Now, you are going to incrementally
fetch and move the data to the target table named UpdatedInvestments.
1. Select the processor group and click on the configuration icon from the operate panel.
To learn more about controller services, refer here.
You must use this processor to fetch
the updated values from the given table based on the last modified date. Select
the processor and double click to open a configuration dialog. Specify the below
properties in the configuration.
to Return: Leave empty. If no columns are specified, all the columns will
be returned from the source table.
Figure 9: Configure source table details.
This processor will store the latest
date value as the state value and filter the records based on the date stored. To
view the state value, right-click on the processor and select view->state.
The component state
dialog will appear and shows the last modified date with column name and table
name as keys. To clear the state value, select the “Clear state” option.
Figure 10: View component state.
On running, this processor returns the
output in AVRO format.
View AVRO data.
Next, you have to use
ConvertAvroToJSON processor to convert the incoming AVRO data into JSON format.
properties, set the JSON container options to “array.” AVRO data will be
converted to an array of JSON objects.
Figure 12: Convert AVRO data into JSON data.
You can see the JSON output in the
Figure 13: View JSON data
You must convert the JSON input into
relevant SQL Insert statements. For this operation, you need to use the ConvertJSONToSQL
properties, set the statement type as either “INSERT” or “UPDATE,” and enter the target table name as
“UpdatedInvestments” in the Table Name
Figure 14: Configure target table details.
Under the output queue, you can see
the SQL statements as in the following screenshot.
Figure 15: View generated SQL query.
processor is used to execute the incoming SQL statement. Just point out the
relevant connection pool.
Figure 16: Choose SQL Server.
Now your data flow is ready.
You must schedule the overall dataflow
to run periodically and move the data from source to destination.
Open configuration dialog for the
“UpdateDatabaseTable” processor, move to the scheduling tab, choose the “timer
driven” option, and set it for one minute. This will ensure that the data flow
is run every minute, checking for new and updated records and moving them from the
source to the destination.
Figure 17: Schedule data flow.
Please refer to the link below to learn
more about scheduling:
Now, start all processors by clicking
“start” button under the operate menu.
Figure 18: Execute the data flow.
values are inserted or updated (along with the LastModified date field) in the source
table, they will be automatically moved to the target table as per the dataflow
You can try directly editing the
source table along with the last modified column in the SQL server to see if
the records are updated correctly.
Figure 19: Target table.
Hopefully this blog has shown you the steps
involved in how to fetch data incrementally from source SQL tables and move it
to a target SQL table using the QueryDataBaseTable processor in the Syncfusion Data Integration
Platform. This dataflow will be very helpful when you want to perform data
migration, data synchronization, and data transformation use cases between SQL
If you are new to the Data Integration Platform, it is highly
recommended that you follow our Getting
If you’re already a Syncfusion user, you can download the product setup here. If you’re not yet a Syncfusion user, you can
download a free, 30-day trial here.
If you have any questions or require clarification about this feature,
please let us know in the comments below. You can also contact us through
forum or Direct-Trac. We are happy to assist you!