Creating a Dataflows
If you're interested in gaining an understanding of the Dataflows feature, this document could provide assistance.
Last updated
Was this helpful?
If you're interested in gaining an understanding of the Dataflows feature, this document could provide assistance.
Last updated
Was this helpful?
How to navigate to the Dataflows?
For creating a dataflow or managing existing dataflows, please
Proceed to the Data module
Then, choose the Batch Stream/Realtime Stream menu.
The Dataflows module offers two data transformation options that you can choose based on your requirements.
Batch Streaming: The option allows you to have data transformed on a scheduled basis.
Realtime Streaming: The option permits data transformation in real time.
For instance,
If you regularly update transaction data into CDP 365, opting for a Batch Streaming dataflow would be ideal.
However, if you intend to send order confirmations promptly after a Messenger message, it is recommended to utilize a Real-time Streaming dataflow.
Step 1: Choose the Batch Streaming option.
Step 2: Then proceed to click on "Blank Dataflow".
Step 3: Now, please select your data source.
Notes:
Action nodes: Nodes process data in the dataflow.
Destination channels: Nodes push data out of the dataflow.
Step 1: Choose the Real-time Streaming option.
Step 2: Then proceed to click on either "Facebook Lead" or "Facebook Messenger" as your chosen data source.
Step 3: You can configure your destination channels, which could be either a CDP 365 event or a CDP 365 Business Object.
While performing data transformations, a dataflow maintains a log. However, it's important to note that the Streaming log for each type of dataflow conveys distinct meanings.
In Batch Streaming: a log documents the time of data updates based on the schedule.
In Real-time Streaming: a log keeps track of when a data record is updated into CDP 365.
For instance,
In the case of a dataflow meant for routine transaction data updates of a Batch Streaming, the log will document daily updates.
On the other hand, for Real-time Streaming involving Facebook Messenger, the log will record exchanged messages from your account.
Data source nodes allow you to configure your data inputs.
Action nodes enable you to define data transformation conditions.
Destination channels are where you configure your data outputs.
In a Batch Streaming dataflow, the sequence typically begins with one or several data source nodes, followed by the action nodes, and concludes with one or several destination channels.
In a Real-time Streaming dataflow, you choose a single data input and have the option to establish multiple destination channels, which can be either events or Business Objects.
Every time you save your dataflow, a new version is generated.
You can access the Version History to review your past configurations and restore them as needed.
If you want to initiate the execution of your dataflow outside of its scheduled time, click on "Force Run".
When you're ready to save the current settings, simply click on Save.
To activate your dataflow, click on Activate.
Before creating your dataflows, it's important to have a well-defined understanding of your data requirements. This will enable you to choose the appropriate type of dataflow ( or ) that aligns with your specific use cases.
Nevertheless, you can only construct a dataflow once you have both the and already established.
The creation steps for dataflows vary significantly depending on the you choose.
Step 4: Within the dataflow, there are multiple nodes ( and ) available to assist in the transformation of the input data.
()
()