Creating a Dataflows

If you're interested in gaining an understanding of the Dataflows feature, this document could provide assistance.

How to navigate to the Dataflows?

For creating a dataflow or managing existing dataflows, please

  1. Proceed to the Data module

  2. Then, choose the Batch Stream/Realtime Stream menu.

How to create your dataflow?

Before creating your dataflows, it's important to have a well-defined understanding of your data requirements. This will enable you to choose the appropriate type of dataflow (Batch Streaming or Real-time Streaming) that aligns with your specific use cases.

Nevertheless, you can only construct a dataflow once you have both the data source and data destination already established.

The creation steps for dataflows vary significantly depending on the type of dataflow you choose.

Dataflow options

The Dataflows module offers two data transformation options that you can choose based on your requirements.

  1. Batch Streaming: The option allows you to have data transformed on a scheduled basis.

  2. Realtime Streaming: The option permits data transformation in real time.

For instance,

  • If you regularly update transaction data into CDP 365, opting for a Batch Streaming dataflow would be ideal.

  • However, if you intend to send order confirmations promptly after a Messenger message, it is recommended to utilize a Real-time Streaming dataflow.

Batch Streaming Dataflows

  • Step 1: Choose the Batch Streaming option.

  • Step 2: Then proceed to click on "Blank Dataflow".

  • Step 3: Now, please select your data source.

Notes:

  1. Action nodes: Nodes process data in the dataflow.

  2. Destination channels: Nodes push data out of the dataflow.

(An example of the Batch Streaming dataflow)

Real-time Streaming Dataflows

  • Step 1: Choose the Real-time Streaming option.

  • Step 2: Then proceed to click on either "Facebook Lead" or "Facebook Messenger" as your chosen data source.

  • Step 3: You can configure your destination channels, which could be either a CDP 365 event or a CDP 365 Business Object.

(An example of the Real-time Streaming dataflow)

Explanation of the Streaming Log tab

While performing data transformations, a dataflow maintains a log. However, it's important to note that the Streaming log for each type of dataflow conveys distinct meanings.

  • In Batch Streaming: a log documents the time of data updates based on the schedule.

  • In Real-time Streaming: a log keeps track of when a data record is updated into CDP 365.

For instance,

  • In the case of a dataflow meant for routine transaction data updates of a Batch Streaming, the log will document daily updates.

  • On the other hand, for Real-time Streaming involving Facebook Messenger, the log will record exchanged messages from your account.

An example of a Batch Streaming dataflow

  1. Data source nodes allow you to configure your data inputs.

  2. Action nodes enable you to define data transformation conditions.

  3. Destination channels are where you configure your data outputs.

In a Batch Streaming dataflow, the sequence typically begins with one or several data source nodes, followed by the action nodes, and concludes with one or several destination channels.

An example of a Real-time Streaming dataflow

In a Real-time Streaming dataflow, you choose a single data input and have the option to establish multiple destination channels, which can be either events or Business Objects.

Feature Explanation

Version History

Every time you save your dataflow, a new version is generated.

You can access the Version History to review your past configurations and restore them as needed.

Force Run

If you want to initiate the execution of your dataflow outside of its scheduled time, click on "Force Run".

Save

When you're ready to save the current settings, simply click on Save.

Activate

To activate your dataflow, click on Activate.

Last updated

Was this helpful?