Your company receives both batch- and stream-based event data. You want to process the data using
Google Cloud Dataflow over a predictable time period. However, you realize that in some instances data
can arrive late or out of order. How should you design your Cloud Dataflow pipeline to handle data that is
late or out of order?
Recent Comments (The most recent comments are at the top.)
C is the right answer. Watermarks are the notion of when the system expects that all data in a certain window has arrived in the pipeline
It should be A