Skip to main content

Maritime Event Detection

Synopsis

This operator uses the Maritime Event Detection Service (connection information provided by parameters) to detect events on the streamed maritime data in a streaming analytic workflow.

Description

To utilize this operators functionality an Akka cluster with the Maritime Event Detection Algorithm has to be set up. The Kafka cluster and the names of the input and output topics used by the Maritime Event Detection has to be provided to the operator.

The operator starts the Maritime Event Detection on the Akka cluster (connection information through Maritime Connection). Then the input data events received at the input stream port are pushed to the data topic of the Kafka cluster. The Maritime Event Detection will perform the event detection and pushes the detected events to the output topic of the Kafka cluster.

The operator reads from the output topic of the Kafka cluster and pushes the computed estimated events further downstream (to the output stream port).

This is a streaming operator and needs to be placed inside a Streaming Nest or a Streaming Optimization operator. The operator defines the logical functionality and can be used in all streaming analytic workflow for any supported streaming platform (currently Flink and Spark). The actual implementation used depends on the type of connection connected to the Streaming Nest operator in which this operator is placed.

Input

maritime-connection

The connection to the Maritime Akka Cluster to start the Maritime Event Detection service.

kafka-connection

The connection to the Kafka Cluster which is used by the Maritime Event Detection Service for communication.

input stream

The input of this streaming operation. It needs to receive the output of a preceding streaming operator, to define the flow of data events in the streaming analytic workflow.

Output

output stream

The output of this streaming operation. Connect it to the next Streaming operator to define the flow of the data events in the designed streaming analytic workflow.

Parameters

Job type

Type of the Maritime job to start remotely.

  • events: 'events' job.
  • fusion: 'fusion' job.

Input topic

Name of the Kafka topic used by the SDE Service to receive input data events

Output topic

Name of the Kafka topic used by the SDE Service to push output data events to.

Key field

Name of key field to use for Kafka records