Pipes Language
A pipe consists of an input which generates data that passes through actions and finally gets sent to the desired destination with an output.
In addition, some inputs may be used as actions, as well as some outputs
Inputs
amqp: Read from AMQP queues
azure-blob: Send data to a Microsoft Azure Storage Blob (Block Storage)
echo: Create a simple static event
exec: Execute arbitrary commands
files: Read from multiple files, in order of creation
http-poll: Run HTTP queries (GET and POST)
http-server: Run HTTP server
kafka: Consume events from one or more Kafka topics
mqtt: Consume events from a MQTT topic
nsq: Consume events from a NSQ topic
redis: Read from Redis in-memory key-value store
s3: Stream data from a S3 Object
scuba: Run BQL queries against a Scuba API endpoint
sql: Query a SQL database
tcp: Listen for incoming TCP connections (or write to existing server)
internal-messages: Receive internal messages
udp: Listen for incoming UDP connections
Outputs
amqp: Send events to AMQP server
azure-blob: Send data to a Microsoft Azure Storage Blob (Block Storage)
azure-monitor: Send data to an Azure monitor
elastic: Send events to Elasticsearch server
exec: Execute arbitrary commands
file: Write to a file
http-get: Run HTTP GET requests
http-post: Run HTTP POST requests
http-server: Run HTTP server
kafka: Write to a Kafka topic
mqtt: Consume events from a NSQ topic
nsq: Publish events from to a NSQ topic
print: Print to either STDOUT (the standard output for the terminal) or STDERR
message: Create a message for the internal message subsystem
redis: Write to Redis in-memory key-value store
s3: Write events to a S3 bucket file
splunk-hec: Output events to a Splunk HTTP Event Collector endpoint (Splunk HEC)
sql: Insert data into a SQL database
tcp: Send data to a TCP server
udp: Send data to a UDP server
Actions
abort: Abort the pipe if the condition is met
assert: Validate an event against a JSON Schema, based on IETF's draft v7 (http://json-schema.org)
add: Add new fields to an event
collapse: Converts JSON records to another format, like CSV or key-value pairs
convert: Converts data types of values
enrich: Allows using CSV lookup to enrich data
expand: Converts simple separated data into JSON
extract: Extract data from plain text, using a pattern
exec: Execute arbitrary commands
filter: Removes events, based on some given conditions
flatten: Flatten nested JSON Objects and Arrays into a single JSON Object containing only top-level fields.
generate: Create new events, specifically for alerts
raw: Operations on raw (non-JSON) data
remove: Remove fields
rename: Rename fields
script: Set fields to computed values, perhaps conditionally
stalled: Reports when a stream has stopped getting events for a given duration
stream: Create a new field calculated on historical data
time: Time{stamp} manipulation
transaction: Collects events together based on some condition to make a single new event
transition: Performs various actions based on a changed field