The type of the data that results from being processed
A function that takes the data to be processed, the callback done function and a push function that can be used to wrap a command to do work.
If the processing succeeded and you don't need to checkpoint and you don't want to pass anything on to flow to the next processing step, you can just call this function with no arguments.
The callback function to send back an error or the successful processed result.
If processing failed, set this to an Error object representing the failure.
If processing succeed and you don't need to pass anything to another processing step, you can pass true here which will checkpoint for you for the event being processed. If processing succeeded and you need to send a result of processing to another processing step, set this to be the data to send.
A convenience method to push the resulting processed data to a queue.
The data to push to a queue.
The data to give to the next processing step, in a pipeline that would be putting the data into the pipeline to flow to the next pipeline step.
This is a standard callback used to tell the SDK that you are done processing something, either successfully or not successfully. If the processing succeeded and you don't need to checkpoint and you don't want to pass anything on to flow to the next processing step, you can just call this function with no arguments.
If processing failed, set this to an Error object representing the failure.
If processing succeed and you don't need to pass anything to another processing step, you can pass true here which will checkpoint for you for the event being processed. If processing succeeded and you need to send a result of processing to another processing step, set this to be the data to send.
This allows you to override the normal options used by the SDK for this one event that you are saying is now "done".
The type of the data that is to be processed
The type of the data that will be the result of processing
A function that takes the payload of the event (T) and the wrapper of the entire event (ReadEvent
The data to be processed
The function to call when done with the result, if any
For convenience, a re-export of the popular event-stream library.
This creates a pipeline step to tell the SDK to micro-batch events received in one pipeline step before sending them to the next pipeline step. It's useful to control how many events arrive all at once, roughly, to the next pipeline step. It can be helpful for a pipeline step to receive a micro-batch of events, say 100 at a time, instead of 1 at a time to leverage economies of scale when writing to a database, e.g. doing one query to the database to get 100 records back all at once instead of having to do a query to the database for each event as it comes in.
The type of the data being batched from the previous pipeline step before sending to the next pipeline step
If a number, then this is how many events to batch up before sending along. If BatchOptions
then
this is fine-grained control to ensure events keep flowing smoothly whether there are a few or many at a given moment.
The pipeline step that is ready to be used in a pipeline
The pipeline step that is ready to be used in a pipeline
This creates a pipeline step that takes an event, logs it and then passes the event on to the next pipeline step. The log will include the event ID of the event, if it's present. This is helpful to get visibility into the pipeline.
The type of the event that flows in, gets logged and then flows unchanged to the next pipeline step
If present, log statements are prefixed with this string
If present, only log every Nth event that flows through where N is records
The pipeline step that is ready to be used in a pipeline
This creates a pipeline step that takes an event, logs it and then passes the event on to the next pipeline step. The log will include the event ID of the event, if it's present. This is helpful to get visibility into the pipeline.
The type of the event that flows in, gets logged and then flows unchanged to the next pipeline step
If present, only log every Nth event that flows through where N is records
The pipeline step that is ready to be used in a pipeline
Creates a pipeline step that can act as a noop sink.
Sometimes you don't care to push data anywhere when you have a pipeline, but you need the fine-grained control of making your own pipeline. When that's the case, use this to create a final pipeline step, a sink, to end your pipeline.
Pipelines must have a sink or data won't flow through the pipeline since Node streams pull data starting with the sink who asks the previous pipeline step for data and then that previous step asks the one before it for data and so on. So, no sink means no data flows. This gives you a noop sink.
The type of the data sent into this final pipeline step
If a string, logs events that come in, prefixing the log statement with the stream. If this is true, logs the event. Otherwise, does nothing.
The pipeline step that is ready to be used in a pipeline
Helper function to turn a timestamp into an RStreams event ID.
The timestamp you want to turn into an RStreams event ID which can be anything used to construct a Moment object
Specify the granularity of the event ID, maybe just year/month or year/month/hour, etc.
The generated event ID
Helper function to turn a an RStreams event ID into a timestamp.
The event ID to turn into an epoch timestamp
The timestamp as a time since the epoch
This creates a pipeline step that will parse events from a CSV file and send them to the next step. Underneath the covers it uses the popular fast-csv node library.
List of fields to transform | true builds the header list dynmaically
fastCSV options https://c2fo.github.io/fast-csv/docs/parsing/options
The pipeline step that is ready to be used in a pipeline
This creates a pipeline step that can act as the first step in a pipeline, the source, which reads data from an S3 file.
What to read from.
The name of the S3 bucket to read from
The name of the file in the bucket to read from
Read from a specific range in the file. This is a string that must look like this:
bytes=<startingByteoffset>-<endingByteOffset>
where
The pipeline step that is ready to be used in a pipeline
The pipeline step that joins an external data source with events
Creates a pipeline step that will log events as they pass through which can be helpful for debugging in between streaming operations.
The type of the data that flows through the step to be logged
If provided, this prefix is included with each log
The pipeline step that is ready to be used in a pipeline
A pipeline step that will split and parse JSON lines, turning them into events.
If true and there's a parse error, the error and the JSON line that couldn't be parsed is skipped. Defaults to false.
The pipeline step that is ready to be used in a pipeline
A callback-based version of pipeAsync
. Creates a pipeline of steps where the first step produces the data and then
it flows to the next step and so on. The first step is the source, producing the content, the final step is the sink.
The type of data that is produced by the source
Pipeline step 1: The source that produces the data, the first step of the pipe
Pipeline step 2: The sink that is the last step of the pipe
Called if something goes wrong
The pipeline itself
A callback-based version of pipeAsync
. Creates a pipeline of steps where the first step produces the data and then
it flows to the next step and so on. The first step is the source, producing the content, the final step is the sink.
The type of data that is produced by the source
The type of data generated and that moves to the final step of the pipe
Pipeline step 1: The source that produces the data, the first step of the pipe
Pipeline step 2: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 3: The sink that is the last step of the pipe
Called if something goes wrong
The pipeline itself
A callback-based version of pipeAsync
. Creates a pipeline of steps where the first step produces the data and then
it flows to the next step and so on. The first step is the source, producing the content, the final step is the sink.
The type of data that is produced by the source
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the final step of the pipe
Pipeline step 1: The source that produces the data, the first step of the pipe
Pipeline step 2: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 3: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 4: The sink that is the last step of the pipe
Called if something goes wrong
The pipeline itself
A callback-based version of pipeAsync
. Creates a pipeline of steps where the first step produces the data and then
it flows to the next step and so on. The first step is the source, producing the content, the final step is the sink.
The type of data that is produced by the source
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the final step of the pipe
Pipeline step 1: The source that produces the data, the first step of the pipe
Pipeline step 2: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 3: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 4: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 5: The sink that is the last step of the pipe
Called if something goes wrong
The pipeline itself
A callback-based version of pipeAsync
. Creates a pipeline of steps where the first step produces the data and then
it flows to the next step and so on. The first step is the source, producing the content, the final step is the sink.
The type of data that is produced by the source
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the final step of the pipe
Pipeline step 1: The source that produces the data, the first step of the pipe
Pipeline step 2: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 3: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 4: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 5: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 6: The sink that is the last step of the pipe
Called if something goes wrong
The pipeline itself
A callback-based version of pipeAsync
. Creates a pipeline of steps where the first step produces the data and then
it flows to the next step and so on. The first step is the source, producing the content, the final step is the sink.
The type of data that is produced by the source
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the final step of the pipe
Pipeline step 1: The source that produces the data, the first step of the pipe
Pipeline step 2: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 3: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 4: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 5: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 6: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 7: The sink that is the last step of the pipe
Called if something goes wrong
The pipeline itself
An async/await-friendly version of pipe
. Creates a pipeline of steps where the first step produces the data and then
it flows to the next step and so on. The first step is the source, producing the content, the final step is the sink.
The type of data that is produced by the source
Pipeline step 1: The source that produces the data, the first step of the pipe
Pipeline step 2: The sink that is the last step of the pipe
A promise so it can play nice with async/await
An async/await-friendly version of pipe
. Creates a pipeline of steps where the first step produces the data and then
it flows to the next step and so on. The first step is the source, producing the content, the final step is the sink.
The type of data that is produced by the source
The type of data generated and that moves to the final step of the pipe
Pipeline step 1: The source that produces the data, the first step of the pipe
Pipeline step 2: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 3: The sink that is the last step of the pipe
A promise so it can play nice with async/await
An async/await-friendly version of pipe
. Creates a pipeline of steps where the first step produces the data and then
it flows to the next step and so on. The first step is the source, producing the content, the final step is the sink.
The type of data that is produced by the source
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the final step of the pipe
Pipeline step 1: The source that produces the data, the first step of the pipe
Pipeline step 2: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 3: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 4: The sink that is the last step of the pipe
A promise so it can play nice with async/await
An async/await-friendly version of pipe
. Creates a pipeline of steps where the first step produces the data and then
it flows to the next step and so on. The first step is the source, producing the content, the final step is the sink.
The type of data that is produced by the source
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the final step of the pipe
Pipeline step 1: The source that produces the data, the first step of the pipe
Pipeline step 2: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 3: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 4: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 5: The sink that is the last step of the pipe
A promise so it can play nice with async/await
An async/await-friendly version of pipe
. Creates a pipeline of steps where the first step produces the data and then
it flows to the next step and so on. The first step is the source, producing the content, the final step is the sink.
The type of data that is produced by the source
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the final step of the pipe
Pipeline step 1: The source that produces the data, the first step of the pipe
Pipeline step 2: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 3: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 4: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 5: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 6: The sink that is the last step of the pipe
A promise so it can play nice with async/await
An async/await-friendly version of pipe
. Creates a pipeline of steps where the first step produces the data and then
it flows to the next step and so on. The first step is the source, producing the content, the final step is the sink.
The type of data that is produced by the source
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the next step of the pipe
The type of data generated and that moves to the final step of the pipe
Pipeline step 1: The source that produces the data, the first step of the pipe
Pipeline step 2: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 3: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 4: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 5: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 6: A transformation step that takes data from the previous step and pushes the result to the next step
Pipeline step 7: The sink that is the last step of the pipe
A promise so it can play nice with async/await
Only used in advanced scenarios where you need to wrap multiple pipeline steps into a single writeable stream. This is useful when you need to give some other code access to feed data into your pipeline. It is not commonly used.
The type of data that will be written to the
The first step to be wrapped
The single pipeline step that wraps up all the steps passed in into a single step
This creates a pipeline step that takes events in of type T, allows your code to process it and then you send an event of type U to the the next pipeline step.
The type of the data sent into the function to be processed
The type of the data you send on after being processed
The name of the bot act as
The function to process data, getting data of type T and returning data of type U
The queue to send the resulting data to
The pipeline step that is ready to be used in a pipeline
This creates a pipeline step that turns a Javascript object into a JSON line (newline at the end of the stringified JS object). This is used to make it easy to create JSON lines files.
The pipeline step that is ready to be used in a pipeline
This creates a callback based pipeline step that will take data in, possibly transform the data or do computation, and then sends the result on to the next step in the pipeline.
The type of the data sent in to be passed through this step.
The type of data to be sent on to the next step in the pipeline.
A function that does the work of taking the data in, doing something with it and then calling the done function when done.
The first arg is stripped off by Javascript since it recognizes that the this arg is just to set the this context
so that the this
keyword will work inside the function and itself be the instance of the transform stream which can be useful.
For example, say you want to push to an event in here to a queue. You could do that by calling
this.push
to push the event to a queue while still sending the queue on the next step in the pipeline afterwards.
So, the first real argument your function will receive is obj
which is the data event being sent in to be processed/transformed
and sent on to the next pipeline step. The second arg is done
. You call this when you're done. Call done()
if there's no error
but you want to filter out this event and not pass it on to the next pipeline step. Call done(err)
if an error ocurreed where
err
is a string or Error object. Call done(null, U)
when no error and you want to pass on an event to the next step in the
pipeline where U
is the type of object being sent on.
A function to be called when the entire pipeline has been flushed to allow for cleanup, perhaps closing a DB connection.
The pipeline step that is ready to be used in a pipeline
This creates an async/await-friendly pipeline step that will take data in, possibly transform the data or do computation, and then sends the result on to the next step in the pipeline.
The type of the data sent in to be passed through this step.
The type of data to be sent on to the next step in the pipeline.
A function that does the work of taking the data in, doing something with it and then rejecting or resolving
the promise with the result object type U. If you resolve with no result data, the event is skipped and not sent to the next pipeline step.
The first arg is stripped off by Javascript since it recognizes that the this arg is just to set the this context
so that the this
keyword will work inside the function and itself be the instance of the transform stream which can be useful.
For example, say you want to push to an event in here to a queue. You could do that by calling
this.push
to push the event to a queue while still sending the queue on the next step in the pipeline afterwards.
So, the first real argument your function will receive is obj
which is the data event being sent in to be processed/transformed
and sent on to the next pipeline step.
A function to be called when the entire pipeline has been flushed to allow for cleanup, perhaps closing a DB connection.
The pipeline step that is ready to be used in a pipeline
This creates a pipeline step that will create a CSV file from the events that flow into this step. Underneath the covers it uses the popular fast-csv node library.
List of fields to transform | true builds the header list dynmaically
fastCSV options https://c2fo.github.io/fast-csv/docs/parsing/options
The pipeline step that is ready to be used in a pipeline
This creates a pipeline step meant to be the last step in a pipeline, the sink, that writes events that flow into it into S3. You should micro-batch events before getting to this step to control how many events to write to the file.
The name of the AWS S3 bucket to write the file to
The name of the file to write.
The pipeline step that is ready to be used in a pipeline
Wraps a command function as a WriteableStream.
Wraps a command function as a WriteableStream.
Generated using TypeDoc
The type of the data to be processed