Custom Stream API Documentation

The Custom Stream API has recently undergone a complete rewrite from the ground up, using a brand new architecture that will offer users improved reliability, scalability and functionality as well as providing a platform for launching some exciting new Data Hub features and applications over the coming months. The API is now built around a MongoDB platform, allowing users to tap into many of MongoDB’s advanced querying methods. Whilst the functionality of the old stream API will be maintained, we advise users to read through this documentation and update any scripts or sensors that push or pull data via the API accordingly.


Pushing new data to a dataset is performed through a HTTP PUT request, using Basic Authentication, with the dataset access key being supplied as the authentication username and the password being left blank. The URL for pushing data using the API is as follows:<dataset-uuid>/

Data should be formatted as a JSON document and passed as the HTTP request body, using application/json as a Content-Type HTTP request header. A typical JSON document to be added to a dataset might look as follows:


Additional annotations to your data

Note that, as data gets added to your dataset, the API will automatically annotate your JSON documents with some additional values. These help add context to your data and can be useful when constructing queries to return specific ranges of data. These additional fields can be distinguished from your own fields as they are preceded with an underscore:

  • _datasetid: The UUID of the dataset the JSON document was added to
  • _timestamp: The UNIX timestamp indicating when the data point was added to the dataset
  • _timestamp_year: The year component of the timestamp
  • _timestamp_month: The month component of the timestamp
  • _timestamp_day: The day (of the month) component of the timestamp
  • _timestamp_hour: The hour component of the timestamp
  • _timestamp_minute: The minute component of the timestamp
  • _timestamp_second: The second component of the timestamp

Be aware that the timestamp values added to your data reflect the time at which the JSON document was added to your dataset via the API, not the time at which a sensor value, for example, was read from a sensor. This should be considered when batch processing large amounts of data and writing to your dataset in bulk.

A data point that has been annotated with additional fields may look as follows:


A dataset can be queried through a HTTP POST request with Basic Authentication, with the dataset access key being supplied as the authentication username and the password being left blank. The dataset query URL is as follows:<dataset-uuid>/

Making a simple POST request to this URL, with the appropriate authentication, will return all the data points from this dataset as an array of JSON documents. Note that this request can also be made via a HTTP GET, although using GET does not support any of the advanced querying methods described below and can only be used to return all data points.


Specifying a query

A reading request can be refined using a query over the fields of data points contained in the specified dataset. The Custom Stream API uses MongoDB’s JSON-based query language for querying data. The query document should be constructed and passed in the body of the HTTP POST request, specifying application/json as a Content-Type HTTP request header. The simplest query, an empty JSON document, will return all data points: To specify equality conditions, use <field>:<value> expressions in the query filter document: Multiple equality conditions can be specified, using commas to separate them: Query operators can be used to construct more sophisticated queries and filter on a range of conditions. For example, to select values greater than or equal to a particular value, the $gte operator can be used:


Putting the above query into practice within a script would look like the following:

MongoDB provides a sophisticated query language that offers similar functionality to that available in SQL. We recommend taking a look at the full documentation on constructing MongoDB-style query documents which is available here.