Beeks are committed to open architecture and offer a choice of interfaces to Beeks Analytics that enable users to access analytics output. This approach allows you to:

  • Match the interface to your requirements
    Use a RESTful API for request/response access to analytics or the Core Data Feed publish/subscribe streaming interface for consumption of individual messages or ongoing big data system integration.

  • Consume high-volume data as close to the source as possible, with minimal processing overhead
    If you want to consume high-volume data cost-effectively (e.g., near real-time message decodes or notifications of gaps), you’ll need to be able to scale your costs and deployments linearly, as opposed to the wasteful step costs involved in appliance-based licensing models.

  • Perform interactive analysis directly within the analytics platform using SQL
    Use the Beeks Analytics CDF-Q interface to query decoded messages and statistics in situ from the QuestDB timeseries store, ideal for dynamic exploration, rapid prototyping, or building ad hoc dashboards without needing to egress the data.

This document includes references to the CDF-Q interactive interface, which allows you to query Beeks Analytics data in QuestDB directly using SQL. Learn more about CDF-Q in the Analytics Concepts Guide.

Core Data Feed

The Beeks Core Data Feed (CDF) streaming interface is the strategic interface for programmatically accessing data in Beeks Analytics.
The CDF streaming interface:

  • provides high volume access to data from VMX-Analysis and VMX-Capture to enable you to realise real-time insights, drive optimisation, and keep your big data systems up-to-date with decoded data from the wire. 

  • streams at extremely high volumes. At STAC in October 2024, we presented our ability to stream messages at 300 million messages per appliance. See the Beeks Analytics Performance Guide for details. 

  • is a Publish/Subscribe interface.

  • streams analytics data like network telemetry, network session information, network metrics, or business information like correlated business transactions.

  • can provide an alert stream.

  • streams decoded messages (captured from the wire) to downstream systems for tick storage, quant analysis, etc.

  • accesses data using the CNCF Open Telemetry framework to empower you to move beyond log data by integrating metrics, spans and distributed traces.

  • can seamlessly publish analytics events and decoded message flows into a standard Kafka infrastructure.

CDF acceleration with QuestDB

Newer versions of Beeks Analytics introduce QuestDB as the fast, open time-series database at the core of the system, enabling efficient SQL-based querying over wire data. To enhance portability and high-speed data exchange, QuestDB integrates seamlessly with Apache Parquet and Apache Arrow.

Apache Parquet allows Beeks to export decoded wire data in a compressed, columnar format that's ideal for storage and analytical workflows - making it easy to move data into cloud data lakes or downstream tools like Spark, Snowflake, or Python-based analytics.

Apache Arrow, in contrast, is optimized for in-memory analytics and inter-process communication. It's used for real-time streaming of decoded telemetry, especially when ultra-high throughput and low latency are critical. Arrow enables zero-copy transfers between systems - eliminating bottlenecks common with JSON or CSV formats.

By combining QuestDB's SQL-native time-series capabilities with Parquet and Arrow, Beeks delivers both flexibility and speed, historical insights at scale, real-time streaming at the edge, and seamless interoperability with the broader data ecosystem. Together, they unlock data agility across the entire AI and data storage pipeline - from packet capture to predictive analytics.

The role of QuestDB in the overall Beeks Analytics architecture is covered in the Analytics Concepts Guide, and you can also read more about why Beeks selected QuestDB.

Request/response APIs

Beeks Analytics REST API

The REST API provides an interface to programatically access information from within the Beeks Analytics system. The Analysis Server (part of the VMX-Analysis component of the overall Beeks Analytics architecture) presents this REST API to users.

The REST API is a request/response API.

Although the REST API is presented by the Analysis Server, which is part of the VMX-Analysis architectural layer, the REST API is also used to query data from the VMX-Capture architectural layer.

Clients can deploy appliances, which are exclusively for packet capture and/or streaming of decoded data to their own systems. These systems do not have the overhead of running a full VMX-Analysis instance. In these cases, a thin deployment of VMX-Analysis that provides a more limited REST API and Grafana data source can be installed on the appliance.

For more information on the REST API, see the Beeks Analytics REST API Worked Examples Guide, which also contains links to the Swagger documentation.

VMX-Explorer Grafana Datasource

VMX-Explorer presents two Beeks Analytics Grafana data sources to the user. The advantages of a dedicated Beeks Analytics Grafana data source over direct access to the REST API are as follows:

  • Each data source provides its own query editor, which allows you to easily create, edit, and modify custom queries to access the data that you need.

  • Having a data source allows you to use the Grafana Explore interface to understand the underlying data and to help fast-track the development of custom queries and visualisations.

  • The back-end data source allows you to take advantage of Grafana’s rich alerting rules to create custom alerts.

For more information on the VMX-Explorer Grafana datasources, see the Beeks Analytics User Guide for VMX-Explorer.

CDF vs REST API

So when is it better to use the Core Data Feed streaming interface instead of the REST API? The Core Data Feed streaming interface is more suited to interfacing with the VMX-Capture layer directly or for maintaining a streaming list of updating information from the VMX-Analysis server.

The Core Data Feed streaming interface:

  • is a Publish/Subscribe messaging feed.

  • outputs Agent Events, Intervals, Timeseries, Associations, Alerts, and configuration messages.

  • offers pre-filtering of messages to reduce volume.

  • offers reliable rather than guaranteed messaging, to allow for drops if there are problems consuming.