Low latency intelligent data ingest with Apache Pulsar messaging stack

SigmaX Open Source and Optimized IOT Stack:

  • Tiered Storage, Enterprise and Cloud
  • Full Stack Geo-Replication Support
  • HDFS
  • Streams and Topics Support with Apache Pulsar
  • Higher Performance than Apache Kafka
  • Lower Latency than Apache Kafka
  • Apache Arrow Data Ingest
  • Durable Writes with Apache Pulsar
  • Apache Presto for query
  • 10X Lower latency messages
  • HOSS Supported by SigmaX

An IoT and Edge optimized platform featuring:

  • Intel PAC FPGA Ultra low latency data ingest and data format coercion. Pre-Analytics capability.
  • Intel OPTANE Persistent Memory in Application Direct or Memory Mode. Deep and affordable main memory space. Accelerate your data cache.
  • NVIDIA T4 AI and Inferencing Acceleration
  • Baseline 8x SSD drives for excellent storage bandwidth.

OPEN Platform:

Uniquely matched server hardware and Open Source software ideal for IoT and Edge application spaces. Our software stack is expertly supported and state of the art hardware has been integrated and tested.

Why SigmaX node.IoT?

Compare KAFKA to Pulsar

It goes beyond Latency and performance.   Pulsar has built in data reliability features such as Geo-replication.  It offers both streaming and queuing, a distributed log, tiered storage and end-to-end encryption!

Your most valuable data is your most recent

In-hardware data coercion to ultra efficient Apache Arrow data format — Paired with very deep and cost effective Intel OPTANE memory.  The result?  Lower latency, Higher message performance and MORE data available for immediate processing

Based in and Supported from the USA

Based in Virginia, SigmaX has made significant investment in US based development and support. We offer customization services and include options for government customers.


Ingest directly to Apache Arrow

Apache Arrow format is incredibly efficient and flexible. It is directly query-able, resident in memory and supports durable writes. node.IoT uses FPGA hardware assist to coerce your data in real-time into Apache Arrow format.

Real, Real-time processing

Real-time meets near real time messaging. Deploy real-time pre-analytics in FPGA hardware assisted ingest to feature extract or transform your data before it arrives in a Pulsar stream and topic

Distributed and scalable

Support for Tiered storage. Move old data out to S3 cloud buckets, HDFS support, Apache Presto built in for distributed SQL query capability

Get 60% of your CPU back

A foundational philosophy behind SigmaX stack development is the elimination of waste. 60% of your CPU cycles are wasted preparing data for query. We avoid ETL and SERDES operations and thus your CPU is free to do more important things.

Built to work in with node.AI and node.HDFS

node.HDFS is a storage bandwidth optimized Hadoop storage solution offering significant drive bandwidth and depth. node.AI is the SigmaX standard for hardware accelerated AI and ML algorithm execution.

Enterprise Block Chain

With enterprise blockchain support you can establish a secure and trusted path for edge data to transact against smart contracts. SigmaX includes IPFS and Etherium support connected to our Pulsar Message Queue to do just that!

Node.IoT Server

FPGA Intel Arria 10 PAC

  • QSFP+ 4X 10G Network Ports
  • 1,150K Logic Elements
  • Intel Acceleration Stack support
GPU NVIDIA T4 Universal Deep Learning Accelerator
Memory Intel Optane 512GB Standard

  • Application Direct Mode and Memory Mode supported
  • Quad 128GB Intel Optane 2666 SR DIMMS
Server Hardware
Chassis 2U 27.8″ Deep
CPU Dual Xeon Cascade Lake 4215R 8C/16T 3.2G 11M 9.6GT
Memory 128G DDR4 2933 2Rx8 (8x16GB DIMMS) LP ECC
Storage 8x Intel SSD 240GB/ea.  6GB/s, 3D TLC 2.5″

RAID 0,1,5,6,10,50,60

Broadcom Supercap Cache Protection

Ports FPGA 1x QSFP+ 40G

2X 25G SFP28 LAN Ports

1x RJ45 Dedicated IPMI LAN port

Intel OPTANE persistent memory offers THREE big benefits to an Edge system:  #1 Greater message throughput, #2 Lower message latency and #3 More recent data in memory.

Did you know?