<img height="1" width="1" alt="" style="display:none" src="https://www.facebook.com/tr?id=1565360197020699&amp;ev=PixelInitialized">

Own Your Data With Data Streaming

Gain complete control of your Crunchtime data to power your own dashboards and analysis. 

Crunchtime-data-streaming-hero

Your data to do what you want

We made Data Streaming so you can analyze your data the way you want.

crunchtime-data-captured

Data is captured in Crunchtime Apps

(Inventory counts, employee hours worked, etc.)

 

crunchtime-database

Data moves to Crunchtime Database

crunchtime-snowflake

Crunchtime streams data to your Snowflake environment

Built for data experts

Data Streaming is designed with data professionals in mind. It gives you full control over your data, making it easy to extract, transform, and analyze large volumes of data exactly how you need.

Crunchtime-data-experts-2

Learn how top brands like Five Guys, Jersey Mike’s, and sweetgreen use Crunchtime to achieve ops excellence.

Join Our Product Tour Get a Personalized Demo
202306-CT-Platform-Overview-Hero-tiny

Crunchtime Data Streaming FAQ

Are other databases available for this service besides Snowflake?

No. We choose Snowflake because it is an industry leader in data lakes and reporting, in addition to supporting egress from Snowflake to virtually every popular database.

What does Data Streaming include?

Crunchtime will manage the streaming data replication to a Snowflake data warehouse
operated by you. This includes:

  • Monitoring the stream and availability of data delivery to the Snowflake data warehouse.
  • Synchronizing eligible tables and columns to the Snowflake data warehouse orchestrated with schema changes in the products as new features are created.
What latency should be expected for the data freshness?

Up to 4 hours for inventory and labor and up to 12 hours for Zenput. This means a change made in the system will become visible in the Snowflake data warehouse within 4 or 12 hours of creation.

What does “stream” mean? It means that change logs are monitored on the production transaction system and
changes are moved to the Snowflake environment as they are found. This is not a
“batch” process.