![]() We’re making it delightful for developers to build real-time applications, and we’re not stopping at Business Intelligence. ![]() But we’re doing it with a fundamentally different approach than most. There are plenty of companies going after the real-time analytics prize. Until Action is precipitated from Insight, “data-driven” businesses get locked into “explore and explain” infinity, where there is little value to be had. Action means putting that data to use within applications. Second, as I mentioned, the value of real-time analytics doesn’t end at “Insight” but rather at “Action.” Sure, you can run low-latency, high-concurrency queries on your database, but the only thing you’ll gain is more Insight. There are many good reasons people love Postgres more than MySQL. You need to be able to get started easily, and you should never be held back as you progress. When you’re engaged in any new data project, your ability to gain insight from your data relies on being able to understand and leverage the capabilities of the tools you’re using. There’s a very good chance that analytical databases will follow the path of general compute: The stuff under the hood will eventually become a commodity. Real-time analytics requires so much more than a scalable DBMS.įirst: Developer experience matters. They’re developer-friendly and accelerate the ability to write differentiated code.īut the headache doesn’t stop at the database. To begin with, serverless technologies, including DBaaS (Database as a Service) offerings, are becoming increasingly widespread. Real-time analytics is certainly a growing space within the larger data ecosystem, but historically it has only been achievable for big players with deep pockets who can dedicate large teams to building and maintaining beefy infrastructure. There’s a technical reason for this, and I’ll cover it in a subsequent blog. They simply can’t keep up with real-time demands. But when it comes to building low-latency, high-concurrency applications on top of these massive volumes of fresh data, they fail. To be able to ingest massive volumes of data being generated constantly, enrich and transform that data, and expose that data to applications and interfaces accessed by many concurrent users is still exceptionally difficult, and it’s not something that established data architectures were built for.ĭata warehouses, in particular, were built to solve a particular problem: Complex, batch analysis over huge volumes of historical data. Typical data architectures: Fast & fresh won't be concurrent. It’s like that sign you see at some hole-in-the-wall restaurants: It’s how long it takes your dashboard to load when you click refresh.Ĭoncurrency is how many different clients want low-latency access to fresh data at the same time.įrankly, it’s really hard to build systems that do all 3 well. Latency is how quickly you can request that data and get a result. ![]() In SQL terms, you could think of freshness as the difference between the result of the ``now()`` function and the latest timestamp in your database. In order to be valuable to businesses, real-time analytical architectures have to deal with three data attributes:įreshness is the delay between when data is created through a real-world event and when it is available to act upon. ![]() Said as simply as possible, real-time analytics is the ability to do something valuable with data as quickly as it is generated.Īnd for data-intensive businesses, this is a hard problem. It’s the system whereby Frogger nimbly springs from lane to lane, one hop ahead of his wheeled nemeses.įor real-time applications, the value of data drops precipitously as time passes. It keeps companies on top of the constant barrage of data that could be (and should be) informing their decisions, and it lets them act upon that data at peak value. Real-time analytics, then, is the synapse that connects high-speed data generation to processes and activities that keep the business alive and profitable. A global online retailer selling thousands of articles of clothing a minute on Black Friday has very different data needs from a small, brick-and-mortar boutique in a rural town on a Tuesday morning.įrogger’s ability to safely cross the road depends on how many cars there are, how fast they’re going, and how quickly he can hop. Of course, the rate at which its value declines depends on the speed of the business. The longer data sits unprocessed and not acted upon, the less valuable it becomes. And for most businesses, that data has a shelf life. Most modern businesses generate new data constantly.
0 Comments
Leave a Reply. |