The Pop-Up Playground Party at AWS re:Invent 2019, sponsored by MongoDB and Confluent, was an exclusive, interactive event featuring immersive art, light, gaming, and play activations with an open bar, DJs, and more, at the Industrial Las Vegas.
Attendees got to create unforgettable memories (or at least killer Instagram photos) at the glowing ball pit, DIY graffiti walls, retro arcade games, and other immersive activities, such as getting a temporary tattoo in a sports car. Meanwhile, the Hood Internet and Vegas’s hottest DJs kept the soundtrack fresh all night long!
In a word, the Pop-Up Playground was: fun.
Something that is way less fun: being one of the many companies that increasingly find it impossible to keep up with the pace of business because their data platforms can’t keep up. Under those circumstances life is anything but a party, in fact, the opposite of a party.
That’s why MongoDB and Confluent teamed up, not just to throw the best party at re:Invent but to help companies run their businesses in real time.
An effective approach to real time could not come at a better time, because many companies are still trying to meet their data needs in ways that don’t match modern business reality. Some are mired in 40-year-old relational database technology. Others think updating to a NoSQL database is positioning them for the future. They soon learn that while getting data in may be easy enough, getting insights out is a whole different story, and what they are actually positioned for now are niche applications, not an effective new data strategy.
The results of pursuing the wrong data strategy can be harmful or fatal. CNBC Markets found that the average lifespan of S&P 500 companies has fallen from nearly 60 years in the 1950s to less than 20 years today. Innosight modeled the rate of attrition from the S&P 500 and predicted that over the next 10 years, half the companies will be replaced: drop out, go out of business, or be acquired, as digital transformation lowers barriers to entry for new players and helps drive out incumbents trapped in legacy infrastructures and processes.
The traditional request-driven data architecture that still exists at many companies helps keep them trapped in legacy limitations. It requires users and applications to make requests and wait until the requested information becomes available. Waiting for data kills opportunity and agility.
By contrast, event-driven architecture, which I posted about on jeffcotrupe.com, proactively makes a stream of data from source systems (producers) available in real time. Consuming applications and services (consumers) subscribe to topics of interest and consume data at their own pace. Capturing and acting on events in real time enables systems to react automatically and immediately to events. This helps a company rapidly position itself to outflank competitors. Detecting operational errors lets it take immediate corrective action. These benefits translate into not only operational excellence and cost savings but also enhanced customer experience as the company optimizes customer-facing processes. More broadly, an event-driven architecture helps the organization improve business agility.
The key driver today in making the need for real-time data an organizational imperative is the emergence of microservices. Microservices architecture breaks up monolithic applications into small, discrete services or functions. It creates self-sufficient sprint teams empowered to bring new capabilities online independently of each other, then over time evolve and upgrade their microservice without impacting adjacent microservices. That is the essence of agility.
As beneficial as microservices can be, though, they require the ability to work with large volumes of data that change frequently, which is a challenge many existing systems cannot meet. Trying to implement microservices in a legacy relational database incurs the pain and friction of having to define a schema in the database and re-implement that same schema again to effect object-relational mapping (ORM) at the application layer. Then your development team has to repeat the process, first for each microservice and then every change to the data model as application functionality evolves.
With MongoDB, data modeling for microservices is easy, which is a big reason MongoDB is at the core of many event-driven systems today. MongoDB’s flexible document model gives you the best way to work with data, lets you intelligently place data where you need it (and when, as in immediately), and gives you the freedom to run anywhere. MongoDB helps you move at the speed your users demand. It gives you the power to launch new digital initiatives and bring modernized applications to market faster, running reliably and securely at scale, unlocking insights and intelligence ahead of your competitors.
By starting with the core MongoDB data platform and binding in complementary technologies, MongoDB provides the data persistence heart of an event-driven architecture. MongoDB and Confluent work together to enable you to readily build microservices and event-driven architectures to become an agile organization.
Confluent Platform, including Apache® Kafka® and Kafka Connect, is designed as an event messaging queue for massive streams of data that sequentially writes events into commit logs, allowing real-time data movement between your services and data sources. The MongoDB Connector for Apache® Kafka® — developed and supported by MongoDB engineers, and verified by Confluent as a first-class component of Confluent Platform — simplifies building robust, reactive pipelines to move events between systems. You can use MongoDB as a sink (consumer) to ingest events from Kafka topics directly into MongoDB collections, exposing the data to your services for efficient querying, enrichment, and analytics, as well as for long-term storage. You can also use MongoDB as a source (producer) for Kafka topics; in this mode, data is captured via Change Streams within the MongoDB cluster and published straight into Kafka topics. These capabilities enable consuming apps to react to data changes in real time.
MongoDB-powered event-driven architectures are at work in a range of user cases including IoT and other time series applications; financial services; AI; predictive maintenance, primarily in manufacturing but also in other verticals; Web activity tracking and log aggregation; and as an operational data layer (ODL) integrating and organizing siloed enterprise data to make it available to all users and consuming apps. Customers who have deployed event-driven architecture powered by MongoDB include Ticketek, EG, ao.com, Man AHL, and comparethemarket.com.
The figure below shows MongoDB and Confluent working together in an event-driven architecture supporting a microservices-based e-commerce application.
In this scenario, fuel costs to ship some items have just gone up, which could impact pricing. This produces events about the cost increase and places them into Apache Kafka. The Pricing microservice consumes the event, analyzes it against existing data, and produces events conveying the new pricing. MongoDB Atlas captures this data and, through the MongoDB Connector for Apache Kafka, publishes it into Kafka topics, which makes the data available to all consumers. Microservices directly impacted by pricing changes, such as those that manage inventory, marketing, promotions & coupons, point of sale (POS), and the e-commerce provider’s order management system (OMS), consume the price change events and update their individual databases accordingly. MongoDB Atlas aggregates and persists data from all microservices, enriches event streams with data from other sources, including historical data, and provides a central repository. This enables applications and users to benefit from all data across all microservices and provides a unified view of state across the e-commerce provider’s enterprise.