MongoDB Blog

Articles, announcements, news, updates and more

New Aggregation Pipeline Text Editor Debuts in MongoDB Compass

There’s a reason why Compass is one of MongoDB’s most-loved developer tools: because it provides an approachable and powerful visual user interface for interacting with data on MongoDB. As part of this, Compass’s Aggregation Pipeline Builder abstracts away the finer points of MongoDB’s Query API syntax and provides a guided experience for developing complex queries. But what about when you want less rather than more abstraction? That’s where our new Aggregation Pipeline Text Editor comes in. Recently released on Compass, the Aggregation Pipeline Text Editor allows users to write free-form aggregations. While users could previously write and edit pipelines through a guided and structured builder organized by aggregation stage, a text-based builder can be preferable for some users. This new pipeline editor makes it easy for users to: See the entire pipeline without having to excessively scroll through the UI Stay “in the flow” when writing aggregations if they are already familiar with MongoDB’s Query API syntax Copy and paste aggregations built elsewhere (like in MongoDB’s VS Code Extension ) into Compass Use built-in syntax formatting to make pipeline text “pretty” before copying it over from Compass to other tools The Aggregation Pipeline Text Editor in Compass. Notice how toward the top right you can click on “stages” to move back to the traditional stage-based Aggregation Pipeline Builder. Ultimately, the addition of the Aggregation Pipeline Text Editor to Compass gives users more flexibility depending on how they want to build aggregations. For a more guided experience and to get result previews when adding each new stage, the existing Aggregation Pipeline Builder will work best for most users. But when writing free-form aggregations or copying and pasting aggregation text from other tools, the Aggregation Pipeline Text Editor may be preferable. It also previews the final pipeline output, rather than the stage-by-stage preview that exists today. Users will be able to access either both the traditional Aggregation Pipeline Builder and the new Pipeline Text Editor from directly within the Aggregations tab in Compass and can switch between the two views without losing their work. To get access to the new Aggregation Pipeline Text Editor, make sure to download the latest version of Compass here . And as always, we welcome your continued feedback on how to improve Compass. If you have ideas for how to improve your experience with Compass you can submit them on our UserVoice platform here . We’ll have even more great features coming in Compass soon. Keep checking back on our blog for the latest news!

January 26, 2023
Updates

5 Ways to Learn MongoDB

MongoDB offers a variety of ways for users to gain product knowledge, get certified, and advance their careers. In this guide, we'll provide an overview of the top five ways to get MongoDB training, resources, and certifications. #1: MongoDB University The best place to go to get MongoDB-certified and improve your technical skills is MongoDB University . At our last MongoDB.local London event, we announced the launch of a brand new, enhanced university experience, with new courses and features, and a seamless path to MongoDB certification to help you take your skills and career to the next level. MongoDB University offers courses, learning paths, and certifications in a variety of content types and programming languages. Some of the key features that MongoDB University offers are: Hands-on labs and quizzes Bite-sized video lectures Badges for certifications earned Study guides and materials Getting certified from MongoDB University is a great way to start your developer journey. Our education offerings also include benefits for students and educators . #2: MongoDB Developer Center For continued self-paced learning, the MongoDB Developer Center is the place to go. The Developer Center houses the latest MongoDB tutorials, videos, community forums , and code examples in your preferred languages and tools. The MongoDB Developer Center is a global community of more than seven million developers. Within the Developer Center, you can code in different languages, get access to integrate technologies you already use, and start building with MongoDB products, including: MongoDB, the original NoSQL database MongoDB Atlas , the cloud document database as a service and the easiest way to deploy, operate, and scale MongoDB MongoDB Atlas App Services , the easy way to get new apps into the hands of your users faster #3: Instructor-led training As an IT leader, you can help your team succeed with MongoDB instructor-led training taught live by expert teachers and consultants. With MongoDB’s instructor-led training offering, you can access courses aimed at various roles. Our Developer and Operations learning paths cover fundamental skills needed to build and manage critical MongoDB deployments. Beyond that, our specialty courses help learners master their skills and explore advanced MongoDB features and products. You can also modify how you want to learn. MongoDB offers public remote courses, which are perfect for individuals or teams who want to send a few learners at a time. If your goal is to upskill your entire team with MongoDB, our courses can be delivered privately, both onsite or remotely. Instructor-led training also provides the opportunity for Q&A, providing answers to your specific questions. #4: Resources Beyond formal training programs, MongoDB is committed to providing thought leadership resources for those looking to dive deeper and learn more about MongoDB and database technologies in general. Our website offers an active blog with ongoing thought leadership and how-to articles, along with additional coding documentation , guides, and drivers. You can also check out the MongoDB Podcast for information about new and emerging technology, MongoDB products, and best practices. #5: Events You can also engage with MongoDB experts at our many events, including MongoDB World, our annual conference for developers and other IT leaders. After MongoDB World, we take our show on the road with MongoDB .local events across the globe. These events give you the opportunity to learn in a hands-on fashion and meet other MongoDB users. MongoDB also hosts MongoDB days in various global regions, focusing on developer workshops and leveling up skills. Beyond that, you can keep up with our webinars and other learning opportunities through our Events page. Build your own MongoDB story Of course, many people like to learn by doing. To get started using MongoDB Atlas in minutes, register for free .

January 20, 2023
News

Hydrus Helps Companies Improve ESG Performance

More organizations are embracing workforce diversity, environmental sustainability, and responsible corporate governance in an effort to improve their Environmental, Social, and Governance (ESG) performance. As investors increasingly favor ESG in their portfolios, organizations are under greater pressure to capture, store, and verify ESG metrics. San Francisco-based startup, Hydrus, is helping companies make ESG data more usable and actionable. The platform Hydrus, a MongoDB for Startups program member, is a software platform that enables enterprises to collect, store, report, and act on their environmental, social, and governance data. ESG data includes things like: How a company safeguards the environment Its energy consumption and how it impacts climate change How it manages relationships with employees, suppliers, and customers Details about the company’s leadership, executive pay, audits, and internal controls The Hydrus platform enables organizations to collect, store, and audit diversity and environmental data, and run analytics and machine learning against that data. Hydrus offers users a first-rate UI/UX so that even non-technical users can leverage the platform. With the auditing capabilities, organizations can ensure the provenance and integrity of ESG data over time. Other solutions don't allow users to go back in time and determine who made changes to the data, why they made them, what earlier versions of the data looked like, and what time the changes were made. Hydrus gives users complete visibility into these activities. The tech stack MongoDB Atlas was the preferred database for Hydrus because of the flexibility of the data model. George Lee, founder and CEO of Hydrus, says the traditional SQL database model was too limiting for the startup's needs. MongoDB's document model eliminated the need to create tables or enforce restrictions of data fields. With MongoDB, they could simply add fields without undertaking any major schema changes. Hydrus also tapped MongoDB for access to engineers and technical resources. This enabled the company to architect its platform for all of the different types of sustainability data that exist. MongoDB technical experts helped Hydrus model data for future scalability and flexibility so it could add data fields when the need arises. On top of Atlas and MongoDB technical support, Hydrus leans heavily on MongoDB Charts , a data visualization tool for creating, sharing, and embedding visualizations from MongoDB Atlas. Charts enables Hydrus to derive insights from ESG data, giving its Fortune 200 clients better visibility into their operational efficiency. Charts uses a drag-and-drop interface that makes it easy to build charts and answer questions about ESG data. A Hydrus customer using MongoDB Charts was better able to understand the impact of their footprint from a greenhouse gas perspective and a resource usage perspective. Another customer detected a 30x increase in refrigerant usage in one of its facilities. The visual analytics generated with MongoDB Charts enabled the company to make changes to improve their ESG performance. MongoDB Charts enabled Hydrus to visualize sustainability data "MongoDB Charts enables our customers to directly report their sustainability data, customize the charts, and better tell the sustainability story in a visual format," Lee says. "It's way better than the traditional format where you have data, tables, and spreadsheets everywhere." The roadmap Hydrus seeks to take the hassle out of managing a sustainable business by streamlining data collection, reporting, and auditing processes. Its platform is designed to eliminate manual tasks for sustainability managers so they can focus on decarbonization, resource usage optimization, and being able to hit their sustainability goals. Hydrus accelerates these activities by helping companies model their sustainability data around science-based targets so they can better decarbonize and meet other ESG goals. If you're interested in learning more about how to help your organization become more sustainable, decarbonize, and succeed in your sustainability journey, visit the Hydrus website . Are you part of a startup and interested in joining the MongoDB for Startups program? Apply now . For more startup content, check out our wrap-up of the 2022 year in startups .

January 18, 2023
Applied

Predictions 2023: Modernization Efforts in the Financial Services Industry

As a global recession looms, banks are facing tough economic conditions in 2023. Lowering costs will be vital for many organizations to remain competitive in a data-intensive and highly regulated environment. Thus, it’s important that any IT investments accelerate digital transformation with innovative technologies that break down data silos, increase operational efficiency, and build personalized customer experiences. Read on to learn about areas in which banks are looking to modernize in 2023 to build better customer experiences at a lower cost and at scale. Shaping a better banking future with composable designs With banks eager to modernize and innovate, institutions must move away from the legacy systems that are restricting their ability to show progress. Placing consumers at the center of a banking experience made up of interconnected, yet independent services offers technology-forward banks the chance to reshape their business models and subsequently grow market share and increase profitability. These opportunities have brought to fruition a composable architecture design allows faster innovation, improved operational efficiency, and creates new revenue streams by extending the portfolio of services and products. Thus, banks are able to adopt the best-of-breed and perfect-fit-for-purpose software available by orchestrating strategic partnerships with relevant fintechs and software providers. This new breed of suppliers can provide everything from know your customer (KYC) services to integrated booking, load services or basic marketing and portfolio management functionalities. This approach is more cost efficient for institutions than having to build and maintain the infrastructure themselves, and it is significantly faster in terms of time to market and time to revenue. Banks adopting such an approach are seeing fintechs less as competitors and more as part of an ecosystem to collaborate with to accelerate innovation and reach customers. Operational efficiency with intelligent automation Financial institutions will continue to focus on operational efficiency and cost control through automating previous manual and paper-driven processes. Banks have made some progress digitizing and automating what were once almost exclusively paper-based, manual processes. But, the primary driver of this transformation has been compliance with local regulations rather than an overarching strategy for really getting to know the client and achieving true customer delight. The market is eager for better automated and data-driven decisions, and legacy systems can’t keep up. Creating hyper-personalized experiences that customers demand, which include things like chatbots, self-service portals, and digital forensics, is difficult for institutions using outdated technology. And, having data infrastructure in siloes prohibits any truly integrated modern experience. Using a combination of robotic process automation (RPA), machine learning (ML), and artificial intelligence (AI), financial institutions are able to streamline processes, thereby freeing the workforce to focus on tasks that drive a bigger impact for the customer and business. Institutions must not digitize without considering the human interaction that will be replaced, as customers prefer a hybrid approach. The ability to act on real-time data is the way forward for driving value and transforming customer experiences, which must be accompanied by the modernization of the underlying data architecture. The prerequisite for this goal involves the de-siloing of data and sources into a holistic data landscape. Some people call it a data mesh , some composable data sources, virtualized data. Solving ESG data challenges Along with high inflation, the cost-of-living crisis, energy turmoil, and rising interest rates, environmental, social, and governance (ESG) is also in the spotlight. There is growing pressure from regulators to provide ESG data and from investors to make sure portfolios are sustainable. The role of ESG data in conducting market analysis, supporting asset allocation and risk management, and providing insights into the long-term sustainability of investments continues to expand. The nature and variability of many ESG metrics is a major challenge facing companies today. Unlike financial datasets that are mostly numerical, ESG metrics can include both quantitative and qualitative data to help investors and other stakeholders understand a company’s actions and intentions. This complexity, coupled with the lack of a universally applicable ESG reporting standard, means institutions must consider different standards with different data requirements. To master ESG reporting, including the integration of relevant KPIs, appropriate, high-quality data is needed that is also at the right level of granularity and covers the required industries and region. Given the data volume and complexity, financial institutions are building ESG platforms underpinned by modern data platforms that are capable of consolidating different types of data from various providers, creating customized views, modeling data, and performing operations with no barriers. Digital payments - Unlocking an enriched experience Pushed by new technologies and global trends, the digital payments market is flourishing globally. With a valuation of more than $68 billion in 2021 and expectations of double-digit growth over the next decade, emerging markets are leading the way in terms of relative expansion. This growth has been driven by pandemic-induced cashless payments, e-commerce, government push, and fintechs. Digital payments are transforming the payments experience. While it was once enough for payment service providers to supply account information and orchestrate simple transactions, consumers now expect an enriched experience where each transaction offers new insights and value-added services. Meeting these expectations is difficult, especially for companies that rely on outdated technologies that were created long before transactions were carried out with a few taps on a mobile device. To meet the needs of customers, financial institutions are modernizing their payments data infrastructure to create personalized, secure, and real-time payment experiences — all while protecting consumers from fraud. This modernization allows financial institutions to ingest any type of data, launch services more quickly at a lower cost, and have the freedom to run in any environment, from on-premises to multi-cloud . Security and risk management Data is critical to every financial institution; it is recognized as a core asset to drive customer growth and innovation. As the need to leverage data efficiently increases, however, according to 57% of decision makers , the legacy technology that still underpins many organizations is too expensive and doesn’t fulfill the requirements of modern applications. Not only is this legacy infrastructure complex, it is unable to meet current security requirements. Given the huge amount of confidential client and customer data that the financial services industry deals with on a daily basis — and the strict regulations surrounding that data — security must be of highest priority. The perceived value of this data also makes financial services organizations a primary target for data breaches. Fraud protection, risk management, and anti-money laundering are high priorities for any new data platform according to Forrester’s What’s Driving Next-Generation Data Platform Adoption in Financial Services study. To meet these challenges, adoption of next-generation data platforms will continue to grow as financial institutions realize their full potential to manage costs, maximize security, and foster innovation. Download Forrester’s full study — What’s Driving Next-Generation Data Platform Adoption in Financial Services — to learn more.

January 17, 2023
Applied

How Startups Stepped Up in 2022

After muddling through the global pandemic in 2021, entrepreneurs emerged in 2022 ready to transform the way people live, learn, and work. Through the MongoDB for Startups program, we got a close-up view of their progress. What we observed was a good indication of how critical data is to delivering the transformative experiences users expect. Data access vs. data governance The increasing importance of data in the digital marketplace has created a conflict that a handful of startups are working to solve: Granting access to data to extract value from it while simultaneously protecting it from unauthorized use. In 2022, we were excited to work with promising startups seeking to strike a balance between these competing interests. Data access service provider Satori enables organizations to accelerate their data use by simplifying and automating access policies while helping to ensure compliance with data security and privacy requirements. At most organizations, providing access to data is a manual process often handled by a small team that's already being pulled in multiple directions by different parts of the organization. It's a time-consuming task that takes precious developer resources away from critical initiatives and slows down innovation. Data governance is a high priority for organizations because of the financial penalties of running afoul of data privacy regulations and the high cost of data breaches. While large enterprises make attractive targets, small businesses and startups in particular need to be vigilant because they can less afford financial and reputational setbacks. San Francisco-based startup Vanta is helping companies scale security practices and automate compliance for the most prevalent data security and privacy regulatory frameworks. Its platform gives organizations the tools they need to automate up to 90% of the work required for security audits. Futurology The Internet of Things (IoT), artificial intelligence (AI), virtual reality (VR), and natural language processing (NLP) remain at the forefront of innovation and are only beginning to fulfill their potential as transformative technologies. Through the MongoDB for Startups program, we worked with several promising ventures that are leveraging these technologies to deliver game-changing solutions for both application developers and users. Delaware-based startup Qubitro helps companies bring IoT solutions to market faster by making the data collected from mobile and IoT devices accessible anywhere it's needed. Qubitro creates APIs and SDKs that let developers activate device data in applications. With billions of devices producing massive amounts of data, the potential payoff in enabling data-driven decision making in modern application development is huge. London-based startup Concured uses AI technology to help marketers know what to write about and what's working for themselves and their competitors. It also enables organizations to personalize experiences for website visitors. Concured uses NLP to generate semantic metadata for each document or article and understand the relationship between articles on the same website. Another London-based startup using AI and NLP to deliver transformative experiences is Semeris . Analyzing legal documents is a tedious, time-consuming process, and Semeris enables legal professionals to reduce the time it takes to extract information from documentation. The company’s solution creates machine learning (ML) models based on publicly available documentation to analyze less seen or more private documentation that clients have internally The language we use in day-to-day communication says a lot about our state of mind. Sydney-based startup Pioneera looks at language and linguistic markers to determine if employees are stressed out at work or at risk for burnout. When early warning signs are detected, the person gets the help they need to reduce stress, promote wellness, and improve productivity confidentially and in real time. Technologies like AR and VR are transforming learning for students. Palo Alto-based startup Inspirit combines 3D and VR instruction to create an immersive learning experience for middle and high school students. The platform helps students who love science engage with the subject matter more deeply and those who dislike it to experience it in a more compelling format. No code and low code The startup space is rich with visionary thinkers and ideas. But the truth is that you can't get far with an idea if you don't have access to developer talent, which is scarce and costly in today's job market. We've worked with a couple of companies through the MongoDB for Startups program that are helping entrepreneurs breathe life into their ideas with low- and no-code solutions for building applications and bringing them to market. Low- and no-code platforms enable users with little or no coding background to satisfy their own development needs. For example, Alloy Automation is a no-code integration solution that integrates with and automates ecommerce services, such as CRM, logistics, subscriptions, and databases. Alloy can automate SMS messages, automatically start a workflow after an online transaction, determine if follow-up action should be taken, and automate actions in coordination with connected apps. Another example is Thunkable , a no-code platform that makes it easy to build custom mobile apps without any advanced software engineering knowledge or certifications. Thunkable's mission is to democratize mobile app development. It uses a simple drag-and-drop design and powerful logic blocks to give innovators the tools they need to breathe life into their app designs. The startup journey Although startups themselves are as diverse as the people who launch them, all startup journeys begin with the identification of a need in the marketplace. The MongoDB for Startups program helps startups along the way with free MongoDB Atlas credits, one-on-one technical advice, co-marketing opportunities, and access to a vast partner network. Are you a startup looking to build faster and scale further? Join our community of pioneers by applying to the MongoDB for Startups program. Apply now .

January 16, 2023
Applied

Improving Building Sustainability with MongoDB Atlas and Bosch

Every year developers from more than 45 countries head to Berlin to participate in the Bosch Connected Experience (BCX) hackathon — one of Europe’s largest AI and Internet of Things (AIoT) hackathons. This year, developers were tasked with creating solutions to tackle a mix of important problems, from improving sustainability in commercial building operations and facility management to accelerating innovation of automotive-grade, in-car software stacks using a variety of hardware and software solutions made available through Bosch, Eclipse, and their ecosystem partners. MongoDB also took part in this event and even helped one of the winning teams build their solution on top of MongoDB Atlas. I had the pleasure of connecting with a participant from that winning team, Jonas Bruns, to learn about his experience building an application for the first time with MongoDB Atlas. Ashley George: Tell us a little bit about your background and why you decided to join this year's BCX hackathon? Jonas Bruns: I am Jonas, an electrical engineering student from Friedrich Alexander University in Erlangen Nürnberg. Before I started my master’s program, I worked in the automotive industry in the Stuttgart area. I was familiar with the BCX hackathon from my time in Stuttgart and, together with two friends from my studies, decided to set off to Berlin this year to take part in this event. The BCX hackathon is great because there are lots of partners on site to help support the participants and provide knowledge on both the software and hardware solutions available to them — allowing teams to turn their ideas into a working prototype within the short time available. We like being confronted with new problems and felt this was an important challenge to take on, so participation this year was a must for us. AG: Why did you decide to use MongoDB Atlas for your project? JB: We started with just the idea of using augmented reality (AR) to improve the user experience (UX) of smart devices. To achieve this goal, we needed not only a smartphone app but also a backend in which all of our important data is stored. Due to both limited time and the fact that no one on our team had worked with databases before, we had to find a solution that would grow with our requirements and allow us to get started as easily as possible. Ideally, the solution would also be fully managed as well to eliminate us having to take care of security on our own. After reviewing our options, we quickly decided on using MongoDB Atlas . AG: What was it like working with MongoDB Atlas, especially having not worked with a database solution before? JB: The setup was super easy and went pretty fast. Within just a short time, we were able to upload our first set of data to Atlas using MongoDB Compass . As we started to dive in and explore Atlas a bit more we discovered the trigger functionality (Atlas Triggers), which we were able to use to simplify our infrastructure. Originally, we planned to use a server connected to the database, which would react to changed database entries. This would then send a request to control the desired periphery. The possibility to configure triggers directly in the database made a server superfluous and saved us a lot of time. We configured the trigger so that it executes a JavaScript function when a change is made to the database. This evaluates data from the database and executes corresponding requests, which directly control the periphery. Initially, we had hit a minor roadblock in determining how to handle the authentication needs (creating security tokens), which the periphery needs and expects during a request. To solve for this, we stored the security tokens on an AWS server which listens to an HTTP request. From Atlas, we then just have to call the URL and the AWS instance does the authentication and control of the lights. After we solved this problem, we were thrilled with how little configuration was needed and how intuitive Atlas is. The next steps, like connecting Atlas to the app, were easy. We achieved this by sending data from Flutter to Atlas over HTTPs with the Atlas Data API . AG: How did Atlas enable you to build your winning application? JB: By the end of the challenge, we had developed our idea into a fully functional prototype using Google ARcore, Flutter, MongoDB Atlas, and the Bosch Smart Home Hardware (Figure 1). We built a smartphone application that uses AR to switch on and off a connected light in a smart building. The position and state of the light (on or off) are stored in the database. If the state of the light should change, the app manipulates the corresponding value in the database. The change triggers a function that then sets the light to the desired state (on or off). The fact that we were able to achieve this within a short time without sufficient prior knowledge is mainly due to the ease and intuitive nature of Atlas. The simple handling allowed us to quickly learn and use the available features to build the functionality our app needed. Figure 1: Tech stack for the projects prototype. AG: What additional features within Atlas did you find the most valuable in building your application? JB: We created different users to easily control the access rights of the app and the smart devices. By eliminating the need for another server to communicate with the smart devices and using the trigger function of Atlas, we were able to save a lot of time on the prototype. In addition, the provided preconfigured code examples in various languages facilitated easy integration to our frontend and helped us avoid errors. Anyone who is interested can find the results of our work in the GitHub repo . AG: Do you see yourself using Atlas more in the future? JB: We will definitely continue to use Atlas in the future. The instance from the hackathon is still online, and we want to get to know the other functionalities that we haven't used yet. Given how intuitive Atlas was in this project, I am also sure that we will continue to use it for future projects as well. Through this project, Jonas and team were able to build a functional prototype that can help commercial building owners have more control over their buildings and take the steps to help reduce CO₂ emissions.

January 12, 2023
Applied

Introducing MongoDB Connector for Apache Kafka version 1.9

Today, MongoDB released version 1.9 of the MongoDB Connector for Apache Kafka! This article highlights the key features of this new release! Pre/Post document states In MongoDB 6.0, Change Streams added the ability to retrieve the before and after state of an entire document . To enable this functionality on the collection you can set it as a parameter in the createCollection command such as: db.createCollection( "temperatureSensor", { changeStreamPreAndPostImages: { enabled: true } } ) Alternatively, for existing collections, use colMod as shown below: db.runCommand( { collMod: <collection>, changeStreamPreAndPostImages: { enabled: <boolean> } } ) Once the collection is configured for pre and post images, you can set the change.stream.full.document.before.change source connector parameter to include this extra information in the change event. For example, consider this source definition: { "name": "mongo-simple-source", "config": { "connector.class": "com.mongodb.kafka.connect.MongoSourceConnector", "connection.uri": "<< MONGODB CONNECTION STRING >>", "database": "test", "collection": "temperatureSensor", "change.stream.full.document.before.change":"whenavailable" } } When the following document is inserted: db.temperatureSensor.insertOne({'sensor_id':1,'value':100}) Then an update is applied: db.temperatureSensor.updateOne({'sensor_id':1},{ $set: { 'value':105}}) You can see the change stream event written to Kafka topic is as follows: { "_id": { "_data": "82636D39C8000000012B022C0100296E5A100444B0F5E386F04767814F28CB4AAE7FEE46645F69640064636D399B732DBB998FA8D67E0004" }, "operationType": "update", "clusterTime": { "$timestamp": { "t": 1668102600, "i": 1 } }, "wallTime": { "$date": 1668102600716 }, "ns": { "db": "test", "coll": "temperatureSensor" }, "documentKey": { "_id": { "$oid": "636d399b732dbb998fa8d67e" } }, "updateDescription": { "updatedFields": { "value": 105 }, "removedFields": [], "truncatedArrays": [] }, "fullDocumentBeforeChange": { "_id": { "$oid": "636d399b732dbb998fa8d67e" }, "sensor_id": 1, "value": 100 } } Note the fullDocumentBeforeChange key includes the original document before the update occurred. Starting the connector at a specific time Prior to version 1.9, when the connector starts as a source, it will open a MongoDB change stream and any new data will get processed by the source connector. To copy all the existing data in the collection first before you begin processing the new data, you specify the “ copy.existing ” property. One frequent user request is to start the connector based upon a specific timestamp versus when the connector starts. In 1.9 a new parameter called startup.mode was added to specify when to start writing data. startup.mode=latest (default) “Latest” is the default behavior and starts processing the data when the connector starts. It ignores any existing data when the connector starts. startup.mode=timestamp “timestamp” allows you to start processing at a specific point in time as defined by additional startup.mode.timestamp.* properties. For example, to start the connector from 7AM on November 21, 2022, you set the value as follows: startup.mode.timestamp.start.at.operation.time=’2022-11-21T07:00:00Z’ Supported values are an ISO-8601 format string date as shown above or as a BSON extended string format. startup.mode=copy.existing Same behavior as the existing as the configuration option, “copy.existing=true”. Note that “copy.existing” as a separate parameter is now deprecated. If you defined any granular copy.existing parameters such as copy.existing.pipeline, just prepend them with “startup.mode.copy.existing.” property name. Reporting MongoDB errors to the DLQ Kafka supports writing errors to a dead letter queue . In version 1.5 of the connector, you could write all exceptions to the DLQ through the mongo.error.tolerance=’all’ . One thing to note was that these errors were Kafka generated errors versus errors that occurred within MongoDB. Thus, if the sink connector failed to write to MongoDB due to a duplicate _id error, for example, this error wouldn’t be written to the DLQ. In 1.9, errors generated within MongoDB will be reported to the DLQ. Behavior change on inferring schema Prior to version 1.9 of the connector, if you are inferring schema and insert a MongoDB document that contains arrays with different value data types, the connector is naive and would simply set the type for the whole array to be a string. For example, consider a document that resembles: { "myfoo": [ { "key1": 1 }, { "key1": 1, "key2": "dogs" } ] } If we set output.schema.infer.value . to true on a source connector, the message in the Kafka Topic will resemble the following: … "fullDocument": { … "myfoo": [ "{\"key1\": 1}", "{\"key1\": 1, \"key2\": \"dogs\"}" ] }, … Notice the array items contain different values. In this example, key1 is a subdocument with a single value the number 1, the next item in the “myfoo” array is a subdocument with the same “key1” field and value of an integer, 1, and another field, “key 2” that has a string as a value. When this scenario occurs the connector will wrap the entire array as a string. This behavior can also apply when using different keys that contain different data type values. In version 1.9, the connector when presented with this configuration will not wrap the arrays, rather it will create the appropriate schemas for the variable arrays with different data type values. The same document when run in 1.9 will resemble: "fullDocument": { … "myfoo": [ { "key1": 1, }, { "key1": 1, "key2": "DOGS" } ] }, Note that this behavior is a breaking change and that inferring schemas when using arrays can cause performance degradation for very large arrays using different data type values. Download the latest version of the MongoDB Connector for Apache Kafka from Confluent Hub! To learn more about the connector read the MongoDB Online Documentation . Questions? Ask on the MongoDB Developer Community Connectors and Integrations forum!

January 12, 2023
Updates

Top 3 Wins and Wants from the Latest TDWI Modernization Report

We recently reported that analyst and research firm TDWI had released its latest report on IT modernization: Maximizing the Business Value of Data: Platforms, Integration, and Management . The report surveyed more than 300 IT executives, data analysts, data scientists, developers, and enterprise architects to find out what their priorities, objectives, and experiences have been in terms of IT modernization. In many ways, organizations have made great progress. From new data management and data integration capabilities to smarter processes for higher business efficiency and innovations, IT departments have helped organizations get more value from the data they generate. In other cases, organizations are still stuck in data silos and struggling with improving data quality as data distribution increases due to the proliferation of multi-cloud environments. In this article, we'll summarize the top three areas where organizations are winning and the top three ways that organizations are left wanting when it comes to digital transformation and IT modernization. Download the complete report, Maximizing the Business Value of Data: Platforms, Integration, and Management , and find out the latest strategies, trends, and challenges for businesses seeking to modernize. Wins 1. Cloud migration Moving legacy applications to the cloud is essential for organizations seeking to increase operational efficiency and effectiveness, generate new business models through analytics, and support automated decision-making — the three biggest drivers of modernization efforts. And, most organizations are succeeding. Seventy-two percent of respondents in the TDWI survey reported being very or somewhat successful moving legacy applications to cloud services. Migrating to the cloud is one thing, but getting data to the right people and systems at the right time is another. For organizations to get full value of their data in the cloud, they also need to ensure the flow of data into business intelligence (BI) reports, data warehouses, and embedded analytics in applications. 2. 24/7 operations The ability to run continuous operations is a widely shared objective when organizations take on a transformation effort. Increasingly global supply chains, smaller and more dispersed office locations, and growing international customer bases are major drivers of 24/7 ops. And, according to the TDWI survey, more than two-thirds of organizations say they've successfully transitioned to continuous operations. 3. User satisfaction Organizations are also winning the race to match users' needs when provisioning data for BI, analytics, data integration, and the data management stack. Eighty percent of respondents said their users were satisfied with these capabilities. Additionally, 72% trusted in the quality of data and how it's governed, and 68% were satisfied that role-based access controls were doing a good job of ensuring that only authorized users had access to sensitive data. Wants 1. Artificial intelligence, machine learning, and predictive intelligence Machine learning (ML) and artificial intelligence (AI) comprise a key area where organizations are left wanting. While 51% of respondents were somewhat or very satisfied with their use of AI and ML data, almost the same number (49%) said they were neither satisfied nor dissatisfied, somewhat dissatisfied, or very dissatisfied. Similar results were also reported for data-driven predictive modeling. The report notes that provisioning data for AI/ML is more complex and varied than for BI reporting and dashboards, but that cloud-based data integration and management platforms for analytics and AI/ML could increase satisfaction for these use cases. 2. More value from data Perhaps related to the AI/ML point, the desire to get more value out of their data was cited as the biggest challenge organizations face by almost 50% of respondents. Organizations today capture more raw, unstructured, and streaming data than ever, and they're still generating and storing structured enterprise data from a range of sources. One of the big challenges organizations reported is running analytics on so many different data types. According to TDWI, organizations need to overcome this challenge to inform data science and capitalize modern, analytics-infused applications . 3. Easier search A big part of extracting more value from data is making it easy to search. Traditional search functionality, however, depends on technically challenging SQL queries. According to the TDWI report, 19% of users were dissatisfied with their ability to search for data, reports, and dashboards using natural language. Unsurprisingly, frustration with legacy technologies was cited as the third biggest challenge facing organizations, according to the survey. The way forward "In most cases, data becomes more valuable when data owners share data," the TDWI report concludes. Additionally, the key to making data more shareable is moving toward a cloud data platform , one that makes data more available while simultaneously governing access when there's a need to protect the confidentiality of sensitive data. Not only does a cloud data platform make data more accessible and shareable for users, it also creates a pipeline for delivering data to applications that can use it for analytics, AI, and ML. Read the full TDWI report: Maximizing the Business Value of Data: Platforms, Integration, and Management .

January 11, 2023
News

MongoDB Is A Best Place to Work in 2023, According to Our Employees on Glassdoor

MongoDB is pleased to announce that we are among the winners of the annual Glassdoor Employees’ Choice Awards, a list of the Best Places to Work in 2023 . Unlike other workplace awards, there is no self-nomination or application process, instead it’s entirely based on the feedback our employees have voluntarily and anonymously shared on Glassdoor. To determine the winners of the awards, Glassdoor evaluates company reviews shared by current and former employees over the past year. This year, we are proud to be recognized as a Best Place to Work among U.S. companies with more than 1,000 employees. A huge thank you goes out to all our employees who took the time to share their perspective on what it’s like to work here. We appreciate all the valuable feedback as it only helps us improve. Below are just a few words employees shared on Glassdoor that contributed toward the award and make us feel incredibly honored: Senior Staff Engineer, Sydney “I have been working on the Storage Engine for MongoDB for over ten years now. In my tenure at MongoDB I have taken on a lot of different roles and responsibilities and am now a senior individual contributor. Working with my colleagues to build the best storage engine in the world as well as carefully crafting a diverse, inclusive, pragmatic, engaged and curious engineering culture. During my time here I've been able to actively contribute to its success, and have clearly understood the vision and pathway to that success. The company is continually growing and evolving to meet changing needs - it's an exciting place to work full of opportunity and challenges. Enterprise Account Executive, Tel-Aviv “Amazing tech and some of the most smart & experienced you'll ever have a chance to work with. Feedback is a big part of the culture and is given in an actionable, clear way that is intended to make you better in your craft and your results.” Deal Strategy Manager, Dublin “MongoDB is very passionate about culture and ensuring everyone who walks in the door fits the existing culture. This is a culture where openness, inclusiveness and respect are really important. Management wants to try as hard as they can to maintain the small company feel while the company scales. I have worked in some large companies where the term 'family' is used a lot but here there is truth in saying that there is a family feel amongst my team and in my office. I can attest to this as within my first year I have had to deal with two quite serious changes in my personal life and the team has been so supportive and nothing has ever been an issue. The Senior Leadership here is the strongest I have ever seen in my career and I have no doubt this company will continue to grow over the next 5 years. The offices are incredible and the employee benefits are exceptional.” Director, Developer Relations, Austin “The C-Suite management team is amazing. Dev is an amazing CEO who has surrounded himself with brilliant people who know how to execute. The market opportunity is incredible. MongoDB is the hands down leader in the NoSQL space and the "great replacement" of RDBMS is just getting started. Outstanding growth position in a turbulent market. The entire team is focused on one mission. MongoDB has one goal. We will extend our lead in the NoSQL technology sector as we disrupt the global database technology market and replace the RDBMS. Everyone here marches to the beat of the same drum.” We’re hiring in 2023 and would love for you to join us. View our current career opportunities .

January 11, 2023
Culture

Build Analytics-Driven Apps with MongoDB Atlas and the Microsoft Intelligent Data Platform

Customers increasingly expect engaging applications informed by real-time operational analytics, yet meeting these expectations can be difficult. MongoDB Atlas is a popular operational data platform that makes it straightforward to manage critical business data at scale. For some applications, however, enterprises may also want to apply insights gleaned from data warehouse, business intelligence (BI), and related solutions, and many enterprises depend on the Microsoft Intelligent Data Platform to apply analytics and governance solutions to operational data stores. MongoDB and Microsoft have partnered to make it simple to use the Microsoft Intelligent Data Platform to glean and apply comprehensive analytical insights to data stored in MongoDB. This article details how enterprises can successfully use MongoDB with the Microsoft Intelligent Data Platform to build more engaging, analytics-driven applications. Microsoft Intelligent Data Platform + MongoDB MongoDB Atlas provides a unified interface for developers to build distributed, serverless, and mobile applications with support for diverse workload types including operational, real-time analytics, and search. With the ability to model graph, geospatial, tabular, document, time series, and other forms of data, developers don’t have to go for multiple niche databases, which results in highly complex, polyglot architectures. The Microsoft Intelligent Data Platform offers a single platform for databases, analytics, and data governance by integrating Microsoft’s database, analytics, and data governance products. In addition to all Azure database services, the Microsoft Intelligent Data Platform includes Azure Synapse Analytics for data warehousing and analytics, Power BI for BI reporting, and Microsoft Purview for enterprise data governance requirements. Although customers have always been able to apply the Microsoft Intelligent Data Platform services to MongoDB data, doing so hasn't always been as simple as it could be. Through this new integration, customers gain a seamless way to run analytics and data warehousing operations on the operational data they store in MongoDB Atlas. Customers can also more easily use Microsoft Purview to manage and run data governance policies against their most critical MongoDB data, thereby ensuring compliance and security. Finally, through Power BI customers are empowered to easily query and extract insights from MongoDB data using powerful in-built and custom visualizations. Let’s deep dive into each of these integrations. Operationalize insights with MongoDB Atlas and Azure Synapse Analytics MongoDB Atlas is an Operational Data Platform which can handle multiple workload types including transactional, search, operational analytics, etc. It can cater to multiple application types including distributed, serverless, mobile, etc. For data warehousing workloads, long-running analytics, and AI/ML, we compliment Azure Synapse Analytics very well. MongoDB Atlas can be easily integrated as a source or as a sink resource in Azure Synapse Analytics. This connector is useful to: Fetch all the MongoDB Atlas historical data into Synapse Retrieve incremental data for a period based on filter criteria in a batch mode, to run SQL based or Spark based analytics The sink connector allows you to store the analytics results back to MongoDB, which can then power applications enabled on top of it. Many enterprises require real-time analytics, for example, in fraud detection, anomaly detection of IoT devices, predicting stock depletion, and maintenance of machinery, where a delay in getting insights could cause serious repercussions. MongoDB and Microsoft have worked together to come up with the best practice architecture for the same which can be found in this article . Figure 1: Schematic showing integration of MongoDB with Azure Synapse Analytics. Business intelligence reporting and visualization with PowerBI Together, MongoDB Atlas and Microsoft PowerBI offer a sophisticated real-time data platform, providing customers with the ability to present specialized operational and analytical query engines on the same data sets. Information on connecting from PowerBI desktop to MongoDB is available in the official documentation . MongoDB is also excited to announce the forthcoming MongoDB Atlas Power BI Connector that will expose the richness of the JSON document data with Power BI (see Figure 2). This MongoDB Atlas Power BI Connector allows users to unlock access to their Atlas cloud data. Figure 2: Schematic showing integration of MongoDB and Microsoft Power BI. Beyond providing mere access to MongoDB Atlas data, this connector will provide a SQL interface to let you interact with semi-structured JSON data in a relational way, thereby ensuring you can take full advantage of Power BI's rich business intelligence capabilities. Importantly, through the connector, support is planned for two connectivity modes: import and direct. This new MongoDB Atlas Power BI Connector will be available in the first half of 2023. Conclusion Together with the Microsoft Intelligent Data Platform offerings, MongoDB Atlas can help operationalize the insights driven from customers’ data spread across siloed legacy databases and help build modern applications with ease. With MongoDB Atlas on Microsoft Azure, developers receive access to the most comprehensive, secure, scalable, and cloud–based developer data platform in the market. Now, with the availability of Atlas on the Azure Marketplace, it’s never been easier for users to start building with Atlas while streamlining procurement and billing processes. Get started today through the MongoDB Atlas on Azure Marketplace listing .

January 10, 2023
Applied

Break Down Silos with a Data Mesh Approach to Omnichannel Retail

Omnichannel experiences are increasingly important for customers, yet still hard for many retailers to deliver. In this article, we’ll cover an approach to unlock data from legacy silos and make it easy to operate across the enterprise — perfect for implementing an omnichannel strategy. Establishing an omnichannel retail strategy An omnichannel strategy connects multiple, siloed sales channels (web, app, store, phone, etc.) into one cohesive and consistent experience. This strategy allows customers to purchase through multiple channels with a consistent experience (Figure 1). Most established retailers started with a single point of sale or “channel” — the first store — then moved to multiple stores and introduced new channels like ecommerce, mobile, and B2B. Omnichannel is the next wave in this journey, offering customers the ability to start a journey on one channel and end it on another. Figure 1: Omnichannel experience examples. Why are retailers taking this approach? In a super-competitive industry, an omnichannel approach lets retailers maximize great customer experience, with a subsequent effect on spend and retention. Looking at recent stats , Omnisend found that purchase frequency is 250% higher on omnichannel, and Harvard Business Review’s research saw omnichannel customers spend 10% more online and 4% more in-store. Omnichannel: What's the challenge? So, if all retailers want to provide these capabilities to their customers, why aren’t they? The answer lies in the complex, siloed data architectures that underpin their application architecture. Established retailers who have built up their business over time traditionally incorporated multiple off-the-shelf products (e.g., ERP, PIMS, CMS, etc.) running on legacy data technologies into their stack (mainframe, RDBMS, file-based). With this approach, each category of data is stored in a different technology, platform, and rigid format — making it impossible to combine this data to serve omnichannel use cases (e.g., in-store stock + ecommerce to offer same-day click and collect). See Figure 2. Figure 2: Data sources for omnichannel. The next challenge is the separation of operational and historical data — older data is moved to archives, data lakes, or warehouses. Perhaps you can see today’s stock in real time, but you can’t compare it to stock on the same day last year because that is held in a different system. Any business comparison occurs after the fact. To meet the varied volume and variety of requests, retailers must extract, transform, and load (ETL) data into different databases, creating a complex disjointed web of duplicated data. Figure 3 shows a typical retailer architecture: A document database for key-value lookup, cache added for speed, wide column storage for analytics, graph databases to look up three degrees of separation, time series to track changes over time, etc. Figure 3: An example of a typical data architecture sprawl in modern retailers. The problem is that ETL’d data becomes stale as it moves between technologies, lagging behind real-time and losing context. This sprawl of technology is complex to manage and difficult to develop against — inhibiting retailers from moving quickly and adapting to new requirements. If retailers want to create experiences that can be used by consumers in real-time — operational or analytical — this architecture does not give them what they need. Additionally, if they want to use AI or machine learning models, they need access to current behavior for accuracy. Thus, the obstacle to delivering omnichannel experiences is a data problem that requires a data solution. Let's look at a smart approach to fixing it. Modern retailers are taking a data mesh approach Retail architectures have gone through many iterations, starting from vendor solutions per use case, moving toward a microservices approach, and landing into domain-driven design (Figure 4). Vendor Applications Microservices Domain-Driven Design * Each vendor decides the framework and governance of the data layer. The enterprise has no control over app or data * Microservices pull data from the API layer * Microservices and core datasets are combined into bounded contexts by business function * Data is not interoperable between components * DevOps teams control their microservices, but data is managed by a centralized enterprise team * DevOps teams control microservices AND data Figure 4: Architecture evolution. Domain-driven design has emerged through an understanding that the team with domain expertise should have control over the application layer and its associated data — this is the “bounded context” for their business function. This means they can change the data to innovate quickly, without reliance on another team. Of course, if data remains in its bounded context only, we end up with the same situation as the commercial off-the-shelf (COTS) and legacy architecture model. Where we see value is when the data in each domain can be used as a product throughout the organization. Data as a product is a core data mesh concept — it includes data, metadata, and the code and infrastructure to use it. Data as a product is expected to be discoverable (searchable), addressable, self-identifying, and interoperable (Figure 5). In a retail example, the product, customer, and store can be thought of as bounded contexts. The product bounded context contains the product data and the microservices/applications that are built for product use cases. But, for a cross-domain use case like personalized product recommendations, the data from both customer and product domains must be available “as a product.” Figure 5: Bounded contexts and data as a product. What we’re creating here is a data mesh — an enterprise data architecture that combines intentionally distributed data across distinctly defined, bounded contexts. It is a business domain-oriented, decentralized data ownership and architecture, where each makes its data available as an interoperable “data product.” The key is that the data layer must serve all real-time workloads that are required of the business — both operational and real-time analytical (Figure 6). Figure 6: Data mesh. Why use MongoDB for omnichannel data mesh Let’s look at data layer requirements needed for a data mesh move to be successful and how MongoDB can meet those requirements. Capable of handling all operational workloads: Expressive query language, including joining data, ACID transactions, and IoT collections make it great for multiple workloads. MongoDB is known for its performance and speed. The ability to use secondary indexes means that several workloads can run performantly. Search is key for retail applications — MongoDB Atlas has Lucene search engine built-in for full-text search with no data movement. Omnichannel experiences often involve mobile interaction. MongoDB Realm and Flexible Device Sync can seamlessly ensure consistency between mobile and backend. Capable of handling analytical workloads: MongoDB’s distributed architecture means analytical workloads can run on a real-time data set, without ETL or additional technology and without disturbing operational workloads. For real-time analytical use cases, the aggregation framework can be used to perform powerful data transformations and run ad hoc exploratory queries. For business intelligence or reporting workloads, data can be queried by Atlas SQL or piped through the BI Connector to other data tools (e.g., Tableau and PowerBI). Capable of serving data as a product: When serving data as a product, it is often by API: MongoDB’s BSON-based document model maps well to JSON-based API payloads for speed and ease. MongoDB Atlas provides both the Data API and the GraphQL API fully hosted. Depending on the performance needed, direct access may also be required. MongoDB has drivers for all common programming languages, meaning that other teams using different languages can easily interact with it. Rules for access of course must be defined, and one option is to use MongoDB App Services . Real-time data can also be published to Apache Kafka topics using the MongoDB Kafka Connector , which can act as a sync and a source for data. For example, one bounded context could publish data in real-time to a named Kafka topic, allowing another context to consume this and store it locally to serve latency-sensitive use cases. The tunable schema allows for flexibility in non-product fields, while schema validation capabilities enforce specific fields and data types in a collection to provide consistent datasets. Resilient, secure, and scalable: MongoDB Atlas has a 99.995% uptime guarantee and provides auto-healing capability, with multi-region and multi-cloud resiliency options. MongoDB provides the ability to scale up or down to meet your application requirements — vertically and horizontally. MongoDB follows a best-in-class security protocol. Choose the flexible data mesh approach Providing customers with omnichannel experiences isn’t easy, especially with legacy siloed data architectures. Omnichannel requires a way of making your data work easily across the organization in real-time, giving access to data to those who need it while also giving the power to innovate to the domain experts in each field. A data mesh approach provides the capability and flexibility to continuously innovate. Ready to build deeper business insights with in-app analytics and real-time business visibility? Read our new white paper: Application-Driven Analytics: In-App and Real-Time Insights for Retailers .

January 10, 2023
Applied

MongoDB Named a Leader in 2022 Gartner® Magic Quadrant™ for Cloud Database Management Systems

MongoDB is proud to be named a Leader in the 2022 Gartner® Magic Quadrant for Cloud Database Management Systems (DBMS). We believe this achievement makes MongoDB the only pure-play application database provider recognized as a Leader. Here at MongoDB, we feel the true achievement is not in the placement, rather the means by which it was achieved. At MongoDB we strive to offer engineering teams an enhanced approach through a unified developer data platform as opposed to contending with multiple disparate and fragmented technologies and products. Our desire is to give customers a single cohesive design philosophy with a consistent workflow, API, data model, query language, management and navigation through this unified data developer platform. This design is flexible enough to service almost any application - so significantly reduces cognitive toil engineering teams have to deal with. Therefore, teams can build and release faster, evolving their apps at higher velocity than competitors. We believe that this philosophy has been validated, not only by the community of developers that have built up around MongoDB, but now by this placement in this publication from Gartner. Based on our conversations with customers, we don’t think that having a tool for every job makes sense because over time, the tax and cost of learning, managing and supporting those different tools doesn’t make a lot of sense or just becomes cost-prohibitive. 1 Dev Ittycheria, MongoDB CEO - November 2022 Cloud Database Management Systems are defined by Gartner as “Core capabilities are that vendors fully supply provider-managed public or private cloud software systems that manage data in cloud storage. Data is stored in a cloud storage tier. Optionally, they may cater to multiple data models and data types — relational, nonrelational (document, key value, wide column, graph), geospatial, time series and others.” To help users understand this emerging technology landscape, Gartner published its first Cloud Database Management Systems Magic Quadrant back in 2020. Two years on and after an evolving criteria, Gartner has named MongoDB as a Leader in its debut as a qualifying vendor for the latest 2022 Magic Quadrant. We believe MongoDB was named a Leader in this report due to the R&D investments made in further building out capabilities in MongoDB Atlas , our multi-cloud developer data platform. These investments were driven by the demands of the developer communities we work with. You told us how you struggle to bring together all of the data infrastructure needed to power modern digital experiences – from transactional databases to analytics processing, full-text search, and edge computing. This is exactly what our developer data platform offers. It provides an elegant, integrated, and fully-managed data architecture accessed via a unified set of APIs. With MongoDB Atlas, developers are more productive, they ship code faster and improve it more frequently. Cloud native - First and foremost Virtually every organization we work with at MongoDB has a cloud strategy. It's not zero sum, so there are some apps that will never go there. However, everyone is either using or looking to use the cloud because of the agility it brings to them. MongoDB delivers this to our customers via a fully managed, cloud-native document-based database; MongoDB Atlas , our globally distributed developer data platform. This platform easily and securely reduces time spent on development cycles and empowers organizations with flexible schema and the tools they need to innovate. MongoDB Atlas ’ multi-cloud clusters take the concept a step further by enabling a single application to use multiple clouds simultaneously. With multi-cloud clusters, data is easily distributed across different public clouds, like Amazon Web Services ( AWS ), Google Cloud Platform ( GCP ), and Microsoft Azure . This enables data mobility and resilience without the complexity of manual data replication. This philosophy was taken a stage further earlier this year with the delivery of our serverless architecture . This abstracts away server, storage, and network provisioning, plus the management overhead. Thus, organizations and their development teams can focus on building differentiating features and creating great app experiences for their customers. Such expanding product vision was called out by Gartner as a notable strength. The authors of the magic quadrant recognised our plans to deliver more comprehensive analytics support and SQL capabilities alongside the existing functionality delivered to date. These additions have extended the document data model to embrace time series , application search and application-driven analytics use cases among others, and deliver an industry proven multi-model general purpose operational and transactional database. The community of MongoDB users have also spoken on Gartner Peer Insights™; 97% of MongoDB users who provided reviews to Gartner Peer Insights platform said they would recommend us 2 (based on 36 ratings in the last 12 months, as on December 21, 2022). Evaluating a leader in the Magic Quadrant for cloud database management systems Gartner evaluated 20 of the most significant cloud DBMS vendors against 15 criteria (7 on execution and 8 on vision). These criteria span current product offering, market responsiveness and record through to innovation and the business model. Our rapid adoption in the past decade and move to cloud was highlighted by Gartner as a strength. It is our opinion that our placement as a Leader in this Magic Quadrant is an acknowledgement of this. Visit the Gartner website to obtain the full report here (requires Gartner subscription) 3 . Customer momentum Organizations frequently start their journey with MongoDB by employing it as an operational database. This can be for both, new cloud-native services, as well as modernized legacy apps. Both scenarios are routinely enhanced by appending application search, mobile and analytics use cases to the core operational database requirement. Increasingly more diverse teams are now improving customer experience and attaining business agility by embracing what the MongoDB community have become accustomed to for years. Examples include: Forbes for migration to the cloud in six months Verizon for pioneering 5G connectivity and data storage to the edge Powerledger for legacy data platform migration Getting started on your Cloud journey MongoDB Atlas is engineered to help you make the shift to the cloud. We can come in to understand more about your key transformation initiatives and workshop ideas with your teams to accelerate delivery. In the interim, your engineers and developers can familiarize themselves with MongoDB right away by signing up for a free account on MongoDB Atlas . Have them create a free database cluster, load your own data or our sample data sets, and explore what’s possible within the platform. From November 2022, we have enhanced the MongoDB learning experience . The updated program includes an expanded catalog of courses, streamlined developer certifications, 24/7 exam access, hands-on Atlas labs, and foreign language support. Additionally, the MongoDB Developer Center hosts an array of resources including tutorials, sample code, videos, and documentation. 1 Source: SiliconANGLE article, 28th November 2022: “ MongoDB’s Dev Ittycheria on how the cloud is expanding developers’ influence”, Mike Wheatley 2 Gartner® and Peer Insights™ are trademarks of Gartner, Inc. and/or its affiliates. All rights reserved. Gartner Peer Insights content consists of the opinions of individual end users based on their own experiences, and should not be construed as statements of fact, nor do they represent the views of Gartner or its affiliates. Gartner does not endorse any vendor, product or service depicted in this content nor makes any warranties, expressed or implied, with respect to this content, about its accuracy or completeness, including any warranties of merchantability or fitness for a particular purpose. 3 Gartner and Magic Quadrant are registered trademarks of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose. The Gartner logo is a trademark and service mark of Gartner, Inc., and/or its affiliates, and is used herein with permission. All rights reserved.

January 10, 2023
News

Ready to get Started with MongoDB Atlas?

Start Free