multi-cloud

2957 results

MongoDB: Powering Digital Natives

Today's rapidly evolving digital landscape is dominated by digital native companies, driving innovation . These are companies born in the digital age and who operate through digital channels with a business model enabled by technology and data. They are not only adept at using technology but are also reshaping the way software is developed and deployed. This article delves into the challenges and opportunities facing digital natives in modern application development, with a particular focus on the complexities of managing data. We’ll explore how the right data platform can empower your digital native organization to build high-quality software faster, adapt to changing market demands, and unlock the full potential of your business. Strong foundations: The four pillars of tech-fueled growth for digital natives Achieving explosive growth requires a strong foundation built on specific principles, which empower rapid scaling and success. Here, we explore the four key pillars that fuel tech-driven growth for digital natives: Product-market fit, fast: As a digital native, you must continuously ship and iterate products to achieve a quick product-market fit. This builds customer trust and captures opportunities before competitors can in an evolving market. Data and AI-driven decisions: You must leverage data to personalize experiences, automate processes, and guide product decisions. A robust data architecture feeds real-time data into AI models, enabling data-driven decisions organization-wide. Balance of freedom and control: Your developers must have the freedom to choose technologies, even as your organization maintains control over the infrastructure to manage risks and costs at scale. Selected technologies must integrate within your overall technology estate. Extensible and open technologies: You must explore disruptive technologies while maintaining existing systems. Freedom from platform and vendor lock-in enables quick adoption of innovations, from current generative AI capabilities to future technological advances. Data: The unsolved challenge in modern application development From cloud platforms and managed services to gen AI code assistants, advancements have transformed how engineering teams build, ship, and run applications: Agile methods and programmatic APIs streamline development, while CI/CD and infrastructure as code automate processes. Containerization, microservices, and serverless architectures enable modularity, while new languages and frameworks boost capabilities. Enhanced logging and monitoring tools provide deep application health insights. Figure 1: Tools and processes to maximize velocity. But none of these advancements address where developers spend most of their time— data . In fact, 73% of developers share time and again that working with data is the hardest part of building an application or feature. So why is data the problem? Traditionally, selecting a database, often an open-source relational one, is the first step in development. However, these databases can struggle with the characteristics of modern data: it’s high volume, unstructured, and constantly evolving. As applications mature and their data demands grow, development teams may encounter challenges with achieving scalability and maintaining service resilience. Some teams turn to NoSQL databases, but even then they find there are limited capabilities, pushing them back to relational databases. As the application gains traction, the business’s appetite for innovation grows, compelling development teams to incorporate an expanding array of database technologies. This results in an architectural sprawl, imposing on teams the challenges of mastering, sustaining, and harmonizing new technologies. Concurrently, the dynamic technology landscape undergoes constant evolution, demanding teams to swiftly adjust. As a result, self-contained, autonomous teams encounter these hurdles recurrently, highlighting the pressing need for streamlined solutions to mitigate complexity and enhance agility. Figure 2: The evolving tech landscape. Data sprawl: A major threat to developer productivity and business agility Data sprawl is slowing everyone down. The more systems we add, the harder it is for developers to keep up. Each new database brings its own unique language, format, and way of working. This creates a huge headache for managing everything—from buying new systems to making sure they all work together securely. It’s a constant battle to keep data accessible, consistent, and backed up across all these different platforms. Figure 3: Teams building on separate stacks leads to data sprawl and manageability issues across the organization It compromises every single one of the four outcomes your technology foundation should be providing, yielding the opposite results: Missed opportunities, lost customers: Fragmented development experiences consume time as engineers struggle with multiple technologies, frameworks, and extract, transform, and load mechanisms for duplicating data between systems. This slows down releases, degrades digital product quality, and impedes engineers from achieving product-market fit and effective competition. Flying blind: With your operational data siloed across multiple systems, you lack the data foundations necessary to use live data in shaping customer experiences or reacting to market changes. This is because you are unable to feed reliable, consistent, real-time data into your AI models to take action within the flow of the application or to provide the business with up-to-the-second visibility into operations. High attrition, high costs: Complex data architecture impacts development team culture, leading to siloed knowledge, inefficient collaboration, and decreased developer satisfaction. This complexity also consumes substantial resources in maintaining existing systems by diverting resources from new projects that are vital for business competition in new markets. Disruption from new technologies: Dependence on any one cloud provider can stifle innovation for development teams by restricting access to the latest technologies. Developers are confined to the tools and services offered by a single provider, hindering their ability to explore and integrate new, potentially more efficient, or advanced technologies. Speed: A unified developer experience for building high-quality software faster In today’s digital world, speed is king. Your customers expect seamless experiences, but clunky applications leave them frustrated. But traditional databases can be a bottleneck, struggling to keep pace with your ever-evolving data and slowing down development. The future of data is here, and it’s flexible: a data platform built for digital natives . It leverages a flexible document model, letting you store and work with your data exactly how you need it. This eliminates rigid structures and complex migrations, freeing your developers to focus on what matters—building amazing applications faster. Flexible document data models empower developers to handle today’s rapidly evolving application data ( 80%+ unstructured) that relational databases struggle with. MongoDB documents are richly typed, boosting developer productivity by eliminating the need for lengthy schema migrations when implementing new features. Developers get to use their preferred tools and languages. Through its drivers and integrations, MongoDB supports all of the most popular programming languages, frameworks, integrated development environments, and AI-code assistance tools. MongoDB scales! It starts small and scales globally. Built for elasticity and horizontal scaling, it handles massive workloads without app changes. Figure 4: A unified developer experience, integrating all necessary data services for building sophisticated modern applications Introducing MongoDB Atlas : a fully-managed cloud database built for the modern developer. It enables the integration of real-time data from devices with AI capabilities (through vector embeddings and large language models ) to personalize user experiences. Stream processing empowers constant data analysis, while in-app analytics provides real-time insights without needing separate data warehouses, all while automatically managing data movement and storage for cost-effectiveness. MongoDB Atlas simplifies database management with the following: Easy deployment via UI, API, CLI, Kubernetes, and infrastructure as code tools. Automated operations for cost-effective performance and real-time monitoring. MongoDB Atlas customer success stories: Development with speed, scale, and efficiency Delivery Hero Delivery Hero, a global leader in online food delivery, leverages MongoDB Atlas to power its rapid service. Founded in 2011, Delivery Hero now serves millions of customers in over 70 countries through brands like PedidosYa, foodpanda, and Glovo. Having replaced its legacy SQL database, Delivery Hero optimized operations and bolstered performance by using MongoDB Atlas. By leveraging MongoDB Atlas Search, Delivery Hero revolutionized its search functionality, ensuring a seamless user experience for its extensive customer base through simplified indexing and real-time data accuracy. MongoDB’s scalability has empowered Delivery Hero to manage over 100 million products in its catalog without encountering latency issues, enabling the company to expand its services while maintaining peak performance. This agility, coupled with MongoDB’s cost-effectiveness, has enabled Delivery Hero to swiftly adapt to evolving customer demands, solidifying its position in the fiercely competitive delivery market. MongoDB Atlas Search was a game changer. We ran a proof of concept and discovered how easy it is to use. We can index in one click, and because it’s a feature of MongoDB, we know data is always up-to-date and accurate. Andrii Hrachov, Principal Software Engineer, Delivery Hero Read the full customer story to learn more. Coinbase Coinbase, a prominent cryptocurrency exchange boasting 245,000 ecosystem partners and managing assets worth $273 billion , trusts MongoDB to handle its extensive data workload. As the company grew, MongoDB scaled seamlessly to accommodate the increased demand. To further improve performance in the fast-paced crypto world, Coinbase partnered with MongoDB to develop a system that significantly accelerated data transfer to reporting tools, reducing processing time from days to a mere 5-6 hours. This near real-time data access enables Coinbase to rapidly analyze trends and make informed decisions, maintaining a competitive edge in the ever-evolving crypto landscape. Watch Coinbase's full session at MongoDB.local Austin, 2024 to learn more. MongoDB: Your flexible platform for digital growth With MongoDB, you can freely explore, experiment, develop, and deploy according to your digital-native business needs. If you would like to learn more about how MongoDB can empower your digital-native business to conquer market trends, visit: Innovate With AI: The Future Enterprise Application-Driven Intelligence: Defining the Next Wave of Modern Apps AI-Driven Real-Time Pricing with MongoDB and Vertex AI

November 7, 2024

MongoDB and Partners: Building the AI Future, Together

If you’re like me, over the past year you’ve closely watched AI’s developments—and the world’s reactions to them. From infectious excitement about AI’s capabilities, to impatience with its cost and return on investment, every day has been filled with AI twists and turns. It’s been quite the roller coaster. During the ride, from time to time I’ve wondered where AI falls on the Gartner hype cycle, which gives "a view of how a technology or application will evolve over time." Have we hit the "peak of inflated expectations" only to fall into the "trough of disillusionment?" Or is the hype cycle an imperfect guide, as The Economist argues? The reality is that it takes time for any new technology—even transformative ones like AI—to take hold. And every advance, no matter how big, has had its detractors. A famous example is that of Picasso (!), who in 1968 said, “Computers are useless. They can only give you answers.” (!!) For our part, MongoDB is convinced that AI is a once-in-a-generation technology that will enhance every future application—a belief that has been reinforced by the incredible work our partners have shared at MongoDB’s 2024 events. Speeding AI development MongoDB is committed to helping organizations of all sizes succeed with AI, and one way we’re doing that is by collaborating with the MongoDB partner ecosystem to create powerful, user-friendly AI development tools and solutions. For example, Fireworks.ai —which is a member of the MongoDB AI Applications Program ecosystem —created an inference solution that hosts gen AI models and supports containerized deployments. This tool makes it easier for developers to build and deploy powerful applications with a range of easy-to-use tools and customization options. They can choose to use state-of-the-art, open-source language, image, and multimodal foundation models off the shelf, or they can customize and fine-tune models to their needs. Jointly, Fireworks.ai and MongoDB provide a solution for developers who want to leverage highly curated and optimized open-source models and combine these with their organization’s own proprietary data—and to do so with unparalleled speed and security. “MongoDB is one of the most sophisticated database providers, and it’s very easy to use,” said Benny Chen , cofounder of Fireworks.ai. "We want developers to be able to use these tools, and we want to work with providers who enable and empower developers." Nomic , another MAAP ecosystem member, also enables developers with best-in-class solutions across the entire unstructured data workflow. Their Embed offering, available through the Nomic API , allows users to vectorize large-scale datasets for use in text, image, and multimodal retrieval applications, including retrieval-augmented generation (RAG), using only their web browser. The Nomic-MongoDB solution is a highly efficient, open-weight model that developers can use to visualize the unstructured datasets they store in MongoDB Atlas . These insights help users quickly discover trends and articulate data-driven value propositions. Nomic also supported the recently announced vector quantization in MongoDB Atlas Vector Search , which reduces vector sizes while preserving performance. Last—but hardly least!—there’s our new reference architecture with MAAP partners AWS and Anthropic. Announced at MongoDB.local London , the reference architecture supports building memory-enhanced AI agents, and is designed to streamline complex processes and develop smarter, more responsive applications. For more—including a link to the code on Github— check out the MongoDB Developer Center . Making AI work for anyone and everyone The companies MongoDB partners with aren’t just making gen AI easier for developers—they’re building tools for everyone. For example, Capgemini has invested $2 billion in gen AI and is training 100,000 of its employees in the technology. GenYoda, a solution that helps insurance professionals with their daily work, is a product of this investment. GenYoda leverages MongoDB Atlas Vector Search to analyze large amounts of customer data, like policy statements, premiums, claims history, and health information. Using GenYoda, insurance professionals can quickly analyze underwriters’ reports to make informed decisions, create longitudinal health summaries, and streamline customer interactions to improve contact center efficiency. GenYoda can ingest 100,000 documents in just a few hours and respond to users’ queries in two to three seconds—a metric on par with the most widely used gen AI models. And it produces results: in one example, by using Capgemini’s solution an insurer was able to increase productivity by 15%, add new reports 25% faster (thus speeding decision-making), and reduce the manual effort of searching PDFs, increasing efficiency by 10%. Building the future of AI together So, what’s next? Honestly, I’m as curious as you are. But I’m also incredibly excited. At MongoDB, we’re active participants in the AI revolution, working to embrace the possibilities that lie ahead. The future of gen AI is bright, and I can’t wait to see what we’ll build together. To learn more about how MongoDB can accelerate your AI journey, explore the MongoDB AI Applications Program .

November 4, 2024

MongoDB Atlas Introduces Enhanced Cost Optimization Tools

MongoDB Atlas was designed with elasticity at its core and has always allowed customers to scale capacity vertically and horizontally, as required and automatically. Today, these inherent capabilities are even better and more cost-effective. At the recent MongoDB.local London, MongoDB announced several new MongoDB Atlas features that improve elasticity and help optimize costs while maintaining the performance and availability that business-critical applications demand. These include scaling each shard independently, extending storage beyond 4 TB or more , and 5X more responsive auto-scaling . Organizations and their customers are inherently dynamic, with operations, web traffic, and application usage growing unpredictably and non-linearly. For example, website traffic can spike due to a single video going viral on social media, and holidays are a frequent cause of application usage slowdowns. Traditionally, organizations have tackled this volatility by over-provisioning infrastructure, often at significant cost. Cloud adoption has improved the speed at which infrastructure can be provisioned in response to growing and volatile demand. Simultaneously, companies are focused on striking the perfect balance between performance and cost efficiency. This balance is acute in the current economic climate, where cost optimization is a top priority for Infrastructure & IT Operations (I&O) leaders. The goal is not balance between supply and demand. The goal is to meet the most profitable and mission-critical demand with the resources available. Nathan Hill, Distinguished VP Analyst, Gartner - Dec 2023 However, scaling infrastructure to meet demand without overprovisioning can be complex and costly. Organizations have often relied on manual processes (like scheduled scripts) or dedicated teams (like IT ops) to manage this challenge. MongoDB Atlas enables a more effective approach. With MongoDB Atlas, customers can manage flexible provisioning, zero-downtime scaling, and easy auto-scaling of their clusters. From October 2024, all Atlas customers with dedicated tier clusters can employ these recently announced enhancements for improved cost optimization. Granular resource provisioning MongoDB’s tens of thousands of customers have complex and diverse workloads with constantly changing requirements. Over time, workloads can grow unpredictably, requiring scaling up storage, compute, and IOPS independently and at differing granularities. Imagine a global retailer preparing for Cyber Monday, when traffic could be 512% higher than average — additional resources to serve customers are vital. Independent shard scaling enables customers running MongoDB Atlas to do this in a cost-optimal manner. Customers can independently scale the tier of individual shards in a cluster when one or more shards experience disproportionately higher traffic. For customers running workloads on sharded clusters, scaling each shard independently of all other shards is now an option (for example, only the shards serving US traffic during Thanksgiving). Customers can scale operational and analytical nodes independently in a single shard. This improves scalability and cost-optimization by providing fine-grained control to add resources to hot shards while maintaining the resources provisioned to other shards. All Atlas customers running dedicated clusters can use this feature through Terraform and the Admin API . Support for independent shard auto-scaling and configuration management via the Admin API and Terraform will be available in late 2024. Extended Storage and IOPS in Azure : MongoDB is introducing the ability to provision additional storage and IOPS on Atlas clusters running on Azure. This enables support for optimal performance without over-provisioning. Customers can create new clusters on Azure to provision additional IOPS and extended storage with 4TB or more on larger clusters (M40+). This feature is being rolled out and will be available to all Atlas clusters by late 2024. Head over to our docs page to learn more. With these updates, customers have greater flexibility and granularity in provisioning and scaling resources across their Atlas clusters on all three major cloud providers. Therefore, customers can optimize for performance and costs more effectively. More responsive auto-scaling Granular provisioning is excellent for optimizing costs while ensuring availability for an expected increase in traffic. However, what happens if a website gets 13X higher traffic or a surge in app interactions due to an unexpected social media post? Several enhancements to the algorithms and infrastructure powering MongoDB’s auto-scaling capabilities were announced in October 2024 at .local London . Cumulatively, these improve the time taken to scale and the responsiveness of MongoDB’s auto-scaling engine. Customers running dynamic workloads, particularly those with sharper peaks, will see up to 5X improvement in responsiveness. Smarter scaling decisions by Atlas will ensure that resource provisioning is optimized while maintaining high performance. This capability is available on all Atlas clusters with auto-scaling turned on, and customers should experience the benefits immediately. Industry-leading MongoDB Atlas customers like Conrad and Current use auto-scaling to automatically scale their compute capacity, storage capacity, or both without needing custom scripts, manual intervention, or third-party consulting services. Customers can set upper and lower tier limits, and Atlas will automatically scale their storage and tiers depending on their workload demands. This ensures clusters always have the optimal resources to maintain performance while optimizing costs. Take a look at how Coinbase is optimizing for both availability and cost in the volatile world of cryptocurrency with MongoDB Atlas’ help, or read our auto-scaling docs page to learn more. Optimize price and performance with MongoDB Atlas As businesses focus more on optimizing cloud infrastructure costs, the latest MongoDB Atlas enhancements— independent shard scaling, more responsive auto-scaling, and extended storage with IOPS—empower organizations to manage resources efficiently while maintaining top performance. These tools provide the flexibility and control needed to achieve cost-effective scalability. Ready to take control of your cloud costs? Sign up for a free trial today or spin up a cluster to get the performance, availability, and cost efficiency you need.

October 31, 2024

Health-Tech Startup Aktivo Labs Scales Up With MongoDB Atlas

Aktivo Labs , a pioneering health-tech startup based in Singapore, has made significant strides in the fight against chronic diseases. Aktivo Labs develops innovative preventative healthcare technology solutions that encourage healthier lifestyles. The Aktivo Score ® —the flagship product of Aktivo Labs built on MongoDB Atlas —is a simple yet powerful tool designed to guide users toward healthier living. “By collecting and analyzing data from smartphones and wearables—including physical activity, sleep patterns, and sedentary behavior—the Aktivo Score provides personalized recommendations to help users improve their health,” said Aktivo Labs CTO Jonnie Avinash at MongoDB.local Singapore in August 2024 . Aktivo Labs also works closely with insurance companies. Acting as a data processor, it helps insurers integrate some of the Aktivo Score features into their own apps to improve customer engagement. Empowering insurers with out-of-the-box apps and user journeys From the start, the Aktivo Labs engineering team chose to work on MongoDB Atlas because the platform’s document model and cloud nature provided the flexibility and scalability required to support the company’s business model. The first goal of the engineering team was to enable insurance providers to integrate Aktivo Score smoothly within their own infrastructures. The team built software development kits (SDKs) that insurers can embed in various iOS and Android apps. The SDKs enable progressive web app journeys for user experience, which insurers can then rebrand and customize as their own. Next, the Aktivo Labs team created a web portal to help companies manage their apps and monitor their performance. This required discreet direct integrations with a myriad of wearables. “When we started to deploy things with companies, we were able to replicate this architecture so we could support all kinds of configurations,” Avinash said. “We could give you dedicated clusters if the number of users that you’re expecting is big enough. If you’re not expecting too many customers, we could give you colocated or shared environments.” Finding more efficiencies, flexibility, and scalability with MongoDB Atlas “When we started off, one of our challenges was that we had a very small engineering team. A lot of the focus had to be on functionality, and the cost of tech had to be kept low,” said Avinash. Working on MongoDB Atlas allowed the Aktivo Labs team to focus on product development rather than on database management and overhead costs. As the company grew and expanded to markets across Asia, Africa, and the Middle East, another challenge arose: Aktivo Labs needed to ensure its platform could scale and handle large volumes of disparate data efficiently. MongoDB Atlas was the optimal solution because its fully managed multi-cloud platform could easily scale as the company grew. MongoDB Atlas also provided Aktivo Labs the flexibility it needed to handle the wide variety, volume, and complexity of data generated by users’ health metrics. Based on insights from the MongoDB Atlas oplog, the engineering team made proactive updates to the database in real-time in anticipation of dynamic changes to leaderboards and challenges in the app. This approach enables Aktivo Labs to manage complex data flows efficiently, ensuring that users always have access to the latest metrics about their health. MongoDB Atlas’s secondary nodes and analytics nodes provide isolated environments for intensive data processing tasks, such as calculating risk scores for diabetes and hypertension. This separation ensures that the primary user-facing applications remain responsive, even during periods of heavy data processing. These isolated environments have also been an important factor in achieving compliance with the data-anonymization requirements from health insurers. “The moment you start showing that it’s a managed service and you’re able to show a lot of these things, the amount of faith that both auditors and clients have in us is a lot more,” said Avinash. Powered by MongoDB Atlas, Aktivo Labs is now looking to expand into U.S. and European markets, pursuing its mission of preventing chronic diseases on a global scale. Visit our product page to learn more about MongoDB Atlas.

October 29, 2024

Away From the Keyboard: Rafa Liou, Senior Partner Marketing Manager

Welcome to the latest article in our “Away From the Keyboard” series, which features interviews with people at MongoDB, discussing what they do, how they prioritize time away from their work, and their advice for others looking to create a more holistic approach to coding. Rafa Liou, Senior Partner Marketing Manager at MongoDB, was gracious enough to tell us why he's not ashamed to advocate strongly for a healthy work-life balance and how his past career in the wild world of advertising helped him first recognize the need to do so. Q: What do you do at MongoDB? RAFA: I’m a Marketing Manager focused on MongoDB’s AI partner ecosystem . I help promote our partnerships with companies such as Anthropic, Cohere, LangChain, Together AI, and many others. I work to drive mutual awareness, credibility, and product adoption in the gen AI space via marketing programs. Basically telling the world why we’re better together. It’s a cool job where I’m able to wear many hats and interact with lots of different teams internally and externally. Q: What does work-life balance look like for you? RAFA: Work-life balance is really important to me. It’s actually one of the things I value the most in a job. I know some people advise against this but anytime I’m interviewing with a company I ask about it because it definitely impacts my mental health, how I spend my time outside of work, and my ability to do the things I love. I’m very fortunate to work for a company that understands that, and trusts me to do my job and, at the same time, be able to step out for a walk, a workout, not miss a dinner reservation with my husband, or whatever it is. It makes a lot of difference in both my productivity and happiness. After I log off, you can find me taking a HIIT class, exploring the restaurant scene in LA, or biking at the beach. It’s so good to be able to do all of that stress-free! Q: How do you ensure you set boundaries between work and personal life? RAFA: I usually joke that if you do everything you’re tasked with at the pace you’d like things to get done, you will never stop working. It is really important to prioritize them based on value, urgency, and feasibility. By assessing your pipeline more critically, you will be able to distill what needs to be done right now and also be at peace with the things that will be handled down the road, making it easier to disconnect when you’re done for the day. It’s also important to set expectations and boundaries with your manager and teams so you can fully enjoy life after work without worrying about that Slack message when you’re at the movies. Q: Has work/life balance always been a priority for you, or did you develop it later in your career? RAFA: Before tech, I worked in advertising, which is a very fast-paced industry with the craziest deadlines. For some time in my career, working relentlessly was not only required, but it was also rewarded by agency culture. When you’re young, nights in the office brainstorming over pizza with friends may sound fun. But it starts to wear you out pretty quickly, especially when you don’t have the time, energy, or even the mental state to enjoy your personal life after long hours. As I matured and climbed a few steps in my career, I felt the urge and empowerment to set some boundaries to protect myself. Now, it’s a non-negotiable factor for me. Q: What benefits has this balance given you in your career? RAFA: By constantly exercising prioritization, I’ve become a more efficient professional. When you focus on what really matters, you are also able to execute at higher quality, without distractions or the feeling of getting overwhelmed. Of course, with prioritization comes a lot of trade-offs and discussions with stakeholders on what should be prioritized today versus tomorrow. So, I think I’ve also gotten better at negotiation and conflict resolution (things I’ve always struggled with). Last but not least: having consistent downtime to unwind makes me more creative and energized to come up with new ideas and take on new projects. Q: What advice would you give to someone seeking to find a better balance? RAFA: First and foremost: don’t be ashamed of wanting a better work-life balance. I often find people living and breathing work just because they don’t want to be seen as lazy or uncommitted. Once you understand that a better work-life balance will actually make you a better professional—more intentional, efficient, and even strategic (as you will spend energy to solve what creates more value in a timely manner)—it will be easier to have this mindset, communicate it to others, and live by it. Something more practical would be to start a list of all the things you have to do, acknowledge you can’t finish them all by the end of the day (or week, or month), and ask yourself: Do they all carry the same importance? How can I prioritize them? What would happen if I work on X now instead of Y? I would experiment with this approach and check how you feel and how it impacts your day-to-day life. You might be surprised by the result. Making time for personal life events, hobbies, and meet-ups with family and friends will also help you have something to look forward to after closing your laptop. This is all easier said than done but I guarantee that once this becomes part of your core values and you find the balance that works for you, it is totally worth it! Thank you to Rafa Liou for sharing his insights! And thanks to all of you for reading. For past articles in this series, check out our interviews with: Senior AI Developer Advocate, Apoorva Joshi Developer Advocate Anaiya Raisinghani Interested in learning more about or connecting more with MongoDB? Join our MongoDB Community to meet other community members, hear about inspiring topics, and receive the latest MongoDB news and events. And let us know if you have any questions for our future guests when it comes to building a better work-life balance as developers. Tag us on social media: @/mongodb

October 29, 2024

MongoDB Atlas与YoMio.AI近乎完美适配:推理更快速、查询更灵活、场景更丰富

人工智能(AI) 世界正在以闪电般的速度发展,各种应用层出不穷,其中包括目前最为炫酷的新AI聊天机器人之一:角色AI。角色AI可以进行有趣的对话,帮助学习一门新语言,或者创建用户自己的聊天机器人。 YoMio.AI是一家专注角色AI的天使轮初创公司,聚焦AI娱乐,致力于从各方面让AI成为人类的陪伴。YoMio.AI目前主要开发了AI原生娱乐产品Rubii,并围绕Rubii构建了一整套产品矩阵,将Rubii中的功能解构,创造一套独立的服务,其中包括:全球最快的语音生成推理引擎之一;从Rubii上一键将角色放到其他社交平台,例如QQ;提供公开竞技场测评大语言模型的角色扮演能力(Roleplay LLM Arena);快速定制富知识机器人等。 初创公司,尤其是AI初创公司正在以最大限度的想象力在改变着我们每天的生活。他们每天在为我们创造工具,而在这个过程中,AI初创公司也迫切需要好用的工具。YoMio.AI创始人Junity指出,就开发而言,初创公司首先最需要的是统一有效的云架构解决方案,将全部应用迁移到一家云;其次,初创公司需求变化快,需要随时更改表单,非关系型数据库更为适配;此外,多语言全文搜索也是一项必要功能。 为了应对以上挑战与需求, MongoDB Atlas 成为了YoMio.AI近乎完美的适配解决方案。 利用二进制存储缓存张量,实现MongoDB版Prompt Cache,打造全球最快TTS推理引擎之一。 利用MongoDB储存二进制文件的能力,YoMio.AI实现了行业首个GPT-SoVITS极速推理,成功将原版3秒左右一条音频优化到15秒推理出160条音频(注:GPT-SoVITS是一款先进的TTS框架,在Github上超过30000星标,以跨语言、3秒语音无需训练即可克隆而著称)。据Junity介绍,通过MongoDB Atlas,YoMio.AI无需像PostgreSQL装插件来实现中文全文搜索,也无需像Elastic Search专门配置搜索节点,配置Atlas Index后,仅需简单的代码即可搜索。 Search Index实现多语言全文搜索。 MongoDB 的全文索引可以帮助用户快速地查找包含特定关键字或短语在内的数据。这对很多应用程序来说非常重要,因为可以使用全文索引来快速查找相关数据。在MongoDB支持之下,YoMio.AI不但实现了中日英韩粤多语言搜索,而且能够实现跨语言搜索,甚至是在同一句话中进行混读。 Atlas Vector Search搭配Infinity推理引擎,实现极低延迟且超高性能检索重排。 MongoDB Atlas 提供非常丰富的开箱即用功能,向量检索构建了最低延迟且同时满足检索+重排的系统,并且搭建本地Infinity镜像实现embedding+reranker即插即用,单次检索全流程延迟低于50ms。 除此之外,通过Atlas全球集群(Global Cluster),YoMio.AI上述系统在全球任何范围内都是低延迟高可用,而实现这一切仅用了两个月。 Junity 解释到,YoMio.AI业务分为ToC和ToB两类。ToC为主推的AI角色Rubii,利用丰富的数据和精进的算法,Rubii正在变得更富场景感和体验感;ToB主推富有定制知识的聊天机器人,YoMio.AI内部检索引擎会将客户的文档分块,转换成向量,并且用知识图谱解析,每一次和机器人对话时,机器人都会获得最符合该对话场景下的文档分片。 无论是ToC端还是ToB端,YoMio.AI都在与时代赛跑,始终要拿出最快、最优质的产品。作为YoMio.AI的数据库技术合作伙伴,MongoDB在AI前沿探索方面也开足马力,正在积极探索AI在应用程序现代化改造中的应用,尤其在代码分析、智能模式映射和代码转换等领域。通过引入AI,MongoDB将进一步简化应用现代化的过程,缩短迁移时间,使企业能够更快地适应市场需求。 随着MongoDB的新发布和革新,YoMio.AI的“闪电式发展”值得期待。 点击注册,免费开始使用 MongoDB Atlas

October 29, 2024

Driving Neurodiversity Awareness and Education at MongoDB

Roughly 20% of the US population is neurodiverse, which means that you likely work with a colleague who learns and navigates the workplace (and the world) differently than you do. Which is a good thing! Studies have shown that hiring neurodiverse individuals benefits workplaces , with Deloitte noting that organizations “can gain a competitive edge from increased diversity in skills, ways of thinking, and approaches to problem-solving.” Config at MongoDB —which Cian and I are the global leaders of—recognizes the prevalence, importance, and power of neurodiversity in the workplace. Config’s mission is to educate both our members and the wider employee population at MongoDB about neurodiversity in the workplace, and through education to empower them to embrace—and champion—neurodiversity. Since it was founded in April 2023, Config’s membership has grown by over 150%, and it now has members in New York, Dublin, Paris, Gurugram, and Sydney. In fact, more than 200 people who span a range of MongoDB teams—from Engineering and Product, to the People team, to Marketing—take part in Config. We like to say that no one succeeds until all of us succeed. And that no one belongs until all of us belong. As managers, culture leaders, and as people, it's our responsibility to do whatever we can to make that true. Invisible differences like neurodiversity are hard to spot, but they enrich our work and our lives. Config.MDB plays an important role in helping us achieve this ambition. Making an impact on the MongoDB community Over the last year and a half, Config has held over fifteen events globally—with almost 1,000 employees in attendance. Config has held educational events for both the group’s members and the wider MongoDB audience on neurodiversity-related topics like autism awareness and ADHD awareness, along with events tailored to allies and members who identify as neurodivergent or who are part of a neurodivergent family. Config has also held training sessions for MongoDB people managers that provide them knowledge and tools to better manage neurodiverse team members. Ger Hartnett, an Engineering Lead at MongoDB said the training “gave me a much better understanding and appreciation for neurodiversity. This course was truly eye-opening for me. I learned practical ways to be more inclusive and supportive, both at work and in everyday life.” The group also holds quarterly virtual meetings to share the latest updates, personal experiences, and practical tips for members, focusing on career development, benefit entitlements, and events happening within MongoDB. Outside of events and training sessions, Config has had a broader business impact on the company, with some Config leads partnering with the employee inclusion and recruiting teams to put together an interview accommodation program. This program supports candidates who are neurodiverse or have a disability by allowing them to apply for special requests to make their interview experience more inclusive and enjoyable. Making a difference for individual members Config’s focus on educational and training events has had a dramatic and direct impact on members. The group is a safe space for neurodiverse or disabled people to share their experiences and seek advice on various issues. Cian is one of Config’s founding members, and had this to say about his personal experience: I was diagnosed with dyslexia in college and wanted to start a group like Config after speaking with other employees who were neurodiverse. We agreed that there was a need for a group like this at MongoDB. After the group was formed, I attended several events that focused on ADHD and saw a lot of similarities between traits and experiences of those with ADHD and myself. After attending these events, struggles that I had and that I thought were personality traits could be a sign of ADHD, I turned to some of our members for guidance on how to seek a diagnosis. Earlier this year, I was diagnosed with ADHD by a medical professional. I have noticed an improvement in my quality of life, and thanks to Config, I have a lot of valuable tips and resources to help me in my day-to-day. Had it not been for Config and these events I would still be none the wiser. Config has also made an impact on employees who are parents of neurodivergent children, like Sarah Lin , a senior information/content architect and Config member: I joined Config to be part of the change I want to see in the world—to help make the inclusive and supportive workplace I'd want my autistic daughter to experience. I certainly hope I'm contributing because membership has benefitted me personally. I've learned more about different types of neurodivergence and ways to support my colleagues. From our employee resource group events, I've learned more about autism and the lives of autistic adults so that I can be a better support for my daughter as we look toward her adulthood. The best part has been conversations with other parents and seeing myself reflected in their struggles, persistence, and achievements. Looking ahead As Config continues to expand its footprint within MongoDB, the group plans to introduce advanced educational programming to raise awareness for neurodiversity in the workplace. It also plans to hold workshops to foster professional development and executive functioning. Config also hopes to grow its global membership to provide community outreach at scale for nonprofit organizations that specifically service neurodiverse individuals. Ultimately, Config’s aim is to create the best environment for teams at MongoDB. Our view of success is not only the “what” but also the “how.” Being sustainable, encouraging growth through learning, and accomplishing goals as a team are all meaningful to us. And we believe strongly in the power of allyship; we want MongoDB to be a place where amazing people feel supported and are given the opportunity to do their best. After all, many of us are already close to neurodivergent individuals. One of Config’s Executive Sponsors, Mick Graham, has a daughter who is neurodivergent—which he says gives him extra inspiration to support Config now and in the future. Overall, being part of Config has raised our understanding of how neurodivergent people navigate the world. And the group—and the inspirations and experiences members have shared—contribute to making MongoDB a place that great people want to be. Interested in learning more about employee resource groups at MongoDB? Join our talent community to receive the latest MongoDB culture highlights.

October 24, 2024

Reflections On Our Recent AI "Think-A-Thon"

Interesting ideas are bound to emerge when great minds come together, so there was no shortage of interesting ideas on October 2nd, when MongoDB’s Developer Relations team hosted our second-ever AI Build Together event at MongoDB.local London. In some ways, the event is similar to a hackathon: a group of developers come together to solve a problem. But in other ways, the event is quite different. While hackathons normally take an entire day and involve intensive coding, the AI Build Together events are organized to take place over just a few hours and don't involve any coding at all. Instead, it’s all based around discussion and ideation. For these reasons, MongoDB’s Developer Relations team likes to dub them “think-a-thons.” Our first AI Build Together event was held earlier this year at .local NYC. After seeing the energy in the room and the excitement from attendees, our Developer Relations team knew it wanted to host another one. The .local London event’s fifty attendees—which included developers from numerous industries and leading AI innovators who served as mentors—came together to brainstorm and discuss AI-based solutions to common industry problems. .local London AI Build Together attendees brainstorming AI solutions for the healthcare industry The AI mentors included: Loghman Zadeh (gravity9), Ben Gutkovich (Superlinked), Jesse Martin (Hasura), Marlene Mhangami (Microsoft), Igor Alekseev (AWS), and John Willis and Patrick Debois (co-founders of DevOps). Upon arrival, participants joined a workflow group best aligned with their industry and/or area of interest—AI for Education, AI for DevOps, AI for Healthcare, AI for Optimizing Travel, AI for Supply Chain, and AI for Productivity. The AI for Productivity group collaborating on their workflow The discussions were lively, and it was amazing to see how much energy these attendees brought to their discussions. For example, the AI for Education workflow group vigorously discussed developing a personalized AI education coach to help students develop their educational plans and support them with career advice. Meanwhile, the AI for Healthcare workflow group focused on the idea of creating an AI drive tool to provide personalized healthcare to patients and real-time insights to their providers. The AI for Productivity team came up with a clever product that helps you read, digest, and identify the key aspects of long legal documents. The AI for Optimizing Travel group seeking advice from AI mentor Marlene A talented artist was also brought in to visualize each workflow group’s problem statements and potential solutions—literally and figuratively illustrating their innovative ideas. Graphic recorder Maria Foulquié putting the final touches on the illustration Final illustration documenting the 2024 MongoDB.local London AI Build Together event All in all, our second time hosting this event was deemed a success by everyone involved. “It was impressive to see how attendees, regardless of their technical background, found ways to contribute to complex AI solutions,” says Loghman Zadeh, AI Director at gravity9, who served as one of the event’s advisors. “Engaging with so many creative and forward-thinking individuals, all eager to push the boundaries of AI innovation was refreshing. The collaborative atmosphere fostered dynamic discussions and allowed participants to explore new ideas in a supportive environment.” If you’re interested in taking part in events like these—which offer a range of networking opportunities—there are three more MongoDB.local events slated for 2024—Sao Paulo, Paris, and Stockholm. Additionally, you can join your local MongoDB user group to learn from and connect with other MongoDB developers in your area.

October 23, 2024

Gamuda Puts AI in Construction with MongoDB Atlas

Gamuda Berhad is a leading Malaysian engineering and construction company with operations across the world, including in Australia, Taiwan, Singapore, Vietnam, the United Kingdom, and more. The company is known for its innovative approach to construction through the use of cutting-edge technology. Speaking at MongoDB.local Kuala Lumpur in August 2024 , John Lim, Chief Digital Officer at Gamuda said: “In the construction industry, AI is increasingly being used to analyze vast amounts of data, from sensor readings on construction equipment to environmental data that impacts project timelines.” One of Gamuda’s priorities is determining how AI and other tools can impact the company’s methods for building large projects across the world. For that, the Gamuda team needed the right infrastructure, with a database equipped to handle the demands of modern AI-driven applications. MongoDB Atlas fulfilled all the requirements and enabled Gamuda to deliver on its AI-driven goals. Why Gamuda chose MongoDB Atlas “Before MongoDB, we were dealing with a lot of different databases and we were struggling to do even simple things such as full-text search,” said Lim. “How can we have a tool that's developer-friendly, helps us scale across the world, and at the same time helps us to build really cool AI use cases, where we're not thinking about the infrastructure or worrying too much about how things work but are able to just focus on the use case?” After some initial conversations with MongoDB, Lim’s team saw that MongoDB Atlas could help it streamline its technology stack, which was becoming very complex and time consuming to manage. MongoDB Atlas provided the optimal balance between ease of use and powerful functionality, enabling the company to focus on innovation rather than database administration. “I think the advantage that we see is really the speed to market. We are able to build something quickly. We are fast to meet the requirements to push something out,” said Lim. Chi Keen Tan, Senior Software Engineer at Gamuda, added, “The team was able to use a lot of developer tools like MongoDB Compass , and we were quite amazed by what we can do. This [ability to search the items within the database easily] is just something that’s missing from other technologies.” Being able to operate MongoDB on Google Cloud was also a key selling point for Gamuda: “We were able to start on MongoDB without any friction of having to deal with a lot of contractual problems and billing and setting all of that up,” said Lim. How MongoDB is powering more AI use cases Gamuda uses MongoDB Atlas and functionalities such as Atlas Search and Vector Search to bring a number of AI use cases to life. This includes work implemented on Gamuda’s Bot Unify platform, which Gamuda built in-house using MongoDB Atlas as the database. By using documents stored in SharePoint and other systems, this platform helps users write tenders quicker, find out about employee benefits more easily, or discover ways to improve design briefs. “It’s quite incredible. We have about 87 different bots now that people across the company have developed,” Lim said. Additionally, the team has developed Gamuda Digital Operating System (GDOS), which can optimize various aspects of construction, such as predictive maintenance, resource allocation, and quality control. MongoDB’s ability to handle large volumes of data in real-time is crucial for these applications, enabling Gamuda to make data-driven decisions that improve efficiency and reduce costs. Specifically, MongoDB Atlas Vector Search enables Gamuda’s AI models to quickly and accurately retrieve relevant data, improving the speed and accuracy of decision-making. It also helps the Gamuda team find patterns and correlations in the data that might otherwise go unnoticed. Gamuda’s journey with MongoDB Atlas is just beginning as the company continues to explore new ways to integrate technology into its operations and expand to other markets. To learn more and get started with MongoDB Vector Search, visit our Vector Search Quick Start page.

October 22, 2024

Empower Innovation in Insurance with MongoDB and Informatica

For insurance companies, determining the right technology investments can be difficult, especially in today's climate where technology options are abundant but their future is uncertain. As is the case with many large insurers, there is a need to consolidate complex and overlapping technology portfolios. At the same time, insurers want to make strategic, future-proof investments to maximize their IT expenditures. What does the future hold, however? Enter scenario planning. Using the art of scenario planning, we can find some constants in a sea of uncertain variables, and we can more wisely steer the organization when it comes to technology choices. Consider the following scenarios: Regulatory disruption: A sudden regulatory change forces re-evaluation of an entire market or offering. Market disruption: Vendor and industry alliances and partnerships create disruption and opportunity. Tech disruption: A new CTO directs a shift in the organization's cloud and AI investments, aligning with a revised business strategy. What if you knew that one of these three scenarios was going to play itself out in your company but weren’t sure which one? How would you invest now to prepare for one of the three? At the same time that insurers are grappling with technology choices, they’re also facing clashing priorities: Running the enterprise: supporting business imperatives and maintaining health and security of systems. Innovating with AI: maintaining a competitive position by investing in AI technologies. Optimizing spend: minimizing technology sprawl, technical debt, and maximizing business outcomes. Data modernization What is the common thread among all these plausible future scenarios? How can insurers apply scenario planning principles while bringing diverging forces into alignment? There is one constant in each scenario, and that’s the organization’s data—if it’s hard to work with, any future scenario will be burdened by this fact. One of the most critical strategic investments an organization can make is to ensure data is easy to work with. Today, we refer to this as data modernization, which involves removing the friction that manifests itself in data processing, ensuring data is current, secure, and adaptable. For developers, who are closest to the data, this means enabling them with a seamless and fully integrated developer data platform along with a flexible data model. In the past, data models and databases would remain unchanged for long periods. Today, this approach is outdated. Consolidation creates a data model problem, resulting in a portfolio with relational, hierarchical, and file-based data models—or, worst of all, a combination of all three. Add to this the increased complexity that comes with relational models, including supertype-subtype conditional joins and numerous data objects, and you can see how organizations wind up with a patchwork of data models and overly complicated data architecture. A document database, like MongoDB Atlas , stores data in documents and is often referred to as a non-relational (or NoSQL) database. The document model offers a variety of advantages and specifically excels in data consolidation and agility: Serves as the superset of all other data model types (relational, hierarchical, file-based, etc.) Consolidates data assets into elegant single-views, capable of accommodating any data structure, format, or source Supports agile development, allowing for quick incorporation of new and existing data Eliminates the lengthy change cycles associated with rigid, single-schema relational approaches Makes data easier to work with, promoting faster application development By adopting the document model, insurers can streamline their data operations, making their technology investments more efficient and future-proof. The challenges of making data easier to work with include data quality. One significant hurdle insurers continue to face is the lack of a unified view of customers, products, and suppliers across various applications and regions. Data is often scattered across multiple systems and sources, leading to discrepancies and fragmented information. Even with centralized data, inconsistencies may persist, hindering the creation of a single, reliable record. For insurers to drive better reporting, analytics, and AI, there's a need for a shared data source that is accurate, complete, and up-to-date. Centralized data is not enough; it must be managed, reconciled, standardized, cleansed, and enriched to maintain its integrity for decision-making. Mastering data management across countless applications and sources is complex and time-consuming. Success in master data management (MDM) requires business commitment and a suite of tools for data profiling, quality, and integration. Aligning these tools with business use cases is essential to extract the full value from MDM solutions, although the process can be lengthy. Informatica’s MDM solution and MongoDB Informatica’s MDM solution has been developed to answer the key questions organizations face when working with their customer data: “How do I get a 360-degree view of my customer, partner and & supplier data?” “How do I make sure that my data is of the highest quality?” The Informatica MDM platform helps ensure that organizations around the world can confidently use their data and make business decisions based on it. Informatica’s entire MDM solution is built on MongoDB Atlas , including its AI engine, Claire. Figure 1: Everything you need to modernize the practice of master data management. Informatica MDM solves the following challenges: Consolidates data from overlapping and conflicting data sources. Identifies data quality issues and cleanses data. Provides governance and traceability of data to ensure transparency and trust. Insurance companies typically have several claim systems that they’ve amassed over the years through acquisitions, with each one containing customer data. The ability to relate that data together and ensure it’s of the highest quality enables insurers to overcome data challenges. MDM capabilities are essential for insurers who want to make informed decisions based on accurate and complete data. Below are some of the different use cases for MDM: Modernize legacy systems and processes (e.g. claims or underwriting) by effectively collecting, storing, organizing, and maintaining critical data Improve data security and improve fraud detection and prevention Effective customer data management for omni-channel engagement and cross- or up-sell Data management for compliance, avoiding or predicting in advance any possible regulatory issues Given we already leverage the performance and scale of MongoDB Atlas within our cloud-native MDM SaaS solution and share a common focus on high-value, industry solutions, this partnership was a natural next step. Now, as a strategic MDM partner of MongoDB, we can help customers rapidly consolidate and sunset multiple legacy applications for cloud-native ones built on a trusted data foundation that fuels their mission-critical use cases. Rik Tamm-Daniels, VP of Strategic Ecosystems and Technology at Informatica Taking the next step For insurance companies navigating the complexities of modern technology and data management, MDM combined with powerful tools like MongoDB and Informatica provide a strategic advantage. As insurers face an uncertain future with potential regulatory, market, and technological disruptions, investing in a robust data infrastructure becomes essential. MDM ensures that insurers can consolidate and cleanse their data, enabling accurate, trustworthy insights for decision-making. By embracing data modernization and the flexibility of document databases like MongoDB, insurers can future-proof their operations, streamline their technology portfolios, and remain agile in an ever-changing landscape. Informatica’s MDM solution, underpinned by MongoDB Atlas, offers the tools needed to master data across disparate systems, ensuring high-quality, integrated data that drives better reporting, analytics, and AI capabilities. If you would like to discover more about how MongoDB and Informatica can help you on your modernization journey, take a look at the following resources: Unify data across the enterprise for a contextual 360-degree view and AI-powered insights with Informatica’s MDM solution Automating digital underwriting with machine learning Claim management using LLMs and vector search for RAG

October 22, 2024

Built With MongoDB: Buzzy Makes AI Application Development More Accessible

AI adoption rates are sky-high and showing no signs of slowing down. One of the driving forces behind this explosive growth is the increasing popularity of low- and no-code development tools that make this transformative technology more accessible to tech novices. Buzzy , an AI-powered no-code platform that aims to revolutionize how applications are created, is one such company. Buzzy enables anyone to transform an idea into a fully functional, scalable web or mobile application in minutes. Buzzy developers use the platform for a wide range of use cases, from a stock portfolio tracker to an AI t-shirt store. The only way the platform could support such diverse applications is by being built upon a uniquely versatile data architecture. So it’s no surprise that the company chose MongoDB Atlas as its underlying database. Creating the buzz Buzzy’s mission is simple but powerful: to democratize the creation of applications by making the process accessible to everyone, regardless of technical expertise. Founder Adam Ginsburg—a self-described husband, father, surfer, geek, and serial entrepreneur—spent years building solutions for other businesses. After building and selling an application that eventually became the IBM Web Content Manager, he created a platform allowing anyone to build custom applications quickly and easily. Buzzy initially focused on white-label technology for B2B applications, which global vendors brought to market. Over time, the platform evolved into something much bigger. The traditional method of developing software, as Ginsburg puts it, is dead. Ginsburg observed two major trends that contributed to this shift: the rise of artificial intelligence (AI) and the design-centric approach to product development exemplified by tools like Figma. Buzzy set out to address two major problems. First, traditional software development is often slow and costly. Small-to-medium-sized business (SMB) projects can take anywhere from $50,000 to $250,000 and nine months to complete. Due to these high costs and lengthy timelines, many projects either fail to start or run out of resources before they’re finished. The second issue is that while AI has revolutionized many aspects of development, it isn’t a cure-all for generating vast amounts of code. Generating tens of thousands of lines of code using AI is not only unreliable but also lacks the security and robustness that enterprise applications demand. Additionally, the code generated by AI often can’t be maintained or supported effectively by IT teams. This is where Buzzy found a way to harness AI effectively, using it in a co-pilot mode to create maintainable, scalable applications. Buzzy’s original vision was focused on improving communication and collaboration through custom applications. Over time, the platform’s mission shifted toward no-code development, recognizing that these custom apps were key drivers of collaboration and business effectiveness. The Buzzy UX is highly streamlined so even non-technical users can leverage the power of AI in their apps. Initially, Buzzy's offerings were somewhat rudimentary, producing functional but unpolished B2B apps. However, the platform soon evolved. Instead of building their own user experience (UX) and user interface (UI) capabilities, Buzzy integrated with Figma, giving users access to the design-centric workflow they were already familiar with. The advent of large language models (LLMs) provided another boost to the platform, enabling Buzzy to accelerate AI-powered development. What sets Buzzy apart is its unique approach to building applications. Unlike traditional development, where code and application logic are often intertwined, Buzzy separates the "app definition" from the "core code." This distinction allows for significant benefits, including scalability, maintainability, and better integration with AI. Instead of handing massive chunks of code to an AI system—which can result in errors and inefficiencies—Buzzy gives the AI a concise, consumable description of the application, making it easier to work with. Meanwhile, the core code, written and maintained by humans, remains robust, secure, and high-performing. This approach not only simplifies AI integration but also ensures that updates made to Buzzy’s core code benefit all customers simultaneously, an efficiency that few traditional development teams can achieve. Flexible platform, fruitful partnership The partnership between Buzzy and MongoDB has been crucial to Buzzy’s success. MongoDB’s Atlas developer data platform provides a scalable, cost-effective solution that supports Buzzy’s technical needs across various applications. One of the standout features of MongoDB Atlas is its flexibility and scalability, which allows Buzzy to customize schemas to suit the diverse range of applications the platform supports. Additionally, MongoDB’s support—particularly with new features like Atlas Vector Search —has allowed Buzzy to grow and adapt without complicating its architecture. In terms of technology, Buzzy’s stack is built for flexibility and performance. The platform uses Kubernetes and Docker running on Node.js with MongoDB as the database. Native clients are powered by React Native, using SQLLite and Websockets for communication with the server. On the AI side, Buzzy leverages several models, with OpenAI as the primary engine for fine-tuning its AI capabilities. Thanks to the MongoDB for Startups program , Buzzy has received critical support, including Atlas credits, consulting, and technical guidance, helping the startup continue to grow and scale. With the continued support of MongoDB and an innovative approach to no-code development, Buzzy is well-positioned to remain at the forefront of the AI-driven application development revolution. A Buzzy future Buzzy embodies the spirit of innovation in its own software development lifecycle (SDLC). The company is about to release two game-changing features that are going to take AI driven App development to the next level: Buzzy FlexiBuild, which will allow users to build more complex applications using just AI prompts, and Buzzy Automarkup, which will allow Figma users to easily mark up screens, views, lists, forms, and actions with AI in minutes. Ready to start bringing your own app visions to life? Try Buzzy and start building your application in minutes for Free. To learn more and get started with MongoDB Vector Search, visit our Vector Search Quick Start guide .

October 18, 2024

Announcing Hybrid Search Support for LlamaIndex

MongoDB is excited to announce enhancements to our LlamaIndex integration. By combining MongoDB’s robust database capabilities with LlamaIndex’s innovative framework for context-augmented large language models (LLMs), the enhanced MongoDB-LlamaIndex integration unlocks new possibilities for generative AI development. Specifically, it supports vector (powered by Atlas Vector Search ), full-text (powered by Atlas Search ), and hybrid search, enabling developers to blend precise keyword matching with semantic search for more context-aware applications, depending on their use case. Building AI applications with LlamaIndex LlamaIndex is one of the world’s leading AI frameworks for building with LLMs. It streamlines the integration of external data sources, allowing developers to combine LLMs with relevant context from various data formats. This makes it ideal for building application features like retrieval-augmented generation (RAG), where accurate, contextual information is critical. LlamaIndex empowers developers to build smarter, more responsive AI systems while reducing the complexities involved in data handling and query management. Advantages of building with LlamaIndex include: Simplified data ingestion with connectors that integrate structured databases, unstructured files, and external APIs, removing the need for manual processing or format conversion. Organizing data into structured indexes or graphs , significantly enhancing query efficiency and accuracy, especially when working with large or complex datasets. An advanced retrieval interface that responds to natural language prompts with contextually enhanced data, improving accuracy in tasks like question-answering, summarization, or data retrieval. Customizable APIs that cater to all skill levels—high-level APIs enable quick data ingestion and querying for beginners, while lower-level APIs offer advanced users full control over connectors and query engines for more complex needs. MongoDB's LlamaIndex integration Developers are able to build powerful AI applications using LlamaIndex as a foundational AI framework alongside MongoDB Atlas as the long term memory database. With MongoDB’s developer-friendly document model and powerful vector search capabilities within MongoDB Atlas, developers can easily store and search vector embeddings for building RAG applications. And because of MongoDB’s low-latency transactional persistence capabilities, developers can do a lot more with MongoDB integration in LlamIndex to build AI applications in an enterprise-grade manner. LlamaIndex's flexible architecture supports customizable storage components, allowing developers to leverage MongoDB Atlas as a powerful vector store and a key-value store. By using Atlas Vector Search capabilities, developers can: Store and retrieve vector embeddings efficiently ( llama-index-vector-stores-mongodb ) Persist ingested documents ( llama-index-storage-docstore-mongodb ) Maintain index metadata ( llama-index-storage-index-store-mongodb ) Store Key-value pairs ( llama-index-storage-kvstore-mongodb ) Figure adapted from Liu, Jerry and Agarwal, Prakul (May 2023). “Build a ChatGPT with your Private Data using LlamaIndex and MongoDB”. Medium. https://medium.com/llamaindex-blog/build-a-chatgpt-with-your-private-data-using-llamaindex-and-mongodb-b09850eb154c Adding hybrid and full-text search support Developers may use different approaches to search for different use cases. Full-text search retrieves documents by matching exact keywords or linguistic variations, making it efficient for quickly locating specific terms within large datasets, such as in legal document review where exact wording is critical. Vector search, on the other hand, finds content that is ‘semantically’ similar, even if it does not contain the same keywords. Hybrid search combines full-text search with vector search to identify both exact matches and semantically similar content. This approach is particularly valuable in advanced retrieval systems or AI-powered search engines, enabling results that are both precise and aligned with the needs of the end-user. It is super simple for developers to try out powerful retrieval capabilities on their data and improve the accuracy of their AI applications with this integration. In the LlamaIndex integration, the MongoDBAtlasVectorSearch class is used for vector search. All you have to do is enable full-text search, using VectorStoreQueryMode.TEXT_SEARCH in the same class. Similarly, to use Hybrid search, enable VectorStoreQueryMode.HYBRID . To learn more, check out the GitHub repository . With the MongoDB-LlamaIndex integration’s support, developers no longer need to navigate the intricacies of Reciprocal Rank Fusion implementation or to determine the optimal way to combine vector and text searches—we’ve taken care of the complexities for you. The integration also includes sensible defaults and robust support, ensuring that building advanced search capabilities into AI applications is easier than ever. This means that MongoDB handles the intricacies of storing and querying your vectorized data, so you can focus on building! We’re excited for you to work with our LlamaIndex integration. Here are some resources to expand your knowledge on this topic: Check out how to get started with our LlamaIndex integration Build a content recommendation system using MongoDB and LlamaIndex with our helpful tutorial Experiment with building a RAG application with LlamaIndex, OpenAI, and our vector database Learn how to build with private data using LlamaIndex, guided by one of its co-founders

October 17, 2024