MongoDB Blog

Announcements, updates, news, and more

Ubuy Scales E-Commerce Globally and Unlocks AI With MongoDB

In today’s digital era, global e-commerce presents a major growth opportunity. This is particularly acute for businesses looking to expand beyond their local markets. While some companies thrive by serving domestic customers, others capitalize on cross-border e-commerce to reach a wider audience. Ubuy , a Kuwait-based e-commerce company, is tapping into this opportunity. Operating in over 180 countries, Ubuy enables customers worldwide to purchase products that may not be available in their local markets. The Ubuy App, which is also available to users on both iOS and Android, supports over 60 languages internationally and is a popular way to access Ubuy’s platform. Ubuy simplifies logistics, customs, and shipping to create a seamless shopping experience. It acts as a bridge between customers and international sellers. Unlike traditional marketplaces, Ubuy provides end-to-end services, from sourcing products and performing quality checks to handling shipping and customs. This ensures that products, even those newly launched in international markets, are accessible to buyers globally with minimal hassle. Founded in 2012, Ubuy initially focused on the Gulf Cooperation Council region. Having identified a gap in international products’ availability there, it used the next three years to expand its services. Today, Ubuy offers a diverse catalog of over 300 million products, and customers worldwide can access products from nine warehouses strategically located in the United States, the United Kingdom, China (including in Hong Kong), Turkey, Korea, Japan, Germany, and Kuwait. Scaling such a vast operation represented a significant technological challenge. MongoDB Atlas proved critical in enabling Ubuy to scale its operations and address specific search performance and inventory management issues. Overcoming search and scalability challenges Before adopting MongoDB Atlas, Ubuy relied on MySQL to manage product data and search functions. However, this model’s limitations led to performance bottlenecks - it couldn’t handle large-scale search operations, lacked high availability, and struggled to manage complex search queries from customers across different markets. Slow query responses, averaging as much as 4–5 seconds per search, impacted the user experience, making it critical for Ubuy to identify a more scalable and performant solution. Ubuy migrated to MongoDB Atlas and implemented both MongoDB Atlas Search and MongoDB Atlas Vector Search to overcome these hurdles. By using these products, Ubuy significantly improved search efficiency, reducing response times to milliseconds. The company can now ensure high search relevancy, enabling users to find products more accurately and quickly. Migrating a large platform to MongoDB Atlas At Ubuy’s scale, the migration to MongoDB Atlas required careful planning. In March 2023, the team conducted a proof of concept to test MongoDB Atlas’s capabilities in handling its vast inventory. A month later, the migration was complete: Ubuy had transitioned from MySQL to a fully managed MongoDB Atlas environment. The transition was seamless, with no downtime. The MongoDB team provided ongoing guidance to help Ubuy optimize search filters and facilitate a smooth integration with its existing e-commerce systems. The result was an improved customer experience through faster and more relevant search results. Ubuy chose MongoDB Atlas for three key reasons: Scalability: MongoDB Atlas provides the ability to handle massive data loads efficiently, enabling smooth search performance even during peak traffic. High availability: As a fully managed cloud database, MongoDB Atlas provides resilience and reduces downtime. AI-powered search: The use of MongoDB Atlas Search improves Ubuy’s product discovery experience, helping customers find the right products without seeing unnecessary results. Additionally, MongoDB Atlas Vector Search provides semantic search capabilities. This enables more intuitive product discovery based on intent rather than merely on keywords, enhancing customer satisfaction. Using AI-powered enhancements to drive customer engagement Beyond improving search performance, Ubuy has been enhancing its customers’ shopping experience through AI. Ubuy integrated AI-powered search and recommendation systems with MongoDB Atlas’s vector database capabilities. This enabled a transition from simple keyword-based searches to a more intuitive, intent-driven discovery experience. For example, when a user searches for a specific keyword, like “Yamaha guitar,” the AI-enhanced product page now provides structured information on this product’s suitability for beginners, professionals, and trainers. This improves user experience and enhances SEO visibility, driving organic traffic to Ubuy’s platform. “With MongoDB Atlas Search and Atlas Vector Search, we are able to deliver personalized product recommendations in real-time, making it easier for customers to find what they need faster than ever before,” said Mr. Omprakash Swami, Head of IT at Ubuy. Achieving response speed and business growth Since implementing MongoDB Atlas and AI-driven enhancements, Ubuy has seen remarkable improvements: Search response time reduced from 4–5 seconds to milliseconds Over 150 million search queries handled annually with improved relevancy Higher engagement on product pages due to AI-enriched content Ability to scale inventory beyond 300 million products with zero performance concerns “Moving to MongoDB Atlas and being able to use features such as Atlas Vector Search have been a game changer,” said Swami. “The ability to handle massive search queries in milliseconds while maintaining high relevancy has dramatically improved our customer experience and business operations. The flexibility of MongoDB Atlas has not only improved our search performance but also set the stage for AI-powered innovations that were previously impossible with our relational database setup.” Enhancing the future of e-commerce Looking ahead, Ubuy aims to optimize search by consolidating inventory visibility across multiple stores. The goal is to enable users to search across all warehouses from a single interface, delivering even greater convenience. Ubuy’s transformation showcases how employing MongoDB Atlas, along with its fully-integrated search capabilities and AI-driven insights, can significantly enhance global e-commerce operations. By addressing scalability and search relevance challenges, the company has positioned itself as a leader in cross-border e-commerce. With a relentless focus on innovation, Ubuy is set to redefine how consumers access international products. Together, Ubuy and MongoDB are helping make shopping across borders effortless and efficient. Visit our product page to learn more about MongoDB Atlas Search. Check out our Atlas Vector Search Quick Start Guide to get started with Vector Search today. Boost your MongoDB skills with our Atlas Learning Hub .

May 6, 2025
Home

Teach & Learn with MongoDB: Professor Chanda Raj Kumar

Welcome to the second edition of our series highlighting how educators and students worldwide are using MongoDB to transform learning. In this post, we chat with Professor Chanda Raj Kumar of KL University Hyderabad. The MongoDB for Educators program provides free resources like curriculum materials, MongoDB Atlas credits, certifications, and access to a global community of more than 700 universities—helping educators teach practical database skills and inspire future tech talent. Applied learning: Using MongoDB in real-world teaching Chanda Raj Kumar , Assistant Professor at KLEF Deemed to be University, Hyderabad, India, is a MongoDB Educator and Leader of the MongoDB User Group—Hyderabad. With ten years of teaching experience, he empowers students to gain hands-on experience with MongoDB in their projects. Thanks to his mentorship, during last semester’s Skill Week, 80% of his students earned MongoDB certifications, preparing them for careers in tech. His dedication earned him the 2024 Distinguished Mentor Award from MongoDB. His story shows how educators can use MongoDB to inspire students and prepare them for careers in tech. Tell us about your educational and professional journey and what initially sparked your interest in databases and MongoDB. My educational journey consists of an undergraduate degree from Kakatiya University. Following that, I pursued an M.Tech from Osmania University, where I gained immense knowledge in the landscape of computer science, which aided in laying a strong foundation for my technical expertise. Currently, I am pursuing a PhD from Annamalai University, focusing my research on machine learning. Additionally, qualifying exams like UGC NET and TSET have further strengthened my understanding of databases and why they are a core aspect of developing an application. Over the past ten years, I have gained extensive experience in academia and industry, and I currently serve as an Assistant Professor at KL University, Hyderabad. My interest in databases stems from their universal presence in almost every application. Early on, when I first dabbled into the world of databases, I was intrigued by how efficient storage mechanisms severely impact the speed and accuracy of data retrieval and other operations that will be performed on data through our application. While working with relational databases, I encountered challenges related to fixed schemas—certain data insertions were not feasible due to strict structural constraints or the unavailability of data types corresponding to spatial and vectorial data. This led me to delve into MongoDB, where the flexible JSON-based document structure provided a more scalable and dynamic approach to data management, along with MongoDB Atlas conforming to the rapidly evolving cloud computing of today's time. What courses related to databases and MongoDB are you currently teaching? At my university, I teach database-related courses across different levels. As a core course, I teach Database Management Systems (DBMS), covering database fundamentals and operations. I also handle Python Full Stack, MERN Stack, and Java Full Stack Development, integrating MongoDB with modern frameworks. Additionally, I conduct MongoDB certification courses, helping students gain industry-standard knowledge in database technologies. What motivated you to incorporate MongoDB into your curriculum? My journey with databases began when I realized the challenges of relational databases like SQL, with their rigid schema and complex queries. This led me to explore MongoDB, which offers a more flexible, user-friendly approach to data management. I actively advocate for adding MongoDB to the college curriculum to prepare students for the growing demand for NoSQL technologies. By teaching MongoDB alongside relational databases, I aim to help students build practical skills to design and manage modern, dynamic applications. You have successfully built an active student community around MongoDB on your campus. Can you share some insights into how you achieved this and the impact it's had on students? Building an active student community around MongoDB on campus has not only been an exciting journey, but a very enlightening one as well. I had concentrated on a step-by-step teaching approach, beginning with the basics and slowly making my way up to more complex topics. This helped students build a strong foundation while feeling confident and thorough about the things they were learning. One of the main ways I involved students was by incorporating MongoDB into different courses, where they could work on hands-on projects that required using the database. I also encouraged students to earn certifications like Developer and DBA, which gave them valuable credentials and a nod to their MongoDB skills. Furthermore, I arranged group discussions where students brainstormed, solved problems together, and stayed actively engaged in their learning. On top of that, I held special training sessions each semester called “Skill Weeks” that lasted a week to make sure that everyone was aware of the ongoing MongoDB advancements while also teaching newcomers. How do you design your course content to integrate MongoDB in a way that engages students and ensures practical learning experiences? I often begin by building a strong foundation, going over fundamental concepts such as document-oriented storage, collections, indexing, and CRUD operations to ensure students grasp the essentials. Once a solid base has been established, I introduce advanced concepts like aggregation pipelines, indexing, query optimization techniques, and sharding, whilst putting utmost emphasis on hands-on learning with real datasets to further fortify understanding. I also incorporate real-world projects where students design and build complete applications that integrate MongoDB in the backend and thereby, simulate industry use cases to enhance their problem-solving in a professional environment. As for the certification component, I include model quizzes, practice tests, and assignments to evaluate their knowledge and ensure they are job-ready with a validated skill set. How has MongoDB supported you in enhancing your teaching methodologies and upskilling your students? The curated learning paths and comprehensive resources through MongoDB Academia , such as PowerPoint presentations for educators, have best supported me and my teaching methods. The platform offers a wide variety of materials, covering basic to advanced concepts, often accompanied by visual aids that make complex concepts easier to grasp. The learning paths also provide a set of practice questions for the students that can reinforce their understanding. Moreover, the availability of the Atlas free cluster allows students to experiment with real-world database operations without cost, providing a practical experience. These resources offered by MongoDB have significantly reshaped my pedagogy to better accommodate practical elements. Have you conducted any projects or studies on students' experiences with MongoDB? If so, what key insights have you discovered, and how can they benefit other educators? Through surveys, Q&A sessions, and project reviews, I have identified students' strengths and weaknesses in working with MongoDB. Many students find the document-oriented model intuitive and appreciate the flexibility of schema design, but often struggle with optimizing queries, indexing strategies, and understanding aggregation pipelines. These insights have helped me refine and iterate my teaching style by focusing more on demonstrations, interactive exercises, and explanations targeted at complex topics. Other educators can benefit from these conclusions I have arrived at by incorporating regular feedback sessions and adapting their teaching methods to address these loopholes. Could you share a memorable experience or success story of a project from your time teaching MongoDB that stands out to you? One of the most memorable experiences from my time teaching MongoDB was during Skill Week, where 80% of my students earned MongoDB certifications. The structured pedagogy I implemented—combining hands-on learning, real-world projects, and guided problem-solving—played a crucial role in their success. This success was further recognized when I received an award last semester for my contributions to MongoDB education, further proving the impact of my teaching approach. Seeing students excel, gain industry-recognized skills, and confidently apply MongoDB skills in their careers has been incredibly rewarding for me. How has your role as a MongoDB Educator impacted your professional growth and the growth of the student community at your university? I have been able to demonstrate the power of non-relational databases, breaking the initial stigma about NoSQL databases and helping students see the advantages of flexible, scalable data models. This journey has also helped me secure my position as a subject matter expert, allowing me to lead discussions on advanced database concepts and real-world applications. As a MongoDB User Group (MUG) leader, I have built a global network, collaborating with educators, developers, and industry professionals. Additionally, conducting mentoring workshops at other colleges has strengthened my leadership skills while expanding MongoDB awareness beyond the scope of my institution. Most importantly, this role has provided students with direct industry exposure, which I believe plays a pivotal role in the growth of their careers. What advice would you give to educators who are considering integrating MongoDB into their courses to ensure a successful and impactful learning experience for students? My advice is to build upon students’ pre-existing knowledge while gradually introducing the transition or shift to NoSQL concepts. Since most students start with relational databases, it’s important to first highlight the key differences between SQL and NoSQL, and to explain when to use each. Given that students are generally inclined toward SQL (as it’s often the first database they work with), introducing MongoDB as a schema-less, document-oriented database makes the transition smoother. Once the basics are covered, progressing to advanced topics like data modeling, aggregation pipelines, and indexing ensures students gain a deeper understanding of database optimization and performance tuning. By adopting this structured approach, educators can provide a comprehensive, real-world learning experience that prepares students for industry use cases. To learn more, apply to the MongoDB for Educators program and explore free resources for educators crafted by MongoDB experts to prepare learners with in-demand database skills and knowledge.

May 5, 2025
Home

Announcing the MongoDB MCP Server

Today, MongoDB is pleased to share the MongoDB Model Context Protocol (MCP) Server in public preview. The MongoDB MCP Server enables AI-powered development by connecting MongoDB deployments—whether they’re on MongoDB Atlas, MongoDB Community Edition, or MongoDB Enterprise Advanced—to MCP-supported clients like Windsurf, Cursor, GitHub Copilot in Visual Studio Code, and Anthropic’s Claude. Using MCP as the two-way communication protocol, the MongoDB MCP Server makes it easy to interact with your data using natural language and perform database operations with your favorite agentic AI tools, assistants, and platforms. Originally introduced by Anthropic, the Model Context Protocol has been gaining traction as an open standard for connecting AI agents and diverse data systems. The growing popularity of MCP comes at a pivotal moment as LLMs and agentic AI are reshaping how we build and interact with applications. MCP unlocks new levels of integrated functionality, ensuring that the LLMs behind agentic workflows have access to the most recent and contextually relevant information. And it makes it easier than ever for developers to take advantage of the fast-growing and fast-changing ecosystem of AI technologies. The MongoDB MCP Server: Connecting to the broader AI ecosystem The MongoDB MCP Server enables developer tools with MCP clients to interact directly with a MongoDB database and to handle a range of administrative tasks, such as managing cluster resources, as well as data-related operations like querying and indexing. Figure 1. Overview of MongoDB MCP Server integration with MCP components. Forget separate tools, custom integrations, and manual querying. With the MongoDB MCP Server, developers can leverage the intelligence of LLMs to perform crucial database tasks directly within their development environments, with access to the most recent and contextually relevant data. The MongoDB MCP Server enables: Effortless data exploration: Ask your AI to "show the schema of the 'users' collection" or "find the most active users in the collection." Streamlined database management: Use natural language to perform database administration tasks like "create a new database user with read-only access" or "list the current network access rules." Context-aware code generation: Describe the data you need, and let your AI generate the MongoDB queries and even the application code to interact with it. AI-powered software development with Windsurf and MongoDB To make it easier for developers everywhere to use the MongoDB MCP Server right away, we've made it available out of the box in Windsurf , an AI code editor used by over a million developers and counting. Developers building with MongoDB can leverage Windsurf's agentic AI capabilities to streamline their workflows and accelerate application development. “MongoDB is aligned with Windsurf’s mission of empowering everyone to continuously dream bigger,” said Rohan Phadte, Product Engineer at Windsurf. “Through our integration with the MongoDB MCP Server, we’re helping innovators to create, transform, and disrupt industries with software in this new age of development. Developers can get started today by accessing the MongoDB MCP Server through our official server templates, and take advantage of the combined power of Windsurf and MongoDB for building their next project.” Figure 2. Windsurf MCP server templates. The MongoDB MCP Server in action Check out the videos below to see how to use the MongoDB MCP Server with popular tools like Claude, Visual Studio Code, and Windsurf. Using the MongoDB MCP Server for data exploration With an AI agent capable of directly accessing and exploring your database guided by natural language prompts, you can minimize context switching and stay in the flow of your work. Using the MongoDB MCP Server for database management The MongoDB MCP Server enables AI agents to interact directly with MongoDB Atlas or self-managed MongoDB databases, making it easier to automate manual tasks around cluster and user management. Using the MongoDB MCP Server for code generation Using LLMs and code agents has become a core part of developers’ workflows. Providing context, such as schemas and data structures, enables more accurate code generation, reducing hallucinations and enhancing agent capabilities. The future of software development is agentic The MongoDB MCP Server is a step forward in MongoDB’s mission to empower developers with advanced technologies to effortlessly bring bold ideas to life. By providing an official MCP server release, we’re meeting developers in the workflows and tools they rely on to build the future on MongoDB. As MCP adoption continues to gain momentum, we’ll continue to actively listen to developer feedback and to prioritize enhancements to our MCP implementation. If you have input on the MongoDB MCP Server, please create an issue on GitHub . And to stay abreast of the latest news and releases from MongoDB, make sure you check out the MongoDB blog . Check out the MongoDB MCP Server on GitHub and give it a try—see how it can accelerate your development workflow!

May 1, 2025
Home

Multi-Agentic Systems in Industry with XMPro & MongoDB Atlas

In 2025, agentic AI applications are no longer pet projects—companies around the world are investing in software to incorporate AI agents into their business workflows. The most common use of an AI agent is to assist with research analysis or writing code. LangChain’s recent survey of over 1000 professionals across multiple industries showed that over 51% have already deployed agents in production, with 60% using the agents for research and summarization tasks. However, leveraging an AI agent for more complex tasks than research and summarization—and implementing them in industrial environments like manufacturing—presents certain challenges. For example, as new technology is introduced into already established companies, the visibility of brownfield deployments increases. This installation and configuration of new hardware or software must coexist with legacy IT systems. And, while it is easy to run an AI agent in a sandboxed environment, it is harder to integrate agents with machines and Operational Technology (OT) systems speaking industrial protocols like Modbus, PROFINET, and BACnet due to existing legacy infrastructure and an accumulation of tech debt. To ensure governance and security in industrial environments, data security policies, regulatory compliance, and governance models are essential. Agent profiles with defined goals, rules, responsibilities, and constraints must be established before agents are deployed. Additionally, addressing real-world constraints—like LLM latency—and strategically selecting use cases and database providers can enhance Al agent effectiveness and optimize response times. What’s more, the successful implementation of AI agents in industrial environments requires a number of foundational elements, including: Flexible data storage and scalability: An agent requires different types of data to function, such as agent profile, short-term memory, and long-term memory. Industrial AI agents require even more types of data, such as time series data from sensors and PLCs. They require efficient and scalable data storage that adapts to the dynamic needs of the environment. Continuous monitoring and analysis: An agent deployed in a manufacturing environment requires real-time observability of ever-changing data generated by the factory. It also needs to keep humans in the loop for any critical decisions that might affect production. High availability: Industrial environments demand near-zero downtime, making system resilience and failover capabilities essential. XMPro joins forces with MongoDB To address these challenges, we are pleased to announce XMPro’s partnership with MongoDB. XMPro offers APEX AI , a low-code control room for creating and managing advanced AI agents for industrial applications. To ensure seamless control over these autonomous agents, XMPro APEX serves as the command center for configuring, monitoring, and orchestrating agent activities, empowering operators to remain in control. Figure 1. XMPro APEX AI platform working with MongoDB Atlas. APEX AI, combined with MongoDB Atlas and MongoDB Atlas Vector Search , addresses a variety of challenges faced by developers when building AI agents for industrial environments. XMPro complements this by seamlessly integrating with industrial equipment such as SCADA systems, PLCs, IoT sensors, and ERPs, enabling continuous monitoring of operations. This integration ensures real-time data acquisition, contextualization, and advanced analytics, transforming raw data into actionable insights. XMPro’s capabilities include condition monitoring, predictive maintenance, anomaly detection, and process optimization, which help reduce downtime and improve operational efficiency while maintaining compliance and safety standards. XMPro’s industrial AI agents rely on memory persistence for contextual decision-making. MongoDB Atlas acts as the database for storing and retrieving agent memories. Using a flexible document database for storing agentic memories enables agents to store different types of data, such as conversational logs, state transitions, and telemetry data, without requiring schema re-design. The capabilities of MongoDB Atlas Vector Search empower APEX AI agents with a retrieval-augmented generation (RAG) tool, which helps to reduce LLM hallucinations. This integration allows agents to access and retrieve verified data, grounding their responses. Having database and vector search tools together in MongoDB Atlas also helps reduce agent latency and speeds up development. APEX AI-enabled multi-agent systems working together in an industrial setting. These context-aware agents can work in tandem, retrieving relevant knowledge stored in MongoDB Atlas to enable meaningful collaboration and better decision-making. XMPro APEX AI also leverages MongoDB Atlas’s robust security and high availability to ensure that agents can securely access and leverage data in real time; features such as role-based access controls, network isolation, encryption in transit and at-rest are key to why this agent-based AI solution is ideal for securing industrial production environments. MongoDB’s highly available and horizontal scalability ensures seamless data access at scale as organizations scale up their APEX AI deployments. Unlocking the future of AI in industrial automation XMPro APEX AI and MongoDB Atlas are a winning combination that paves the way for a new era of industrial automation. By tackling the core challenges of AI agents' deployment in industrial environments, we’re enabling organizations to deploy robust, intelligent, and autonomous industrial AI agents at scale. To learn more about MongoDB’s role in the manufacturing industry, please visit our manufacturing and automotive webpage . Ready to boost your MongoDB skills? Head over to our MongoDB Atlas Learning Hub to start learning today.

April 29, 2025
Artificial Intelligence

VPBank Builds OpenAPI Platform With MongoDB

Open banking is the practice of banks sharing some of their financial data and services to developers for third-party financial service providers through an API. Open banking has accelerated the digitization of the financial services and banking industries. It also helps foster innovation and enhance customer experience by helping create customer-centric, personalized services and experiences. MongoDB has been at the forefront of this revolution. Specifically, MongoDB helps financial institutions worldwide take advantage of OpenAPI . This open-source technology enables an organization’s applications, software, and digital platforms to connect and exchange data with third-party services efficiently and securely. An example is VPBank . One of Vietnam’s largest private banks, it serves over 30 million customers. In 2020, VPBank was the first Vietnamese bank to adopt MongoDB Atlas for OpenAPI. Working with MongoDB, VPBank moved to a microservices architecture , which supported the creation of its own OpenAPI platform and set a new standard for digital banking in Vietnam. Speaking at MongoDB Day in Vietnam in November 2024 , Anh K. Pham, Head of Database Services and Operations for VPBank, shared how MongoDB set up the bank for success with open banking. Migrating from the relational model to the document model Before working with MongoDB, VPBank operated in SQL . The COVID pandemic and the rise of models such as open banking in the early 2020s mandated rapid digitization of banking operations and services. VPBank realized it needed to build the next generation of intelligent banking services to remain competitive. This was not feasible with traditional relational database management systems and the SQL model. VPBank’s primary goal was to harness the power of data and to more efficiently manage unstructured data . This meant switching to an agile architecture based on microservices. “When I was introduced to NoSQL, it made sense,” said Pham. “Data is not always structured. There’s a bunch of different data points here and there, and you can’t make anything of it. But it has to be stored somewhere, it has to be read, and it has to be fed into your applications.” MongoDB Atlas was hosted on Amazon Web Services (AWS) as part of VPBank’s cloud transformation journey. The bank chose MongoDB Atlas for its ability to handle multiple workload types, which had been inadequately supported by its relational databases. These workloads include time series data , event data, real-time analytics, notifications, and big data (like transaction histories, catalog data, and JSON data). Powering 220 microservices with flexibility, scalability, and performance VPBank’s OpenAPI platform consists of over 220 microservices, and it processes more than 100 million transactions per month. By supporting these transactions, MongoDB is ultimately helping VPBank enhance customer experiences and streamline operations. By using MongoDB Atlas, VPBank can better unlock the power of its data to quickly build data-driven applications and services on its microservices architecture. It experienced three substantial benefits by using MongoDB: Flexibility: MongoDB Atlas empowers VPBank to handle complex data, conduct rapid development and iterations, and facilitate efficient API development with BSON. Scalability: MongoDB enables dynamic scaling to handle increasing workloads. Additionally, horizontal scaling distributes data across multiple servers to handle high volumes, spikes in transactions, and API requests. Performance: MongoDB Atlas’s performance capabilities enable VPBank to manage large volumes of data in real time, regardless of acute throughput and latency demands. We have flexibility; we have scalability; we have performance. Those are the main things we want to look at when we’re talking about banking. I need to be flexible. I need to be scalable. I need my performance to be high, because I want my customers to not wait and see if their money is going to go through or not, Ahn K. Pham, Head of Database Services and Operations, VPBank Using OpenShift Container Platform (OCP), VPBank deployed a microservices architecture to run its Open Banking services. “Choosing MongoDB as the modern database was the best choice since it can handle multiple types of data workloads with the performance we needed,” said Pham. Looking to the future VPBank plans to continue its cloud transformation journey. “We’re continuing to migrate our applications from on-premises into the cloud, and we’re continuing to modernize our applications as well,” said Pham. “That means that maybe those other databases that we used to have might be turning into MongoDB databases.” VPBank is also looking at MongoDB to support its AI-driven future: “We really want to focus on AI and data analytics, pulling information from all our customers’ transactions,” explained Pham. “We want to ensure that what we build caters to our 30-plus million customers.” Visit our MongoDB Atlas Learning Hub to boost your MongoDB skills. To learn more about MongoDB for financial services, visit our solutions page .

April 29, 2025
Artificial Intelligence

How Neurodiversity Shines at MongoDB

At MongoDB, we continually work to foster a workplace culture where everyone can be their authentic selves. In fact, “Embrace the Power of Differences” is one of our core company values . Our employees’ diverse experiences, backgrounds, and perspectives help give MongoDB its competitive edge and lead to continuous innovation. April is Autism Awareness Month, and across industries, organizations are putting an increased focus on neurodiversity in the workplace. Luce and Ronan—members of our Config employee resource group—open up about their experiences as part of the neurodivergent community and how it shapes their work lives. Read on to learn more about the importance of neurodiversity awareness, building an inclusive culture, and ways we can all be better colleagues and advocates. Luce: Advocating for developers and neurodiversity Luce, a Senior Developer Advocate on MongoDB’s Developer Relations team, and someone with ADHD and high-functioning autism, says she’s always had a strong connection to technology. With a deep knowledge of Microsoft and C#, Luce brings both technical expertise and a wealth of personal experience to her role. But her journey to becoming an advocate for neurodiversity was shaped by challenges she faced early in her career. “I started out as a software developer, and in some ways, the logic of coding suited my neurodiversity,” said Luce. However, she struggled with breaking tasks into manageable steps and understanding the broader impact of a single change. This led to feelings of impostor syndrome, where she often saw herself as a “beginner” despite her technical skills. What kept Luce moving forward, though, was her passion for learning and her desire to support others. She began sharing her knowledge through blog posts and tutorials, which eventually led to her becoming a Microsoft MVP—a recognition she’s received seven times—and finding a career in developer advocacy. Luce’s advice to her colleagues? “Find out how your colleagues work best and leverage that. Corporate life can sometimes be too structured, and people should be allowed to work in ways that make them feel comfortable.” For someone with ADHD like Luce, this often looks like periods of hyperfocus and intense productivity, followed by periods of distraction that appear to be an “inconsistent” working cadence. Part of MongoDB’s hybrid working approach means that employees are owners of their success and trusted to work in a way that’s best for them—here you have the flexibility to decide how you get your work done in a setting that’s most productive for you. Ronan: A problem-solver at heart Ronan, a Senior Escalation Engineer on MongoDB’s Technical Services team, has spent his career untangling complex technical issues under pressure. As part of the escalation team, his role involves solving high-stakes, challenging problems that require deep focus and problem-solving skills. But as Ronan reflects, his journey to understanding his neurodivergence wasn’t straightforward. “For a long time, I didn’t realize what made me different,” said Ronan. “I loved working on complex technical issues, but at the same time, simple tasks would leave me stressed and anxious.” It wasn’t until his niece was diagnosed with autism that Ronan started to recognize similarities between her experience and his own. After pursuing a diagnosis in 2024, he finally understood what had been holding him back for so long. Ronan has learned that his own autism means he’s a highly detailed thinker who thrives when faced with a problem to solve. However, hyperfocus—while a strength—can be overwhelming, especially when switching between tasks. Ronan admits that large Zoom calls or constant Slack pings can overstimulate him, so he prefers written communication where he can process information at his own pace. One of Ronan’s biggest lessons learned is that neurodivergence is unique to each individual. “There’s no one-size-fits-all,” he said. “Some people may have sensory needs, communication preferences, or support needs that are very different from mine. It’s important to learn what neurodivergence means for the person in front of you.” The power of Config: A place for connection Both Luce and Ronan have found a supportive community in Config, MongoDB’s Employee Resource Group (ERG) for neurodiverse employees and allies. For Luce, being one of the leads of Config is incredibly fulfilling. She’s able to interact with members, understand how neurodiversity impacts them, and offer support in tangible ways. “I set up an always-open Zoom call where members can drop by for ‘ body doubling ,’ a technique that can help boost productivity,” Luce noted. “It’s so rewarding to see how simple actions can make a big difference for someone.” Ronan also values the sense of connection that Config has given him. “It’s helped me find a place where I can share my experiences and learn from others who understand,” he says. “The leadership team is fantastic at organizing events, and I feel like I can finally talk about the challenges that I face as a neurodivergent employee without fear of being judged.” Embracing neurodivergence: What we can all learn Luce and Ronan both emphasize the importance of understanding that neurodivergence isn’t the same for everyone. As Ronan puts it, “If you’ve met one neurodivergent person, you’ve met one neurodivergent person.” Whether it’s about communication styles, sensory needs, or work preferences, the key takeaway is this: when we listen, adapt, and support one another, we create an environment where everyone can thrive. Ronan’s advice to other neurodivergent employees? “There’s nothing wrong with you. You are not broken.” For Luce, the message is clear: “If you’re struggling, you don’t have to do it alone.” Both of them remind us that support is out there, whether it’s from friends, colleagues, employee resource groups, or the neurodiverse community at large. At MongoDB, we’re committed to celebrating neurodiversity and creating a workplace where everyone’s unique strengths are recognized. By continuing to build a culture of empathy, understanding, and collaboration, we’re ensuring that everyone—neurodivergent or not—has the opportunity to shine. Learn more about employee resource groups and careers at MongoDB when you join our talent community .

April 28, 2025
Culture

Reimagining Legacy Systems with AI: Why We're Building the Future

This article was adapted from an interview with Galileo’s Chain of Thought podcast. Watch the full episode on YouTube. At MongoDB, we talk a lot about what it means to be at an inflection point—a moment where you can either maintain the status quo or redefine what's possible. For my team and for software engineers around the world, that inflection point is here. Now. Large language models (LLMs), agents, and now model context protocol (MCP) are fundamentally changing not just how we work, but what is possible with technology. So we must adapt; we have to build differently, and be faster. At MongoDB, we're creating something unique. Something smarter. We’re embracing and exploring everything that AI can do for developers. And we’re inviting the next generation of engineers to join us in shaping it. The hidden cost of legacy systems If you've spent any time in enterprise engineering, you know the challenge many organizations face: decades-old systems that are critical to a company’s operations but are held together with duct tape and wishful thinking. The developers who built them have long since moved on. The documentation is missing, outdated, or was never written. The last update was six years ago; some of the dependencies are abandoned. Whether you call it “tech debt” or “care and feeding” or “maintenance”, it consumes a huge fraction of our time and engineering budget—without a clear path forward. This is the reality many companies face—whether they’re in finance , healthcare, or the public sector—they have no choice but to pour millions of dollars into just keeping these systems afloat. Or do they? At MongoDB, we're building a new kind of engineering capability—one that combines the latest advancements in generative AI with the principles of forward-deployed engineering to help modernize legacy systems at a speed and scale that feels impossible. What is forward-deployed AI engineering? Because it’s new, you may not have heard of this role before. It's a bridge between engineering, consulting, and product development. As an AI Forward Deployed Engineer, you won’t just sit behind the scenes writing code. You'll be embedded with our customers, working side by side with their teams to solve real-world modernization challenges. No theory, just hands-on engineering at the sharpest edge of AI innovation. Your goal? Write software with AI, write software for AI, at speeds you’ve never experienced. In some tasks, we have benchmarked AI as being more than eight hundred times faster than a human being… working in that environment is, I assure you, radically different. You’ll deliver immediate, meaningful impact—whether that’s untangling a million-line code base, modernizing outdated Java frameworks, or helping teams migrate from niche, unsupported languages to modern tech stacks. MongoDB + AI: Changing the game One of the most exciting parts of our work is how AI is fundamentally changing what's possible. In the past, a modernization project like this might take five years and dozens of engineers. Today, with the power of MongoDB and AI, that same work is done in a fraction of the time—sometimes in months, sometimes in weeks—with a much smaller, highly focused team. We use LLMs and other AI tools to do things like: Add missing documentation to legacy code Write unit tests where none existed Remove outdated frameworks, replacing them with others Analyze and map massive, messy code bases Move between programming languages, frameworks, platforms, ORMs, databases, and front-end technologies. Let me be clear—none of this is easy, or on autopilot. We augment great engineers with powerful tools so they can focus on the work that matters most, so they can think big, and go far . Combining human expertise with AI capability leads to outcomes no one thought possible. Why it matters Modernizing these systems isn’t just about efficiency or cost savings—it’s mission-critical. The legacy platforms power trading systems, healthcare infrastructure, and governmental services that support real people every day. In multiple countries around the world. Being able to show them a fully functional prototype in weeks, instead of years, is game-changing. It proves what's possible. It builds momentum. It allows them to rethink how they, too, can structure teams and processes in a world where technology moves faster than ever. Why join MongoDB? If you're an engineer who thrives on autonomy, problem-solving, and building real solutions with immediate impact—and you want to work with AI every day in novel and complex ways— this is your chance. We’re looking for folks from a variety of backgrounds—software development, consulting, product engineering, or technical architecture—who are passionate about learning and applying new technologies, skilled communicators who are excited to partner with customers, and who are comfortable operating in ambiguity. As part of our Application Modernization function, you will: Work directly with customers and stakeholders Rapidly build solutions that solve meaningful business challenges Operate at the intersection of engineering, product, and consulting Learn and apply cutting-edge AI, inventing both new methodologies and technologies Be part of a small, fast-moving, high-impact team You might even write a white paper or two. You’ll also be part of a culture that values leadership at every level. At MongoDB, we believe in making it matter —and this role is designed to make change, not just for our customers, but for the software industry as a whole. If you’ve ever wanted to be part of a startup within an enterprise, this is your opportunity. We're building a new way of working — one where experimentation, agility, and ownership are at the core. Help us build what’s next At MongoDB, we’re not just modernizing applications—we’re modernizing how software is made . If that excites you, if you want to shape the future of software development alongside a team of builders, thinkers, and innovators—we’d love to hear from you. Check out our open roles and join us in redefining what's possible.

April 23, 2025
Culture

Transforming News Into Audio Experiences with MongoDB and AI

You wake up, brew your coffee, and start your day with a perfectly tailored podcast summarizing the latest news—delivered in a natural, engaging voice. No manual curation, no human narration, just seamless AI magic. Sounds like the future? It's happening now, powered by MongoDB and generative AI. In 2025, the demand for audio content—particularly podcasts—surged, with 9 million new active listeners in the United States alone, prompting news organizations to seek efficient ways to deliver daily summaries to their audiences. However, automating news delivery has proven to be a challenging task, as media outlets must manage dynamic article data and convert this information into high-quality audio formats at scale. To overcome these hurdles, media organizations can use MongoDB for data storage alongside generative AI for podcast creation, developing a scalable solution for automated news broadcasting. This approach unlocks new AI-driven business opportunities and can attract new customers while strengthening the loyalty of existing ones, contributing to increased revenue streams for media outlets. Check out our AI Learning Hub to learn more about building AI-powered apps with MongoDB. The secret sauce: MongoDB + AI In a news automation solution, MongoDB acts as the system’s backbone, storing news article information as flexible documents with fields like title, content, and publication date—all within a single collection. Alongside this, dynamic elements (such as the number of qualified reads) can be seamlessly integrated into the same document to track content popularity. Moreover, derived insights—e.g., sentiment analysis and key entities—can be generated and enriched through a gen AI pipeline directly within the existing collection. Figure 1. MongoDB data storage for media. This adaptable data structure ensures that the system remains both efficient and scalable, regardless of content diversity or evolving features. As a result, media outlets have created a robust framework to query and extract the latest news and metadata from MongoDB. They can now integrate AI with advanced language models to transform this information into an audio podcast. With this foundation in place, let's examine why MongoDB is well-suited for implementing AI-driven applications. Why MongoDB is the perfect fit News data is inherently diverse, with each article containing a unique mix of attributes, including main content fields (e.g. id, title, body, date, imageURL), calculated meta data (e.g. read count), generated fields with the help of GenAI (e.g. keywords, sentiment) and embeddings for semantic/vector search. Some of these elements originate from publishers, while others emerge from user interactions or AI-driven analysis. MongoDB’s flexible document model accommodates all these attributes—whether predefined or dynamically generated, within a single, adaptable structure. This eliminates the rigidity of traditional databases and ensures that the system evolves seamlessly alongside the data it manages. What’s more, speed is critical in news automation. By storing complete, self-contained documents, MongoDB enables rapid retrieval and processing without the need for complex joins. This efficiency allows articles to be enriched, analyzed, and transformed into audio content in near real-time. And scalability is built in. Whether handling a small stream of updates or processing vast amounts of constantly changing data, MongoDB’s distributed architecture ensures high availability and seamless growth, making it ideal for large-scale media applications. Last but hardly least, developers benefit from MongoDB’s agility. Without the constraints of fixed schemas, new data points—whether from evolving AI models, audience engagement metrics, or editorial enhancements—can be integrated effortlessly. This flexibility allows teams to experiment, iterate, and scale without friction, ensuring that the system remains future-proof as news consumption evolves. Figure 2. MongoDB benefits for AI-driven applications. Bringing news to life with generative AI Selecting MongoDB for database storage is just the beginning; the real magic unfolds when text meets AI-powered speech synthesis. In our labs, we have experimented with Google’s NotebookLM model to refine news text, ensuring smooth narration with accurate intonation and pacing. Putting all these pieces together, the diagram below illustrates the workflow for automating AI-based news summaries into audio conversions. Figure 3. AI-based text-to-audio conversion architecture. The process begins with a script that retrieves relevant news articles from MongoDB, using the Aggregation Framework and Vector Search to ensure semantic relevance. These selected articles are then passed through an AI-powered pipeline, where they are condensed into a structured podcast script featuring multiple voices. Once the script is refined, advanced text-to-speech models transform it into high-quality audio, which is stored as a .wav file. To optimize delivery, the generated podcast is cached, ensuring seamless playback for users on demand. The result? A polished, human-like narration, ready for listeners in MP3 format. Thanks to this implementation, media outlets can finally let go of the robotic voices of past automations. Instead, they can now deliver a listening experience to their customers that's human, engaging, and professional. The future of AI-powered news consumption This system isn’t just a technological innovation; it’s a revolution in how we consume news. By combining MongoDB’s efficiency with AI’s creative capabilities, media organizations can deliver personalized, real-time news summaries without human intervention. It’s faster, smarter, and scalable—ushering in a new era of automated audio content. Want to build the next-gen AI-powered media platform? Start with MongoDB and let your content speak for itself! To learn more about integrating AI into media systems using MongoDB, check out the following resources to guide your next steps: The MongoDB Solutions Library: Gen AI-powered video summarization The MongoDB Blog: AI-Powered Media Personalization: MongoDB and Vector Search

April 21, 2025
Artificial Intelligence

Away From the Keyboard: Kyle Lai, Software Engineer 2

In “Away From the Keyboard,” MongoDB developers discuss what they do, how they keep a healthy work-life balance, and their advice for people seeking a more holistic approach to coding. In this article, Kyle Lai describes his role as a Software Engineer 2 at MongoDB; why he’d rather not be like the characters on the TV show, Severance; and how his commute helps set boundaries between his professional and personal lives. Q: What do you do at MongoDB? Kyle: Hi! I’m an engineer on Atlas Growth 1, where we run experiments on Atlas and coordinate closely with marketing, product, design, and analytics to improve the user experience. Atlas Growth 1 is part of the broader Atlas Growth engineering teams, where we own the experimentation platform and experiment software development kit, allowing other teams to run experiments as well! The engineers on Atlas Growth are very involved with the product side of our experiments. We help the analytics team collect metrics and decide if a given experiment was a win. Sometimes, seemingly obvious positive improvements can turn out to be detrimental to the user flow, so our experimentation process allows us to learn greatly about our users, whether the experiment wins or not. Q: What does work-life balance look like for you? Kyle: Work-life balance for me means that I won’t be worrying about responding to messages or needing to open my laptop after work hours. It also means that my teammates equally respect my work-life balance and do not expect me to work during non-work hours. Q: How do you ensure you set boundaries between work and personal life? Kyle: Generally, for me, it’s more difficult to set boundaries between work and personal life when I’m working from home, so I try to come into the office most days. My commute also provides me with time to wind down and signal that work is over for the day. In a way, the drive to and from the train station allows me to transition to getting into the mindset for work or to decompress at the end of the day. Q: Has work-life balance always been a priority for you, or did you develop it later in your career? Kyle: As someone who is early in my career, work-life balance is something that I’ve grown to appreciate and see as a priority in my life. It allows me to enjoy my personal life, and definitely contributes to a healthier me and a healthier team. Q: What benefits has this balance given you in your career? Kyle: Our team has a weekly Friday hangout meeting, where we have a different question posed to us each week. One of the questions was based on the TV show, Severance. Would we choose to be “severed” like the characters in the show? They undergo a procedure that separates their work and personal brains—their work brains have no awareness of their personal lives, and vice versa. As someone who hasn’t seen the show, but has heard about it from the rest of my team, I wouldn’t do it. Balancing my work and personal lives allows me to enjoy each side more. I’m motivated for the end of the week so I can enjoy the weekend, and I’m also excited to come to work with a fresh headspace on Mondays, since I am not overworking during non-work hours. Q: What advice would you give to someone seeking to find a better balance? Kyle: I’ll sometimes have the urge to continue working past work hours, as I’ll feel like I’m about to finish whatever task I’m working on very soon or think I can get even more done if I don’t stop working. That backfires pretty quickly. You have to realize you can be easily fatigued and are not able to give your best work if you constantly keep working. A proper work-life balance will allow you to have a fresh start and a clear mind each day. As for how to better separate work and personal life, I’d suggest changing notification settings on your phone for Slack, so that non-urgent work messages won’t tempt you to open your laptop. Another strategy would be to associate some event with a cutoff for checking work things, such as not reading messages once you’ve left the office or boarded the train. I’ve had teammates tell me they delete Slack from their phones when they’re on vacation, which is a good idea! Thank you to Kyle Lai for sharing these insights! And thanks to all of you for reading. For past articles in this series, check out our interviews with: Staff Engineer, Ariel Hou Senior AI Developer Advocate, Apoorva Joshi Developer Advocate, Anaiya Raisinghani Senior Partner Marketing Manager, Rafa Liou Staff Software Engineer, Everton Agner Interested in learning more about or connecting more with MongoDB? Join our MongoDB Community to meet other community members, hear about inspiring topics, and receive the latest MongoDB news and events. And let us know if you have any questions for our future guests when it comes to building a better work-life balance as developers. Tag us on social media: @/mongodb #LoveYourDevelopers #AwayFromTheKeyboard

April 17, 2025
Culture

Unlocking BI Potential with DataGenie & MongoDB

Business intelligence (BI) plays a pivotal role in strategic decision-making. Enterprises collect massive amounts of data yet struggle to convert it into actionable insights. Conventional BI is reactive, constrained by predefined dashboards, and human-dependent, thus making it error-prone and non-scalable. Businesses today are data-rich but insight-poor. Enter DataGenie, powered by MongoDB—BI reimagined for the modern enterprise. DataGenie autonomously tracks millions of metrics across the entire business datascape. It learns complex trends like seasonality, discovers correlations & causations, detects issues & opportunities, connects the dots across related items, and delivers 5 to 10 prioritized actionable insights as stories in natural language to non-data-savvy business users. This enables business leaders to make bold, data-backed decisions without the need for manual data analysis. With advanced natural language capabilities through Talk to Data, users can query their data conversationally, making analytics truly accessible. The challenges: Why DataGenie needed a change DataGenie processes large volumes of enterprise data on a daily basis for customers, tracking billions of time series metrics and performing anomaly detection autonomously to generate deep, connected insights for business users. The below diagram represents the functional layers of DataGenie. Figure 1. DataGenie’s functional layers. Central to the capability of DataGenie is the metrics store, which stores, rolls up, and serves billions of metrics. At DataGenie, we were using an RDBMS (PostgreSQL) as the metrics store. As we scaled to larger enterprise customers, DataGenie processed significantly higher volumes of data. The complex feature sets we were building also required enormous flexibility and low latency in how we store & retrieve our metrics. DataGenie had multiple components that served different purposes, and all of these had to be scaled independently to meet our sub-second latency requirements. With PostgreSQL as the metrics store for quite some time and tried to squeeze it to the maximum extent possible at the cost of flexibility. Since we over-optimized the structure for performance, we lost the flexibility we required to build our next-gen features, which were extremely demanding We defaulted to PostgreSQL for storing the insights (i.e. stories), again optimized for storage and speed, hurting us on the flexibility part For the vector store, we had been using ChromaDB for storing all our vector embeddings. As the data volumes grew, the most challenging part was maintaining the data sync We had to use a different data store for knowledge store and yet another technology for caching The major problems we had were as follows: Rigid schema that hindered flexibility for evolving data needs. High latency & processing cost due to extensive preprocessing to achieve the desired structure Slow development cycles that hampered rapid innovation How MongoDB gave DataGenie a superpower After extensive experimentation with time-series databases, document databases, and vector stores, we realized that MongoDB would be the perfect fit for us since it exactly solved all our requirements with a single database. Figure 2. MongoDB data store architecture. Metrics store When we migrated to MongoDB, we achieved a remarkable reduction in query latency. Previously, complex queries on 120 million documents took around 3 seconds to execute. With MongoDB's efficient architecture, we brought this down to an impressive 350-500 milliseconds for 500M+ docs , representing an 85-90% improvement in query speed for a much larger scale. Additionally, for storing metrics, we transitioned to a key-value pair schema in MongoDB. This change allowed us to reduce our data volume significantly— from 300 million documents to just 10 million documents —thanks to MongoDB's flexible schema and optimized storage. This optimization not only reduced our storage footprint for metrics but also enhanced query efficiency. Insights store By leveraging MongoDB for the insight service, we eliminated the need for extensive post-processing, which previously consumed substantial computational resources. This resulted in a significant cost advantage, reducing our Spark processing costs by 90% or more (from $80 to $8 per job). Querying 10,000+ insights took a minute before. With MongoDB, the same task is now completed in under 6 seconds—a 10x improvement in performance . MongoDB’s flexible aggregation pipeline was instrumental in achieving these results. For example, we extensively use dynamic filter presets to control which insights are shown to which users, based on their role & authority. The MongoDB aggregation pipeline dynamically adapts to user configurations, retrieving only the data that’s relevant. LLM service & vector store The Genie+ feature in DataGenie is our LLM-powered application that unifies all DataGenie features through a conversational interface. We leverage MongoDB as a vector database to store KPI details, dimensions, and dimension values. Each vector document embeds essential metadata, facilitating fast and accurate retrieval for LLM-based queries. By serving as the vector store for DataGenie, MongoDB enables efficient semantic search, allowing the LLM to retrieve contextual, relevant KPIs, dimensions, and values with minimal latency, enhancing the accuracy and responsiveness of Genie+ interactions. Additionally, integrating MongoDB Atlas Search for semantic search significantly improved performance. It provided faster, more relevant results while minimizing integration challenges.MongoDB’s schema-less design and scalable architecture also streamlined data management. Knowledge store & cache MongoDB’s schema-less design enables us to store complex, dynamic relationships and scale them with ease. We also shifted to using MongoDB as our caching layer. Previously, having separate data stores made syncing and maintenance cumbersome. Centralizing this information in MongoDB simplified operations, enabled automatic syncing, and ensured consistent data availability across all features. With MongoDB, DataGenie is reducing time-to-market for feature releases Although we started the MongoDB migration to solve only our existing scalability and latency issues, we soon realized that just by migrating to MongoDB, we could imagine even bigger and more demanding features without engineering limitations. Figure 3. MongoDB + DataGenie integration. DataGenie engineering team refers v2 magic moment since migrating to MongoDB makes it a lot easier & flexible to roll out the following new features: DataGenie Nirvana: A delay in the supply chain for a raw material can cascade into a revenue impact. Conventional analytics relies on complex ETL pipelines and data marts to unify disparate data and deliver connected dashboard metrics. DataGenie Nirvana eliminates the need for a centralized data lake by independently generating aggregate metrics from each source and applying advanced correlation and causation algorithms on aggregated data to detect hidden connections. DataGenie Wisdom: Wisdom leverages an agentic framework & knowledge stores, to achieve two outcomes: Guided onboarding: Onboarding a new use case in DataGenie is as simple as explaining the business problem, success criteria, and sharing sample data - DataGenie autonomously configures itself for relevant metrics tracking to deliver the desired outcome. Next best action: DataGenie autonomously surfaces insights - like a 10% brand adoption spike in a specific market and customer demographics. By leveraging enterprise knowledge bases and domain-specific learning, DataGenie would propose targeted marketing campaigns as the Next Best Action for this insight. Powered by Genie: DataGenie offers powerful augmented analytics that can be quickly configured for any use case and integrated through secure, high-performance APIs. This powers data products in multiple verticals, including Healthcare & FinOps, to deliver compelling augmented analytics as a premium add-on, drastically reducing their engineering burden and GTM risk. All of these advanced features require enormous schema flexibility, low latency aggregation, and a vector database that’s always in sync with the metrics & insights. That’s exactly what we get with MongoDB! Powered by MongoDB Atlas, DataGenie delivers actionable insights to enterprises, helping them unlock new revenue potential and reduce costs. The following are some of the DataGenie use cases in Retail: Demand shifts & forecasting: Proactively adjust inventory or revise marketing strategies based on product demand changes. Promotional effectiveness: Optimize marketing spend by understanding which promotions resonate with which customer segments. Customer segmentation & personalization: Personalize offers based on customer behavior and demographics. Supply chain & logistics: Minimize disruptions by identifying potential bottlenecks and proposing alternative solutions. Inventory optimization: Streamline inventory management by flagging potential stockouts or overstock. Fraud & loss prevention: Detect anomalies in transaction data that may signal fraud or errors. Customer retention & loyalty: Propose retention strategies to address customer churn. Staffing optimization: Optimize customer support staffing. Final thoughts Migrating to MongoDB did more than just solve DataGenie’s scalability and latency challenges - it unlocked new possibilities. The flexibility of MongoDB allowed DataGenie to innovate faster and conceptualize new features such as Nirvana, Wisdom, and ultra-efficient microservices. This transformation stands as a proof of concept for future product companies considering partnering with MongoDB. The partnership between DataGenie and MongoDB is a testament to how the right technology choices can drive massive business value, improving performance, scalability, and cost-efficiency. Ready to unlock deeper retail insights? Head over to our retail page to learn more. Check out our Atlas Learning Hub to boost your MongoDB skills.

April 16, 2025
Applied

Introducing Database Digest: Building Foundations for Success

Today at MongoDB .local Toronto , I’m excited to share the first issue of Database Digest —MongoDB’s new digital magazine that explores the critical role of data in powering modern applications. This inaugural issue explores modern data architecture, and shows how—when the right data foundation meets emerging technologies—pioneering companies are fundamentally reimagining what's possible. The dawn of data ubiquity Currently, we stand in what McKinsey calls the " data ubiquity era "— with data flowing through organizations as essentially as electricity powers the modern world. The transformation to this era has brought both unprecedented opportunity and formidable challenges. Organizations must simultaneously manage huge volumes of data while delivering the real-time, personalized experiences that define competitive advantage today. Successfully doing so doesn’t mean merely adopting new technologies. Instead, it requires fundamentally rethinking how data is stored, processed, and leveraged to drive business value. Traditional relational database systems simply cannot meet these demands. The future belongs to organizations with data architectures designed for the agility, scalability, and versatility needed to handle diverse data types while seamlessly integrating with emerging technologies like AI. The cornerstone of AI success The rise of AI, and the speed at which the market has been changing have fundamentally shifted the importance of adaptability. However, software can only adapt as fast as its foundation allows. At MongoDB, we believe modern databases are the cornerstone of the age of AI, providing the essential capabilities needed for success in this new era. To do so, they must be able to: Handle all forms of data and provide intelligent search: Modern databases consolidate structured and unstructured data into a single system, eliminating silos that restrict AI innovation. They ground AI output in accurate, contextual data that drives better outcomes. Scale without constraints and react instantly: Databases should be able to adapt to unpredictable workloads and massive data volumes without performance degradation. They should enable real-time decisions and actions when opportunities or threats emerge. Embed domain-specific AI and secure data throughout: Modern databases enhance accuracy with specialized models that reduce hallucinations, and they protect information at every stage without sacrificing speed or functionality. Figure 1. Modern database demands. The impact of a modern database on AI innovation isn’t theoretical—we're seeing organizations like Swisscom leverage this approach to apply generative AI to their extensive expert content library, transforming how they serve the banking industry by delivering bespoke, contextual information within seconds. The AI revolution Perhaps nowhere is the impact of a modern data foundation more profound than in the rapid evolution of AI applications. In just a short time, we've progressed from simple LLM-powered chatbots to more advanced agentic systems capable of understanding complex goals, breaking them into manageable steps, and executing them autonomously—all while maintaining context and learning from interactions. This represents more than incremental progress—it's a fundamental shift in how AI serves as a strategic partner in solving business challenges. MongoDB sits at the heart of this transformation, providing the critical bridge between AI models and data while enabling vector storage, real-time processing, and seamless integration with LLM orchestrators. Companies like Questflow demonstrate the power of this approach, revolutionizing the future of work through a decentralized, autonomous AI agent network that orchestrates multiple AI agents collaborating with humans. By leveraging MongoDB's flexible document model and vector search capabilities, they're enabling startups to create dynamic AI solutions that handle everything from data analysis to content creation, while maintaining context and learning from interactions. Modernizing legacy systems: the strategic imperative For established enterprises, the journey to a modern data foundation often begins with addressing the legacy systems that consume up to 80% of IT budgets yet constrain innovation. Modernization isn't just a technical upgrade—it's a strategic move toward growth, efficiency, and competitive advantage. The evidence is compelling: Bendigo and Adelaide Bank achieved a staggering 90% reduction in both time and cost modernizing core banking applications using MongoDB's repeatable modernization framework and AI-powered migration tools. This transformation isn't just about cost savings—it's about creating the foundation for entirely new capabilities that drive business value. Modern data architecture must embody flexibility across multiple dimensions—from supporting diverse data models to providing deployment options spanning cloud-native, cloud-agnostic, and on-premise environments. This approach enables organizations to break free from silos, integrate AI capabilities, and create a unified data foundation supporting both current operations and future innovations. What’s next for data The organizations featured throughout Database Digest share a common vision: they recognize that tomorrow's success depends on today's data foundation. The convergence of flexible document models, advanced AI integration, and cloud-native capabilities isn't just enabling incremental improvements—it's powering applications and experiences that were never before possible. So I invite you to explore the full, inaugural issue of Database Digest to discover how MongoDB is helping organizations across industries build the foundation for tomorrow's success. This isn't just about technology—it's about creating the foundation for transformation that delivers real business value in our increasingly data-driven world. Visit mongodb.com/library/database-digest to download your copy and join us on this journey into the future of data.

April 15, 2025
News

Now Generally Available: 7 New Resource Policies to Strengthen Atlas Security

Organizations demand for a scalable means to enforce security and governance controls across their database deployments without slowing down developer productivity. To address this, MongoDB introduced resource policies in public preview on February 10th, 2025. Resource policies enable organization administrators to set up automated, organization-wide ‘guardrails’ for their MongoDB Atlas deployments. At public preview, three policies were released to this end. Today, MongoDB is announcing the general availability of resource policies in MongoDB Atlas. This release introduces seven additional policies and a new graphical user interface (GUI) for creating and managing policies. These enhancements give organizations greater control over MongoDB Atlas configurations, simplifying security and compliance automation. How resource policies enable secure innovation Innovation is essential for organizations to maintain competitiveness in a rapidly evolving global landscape. Companies with higher levels of innovation outperformed their peers financially, according to a Cornell University study analyzing S&P 500 companies between 1998 and 2023 1 . One of the most effective ways to drive innovation is by equipping developers with the right tools and giving them the autonomy to put them into action 2 . However, without standardized controls governing those tools, developers can inadvertently configure Atlas clusters to deviate from corporate or regulatory best practices. Manual approval processes for every new project create delays. Concurrently, platform teams struggle to enforce consistent security policies across the organization, leading to increased complexity and costs. As cybersecurity threats evolve daily and regulations tighten, granting developers autonomy and quickly provisioning access to essential tools can introduce risks. Organizations must implement strong security measures to maintain compliance and enable secure innovation. Resource policies empower organizations to enforce security and compliance standards across their entire Atlas environment. Instead of targeting specific user groups, these policies establish organization-wide guardrails to govern how Atlas can be configured. This reduces the risk of misconfigurations and security gaps. With resource policies, security and compliance standards are applied automatically across all Atlas projects and clusters. This eliminates the need for manual approvals. Developers gain self-service access to the resources they need while remaining within approved organizational boundaries. Simultaneously, platform teams can centrally manage resource policies to ensure consistency and free up time for strategic initiatives. Resource policies strengthen security, streamline operations, and help accelerate innovation by automating guardrails and simplifying governance. Organizations can scale securely while empowering developers to move faster without compromising compliance. What resource policies are available? table, th, td { border: 1px solid black; } Policy Type Description Available Since Restrict cloud provider Ensure clusters are only deployed on approved cloud providers (AWS, Azure, or Google Cloud). This prevents accidental or unauthorized deployments in unapproved environments. This supports organizations in meeting regulatory or business requirements. Public preview

April 14, 2025
Updates

Ready to get Started with MongoDB Atlas?

Start Free