After nearly 100 years as the largest U.S. based business media brand, Forbes has established itself as a technology leader in the news industry. To compete in a new mobile environment, Forbes designed a next-generation mobile application to better engage users with their stories. They turned to MongoDB to create a new infrastructure for engaging and dynamic content.
Steven Bond, the group director for the Forbes.com Software Development Team, chose MongoDB for its intuitive web interface, ease of use, and low cost. Says Bond of his experience with MongoDB, “it just works.”
MongoDB made it possible for Forbes.com to store all of its data in a single database. This database contains information on nearly one million articles from thousands of global contributors and more than one hundred twenty thousand users, companies and place list entries. With MongoDB, Forbes is able to aggregate its data, connect it to its mobile and web applications, and integrate partner feeds from a centralized location, creating a rich user experience.
“The beauty of MongoDB is that we can constantly evolve without reengineering our entire approach,” says Bond. In his next project, Bond aims to use social media statistics to predict where users will consume content in the future and the kinds of content that will drive traffic. With MongoDB, Bond will help Forbes change how news is consumed and understood.
Using Big Data for Humanitarian Crisis Mapping
In the wake of natural disasters like Typhoon Haiyan, which brought widespread destruction to the Philippines several weeks ago, data management tools have become a critical component of the post-disaster landscape. Aid groups are monitoring tweets and instant messages where the infrastructure exists to support them, while tracking local news reports on the ground to find the areas suffering the greatest damage, directing resources to those most in need. Sourcing data can significantly improve the efforts of aid initiatives after a disaster. Big data for development, or data philanthropy, streamlines crisis management and prevention by using data processing tools to anticipate and respond to humanitarian emergencies. Initiatives like the UN Global Pulse team are using data to find the “digital smoke signals of distress” that can appear months before showing up on official reports. Real-time data monitoring using social networks, cell phones, blogs, and online commerce platforms can alert the team to indicators of social distress or natural disaster. And with the capacity to recognize these trends comes the ability to prepare the right aid or prevention plan that could save lives. What Big Data Can Do Big data can create a clear picture of a disaster’s regional effects. A program called Ushahidi sourced eyewitness reports (in person and through social media) of the 2010 earthquake in Haiti. The reports’ data became a live crisis map, showing where victims lay buried under collapsed buildings and where aid was most needed. After Typhoon Bopha in the Philippines last year, the Digital Humanitarian Initiative used over 20,000 social media messages to create a map of the storm’s impact and determine where to send aid first. Some organizations believe data for development can soothe social discontent. CNN reported that the U.S. State Department has analyzed data to try and prevent conflict from starting or escalating. Its Conflict and Stabilization Operations office analyzes behavioral patterns and semantic trends in social media to anticipate threats to peace while designing strategies to thwart potential outbreaks of violence. Partnerships For Philanthropy As the data philanthropy movement grows, the tech industry will be observing which companies and corporations are the first to join this global project. Twitter, Facebook, or Instagram might help us move towards a future where disease or disaster can be instantly monitored and possibly prevented, or where the spread of poverty can be stopped in its tracks. The success of these new ventures will not only depend on the determination of the people who work on them. Small humanitarian initiatives will need to develop partnerships with the larger corporations that control telecommunications and census data. Without access to big data or the proper processing tools, data philanthropy groups will not be able to keep up with the demands of crises happening in real time. Going Forward The United Nations Office for the Coordination of Humanitarian Affairs released a report this past June on the importance of big data and humanitarianism. Finding ways to improve humanitarian aid services with data is one of the great challenges and opportunities of our age. But accessing data is not necessarily straightforward. Negotiating with data providers can be difficult and privacy concerns could make corporations unwilling to participate. And while big data processing can be used to improve lives, it should augment existing data gathering methods, not replace them. MongoDB has helped several organizations use data mining to augment public service . The city of Chicago used MongoDB to design WindyGrid , a geographic information system providing a unified view of the city’s operations across a map. Including real-time data like 911 and 311 service calls, critical information is geospatially enabled and tracked to help the Chicago’s Emergency Management and Communications Office handle events or crises across the city. To explore the frontiers of physics, CERN built a Data Aggregation System (DAS) on MongoDB to help physicists search for and aggregate information across complex data landscapes. The data and metadata CERN handles are constantly evolving, but the DAS allows researchers to find information with text based queries, aggregating the results from distributed providers while preserving integrity and security. While these companies haven’t used data mining directly for humanitarian aid, mining data with MongoDB can easily be adapted to philanthropic service. Data philanthropy has the potential to influence humanitarian efforts and change how we understand the scope of big data. As these aid organizations grow in influence, it will be interesting to see how the industry shifts to make room for this new use of data.
How DataSwitch And MongoDB Atlas Can Help Modernize Your Legacy Workloads
Data modernization is here to stay, and DataSwitch and MongoDB are leading the way forward. Research strongly indicates that the future of the Database Management System (DBMS) market is in the cloud, and the ideal way to shift from an outdated, legacy DBMS to a modern, cloud-friendly data warehouse is through data modernization. There are a few key factors driving this shift. Increasingly, companies need to store and manage unstructured data in a cloud-enabled system, as opposed to a legacy DBMS which is only designed for structured data. Moreover, the amount of data generated by a business is increasing at a rate of 55% to 65% every year and the majority of it is unstructured. A modernized database that can improve data quality and availability provides tremendous benefits in performance, scalability, and cost optimization. It also provides a foundation for improving business value through informed decision-making. Additionally, cloud-enabled databases support greater agility so you can upgrade current applications and build new ones faster to meet customer demand. Gartner predicts that by 2022, 75% of all databases will be on the cloud – either by direct deployment or through data migration and modernization. But research shows that over 40% of migration projects fail. This is due to challenges such as: Inadequate knowledge of legacy applications and their data design Complexity of code and design from different legacy applications Lack of automation tools for transforming from legacy data processing to cloud-friendly data and processes It is essential to harness a strategic approach and choose the right partner for your data modernization journey. We’re here to help you do just that. Why MongoDB? MongoDB is built for modern application developers and for the cloud era. As a general purpose, document-based, distributed database, it facilitates high productivity and can handle huge volumes of data. The document database stores data in JSON-like documents and is built on a scale-out architecture that is optimal for any kind of developer who builds scalable applications through agile methodologies. Ultimately, MongoDB fosters business agility, scalability and innovation. Key MongoDB advantages include: Rich JSON Documents Powerful query language Multi-cloud data distribution Security of sensitive data Quick storage and retrieval of data Capacity for huge volumes of data and traffic Design supports greater developer productivity Extremely reliable for mission-critical workloads Architected for optimal performance and efficiency Key advantages of MongoDB Atlas , MongoDB’s hosted database as a service, include: Multi-cloud data distribution Secure for sensitive data Designed for developer productivity Reliable for mission critical workloads Built for optimal performance Managed for operational efficiency To be clear, JSON documents are the most productive way to work with data as they support nested objects and arrays as values. They also support schemas that are flexible and dynamic. MongoDB’s powerful query language enables sorting and filtering of any field, regardless of how nested it is in a document. Moreover, it provides support for aggregations as well as modern use cases including graph search, geo-based search and text search. Queries are in JSON and are easy to compose. MongoDB provides support for joins in queries. MongoDB supports two types of relationships with the ability to reference and embed. It has all the power of a relational database and much, much more. Companies of all sizes can use MongoDB as it successfully operates on a large and mature platform ecosystem. Developers enjoy a great user experience with the ability to provision MongoDB Atlas clusters and commence coding instantly. A global community of developers and consultants makes it easy to get the help you need, if and when you need it. In addition, MongoDB supports all major languages and provides enterprise-grade support. Why DataSwitch as a partner for MongoDB? Automated schema re-design, data migration & code conversion DataSwitch is a trusted partner for cost-effective, accelerated solutions for digital data transformation, migration and modernization through a modern database platform. Our no-code and low-code solutions along with cloud data expertise and unique, automated schema generation accelerates time to market. We provide end-to-end data, schema and process migration with automated replatforming and refactoring, thereby delivering: 50% faster time to market 60% reduction in total cost of delivery Assured quality with built-in best practices, guidelines and accuracy Data modernization: How “DataSwitch Migrate” helps you migrate from RDBMS to MongoDB DataSwitch Migrate (“DS Migrate”) is a no-code and low-code toolkit that leverages advanced automation to provide intuitive, predictive and self-serviceable schema redesign from a traditional RDBMS model to MongoDB’s Document Model with built-in best practices. Based on data volume, performance, and criticality, DS Migrate automatically recommends the appropriate ETTL (Extract, Transfer, Transform & Load) data migration process. DataSwitch delivers data engineering solutions and transformations in half the timeframe of the existing typical data modernization solutions. Consider these key areas: Schema redesign – construct a new framework for data management. DS Migrate provides automated data migration and transformation based on your redesigned schema, as well as no-touch code conversion from legacy data scripts to MongoDB Atlas APIs. Users can simply drag and drop the schema for redesign and the platform converts it to a document-based JSON structure by applying MongoDB modeling best practices. The platform then automatically migrates data to the new, re-designed JSON structure. It also converts the legacy database script for MongoDB. This automated, user-friendly data migration is faster than anything you’ve ever seen. Here’s a look at how the schema designer works. Refactoring – change the data structure to match the new schema. DS Migrate handles this through auto code generation for migrating the data. This is far beyond a mere lift and shift. DataSwitch takes care of refactoring and replatforming (moving from the legacy platform to MongoDB) automatically. It is a game-changing unique capability to perform all these tasks within a single platform. Security – mask and tokenize data while moving the data from on-premise to the cloud. As the data is moving to a potentially public cloud, you must keep it secure. DataSwitch’s tool has the capability to configure and apply security measures automatically while migrating the data. Data Quality – ensure that data is clean, complete, trustworthy, consistent. DataSwitch allows you to configure your own quality rules and automatically apply them during data migration. In summary: first, the DataSwitch tool automatically extracts the data from an existing database, like Oracle. It then exports the data and stores it locally before zipping and transferring it to the cloud. Next, DataSwitch transforms the data by altering the data structure to match the re-designed schema, and applying data security measures during the transform step. Lastly, DS Migrate loads the data and processes it into MongoDB in its entirety. Process Conversion Process conversion, where scripts and process logic are migrated from legacy DBMS to a modern DBMS, is made easier thanks to a high degree of automation. Minimal coding and manual intervention are required and the journey is accelerated. It involves: DML – Data Manipulation Language CRUD – typical application functionality (Create, Read, Update & Delete) Converting to the equivalent of MongoDB Atlas API Degree of automation DataSwitch provides during Migration Schema Migration Activities DS Automation Capabilities Application Data Usage Analysis 70% 3NF to NoSQL Schema Recommendation 60% Schema Re-Design Self Services 50% Predictive Data Mapping 60% Process Migration Activities DS Automation Capabilities CRUD based SQL conversion (Oracle, MySQL, SQLServer, Teradata, DB2) to MongoDB API 70% Data Migration Activities DS Automation Capabilities Migration Script Creation 90% Historical Data Migration 90% 2 Catch Load 90% DataSwitch Legacy Modernization as a Service (LMaas): Our consulting expertise combined with the DS Migrate tool allows us to harness the power of the cloud for data transformation of RDBMS legacy data systems to MongoDB. Our solution delivers legacy transformation in half the time frame through pay-per-usage. Key strengths include: ● Data Architecture Consulting ● Data Modernization Assessment and Migration Strategy ● Specialized Modernization Services DS Migrate Architecture Diagram Contact us to learn more.