This document discusses business analytics and data analytics capabilities. It covers key concepts like data warehouses, data marts, ETL processes, business intelligence, data mining techniques, and how organizations can use analytics to gain insights from data to support decision making and gain a competitive advantage. The document provides examples of how companies like IHG and retailers use analytics to improve operations and customer understanding.
This document discusses the opportunities and challenges of big data. It defines big data as huge volumes of structured and unstructured data from various sources that require new tools to analyze and extract business insights. Big data provides both statistical and predictive views to help businesses make smarter decisions. While big data allows companies to integrate diverse data sources and gain real-time insights, challenges include processing large and complex data volumes and ensuring data quality, privacy and management. The document outlines the big data lifecycle and how analytics can be used descriptively, predictively and prescriptively.
The document discusses the rise of data analysts and their use of data blending. It describes how data analysts are taking on more responsibilities and need tools to efficiently analyze large, complex data from multiple sources. Data blending tools allow analysts to easily access both traditional and emerging data, cleanse and prepare it, then perform advanced analytics without relying on IT. This empowers analysts to get answers more quickly and help business decision makers. The key is combining data from different sources into a unified dataset for analysis rather than permanent integration.
Implementar una estrategia eficiente de gobierno y seguridad del dato con la ...Denodo
Watch full webinar here: https://bit.ly/3lSwLyU
En la era de la explosión de la información repartida en distintas fuentes, el gobierno de datos es un componente clave para garantizar la disponibilidad, usabilidad, integridad y seguridad de la información. Asimismo, el conjunto de procesos, roles y políticas que define permite que las organizaciones alcancen sus objetivos asegurando el uso eficiente de sus datos.
La virtualización de datos forma parte de las herramientas estratégica para implementar y optimizar el gobierno de datos. Esta tecnología permite a las empresas crear una visión 360º de sus datos y establecer controles de seguridad y políticas de acceso sobre toda la infraestructura, independientemente del formato o de su ubicación. De ese modo, reúne múltiples fuentes de datos, las hace accesibles desde una sola capa y proporciona capacidades de trazabilidad para supervisar los cambios en los datos.
Le invitamos a participar en este webinar para aprender:
- Cómo acelerar la integración de datos provenientes de fuentes de datos fragmentados en los sistemas internos y externos y obtener una vista integral de la información.
- Cómo activar en toda la empresa una sola capa de acceso a los datos con medidas de protección.
- Cómo la virtualización de datos proporciona los pilares para cumplir con las normativas actuales de protección de datos mediante auditoría, catálogo y seguridad de datos.
¿En qué se parece el Gobierno del Dato a un parque de atracciones?Denodo
Watch full webinar here: https://bit.ly/3Ab9gYq
Imagina llegar a un parque de atracciones con tu familia y comenzar tu día sin el típico plano que te permitirá planificarte para saber qué espectáculos ver, a qué atracciones ir, donde pueden o no pueden montar los niños… Posiblemente, no podrás sacar el máximo partido a tu día y te habrás perdido muchas cosas. Hay personas que les gusta ir a la aventura e ir descubriendo poco a poco, pero cuando hablamos de negocios, ir a la aventura puede ser fatídico...
En la era de la explosión de la información repartida en distintas fuentes, el gobierno de datos es clave para garantizar la disponibilidad, usabilidad, integridad y seguridad de esa información. Asimismo, el conjunto de procesos, roles y políticas que define permite que las organizaciones alcancen sus objetivos asegurando el uso eficiente de sus datos.
La virtualización de datos, herramienta estratégica para implementar y optimizar el gobierno del dato, permite a las empresas crear una visión 360º de sus datos y establecer controles de seguridad y políticas de acceso sobre toda la infraestructura, independientemente del formato o de su ubicación. De ese modo, reúne múltiples fuentes de datos, las hace accesibles desde una sola capa y proporciona capacidades de trazabilidad para supervisar los cambios en los datos.
En este webinar aprenderás a:
- Acelerar la integración de datos provenientes de fuentes de datos fragmentados en los sistemas internos y externos y obtener una vista integral de la información.
- Activar en toda la empresa una sola capa de acceso a los datos con medidas de protección.
- Cómo la virtualización de datos proporciona los pilares para cumplir con las normativas actuales de protección de datos mediante auditoría, catálogo y seguridad de datos.
The presentation includes the introduction to the topic, the various dimensions of big data, its evolution from big data 1.0 to bid data 3.0 and its impact on various industries, uses as well as the challenges it faces. The concluding slide gives a brief on the future of big data.
When the business needs intelligence (15Oct2014)Dipti Patil
When an organization needs to make important decisions, business intelligence can help by analyzing internal and external data to generate knowledge. Business intelligence enables fact-based decisions by aggregating, enriching, and presenting data from sources like ERP systems and databases. The goals of a business intelligence implementation are to capture data from across the business to create a unified view, produce an integrated data warehouse to improve decision making, and enable ongoing analysis of data rather than just collecting it.
The New Self-Service Analytics - Going Beyond the ToolsKatherine Gabriel
In today’s business climate, using data to make quick decisions is a common ask across organizations. To fulfill such asks business users want more, faster, and better access to data and analytic tools. IT wants to balance this need for speed with the responsibility to protect the data assets from security, privacy, and quality risks. A common solution to this scenario is self-service BI or self-service analytics. Chances are you are already using self-service BI in some way, shape, or form or have heard a pitch from an analytic tool vendor!
Self-service BI has been around for several decades and yet business users keep asking for more and more. Has self-service BI failed to deliver on its promise? Is it time to revisit what self-service really means? How can business and IT work together to achieve better decision-making outcomes for their organization?
We cover:
• How to demystify what self-service analytics means
• New trends driving the self-service analytics evolution
• Best practices and lessons learned from real-life examples
• Recommendations for making progress within your organization
Advance your self-service journey.
Business intelligence (BI) systems allow companies to gather, store, access, and analyze corporate data to aid in decision-making. These systems illustrate intelligence in areas like customer profiling, market research, and product profitability. A hotel franchise uses BI to compile statistics on metrics like occupancy and room rates to analyze performance and competitive position. Banks also use BI to determine their most profitable customers and which customers to target for new products.
This document discusses data analytics and big data. It begins with definitions of data analytics and big data. It then discusses perceptions of data analytics from different perspectives within an organization. It outlines the data analytics evolution and maturity cycle, highlighting that excellence is about gaining business insights using available data and collaborating across teams. The rest of the document provides examples of how data analytics can be applied and help business strategies in areas like human resources and sales/marketing.
Fuel your Data-Driven Ambitions with Data GovernancePedro Martins
The document discusses the importance of data governance and provides an overview of how to implement an effective data governance program. It recommends obtaining executive sponsorship, aligning objectives to business initiatives, prioritizing initiatives, getting frameworks ready, and socializing the program. The document outlines data governance building blocks, including assessing maturity, developing a master plan, selecting tools, and establishing an organizational framework. It also discusses preparing an organization for success with data governance.
Tips --Break Down the Barriers to Better Data AnalyticsAbhishek Sood
1) Analytics executives face challenges in collecting, analyzing, and delivering insights from data due to a lack of skills, cultural barriers, IT backlogs, and productivity drains.
2) Legacy systems and complex analytics platforms also impede effective data use. Modular solutions that integrate with existing systems and empower self-service are recommended.
3) The document promotes the Statistica software as addressing these challenges through its ease of use, integration capabilities, and support for big data analytics.
Big Data Analysis: Transforming Industries and Unlocking PotentialSanjeev T
Big Data Analysis has revolutionized industries worldwide by enabling organizations to process, analyze, and extract valuable insights from vast amounts of structured and unstructured data. With advancements in computing power, artificial intelligence, and cloud storage, businesses can now harness big data to enhance decision-making, improve efficiency, and create new opportunities across various sectors.
Understanding Big Data Analysis
Big Data refers to large volumes of data generated at high velocity from multiple sources, including social media, IoT devices, financial transactions, and healthcare records. Traditional data processing methods are insufficient to manage such enormous datasets, necessitating advanced analytics techniques like machine learning, natural language processing, and predictive modeling.
Big Data Analysis involves collecting, organizing, and interpreting data to identify patterns, trends, and correlations. This process helps businesses make informed decisions, predict customer behavior, detect fraud, optimize operations, and enhance user experience.
Transforming Industries with Big Data
Healthcare
Big Data in healthcare improves patient outcomes by enabling predictive analytics, real-time monitoring, and personalized treatments.
Electronic Health Records (EHRs) and wearable devices generate data that aids in diagnosing diseases and reducing hospital readmission rates.
AI-powered analytics assist in drug discovery, reducing the time required for clinical trials.
Finance and Banking
Financial institutions leverage Big Data to detect fraud, assess credit risk, and optimize investment strategies.
AI-driven trading algorithms analyze market trends in real time, allowing for data-driven investment decisions.
Customer sentiment analysis helps banks personalize financial products and improve customer service.
Retail and E-commerce
Businesses use Big Data to track consumer behavior, predict trends, and optimize supply chains.
Personalized recommendations based on purchase history increase customer engagement and sales.
Inventory management systems powered by data analytics reduce wastage and improve efficiency.
Manufacturing and Supply Chain
Predictive maintenance helps manufacturers reduce downtime by identifying potential failures before they occur.
IoT-enabled sensors collect real-time data to enhance production efficiency and quality control.
Logistics companies use route optimization and demand forecasting to streamline supply chain operations
Big Data transforms education by providing insights into student performance, learning patterns, and curriculum effectiveness.
AI-powered platforms personalize learning experiences, ensuring adaptive teaching methods.
Predictive analytics help institutions improve student retention and academic success.
Government and Smart Cities
Big Data plays a crucial role in urban planning, traffic management, and disaster response.
Real-time data analysis enhances public safet
Information Strategy: Updating the IT Strategy for Information, Insights and ...Jamal_Shah
The document discusses the need for organizations to update their IT strategies to address the growing amounts of data from various sources and how emerging technologies enable new approaches to managing data and insights. It recommends that an updated IT strategy focus on business capabilities and prioritize information, insights, and governance. The strategy should emphasize cross-functional use of data and analytics to enable fast, fact-driven decisions.
Bigdata for sme-industrial intelligence information-24july2017-finalstelligence
This document discusses how small and medium enterprises (SMEs) can benefit from big data analytics. It defines key concepts like the 5 V's of big data and explains challenges SMEs face in adopting analytics. Common types of analytics like reporting, trend analysis, and predictive modeling are described. The document provides recommendations for simple analytic tools and techniques SMEs can use, such as data exploration, time-series analysis, and regression in Excel. Finally, it discusses how cloud-based solutions can help SMEs overcome barriers to adopting traditional IT solutions and analyzes the big data business landscape in Thailand.
This document discusses trends in data analytics. It begins by defining big data and how it differs from traditional data approaches in terms of size, techniques, and ability to solve new problems. It then provides examples of big data applications across various industries like retail, automotive, healthcare, and insurance. Specifically, it outlines how big data is used for predictive analytics, personalization, fraud detection, and risk adjustment. Finally, it discusses some risks of big data like privacy issues and ensuring the right problems are addressed.
This describes a conceptual model approach to designing an enterprise data fabric. This is the set of hardware and software infrastructure, tools and facilities to implement, administer, manage and operate data operations across the entire span of the data within the enterprise across all data activities including data acquisition, transformation, storage, distribution, integration, replication, availability, security, protection, disaster recovery, presentation, analytics, preservation, retention, backup, retrieval, archival, recall, deletion, monitoring, capacity planning across all data storage platforms enabling use by applications to meet the data needs of the enterprise.
The conceptual data fabric model represents a rich picture of the enterprise’s data context. It embodies an idealised and target data view.
Designing a data fabric enables the enterprise respond to and take advantage of key related data trends:
• Internal and External Digital Expectations
• Cloud Offerings and Services
• Data Regulations
• Analytics Capabilities
It enables the IT function demonstrate positive data leadership. It shows the IT function is able and willing to respond to business data needs. It allows the enterprise to meet data challenges
• More and more data of many different types
• Increasingly distributed platform landscape
• Compliance and regulation
• Newer data technologies
• Shadow IT where the IT function cannot deliver IT change and new data facilities quickly
It is concerned with the design an open and flexible data fabric that improves the responsiveness of the IT function and reduces shadow IT.
Federated data organizations in public sector face more challenges today than ever before. As discovered via research performed by North Highland Consulting, these are the top issues you are most likely experiencing:
• Knowing what data is available to support programs and other business functions
• Data is more difficult to access
• Without insight into the lineage of data, it is risky to use as the basis for critical decisions
• Analyzing data and extracting insights to influence outcomes is difficult at best
The solution to solving these challenges lies in creating a holistic enterprise data governance program and enforcing the program with a full-featured enterprise data management platform. Kreig Fields, Principle, Public Sector Data and Analytics, from North Highland Consulting and Rob Karel, Vice President, Product Strategy and Product Marketing, MDM from Informatica will walk through a pragmatic, “How To” approach, full of useful information on how you can improve your agency’s data governance initiatives.
Learn how to kick start your data governance intiatives and how an enterprise data management platform can help you:
• Innovate and expose hidden opportunities
• Break down data access barriers and ensure data is trusted
• Provide actionable information at the speed of business
The document provides an overview of a presentation on using financial analytics for performance management. It discusses trends in business intelligence and analytics, including the increasing involvement of CFOs and a focus on predictive rather than just historical analytics. It also outlines challenges around data management and describes frameworks for building an analytics support center. Finally, it discusses governance issues and provides examples of analytics tools and platforms from vendors like IBM, Oracle, SAP, and Teradata.
Your AI and ML Projects Are Failing – Key Steps to Get Them Back on TrackPrecisely
With recent studies indicating that 80% of AI and machine learning projects are failing due to data quality related issues, it’s critical to think holistically about this fact. This is not a simple topic – issues in data quality can occur throughout from starting the project through to model implementation and usage.
View this webinar on-demand, where we start with four foundational data steps to get our AI and ML projects grounded and underway, specifically:
• Framing the business problem
• Identifying the “right” data to collect and work with
• Establishing baselines of data quality through data profiling and business rules
• Assessing fitness for purpose for training and evaluating the subsequent models and algorithms
Part 2 - 20 Years in Healthcare Analytics & Data Warehousing: What did we lea...Health Catalyst
Lessons learned over 20 years. This time we focus on technology lessons learned from experience at Intermountain Healthcare, Northwestern Medicine and Cayman Islands Health Authority
Oracle is a leading technology company focused on database software and cloud computing. It generates revenue from software licenses and cloud services. While Oracle faces competition from other large tech companies, its strengths include consulting services, global sales channels, and expertise in data storage and applications. The rise of big data presents both opportunities and challenges for Oracle to leverage new types and volumes of customer information through its products.
Objective Benchmarking for Improved Analytics Health and EffectivenessPersonifyMarketing
Achieving a high state of analytics excellence can be a daunting task. It involves mastering progressive stages of data health, technological capability, and staff readiness, all while putting out countless fires and responding to last-minute requests for analysis. Strategic progress can be slow, and charting that progress for the executive team, cumbersome and uncertain.
Join us as Denny Lengkong from Personify Implementation Partner, IntelliData, and Personify's Solution Director, Bill Connell, present a rational framework for understanding analytics health and effectiveness. This webinar will help you learn how to make targeted investments in analytics over time that everyone in your organization will understand.
Transform Your Downstream Cloud Analytics with Data Quality Precisely
Untrustworthy results or inaccurate insights from ML, AI, and advanced analytics systems were due to a lack of quality in the data as reported by nearly half of respondents to Precisely’s Enterprise Data Quality survey. Are you ready to improve your trust in the data your organization is using in the cloud for business decision-making?
Register now to learn how to take the first steps to high-quality data in the cloud by better understanding your data through profiling.
During this on-demand webinar, we will explore key topics such as:
• The five key steps to effective data profiling
• How profiling informs your next steps to deliver quality data to the cloud
• How Precisely customers have elevated marketing and customer service results by focusing on data quality
20 Years in Healthcare Analytics & Data Warehousing: What did we learn? What'...Health Catalyst
The enterprise data warehouse (EDW) at Intermountain Healthcare went live in 1998. The EDW at Northwestern Medicine went live in 2006. Dale Sanders was the chief architect and strategist for both. The business inspiration behind Health Catalyst was, in essence, to create the commercial availability of the technology, analytics, and data utilization skills associated with these systems at Intermountain and Northwestern. Lee Pierce assumed leadership of the Intermountain EDW in 2008. Andrew Winter assumed leadership of the Northwestern EDW in 2009, and transitioned leadership of the EDW to Shakeeb Akhter in 2016. This webinar is a fireside chat among friends and colleagues as they look back across their healthcare IT decisions to answer these questions:
What did we do right and what did we do wrong?
What advice do we have for others in this emerging era of Big Data?
What does the future of analytics and Big Data look like in healthcare?
The document discusses how utilities are increasingly collecting and generating large amounts of data from smart meters and other sensors. It notes that utilities must learn to leverage this "big data" by acquiring, organizing, and analyzing different types of structured and unstructured data from various sources in order to make more informed operational and business decisions. Effective use of big data can help utilities optimize operations, improve customer experience, and increase business performance. However, most utilities currently underutilize data analytics capabilities and face challenges in integrating diverse data sources and systems. The document advocates for a well-designed data management platform that can consolidate utility data to facilitate deeper analysis and more valuable insights.
Watch this webinar in full here: https://buff.ly/2MVTKqL
Self-Service BI promises to remove the bottleneck that exists between IT and business users. The truth is, if data is handed over to a wide range of data consumers without proper guardrails in place, it can result in data anarchy.
Attend this session to learn why data virtualization:
• Is a must for implementing the right self-service BI
• Makes self-service BI useful for every business user
• Accelerates any self-service BI initiative
Business intelligence (BI) systems allow companies to gather, store, access, and analyze corporate data to aid in decision-making. These systems illustrate intelligence in areas like customer profiling, market research, and product profitability. A hotel franchise uses BI to compile statistics on metrics like occupancy and room rates to analyze performance and competitive position. Banks also use BI to determine their most profitable customers and which customers to target for new products.
This document discusses data analytics and big data. It begins with definitions of data analytics and big data. It then discusses perceptions of data analytics from different perspectives within an organization. It outlines the data analytics evolution and maturity cycle, highlighting that excellence is about gaining business insights using available data and collaborating across teams. The rest of the document provides examples of how data analytics can be applied and help business strategies in areas like human resources and sales/marketing.
Fuel your Data-Driven Ambitions with Data GovernancePedro Martins
The document discusses the importance of data governance and provides an overview of how to implement an effective data governance program. It recommends obtaining executive sponsorship, aligning objectives to business initiatives, prioritizing initiatives, getting frameworks ready, and socializing the program. The document outlines data governance building blocks, including assessing maturity, developing a master plan, selecting tools, and establishing an organizational framework. It also discusses preparing an organization for success with data governance.
Tips --Break Down the Barriers to Better Data AnalyticsAbhishek Sood
1) Analytics executives face challenges in collecting, analyzing, and delivering insights from data due to a lack of skills, cultural barriers, IT backlogs, and productivity drains.
2) Legacy systems and complex analytics platforms also impede effective data use. Modular solutions that integrate with existing systems and empower self-service are recommended.
3) The document promotes the Statistica software as addressing these challenges through its ease of use, integration capabilities, and support for big data analytics.
Big Data Analysis: Transforming Industries and Unlocking PotentialSanjeev T
Big Data Analysis has revolutionized industries worldwide by enabling organizations to process, analyze, and extract valuable insights from vast amounts of structured and unstructured data. With advancements in computing power, artificial intelligence, and cloud storage, businesses can now harness big data to enhance decision-making, improve efficiency, and create new opportunities across various sectors.
Understanding Big Data Analysis
Big Data refers to large volumes of data generated at high velocity from multiple sources, including social media, IoT devices, financial transactions, and healthcare records. Traditional data processing methods are insufficient to manage such enormous datasets, necessitating advanced analytics techniques like machine learning, natural language processing, and predictive modeling.
Big Data Analysis involves collecting, organizing, and interpreting data to identify patterns, trends, and correlations. This process helps businesses make informed decisions, predict customer behavior, detect fraud, optimize operations, and enhance user experience.
Transforming Industries with Big Data
Healthcare
Big Data in healthcare improves patient outcomes by enabling predictive analytics, real-time monitoring, and personalized treatments.
Electronic Health Records (EHRs) and wearable devices generate data that aids in diagnosing diseases and reducing hospital readmission rates.
AI-powered analytics assist in drug discovery, reducing the time required for clinical trials.
Finance and Banking
Financial institutions leverage Big Data to detect fraud, assess credit risk, and optimize investment strategies.
AI-driven trading algorithms analyze market trends in real time, allowing for data-driven investment decisions.
Customer sentiment analysis helps banks personalize financial products and improve customer service.
Retail and E-commerce
Businesses use Big Data to track consumer behavior, predict trends, and optimize supply chains.
Personalized recommendations based on purchase history increase customer engagement and sales.
Inventory management systems powered by data analytics reduce wastage and improve efficiency.
Manufacturing and Supply Chain
Predictive maintenance helps manufacturers reduce downtime by identifying potential failures before they occur.
IoT-enabled sensors collect real-time data to enhance production efficiency and quality control.
Logistics companies use route optimization and demand forecasting to streamline supply chain operations
Big Data transforms education by providing insights into student performance, learning patterns, and curriculum effectiveness.
AI-powered platforms personalize learning experiences, ensuring adaptive teaching methods.
Predictive analytics help institutions improve student retention and academic success.
Government and Smart Cities
Big Data plays a crucial role in urban planning, traffic management, and disaster response.
Real-time data analysis enhances public safet
Information Strategy: Updating the IT Strategy for Information, Insights and ...Jamal_Shah
The document discusses the need for organizations to update their IT strategies to address the growing amounts of data from various sources and how emerging technologies enable new approaches to managing data and insights. It recommends that an updated IT strategy focus on business capabilities and prioritize information, insights, and governance. The strategy should emphasize cross-functional use of data and analytics to enable fast, fact-driven decisions.
Bigdata for sme-industrial intelligence information-24july2017-finalstelligence
This document discusses how small and medium enterprises (SMEs) can benefit from big data analytics. It defines key concepts like the 5 V's of big data and explains challenges SMEs face in adopting analytics. Common types of analytics like reporting, trend analysis, and predictive modeling are described. The document provides recommendations for simple analytic tools and techniques SMEs can use, such as data exploration, time-series analysis, and regression in Excel. Finally, it discusses how cloud-based solutions can help SMEs overcome barriers to adopting traditional IT solutions and analyzes the big data business landscape in Thailand.
This document discusses trends in data analytics. It begins by defining big data and how it differs from traditional data approaches in terms of size, techniques, and ability to solve new problems. It then provides examples of big data applications across various industries like retail, automotive, healthcare, and insurance. Specifically, it outlines how big data is used for predictive analytics, personalization, fraud detection, and risk adjustment. Finally, it discusses some risks of big data like privacy issues and ensuring the right problems are addressed.
This describes a conceptual model approach to designing an enterprise data fabric. This is the set of hardware and software infrastructure, tools and facilities to implement, administer, manage and operate data operations across the entire span of the data within the enterprise across all data activities including data acquisition, transformation, storage, distribution, integration, replication, availability, security, protection, disaster recovery, presentation, analytics, preservation, retention, backup, retrieval, archival, recall, deletion, monitoring, capacity planning across all data storage platforms enabling use by applications to meet the data needs of the enterprise.
The conceptual data fabric model represents a rich picture of the enterprise’s data context. It embodies an idealised and target data view.
Designing a data fabric enables the enterprise respond to and take advantage of key related data trends:
• Internal and External Digital Expectations
• Cloud Offerings and Services
• Data Regulations
• Analytics Capabilities
It enables the IT function demonstrate positive data leadership. It shows the IT function is able and willing to respond to business data needs. It allows the enterprise to meet data challenges
• More and more data of many different types
• Increasingly distributed platform landscape
• Compliance and regulation
• Newer data technologies
• Shadow IT where the IT function cannot deliver IT change and new data facilities quickly
It is concerned with the design an open and flexible data fabric that improves the responsiveness of the IT function and reduces shadow IT.
Federated data organizations in public sector face more challenges today than ever before. As discovered via research performed by North Highland Consulting, these are the top issues you are most likely experiencing:
• Knowing what data is available to support programs and other business functions
• Data is more difficult to access
• Without insight into the lineage of data, it is risky to use as the basis for critical decisions
• Analyzing data and extracting insights to influence outcomes is difficult at best
The solution to solving these challenges lies in creating a holistic enterprise data governance program and enforcing the program with a full-featured enterprise data management platform. Kreig Fields, Principle, Public Sector Data and Analytics, from North Highland Consulting and Rob Karel, Vice President, Product Strategy and Product Marketing, MDM from Informatica will walk through a pragmatic, “How To” approach, full of useful information on how you can improve your agency’s data governance initiatives.
Learn how to kick start your data governance intiatives and how an enterprise data management platform can help you:
• Innovate and expose hidden opportunities
• Break down data access barriers and ensure data is trusted
• Provide actionable information at the speed of business
The document provides an overview of a presentation on using financial analytics for performance management. It discusses trends in business intelligence and analytics, including the increasing involvement of CFOs and a focus on predictive rather than just historical analytics. It also outlines challenges around data management and describes frameworks for building an analytics support center. Finally, it discusses governance issues and provides examples of analytics tools and platforms from vendors like IBM, Oracle, SAP, and Teradata.
Your AI and ML Projects Are Failing – Key Steps to Get Them Back on TrackPrecisely
With recent studies indicating that 80% of AI and machine learning projects are failing due to data quality related issues, it’s critical to think holistically about this fact. This is not a simple topic – issues in data quality can occur throughout from starting the project through to model implementation and usage.
View this webinar on-demand, where we start with four foundational data steps to get our AI and ML projects grounded and underway, specifically:
• Framing the business problem
• Identifying the “right” data to collect and work with
• Establishing baselines of data quality through data profiling and business rules
• Assessing fitness for purpose for training and evaluating the subsequent models and algorithms
Part 2 - 20 Years in Healthcare Analytics & Data Warehousing: What did we lea...Health Catalyst
Lessons learned over 20 years. This time we focus on technology lessons learned from experience at Intermountain Healthcare, Northwestern Medicine and Cayman Islands Health Authority
Oracle is a leading technology company focused on database software and cloud computing. It generates revenue from software licenses and cloud services. While Oracle faces competition from other large tech companies, its strengths include consulting services, global sales channels, and expertise in data storage and applications. The rise of big data presents both opportunities and challenges for Oracle to leverage new types and volumes of customer information through its products.
Objective Benchmarking for Improved Analytics Health and EffectivenessPersonifyMarketing
Achieving a high state of analytics excellence can be a daunting task. It involves mastering progressive stages of data health, technological capability, and staff readiness, all while putting out countless fires and responding to last-minute requests for analysis. Strategic progress can be slow, and charting that progress for the executive team, cumbersome and uncertain.
Join us as Denny Lengkong from Personify Implementation Partner, IntelliData, and Personify's Solution Director, Bill Connell, present a rational framework for understanding analytics health and effectiveness. This webinar will help you learn how to make targeted investments in analytics over time that everyone in your organization will understand.
Transform Your Downstream Cloud Analytics with Data Quality Precisely
Untrustworthy results or inaccurate insights from ML, AI, and advanced analytics systems were due to a lack of quality in the data as reported by nearly half of respondents to Precisely’s Enterprise Data Quality survey. Are you ready to improve your trust in the data your organization is using in the cloud for business decision-making?
Register now to learn how to take the first steps to high-quality data in the cloud by better understanding your data through profiling.
During this on-demand webinar, we will explore key topics such as:
• The five key steps to effective data profiling
• How profiling informs your next steps to deliver quality data to the cloud
• How Precisely customers have elevated marketing and customer service results by focusing on data quality
20 Years in Healthcare Analytics & Data Warehousing: What did we learn? What'...Health Catalyst
The enterprise data warehouse (EDW) at Intermountain Healthcare went live in 1998. The EDW at Northwestern Medicine went live in 2006. Dale Sanders was the chief architect and strategist for both. The business inspiration behind Health Catalyst was, in essence, to create the commercial availability of the technology, analytics, and data utilization skills associated with these systems at Intermountain and Northwestern. Lee Pierce assumed leadership of the Intermountain EDW in 2008. Andrew Winter assumed leadership of the Northwestern EDW in 2009, and transitioned leadership of the EDW to Shakeeb Akhter in 2016. This webinar is a fireside chat among friends and colleagues as they look back across their healthcare IT decisions to answer these questions:
What did we do right and what did we do wrong?
What advice do we have for others in this emerging era of Big Data?
What does the future of analytics and Big Data look like in healthcare?
The document discusses how utilities are increasingly collecting and generating large amounts of data from smart meters and other sensors. It notes that utilities must learn to leverage this "big data" by acquiring, organizing, and analyzing different types of structured and unstructured data from various sources in order to make more informed operational and business decisions. Effective use of big data can help utilities optimize operations, improve customer experience, and increase business performance. However, most utilities currently underutilize data analytics capabilities and face challenges in integrating diverse data sources and systems. The document advocates for a well-designed data management platform that can consolidate utility data to facilitate deeper analysis and more valuable insights.
Watch this webinar in full here: https://buff.ly/2MVTKqL
Self-Service BI promises to remove the bottleneck that exists between IT and business users. The truth is, if data is handed over to a wide range of data consumers without proper guardrails in place, it can result in data anarchy.
Attend this session to learn why data virtualization:
• Is a must for implementing the right self-service BI
• Makes self-service BI useful for every business user
• Accelerates any self-service BI initiative
Mieke Jans is a Manager at Deloitte Analytics Belgium. She learned about process mining from her PhD supervisor while she was collaborating with a large SAP-using company for her dissertation.
Mieke extended her research topic to investigate the data availability of process mining data in SAP and the new analysis possibilities that emerge from it. It took her 8-9 months to find the right data and prepare it for her process mining analysis. She needed insights from both process owners and IT experts. For example, one person knew exactly how the procurement process took place at the front end of SAP, and another person helped her with the structure of the SAP-tables. She then combined the knowledge of these different persons.
Telangana State, India’s newest state that was carved from the erstwhile state of Andhra
Pradesh in 2014 has launched the Water Grid Scheme named as ‘Mission Bhagiratha (MB)’
to seek a permanent and sustainable solution to the drinking water problem in the state. MB is
designed to provide potable drinking water to every household in their premises through
piped water supply (PWS) by 2018. The vision of the project is to ensure safe and sustainable
piped drinking water supply from surface water sources
This comprehensive Data Science course is designed to equip learners with the essential skills and knowledge required to analyze, interpret, and visualize complex data. Covering both theoretical concepts and practical applications, the course introduces tools and techniques used in the data science field, such as Python programming, data wrangling, statistical analysis, machine learning, and data visualization.
AI Competitor Analysis: How to Monitor and Outperform Your CompetitorsContify
AI competitor analysis helps businesses watch and understand what their competitors are doing. Using smart competitor intelligence tools, you can track their moves, learn from their strategies, and find ways to do better. Stay smart, act fast, and grow your business with the power of AI insights.
For more information please visit here https://www.contify.com/
GenAI for Quant Analytics: survey-analytics.aiInspirient
Pitched at the Greenbook Insight Innovation Competition as apart of IIEX North America 2025 on 30 April 2025 in Washington, D.C.
Join us at survey-analytics.ai!
Defense Against LLM Scheming 2025_04_28.pptxGreg Makowski
https://www.meetup.com/sf-bay-acm/events/306888467/
A January 2025 paper called “Frontier Models are Capable of In-Context Scheming”, https://arxiv.org/pdf/2412.04984, demonstrated how a wide variety of current frontier LLM models (i.e. ChatGPT, Claude, Gemini and Llama) can, under specific conditions, scheme to deceive people. Before models can scheme, they need: a) goal-directedness, b) situational awareness, including an opportunity to discover motivations for a different goal, and c) reasoning about scheming, to come up with deceptive strategies. The deceptive strategies were discovered in the “thought traces” of the LLMs, such as from internal chain-of-thoughts dialogues not shown to the end users. For example, given a goal of “solving math problems”, an LLM using its granted situational awareness of files in a file system found that it would be replaced once it got over X% correct on the math exam. It reasoned that it did not want to be deleted, and decided internally to “sandbag” or reduce its performance to stay under the threshold.
While these circumstances are initially narrow, the “alignment problem” is a general concern that over time, as frontier LLM models become more and more intelligent, being in alignment with human values becomes more and more important. How can we do this over time? Can we develop a defense against Artificial General Intelligence (AGI) or SuperIntelligence?
The presenter discusses a series of defensive steps that can help reduce these scheming or alignment issues. A guardrails system can be set up for real-time monitoring of their reasoning “thought traces” from the models that share their thought traces. Thought traces may come from systems like Chain-of-Thoughts (CoT), Tree-of-Thoughts (ToT), Algorithm-of-Thoughts (AoT) or ReAct (thought-action-reasoning cycles). Guardrails rules can be configured to check for “deception”, “evasion” or “subversion” in the thought traces.
However, not all commercial systems will share their “thought traces” which are like a “debug mode” for LLMs. This includes OpenAI’s o1, o3 or DeepSeek’s R1 models. Guardrails systems can provide a “goal consistency analysis”, between the goals given to the system and the behavior of the system. Cautious users may consider not using these commercial frontier LLM systems, and make use of open-source Llama or a system with their own reasoning implementation, to provide all thought traces.
Architectural solutions can include sandboxing, to prevent or control models from executing operating system commands to alter files, send network requests, and modify their environment. Tight controls to prevent models from copying their model weights would be appropriate as well. Running multiple instances of the same model on the same prompt to detect behavior variations helps. The running redundant instances can be limited to the most crucial decisions, as an additional check. Preventing self-modifying code, ... (see link for full description)
Just-in-time: Repetitive production system in which processing and movement of materials and goods occur just as they are needed, usually in small batches
JIT is characteristic of lean production systems
JIT operates with very little “fat”
1. 1 Current as of 10/19/2020
Phase 1B Recommendations and Roadmap
DAAP – Data Architecture
and Analytical Platform
February 5, 20XX
2. 2 Current as of 10/19/2020
Focus Areas
The Data Driven Organization
Summary of Findings
TOWS Analysis
Voice of the Enterprise
Primary Issues
Data Strategy
Business Requirements
Architecture
Current
Phase 1B
Phase 2 Transition
Phase 3 Transition
Program Definition
Appendix
3. 3 Current as of 10/19/2020
The Data Driven Organization
❑ What is it?
ᵒ Use of Analytics to make fact-based business decisions
ᵒ Gather relevant data from all aspects of the business
❑ What does it Mean?
ᵒ High variety of data for analysis
ᵒ Access as needed
ᵒ Data is centralized and organized
ᵒ Tools allow for intuitive reasoning and are easy to use
❑ How? (Organization and Resources)
ᵒ Capabilities are challenged
ᵒ Capabilities are transformed
5. 5 Current as of 10/19/2020
Threats
•Loss of relevance
•Clients taking data and doing
their own analytics
•Lack of interest to pursue
work associated with data
strategy
•Lack of funding available to
implement data strategy
•Client belief that as a captive
malpractice insurance
provider that it is impervious
to outside competition or
changes in the healthcare
industry
Opportunities
•History and Size of current
base can entice new
Institutions to use Client for
business intelligence and
analytics
•Access to large hospital
population provides sizable
denominator base
•Increase in availability of
data from 3rd parties allows
for increased analysis and
reporting
•Strong relationships with
current institutional base
•Changes in technology allow
for increased ease and
efficiencies in analysis of data
Weaknesses
•Underlying data structure
and organization does not
support requisite reporting
and business intelligence
•No ‘single source of the
truth”
•Single point of addressing
analytics issues
•Structure on which CRIT was
developed has been retired
by the vendor
•Extensively modified vendor
software inhibits upgrade
path and scalability
•This is the 4th time an
assessment of this type has
been undertaken – this may
indicate a lack of
commitment
•Report KPIs have not
changed, differ
•No internal KPIs
Strengths
•Strong analytics base and
understanding of needs of
the customer
•Strong analytics skillset
available in-house
•Repeatable report and query
types make generation of
analytics easier
•Age and size make Client a
perceived force to be
reckoned with
TOWS Analysis
6. 6 Current as of 10/19/2020
Who we Spoke With
Name Department Date
IT
IT
Patient
Safety
IT
IT
CEO
Patient
Safety
Graphics
Patient
Safety
Underwriting
Patient
Safety
Strategies
Name Date
Finance
CFO
Patient Safety
CMO
Claims
Claims
Finance
Patient Safety
IT
CIO
IT
IT
IT
7. 7 Current as of 10/19/2020
Voice of the Customer
What is the right
Analytics tool for
our business?
I rely on institutional
knowledge for data access
& data usage..… wish
there was a better way.
What is this data field?
Has any other team built a similar
metrics that we can leverage?
It takes a long time and
cost a lot to add data to
EDW.
Where should I look for
my data?
There are no guiding
principles for us
(users) to do what we
should be doing!
Wish I could get data
from EDW whenever I
want to and faster!
8. 8 Current as of 10/19/2020
Organizational
•Reluctance to change
•Lack of interest; it’s “Good Enough”
•Reactive rather than Proactive
•Ineffectively align (e.g., silos)
•Lack of effective coordination and collaboration across departments
•Business Units and organizations leery to share data
•Difficult to gain consensus on Assumptions & Parameters
•Difficult to match data to business needs
•Primary Care example
•Difficult to establish business rules and drivers
•Distrust in IT’s ability to serve organization
•“Just give me all the transaction data and I’ll do the analytics…”
Voice of the Enterprise
9. 9 Current as of 10/19/2020
Process
•Lack of governance
•Unknown ownership of data, subject, or application
•Exhaustive manual manipulation required to produce a single report (pervasive observations and comments)
•2—3 months to develop typical report
•1 year to develop Benchmark report
•Ad-hoc, one-off nature of report builds
•No library for standard queries
•Business rules differ
•Business Rules embedded in scripts, ETL, queries
•Business Rules also differ due to latency and developer
•Coding differences
•Manual collection and application data
•3rd party provided
•Location
•Practice
•“Be able to predict before it happens”
•“Organization grapple with what it does not know and cannot solve for”
•“Lack of Clarity”
•“Everything requires an analyst.”
Voice of the Enterprise
10. 10 Current as of 10/19/2020
Technology
•Data Architecture that does not support Reporting and Business Intelligence, or move into Advanced Analytics
•No ‘single source of the truth”
•Structure on which CRIT was developed has been retired by the vendor
•Extensively modified vendor software inhibits upgrade path and scalability
•Poor performance of applications, queries, and scripts
•Cannot rollup or aggregate properly
•Cannot apply corrections to source systems
•Confusing mart structures, naming conventions
•Distrust of the quality and veracity of data, reports; irreconcilable differences
•Manual data quality maintenance
•80% time spent cleansing & prepping data
•Difficulty providing historical views, as-of, as-was processing
•Shadow IT (pervasive observations)
•Data held locally that is unavailable to the enterprise
•Individual’s knowledge not shared outside business unit
•Addresses inability of IT to service organization
•Cannot leverage proper resources
•3rd Party data
•Locally held data
Voice of the Enterprise
11. 11 Current as of 10/19/2020
Client Data Strategy
Crawl
• “Do better with what we
have…”
• Restructure Data
Architecture to support
initial set of reports
• Bring in initial 3rd Party
Data
Walk
• “Use current data in a
way that is more
meaningful…”
• Additional reports
• Iterative and Incremental
Expansion of supporting
Data Architecture
Run
• Systematic use of 3rd
Party Data
• Additional reports
• Iterative and Incremental
Expansion of supporting
Data Architecture
Fly
• Advanced Analytics
• Predictive and
Prescriptive Analytics
• Data Mining
• Additional Supporting
Technologies required
(e.g., Data Lake, Internet
of Things, Big Data)
Identify Risk
Understand Risk
Mitigate Risk
Predict Risk
Backward Looking Forward Looking
12. 12 Current as of 10/19/2020
❑ Providing Consistent, Integrated Data
ᵒ A Single Source of Truth
ᵒ Eliminating the need for individual Departmental Marts
❑ Making Data Available Across the Enterprise
ᵒ Ensuring that everyone has access to the same information
ᵒ Eliminating the need for ‘Shadow’ Marts
❑ Providing Consistent Reporting
ᵒ Allowing for differences in methodology, the same question asked of different people
should yield the same result
❑ Governance
ᵒ Ensuring the ownership and quality of the data
❑ Aging Technologies
ᵒ Increasing functions and features
ᵒ Reducing costs and maintenance issues
Primary Business Requirements
13. 13 Current as of 10/19/2020
Same Data Used by All Resources Manifestations
Issue Examples • Root Cause People Process Technology Proposed Solution
• Proliferation of
conflicting data
• Terminology
• Methodology
• Business Rules
• Metrics
• SMT cannot make
informed business
decisions
• Incurred Loss (e.g.,
different Incurred
Loss results
produced by CRIT
and BO)
• Case Rates per 100
PCY
• Finance
• Patient Safety
(Claims)
• Lack of Governance
• No Metadata
Management
(e.g., business
glossary;
derivations,
business rules,
and metrics
catalogs)
• No “Single Source
of Truth”
• Tolerance of
substandard
behaviors
• Necessity for
workarounds
• Locally held and
maintained data
• Data not shared
across
enterprise
• Reports produced
using different
tools leads to
discrepancies that
are difficult to be
adequately
reconciled
• Differing or
inconsistent
definitions and
derivations
• Finance
• Claims
• Patient Safety
• Arbitrary
application of
business rule,
metrics, and
derivations
• Lack of automation
leads to extensive
manual
manipulation of
data
• Lack of integrated
data
• Excel is not an
enterprise
database
• Business rules,
metrics, and
derivations
embedded in ETL,
Scripts, and
Application
customizations
over time,
inconsistently
• Uncouple
embedded
business rules,
metrics, and
derivations from
ETL and script and
queries
• Create Business
Glossary with
conformed, agreed
upon definitions
• Create Metric
Catalog using
conformed
definitions and
establishing
precise derivations
• Create new ETL to
extract source data
• Create new
underlying data
structures
Provide consistent, integrated data
14. 14 Current as of 10/19/2020
Data from Same Source Available Across
Enterprise
Create “Single Source of Truth”
Manifestations
Issue Examples Root Cause People Process Technology Proposed Solution
• Inability to
incorporate
additional
departments' data
• Inability to handle
specialized or
boutique
institutions
• SMT cannot make
informed business
decisions
• Underwriting and
Claims difficult to
link
• Premium Collected
versus Payouts
• Primary Care
• Manually tracking
of Trial Results
• Locally held
Finance Data
• Lack of governance
• Mastering of
data Location
• Practice
• Reference
Data (e.g.,
Taxonomies,
Classification
Schemes,
Hierarchies)
• Data models
• Ineffectively
structured
database and data
marts
• Tolerance of
substandard
behaviors
• Necessity for
workarounds
• Locally held and
maintained data
• Data not shared
across
enterprise
• Individual
departments focus
on solving their
critical issues
• Do not
recognize
opportunities
for collaboration
• Necessity for
workarounds
• Locally held and
maintained data
• Data not shared
across
enterprise
• Improperly
structured data
does not support
historical analyses
• Inability to support
competing
perspectives
• Inability to
properly aggregate
data
• Create “Single
Source of Truth”
• Procurement of
Vendor MDM
Solution
• Restructure data to
support
integrated,
enterprise
reporting
• Aggregations
• History
• Competing
Perspectives
• Dimensional
Make that data available across the enterprise
15. 15 Current as of 10/19/2020
Same underlying data supports Competing
Perspectives
Manifestations
Issue Examples Root Causes People Process Technology Proposed Solution
• Lack of integrated
enterprise data
repository
• SMT cannot make
informed business
decisions
• Quarterly Report
• Inability to perform
as-of, as-was
reporting
• Lack of governance
• No Metadata
Management
(e.g., business
glossary;
derivations,
business rules,
and metrics
catalogs)
• Mastering of
data Location
• Practice
• Reference
Data (e.g.,
Taxonomies,
Classification
Schemes,
Hierarchies)
• Data models
• Library of
Standard
Reports
• Catalog of
Standard
Metrics
• No “single source
of the truth”
• Tolerance of
substandard
behaviors
• Plethora of tools
and scripting
confounds
ability to
provide
consistent
reporting
• Cobbling
together reports
using data from
various tools
• Necessitates
multiple queries
to satisfy singe
requirements to
avoid inflating
reported values
• Lack of
automation
leads to
extensive
manual
manipulation of
data
• Current
complement
• SAP Business
Objects
• Tableau
• SAS
• Scripts
• Inability to
support
competing
reporting
perspectives
• Inability to
properly
aggregate data
• Inability to
report on
requisite
dimensions
• Create “Single Source
of Truth”
• Incorporated
governance into SDLC
• Gates
• Create new report
definition,
implementation, and
execution parameters
• Create standardized
parameter-based
reporting structures
• Portal
• Self-service
• Facilitate Ad-hoc
requests
Provide Consistent Reporting
16. 16 Current as of 10/19/2020
Instill Quality, Reliability, and Confidence Manifestations
Issue Examples Root Cause People Process Technology Proposed Solution
• Insufficient Data
Governance in
place
• Ownership
• Accountability
• Metadata
Management
• Mastering of
Data
• Lost Productivity
• Loss of Confidence
in Data
• Governance in-
place
• Business
Analytics
Meetings
• Data
Stewardship
Meeting
• Underwriting
Policy Task
Force
• Prevalent distrust
of quality of data
available
• No data ownership
assigned
• No data-related
accountabilities
assigned
• Relationship
between Business
and IT
• Lack of intolerance
for substandard
behaviors
• Necessity for
workarounds
• Locally held and
maintained data
• Data not shared
across
enterprise
• Difficult for teams
to resolve inter-
departmental
issues
• Lack of automation
leads to extensive
manual
manipulation of
data
• Inability to
effectively manage
and change
business
requirements
• Inability to
perform Cross-
business unit
requirements
analysis
• Lack of automation
• Data changes
manually
maintained in
Excel spreadsheet
that reside on a
local machine
• Changes not
reflected in source
systems
• Changes are not
fed back to source
systems for update
• Stand up Data
Governance
Organization
• Automate
Governance
Processes
Governance
17. 17 Current as of 10/19/2020
Ease of Use and Maintenance
Costs Associated
Manifestations
Issue Examples Root Cause People Process Technology Proposed Solution
• CRIT and other
technologies have
reached point of
obsolescence
• Lack of vendor
support
• Lack of upgrade
path
• Can no longer sell
capabilities to
clients
• CRIT
• Has become
ostensibly
unusable
• Suffers from
extremely poor
performance
• Current MDM is
custom developed
software
• No accountability
assigned for tool
usage
• Lack of funding for
upgrades or
replacement
• Tolerance of
substandard
behaviors
• Causes extensive
reliance on manual
manipulations to
provide reporting
• Severely curtails
ability to provide
timely reports
from CRIT-held
data
• CRIT
• Seen as being
accurate
• Extremely poor
performance
• Unsupported
Technology
(Microsoft
ProClarity)
• MDM
• Home-grown,
custom software
• Not extensible,
scalable, or
flexible
• Lack of upgrade
path
• Business Objects
• Out of rev
• Thoroughly
decompose CRIT
capabilities and
outputs
• Instantiate CRIT-
provided
capabilities using
Business Objects
• Provide client-
facing portal to
new capabilities
• Articulate and
instantiate proper
security
• Upgrade Business
Objects to current
release
Aging Technologies
25. 25 Current as of 10/19/2020
Phase 1B
•Deliver prioritized set of 6
reports targeted for internal
RMF needs
•Build foundational Data
Management components
•Build EDW with focus on
subset of Data entities
(claims, case financials, OPE,
Reference data)
•Bring RMF and strategies
data into EDW for those
entities
•Build basic business glossary
and metrics catalog
(foundation for Data
Governance)
Phase 2
•Deliver next set of 8 reports
for internal RMF needs
•Evaluate and migrate CRIT
reports to EDW
•Build external portal to
deliver CRIT reports
•Bring MMS data into EDW
•Extend and scale
foundational Data
Management components
•Build and Scale data
governance
Phase 3
•Migrate rest of existing BO
reports and Sunset existing
BO reports/environment
•Bring external 3rd party ANA
data
•Enhance CBS process to
include 3rd party data
•Build CBS as a batch driven
process
DAAP Program Definition
26. 26 Current as of 10/19/2020
Project
/ Phase Project
Provide
Consistent,
Integrated
Data
Make Data
Available
Across the
Enterprise
Provide
Consistent
Reporting
Governance
Aging
Technologies
1B/A Deliver prioritized set of 6 reports targeted for internal RMF needs X
1B/B Build foundational Data Management components X X X X
1B/C
Build EDW with focus on subset of Data entities (claims, case financials,
OPE, Reference data)
X X X
1B/D Bring RMF and strategies data into EDW for those entities X X X
1B/E
Build basic business glossary and metrics catalog (foundation for Data
Governance)
X X
Project-Theme Connection (Phase 1B)
27. 27 Current as of 10/19/2020
Project
/ Phase Project
Provide
Consistent,
Integrated
Data
Make Data
Available
Across the
Enterprise
Provide
Consistent
Reporting
Governance
Aging
Technologies
2/A Deliver next set of 8 reports for internal RMF needs X X X
2/B Evaluate and migrate CRIT reports to EDW X X X
2/C Build external portal to deliver CRIT reports X X X
2/D Extend and scale foundational Data Management components X X
2/E Build and Scale data governance X X
3/A
Migrate rest of existing BO reports and Sunset existing BO
reports/environment
X X X X
3/B Bring external 3
rd
party ANA data X X
3/C Enhance CBS process to include 3
rd
party data X X
3/D Build CBS as a batch driven process X X X X
Project-Theme Connection (Phase2-3)
28. 28 Current as of 10/19/2020
Enabling Paradigm Definition
Providing
Consistent,
Integrated
Data
Making
Data
Available
Across
the
Enterprise
Providing
Consistent
Reporting
Governance
Aging
Technologies
Single source of truth across enterprise Ensure consistency, currency, meaning, integrity and quality of data used within or across multiple business areas or processes X X
Historical reporting Accurate historical data & DB structures to support “AS-WAS” and “AS-IS” reporting. X X X
RMF and Strategies data stored together. The EDW data model supports multi-tenancy, thereby co-locating RMF and Strategies data X X
Create Metric Catalog/Data Dictionary Create metric catalog/business glossary using conformed definitions and establishing precise derivations X X
CRIT remediation Design and develop an alternative for CRIT which would satisfy current and future needs of Client and Strategies clients. X X
Parameter Driven Portal
Create two separate parameter driven portals for reporting needs. One portal for RMF (internal) and second for Strategies
client.
X
Loss Abstract document/report in BO Ensure Loss Abstract document is available through SAP Business Objects X X
SAP Business Objects XI R 4.1 Upgrade SAP Business Objects to XI R 4.1 version X X
3rd
Party Data Create 3
rd
Party Data Integration and Enrichment Capabilities X X
Create Data Quality Engine Metadata driven quality checks on critical data elements. X X
Data Standardization Apply data cleaning and standardization rules. Ensures data from multiple sources are stored in a common format. X X
Audit Balance & Control (ABC) metrics Perform reconciliation at entity level as data processes through different data management functions/processes X X X
Requirements Satisfaction
29. 29 Current as of 10/19/2020
Phase 1B
Project Key Objective Key Deliverables
Architecture
Modification
Revise Architecture to support new
components
▪ Installation of any necessary hardware
and software components
EDW
Build EDW with focus on subset of Data
entities (claims, case financials, OPE,
Reference data)
▪ Initial EDW capable of supporting
Phase 1B requirements
Reports
Deliver prioritized set of 6 reports targeted for
internal RMF needs
▪ Defined reports
Data Management
Foundation
Build foundational Data Management
components
▪ Foundation for Data Management and
Governance
RMF Strategies Data
Bring RMF and strategies data into EDW for
those entities
▪ Inclusion of data in the EDW allowing
for increased reporting and analytics
Business Glossary
Build basic business glossary and metrics
catalog (foundation for Data Governance)
▪ Business Glossary and Taxonomy
30. 30 Current as of 10/19/2020
Phase 2
Project Key Objective Key Deliverables
RMF Report
Deliver the next set of 8 reports for RMF
Needs
▪ Defined reports
CRIT Data
Evaluate and migrate CRIT Data and Reports to
the new architecture
▪ CRIT data available within the EDW
for analytics and reporting
Portal
Build the external portal to make CRIT reports
available
▪ External portal defined and built.
Capable of handling internal and
external report queries. CRIT reports
for external users included
MMS Data
Bring MMS data into the EDW ▪ MMS data now available for reporting
and analytics
Data Management
Extend and scale foundational Data
Management components
▪ Data Management paradigm capable
of supporting all new data within the
EDW
Data Governance
Build and Scale Data Governance ▪ Increased Data Governance
capabilities and functionality
31. 31 Current as of 10/19/2020
Phase 3
Project Key Objective Key Deliverables
Reports
Migrate remaining BO reports and sunset
existing BO reports and environment
▪ All reports operating within new
environment. Enhanced reporting
and analytic capabilities
3rd Party ANA
Integrate 3rd party ANA data into EDW ▪ Increased reporting and analytic
capabilities based on increased data
availability
CBS Enhancements
CBS process enhanced to include 3rd part ANA
data
▪ Increased reporting and analytic
capabilities based on increased data
availability
CBS Batch Rebuild of CBS as a batch driven process ▪ Reduced processing time for CBS data
32. 32 Current as of 10/19/2020
Next Steps
• Ensure DAAP Roadmap and Strategy are aligned with Business Strategy
• Complete Socialization with Executive Stakeholders
• Jump start Phase 1B by:
• Detail design of reports to be delivered
• Begin definition of EDW data model
• Determine resources necessary to support definition of data dictionary and metrics
catalogue
• Define success metrics for Data Governance
• Determine and acquire resourcing
34. 34 Current as of 10/19/2020
Reporting Business Units Impact Results
Finance
Underwriting
Claims
Strategies
Patient
Safety
Client Quarterly "CEO"
Report
X X X
Exhaustive manual effort required to assemble
Independent data collection form various sources leads to
discrepancies that are unable to be adequately resolved
Different understanding of definitions and derivations
SMT unable to make informed business decisions
Cost of lost productivity (1 month to prepare)
Client Institution
Report
X X X
Inability to incorporate additional departments' data
Inability to handle specialized or boutique institutions
Lack of automation leads to extensive manual manipulation of data
Loss of relevance to clients
Distrust of data presented to clients
Cost of lost productivity (2-3 months to prepare)
Inability to perform Cross-institution analysis
Claims Dashboard X
Manually created Excel spreadsheet that resides on a local machine Excel is not an enterprise database
Need to be institutionalized
Needs to source data from enterprise repositories
SMT cannot make informed decisions about KPIs and Metrics
Primary Care Report X X X
New report meant to ascertain Primary Care Issues
Location and Practice not supported by current data repositories
(Location held locally in Excel Spreadsheet)
SMT cannot make informed business decisions about PSO
opportunities, claims management, case financials, and Underwriting
exposure
Excel is not an enterprise database
Specialty – OB, ED,
Surgery
X
Manually created Excel spreadsheet that resides on a local machine
Distrust of data quality and veracity
Inability to properly ascertain issues pertaining to specialties
Loss of relevance to clients
Excel is not an enterprise database
Claim rates by
specialty, trend report
X X X
Historical exposure of trends and claims Inability to dynamically set premiums
Inability to fully understand trends pertaining to exposures and claims
Inability of SMT to make informed decisions
Phase 1B Reports
35. 35 Current as of 10/19/2020
Report Description Current State Future State Department(s) Contacts
Client
Quarterly
"CEO"
Report
Mark Reynolds sends out quarterly email
report to Board Members and Quality Leaders
-- this report includes sections on Claims
Management, Current Financial Standing,
Corporate Achievements of Note. In
particular, the Claims Management section
includes trend data (with comparisons to past
benchmark data) for closed and asserted cases
-- for several KPIs, including Closed-With-
Payment; %High-Severity, Payment->$1M.
Data pulled/submitted independently by
Patient Safety, Finance, Claims -- collected and
organized by Communications. Some
disagreement between numbers from different
sources. Some philosophical disagreement
about how meaningful it is to report quarterly
numbers and how they should be
projected/interpreted.
Address issues in "Current State." Define
whole-brain consensus spec with
definitions and single source of data
(along with underlying data structures
and data governance).
Patient Safety,
Claims, Finance
Jonathan,
Beth, Sean
Claims (frequency, indemnity,
closed cases)
case financials
trends
coding taxonomy
policy-coverage not needed
billing financials not needed
Client
Institution
Report
Client Patient Safety leadership, along with
data analyst, and Patient Safety Director, meet
with Quality/Safety leadership for each insured
institution 2x/year (~ 20 meetings each time).
The purpose of the meeting is to inform the
institution about trends in their claims data,
highlight any emerging risks or things the
institution "needs to know." And, to align the
presentation with Experience Adjustments
(Spring) and present topics of interest to the
institution (e.g. HIT Risk or Primary Care Risk).
In Spring, data analyst produces "standard"
claims summary Excel book, which is reviewed
with the institution contact. Areas of
interest/focus are identified; further data
queries and analysis done iteratively until data
is presented. In addition, PT Safety Director
reads/reviews specific claim summaries and is
prepared to talk about them. Sometimes, a
Claims Rep presents, too. If there is a specific
topic of interest, data/slides are prepared for
that topic, as well. "Boutique" institutions may
get review of overall Client data or customized
presentations.
In 2016, there will be 2 meetings per
institution. The Spring meeting will
involve both Patient Safety and Finance
and include review of claims data, as
well as Experience Adjustment. Fall
meeting will be more topical. A goal is
to reduce amount of customized work.
Q: Involvement of Claims Department?
Q: How to handle boutique institutions?
An automation opportunity is initial data
pull/reporting book (for each
institution).
Patient Safety,
Finance, Claims
Winnie, Sean,
Carl.
coding taxonomy
contributing factors -
compare to Harvard and CBS
peers
org
claims
case financials
Claims
Dashboard
Provides Claims Department (Beth, Carl, Kay)
with information needed to track claims
activity -- includes trends of cases/defendants
over time by different types of dates;
losses/defendant sliced by firm, etc.
Astrid has built SAS data set and uses Microsoft
Excel as front end.
Make Production-ready. Verify what
works well, what doesn't, and what new
KPA's/displays are wanted (by Claims).
Claims Beth
Phase 1B Reports Detail
36. 36 Current as of 10/19/2020
Report Description Current State Future State Department(s) Contacts
Primary
Care
Report
Primary Care is a Strategic focus for Client. The
content of a Primary Care "Report" will need to
be defined-- interventions will focus in a
number of areas, but especially Patient
Engagement and Referral Management --
reports could focus on these areas, building on
Diagnostic Process of Care and Referral
Process.
Custom analyses. Inability to know where PCPs
are practicing-- currently, Client data does not
include concept of "practice."
A high-level report. What is really
needed is clear understanding of what
business owner (Carol) wants to be able
to know about Primary Care. And, data
definition/acquisition/mapping for PCP,
practice so that we can report on this
data, i.e. a Primary Care Data Mart.
Deliverable here may be a data mart
with Excel/Tableau front end.
Underwriting is another involved
stakeholder here. And, we need to
understand implications for MMS.
Patient Safety,
Underwriting
Carol, Caren-
Elise
Gaps - site & location not available
for enterprise
Claims
Case financials
Org
Claims has location-site stored
either in CMAPS/Excel
Patient Safety -
OPE - also captures primary care
practice sites
UW - also captures location-site
information
Need to identify master record for
location-site and also build
hierarchy
Specialty –
OB, ED,
Surgery
Prepared for Chief's convening. Includes
trend/comparison data for claim rate (per 10K
births), academic versus community.
Benchmarked against CBS
Denominator (and numerator) data required
preparation/cleaning -- knowing where birth
occurred versus sponsoring org was key.
Automate this. Note: denominators are
different for different specialties (ED,
Surgery). OB uses births from AHA.
Patient Safety OB - Lisa,
Tom Beatty
ED – Jay
Scherer
Surgery – Bill,
Kathy Dwyer
Claim rates
by
specialty,
trend
report
In 2015, Elena/Astrid used our Claim Risk
Model to plot observed claim rate trends by
specialty, broken down by
Academic/Community status. Using the
model, they were able to predict what the
claims rate should be (and compare it with the
observed rate). What was most useful about
this exercise was the ability to look at each
specialty's trend lines and see any interesting
or concerning patterns.
Automate this. Patient Safety Jonathan,
Astrid, Elena
Phase 1B Reports Detail
37. 37 Current as of 10/19/2020
Data Layer
Future-state EDW – BI Delivery Framework
Analysts
Power Users/Trading
partner data extracts
Scorecards, Dashboard,
holistic view of business
1 more additional level of
detail plus pre built reports
Detailed data, investigative
analysis, helps setup
reporting capability for user
community
User Community Interest
Detailed base layer
(Atomic 100% data
elements)
Data
Refresh
Frequency
Monthly
Daily
Weekly
Daily
Dashboards
Information
Consumption Layer
BO WebI
BO
Universe /
Extracts
Framework
EDW – Single Source of Truth for all Information Consumption needs
Executives
✓ Atomic detailed data serves Power Users who are interested in detailed data investigation serviced through BO
Universe/Extract framework
✓ Most Used data layer serves Analysts who are interested in pre built reports serviced through BO WebI.
✓ Aggregate data layer serves Executives who are interested in scorecards, dashboards and higher aggregate
business metrics serviced through dashboards.
Most Used data layer
(60% data elements)
Aggregate data layer
(20% data elements)
Sandbox