Summary of three National webinars. Three V's, market, Functional areas showing most traction, Hot Revenue/ROI areas, Architecture options and using Use cases to overcome objections.,
SAP HANA and Apache Hadoop for Big Data Management (SF Scalable Systems Meetup)Will Gardella
In this presentation I argue that the future of data management may see a split between (1) real-time in-memory systems such as SAP HANA for most enterprise workloads (2) disk-based free and open-source Apache Hadoop for certain specialized big data uses.
The presentation starts with a definition of what is intended by the term big data, then talks about SAP HANA and Apache Hadoop from the perspective of suitability for enterprise use with a special concentration on Hadoop. (The basics of SAP HANA were covered in the immediately preceding session). This is followed by a description of currently available SAP support for Apache Hadoop in SAP BI 4.0 and SAP Data Services / EIM. Due to time constraints I did not discuss Apache Hadoop support built into Sybase IQ.
CIO Guide to Using SAP HANA Platform For Big DataSnehanshu Shah
This guide supports CIOs in setting up a system infrastructure for their business that can get the best out of Big Data. We describe what the SAP HANA platform can do and how it integrates with Hadoop and related technologies interplay, looking at data life cycle management and data streaming. Concrete use cases point out the requirements associated with Big Data as well as the opportunities it offers, and how companies are already taking advantage of them.
This document discusses harnessing big data in real-time. It outlines how business requirements are increasingly demanding real-time insights from data. Traditional systems struggle with high latency, complexity, and costs when dealing with big data. The document proposes using SAP HANA and Hadoop together to enable instant analytics on vast amounts of data. It provides examples of using this approach for cancer genome analysis and other use cases to generate personalized and timely results.
Leveraging SAP, Hadoop, and Big Data to Redefine BusinessDataWorks Summit
The document discusses SAP's big data solutions leveraging SAP HANA, Hadoop, and related technologies. It covers SAP's strategy to provide an end-to-end platform for ingesting, processing, analyzing and acting on both structured and unstructured data at large scale. Key components discussed include Smart Data Access for querying Hadoop data virtually, virtual user defined functions for custom MapReduce jobs, and Smart Data Streaming for real-time analytics of streaming data. Use cases and customer deployments integrating SAP HANA and Hadoop are also mentioned.
Leveraging SAP HANA with Apache Hadoop and SAP AnalyticsMethod360
The rise of big data and the Apache Hadoop platform allows for the capture and processing of data at an unprecedented scale and velocity. Watch this slide deck to get a comprehensive overview of the Apache Hadoop platform architecture and learn how to leverage the strengths of both the Apache Hadoop and SAP HANA platforms.
This is a point of view document showing the various possible techniques to integrate SAP HANA and Hadoop and their pros & cons and the scenarios where each of them is recommended.
The document discusses building an information analytics platform by integrating Hadoop and SAP HANA. It describes VUPICO's profile as a consulting firm focused on analytics using Hadoop and SAP HANA. It outlines the benefits of integrating these technologies, such as scalability, real-time access to large data, and a common view of business data. It also discusses using SAP HANA VORA to bridge Hadoop and SAP HANA and generate further value from predictive analytics.
Leveraging SAP, Hadoop, and Big Data to Redefine BusinessDataWorks Summit
The document discusses leveraging SAP, Hadoop, and big data technologies to redefine businesses. It describes how the volume of digital data is exploding and includes both relational and non-relational machine-generated data. The document outlines how SAP focuses on providing an end-to-end value chain through its HANA data platform, which provides in-memory analytics, dynamic data tiering between HANA and Hadoop, smart data integration and quality features, and the ability to consume, compute and store data. Key features of HANA's integration with Hadoop include smart data access to Hive and Spark, support for MapReduce jobs, and access to HDFS.
SAP HANA Vora is an in-memory query engine that leverages and extends the Apache Spark execution framework to provide interactive analytics on Hadoop. It bridges the gap between enterprise and big data by enabling precision decisions with contextual insights across systems. Key features include compiled queries for efficient processing across nodes, drill-downs into HDFS data from HANA, and open programming for data scientists. SAP HANA Vora simplifies big data ownership and makes all data more accessible for tasks like fraud detection, risk mitigation, and targeted marketing.
Hadoop, Spark and Big Data Summit presentation with SAP HANA Vora and a path ...Ocean9, Inc.
Slides from BrightTALK webinar; listen here: http://bit.ly/1rpVkov
Understand how popular open source and vendor provided technologies integrate well, all making key contributions to a successful big data technology.
But more important than technology is the business goals you define and aim to achieve. - See many examples in this deck and online presentation.
Lastly, how do you bridge the digital implementation gap between in-house capabilities and the market demands driving endless new technologies. - How can you test drive and prototype without owning and committing to all these new technologies?
Listen to the full webcast to find out how: http://bit.ly/1rpVkov
Slides from a session at GigaCon Big Data conference in Warsaw, Poland on Jan 27, 2014. Updated with the content presented at Big Data Budapest on Aug 18, 2014.
Whether MVC for user interfaces, or Spine and Leaf for data centers, new
architecture patterns in our industry act as sort of historical markers of the
effectiveness and acceptance of new technologies. Practical techniques push the
bounds resulting in a shift. Application of distributed storage and streaming
capabilities such as Kafka and of course Hadoop are shifting Big Data architectures
from a layer cake concept, or North/South oriented approach to one which can be
thought of as an East/West architectural concept. Recently popular is Lambda
Architecture, this article presents an SAP HANA based rendering of the Lambda
Architecture.
Flexpod with SAP HANA and SAP ApplicationsLishantian
This document discusses Cisco and NetApp solutions for implementing SAP HANA, including:
1) The FlexPod approach which provides a simplified architecture for deploying SAP HANA appliances on Cisco UCS and NetApp storage up to 48TB.
2) Implementing SAP HANA using Tailored Data Center Integration (TDI) on FlexPod, which provides more flexibility compared to appliance-based deployments.
3) Two use cases for SAP HANA TDI involving running multiple SAP HANA production systems on a single Cisco UCS, and reusing an existing data center network rather than network components included in the solution.
This guide was generated in Jan-Feb'2014 timeframe.
Using the feature of SAP HANA Smart Data Access(SDA), it is possible to access remote data, without having to replicate the data to the SAP HANA database beforehand. The following are supported as sources(till 2013):
- Teradata database,
- SAP Sybase ASE,
- SAP Sybase IQ,
- Intel Distribution for Apache Hadoop,
- SAP HANA.
SAP HANA handles the data like local tables on the database. Automatic data type conversion makes it possible to map data types from databases connected via SAP HANA Smart Data Access to SAP HANA data types.
This guide will explain the step-by-step approach SAP HANA SDA for Hadoop data - which also include the following :
- Hadoop Installation
- Data Load in Hadoop system
- Activities on Unstructured Data in Hadoop system
- ODBC Driver installation & configuration on HANA Server for Hadoop system data access
- Smart Data Access in SAP HANA (through SAP HANA Studio), using HADOOP as a remote data source
Setup used for this guide :
1) Hadoop : HDP 1.3 for Windows(Hortonworks Data Platform) - Standalone - on Dell Laptop, OS Win7 64bit with 8GB RAM
2) SAP HANA Sever : running on VM – 24GB Standalone HANA 1.0 SPS 7 – SLES 11 SP1
20100430 introduction to business objects data servicesJunhyun Song
This document provides an overview and agenda for a presentation on SAP BusinessObjects Data Services XI 3.0. It discusses how data integration and quality tools like Data Services can help address challenges around managing enterprise data by providing a single tool for data integration, quality management, and metadata management. The presentation agenda covers why effective information management is important, an introduction to Data Services, how metadata management impacts data lineage and trustworthiness, use cases for Data Services in SAP environments, and concludes with a wrap-up.
Big Data, Big Thinking: Untapped OpportunitiesSAP Technology
The document discusses a webinar by SAP and Ernst & Young on big data. It explores big data adoption trends, how organizations can leverage big data to improve business performance and manage risks, and common use cases across industries like retail, transportation, and government. The webinar provides guidance on how organizations can get started with big data initiatives by identifying executive sponsors, use cases, architectural gaps, and building a business case to justify investment.
Business intelligence in the era of big dataJC Raveneau
This document discusses how companies can leverage big data and business intelligence (BI) tools to gain insights. It provides examples of how companies like GE generate large amounts of machine data daily. It also outlines the primary big data scenarios for insights from real-time data sources. Finally, it summarizes SAP's BI and big data solutions like HANA, which provide a platform for real-time analytics across structured and unstructured data at large volumes.
Hybrid Data Architecture: Integrating Hadoop with a Data WarehouseDataWorks Summit
Mather Economics wanted a data architecture that could integrate Hadoop and a data warehouse to provide a responsive user experience for data slicing, aggregating, and modeling on 100% of data samples. A hybrid approach was implemented that uses Hadoop for ingestion and storage and a data warehouse for transformation, integration, and dimensional modeling to support both internal analysts and external customers. This hybrid approach meets the goals of being data and technology agnostic while providing speed for analytics.
This document discusses strategies for successfully utilizing a data lake. It notes that creating a data lake is just the beginning and that challenges include data governance, metadata management, access, and effective use of the data. The document advocates for data democratization through discovery, accessibility, and usability. It also discusses best practices like self-service BI and automated workload migration from data warehouses to reduce costs and risks. The key is to address the "data lake dilemma" of these challenges to avoid a "data swamp" and slow adoption.
Tomorrow’s industry leaders are the ones that draw new insights by creatively combining the data they have today. That can be a real challenge for most companies, especially since data types and volumes have grown astronomically over the last decade. And when you pair that with the tasks of gathering, organizing, and managing information from all over an organization and ensuring its accuracy, you have quite a job on your hands. But with our comprehensive portfolio of enterprise information management solutions, your business can unlock the power of its data, achieve entirely new insights, and use them to create a winning future.
SAP HANA SPS10 will include enhancements across several key areas:
- It will simplify administration and monitoring for multi-tenant and cloud deployments.
- It will strengthen high availability and disaster recovery capabilities to ensure the highest service level agreements.
- It will provide new and enhanced security features for enterprise-ready deployment, including simplified security administration and enhanced isolation for multitenant database containers.
Eliminating the Challenges of Big Data Management Inside HadoopHortonworks
Your Big Data strategy is only as good as the quality of your data. Today, deriving business value from data depends on how well your company can capture, cleanse, integrate and manage data. During this webinar, we discussed how to eliminate the challenges to Big Data management inside Hadoop.
Go over these slides to learn:
· How to use the scalability and flexibility of Hadoop to drive faster access to usable information across the enterprise.
· Why a pure-YARN implementation for data integration, quality and management delivers competitive advantage.
· How to use the flexibility of RedPoint and Hortonworks to create an enterprise data lake where data is captured, cleansed, linked and structured in a consistent way.
The Modern Data Architecture for Advanced Business Intelligence with Hortonwo...Hortonworks
The document provides an overview of a webinar presented by Anurag Tandon and John Kreisa of Hortonworks and MicroStrategy respectively. It discusses the drivers for adopting a modern data architecture including the growth of new types of data and the need for efficiency. It outlines how Apache Hadoop can power a modern data architecture by providing scalable storage and processing. Key requirements for Hadoop adoption in the enterprise are also reviewed like the need for integration, interoperability, essential services, and leveraging existing skills. MicroStrategy's role in enabling analytics on big data and across all data sources is also summarized.
The Modern Data Architecture for Predictive Analytics with Hortonworks and Re...Revolution Analytics
Hortonworks and Revolution Analytics have teamed up to bring the predictive analytics power of R to Hortonworks Data Platform.
Hadoop, being a disruptive data processing framework, has made a large impact in the data ecosystems of today. Enabling business users to translate existing skills to Hadoop is necessary to encourage the adoption and allow businesses to get value out of their Hadoop investment quickly. R, being a prolific and rapidly growing data analysis language, now has a place in the Hadoop ecosystem.
This presentation covers:
- Trends and business drivers for Hadoop
- How Hortonworks and Revolution Analytics play a role in the modern data architecture
- How you can run R natively in Hortonworks Data Platform to simply move your R-powered analytics to Hadoop
Presentation replay at:
http://www.revolutionanalytics.com/news-events/free-webinars/2013/modern-data-architecture-revolution-hortonworks/
Apache Hadoop and its role in Big Data architecture - Himanshu Barijaxconf
In today’s world of exponentially growing big data, enterprises are becoming increasingly more aware of the business utility and necessity of harnessing, storing and analyzing this information. Apache Hadoop has rapidly evolved to become a leading platform for managing and processing big data, with the vital management, monitoring, metadata and integration services required by organizations to glean maximum business value and intelligence from their burgeoning amounts of information on customers, web trends, products and competitive markets. In this session, Hortonworks' Himanshu Bari will discuss the opportunities for deriving business value from big data by looking at how organizations utilize Hadoop to store, transform and refine large volumes of this multi-structured information. Connolly will also discuss the evolution of Apache Hadoop and where it is headed, the component requirements of a Hadoop-powered platform, as well as solution architectures that allow for Hadoop integration with existing data discovery and data warehouse platforms. In addition, he will look at real-world use cases where Hadoop has helped to produce more business value, augment productivity or identify new and potentially lucrative opportunities.
This document provides an overview of Apache Atlas and how it addresses big data governance issues for enterprises. It discusses how Atlas provides a centralized metadata repository that allows users to understand data across Hadoop components. It also describes how Atlas integrates with Apache Ranger to enable dynamic security policies based on metadata tags. Finally, it outlines new capabilities in upcoming Atlas releases, including cross-component data lineage tracking and a business taxonomy/catalog.
Learn how when an organizations combine HP and Vertica Analytics Platform and Hortonworks, they can quickly explore and analyze broad variety of data types to transform to actionable information that allows them to better understand how their customers and site visitors interact with their business, offline and online.
A beginners guide to scrum. Not only software. Defines roles, key meetings and artifacts. 7 certifications available thru Scrum alliance. Make the journey.
Shows how RDS supports HANA, new Assemble to order Strategy utilizing RDS, Business Case studies tied to Technology and an evolution path for CRM utilizing RDS, HANA and the Cloud.
SAP HANA Vora is an in-memory query engine that leverages and extends the Apache Spark execution framework to provide interactive analytics on Hadoop. It bridges the gap between enterprise and big data by enabling precision decisions with contextual insights across systems. Key features include compiled queries for efficient processing across nodes, drill-downs into HDFS data from HANA, and open programming for data scientists. SAP HANA Vora simplifies big data ownership and makes all data more accessible for tasks like fraud detection, risk mitigation, and targeted marketing.
Hadoop, Spark and Big Data Summit presentation with SAP HANA Vora and a path ...Ocean9, Inc.
Slides from BrightTALK webinar; listen here: http://bit.ly/1rpVkov
Understand how popular open source and vendor provided technologies integrate well, all making key contributions to a successful big data technology.
But more important than technology is the business goals you define and aim to achieve. - See many examples in this deck and online presentation.
Lastly, how do you bridge the digital implementation gap between in-house capabilities and the market demands driving endless new technologies. - How can you test drive and prototype without owning and committing to all these new technologies?
Listen to the full webcast to find out how: http://bit.ly/1rpVkov
Slides from a session at GigaCon Big Data conference in Warsaw, Poland on Jan 27, 2014. Updated with the content presented at Big Data Budapest on Aug 18, 2014.
Whether MVC for user interfaces, or Spine and Leaf for data centers, new
architecture patterns in our industry act as sort of historical markers of the
effectiveness and acceptance of new technologies. Practical techniques push the
bounds resulting in a shift. Application of distributed storage and streaming
capabilities such as Kafka and of course Hadoop are shifting Big Data architectures
from a layer cake concept, or North/South oriented approach to one which can be
thought of as an East/West architectural concept. Recently popular is Lambda
Architecture, this article presents an SAP HANA based rendering of the Lambda
Architecture.
Flexpod with SAP HANA and SAP ApplicationsLishantian
This document discusses Cisco and NetApp solutions for implementing SAP HANA, including:
1) The FlexPod approach which provides a simplified architecture for deploying SAP HANA appliances on Cisco UCS and NetApp storage up to 48TB.
2) Implementing SAP HANA using Tailored Data Center Integration (TDI) on FlexPod, which provides more flexibility compared to appliance-based deployments.
3) Two use cases for SAP HANA TDI involving running multiple SAP HANA production systems on a single Cisco UCS, and reusing an existing data center network rather than network components included in the solution.
This guide was generated in Jan-Feb'2014 timeframe.
Using the feature of SAP HANA Smart Data Access(SDA), it is possible to access remote data, without having to replicate the data to the SAP HANA database beforehand. The following are supported as sources(till 2013):
- Teradata database,
- SAP Sybase ASE,
- SAP Sybase IQ,
- Intel Distribution for Apache Hadoop,
- SAP HANA.
SAP HANA handles the data like local tables on the database. Automatic data type conversion makes it possible to map data types from databases connected via SAP HANA Smart Data Access to SAP HANA data types.
This guide will explain the step-by-step approach SAP HANA SDA for Hadoop data - which also include the following :
- Hadoop Installation
- Data Load in Hadoop system
- Activities on Unstructured Data in Hadoop system
- ODBC Driver installation & configuration on HANA Server for Hadoop system data access
- Smart Data Access in SAP HANA (through SAP HANA Studio), using HADOOP as a remote data source
Setup used for this guide :
1) Hadoop : HDP 1.3 for Windows(Hortonworks Data Platform) - Standalone - on Dell Laptop, OS Win7 64bit with 8GB RAM
2) SAP HANA Sever : running on VM – 24GB Standalone HANA 1.0 SPS 7 – SLES 11 SP1
20100430 introduction to business objects data servicesJunhyun Song
This document provides an overview and agenda for a presentation on SAP BusinessObjects Data Services XI 3.0. It discusses how data integration and quality tools like Data Services can help address challenges around managing enterprise data by providing a single tool for data integration, quality management, and metadata management. The presentation agenda covers why effective information management is important, an introduction to Data Services, how metadata management impacts data lineage and trustworthiness, use cases for Data Services in SAP environments, and concludes with a wrap-up.
Big Data, Big Thinking: Untapped OpportunitiesSAP Technology
The document discusses a webinar by SAP and Ernst & Young on big data. It explores big data adoption trends, how organizations can leverage big data to improve business performance and manage risks, and common use cases across industries like retail, transportation, and government. The webinar provides guidance on how organizations can get started with big data initiatives by identifying executive sponsors, use cases, architectural gaps, and building a business case to justify investment.
Business intelligence in the era of big dataJC Raveneau
This document discusses how companies can leverage big data and business intelligence (BI) tools to gain insights. It provides examples of how companies like GE generate large amounts of machine data daily. It also outlines the primary big data scenarios for insights from real-time data sources. Finally, it summarizes SAP's BI and big data solutions like HANA, which provide a platform for real-time analytics across structured and unstructured data at large volumes.
Hybrid Data Architecture: Integrating Hadoop with a Data WarehouseDataWorks Summit
Mather Economics wanted a data architecture that could integrate Hadoop and a data warehouse to provide a responsive user experience for data slicing, aggregating, and modeling on 100% of data samples. A hybrid approach was implemented that uses Hadoop for ingestion and storage and a data warehouse for transformation, integration, and dimensional modeling to support both internal analysts and external customers. This hybrid approach meets the goals of being data and technology agnostic while providing speed for analytics.
This document discusses strategies for successfully utilizing a data lake. It notes that creating a data lake is just the beginning and that challenges include data governance, metadata management, access, and effective use of the data. The document advocates for data democratization through discovery, accessibility, and usability. It also discusses best practices like self-service BI and automated workload migration from data warehouses to reduce costs and risks. The key is to address the "data lake dilemma" of these challenges to avoid a "data swamp" and slow adoption.
Tomorrow’s industry leaders are the ones that draw new insights by creatively combining the data they have today. That can be a real challenge for most companies, especially since data types and volumes have grown astronomically over the last decade. And when you pair that with the tasks of gathering, organizing, and managing information from all over an organization and ensuring its accuracy, you have quite a job on your hands. But with our comprehensive portfolio of enterprise information management solutions, your business can unlock the power of its data, achieve entirely new insights, and use them to create a winning future.
SAP HANA SPS10 will include enhancements across several key areas:
- It will simplify administration and monitoring for multi-tenant and cloud deployments.
- It will strengthen high availability and disaster recovery capabilities to ensure the highest service level agreements.
- It will provide new and enhanced security features for enterprise-ready deployment, including simplified security administration and enhanced isolation for multitenant database containers.
Eliminating the Challenges of Big Data Management Inside HadoopHortonworks
Your Big Data strategy is only as good as the quality of your data. Today, deriving business value from data depends on how well your company can capture, cleanse, integrate and manage data. During this webinar, we discussed how to eliminate the challenges to Big Data management inside Hadoop.
Go over these slides to learn:
· How to use the scalability and flexibility of Hadoop to drive faster access to usable information across the enterprise.
· Why a pure-YARN implementation for data integration, quality and management delivers competitive advantage.
· How to use the flexibility of RedPoint and Hortonworks to create an enterprise data lake where data is captured, cleansed, linked and structured in a consistent way.
The Modern Data Architecture for Advanced Business Intelligence with Hortonwo...Hortonworks
The document provides an overview of a webinar presented by Anurag Tandon and John Kreisa of Hortonworks and MicroStrategy respectively. It discusses the drivers for adopting a modern data architecture including the growth of new types of data and the need for efficiency. It outlines how Apache Hadoop can power a modern data architecture by providing scalable storage and processing. Key requirements for Hadoop adoption in the enterprise are also reviewed like the need for integration, interoperability, essential services, and leveraging existing skills. MicroStrategy's role in enabling analytics on big data and across all data sources is also summarized.
The Modern Data Architecture for Predictive Analytics with Hortonworks and Re...Revolution Analytics
Hortonworks and Revolution Analytics have teamed up to bring the predictive analytics power of R to Hortonworks Data Platform.
Hadoop, being a disruptive data processing framework, has made a large impact in the data ecosystems of today. Enabling business users to translate existing skills to Hadoop is necessary to encourage the adoption and allow businesses to get value out of their Hadoop investment quickly. R, being a prolific and rapidly growing data analysis language, now has a place in the Hadoop ecosystem.
This presentation covers:
- Trends and business drivers for Hadoop
- How Hortonworks and Revolution Analytics play a role in the modern data architecture
- How you can run R natively in Hortonworks Data Platform to simply move your R-powered analytics to Hadoop
Presentation replay at:
http://www.revolutionanalytics.com/news-events/free-webinars/2013/modern-data-architecture-revolution-hortonworks/
Apache Hadoop and its role in Big Data architecture - Himanshu Barijaxconf
In today’s world of exponentially growing big data, enterprises are becoming increasingly more aware of the business utility and necessity of harnessing, storing and analyzing this information. Apache Hadoop has rapidly evolved to become a leading platform for managing and processing big data, with the vital management, monitoring, metadata and integration services required by organizations to glean maximum business value and intelligence from their burgeoning amounts of information on customers, web trends, products and competitive markets. In this session, Hortonworks' Himanshu Bari will discuss the opportunities for deriving business value from big data by looking at how organizations utilize Hadoop to store, transform and refine large volumes of this multi-structured information. Connolly will also discuss the evolution of Apache Hadoop and where it is headed, the component requirements of a Hadoop-powered platform, as well as solution architectures that allow for Hadoop integration with existing data discovery and data warehouse platforms. In addition, he will look at real-world use cases where Hadoop has helped to produce more business value, augment productivity or identify new and potentially lucrative opportunities.
This document provides an overview of Apache Atlas and how it addresses big data governance issues for enterprises. It discusses how Atlas provides a centralized metadata repository that allows users to understand data across Hadoop components. It also describes how Atlas integrates with Apache Ranger to enable dynamic security policies based on metadata tags. Finally, it outlines new capabilities in upcoming Atlas releases, including cross-component data lineage tracking and a business taxonomy/catalog.
Learn how when an organizations combine HP and Vertica Analytics Platform and Hortonworks, they can quickly explore and analyze broad variety of data types to transform to actionable information that allows them to better understand how their customers and site visitors interact with their business, offline and online.
A beginners guide to scrum. Not only software. Defines roles, key meetings and artifacts. 7 certifications available thru Scrum alliance. Make the journey.
Shows how RDS supports HANA, new Assemble to order Strategy utilizing RDS, Business Case studies tied to Technology and an evolution path for CRM utilizing RDS, HANA and the Cloud.
Understanding new Rapid Deployment Solutions. 150+ applications taht help solve business problems in weeks not years. Written from a basic user viewpoint.what
The document describes a 3 part webinar series from the Americas' SAP Users' Group on Agile methodology based on SAP's ASAP 8.0 approach. The webinars will cover Agile project launch and requirements, and Agile realization and releases. They will discuss how to leverage SAP's Rapid Deployment Solutions and Agile approaches to accelerate SAP projects.
I think of training as the filling in a marketing sandwich. Training itself should be part of an company’s or organization’s overall marketing plan. But you also need to have a marketing plan to ensure the success of your training. E-Learning or online training/education ensures the online channel is open to your prospective students and, thus, can play a tactical role in delivering marketing messages. This presentation discusses the necessity of having a solid marketing plan for your training deployment and some tactical implementations we\'ve used successfully.
This document identifies common website performance bottlenecks and how to diagnose and address them. It begins by outlining the objectives of identifying the source, symptom, cause, measurements, and cures for bottlenecks. It then provides terms and concepts in application performance testing. The rest of the document discusses specific bottlenecks that can occur in the web server, application server, database server, and network tiers and provides examples of measurements and cures for each.
El documento presenta información sobre Thomas Wallet, un experto en Agile y SAP que ha realizado coaching, capacitación y evangelización sobre Agile. También tiene experiencia implementando módulos y actualizaciones de SAP. El documento discute cómo adaptar metodologías Agile como Scrum a proyectos SAP y cómo la cultura de SAP puede reconciliarse con prácticas Agiles.
The document discusses agile methodologies for SAP projects as an alternative to traditional waterfall models. It describes the challenges of waterfall approaches, including difficulties estimating budgets, requirements changes late in the project, and inability to adapt to changes. The document then summarizes the Scrum and Kanban agile methodologies. Scrum uses short iterative sprints to incrementally develop functionality. Kanban uses a pull-based system with visual boards and limits on work-in-progress to manage flow and identify bottlenecks. Both aim to deliver value earlier, adapt to changes, and improve throughput and lead times over traditional waterfall approaches.
The document discusses applying agile project management methods to ERP implementations. It outlines key principles for an agile ERP approach, including ensuring communication, simplicity, feedback and embracing change. Specific practices for applying agile methods in the product data management domain are also presented, such as assuming simplicity, enabling incremental change, and maximizing stakeholder value.
The document outlines the 6 phases of the SAP ASAP 8 implementation methodology: project preparation, business blueprint, realization, final preparation, go-live and support, and operate. It then describes the role of an SAP functional consultant in more detail, including evaluating business requirements, configuring the system, documentation, testing, training, and support. The consultant is responsible for transforming requirements into logical and technical models, customizing the system, and ensuring the new processes work as intended.
This document discusses new features in SAP HANA SPS 10 for Hadoop and Spark integration, including a native Spark SQL integration using a Spark adapter, Ambari integration with the HANA cockpit for unified administration of HANA and Hadoop nodes, and data lifecycle management between HANA and Hadoop using a relocation agent. It also provides steps for configuring the Spark controller and details the Ambari integration with the HANA cockpit.
The strategic relationship between Hortonworks and SAP enables SAP to resell Hortonworks Data Platform (HDP) and provide enterprise support for their global customer base. This means SAP customers can incorporate enterprise Hadoop as a complement within a data architecture that includes SAP HANA, Sybase and SAP BusinessObjects enabling a broad range of new analytic applications.
Here is a presentation that I recently presented for an SAP Users Group. It provides a good overview of the concepts of Lean and Agile and the considerations for introducing Lean and Agile into an SAP or ERP project.
ERP Implementation Using Agile Project Management with Scrumdj1arry
This document discusses using Agile project management and Scrum for ERP implementation. It introduces ERP systems and Agile project management approaches like Scrum. Scrum uses product owners, development teams, and scrum masters along with artifacts like product backlogs and sprints. Key Scrum ceremonies include sprint planning, daily stand-ups, and sprint reviews. Using Scrum for ERP implementation provides benefits like transparency, inspection, adaptation and aligning with Agile values.
sap sales and distribution tutorial pptchandusapsd
www.magnifictraining.com - "SAP QM(QUALITY MANAGEMENT)" Online Training contact us:[email protected] or +919052666559 By Real Time Experts from Hyderabad, Bangalore,India,USA,Canada,UK, Australia,South Africa.
SAP SD(Sales and Distribution) Online Training Course Content
SAP SD Introduction. Sales Process flow, Test case details. SAP SD Terminologies
Enterprise Structure -1
Enterprise Structure -2
Customer Master – Customising
Customer Master- Creation
Material Master-1
Material Master-2
Pricing -1, General pricing Concepts, Condition Technique
Pricing -2, SAP pricing
Condition records
Sales Order Processing -1 – Sales document Types
Sales Order Processing -2 – Item Category and Schedule Line Category
Sales order processing – End User Part
Transfer of requirement and Availablity Check
Standard Delivery Process -1-Customising
Standard Delivery Process -2-Creation
Account Determination
Standard Billing Process
Quotation and Follow on Quotation
Taxes
Rush Order and Cash Sales
Consignment process
Contracts -1, Document flow and billing Plan
Contracts-2-Creation
Credit Memo and debit memo
Returns
Copy Control and Log of Incompletion
Reports
Free Goods
Material Determination
Output Determination
Additional topics if time permits
SAP Tables
Revision of Topics
Condition technique is a configuration technique in SAP used to configure complex business rules, such as pricing. It consists of several key components, including a field catalog, condition tables, an access sequence, condition types, pricing procedures, and pricing procedure determination. Condition tables contain business rules and are accessed in the order specified by the access sequence. Condition types represent logical components like taxes or discounts. Pricing procedures combine condition types and are assigned to documents like sales orders. Overall, condition technique provides a rules engine for flexibly configuring diverse and changing business rules through its various components.
The document discusses SAP's ASAP (AcceleratedSAP) implementation methodology. It provides an overview of the ASAP roadmap structure and phases. The roadmap is organized into phases including project preparation, business blueprint, realization, final preparation, and go-live and support. Each phase has deliverables and activities to implement SAP solutions according to proven best practices to help ensure project success.
Big Data Management: A Unified Approach to Drive Business ResultsCA Technologies
Traditional data management is changing rapidly, attributed to significant changes brought on by evolving big data environments. IT complexity is on the rise as businesses choose the technologies they need to support their big data strategies and targeted business outcomes. Now, more than ever, we need IT management tools that can accommodate and effectively manage these evolving, complex environments to ensure that enterprises can move forward with their preferred technology and vendor choices.
For more information on Mainframe solutions from CA Technologies, please visit: http://bit.ly/1wbiPkl
This document discusses Hadoop and big data. It notes that digital data doubles every two years and that 85% of data is unstructured. Hadoop provides a cheaper way to store large amounts of both structured and unstructured data compared to traditional storage options. Hadoop also allows data to be stored first before defining what questions will be asked of the data.
This Presentation is completely on Big Data Analytics and Explaining in detail with its 3 Key Characteristics including Why and Where this can be used and how it's evaluated and what kind of tools that we use to store data and how it's impacted on IT Industry with some Applications and Risk Factors
The Double win business transformation and in-year ROI and TCO reductionMongoDB
This document discusses how modern information management with flexible data platforms like MongoDB can help businesses transform and drive ROI through cost reduction and increased productivity compared to legacy systems. It provides examples of strategic areas where MongoDB can modernize an organization's full technology stack from data in motion/at rest to apps, compute, storage and networks. Success stories show how MongoDB has helped companies like Barclays reduce costs and complexity while improving resiliency, agility and innovation.
This document provides an overview of big data presented by five individuals. It defines big data, discusses its three key characteristics of volume, velocity and variety. It explains how big data is stored, selected and processed using techniques like Hadoop and MapReduce. Examples of big data sources and tools are provided. Applications of big data across various industries are highlighted. Both the risks and benefits of big data are summarized. The future growth of big data and its impact on IT is also outlined.
This document outlines a seminar presentation on big data. It begins with an introduction that defines big data and notes how it emerged in the early 21st century mainly through online firms. It then covers the three key characteristics of big data - volume, velocity and variety. Other sections discuss storing, selecting and processing big data, as well as tools used and applications. Risks, benefits and the future impact and growth of big data are also summarized. The presentation provides an overview of the key concepts regarding big data.
This document provides an overview of big data in a seminar presentation. It defines big data, discusses its key characteristics of volume, velocity and variety. It describes how big data is stored, selected and processed. Examples of big data sources and tools used are provided. The applications and risks of big data are summarized. Benefits to organizations from big data analytics are outlined, as well as its impact on IT and future growth prospects.
This document provides an overview of big data, including its definition, characteristics, sources, tools, applications, risks and benefits. It defines big data as large volumes of diverse data that can be analyzed to reveal patterns and trends. The three key characteristics are volume, velocity and variety. Examples of big data sources include social media, sensors and user data. Tools used for big data include Hadoop, MongoDB and analytics programs. Big data has many applications and benefits but also risks regarding privacy and regulation. The future of big data is strong with the market expected to grow significantly in coming years.
Implementar una estrategia eficiente de gobierno y seguridad del dato con la ...Denodo
Watch full webinar here: https://bit.ly/3lSwLyU
En la era de la explosión de la información repartida en distintas fuentes, el gobierno de datos es un componente clave para garantizar la disponibilidad, usabilidad, integridad y seguridad de la información. Asimismo, el conjunto de procesos, roles y políticas que define permite que las organizaciones alcancen sus objetivos asegurando el uso eficiente de sus datos.
La virtualización de datos forma parte de las herramientas estratégica para implementar y optimizar el gobierno de datos. Esta tecnología permite a las empresas crear una visión 360º de sus datos y establecer controles de seguridad y políticas de acceso sobre toda la infraestructura, independientemente del formato o de su ubicación. De ese modo, reúne múltiples fuentes de datos, las hace accesibles desde una sola capa y proporciona capacidades de trazabilidad para supervisar los cambios en los datos.
Le invitamos a participar en este webinar para aprender:
- Cómo acelerar la integración de datos provenientes de fuentes de datos fragmentados en los sistemas internos y externos y obtener una vista integral de la información.
- Cómo activar en toda la empresa una sola capa de acceso a los datos con medidas de protección.
- Cómo la virtualización de datos proporciona los pilares para cumplir con las normativas actuales de protección de datos mediante auditoría, catálogo y seguridad de datos.
1.Introduction
2.Overview
3.Why Big Data
4.Application of Big Data
5.Risks of Big Data
6.Benefits & Impact of Big Data
7.Conclusion
‘Big Data’ is similar to ‘small data’, but bigger in size
But having data bigger it requires different approaches:
Techniques, tools and architecture
An aim to solve new problems or old problems in a better
way
Big Data generates value from the storage and processing
of very large quantities of digital information that cannot be
analyzed with traditional computing techniques.
It’s Not About Big Data – It’s About Big Insights - SAP Webinar - 20 Aug 201...Edgar Alejandro Villegas
Presentation slides of:
It’s Not About Big Data – It’s About Big Insights - SAP Webinar - 20 Aug 2013 - PDF
Scott Mackenzie - Sr. Director, Platform & Analytics CoE
Michael Golzc - CIO for SAP Americas
Ken Demma - VP, Insight Driven Marketing
20 Aug 2013 - Webcast - http://goo.gl/T74WAL
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)Denodo
Watch full webinar here: https://bit.ly/2O2r3NP
In the last several decades, BI has evolved from large, monolithic implementations controlled by IT to orchestrated sets of smaller, more agile capabilities that include visual-based data discovery and governance. These new capabilities provide more democratic analytics accessibility that is increasingly being controlled by business users. However, given the rapid advancements in emerging technologies such as cloud and big data systems and the fast changing business requirements, creating a future-proof data management strategy is an incredibly complex task.
Catch this on demand session to understand:
- BI program modernization challenges
- What is data virtualization and why is its adoption growing so quickly?
- How data virtualization works and how it compares to alternative approaches to data integration
- How modern data virtualization can significantly increase agility while reducing costs
Topics including: The transformative value of real-time data and analytics, and current barriers to adoption. The importance of an end-to-end solution for data-in-motion that includes ingestion, processing, and serving. Apache Kudu’s role in simplifying real-time architectures.
Capgemini’s Data WARP: Accelerate your Journey to InsightsCapgemini
More data, more insights. Data is at the center of change and business value! There is no limit to volume, structure of timing. But more data brings more challenges, like cost of storage, increased complexity of the data architecture and a lack of agility. Many organizations are still faced with scattered data lying in silos across the organization. They often lack a clear business case for funding a transform of their data landscape. Or they suffer from ineffective co-ordination of Data and analytics initiatives. Finally, the dependency on legacy systems for data processing and management is still high. Data WARP (Wide Angle Rationalization Program) helps organizations improve the performance of their data & insights architecture landscape, by providing key deliverables like rationalization designs, business cases and transformation roadmap.
Presented at Informatica World 2016 by Jorgen Heizenberg, CTO- Netherlands, Capgemini Insights & Data
¿En qué se parece el Gobierno del Dato a un parque de atracciones?Denodo
Watch full webinar here: https://bit.ly/3Ab9gYq
Imagina llegar a un parque de atracciones con tu familia y comenzar tu día sin el típico plano que te permitirá planificarte para saber qué espectáculos ver, a qué atracciones ir, donde pueden o no pueden montar los niños… Posiblemente, no podrás sacar el máximo partido a tu día y te habrás perdido muchas cosas. Hay personas que les gusta ir a la aventura e ir descubriendo poco a poco, pero cuando hablamos de negocios, ir a la aventura puede ser fatídico...
En la era de la explosión de la información repartida en distintas fuentes, el gobierno de datos es clave para garantizar la disponibilidad, usabilidad, integridad y seguridad de esa información. Asimismo, el conjunto de procesos, roles y políticas que define permite que las organizaciones alcancen sus objetivos asegurando el uso eficiente de sus datos.
La virtualización de datos, herramienta estratégica para implementar y optimizar el gobierno del dato, permite a las empresas crear una visión 360º de sus datos y establecer controles de seguridad y políticas de acceso sobre toda la infraestructura, independientemente del formato o de su ubicación. De ese modo, reúne múltiples fuentes de datos, las hace accesibles desde una sola capa y proporciona capacidades de trazabilidad para supervisar los cambios en los datos.
En este webinar aprenderás a:
- Acelerar la integración de datos provenientes de fuentes de datos fragmentados en los sistemas internos y externos y obtener una vista integral de la información.
- Activar en toda la empresa una sola capa de acceso a los datos con medidas de protección.
- Cómo la virtualización de datos proporciona los pilares para cumplir con las normativas actuales de protección de datos mediante auditoría, catálogo y seguridad de datos.
This document discusses big data, including its definition, characteristics of volume, velocity, and variety. It describes sources of big data like administrative data, transactions, public data, sensor data, and social media. It discusses processing big data using techniques like Hadoop MapReduce. It outlines benefits like real-time decision making but also drawbacks like security, privacy, and performance issues. It provides some facts about the size of data generated daily by companies and potential impacts and future growth of the big data industry and job market.
The document discusses IBM's Big Data Platform for turning large and complex data into business insights. It provides an overview of key big data challenges faced by organizations and how the IBM platform addresses these challenges through solutions that handle the volume, velocity, variety and veracity of big data. These solutions include analytics, data warehousing, streaming analytics and Hadoop technologies. Use cases are presented for big data exploration, enhancing customer views, security intelligence, operations analysis and augmenting data warehouses.
This document discusses big data and provides an overview of key concepts:
- Big data is defined as datasets that are too large or complex for traditional data management tools to handle. It is characterized by volume, velocity, and variety.
- Big data comes from a variety of sources like social media, sensors, web logs, and transaction systems. It is growing rapidly due to the digitization of information.
- Big data can be used for applications like enhancing customer insights, optimizing operations, and extending security and intelligence capabilities. Example use cases are described.
- Architecting solutions for big data requires handling its scale and integrating diverse data types and sources. Both traditional and new analytics approaches are needed.
Communicating your BRAND today is best done by stories. An effective way for people to understand your uniqueness thru VISUAL Stories. From why stories to using "PAR" as a way to communicate.
In todays workforce their are 5 active generations. How has our past shaped us and how do we interact?. What values do we all share? What is best way to communicate and learn. This presentation tries to address the basics.
Order to Cash. Cash is King. Prime elements, points that block successful ETE flow. KPI's/metics and how to guage where your company really ranks: a Business leader, Average, or Laggard.
Looking at what is driving Big Data. Market projections to 2017 plus what is are customer and infrastructure priorities. What drove BD in 2013 and what were barriers. Introduction to Business Analytics, Types, Building Analytics approach and ten steps to build your analytics platform within your company plus key takeaways.
So what is SAP HANA? How can it help my area (Line of Business) and our business overall!. Presentation lays out BASICS and how can help users enable their area/business "Real time".
The document discusses order-to-cash (O2C) processes. It notes that O2C refers to taking customer orders through various channels, fulfilling orders, shipping products, generating invoices, and collecting payments. Key pressures driving focus on O2C include reducing costs and improving customer service. Common objectives of O2C improvement are to reduce days sales outstanding and improve cash flow forecasting. The document outlines a typical O2C process and symptoms of a broken O2C process. It also applies Aberdeen's PACE framework to analyze pressures, actions, capabilities, and enablers related to optimizing an organization's O2C cycle.
Special Meetup Edition - TDX Bengaluru Meetup #52.pptxshyamraj55
We’re bringing the TDX energy to our community with 2 power-packed sessions:
🛠️ Workshop: MuleSoft for Agentforce
Explore the new version of our hands-on workshop featuring the latest Topic Center and API Catalog updates.
📄 Talk: Power Up Document Processing
Dive into smart automation with MuleSoft IDP, NLP, and Einstein AI for intelligent document workflows.
TrsLabs - Fintech Product & Business ConsultingTrs Labs
Hybrid Growth Mandate Model with TrsLabs
Strategic Investments, Inorganic Growth, Business Model Pivoting are critical activities that business don't do/change everyday. In cases like this, it may benefit your business to choose a temporary external consultant.
An unbiased plan driven by clearcut deliverables, market dynamics and without the influence of your internal office equations empower business leaders to make right choices.
Getting things done within a budget within a timeframe is key to Growing Business - No matter whether you are a start-up or a big company
Talk to us & Unlock the competitive advantage
The Evolution of Meme Coins A New Era for Digital Currency ppt.pdfAbi john
Analyze the growth of meme coins from mere online jokes to potential assets in the digital economy. Explore the community, culture, and utility as they elevate themselves to a new era in cryptocurrency.
Increasing Retail Store Efficiency How can Planograms Save Time and Money.pptxAnoop Ashok
In today's fast-paced retail environment, efficiency is key. Every minute counts, and every penny matters. One tool that can significantly boost your store's efficiency is a well-executed planogram. These visual merchandising blueprints not only enhance store layouts but also save time and money in the process.
Massive Power Outage Hits Spain, Portugal, and France: Causes, Impact, and On...Aqusag Technologies
In late April 2025, a significant portion of Europe, particularly Spain, Portugal, and parts of southern France, experienced widespread, rolling power outages that continue to affect millions of residents, businesses, and infrastructure systems.
Complete Guide to Advanced Logistics Management Software in Riyadh.pdfSoftware Company
Explore the benefits and features of advanced logistics management software for businesses in Riyadh. This guide delves into the latest technologies, from real-time tracking and route optimization to warehouse management and inventory control, helping businesses streamline their logistics operations and reduce costs. Learn how implementing the right software solution can enhance efficiency, improve customer satisfaction, and provide a competitive edge in the growing logistics sector of Riyadh.
Designing Low-Latency Systems with Rust and ScyllaDB: An Architectural Deep DiveScyllaDB
Want to learn practical tips for designing systems that can scale efficiently without compromising speed?
Join us for a workshop where we’ll address these challenges head-on and explore how to architect low-latency systems using Rust. During this free interactive workshop oriented for developers, engineers, and architects, we’ll cover how Rust’s unique language features and the Tokio async runtime enable high-performance application development.
As you explore key principles of designing low-latency systems with Rust, you will learn how to:
- Create and compile a real-world app with Rust
- Connect the application to ScyllaDB (NoSQL data store)
- Negotiate tradeoffs related to data modeling and querying
- Manage and monitor the database for consistently low latencies
Mobile App Development Company in Saudi ArabiaSteve Jonas
EmizenTech is a globally recognized software development company, proudly serving businesses since 2013. With over 11+ years of industry experience and a team of 200+ skilled professionals, we have successfully delivered 1200+ projects across various sectors. As a leading Mobile App Development Company In Saudi Arabia we offer end-to-end solutions for iOS, Android, and cross-platform applications. Our apps are known for their user-friendly interfaces, scalability, high performance, and strong security features. We tailor each mobile application to meet the unique needs of different industries, ensuring a seamless user experience. EmizenTech is committed to turning your vision into a powerful digital product that drives growth, innovation, and long-term success in the competitive mobile landscape of Saudi Arabia.
AI Changes Everything – Talk at Cardiff Metropolitan University, 29th April 2...Alan Dix
Talk at the final event of Data Fusion Dynamics: A Collaborative UK-Saudi Initiative in Cybersecurity and Artificial Intelligence funded by the British Council UK-Saudi Challenge Fund 2024, Cardiff Metropolitan University, 29th April 2025
https://alandix.com/academic/talks/CMet2025-AI-Changes-Everything/
Is AI just another technology, or does it fundamentally change the way we live and think?
Every technology has a direct impact with micro-ethical consequences, some good, some bad. However more profound are the ways in which some technologies reshape the very fabric of society with macro-ethical impacts. The invention of the stirrup revolutionised mounted combat, but as a side effect gave rise to the feudal system, which still shapes politics today. The internal combustion engine offers personal freedom and creates pollution, but has also transformed the nature of urban planning and international trade. When we look at AI the micro-ethical issues, such as bias, are most obvious, but the macro-ethical challenges may be greater.
At a micro-ethical level AI has the potential to deepen social, ethnic and gender bias, issues I have warned about since the early 1990s! It is also being used increasingly on the battlefield. However, it also offers amazing opportunities in health and educations, as the recent Nobel prizes for the developers of AlphaFold illustrate. More radically, the need to encode ethics acts as a mirror to surface essential ethical problems and conflicts.
At the macro-ethical level, by the early 2000s digital technology had already begun to undermine sovereignty (e.g. gambling), market economics (through network effects and emergent monopolies), and the very meaning of money. Modern AI is the child of big data, big computation and ultimately big business, intensifying the inherent tendency of digital technology to concentrate power. AI is already unravelling the fundamentals of the social, political and economic world around us, but this is a world that needs radical reimagining to overcome the global environmental and human challenges that confront us. Our challenge is whether to let the threads fall as they may, or to use them to weave a better future.
What is Model Context Protocol(MCP) - The new technology for communication bw...Vishnu Singh Chundawat
The MCP (Model Context Protocol) is a framework designed to manage context and interaction within complex systems. This SlideShare presentation will provide a detailed overview of the MCP Model, its applications, and how it plays a crucial role in improving communication and decision-making in distributed systems. We will explore the key concepts behind the protocol, including the importance of context, data management, and how this model enhances system adaptability and responsiveness. Ideal for software developers, system architects, and IT professionals, this presentation will offer valuable insights into how the MCP Model can streamline workflows, improve efficiency, and create more intuitive systems for a wide range of use cases.
DevOpsDays Atlanta 2025 - Building 10x Development Organizations.pptxJustin Reock
Building 10x Organizations with Modern Productivity Metrics
10x developers may be a myth, but 10x organizations are very real, as proven by the influential study performed in the 1980s, ‘The Coding War Games.’
Right now, here in early 2025, we seem to be experiencing YAPP (Yet Another Productivity Philosophy), and that philosophy is converging on developer experience. It seems that with every new method we invent for the delivery of products, whether physical or virtual, we reinvent productivity philosophies to go alongside them.
But which of these approaches actually work? DORA? SPACE? DevEx? What should we invest in and create urgency behind today, so that we don’t find ourselves having the same discussion again in a decade?
How Can I use the AI Hype in my Business Context?Daniel Lehner
𝙄𝙨 𝘼𝙄 𝙟𝙪𝙨𝙩 𝙝𝙮𝙥𝙚? 𝙊𝙧 𝙞𝙨 𝙞𝙩 𝙩𝙝𝙚 𝙜𝙖𝙢𝙚 𝙘𝙝𝙖𝙣𝙜𝙚𝙧 𝙮𝙤𝙪𝙧 𝙗𝙪𝙨𝙞𝙣𝙚𝙨𝙨 𝙣𝙚𝙚𝙙𝙨?
Everyone’s talking about AI but is anyone really using it to create real value?
Most companies want to leverage AI. Few know 𝗵𝗼𝘄.
✅ What exactly should you ask to find real AI opportunities?
✅ Which AI techniques actually fit your business?
✅ Is your data even ready for AI?
If you’re not sure, you’re not alone. This is a condensed version of the slides I presented at a Linkedin webinar for Tecnovy on 28.04.2025.
HCL Nomad Web – Best Practices and Managing Multiuser Environmentspanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-nomad-web-best-practices-and-managing-multiuser-environments/
HCL Nomad Web is heralded as the next generation of the HCL Notes client, offering numerous advantages such as eliminating the need for packaging, distribution, and installation. Nomad Web client upgrades will be installed “automatically” in the background. This significantly reduces the administrative footprint compared to traditional HCL Notes clients. However, troubleshooting issues in Nomad Web present unique challenges compared to the Notes client.
Join Christoph and Marc as they demonstrate how to simplify the troubleshooting process in HCL Nomad Web, ensuring a smoother and more efficient user experience.
In this webinar, we will explore effective strategies for diagnosing and resolving common problems in HCL Nomad Web, including
- Accessing the console
- Locating and interpreting log files
- Accessing the data folder within the browser’s cache (using OPFS)
- Understand the difference between single- and multi-user scenarios
- Utilizing Client Clocking
TrustArc Webinar: Consumer Expectations vs Corporate Realities on Data Broker...TrustArc
Most consumers believe they’re making informed decisions about their personal data—adjusting privacy settings, blocking trackers, and opting out where they can. However, our new research reveals that while awareness is high, taking meaningful action is still lacking. On the corporate side, many organizations report strong policies for managing third-party data and consumer consent yet fall short when it comes to consistency, accountability and transparency.
This session will explore the research findings from TrustArc’s Privacy Pulse Survey, examining consumer attitudes toward personal data collection and practical suggestions for corporate practices around purchasing third-party data.
Attendees will learn:
- Consumer awareness around data brokers and what consumers are doing to limit data collection
- How businesses assess third-party vendors and their consent management operations
- Where business preparedness needs improvement
- What these trends mean for the future of privacy governance and public trust
This discussion is essential for privacy, risk, and compliance professionals who want to ground their strategies in current data and prepare for what’s next in the privacy landscape.
AI and Data Privacy in 2025: Global TrendsInData Labs
In this infographic, we explore how businesses can implement effective governance frameworks to address AI data privacy. Understanding it is crucial for developing effective strategies that ensure compliance, safeguard customer trust, and leverage AI responsibly. Equip yourself with insights that can drive informed decision-making and position your organization for success in the future of data privacy.
This infographic contains:
-AI and data privacy: Key findings
-Statistics on AI data privacy in the today’s world
-Tips on how to overcome data privacy challenges
-Benefits of AI data security investments.
Keep up-to-date on how AI is reshaping privacy standards and what this entails for both individuals and organizations.
AI and Data Privacy in 2025: Global TrendsInData Labs
Big data/Hadoop/HANA Basics
1. [
How Big Data Technologies Provide
Solutions for Big Data Problems
John Choate – PMMS SIG Chair
David Burdett – Strategic Technology Advisor, SAP
Henrik Wagner, Global SAP Lead-Alliances, EMC Corp
2. [ The Challenge of Big Data
Decision-Maker
Customer
LOB User
Data
IT Developer
2
Analyst
3. [ The 5 Part Series
Webinar 1: Why Big Data matters, how it can fit into your Business and
Technology Roadmap, and how it can enable your business!
Webinar 2: How Big Data technologies provide Solutions for Big Data
problems
Webinar 3: Using Hadoop in an SAP Landscape with HANA
Webinar 4: Leveraging Hadoop with SAP HANA smart data access
Webinar 5: Using SAP Data Services with Hadoop and SAP HANA
Resources …
Webinar Registration
1. Go to www.saphana.com
2. Search “ASUG Big Data Webinar”
3. Registration links in blog …
Big Data, Hadoop and Hana – How they Integrate and How they Enable your Business!
Info on SAP and Big Data – go to www.sapbigdata.com
3
4. [ AREAS TO COVER
SETTING THE STAGE
MARKET
TECHNOLOGY
USE CASES
SUMMARY
4
5. [
How did we get here?
Facebook: 1 billion users; 600 mobile users; more
than 42 million pages and 9 million apps
Youtube:
More people have mobile 4 billion views per day
phones thanGoogle+: 400 million registered users
electricity or
safe drinking watermillion monthly connected users
Skype: 250
REAL TIME
3,000,000
1,000,000+
SOLD
people had access to
internet worldwide
BIG DATA
SOCIAL
MOBILE
PERSONAL
COMPUTER AND
CLIENT SERVER
DATABASE
(CIRCA 1980)
1990
B2B / B2C
WWW
ANALYTICS
(CIRCA 1980)
PREDICTIVE ANALYTICS
(CIRCA 1980)
2000
SEMANTIC ANALYTICS
(CIRCA 1980)
2005
2010
2015
2013
5
6. [
How big is Big Data?
Today we measure available data
in zettabytes (1 trillion gigabytes)
IN 2011, THE AMOUNT
OF DATA SURPASSED
90% OF THE WORLD DATA TODAY
has been created
in the last two years alone!
1.8
ZETTABYTES
Eight 32GB iPads
per person alive
in the world
6
7. [ Big Data Simplified
Definition
“Big data” is high-volume, velocity and -variety
information assets that
demand cost-effective,
innovative forms of
information processing for
enhanced insight and
decision making
Gartner
7
Three Key Parts
Part One: 3V’s – Volume,
Velocity, Variety
Part Two: Cost-Effective,
Innovative Forms of
Information Processing
Part Three: Enhanced
insight for “Real Time”
decision making
8. [ The 7 Key Drivers Behind the Big Data Movement? *
* http://hortonworks.com/blog/7-key-drivers-for-the-big-data-market/
Business
Opportunity to enable innovative new business models
Potential for new insights that drive competitive advantage
Technical
Data collected and stored continues to grow exponentially
Data is increasingly everywhere and in many formats
Traditional solutions are failing under new requirements
Financial
Cost of data systems, as a percentage of IT spend, continues to
grow
Cost advantages of commodity hardware & open source
software
8
9. [ Todays Key Challenges in Big Data
Data Analytics
1.
Data Capture & Retention – What data should be kept and why
2.
Behavioral Analytics – Understanding and leveraging customer behavior
3.
Predictive Analytics – Using new data types (sentiment, clickstream, video, image and
text) to predict future events
9
Information Strategy
1.
Which investments will deliver most business value and ROI?
2.
Governance – New expectations for data quality and management
3.
Talent – How will you assemble the right teams and align skills?
Enterprise Information Management (EIM)
1.
User expectations – Making “Big Data” accessible for the end user in “real-time”
2.
Costs – How to provide access to big data in a rapid and cost-effective way to support
better decision-making?
3.
Tools – Have you identified the processes, tools and technologies you need to support
big data in your enterprise?
11. [ The RAPIDLY GROWING Market
“By 2015, 4.4 million IT jobs globally will
be created to support big data, generating
1.9 million IT jobs in the United States”
Peter Sondergaard, Senior Vice President at Gartner and
global head of Research
http://www.gartner.com/newsroom/id/2207915
“The Global big data market is estimated to be
$14.87 billion in 2013 and expected to
grow to $46.34 billion … an estimated
Compounded Annual Growth Rate (CAGR) of
25.52% from 2013 to 2018”
“IDC expects the Big Data technology and
services market to grow at a 31.7% compound
annual growth rate through 2016”
http://www.idc.com/getdoc.jsp?containerId=238746
11
http://www.marketsandmarkets.com/PressReleases/big-data.asp
12. [ Products and Services under the Umbrella of Big Data
Hadoop software and related
hardware
NoSQL database software and
related hardware
Next-generation data
warehouses/analytic database
software and related hardware
Non-Hadoop Big Data platforms,
software, and related hardware
In-memory – both DRAM and
flash – databases as applied to Big
Data workloads
Data integration and data quality
platforms and tools as applied to
Big Data deployments
12
Advanced analytics and data
science platforms and tools
Application development
platforms and tools as applied to
Big Data use cases
Business intelligence and data
visualization platforms and tools as
applied to Big Data use cases
Analytic and transactional
applications as applied to Big Data
use cases
Big Data support, training, and
professional services
13. [ WHO IS SPENDING $$$ ON BIG DATA ?
COMPANIES
INDUSTRIES
Median = $10M
MOST
25% Spend less $2.5M
15% Spend greater
$100M
7% Spend greater than
$500M
13
Banking
High Tech
Telecommunications
Travel
LEAST
Energy/Resources
Life Sciences
Retail
2012 Tata Consulting Services (TCS)
Global Study
14. [ How the market is growing
Wikibon:
http://wikibon.org/wiki/v/Big_Data_Vendor_Revenue_
and_Market_Forecast_2012-2017
Wikibon:
http://wikibon.org/vault/Special:FilePath/2012BigDataSegment
Growth20112017.png
Fastest growing area is Applications (49% CAGR), 2012-17
14
15. [ Big Data Vendor Revenue
Big Data vendors are a
mix of established
players and pure-plays
Source Data:
http://wikibon.org/wiki/v/Big_Data_Vendor_Revenue_and_Marke
t_Forecast_2012-2017
15
16. [ 10 Big Data Trends Changing the Face of Business
1. Machine Data and the Internet of
Things Takes Center Stage
6. Large Companies Are Increasingly
Turning to Big Data
2. Compound Applications That
Combine Data Sets to Create Value
7. Most Companies Spend Very Little, A
Few Spend A Lot
3. Explosion of Innovation Built on
Open Source Big Data Tools
4. Companies Taking a Proactive
Approach to Identifying Where Big Data
Can Have an Impact
5. There Are More Actual Production
Big Data Projects
16
8. Investments Are Geared Toward
Generating and Maintaining Revenue
9. The Greatest ROI of Big Data Is
Coming from the Logistics and Finance
Functions
10. The Biggest Challenges Are as Much
Cultural as Technological
18. [ Aspect of Time Value of Data
“HOT” Data may be better suited for “In Memory” HANA
residency. This data largely derived from structured SAP
sources.
“WARM” and “COLD” Data may be better suited for
HADOOP residency. This data is largely unstructured in
nature and may present very large data sets (multi PB).
Business value reflected by Use Cases may consist of queries
and data structures in three different ways:
Enabled by SAP HANA
Enabled by HADOOP
Enabled by HANA and HADOOP simultaneously
18
EMC
Corporation
20. [ Big Data High Level Software Architecture
Big Data Storage holds the data in memory or on
SSD/HDD
Big Data Database Software manages data in the Big
Data Storage. Includes SQL and NoSQL DBMS.
Processing Engines are software that can process /
manipulate data in the Big Data Storage
Processing Engines
Software
Analytic Software analyzes data using the Processing
Engines or Big Data DB Software
Big Data Applications provide solutions for specific
business problems
In-memory
Development Software is used to build Big Data
Applications
Visualization Software presents the results to end
users from Analytic Software or Big Data Applications
Data Capture Software
Data Capture Software on-boards and manages data
from multiple Data Sources
Data Sources
Development
Software
Management Software handles operational of the Big
Data implementation / solution
Big Data
Applications
Analytic
Software
Big Data
Database Software
Visualization
Software
Management
Software
Big Data Storage
20
SSD
HDD
21. Big Data
Hive/HBase
Database Software
Visualization
Software
Mahout/
Processing Engines
Software
Giraph, etc
Big Data
Applications
Analytic
Software
Big Data
Cassandra
Database Software
Data Sources
Hadoop
Big Data Storage
Cassandra
Management
Software
Management
Software
Data Capture Software
Big Data
Applications
Analytic
Software
Big Data
MongoDB
Database Software
Data Capture Software
Data Sources
Cassandra
Software
In-memory
Big Data Storage
MongoDB
SSD
HDD
Visualization
Processing Engines
Software
In-memory
SSD
HDD
Software
Processing Engines
Software
In-memory
Big Data Storage
Hadoop HDFS
Visualization
Development
Software
Analytic
Software
Management
Software
Big Data
Applications
Development
Software
Development
Software
[ Big Data Software Other Solutions
Data Capture Software
Data Sources
MongoDB
Big Data Software solutions only handle part of the problem
21
SSD
HDD
22. [ Big Data Software Architecture and HANA
Development
HANA Studio
Software
ANALYZE – Analytics!
Big Data
Applications
Analytic
SAP BI
Software
Tools
Big Data
HANA / Sybase IQ
Database
Software
Visualizatio
SAP Lumira
n Software
Processing
“R” Engine, Text
Engines
Analytics, etc.
Software
SAP Landscape
Management
Management
Software
In-memory
22
SAP HANA
Sybase IQ
Big Data Storage
Hadoop HDFS
DataSAP Data Services
Capture Software
Data Sources
SSD
HDD
Analyze and visualize Big Data using tools that best serve your
business needs.
Reduce delays associated with complex analysis of large data sets
using in-memory analytics.
New opportunities and expose hidden risks using algorithms, R
integration, and predictive analysis.
Enable business users to access and visualize insight using charts,
graphs, maps, and more.
Uncover hidden value from unstructured data with text analytics.
ACELERATE – “Real Time” Visibility
Increase business speed with cost-performance data processing
options
In-memory processing with SAP HANA to massively parallel
processing with the SAP Sybase IQ database
Distributed processing of large data sets with Hadoop.
ACQUIRE – Meet the Expanding Data Demand
Acquire and store large volumes of data from a variety of data sources.
Flexible data management capabilities delivered via the SAP HANA
platform.
Best option based on business requirements for accessibility,
complexity of analytics, processing speed, and storage costs.
See: http://www.sapbigdata.com/platform/
24. [ Looking for Big Data Potential in your Company
ACQUIRE – Meet the Expanding Data Demand
1. Acquire and store large volumes of data from a variety of data sources.
2. Flexible data management capabilities delivered via the SAP HANA platform.
3. Best option based on business requirements for accessibility, complexity of analytics, processing speed,
and storage costs.
ACELERATE – “Real Time” Visibility
1. Increase business speed with cost-performance data processing options
2. In-memory processing with SAP HANA to massively parallel processing with the SAP Sybase IQ
database
3. Distributed processing of large data sets with Hadoop.
ANALYZE – Analytics!
1. Analyze and visualize Big Data using tools that best serve your business needs.
2.
3.
4.
5.
24
Reduce delays associated with complex analysis of large data sets using in-memory analytics.
New opportunities and expose hidden risks using algorithms, R integration, and predictive analysis.
Enable business users to access and visualize insight using charts, graphs, maps, and more.
Uncover hidden value from unstructured data with text analytics.
25. [ OVERCOMING OBJECTIONS – USE CASES
1. Big Data Projects are too expensive
2. Big Data is Technology in search of a Business Problem to solve!
3. Big Data is an IT project, we don’t need to involve the business.
4. Big Data is just the new Buzzword phrase, just like Cloud! Soon another
trend and new buzzword will come along.
5. We don’t have the skills to use Big Data Solutions.
25
26. [ Big Data and Competitive Advantage
Utilize your data to gain a
competitive advantage!
Competitiveness of fact-finders vs. fumblers
Fumblers
Fumblers
Leading businesses can outpace the competition
because they can:
• Base decisions on the latest, granular
multi-structured data
• Make decisions on analytics rather than
intuition
Factfinders
Factfinders
• Frequently reassess forecasts and plans
• Utilize analytics to support a spectrum
of strategic, operational and tactical decision making
• Rapidly evaluate alternative scenarios
Laggards
Leaders
n=1,002
Source: IDC‘s SAP HANA Market Assessment, August 2011
26
27. [ Soliciting Allies
REVENUE
ROI
Sales
Finance
Marketing
Logistics
Customer Service
Marketing
R&D/NPI
Sales
IT
Finance
Greater 25%
HR
27
2012 Tata Consulting
Services (TCS) Global
Study
28. [ T-Mobile USA, Inc.
Telecom – Optimize Marketing Campaigns Effectiveness
Product: Agile Datamart
56x faster analysis
5 Billion+ records
for 33M customers
report executed in 9
seconds
Business Challenges
Proliferation of offers/micro-offers increasingly strategic in a highly
competitive market
Marketing Operations needs to collect, analyze and report on results of
campaigns/offers very quickly and with great flexibility
Current and future campaigns have to be fine tuned to improve
customer adoption and profitability
Technical Challenges
Data for 33M customers required a lot of time to be explored and
analyzed in detail with previous technology
Benefits
Dynamic read outs on the upsell/cross sell performance of store and
call centers
Easy, fast assess to the performance of all campaigns (e.g. by geo, by
store, etc)
Quicker forecast of the financial impact of marketing campaigns
“ ”
Based on the rapid analytics that we’re performing on SAP HANA, we are now able to quickly fine tune our current and future campaigns to
improve the customer adoption rate, reduce churn and increase profit
Alison Bessho, Director, Enterprise Systems Business Solutions, T-Mobile USA
28
29. [ University of Kentucky
Higher Education – Student Retention
Business Challenges
$1.1M increase in
revenue with 1% increase
in retention rate
Enable the University to increase student retention and thus increase the
Graduation Rate from 60% to 70% over a 10 Year period
Huge costs and longer turnaround time for student classification to
improve student satisfaction and the retention rate
420x improvement
Technical Challenges
in reporting speed: It
Lack of speed, accuracy and visibility into data analysis
took 2-3 seconds as
against the
competition Oracle DW
which took 15-20
minutes
Handling Big data efficiently: SAP ECC V6 production system is 1.5 TB and
SAP BW V7 and Oracle Data Warehouse combined is 4 TB
Benefits
15x improvement in
Increased Student Retention Rate, fast collect new information related to
student interactions and various student behaviors
Query load time
Reduced IT Infrastructure Costs and increased IT FTE productivity
“”
Allow the University to retire several systems including
Informatica, BI Web Focus (IBI), and Oracle (DB)
SAP HANA offers an effective real-time data driven system which is essential to giving immediate performance feedback and increase
retention rate of students, increasing millions in revenue for the University every year.
Vince Kellen, CIO University of Kentucky
29
30. [ Hardware Preventative Maintenance
Business Challenges
A computer server manufacturer wants to implement effective preventative maintenance
by identifying problems as they arise then take prompt action to prevent the problem
occurring at other customer sites
Technical Challenges
Identifying problems by analyzing text data from call centers, customer questionnaires
together with server logs generated by their hardware
Combining results with CRM, sales and manufacturing data to predict which servers are
likely to have problems in the future
Solution
Use SAP Data Services to analyze call center data and questionnaires stored in Hadoop and
identify potential problems
Use HANA to merge results from Hadoop with server logs to identify indicators in those
logs of potential problems
Combine with CRM, bill of material and production/manufacturing data to identify cases
where preventative maintenance would help
30
31. [ Data Warehouse Migration
Business Challenges
A high tech company with a major web presence uses non-SAP software for its data warehouse to analyze
the activity on their web site properties and combine it with data in SAP Business Suite
They want to both reduce the cost and improve the responsiveness of their data warehouse solutions by
moving to a combination of SAP HANA and Hadoop
Technical Challenges
How to complete the migration without disrupting existing reporting processes
Solution – this was a four step process
Step 1. Replicate Data in Hadoop. SAP Data Services is used to replicate in Hadoop all data from web
logs and SAP Business Suite being captured by the current Data Warehouse
Step 2. Aggregate Data in Hadoop. The aggregation process in the existing Data Warehouse is reimplemented in Hadoop and the aggregate results fed back to the existing Data Warehouse
significantly reducing its workload.
Step 3. Copy the Aggregate Data to HANA. The aggregate data created by Hadoop is also copied to
HANA together with historical aggregate data already in the existing Data Warehouse. The result is
that eventually HANA has a complete copy of the data in the existing Data Warehouse.
Step 4. Replace Reporting by SAP HANA. New reports are developed in HANA to replace reports in the
original Data Warehouse. Once complete, the original Data Warehouse will be decommissioned.
The end result is a faster, more responsive and lower cost Data Warehouse built on HANA and Hadoop.
31
33. [ SUMMARY
1. The Big data Market Is Not Going Away!
2. There are 3 Distinct Components of BD Market
3. Its Not a New Trend but way for Technology To
Enable Your Business
4. Case Studies HELP to visualize your own Companies
BD Opportunities – Benchmark & Assess!
5. Don’t go the Journey Alone – There are many
resources available to make your Journey Successful!
33
35. [ The 5 Part Series
Webinar 1: Why Big Data matters, how it can fit into your Business and
Technology Roadmap, and how it can enable your business!
Webinar 2: How Big Data technologies provide Solutions for Big Data problems
Webinar 3: Using Hadoop in an SAP Landscape with HANA
Webinar 4: Leveraging Hadoop with SAP HANA smart data access
Webinar 5: Using SAP Data Services with Hadoop and SAP HANA
Resources …
Webinar Registration
1. Go to www.saphana.com
2. Search “ASUG Big Data Webinar”
3. Registration links in blog …
Big Data, Hadoop and Hana – How they Integrate and How they Enable your Business!
Info on SAP and Big Data – go to www.sapbigdata.com
35
36. THANK YOU FOR PARTICIPATING.
SESSION CODE:
Learn more year-round at www.asug.com