Introduces the Microsoft’s Data Platform for on premise and cloud. Challenges businesses are facing with data and sources of data. Understand about Evolution of Database Systems in the modern world and what business are doing with their data and what their new needs are with respect to changing industry landscapes.
Dive into the Opportunities available for businesses and industry verticals: the ones which are identified already and the ones which are not explored yet.
Understand the Microsoft’s Cloud vision and what is Microsoft’s Azure platform is offering, for Infrastructure as a Service or Platform as a Service for you to build your own offerings.
Introduce and demo some of the Real World Scenarios/Case Studies where Businesses have used the Cloud/Azure for creating New and Innovative solutions to unlock these potentials.
Big Data in the Cloud with Azure Marketplace ImagesMark Kromer
The document discusses strategies for modern data warehousing and analytics on Azure including using Hadoop for ETL/ELT, integrating streaming data engines, and using lambda and hybrid architectures. It also describes using data lakes on Azure to collect and analyze large amounts of data from various sources. Additionally, it covers performing real-time stream analytics, machine learning, and statistical analysis on the data and discusses how Azure provides scalability, speed of deployment, and support for polyglot environments that incorporate many data processing and storage options.
This is a run-through at a 200 level of the Microsoft Azure Big Data Analytics for the Cloud data platform based on the Cortana Intelligence Suite offerings.
This document discusses big data and analytics solutions from Microsoft. It introduces Azure Data Lake Store as a hyper-scale repository for big data analytics workloads that allows storing any data in its native format. It also describes Azure Data Lake Analytics as a service for big data analytics that offers distributed, parallel processing with U-SQL and integration with Visual Studio. The document provides examples of using Azure Data Lake Analytics to extract, transform, and analyze big data from various sources like call log files and customer tables.
The Hive Think Tank - The Microsoft Big Data Stack by Raghu Ramakrishnan, CTO...The Hive
Until recently, data was gathered for well-defined objectives such as auditing, forensics, reporting and line-of-business operations; now, exploratory and predictive analysis is becoming ubiquitous, and the default increasingly is to capture and store any and all data, in anticipation of potential future strategic value. These differences in data heterogeneity, scale and usage are leading to a new generation of data management and analytic systems, where the emphasis is on supporting a wide range of very large datasets that are stored uniformly and analyzed seamlessly using whatever techniques are most appropriate, including traditional tools like SQL and BI and newer tools, e.g., for machine learning and stream analytics. These new systems are necessarily based on scale-out architectures for both storage and computation.
Hadoop has become a key building block in the new generation of scale-out systems. On the storage side, HDFS has provided a cost-effective and scalable substrate for storing large heterogeneous datasets. However, as key customer and systems touch points are instrumented to log data, and Internet of Things applications become common, data in the enterprise is growing at a staggering pace, and the need to leverage different storage tiers (ranging from tape to main memory) is posing new challenges, leading to caching technologies, such as Spark. On the analytics side, the emergence of resource managers such as YARN has opened the door for analytics tools to bypass the Map-Reduce layer and directly exploit shared system resources while computing close to data copies. This trend is especially significant for iterative computations such as graph analytics and machine learning, for which Map-Reduce is widely recognized to be a poor fit.
While Hadoop is widely recognized and used externally, Microsoft has long been at the forefront of Big Data analytics, with Cosmos and Scope supporting all internal customers. These internal services are a key part of our strategy going forward, and are enabling new state of the art external-facing services such as Azure Data Lake and more. I will examine these trends, and ground the talk by discussing the Microsoft Big Data stack.
The document discusses different types of big data including unstructured, semi-structured, and structured data. It provides examples of each type such as audio, video, and images for unstructured data. JSON, XML, and sensor data are given as examples for semi-structured data. The document also discusses the challenges of processing big data due to its variety, velocity, and volume.
Big Data Analytics in the Cloud with Microsoft AzureMark Kromer
Big Data Analytics in the Cloud using Microsoft Azure services was discussed. Key points included:
1) Azure provides tools for collecting, processing, analyzing and visualizing big data including Azure Data Lake, HDInsight, Data Factory, Machine Learning, and Power BI. These services can be used to build solutions for common big data use cases and architectures.
2) U-SQL is a language for preparing, transforming and analyzing data that allows users to focus on the what rather than the how of problems. It uses SQL and C# and can operate on structured and unstructured data.
3) Visual Studio provides an integrated environment for authoring, debugging, and monitoring U-SQL scripts and jobs. This allows
Building Modern Data Platform with Microsoft AzureDmitry Anoshin
This document provides an overview of building a modern cloud analytics solution using Microsoft Azure. It discusses the role of analytics, a history of cloud computing, and a data warehouse modernization project. Key challenges covered include lack of notifications, logging, self-service BI, and integrating streaming data. The document proposes solutions to these challenges using Azure services like Data Factory, Kafka, Databricks, and SQL Data Warehouse. It also discusses alternative implementations using tools like Matillion ETL and Snowflake.
Verizon Centralizes Data into a Data Lake in Real Time for AnalyticsDataWorks Summit
Verizon – Global Technology Services (GTS) was challenged by a multi-tier, labor-intensive process when trying to migrate data from disparate sources into a data lake to create financial reports and business insights. Join this session to learn more about how Verizon:
• Easily accessed data from multiple sources including SAP data
• Ingested data into major targets including Hadoop
• Achieved real-time insights from data leveraging change data capture (CDC) technology
• Reduced costs and labor
In this slidedeck, Infochimps Director of Product, Tim Gasper, discusses how Infochimps tackles business problems for customers by deploying a comprehensive Big Data infrastructure in days; sometimes in just hours. Tim unlocks how Infochimps is now taking that same aggressive approach to deliver faster time to value by helping customers develop analytic applications with impeccable speed.
Entity Resolution Service - Bringing Petabytes of Data Online for Instant AccessDataWorks Summit
2.5B+ ids, 2ms latency, 15K+ TPS and Petabytes of data.These numbers outline the challenges with eBay’s Entity Resolution Service (ERS). ERS provides a temporal map between anyid-anyid. The technology stack of ERS has Hadoop as the batch layer, Couchbase as cache layer, Spring Batch to load data to Couchbase and Rest API at Service layer. In our presentation we will take you through the journey from conceptual to production release. It’s a great story and we would like to share with you!
Everyone is awash in the new buzzword, Big Data, and it seems as if you can’t escape it wherever you go. But there are real companies with real use cases creating real value for their businesses by using big data. This talk will discuss some of the more compelling current or recent projects, their architecture & systems used, and successful outcomes.
Cortana Analytics Workshop: Operationalizing Your End-to-End Analytics SolutionMSAdvAnalytics
Wee Hyong Tok. With Azure Data Factory (ADF), existing data movement and analytics processing services can be composed into data pipelines that are highly available and managed in the cloud. In this demo-driven session, you learn by example how to build, operationalize, and manage scalable analytics pipelines. Go to https://channel9.msdn.com/ to find the recording of this session.
Microsoft and Hortonworks Delivers the Modern Data Architecture for Big DataHortonworks
Joint webinar with Microsoft and Hortonworns on the power of combining the Hortonworks Data Platform with Microsoft’s ubiquitous Windows, Office, SQL Server, Parallel Data Warehouse, and Azure platform to build the Modern Data Architecture for Big Data.
This document discusses big data concepts like volume, velocity, and variety of data. It introduces NoSQL databases as an alternative to relational databases for big data that does not require data cleansing or schema definition. Hadoop is presented as a framework for distributed storage and processing of large datasets across clusters of commodity hardware. Key Hadoop components like HDFS, MapReduce, Hive, Pig and YARN are described at a high level. The document also discusses using Azure services like Azure Storage, HDInsight and Stream Analytics with Hadoop.
Big data is driving transformative changes in traditional data warehousing. Traditional ETL processes and highly structured data schemas are being replaced with schema flexibility to handle all types of data from diverse sources. This allows for real-time experimentation and analysis beyond just operational reporting. Microsoft is applying lessons from its own big data journey to help customers by providing a comprehensive set of Apache big data tools in Azure along with intelligence and analytics services to gain insights from diverse data sources.
This document summarizes the history and evolution of data warehousing and analytics architectures. It discusses how data warehouses emerged in the 1970s and were further developed in the late 1980s and 1990s. It then covers how big data and Hadoop have changed architectures, providing more scalability and lower costs. Finally, it outlines components of modern analytics architectures, including Hadoop, data warehouses, analytics engines, and visualization tools that integrate these technologies.
This document discusses using Azure HDInsight for big data applications. It provides an overview of HDInsight and describes how it can be used for various big data scenarios like modern data warehousing, advanced analytics, and IoT. It also discusses the architecture and components of HDInsight, how to create and manage HDInsight clusters, and how HDInsight integrates with other Azure services for big data and analytics workloads.
Best Practices: Hadoop migration to Azure HDInsightRevin Chalil
This document provides guidance on migrating Hadoop workloads from on-premises environments to Azure HDInsight. It discusses best practices such as choosing the appropriate HDInsight cluster type based on workload, selecting virtual machine sizes and storage locations, configuring security and networking, using metastores for metadata migration, moving data over, and remediating applications. The document also provides recommendations on optimization techniques after migration such as using Spark jobs instead of MapReduce and Apache Ambari for cluster management.
Cortana Analytics Suite is a fully managed big data and advanced analytics suite that transforms your data into intelligent action. It is comprised of data storage, information management, machine learning, and business intelligence software in a single convenient monthly subscription. This presentation will cover all the products involved, how they work together, and use cases.
Embarking on building a modern data warehouse in the cloud can be an overwhelming experience due to the sheer number of products that can be used, especially when the use cases for many products overlap others. In this talk I will cover the use cases of many of the Microsoft products that you can use when building a modern data warehouse, broken down into four areas: ingest, store, prep, and model & serve. It’s a complicated story that I will try to simplify, giving blunt opinions of when to use what products and the pros/cons of each.
Dr. Christian Kurze from Denodo, "Data Virtualization: Fulfilling the Promise...Dataconomy Media
This document discusses data virtualization and how it can help organizations leverage data lakes to access all their data from disparate sources through a single interface. It addresses how data virtualization can help avoid data swamps, prevent physical data lakes from becoming silos, and support use cases like IoT, operational data stores, and offloading. The document outlines the benefits of a logical data lake created through data virtualization and provides examples of common use cases.
Pentaho Big Data Analytics with Vertica and HadoopMark Kromer
Overview of the Pentaho Big Data Analytics Suite from the Pentaho + Vertica presentation at Big Data Techcon 2014 in Boston for the session called "The Ultimate Selfie | Picture Yourself with the Fastest Analytics on Hadoop with HP Vertica and Pentaho"
Data Mesh in Practice: How Europe’s Leading Online Platform for Fashion Goes ...Databricks
Zalando transitioned from a centralized data platform to a data mesh architecture. This decentralized their data infrastructure by having individual domains own datasets and pipelines rather than a central team. It provided self-service data infrastructure tools and governance to enable domains to operate independently while maintaining global interoperability. This improved data quality by making domains responsible for their data and empowering them through the data mesh approach.
Modernizing to a Cloud Data ArchitectureDatabricks
Organizations with on-premises Hadoop infrastructure are bogged down by system complexity, unscalable infrastructure, and the increasing burden on DevOps to manage legacy architectures. Costs and resource utilization continue to go up while innovation has flatlined. In this session, you will learn why, now more than ever, enterprises are looking for cloud alternatives to Hadoop and are migrating off of the architecture in large numbers. You will also learn how elastic compute models’ benefits help one customer scale their analytics and AI workloads and best practices from their experience on a successful migration of their data and workloads to the cloud.
MapR on Azure: Getting Value from Big Data in the Cloud -MapR Technologies
Public cloud adoption is exploding and big data technologies are rapidly becoming an important driver of this growth. According to Wikibon, big data public cloud revenue will grow from 4.4% in 2016 to 24% of all big data spend by 2026. Digital transformation initiatives are now a priority for most organizations, with data and advanced analytics at the heart of enabling this change. This is key to driving competitive advantage in every industry.
There is nothing better than a real-world customer use case to help you understand how to get value from big data in the cloud and apply the learnings to your business. Join Microsoft, MapR, and Sullexis on November 10th to:
Hear from Sullexis on the business use case and technical implementation details of one of their oil & gas customers
Understand the integration points of the MapR Platform with other Azure services and why they matter
Know how to deploy the MapR Platform on the Azure cloud and get started easily
You will also get to hear about customer use cases of the MapR Converged Data Platform on Azure in other verticals such as real estate and retail.
Speakers
Rafael Godinho
Technical Evangelist
Microsoft Azure
Tim Morgan
Managing Director
Sullexis
Intorducing Big Data and Microsoft AzureKhalid Salama
The purpose of these slides is to give a high-level overview of Big Data concepts and techniques, as well as its related tools and technologies, focusing on Microsoft Azure. It starts by defining what Big Data is, as well as why Big Data platforms are needed. Fundamental components of a Big Data Platform are discussed, followed by a little bit of theory about Distributed Processing & CAP Theorem, and its relevance to how Big Data Solutions compare to Traditional RDBMS. Use case of how Big Data fits in Enterprise Data Platforms are shown. The Hadoop Ecosystem is briefly reviewed before Big Data on Microsoft Azure is discussed. Then some directions of How to get started with Big Data.
The document discusses different types of big data including unstructured, semi-structured, and structured data. It provides examples of each type such as audio, video, and images for unstructured data. JSON, XML, and sensor data are given as examples for semi-structured data. The document also discusses the challenges of processing big data due to its variety, velocity, and volume.
Big Data Analytics in the Cloud with Microsoft AzureMark Kromer
Big Data Analytics in the Cloud using Microsoft Azure services was discussed. Key points included:
1) Azure provides tools for collecting, processing, analyzing and visualizing big data including Azure Data Lake, HDInsight, Data Factory, Machine Learning, and Power BI. These services can be used to build solutions for common big data use cases and architectures.
2) U-SQL is a language for preparing, transforming and analyzing data that allows users to focus on the what rather than the how of problems. It uses SQL and C# and can operate on structured and unstructured data.
3) Visual Studio provides an integrated environment for authoring, debugging, and monitoring U-SQL scripts and jobs. This allows
Building Modern Data Platform with Microsoft AzureDmitry Anoshin
This document provides an overview of building a modern cloud analytics solution using Microsoft Azure. It discusses the role of analytics, a history of cloud computing, and a data warehouse modernization project. Key challenges covered include lack of notifications, logging, self-service BI, and integrating streaming data. The document proposes solutions to these challenges using Azure services like Data Factory, Kafka, Databricks, and SQL Data Warehouse. It also discusses alternative implementations using tools like Matillion ETL and Snowflake.
Verizon Centralizes Data into a Data Lake in Real Time for AnalyticsDataWorks Summit
Verizon – Global Technology Services (GTS) was challenged by a multi-tier, labor-intensive process when trying to migrate data from disparate sources into a data lake to create financial reports and business insights. Join this session to learn more about how Verizon:
• Easily accessed data from multiple sources including SAP data
• Ingested data into major targets including Hadoop
• Achieved real-time insights from data leveraging change data capture (CDC) technology
• Reduced costs and labor
In this slidedeck, Infochimps Director of Product, Tim Gasper, discusses how Infochimps tackles business problems for customers by deploying a comprehensive Big Data infrastructure in days; sometimes in just hours. Tim unlocks how Infochimps is now taking that same aggressive approach to deliver faster time to value by helping customers develop analytic applications with impeccable speed.
Entity Resolution Service - Bringing Petabytes of Data Online for Instant AccessDataWorks Summit
2.5B+ ids, 2ms latency, 15K+ TPS and Petabytes of data.These numbers outline the challenges with eBay’s Entity Resolution Service (ERS). ERS provides a temporal map between anyid-anyid. The technology stack of ERS has Hadoop as the batch layer, Couchbase as cache layer, Spring Batch to load data to Couchbase and Rest API at Service layer. In our presentation we will take you through the journey from conceptual to production release. It’s a great story and we would like to share with you!
Everyone is awash in the new buzzword, Big Data, and it seems as if you can’t escape it wherever you go. But there are real companies with real use cases creating real value for their businesses by using big data. This talk will discuss some of the more compelling current or recent projects, their architecture & systems used, and successful outcomes.
Cortana Analytics Workshop: Operationalizing Your End-to-End Analytics SolutionMSAdvAnalytics
Wee Hyong Tok. With Azure Data Factory (ADF), existing data movement and analytics processing services can be composed into data pipelines that are highly available and managed in the cloud. In this demo-driven session, you learn by example how to build, operationalize, and manage scalable analytics pipelines. Go to https://channel9.msdn.com/ to find the recording of this session.
Microsoft and Hortonworks Delivers the Modern Data Architecture for Big DataHortonworks
Joint webinar with Microsoft and Hortonworns on the power of combining the Hortonworks Data Platform with Microsoft’s ubiquitous Windows, Office, SQL Server, Parallel Data Warehouse, and Azure platform to build the Modern Data Architecture for Big Data.
This document discusses big data concepts like volume, velocity, and variety of data. It introduces NoSQL databases as an alternative to relational databases for big data that does not require data cleansing or schema definition. Hadoop is presented as a framework for distributed storage and processing of large datasets across clusters of commodity hardware. Key Hadoop components like HDFS, MapReduce, Hive, Pig and YARN are described at a high level. The document also discusses using Azure services like Azure Storage, HDInsight and Stream Analytics with Hadoop.
Big data is driving transformative changes in traditional data warehousing. Traditional ETL processes and highly structured data schemas are being replaced with schema flexibility to handle all types of data from diverse sources. This allows for real-time experimentation and analysis beyond just operational reporting. Microsoft is applying lessons from its own big data journey to help customers by providing a comprehensive set of Apache big data tools in Azure along with intelligence and analytics services to gain insights from diverse data sources.
This document summarizes the history and evolution of data warehousing and analytics architectures. It discusses how data warehouses emerged in the 1970s and were further developed in the late 1980s and 1990s. It then covers how big data and Hadoop have changed architectures, providing more scalability and lower costs. Finally, it outlines components of modern analytics architectures, including Hadoop, data warehouses, analytics engines, and visualization tools that integrate these technologies.
This document discusses using Azure HDInsight for big data applications. It provides an overview of HDInsight and describes how it can be used for various big data scenarios like modern data warehousing, advanced analytics, and IoT. It also discusses the architecture and components of HDInsight, how to create and manage HDInsight clusters, and how HDInsight integrates with other Azure services for big data and analytics workloads.
Best Practices: Hadoop migration to Azure HDInsightRevin Chalil
This document provides guidance on migrating Hadoop workloads from on-premises environments to Azure HDInsight. It discusses best practices such as choosing the appropriate HDInsight cluster type based on workload, selecting virtual machine sizes and storage locations, configuring security and networking, using metastores for metadata migration, moving data over, and remediating applications. The document also provides recommendations on optimization techniques after migration such as using Spark jobs instead of MapReduce and Apache Ambari for cluster management.
Cortana Analytics Suite is a fully managed big data and advanced analytics suite that transforms your data into intelligent action. It is comprised of data storage, information management, machine learning, and business intelligence software in a single convenient monthly subscription. This presentation will cover all the products involved, how they work together, and use cases.
Embarking on building a modern data warehouse in the cloud can be an overwhelming experience due to the sheer number of products that can be used, especially when the use cases for many products overlap others. In this talk I will cover the use cases of many of the Microsoft products that you can use when building a modern data warehouse, broken down into four areas: ingest, store, prep, and model & serve. It’s a complicated story that I will try to simplify, giving blunt opinions of when to use what products and the pros/cons of each.
Dr. Christian Kurze from Denodo, "Data Virtualization: Fulfilling the Promise...Dataconomy Media
This document discusses data virtualization and how it can help organizations leverage data lakes to access all their data from disparate sources through a single interface. It addresses how data virtualization can help avoid data swamps, prevent physical data lakes from becoming silos, and support use cases like IoT, operational data stores, and offloading. The document outlines the benefits of a logical data lake created through data virtualization and provides examples of common use cases.
Pentaho Big Data Analytics with Vertica and HadoopMark Kromer
Overview of the Pentaho Big Data Analytics Suite from the Pentaho + Vertica presentation at Big Data Techcon 2014 in Boston for the session called "The Ultimate Selfie | Picture Yourself with the Fastest Analytics on Hadoop with HP Vertica and Pentaho"
Data Mesh in Practice: How Europe’s Leading Online Platform for Fashion Goes ...Databricks
Zalando transitioned from a centralized data platform to a data mesh architecture. This decentralized their data infrastructure by having individual domains own datasets and pipelines rather than a central team. It provided self-service data infrastructure tools and governance to enable domains to operate independently while maintaining global interoperability. This improved data quality by making domains responsible for their data and empowering them through the data mesh approach.
Modernizing to a Cloud Data ArchitectureDatabricks
Organizations with on-premises Hadoop infrastructure are bogged down by system complexity, unscalable infrastructure, and the increasing burden on DevOps to manage legacy architectures. Costs and resource utilization continue to go up while innovation has flatlined. In this session, you will learn why, now more than ever, enterprises are looking for cloud alternatives to Hadoop and are migrating off of the architecture in large numbers. You will also learn how elastic compute models’ benefits help one customer scale their analytics and AI workloads and best practices from their experience on a successful migration of their data and workloads to the cloud.
MapR on Azure: Getting Value from Big Data in the Cloud -MapR Technologies
Public cloud adoption is exploding and big data technologies are rapidly becoming an important driver of this growth. According to Wikibon, big data public cloud revenue will grow from 4.4% in 2016 to 24% of all big data spend by 2026. Digital transformation initiatives are now a priority for most organizations, with data and advanced analytics at the heart of enabling this change. This is key to driving competitive advantage in every industry.
There is nothing better than a real-world customer use case to help you understand how to get value from big data in the cloud and apply the learnings to your business. Join Microsoft, MapR, and Sullexis on November 10th to:
Hear from Sullexis on the business use case and technical implementation details of one of their oil & gas customers
Understand the integration points of the MapR Platform with other Azure services and why they matter
Know how to deploy the MapR Platform on the Azure cloud and get started easily
You will also get to hear about customer use cases of the MapR Converged Data Platform on Azure in other verticals such as real estate and retail.
Speakers
Rafael Godinho
Technical Evangelist
Microsoft Azure
Tim Morgan
Managing Director
Sullexis
Intorducing Big Data and Microsoft AzureKhalid Salama
The purpose of these slides is to give a high-level overview of Big Data concepts and techniques, as well as its related tools and technologies, focusing on Microsoft Azure. It starts by defining what Big Data is, as well as why Big Data platforms are needed. Fundamental components of a Big Data Platform are discussed, followed by a little bit of theory about Distributed Processing & CAP Theorem, and its relevance to how Big Data Solutions compare to Traditional RDBMS. Use case of how Big Data fits in Enterprise Data Platforms are shown. The Hadoop Ecosystem is briefly reviewed before Big Data on Microsoft Azure is discussed. Then some directions of How to get started with Big Data.
The document outlines a reference architecture for using big data and analytics to address challenges in areas like fraud detection, risk reduction, compliance, and customer churn prevention for financial institutions. It describes components like streaming data ingestion, storage, processing, analytics and machine learning, and presentation. Specific applications discussed include money laundering prevention, using techniques like decision trees, cluster analysis, and pattern detection on data from multiple sources stored in Azure data services.
Big Data in the Cloud - Montreal April 2015Cindy Gross
slides:
Basic Big Data and Hadoop terminology
What projects fit well with Hadoop
Why Hadoop in the cloud is so Powerful
Sample end-to-end architecture
See: Data, Hadoop, Hive, Analytics, BI
Do: Data, Hadoop, Hive, Analytics, BI
How this tech solves your business problems
Visualising the tabular model for power view uploadJen Stirrup
This document discusses visualizing tabular models using Power View. It begins by explaining what a tabular model is, as part of Microsoft's business intelligence semantic model (BISM) vision. It is based on the relational data model for familiarity and ease of use. The document then explains what Power View is - an interactive data exploration and visual presentation experience. It concludes by discussing how the tabular model and Power View can work together for data analysis and visualization.
Belgian Windows Server 2012 Launch windows azure insights for the enterprise ...Mike Martin
and the accompanying video can be found here : http://technet.microsoft.com/en-us/video/windows-azure-insights-for-the-enterprise-it-pro
The document introduces Azure Functions as a serverless compute option on the Azure platform. It provides an overview of Azure's compute services spectrum, positioning Azure Functions as a highly agile and scalable option with less complexity compared to other services like virtual machines, cloud services, and service fabric. The document also includes information about event sponsor BlueMetal, an interactive design and technology architecture firm, and contact details for following up.
The document discusses real-time fraud detection patterns and architectures. It provides an overview of key technologies like Kafka, Flume, and Spark Streaming used for real-time event processing. It then describes a high-level architecture involving ingesting events through Flume and Kafka into Spark Streaming for real-time processing, with results stored in HBase, HDFS, and Solr. The document also covers partitioning strategies, micro-batching, complex topologies, and ingestion of real-time and batch data.
Application Insights is a Microsoft Azure service that helps application developers understand if their applications are available, performing well, and successful. It provides a 3600 view for developers to be alerted to problems quickly and learn what customers are doing to prioritize work items. The document discusses using Application Insights for Java applications, with the service currently in preview and working towards general availability. It seeks external help with Application Insights extensions and support for technologies like Java, PHP, Node.js, Ruby, and Python.
Azure api app métricas com application insightsNicolas Takashi
Este documento discute o Application Insights do Azure, que fornece métricas e telemetria para aplicativos. Ele introduz o Application Insights, aborda preocupações sobre sobrecarga, lista tipos de dados que podem ser coletados e promete uma demonstração.
This document discusses Azure IOT Hub and Azure Stream Analytics. Azure IOT Hub is a fully managed service that enables reliable and secure bidirectional communication between IoT devices and backend solutions. It provides device-to-cloud and cloud-to-device messaging at scale with security credentials and access control. Azure Stream Analytics is a low-cost event processing engine that helps uncover real-time insights from streaming data sources. It allows developers to use SQL-like queries to develop solutions faster and elastically scales in the cloud. The document outlines how these services can be used to build IoT solutions that process and analyze real-time device data.
Big data streaming with Apache Spark on AzureWillem Meints
A talk for the Breda Dev meetup in which I showed what challenges microservices architectures bring for data analysis and how you can tackle these challenges with Apache Spark on Azure.
SQLSaturday #230 - Introduction to Microsoft Big Data (Part 1)Sascha Dittmann
In dieser Session stellen wir anhand eines praktischen Szenarios vor, wie konkrete Aufgabenstellungen mit HDInsight in der Praxis gelöst werden können:
- Grundlagen von HDInsight für Windows Server und Windows Azure
- Mit Windows Azure HDInsight arbeiten
- MapReduce-Jobs mit Javascript und .NET Code implementieren
The document discusses software scope, which involves determining project goals, tasks, costs, and deadlines. It also describes functions, performance, constraints, and interfaces. The first step in software project planning is to determine scope by assessing functions and performance allocated to software. Scope is identified by asking the customer questions about goals, benefits, problems, and environment. An example of determining scope for a conveyor line sorting system is provided.
This document provides an overview of big data and how Azure HDInsight can be used to work with big data. It discusses the evolution of data from gigabytes to exabytes and the big data utility gap where most data is stored but not analyzed. It then discusses how to store everything, analyze anything, and build the right thing using big data. Examples are provided of companies generating large amounts of data. An overview of the Hadoop ecosystem is given along with examples of using Hive and Pig on HDInsight to query and analyze large datasets. A case study of Klout is also summarized.
Azure Stream Analytics : Analyse Data in MotionRuhani Arora
The document discusses evolving approaches to data warehousing and analytics using Azure Data Factory and Azure Stream Analytics. It provides an example scenario of analyzing game usage logs to create a customer profiling view. Azure Data Factory is presented as a way to build data integration and analytics pipelines that move and transform data between on-premises and cloud data stores. Azure Stream Analytics is introduced for analyzing real-time streaming data using a declarative query language.
2016-08-25 TechExeter - going serverless with AzureSteve Lee
This document discusses serverless computing options on Microsoft Azure, including Azure Functions, Logic Apps, and Mobile Apps. Azure Functions allow developers to write small code fragments or "nanoservices" that run in ephemeral containers in a serverless computing environment. Logic Apps enable the creation of declarative, event-driven workflows to automate business processes. Mobile Apps provide backend services like user authentication, data synchronization, and push notifications for mobile applications. The document argues that serverless options on Azure simplify development by allowing developers to focus on their code while outsourcing server management.
This document provides an overview of serverless computing using Azure Functions. It discusses the benefits of serverless such as increased server utilization, instant scaling, and reduced time to market. Serverless allows developers to focus on business logic rather than managing servers. Azure Functions is introduced as a way to develop serverless applications using triggers and bindings in languages like C#, Node.js, Python and more. Common serverless patterns are also presented.
Building IoT and Big Data Solutions on AzureIdo Flatow
This document discusses building IoT and big data solutions on Microsoft Azure. It provides an overview of common data types and challenges in integrating diverse data sources. It then describes several Azure services that can be used to ingest, process, analyze and visualize IoT and other large, diverse datasets. These services include IoT Hub, Event Hubs, Stream Analytics, HDInsight, Data Factory, DocumentDB and others. Examples and demos are provided for how to use these services to build end-to-end IoT and big data solutions on Azure.
The document discusses Microsoft's Windows Azure cloud computing platform. It provides an overview of the platform's infrastructure, services, and pricing models. The key points are:
1. Windows Azure provides infrastructure and services for building applications and storing data in the cloud. It offers compute, storage, database, and connectivity services.
2. The platform's infrastructure includes globally distributed data centers housing servers in shipping containers for high density.
3. Services include SQL Azure, storage, content delivery, queues, and an app development platform. Pricing models are consumption-based or via subscriptions.
The document discusses data mesh vs data fabric architectures. It defines data mesh as a decentralized data processing architecture with microservices and event-driven integration of enterprise data assets across multi-cloud environments. The key aspects of data mesh are that it is decentralized, processes data at the edge, uses immutable event logs and streams for integration, and can move all types of data reliably. The document then provides an overview of how data mesh architectures have evolved from hub-and-spoke models to more distributed designs using techniques like kappa architecture and describes some use cases for event streaming and complex event processing.
This is Part 4 of the GoldenGate series on Data Mesh - a series of webinars helping customers understand how to move off of old-fashioned monolithic data integration architecture and get ready for more agile, cost-effective, event-driven solutions. The Data Mesh is a kind of Data Fabric that emphasizes business-led data products running on event-driven streaming architectures, serverless, and microservices based platforms. These emerging solutions are essential for enterprises that run data-driven services on multi-cloud, multi-vendor ecosystems.
Join this session to get a fresh look at Data Mesh; we'll start with core architecture principles (vendor agnostic) and transition into detailed examples of how Oracle's GoldenGate platform is providing capabilities today. We will discuss essential technical characteristics of a Data Mesh solution, and the benefits that business owners can expect by moving IT in this direction. For more background on Data Mesh, Part 1, 2, and 3 are on the GoldenGate YouTube channel: https://www.youtube.com/playlist?list=PLbqmhpwYrlZJ-583p3KQGDAd6038i1ywe
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
Mr. Pollock is an expert technology leader for data platforms, big data, data integration and governance. Jeff has been CTO at California startups and a senior exec at Fortune 100 tech vendors. He is currently Oracle VP of Products and Cloud Services for Data Replication, Streaming Data and Database Migrations. While at IBM, he was head of all Information Integration, Replication and Governance products, and previously Jeff was an independent architect for US Defense Department, VP of Technology at Cerebra and CTO of Modulant – he has been engineering artificial intelligence based data platforms since 2001. As a business consultant, Mr. Pollock was a Head Architect at Ernst & Young’s Center for Technology Enablement. Jeff is also the author of “Semantic Web for Dummies” and "Adaptive Information,” a frequent keynote at industry conferences, author for books and industry journals, formerly a contributing member of W3C and OASIS, and an engineering instructor with UC Berkeley’s Extension for object-oriented systems, software development process and enterprise architecture.
So you got a handle on what Big Data is and how you can use it to find business value in your data. Now you need an understanding of the Microsoft products that can be used to create a Big Data solution. Microsoft has many pieces of the puzzle and in this presentation I will show how they fit together. How does Microsoft enhance and add value to Big Data? From collecting data, transforming it, storing it, to visualizing it, I will show you Microsoft’s solutions for every step of the way
Microsoft Azure DocumentDB is a NoSQL document database service that is part of Microsoft Azure. It allows for the storage and querying of JSON documents and offers rich query capabilities over schema-free data using SQL and JavaScript. DocumentDB provides scalability, availability, and predictable performance for cloud applications.
This document discusses building applications with SQL Data Services and Windows Azure. It provides an agenda that introduces SQL Data Services architecture, describes SDS application architectures, and how to scale out with SQL Data Services. It also discusses the SQL Data Services network topology and performance considerations for accessing SDS from applications.
Big Data Analytics from Azure Cloud to Power BI MobileRoy Kim
This document discusses using Azure services for big data analytics and data insights. It provides an overview of Azure services like Azure Batch, Azure Data Lake, Azure HDInsight and Power BI. It then describes a demo solution that uses these Azure services to analyze job posting data, including collecting data using a .NET application, storing in Azure Data Lake Store, processing with Azure Data Lake Analytics and Azure HDInsight, and visualizing results in Power BI. The presentation includes architecture diagrams and discusses implementation details.
Azure Data Explorer deep dive - review 04.2020Riccardo Zamana
Modern Data Science Lifecycle with ADX & Azure
This document discusses using Azure Data Explorer (ADX) for data science workflows. ADX is a fully managed analytics service for real-time analysis of streaming data. It allows for ad-hoc querying of data using Kusto Query Language (KQL) and integrates with various Azure data ingestion sources. The document provides an overview of the ADX architecture and compares it to other time series databases. It also covers best practices for ingesting data, visualizing results, and automating workflows using tools like Azure Data Factory.
Let's talk about what Microsoft has to offer as a platform to help you build an Internet of Things solution. Mainly about Azure cloud but also Machine Learning, Cognitive Services, Windows, Hololens, Open Source
Microsoft Azure is a cloud computing platform offering a range of services including compute, analytics, storage, networking, and more. Users can choose services to develop and scale applications in the public cloud. Key Azure products include Virtual Machines, App Service, SQL Database, Storage, Backup, API Management, Cosmos DB, Machine Learning, Security and Compliance, SQL Data Warehouse, Notification Hubs, IOT Hub, and more. Azure provides scalable, reliable cloud services to build applications across platforms and devices.
The document discusses Microsoft's data platform and cloud services. It highlights:
1) Microsoft's data platform provides intelligence over all data with SQL and Apache Spark, enabling AI and machine learning over any data.
2) Microsoft offers data modernization solutions for migrating to the cloud or managing data on-premises and in hybrid environments.
3) Migrating databases to Azure provides cost savings, security, high performance, and intelligent capabilities through services like Azure SQL Database and Azure Cosmos DB.
Understanding the Windows Azure Platform - Dec 2010DavidGristwood
This document provides an overview of the Windows Azure platform. It describes Windows Azure as a platform as a service (PaaS) that provides scalable compute and storage services in the cloud. It outlines the core services of Windows Azure including compute, storage, networking and tools for development, deployment and management. It also discusses key advantages like scalability, reliability, flexibility and the pay-as-you-go business model.
Cloud Modernization and Data as a Service OptionDenodo
Watch here: https://bit.ly/36tEThx
The current data landscape is fragmented, not just in location but also in terms of shape and processing paradigms. Cloud has become a key component of modern architecture design. Data lakes, IoT, NoSQL, SaaS, etc. coexist with relational databases to fuel the needs of modern analytics, ML and AI. Exploring and understanding the data available within your organization is a time-consuming task. Dealing with bureaucracy, different languages and protocols, and the definition of ingestion pipelines to load that data into your data lake can be complex. And all of this without even knowing if that data will be useful at all.
Attend this session to learn:
- How dynamic data challenges and the speed of change requires a new approach to data architecture – one that is real-time, agile and doesn’t rely on physical data movement.
- Learn how logical data architecture can enable organizations to transition data faster to the cloud with zero downtime and ultimately deliver faster time to insight.
- Explore how data as a service and other API management capabilities is a must in a hybrid cloud environment.
By James Francis, CEO of Paradigm Asset Management
In the landscape of urban safety innovation, Mt. Vernon is emerging as a compelling case study for neighboring Westchester County cities. The municipality’s recently launched Public Safety Camera Program not only represents a significant advancement in community protection but also offers valuable insights for New Rochelle and White Plains as they consider their own safety infrastructure enhancements.
Telangana State, India’s newest state that was carved from the erstwhile state of Andhra
Pradesh in 2014 has launched the Water Grid Scheme named as ‘Mission Bhagiratha (MB)’
to seek a permanent and sustainable solution to the drinking water problem in the state. MB is
designed to provide potable drinking water to every household in their premises through
piped water supply (PWS) by 2018. The vision of the project is to ensure safe and sustainable
piped drinking water supply from surface water sources
Just-in-time: Repetitive production system in which processing and movement of materials and goods occur just as they are needed, usually in small batches
JIT is characteristic of lean production systems
JIT operates with very little “fat”
Mieke Jans is a Manager at Deloitte Analytics Belgium. She learned about process mining from her PhD supervisor while she was collaborating with a large SAP-using company for her dissertation.
Mieke extended her research topic to investigate the data availability of process mining data in SAP and the new analysis possibilities that emerge from it. It took her 8-9 months to find the right data and prepare it for her process mining analysis. She needed insights from both process owners and IT experts. For example, one person knew exactly how the procurement process took place at the front end of SAP, and another person helped her with the structure of the SAP-tables. She then combined the knowledge of these different persons.
AI Competitor Analysis: How to Monitor and Outperform Your CompetitorsContify
AI competitor analysis helps businesses watch and understand what their competitors are doing. Using smart competitor intelligence tools, you can track their moves, learn from their strategies, and find ways to do better. Stay smart, act fast, and grow your business with the power of AI insights.
For more information please visit here https://www.contify.com/
2. About me
Eyal Ben Ivri
Big Data & Cloud Architect, Sela Group
Focus On Hadoop Eco-System & Big-Data +
NoSQL Solutions
3. Modern Data – The Big Picture
IoT
User Data
Media Files
Documents
Machine Data
Log Files
5. The Light Rail problem – TLV
Railway
Imagine the new light Rail maintenance
company
IoT – Internet of Trains (and cameras, and cash
registers and carts and rails and more…)
Analyze data in stream and in batch
Dashboards
Alerts
The perfect problem
6. What We Need
An integrated data solution that will be:
Able to process events from external sources
Able to walk data through different pipelines
Fast and responsive
Big-Data Ready
7. In Other Words
Consume
BI Dashboards Applications
Process
ETL Aggregations Computation Analysis Querying
Persist
Hadoop SQL NoSQL
Ingest
IoT Structured Data Un-Structured Data
8. Microsoft Azure Services for
IoT and BigData
Devices Device Connectivity Storage Analytics Presentation & Action
Event Hubs SQL Database
Machine
Learning
App Service
Service Bus
Table/Blob
Storage
Stream Analytics Power BI
External Data
Sources
DocumentDB HDInsight
Notification
Hubs
Data Lake Store Data Factory Mobile Services
External Data
Sources
Data Lake
Analytics
BizTalk Services
{ }
9. Microsoft Azure Services for
IoT and BigData
Devices Device Connectivity Storage Analytics Presentation & Action
Event Hubs SQL Database
Machine
Learning
App Service
Service Bus
Table/Blob
Storage
Stream Analytics Power BI
External Data
Sources
DocumentDB HDInsight
Notification
Hubs
Data Lake Store Data Factory Mobile Services
External Data
Sources
Data Lake
Analytics
BizTalk Services
{ }
10. Event Hub
Messages at scale
Why not throw it into a queue, and have a
listener at the backend?
Scaling limits, because of the architecture of queues
and topics of a standard Service Bus
Event Hub uses a partition model
11. Getting Started
Easy to set up
Two Configurations
Partition Count – Depend on the number of consumers (2-
32)
Message Retention (days) – between 1 and 7 days
Secured using SAS Policies
13. Field
Gateway
Device
Connectivity & Management
Analytics &
Operationalized Insights
IoT & Data Processing Patterns
Devices
RTOS,Linux,Windows,Android,iOS
Protocol
Adaptation
Batch Analytics & Visualizations
Azure HDInsight, AzureML, Power BI,
Azure Data Factory
Hot Path Analytics
Azure Stream Analytics, Azure HDInsight Storm
Hot Path Business Logic
Service Fabric & Actor Framework
Cloud Gateway
Event Hubs
&
IoT Hub
Field
Gateway
Protocol
Adaptation
14. TLV Railway
Can now ingest millions of messages each
second
These messages carry data from:
Devices
End-Machines
Servers
Next, we need to use this data to create real-
time alerts when something goes wrong
15. Azure Stream Analytics
Automatic recovery
Monitoring and alerting
Scale on demand
Managed Cloud Service
Each unit handles 1MB/s
Can scale up to 1GB/s
SQL like language
temporal windowing
semantics
support for reference data
16. Stream Analytics – Main Concepts
Inputs
Can be stream or reference data (metadata)
Stream Data sources can be Event Hub, Blob Storage
(using blobs with timestamps) or IoT Hub (preview)
Serialization types support CSV, JSON, and Avro
Query
A SQL query to that will select from input(s) and
dump results to output(s)
Output
Can be Blob, SQL, Event Hub (notification), Power BI
(preview), Table storage, Service Bus or DocumentDB
17. Tumbling Windows
How many trains entered each station every 5
minutes?
SELECT TrainId, COUNT(*) FROM EntryStream
GROUP BY TrainId, TumblingWindow(minute,5)
18. Temporal Windows
Tumbling Window
A series of fixed-sized, non-overlapping and
contiguous time intervals
Hopping Window
Scheduled overlapping windows
Sliding Window
Outputs events only for those points in time when
the content of the window actually changes
19. TLV Railway
Can now respond in near-real-time to events as
they happen
Track and maintain malfunctioning equipment
Receive real time data regarding customers
entering and leaving stations
Data can now be processed, so we need a place
to save it, preferably at scale.
20. DocumentDB and Azure Data
Services
fully managed, scalable, queryable, schema free JSON
document database service for modern applications
transactional processing
rich query
managed as a service
elastic scale
internet accessible http/rest
schema-free data model
arbitrary data formats
21. DocumentDB features
JSON Documents
SQL support
Linq Support
REST API Support
JS Support (triggers, UDFs, stored procedures)
Automatic Index
Multiple Document Transactions
Tunable Consistency
22. DocumentDB Key Concept
Collection
A collection of Documents
Not a table (different entities can go into the same
collection)
Collections = Partitions
Not just logical containers, but physical ones
24. TLV Railway
Can now store it’s data in a highly scalable store
Great for interactive querying of any data
Messages from sensors
Reference Data
But this data (and other data) needs to move to
other places (SQL, Batch processing, ML). How?
25. What is Azure Data Factory?
Azure Data Factory is a managed service to produce
trusted information from data stored in the cloud
and on-premises. Easily create, orchestrate and
schedule highly-available, fault tolerant work flows
to move and transform your data at scale.
26. Evolving Approaches to Analytics
ETL Tool
(SSIS, etc)
EDW
(SQL Svr, Teradata, etc)
Extract
Original
Data
Load
Transformed
Data
Transform
BI Tools
Ingest
Original
Data
Scale-out
Storage &
Compute
(HDFS, Blob Storage,
etc)
Transform & Load
Data Marts
Data Lake(s)
Dashboards
Apps
Streaming data
27. Data Factory – Main concepts
Data Store
A data source/sink component
SQL (Azure or On-Premise), Storage, DocumentDB and
more)
Data Set
A defined data set that is contained inside a data store
One data store can have many data sets
Compute
A service for computation
HDInsight, Azure Batch, Data Lake Analytics, Azure ML
28. Data Factory – Main concepts
Pipeline
Set of instructions
“Take data from data set A and move to compute,
then store results in data set B”
Slices
Everything is time sliced
A data set (source) can declare on what time
intervals the data can be sliced, and the pipeline will
be activated when a new slice is ready
JSON
30. Microsoft Azure Services for
IoT and BigData
Devices Device Connectivity Storage Analytics Presentation & Action
Event Hubs SQL Database
Machine
Learning
App Service
Service Bus
Table/Blob
Storage
Stream Analytics Power BI
External Data
Sources
DocumentDB HDInsight
Notification
Hubs
Data Lake Store Data Factory Mobile Services
External Data
Sources
Data Lake
Analytics
BizTalk Services
{ }
31. Microsoft Azure Services for
IoT and BigData
Devices Device Connectivity Storage Analytics Presentation & Action
Event Hubs SQL Database
Machine
Learning
App Service
Service Bus
Table/Blob
Storage
Stream Analytics Power BI
External Data
Sources
DocumentDB HDInsight
Notification
Hubs
Data Lake Store Data Factory Mobile Services
External Data
Sources
Data Lake
Analytics
BizTalk Services
{ }
32. TLV Railway
Can now integrate different services and
different data sources
Move data with ease and as little hassle as
possible
What about aggregations, deeper dive into
data, for more complex analysis?
34. HDInsight
Hadoop-as-a-Service
Based on the Hortonworks distribution
Few flavors:
Hadoop (Windows + Linux)
Storm (Windows + Linux)
HBase (Windows + Linux)
Spark (Windows + Linux)
37. TLV Railway - Summary
Can now perform advanced analytics on top of
large amounts of data, in a variety of formats
(not just structured, boring data)
Can integrate all the loose ends of data coming
in, with data generated in ”Old-School” data
platforms like SQL that is collected from Line-
of-Business applications
We’ve covered data ingestion, responding in
real-time, querying, storing and processing
Azure Stack
38. Hadoop and OSS vs.
Azure IoT and BigData Ecosystem
Azure Ecosystem OSS
Event Hubs Kafka
Stream Analytics Storm
HDInsight Hadoop
Map Reduce Map Reduce
Hive Hive
Spark Spark
HBase HBase
Azure ML Mahout
Data Factory Pig
DocumentDB MongoDB / Couchbase
#9: Key goal of slide:
IoT as you know is a hot area these days and there are a number of players that claim to be active in this space…. And they tend to focus on specific elements you see in this diagram.
Microsoft has the most comprehensive portfolio of cloud services that customers need to develop and deploy end-to-end IoT solutions.
Customers are adopting these services and are successfully deploying their solutions today (reference Rockwell, ThyssenKrupp)
Talk track [Short Version for Sam’s Leadership Session]:
As we think about Azure IoT services, Microsoft has the most comprehensive portfolio of cloud services that customers need to develop and deploy end-to-end IoT solutions
Ranging from devices that produce data, to connecting them to the cloud storage, and driving analytics to gain valuable business insights that allows enterprises to take actions
Talk track [Long Version Chris’ Breakout Session]:
As we think about Azure IoT services, there are a collection of capabilities involved.
First there are Producers. These can be basic sensors, small form factor devices, traditional computer systems, or even complex assets made up of a number of data sources.
Next we have the Connect Devices capabilities on the ingress level within and around Azure. The primary destination is Service Bus & Event Hubs, but this relies on client agent technology either at the edge device level or within a field or cloud gateway. We also have capabilities for other external data sources o provide data
As data is ingressed to Azure, there are various Storage options there can be a number of destinations engaged. Traditional database technology, table or blob, or even more complex destinations like Document DB are possible. External or third party technologies can also be used. This is where the flexibility and agility of a platform shows its strength, This is where analysts like Gartner are forming opinions about just how robust our platform can be.
As this data is processed in Azure, there are a number of capabilities that can be utilized. Machine Learning, HD Insight, Stream Analytics are examples of tools that can analytics the data in various ways.
Finally the concept of Take Actions uses Azure services. Data may populate a LOB portal, be pushed to apps, or presented in analytics and productivity tools. These are all ways that the data gets out of these architecture points to allow organizations to use analysis to change / transform their business.
Through all of these areas, there is the possibility of utilizing existing investments either within your Azure environment, or elsewhere.
#10: Key goal of slide:
IoT as you know is a hot area these days and there are a number of players that claim to be active in this space…. And they tend to focus on specific elements you see in this diagram.
Microsoft has the most comprehensive portfolio of cloud services that customers need to develop and deploy end-to-end IoT solutions.
Customers are adopting these services and are successfully deploying their solutions today (reference Rockwell, ThyssenKrupp)
Talk track [Short Version for Sam’s Leadership Session]:
As we think about Azure IoT services, Microsoft has the most comprehensive portfolio of cloud services that customers need to develop and deploy end-to-end IoT solutions
Ranging from devices that produce data, to connecting them to the cloud storage, and driving analytics to gain valuable business insights that allows enterprises to take actions
Talk track [Long Version Chris’ Breakout Session]:
As we think about Azure IoT services, there are a collection of capabilities involved.
First there are Producers. These can be basic sensors, small form factor devices, traditional computer systems, or even complex assets made up of a number of data sources.
Next we have the Connect Devices capabilities on the ingress level within and around Azure. The primary destination is Service Bus & Event Hubs, but this relies on client agent technology either at the edge device level or within a field or cloud gateway. We also have capabilities for other external data sources o provide data
As data is ingressed to Azure, there are various Storage options there can be a number of destinations engaged. Traditional database technology, table or blob, or even more complex destinations like Document DB are possible. External or third party technologies can also be used. This is where the flexibility and agility of a platform shows its strength, This is where analysts like Gartner are forming opinions about just how robust our platform can be.
As this data is processed in Azure, there are a number of capabilities that can be utilized. Machine Learning, HD Insight, Stream Analytics are examples of tools that can analytics the data in various ways.
Finally the concept of Take Actions uses Azure services. Data may populate a LOB portal, be pushed to apps, or presented in analytics and productivity tools. These are all ways that the data gets out of these architecture points to allow organizations to use analysis to change / transform their business.
Through all of these areas, there is the possibility of utilizing existing investments either within your Azure environment, or elsewhere.
#31: Key goal of slide:
IoT as you know is a hot area these days and there are a number of players that claim to be active in this space…. And they tend to focus on specific elements you see in this diagram.
Microsoft has the most comprehensive portfolio of cloud services that customers need to develop and deploy end-to-end IoT solutions.
Customers are adopting these services and are successfully deploying their solutions today (reference Rockwell, ThyssenKrupp)
Talk track [Short Version for Sam’s Leadership Session]:
As we think about Azure IoT services, Microsoft has the most comprehensive portfolio of cloud services that customers need to develop and deploy end-to-end IoT solutions
Ranging from devices that produce data, to connecting them to the cloud storage, and driving analytics to gain valuable business insights that allows enterprises to take actions
Talk track [Long Version Chris’ Breakout Session]:
As we think about Azure IoT services, there are a collection of capabilities involved.
First there are Producers. These can be basic sensors, small form factor devices, traditional computer systems, or even complex assets made up of a number of data sources.
Next we have the Connect Devices capabilities on the ingress level within and around Azure. The primary destination is Service Bus & Event Hubs, but this relies on client agent technology either at the edge device level or within a field or cloud gateway. We also have capabilities for other external data sources o provide data
As data is ingressed to Azure, there are various Storage options there can be a number of destinations engaged. Traditional database technology, table or blob, or even more complex destinations like Document DB are possible. External or third party technologies can also be used. This is where the flexibility and agility of a platform shows its strength, This is where analysts like Gartner are forming opinions about just how robust our platform can be.
As this data is processed in Azure, there are a number of capabilities that can be utilized. Machine Learning, HD Insight, Stream Analytics are examples of tools that can analytics the data in various ways.
Finally the concept of Take Actions uses Azure services. Data may populate a LOB portal, be pushed to apps, or presented in analytics and productivity tools. These are all ways that the data gets out of these architecture points to allow organizations to use analysis to change / transform their business.
Through all of these areas, there is the possibility of utilizing existing investments either within your Azure environment, or elsewhere.
#32: Key goal of slide:
IoT as you know is a hot area these days and there are a number of players that claim to be active in this space…. And they tend to focus on specific elements you see in this diagram.
Microsoft has the most comprehensive portfolio of cloud services that customers need to develop and deploy end-to-end IoT solutions.
Customers are adopting these services and are successfully deploying their solutions today (reference Rockwell, ThyssenKrupp)
Talk track [Short Version for Sam’s Leadership Session]:
As we think about Azure IoT services, Microsoft has the most comprehensive portfolio of cloud services that customers need to develop and deploy end-to-end IoT solutions
Ranging from devices that produce data, to connecting them to the cloud storage, and driving analytics to gain valuable business insights that allows enterprises to take actions
Talk track [Long Version Chris’ Breakout Session]:
As we think about Azure IoT services, there are a collection of capabilities involved.
First there are Producers. These can be basic sensors, small form factor devices, traditional computer systems, or even complex assets made up of a number of data sources.
Next we have the Connect Devices capabilities on the ingress level within and around Azure. The primary destination is Service Bus & Event Hubs, but this relies on client agent technology either at the edge device level or within a field or cloud gateway. We also have capabilities for other external data sources o provide data
As data is ingressed to Azure, there are various Storage options there can be a number of destinations engaged. Traditional database technology, table or blob, or even more complex destinations like Document DB are possible. External or third party technologies can also be used. This is where the flexibility and agility of a platform shows its strength, This is where analysts like Gartner are forming opinions about just how robust our platform can be.
As this data is processed in Azure, there are a number of capabilities that can be utilized. Machine Learning, HD Insight, Stream Analytics are examples of tools that can analytics the data in various ways.
Finally the concept of Take Actions uses Azure services. Data may populate a LOB portal, be pushed to apps, or presented in analytics and productivity tools. These are all ways that the data gets out of these architecture points to allow organizations to use analysis to change / transform their business.
Through all of these areas, there is the possibility of utilizing existing investments either within your Azure environment, or elsewhere.