Cloud computing is transforming how businesses run their applications.
Join us as we present the latest insights on why and how businesses are
using cloud computing applications. You’ll hear the latest industry trends
and get practical strategies around adoption success. Can cloud computing
give you the security and trust you require while providing you with
scalable solutions?
Asyma E3 2012 - Impact of cloud computing - Robert Laveryasyma
This document discusses how cloud computing can provide benefits to small and medium-sized enterprises (SMEs). It outlines how cloud services have evolved from earlier hosted systems by providing economies of scale, reducing costs, and allowing for scalable resources on demand. The document then discusses how the cloud can help SMEs by providing access to sophisticated software and analytics tools. It also notes concerns around data privacy and security for businesses considering cloud adoption. Overall, the cloud is positioned as potentially helping SMEs reduce costs while gaining access to flexible IT resources and applications.
How to create a secure high performance storage and compute infrastructureAbhishek Sood
Creating a secure, high-performance enterprise storage system presents a number of challenges.
Without a high throughput, low latency connection between your SAN and your cloud compute infrastructure, your business will struggle to extract actionable insights in time to make the best decisions.
Download this white paper to discover technology designed to deliver maximum storage and compute capacity for enterprises, with massive data stores, that need to solve business problems fast without compromising the security of user information.
Mohave County Arizona Experiences Software-Defined Storage with DataCoreDataCore Software
A virtualized software-defined storage solution enables this local government to minimize downtime related to storage hardware migrations. Mohave County is a local government entity in northwestern Arizona. Mohave County through their experience has become a vocal advocate of Software-Defined Storage (SDS), giving DataCore a “5” or “Excellent” rating (on a scale of 1-5) in response to the question “How well has DataCore delivered Software-Defined Storage to your organization?”
With rapid business expansion through acquisitions and organic growth, City and County Healthcare Group wanted to redesign its IT infrastructure. Local servers at branch offices were becoming harder to manage as the company's geographical reach expanded, so the lean IT team wanted to find an easier way to maintain availability and performance for core business applications by developing on a robust and scalable centralized platform.
Five Best Practices for Improving the Cloud ExperienceHitachi Vantara
This document summarizes a report on best practices for improving the cloud experience based on lessons learned from 232 global IT executives. The five best practices are: 1) Ensure cloud providers meet business and IT requirements through service level agreements. 2) Choose the right cloud service model based on needed control over security and data protection. 3) Use architectures that integrate cloud services with existing infrastructure. 4) Consider benefits beyond cost like improved operations and innovation. 5) Define business requirements for IT and have IT act as a cloud broker. The Hitachi Content Platform portfolio aligns with these practices by providing a secure, scalable cloud that meets business needs and accelerates cloud adoption.
- Three organizations realized benefits from using IBM PureFlex and Flex Systems as integrated platforms for cloud infrastructure, including faster deployment times of 33%, near 100% system availability, and lower operations costs of 37%.
- Capgemini deployed an IBM PureFlex system to support its SAP cloud hosting service 33% faster while improving performance 10-15% and reducing operations costs up to 30%.
- Redcentric transformed its cloud business model using IBM Flex System to rapidly provision resources and expand its service portfolio.
- Agilisys implemented an IBM PureFlex system for new financial services applications to quickly bring them to market and avoid complex integration work.
The document discusses how IBM's cloud services provide clients with unprecedented choice and control when deploying applications in cloud environments. It describes IBM's cloud reference architecture which includes infrastructure as a service, platform as a service, and business process as a service. The architecture is customized to meet clients' needs around management, security, availability, technology platforms, and pricing. Whether clients are designing new applications or automating existing ones like ERP, IBM can provide the right cloud solution.
The document provides an overview of IBM's Big Data platform vision. The platform addresses big data use cases involving high volume, velocity and variety of data. It integrates with existing data warehouse and master data management systems. The platform handles different data types and formats, provides real-time and batch analytics, and has tools to make it easy for developers and users to work with. It is designed with enterprise-grade security, scalability and failure tolerance. The platform allows organizations to analyze big data from various sources to gain insights.
Three organizations share how VMware Cloud on AWS helps them extend their data centers to the cloud in a hybrid cloud environment. IHS Markit was able to rapidly extend global services without buying additional data center infrastructure. ZENRIN DataCom increased the flexibility and scalability of their IT infrastructure. West Windsor-Plainsboro Regional School District improved organizational preparedness and reduced IT operating costs by migrating workloads to VMware Cloud on AWS and having the ability to move them back on-premises.
IT leaders are adopting hybrid cloud services at a rapid pace to increase business agility and cost containment. According to a February 2016 IDG study on Hybrid Cloud Computing, up to 83 percent of C-level respondents use or plan to use a hybrid cloud.27 Transforming IT service delivery to a hybrid cloud consumption model is the clear path for the vast majority of organizations. The bigger issue is how quickly an organization can change.
While building your own hybrid cloud solution may seem attractive, you should seriously evaluate the challenges such an option presents to already overextended IT staff resources. As detailed in this paper, organizations must be prepared for the likelihood of higher costs, longer deployments, and greater risks than they may initially expect. In addition, the IT community as a whole is rapidly moving away from integrating components and delivering services manually to a more strategic focus on providing high-value services to businesses.
The race to deliver cloud-native applications and services to the business demands that traditional IT services and applications either evolve to a hybrid cloud consumption model or risk placing the business at a competitive disadvantage. An engineered solution not only saves time, money, and resources but also allows your IT staff to focus on innovation and delivering IT services that increase business value and align with the evolving marketplace of IT services for enterprise-level organizations.
Organizations should seize the opportunity to achieve the transformational efficiencies the hybrid cloud can deliver. When planning your journey, consider buying rather than building your own solution to speed time to value, minimize expenditures, and reduce risk.
Bilcare Cost Effectively Scales For Business GrowthBilcare Research
Bilcare Research Cost-Effectively Scales for Business Growth with Red Hat Enterprise Linux with Integrated Virtualization. For More log on to White Papers http://www.bilcare.com/
Steve Mills - Dispelling the Vapor Around Cloud ComputingMauricio Godoy
The document discusses IBM's perspective on cloud computing. It defines cloud computing, outlines various cloud service and delivery models, and summarizes IBM's cloud computing offerings including consulting services, infrastructure, platforms, and applications.
Bilcare Ltd. is a pharmaceutical company that needed to centralize its data protection across multiple locations. It implemented Symantec NetBackup with PureDisk deduplication, which reduced backup data by 91-94% through deduplication and compression. This allowed 6x faster recoveries and improved backup success rates. Storage management time was reduced by 96%.
This document outlines a cloud migration plan with the following sections: executive summary, scope, cloud migration prerequisites, considerations, planning assessment tools, application discovery services, evaluation of discovered data, migration recommendations, and conclusion. The scope section defines cloud-ready applications as IaaS, SaaS, and PaaS and identifies the cloud provider as AWS. The document assesses growth projections, service level agreements, data security, vendor governance strategy, and the business and technical impact of cloud migration. It aims to develop a comprehensive cloud adoption policy and migration strategy.
The document discusses the rise of big data and why it is important for organizations now. It notes that the volume of data is growing exponentially and will soon reach zettabytes in size. However, most of this data is unstructured and many business leaders do not have access to all the information they need or do not fully trust the information they have. The traditional approach of having IT design structured solutions based on business requirements is no longer sufficient. Instead, the document advocates a big data approach where organizations explore and analyze all available data sources using a platform to discover new insights in an iterative manner and determine new questions to ask. IBM's big data platform is presented as a solution to address these needs by handling large volumes,
The promise of cloud computing is realized because of its
essential fundamentals—standardization of infrastructure,
virtualized resources and automated processes—and the
business results are measurable. Cloud computing represents
a paradigm shift at many levels, but the ‘return on investment’
that cloud customers are realizing cannot be overstated.
New vulnerabilities have been discovered that could allow attackers to hack networks by sending malicious faxes to fax machines. The vulnerabilities affect tens of millions of fax devices globally that use certain communication protocols. Researchers from Check Point demonstrated that specially crafted image files sent by fax could encode malware payloads that would be executed on the target fax machine and potentially spread across connected networks. While fax machines are less commonly thought of as modern devices, there are over 45 million in use by businesses globally, and many office printers have faxing capabilities built-in, representing an overlooked security risk.
FICO with VMware Virtualization TechnologyMainstay
FICO is a world leader in decision management and predictive analytics. Their investment in VMware virtualization technology allowed the company to consolidate its sprawling IT infrastructure, automate system management tasks, boost availability and shrink energy consumption, while allowing it to deploy state-of-the-art cloud computing environment to speed product development worldwide.
Yahoo uses Apache Hadoop at a massive scale to power many of its products and services. Hadoop clusters at Yahoo contain tens of thousands of servers and store over 170 petabytes of data. Hadoop is used for data analytics, content optimization, machine learning, advertising products, and more. One example is Yahoo's homepage, where Hadoop enables the personalization of content for each user, increasing engagement on the site.
Over the last decade, cloud computing has transformed the market for IT services. But the journey to cloud adoption has not been without its share of twists and turns. This report looks at lessons that can be derived from companies' experiences implementing cloud computing technology.
This document discusses hybrid cloud models for Vietnam. It begins by outlining the evolution of cloud computing, from the virtualization era focused on infrastructure as a service (IaaS) to the current era of hybrid data. It then discusses how the digital economy is driving disruption through ecosystem-based innovation, insight-driven processes, and apps that consolidate decision making. The document recommends targeting a technology stack to achieve digital disruption and outlines a high-level target architecture. It emphasizes that a hybrid IT approach integrating new agile capabilities with existing environments provides for two-speed IT delivery. Finally, the document discusses common use cases for hybrid cloud and factors to consider when selecting a cloud model.
Using Oracle SOA Suite 11g to Integrate with PeopleSoft 9.1Brad Bukacek Jr.
Using Oracle SOA Suite 11g, Veolia implemented a fully integrated solution to eliminate redundant manual processes across its PeopleSoft 9.1 and Hard Dollar systems. Key components included integrating bids from Hard Dollar to create contracts and projects in PeopleSoft, sharing customer and rate data between the systems, and handling large data transfers through throttling. Services were developed using a reusable approach based on Oracle Application Integration Architecture methodology.
The document discusses the challenges of enterprise data management and Veritas' solutions to address these challenges. It notes that data is the most critical asset for organizations and must be always available, compliant, and relevant. It then summarizes some of Veritas' data management products like NetBackup, Information Map, Veritas Resiliency Platform, and Velocity that provide capabilities like data visibility, predictable resilience, integrated copy data management, and a 360-degree view of the data landscape. The document aims to demonstrate how Veritas can help organizations gain insights from their data and meet requirements around availability, protection, visibility and access.
This document discusses how choosing the right NAS platform can help organizations address challenges related to rapidly growing data and flat budgets. It recommends looking for a solution that can scale to meet future capacity and performance needs efficiently over 3-4 years, drive capacity efficiencies through deduplication, integrate well with VMware, simplify storage administration, and streamline upgrades. The document then introduces the Hitachi NAS Platform 4000 series as a solution that can provide these benefits, helping organizations consolidate storage, improve productivity, and reduce costs and complexity.
Technology has helped humans overcome natural limitations and improve quality of life. As populations grew, technology advanced to help humans communicate over long distances, farm more efficiently, travel via land, water and air, and build stronger structures like houses and bridges. Overall, technological developments have transformed how humans live and made the world a more comfortable place.
The document discusses Cisco Nexus 1000V and the Nexus 1010 appliance. It provides an overview of the Nexus 1000V architecture, comparing it to a physical modular switch. It describes how the Nexus 1000V uses Virtual Supervisor Modules (VSMs) and Virtual Ethernet Modules (VEMs) to replace the functionality of physical linecards and supervisors. It also discusses how the Nexus 1010 appliance allows hosting of VSMs on a physical device for improved performance and redundancy.
The document discusses applications of factoring polynomials. It provides examples of how factoring can be used to evaluate polynomials by substituting values into the factored form. Factoring is also useful for determining the sign of outputs and for solving polynomial equations, which is described as the most important application of factoring. Examples are given to demonstrate evaluating polynomials both with and without factoring, and checking the answers obtained from factoring using the expanded form.
The document provides an overview of IBM's Big Data platform vision. The platform addresses big data use cases involving high volume, velocity and variety of data. It integrates with existing data warehouse and master data management systems. The platform handles different data types and formats, provides real-time and batch analytics, and has tools to make it easy for developers and users to work with. It is designed with enterprise-grade security, scalability and failure tolerance. The platform allows organizations to analyze big data from various sources to gain insights.
Three organizations share how VMware Cloud on AWS helps them extend their data centers to the cloud in a hybrid cloud environment. IHS Markit was able to rapidly extend global services without buying additional data center infrastructure. ZENRIN DataCom increased the flexibility and scalability of their IT infrastructure. West Windsor-Plainsboro Regional School District improved organizational preparedness and reduced IT operating costs by migrating workloads to VMware Cloud on AWS and having the ability to move them back on-premises.
IT leaders are adopting hybrid cloud services at a rapid pace to increase business agility and cost containment. According to a February 2016 IDG study on Hybrid Cloud Computing, up to 83 percent of C-level respondents use or plan to use a hybrid cloud.27 Transforming IT service delivery to a hybrid cloud consumption model is the clear path for the vast majority of organizations. The bigger issue is how quickly an organization can change.
While building your own hybrid cloud solution may seem attractive, you should seriously evaluate the challenges such an option presents to already overextended IT staff resources. As detailed in this paper, organizations must be prepared for the likelihood of higher costs, longer deployments, and greater risks than they may initially expect. In addition, the IT community as a whole is rapidly moving away from integrating components and delivering services manually to a more strategic focus on providing high-value services to businesses.
The race to deliver cloud-native applications and services to the business demands that traditional IT services and applications either evolve to a hybrid cloud consumption model or risk placing the business at a competitive disadvantage. An engineered solution not only saves time, money, and resources but also allows your IT staff to focus on innovation and delivering IT services that increase business value and align with the evolving marketplace of IT services for enterprise-level organizations.
Organizations should seize the opportunity to achieve the transformational efficiencies the hybrid cloud can deliver. When planning your journey, consider buying rather than building your own solution to speed time to value, minimize expenditures, and reduce risk.
Bilcare Cost Effectively Scales For Business GrowthBilcare Research
Bilcare Research Cost-Effectively Scales for Business Growth with Red Hat Enterprise Linux with Integrated Virtualization. For More log on to White Papers http://www.bilcare.com/
Steve Mills - Dispelling the Vapor Around Cloud ComputingMauricio Godoy
The document discusses IBM's perspective on cloud computing. It defines cloud computing, outlines various cloud service and delivery models, and summarizes IBM's cloud computing offerings including consulting services, infrastructure, platforms, and applications.
Bilcare Ltd. is a pharmaceutical company that needed to centralize its data protection across multiple locations. It implemented Symantec NetBackup with PureDisk deduplication, which reduced backup data by 91-94% through deduplication and compression. This allowed 6x faster recoveries and improved backup success rates. Storage management time was reduced by 96%.
This document outlines a cloud migration plan with the following sections: executive summary, scope, cloud migration prerequisites, considerations, planning assessment tools, application discovery services, evaluation of discovered data, migration recommendations, and conclusion. The scope section defines cloud-ready applications as IaaS, SaaS, and PaaS and identifies the cloud provider as AWS. The document assesses growth projections, service level agreements, data security, vendor governance strategy, and the business and technical impact of cloud migration. It aims to develop a comprehensive cloud adoption policy and migration strategy.
The document discusses the rise of big data and why it is important for organizations now. It notes that the volume of data is growing exponentially and will soon reach zettabytes in size. However, most of this data is unstructured and many business leaders do not have access to all the information they need or do not fully trust the information they have. The traditional approach of having IT design structured solutions based on business requirements is no longer sufficient. Instead, the document advocates a big data approach where organizations explore and analyze all available data sources using a platform to discover new insights in an iterative manner and determine new questions to ask. IBM's big data platform is presented as a solution to address these needs by handling large volumes,
The promise of cloud computing is realized because of its
essential fundamentals—standardization of infrastructure,
virtualized resources and automated processes—and the
business results are measurable. Cloud computing represents
a paradigm shift at many levels, but the ‘return on investment’
that cloud customers are realizing cannot be overstated.
New vulnerabilities have been discovered that could allow attackers to hack networks by sending malicious faxes to fax machines. The vulnerabilities affect tens of millions of fax devices globally that use certain communication protocols. Researchers from Check Point demonstrated that specially crafted image files sent by fax could encode malware payloads that would be executed on the target fax machine and potentially spread across connected networks. While fax machines are less commonly thought of as modern devices, there are over 45 million in use by businesses globally, and many office printers have faxing capabilities built-in, representing an overlooked security risk.
FICO with VMware Virtualization TechnologyMainstay
FICO is a world leader in decision management and predictive analytics. Their investment in VMware virtualization technology allowed the company to consolidate its sprawling IT infrastructure, automate system management tasks, boost availability and shrink energy consumption, while allowing it to deploy state-of-the-art cloud computing environment to speed product development worldwide.
Yahoo uses Apache Hadoop at a massive scale to power many of its products and services. Hadoop clusters at Yahoo contain tens of thousands of servers and store over 170 petabytes of data. Hadoop is used for data analytics, content optimization, machine learning, advertising products, and more. One example is Yahoo's homepage, where Hadoop enables the personalization of content for each user, increasing engagement on the site.
Over the last decade, cloud computing has transformed the market for IT services. But the journey to cloud adoption has not been without its share of twists and turns. This report looks at lessons that can be derived from companies' experiences implementing cloud computing technology.
This document discusses hybrid cloud models for Vietnam. It begins by outlining the evolution of cloud computing, from the virtualization era focused on infrastructure as a service (IaaS) to the current era of hybrid data. It then discusses how the digital economy is driving disruption through ecosystem-based innovation, insight-driven processes, and apps that consolidate decision making. The document recommends targeting a technology stack to achieve digital disruption and outlines a high-level target architecture. It emphasizes that a hybrid IT approach integrating new agile capabilities with existing environments provides for two-speed IT delivery. Finally, the document discusses common use cases for hybrid cloud and factors to consider when selecting a cloud model.
Using Oracle SOA Suite 11g to Integrate with PeopleSoft 9.1Brad Bukacek Jr.
Using Oracle SOA Suite 11g, Veolia implemented a fully integrated solution to eliminate redundant manual processes across its PeopleSoft 9.1 and Hard Dollar systems. Key components included integrating bids from Hard Dollar to create contracts and projects in PeopleSoft, sharing customer and rate data between the systems, and handling large data transfers through throttling. Services were developed using a reusable approach based on Oracle Application Integration Architecture methodology.
The document discusses the challenges of enterprise data management and Veritas' solutions to address these challenges. It notes that data is the most critical asset for organizations and must be always available, compliant, and relevant. It then summarizes some of Veritas' data management products like NetBackup, Information Map, Veritas Resiliency Platform, and Velocity that provide capabilities like data visibility, predictable resilience, integrated copy data management, and a 360-degree view of the data landscape. The document aims to demonstrate how Veritas can help organizations gain insights from their data and meet requirements around availability, protection, visibility and access.
This document discusses how choosing the right NAS platform can help organizations address challenges related to rapidly growing data and flat budgets. It recommends looking for a solution that can scale to meet future capacity and performance needs efficiently over 3-4 years, drive capacity efficiencies through deduplication, integrate well with VMware, simplify storage administration, and streamline upgrades. The document then introduces the Hitachi NAS Platform 4000 series as a solution that can provide these benefits, helping organizations consolidate storage, improve productivity, and reduce costs and complexity.
Technology has helped humans overcome natural limitations and improve quality of life. As populations grew, technology advanced to help humans communicate over long distances, farm more efficiently, travel via land, water and air, and build stronger structures like houses and bridges. Overall, technological developments have transformed how humans live and made the world a more comfortable place.
The document discusses Cisco Nexus 1000V and the Nexus 1010 appliance. It provides an overview of the Nexus 1000V architecture, comparing it to a physical modular switch. It describes how the Nexus 1000V uses Virtual Supervisor Modules (VSMs) and Virtual Ethernet Modules (VEMs) to replace the functionality of physical linecards and supervisors. It also discusses how the Nexus 1010 appliance allows hosting of VSMs on a physical device for improved performance and redundancy.
The document discusses applications of factoring polynomials. It provides examples of how factoring can be used to evaluate polynomials by substituting values into the factored form. Factoring is also useful for determining the sign of outputs and for solving polynomial equations, which is described as the most important application of factoring. Examples are given to demonstrate evaluating polynomials both with and without factoring, and checking the answers obtained from factoring using the expanded form.
This document summarizes the key aspects of deduplication on NetApp storage arrays. It discusses what deduplication does, the core enabling technology of fingerprints, how fingerprints work, dedupe metadata space requirements, how dedupe metadata is handled in different ONTAP versions, potential dedupe savings rates for different data types, and considerations around when dedupe is appropriate.
The document introduces the payroll parallel testing process for an ADP implementation. It explains that a parallel test involves loading payroll data from a past period into the new ADP system and comparing the results to the legacy system to validate configurations and ensure accurate payroll processing. It provides an overview of the parallel testing process, including preparation activities, testing scope, and exit criteria to sign off on the implementation. Employees are asked to actively participate and provide input to help ensure a successful transition to the new ADP payroll system.
The document discusses population pharmacokinetic (PK) analysis. Population PK seeks to identify factors that cause variability in drug concentrations among patients and quantify their effects to help determine appropriate dosages. It describes common PK parameters, software used for PK analysis like NONMEM, and approaches for analyzing population PK data, including nonlinear mixed-effects modeling. An example population PK analysis is provided using simulated gentamicin concentration-time data from 30 patients to illustrate modeling the typical response, heterogeneity between individuals, and uncertainty in the model.
The mobile space is exploding and affiliates need to avoid mistakes and move quickly to generate revenue. Discuss practical case studies and learn from our real-life client examples.
Experience level: Beginner, Intermediate, Advanced
Target audience: Affiliate/Publisher, Merchant/Advertiser
Niche/vertical: Mobile
Jeff Stevens, Director of Sales, DirectTrack (Twitter @jstevnz )
If you’re responsible for creating diverse, scalable automated tests but don’t have the time, budget, or a skilled-enough team to create yet another custom test automation framework, then you need to know about Robot Framework!
In this webinar, Bryan Lamb (Founder, RobotFrameworkTutorial.com) and Chris Broesamle (Solutions Engineer, Sauce Labs) will reveal how you can use this powerful, free, open source, generic framework to create continuous automated regression tests for web, batch, API, or database testing. With the simplicity of Robot Framework, in conjunction with Sauce Labs, you can improve your test coverage and time to delivery of your applications.
The document discusses the use of procurement analytics. It begins by explaining what procurement analytics is and why organizations should use it. Analytics can increase demand forecasting accuracy and contract negotiation power. The document then discusses how analytics can be applied in areas like vendor evaluation, spend analysis, and demand forecasting. It also outlines challenges to implementation and provides recommendations for next steps like gaining leadership support, collaborating cross-functionally, developing skills, and integrating systems.
This document provides an overview of IBM's Identity and Access Management (IAM) product portfolio, including IBM Security Identity Manager, IBM Security Privileged Identity Manager, and IBM Security Access Manager. It discusses how these products help customers secure access, streamline user provisioning and access requests, safeguard access in cloud/SaaS environments, address compliance needs, and centrally manage privileged identities. Specific capabilities highlighted include identity lifecycle management, self-service access requests, centralized password management, account reconciliation, access recertification, reporting for audits, and broad application integration.
Stylus is a CSS preprocessor that aims to simplify CSS by removing syntactic sugar like brackets and semicolons, enforcing indentation, and allowing variables, mixins, and nested selectors. Nib is a library of mixins, utilities, and components for Stylus that handles vendor prefixes and provides things like clearfixes and hiding text. Together, Stylus and Nib allow for more concise and maintainable CSS code.
Bridgestone proposes launching run-flat motorcycle tires called Battlax RF in Thailand. Battlax RF provides safety, comfort and fuel efficiency. They allow riders to continue traveling after a tire pressure loss. Thailand has many unpaved roads and motorcycle accidents, creating a potential market. Financial projections show the 5-year NPV is $57.8 million and 10-year is $125.5 million. Battlax RF faces minimal competition and leverages Bridgestone's existing production and distribution in Thailand. The proposal concludes Battlax RF is a lucrative opportunity in line with Bridgestone's mission.
The document discusses the importance of data storage for large tech companies and the challenges of storing large amounts of data reliably. It provides an overview of NetApp's storage solutions, including Data ONTAP, WAFL file system, Snapshot technology, replication tools like SnapMirror, and management tools like My AutoSupport. NetApp believes in providing a unified storage platform with integrated data protection, management and optimization capabilities.
Management 315: International Management, Professor In Hyeock Lee
Loyola University Chicago Spring 2013
This case study analyzes Honda's overall performance as a multinational enterprise using the company's revenue data, 4 distances, firm specific advantages, country specific advantages, foreign direct investment, and much more.
Sham Hassan Chikkegowda, CS Engineer, and Timothee Maret, Senior Developer, of Adobe provide a review of using Security Assertion Markup Language (SAML) with your Experience Manager deployments. SAML is an XML-based, open-standard data format for exchanging authentication and authorization data between parties, in particular, between an identity provider and a service provider. SAML is a product of the OASIS Security Services Technical Committee. To watch the session on demand at http://bit.ly/AEMGems72016 or the MP4 version http://bit.ly/AEMGem72016
HORMONAL REGULATION OF OVULATION,PREGNANCY,PARTURITIONSudarshan Gokhale
The document discusses the hormonal regulation of ovulation, pregnancy, and parturition. It describes the key hormones involved in each process, including estrogen, progesterone, LH, FSH, hCG, relaxin, corticotropin, and oxytocin. Ovulation is regulated by the hypothalamus and pituitary gland releasing hormones like LH and FSH. Pregnancy involves changes in the maternal body and is maintained by hormones like estrogen, progesterone, hCG, and corticotropin. Parturition is triggered by a drop in progesterone and rise in oxytocin, relaxing ligaments and stimulating uterine contractions.
This document summarizes the history and types of surgical dressings. It discusses how dressings have evolved from simple cloths to advanced engineered skin substitutes. The key types of dressings covered are dry dressings, moisture-keeping dressings, bioactive dressings, and skin substitutes. Examples of commonly used dressings like gauze, foams, hydrocolloids, and alginates are provided along with their characteristics and uses.
Redefining HCI: How to Go from Hyper Converged to Hybrid Cloud InfrastructureNetApp
The hyper converged infrastructure (HCI) market is entering a new phase of maturity. A modern HCI solution requires a private cloud platform that integrates with public clouds to create a consistent hybrid multi-cloud experience.
During this webinar, NetApp and an IDC guest speaker covered what led to the next generation of hyper converged infrastructure and which five capabilities are required to go from hyper converged to hybrid cloud infrastructure.
Hipskind Cloud Services offers comprehensive data protection solutions including backup, disaster recovery, and archiving through its Infrastructure as a Service platform. Key benefits include lowering costs by transforming capital expenditures to operational expenses, improving productivity and efficiency. Hipskind utilizes industry-leading Commvault Simpana software across its SSAE16 and SOC2 certified data centers to provide scalable, secure protection of customer data.
See how AWPRx used Jitterbit to succesfully move its business to the cloud. Jitterbit was used to migrate, synchronize, and replicate data between force.com, private clouds, and Salesforce.
If you visit Revlon, Inc.’s data center in Oxford, North Carolina, don’t blink, or you’ll miss the infrastructure. Just two racks house roughly 3.6PB and 800 virtual servers that process an average of 14,000 transactions per second (TPS) from systems around the world, with 99.9999% uptime. When people walk into our data center, they ask, “That’s it?” The answer is yes, and it runs everything. Find out more about Revlon and NetApp here: http://www.netapp.com/us/campaigns/builton/?REF_SOURCE=smctwitter-initiative-builton
Accelerating Cloud Data Warehouse Adoption using Hexaware’s Cloud EDMAHexaware Technologies
Hexaware assessed the current data landscape for the US arm of one of the largest stock exchanges in Europe and came up with an option to build a highly scalable and agile data warehouse on cloud leveraging their Cloud EDMA platform.
Azure Migration
Azure migration is the process of moving your workloads to the Azure cloud. This can include migrating your infrastructure, databases, and applications. Azure migration can help you improve your scalability, reliability, and security, while also reducing your costs. Csharptek is a trusted microsoft solution partner in Digital and Innovation (Azure)for Azure migration. We have a team of experienced and certified Azure professionals who can help you with every aspect of your migration. We offer a variety of services to meet your needs, and we're committed to helping you achieve your business goals.
The document is a collection of case studies for Riverbed technology. It provides summaries of how Riverbed solutions helped various organizations including Fortune 500 companies across different industries like manufacturing, retail, healthcare, transportation and more. The case studies describe challenges like slow application performance over WAN, need for server consolidation and mobility support. Riverbed deployments delivered benefits like bandwidth reduction, accelerated application performance, productivity gains and cost savings.
Risc and velostrata 2 28 2018 lessons_in_cloud_migrationRISC Networks
Learn how to accelerate and
de-risk your cloud migration project
Despite the surge in enterprises migrating applications to the public cloud, more than half of all projects are delayed or over budget and an even greater number are more difficult than expected.1
Cloud Migrations don’t begin when you start moving applications into the cloud. They begin with your application landscape discovery and assessment. The second phase comprises the actual migration where applications are moved to the public cloud. Working with purpose-built enterprise-grade cloud migration platforms, especially those that partner to integrate both phases greatly simplifies and accelerates projects.
RISC Networks and Velostrata have teamed up to deliver this webinar where we’ll share real-world examples, tips, and tricks on crafting a seamless cloud migration from start to completion.
Value Journal, a monthly news journal from Redington Value Distribution, intends to update the channel on the latest vendor news and Redington Value’s Channel Initiatives.
Key stories from the August Edition:
•AWS Announces General Availability of AWS IoT SiteWise
•Dell Technologies Explores VMware Spin-Off
•HPE To Drive Edge-to-Cloud Strategy With Acquisition Of Silver Peak
•Oracle Announces Oracle Dedicated Region Cloud@Customer
•Fortinet Acquires Cloud Security and Networking Innovator OPAQ Networks
•Nexthink Launches Experience Platform
•Nutanix Sheds Light On Smart City Development Expectations
•Palo Alto Networks Launches World’s First ML-Powered NGFW
•57% of UAE IT Heads See Challenge In Gaining Holistic Network Visibility: VMware
•Mimecast Acquires MessageControl
•Cybersecurity Top Priority In COVID-19 Recovery: CrowdStrike Report
•Cohesity Launches Consumption-Based Subscription Pricing for MSP Partner
•Kaspersky’s New VPN Secure Connection Ensures User Privacy
•Veeam Announces New Competencies To EMEA ProPartner Program
•82% of UAE Remote Workers Have Gained Cybersecurity Awareness: Trend Micro
•Imperva Unveils Cloud Data Security Solution
Cloud computing case studies with ProfitBricks IaaSProfitBricks
ProfitBricks cloud computing case study with one cloud user in the USA (Cloud Pharmaceuticals) and one in Germany (DriveNow). Cloud case studies include cloud use cases and cloud architecture overviews presented at Cloud Expo Santa Clara.
Cloud computing adoption in sap technologiessveldanda
Cloud computing is emerging as an exciting trend in the ICT and with this presentation we tried to explore opportunities of adopting Cloud computing in SAP Technologies
Why would you should trust Stack Harbor with your Data
The Most performance and security oriented Canadian cloud company.
Learn more about our all SSD instances comparable and outperforming AWS, Azure, soft layer, iWeb etc..
Gartner IT Symposium 2013: Delivering IT-as-a-Service with Cloud Brokering an...Gravitant, Inc.
This document discusses how Gravitant provides a cloud brokering and management solution that delivers IT-as-a-Service using a cloud solution factory. Key benefits highlighted include speeding up the process of delivering application infrastructure, reducing costs by up to 50%, and providing flexibility, control and governance over multiple cloud environments and services. The solution aims to simplify cloud complexity and allow organizations to choose the best cloud providers and offerings for their needs. An example case study with the State of Texas is provided, showing how Gravitant helped achieve a 40% reduction in cloud spending and a decrease in IT solution delivery times from over 9 months to less than 1 month.
Juniper Networks' QFabric is an innovative data center fabric that provides a flattened, single-tier network architecture with any-to-any connectivity between devices. This allows for rapid deployment of services by eliminating bottlenecks and simplifying network management. QFabric also improves cost efficiency by reducing complexity, scaling more easily, and lowering power consumption and space needs compared to traditional hierarchical network designs. The document examines the business benefits of QFabric, such as rapid service provisioning, lower costs, increased efficiency, and improved resiliency and security for data center networks.
Overview of Cloud Computing from the CFO perspective. Focuses on business advantages, costs, risks, and organizational impact across a wide range of emerging platforms.
When it comes to the cloud, Gartner may have said it best:
“By 2020, a corporate ‘no-cloud’ policy will be as rare as a corporate ‘no-internet’ policy is today.”
If your organization is still skeptical of the cloud, now is the time to take a closer look. Faster implementation timelines and reduced maintenance costs are just two reasons why the cloud is becoming the standard across all industries.
In our webinar, we dispelled common concerns and explored the benefits of operating in the cloud. We also provided real-world examples of companies that have taken the leap and discovered just how much better business works in the cloud.
This document discusses how continuous delivery of software is putting pressure on security teams to keep up with frequent releases. It describes how leading companies are using Fortify's application security solutions to scan more applications faster, better prioritize issues, and integrate security testing throughout development. By shifting security left to earlier phases, these companies find and fix vulnerabilities sooner, reducing remediation time and allowing for faster software delivery cycles to support business needs. The document surveys software security operations at several large financial, energy, and technology companies to evaluate how Fortify helps with scan setup, performance, triaging, remediation, and scalability.
The document discusses Oracle's Customer 2 Cloud program which allows customers to convert existing on-premise software spending and unused seat licenses into Oracle cloud application services. The program provides attractive financial incentives to move to cloud services, rapid deployment options through packaged integrations, and expert guidance from partners like IBM to help customers seamlessly transition to the cloud.
Allegro Group, a large Polish e-commerce company, was struggling with an outdated and fragmented data management system that could not handle their rapid growth. They implemented the Oracle Exadata database platform to centralize their data and gain real-time insights. This improved performance dramatically, with query times improving by 48x and data access by 15x. A financial analysis projected the system would deliver $7.5 million in savings over three years from improved productivity, fraud prevention, and revenue growth.
California State University, Fullerton implemented a major initiative involving a digital print center managed by Xerox and replacing aging printers with new Xerox multifunction devices. This was projected to save the university $554k total through operational efficiencies and cost reductions. The digital print center provides printing and finishing services cheaper than third parties, saving an estimated $304k over 5 years. Standardizing the printer fleet reduced costs by 26%, saving $250k over 3 years. The investments helped make processes more digital, mobile, and sustainable while preparing students for technology careers.
The document discusses a new hyperscale data center solution from Ericsson called Ericsson Hyperscale Datacenter System 8000. It claims this solution can deliver significant cost savings for enterprise data centers by enabling higher utilization rates compared to traditional architectures. Specifically, it argues the solution can achieve CPU utilization up to 4 times higher, network utilization 26% higher, and allow one administrator to manage thousands of servers rather than hundreds. These efficiencies are achieved through a pooled infrastructure that allows dynamic allocation of resources and improved matching of capacity to workload demands.
- Craft a compelling RFP Executive Summary that includes quantified measures of business impact and KPIs.
- Prepare a Business Value Assessment (BVA) of their existing solution’s business value.
- Executive-ready presentation that is included as an appendix to the RFP.
- Providing existing RFP customers with a “cost to conduct an RFP” calculator.
- Estimate the full cost of going out to multiple RFPs.
Working with the Mainstay team, the Cisco IOT Manufacturing Marketing team combined research from manufacturing trade associations, management consulting research and an internal benchmarking project to create an Executive Briefing Presentation that would educate CxOs on the opportunities IOT can provide. This content was also repurposed to create a manufacturing IOT whitepaper to provide an asset to entice prospective customers to consider Cisco’s IOT offerings.
Kofax turned to Mainstay to help define the key value drivers and impact levels to help promote their Claims Automation Solution. Working closely with Kofax’s product team and working with key customer references, Mainstay was able to build a very compelling infographic that provides a simple, rapid way to digest a very complex solution.
Mainstay was introduced to Bluewolf through their relationship with Oracle and brought our team in to help capture the business value story at Kele. Working with the Bluewolf sales team and the Kele project sponsor, Mainstay was able to develop a quantitative view of the business value achieved. The story focused on the impact of developing a marketing automation solution to benefit Kele’s customers by providing greater customer support and a deeper partnership with their clients.
Mainstay Company was contracted by Texas Bank to develop customer personas to enhance their marketing efforts. The personas would identify key influencers in industries like financial services and banking, and understand their needs, preferences and buying decisions related to Texas Bank's solutions and services. Mainstay conducted research including interviews with Texas Bank staff and customers to develop composite personas representing important user groups. The personas would provide a focus on end users to improve communication, innovation and marketing strategies at Texas Bank.
Mainstay provides various conference content management services to help companies develop, produce, and distribute content to promote their solutions at conferences. These services include developing white papers, presentations, videos, and success stories to share at conferences, as well as providing consulting resources at client booths. Mainstay works with clients to identify key content needs and assets, prioritize projects, and produce content through various phases from planning to execution. Examples of content produced for clients include white papers, executive presentations, videos, ROI case studies, and interactive calculators.
1) The document outlines two social media packages - Package A and Package B - offered by Mainstay Company LLC to leverage customer value messaging and promote social sharing.
2) Package A includes content for 10 social media posts on Twitter, 2 blog posts, 5 posts on LinkedIn, 3 slides for Slideshare, 1 press release, and a 4-minute customer video testimonial.
3) Package B builds on Package A and also includes a 10-minute webinar and a 3-minute podcast derived from the webinar. The packages are designed to spread content across multiple social media channels and increase marketing and sales efforts.
This document discusses how improving the customer experience is critical for insurance companies to drive growth. It states that customer experience, especially during the claims process, has become the new battleground for attracting and retaining customers. The document recommends that insurance companies implement "next practices" like automated claims processing using capture technologies to streamline the claims process and enhance the customer experience. This improves customer satisfaction and loyalty, reduces churn, and boosts the bottom line through increased revenue and customer retention.
The Evolution of Meme Coins A New Era for Digital Currency ppt.pdfAbi john
Analyze the growth of meme coins from mere online jokes to potential assets in the digital economy. Explore the community, culture, and utility as they elevate themselves to a new era in cryptocurrency.
Increasing Retail Store Efficiency How can Planograms Save Time and Money.pptxAnoop Ashok
In today's fast-paced retail environment, efficiency is key. Every minute counts, and every penny matters. One tool that can significantly boost your store's efficiency is a well-executed planogram. These visual merchandising blueprints not only enhance store layouts but also save time and money in the process.
Generative Artificial Intelligence (GenAI) in BusinessDr. Tathagat Varma
My talk for the Indian School of Business (ISB) Emerging Leaders Program Cohort 9. In this talk, I discussed key issues around adoption of GenAI in business - benefits, opportunities and limitations. I also discussed how my research on Theory of Cognitive Chasms helps address some of these issues
AI and Data Privacy in 2025: Global TrendsInData Labs
In this infographic, we explore how businesses can implement effective governance frameworks to address AI data privacy. Understanding it is crucial for developing effective strategies that ensure compliance, safeguard customer trust, and leverage AI responsibly. Equip yourself with insights that can drive informed decision-making and position your organization for success in the future of data privacy.
This infographic contains:
-AI and data privacy: Key findings
-Statistics on AI data privacy in the today’s world
-Tips on how to overcome data privacy challenges
-Benefits of AI data security investments.
Keep up-to-date on how AI is reshaping privacy standards and what this entails for both individuals and organizations.
This is the keynote of the Into the Box conference, highlighting the release of the BoxLang JVM language, its key enhancements, and its vision for the future.
Designing Low-Latency Systems with Rust and ScyllaDB: An Architectural Deep DiveScyllaDB
Want to learn practical tips for designing systems that can scale efficiently without compromising speed?
Join us for a workshop where we’ll address these challenges head-on and explore how to architect low-latency systems using Rust. During this free interactive workshop oriented for developers, engineers, and architects, we’ll cover how Rust’s unique language features and the Tokio async runtime enable high-performance application development.
As you explore key principles of designing low-latency systems with Rust, you will learn how to:
- Create and compile a real-world app with Rust
- Connect the application to ScyllaDB (NoSQL data store)
- Negotiate tradeoffs related to data modeling and querying
- Manage and monitor the database for consistently low latencies
Quantum Computing Quick Research Guide by Arthur MorganArthur Morgan
This is a Quick Research Guide (QRG).
QRGs include the following:
- A brief, high-level overview of the QRG topic.
- A milestone timeline for the QRG topic.
- Links to various free online resource materials to provide a deeper dive into the QRG topic.
- Conclusion and a recommendation for at least two books available in the SJPL system on the QRG topic.
QRGs planned for the series:
- Artificial Intelligence QRG
- Quantum Computing QRG
- Big Data Analytics QRG
- Spacecraft Guidance, Navigation & Control QRG (coming 2026)
- UK Home Computing & The Birth of ARM QRG (coming 2027)
Any questions or comments?
- Please contact Arthur Morgan at [email protected].
100% human made.
Book industry standards are evolving rapidly. In the first part of this session, we’ll share an overview of key developments from 2024 and the early months of 2025. Then, BookNet’s resident standards expert, Tom Richardson, and CEO, Lauren Stewart, have a forward-looking conversation about what’s next.
Link to recording, presentation slides, and accompanying resource: https://bnctechforum.ca/sessions/standardsgoals-for-2025-standards-certification-roundup/
Presented by BookNet Canada on May 6, 2025 with support from the Department of Canadian Heritage.
Role of Data Annotation Services in AI-Powered ManufacturingAndrew Leo
From predictive maintenance to robotic automation, AI is driving the future of manufacturing. But without high-quality annotated data, even the smartest models fall short.
Discover how data annotation services are powering accuracy, safety, and efficiency in AI-driven manufacturing systems.
Precision in data labeling = Precision on the production floor.
What is Model Context Protocol(MCP) - The new technology for communication bw...Vishnu Singh Chundawat
The MCP (Model Context Protocol) is a framework designed to manage context and interaction within complex systems. This SlideShare presentation will provide a detailed overview of the MCP Model, its applications, and how it plays a crucial role in improving communication and decision-making in distributed systems. We will explore the key concepts behind the protocol, including the importance of context, data management, and how this model enhances system adaptability and responsiveness. Ideal for software developers, system architects, and IT professionals, this presentation will offer valuable insights into how the MCP Model can streamline workflows, improve efficiency, and create more intuitive systems for a wide range of use cases.
Complete Guide to Advanced Logistics Management Software in Riyadh.pdfSoftware Company
Explore the benefits and features of advanced logistics management software for businesses in Riyadh. This guide delves into the latest technologies, from real-time tracking and route optimization to warehouse management and inventory control, helping businesses streamline their logistics operations and reduce costs. Learn how implementing the right software solution can enhance efficiency, improve customer satisfaction, and provide a competitive edge in the growing logistics sector of Riyadh.
Noah Loul Shares 5 Steps to Implement AI Agents for Maximum Business Efficien...Noah Loul
Artificial intelligence is changing how businesses operate. Companies are using AI agents to automate tasks, reduce time spent on repetitive work, and focus more on high-value activities. Noah Loul, an AI strategist and entrepreneur, has helped dozens of companies streamline their operations using smart automation. He believes AI agents aren't just tools—they're workers that take on repeatable tasks so your human team can focus on what matters. If you want to reduce time waste and increase output, AI agents are the next move.
Dev Dives: Automate and orchestrate your processes with UiPath MaestroUiPathCommunity
This session is designed to equip developers with the skills needed to build mission-critical, end-to-end processes that seamlessly orchestrate agents, people, and robots.
📕 Here's what you can expect:
- Modeling: Build end-to-end processes using BPMN.
- Implementing: Integrate agentic tasks, RPA, APIs, and advanced decisioning into processes.
- Operating: Control process instances with rewind, replay, pause, and stop functions.
- Monitoring: Use dashboards and embedded analytics for real-time insights into process instances.
This webinar is a must-attend for developers looking to enhance their agentic automation skills and orchestrate robust, mission-critical processes.
👨🏫 Speaker:
Andrei Vintila, Principal Product Manager @UiPath
This session streamed live on April 29, 2025, 16:00 CET.
Check out all our upcoming Dev Dives sessions at https://community.uipath.com/dev-dives-automation-developer-2025/.
HCL Nomad Web – Best Practices und Verwaltung von Multiuser-Umgebungenpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-nomad-web-best-practices-und-verwaltung-von-multiuser-umgebungen/
HCL Nomad Web wird als die nächste Generation des HCL Notes-Clients gefeiert und bietet zahlreiche Vorteile, wie die Beseitigung des Bedarfs an Paketierung, Verteilung und Installation. Nomad Web-Client-Updates werden “automatisch” im Hintergrund installiert, was den administrativen Aufwand im Vergleich zu traditionellen HCL Notes-Clients erheblich reduziert. Allerdings stellt die Fehlerbehebung in Nomad Web im Vergleich zum Notes-Client einzigartige Herausforderungen dar.
Begleiten Sie Christoph und Marc, während sie demonstrieren, wie der Fehlerbehebungsprozess in HCL Nomad Web vereinfacht werden kann, um eine reibungslose und effiziente Benutzererfahrung zu gewährleisten.
In diesem Webinar werden wir effektive Strategien zur Diagnose und Lösung häufiger Probleme in HCL Nomad Web untersuchen, einschließlich
- Zugriff auf die Konsole
- Auffinden und Interpretieren von Protokolldateien
- Zugriff auf den Datenordner im Cache des Browsers (unter Verwendung von OPFS)
- Verständnis der Unterschiede zwischen Einzel- und Mehrbenutzerszenarien
- Nutzung der Client Clocking-Funktion
HCL Nomad Web – Best Practices und Verwaltung von Multiuser-Umgebungenpanagenda
DCI NetApp Benefits
1. DCI CASE STUDY
NETAPP BENEFITS SERIES
CUSTOMER PROFILE
Company Name:
DCI
Website URL: www.datacenterinc.com
Industry: High Technology/
Financial Services
Employees: 300
HIGHLIGHTS
NetApp Solution:
■■ NetApp®
FAS3140, FA2050, and FAS2020
systems with NetApp Snapshot™ and
deduplication technologies, SnapManager®
for Microsoft SQL Server, SnapManager
for Microsoft Exchange, SnapManager for
Virtual Infrastructure, SnapManager for
Oracle, SnapMirror®
, SnapRestore®
,
FlexClone®
, FlexVol®
, SnapVault®
, SnapDrive®
,
SnapLock®
Enterprise, Operations Manager,
and Protection Manager software
Key Benefits:
Financial:
■■ On track to achieve 40% ROI over four
years on $1.3M investment with payback
within 23 months
■■ Improved storage administration productivity
by the equivalent of one employee;
$95K personnel cost optimization
■■ Avoided hiring two storage administrators,
saving $720K over three years
■■ Saved $47K in annual traveling costs and
lost productivity
■■ Saved $108K in energy costs over four years
■■ Avoided $45K in WAN costs over four years
DCI Contains IT Sprawl, Improves Business Continuity
with Virtualized NetApp Platform; 40% ROI and 23-Month
Payback Projected
EXECUTIVE SUMMARY
One of the nation’s first bank technology
companies, Kansas-based DCI, provides
core banking services to nearly 200
community banks around the country. To
satisfy the high-volume data-processing
requirements of its bank clients, DCI’s IT
platform has to be ultradependable and
fast. DCI managed to achieve high
processing performance over the years,
but it took increasing amounts of hardware
sprawled over several data centers.
Eventually, the company realized that
it needed to rein in its server and
storage infrastructure to control costs,
better safeguard data, and support
next-generation Web-based products
and services. With the help of ISG
Technology, Inc., a leading Midwestern
“ Our objective was to ensure rapid data restore and
DR support for our customers, and the NetApp and
VMware infrastructure enabled us to achieve this.”
– Robert Ross, SVP of Network/Technical Services
and Chief Security Officer
- 1 -
2. DCI CASE STUDY
NETAPP BENEFITS SERIES
computer technology firm, DCI chose
to build a highly virtualized server and
storage environment based on NetApp
and VMware technology. Today, the new
platform supports DCI’s major business
applications, databases, and systems,
including its flagship iCore360®
application,
Microsoft SQL and Exchange servers,
Citrix, Blackberry, and more.
The move to a VMware on NetApp platform
enabled DCI to consolidate its IT infrastructure
on 80% fewer servers, cutting energy
consumption and allowing the company
to administer the whole environment with
significantly fewer resources. Furthermore,
it built a fully automated, centralized disaster
recovery center in Oklahoma City and
adopted consistent data-backup processes
companywide. All of these moves have added
new levels of reliability and scalability to the
firm’s IT backbone, laying the foundation for
the company’s move into IT-as-a-Service
and cloud computing services.
According to an assessment by Mainstay
Partners, the NetApp investment has put
DCI on track to realize total benefits of
approximately $2.1M over four years,
including savings from avoiding outlays
for an alternative high-availability system
and savings from productivity gains,
hardware consolidation, and lower energy
consumption. The result: DCI is expected
to realize a 40% ROI over four years and
to break even in 23 months.
CUSTOMER PROFILE
Formed in 1963 by four Kansas bankers,
DCI was the first bank technology company
of its kind west of the Mississippi and remains
unique in ownership, history, and product
offering. Today, DCI is still a private company
with many client banks as shareholders
and board members, making the company
more directly responsive to the needs of all
their clients.
DCI serves nearly 200 community banks
nationwide and enjoys a consistent contract
renewal rate of over 95%. DCI’s reputation
is one of quality service, innovative develop-
ment, and collaborative focus on the needs
of their community bank clients.
DCI’s product offering includes iCore360°,
their latest ASP .NET system for complete
core processing, bank management, and
automation. Designed by banks to automate
and streamline the workflow, iCore360° is
delivered via an ordinary PC and Internet
browser, requiring no software installation
or costly servers.
THE CHALLENGE
The financial institutions that depend on DCI
are among the most demanding customers
around. They execute millions of dollars of
transactions every day—everything from
checking deposits to loan processing—
using DCI’s computing resources and
iCore360° applications. These banks
can’t afford system downtime or the loss
of critical financial data, which can have
an immediate impact on the business.
To ensure that the company could run its
business without downtime or data loss that
would impact their customers, DCI built a
large IT network spanning data centers in
three states and a disaster recovery site in
Oklahoma. But as DCI’s business expanded,
so did its data-processing and storage needs.
“Storage was just growing all over the
Reduction in
SA server
provisioning time
90%
Reduction in
overall SA time
on maintenance
Reduction in
database recovering
and testing effort
93%
Increase in
data recovery
success rate
99%
Reduction in
DBA provisioning
workload
Reduction in
capacity optimization
cycle times
50%
60%
Reduction in
the number
of servers
Reduction in
the number
of racks
80%
56%
Reduction in
SA server
provisioning time
90%
75%
Operational Benefits
With the VMware on NetApp infrastructure, DCI realized significant operational benefits and improvements,
positioning the company for future growth.
Figure 1
- 2 -
3. DCI CASE STUDY
NETAPP BENEFITS SERIES
place,” said Robert Ross, DCI’s Senior Vice
President of Network/Technical Services
and Chief Security Officer. The challenge
was compounded by occasional disk-drive
replacement and a backup strategy that
relied heavily on inefficient tape drives at
remote sites.
Maintaining efficient data backup and
recovery capabilities was a key concern,
in part because of new regulatory require-
ments designed to tighten information
management in the financial services
sector. DCI also wanted to improve the
efficiency of IT and thereby control costs.
Administrators were spending too much
time backing up DCI’s data, and recovery
performance was less than optimal. “We
had to offer a better business continuity
environment to give customers the backup
capabilities they needed,” Ross said.
The sprawling infrastructure also burdened
DCI’s environment in Denver with steep
maintenance costs and high energy bills.
Basic IT administration tasks like database
and server provisioning, and data backup
and recovery testing, were becoming a
costly time sink.
All of these factors threatened DCI’s
strategic growth potential and the
company’s plans to deliver services using
new technologies like cloud computing.
“We wanted better growth management,”
Ross said, “including the ability to deliver
a flexible IT solution and offer our version
of cloud computing as an IT service.”
WHY VMWARE ON NETAPP?
DCI engaged ISG Technology, Inc. to examine
the potential of virtualization to overcome the
technical limitations and rising costs of the
existing infrastructure. ISG is an IT solutions
provider and a trusted advisor to DCI. After
reviewing DCI’s architecture and capacity,
the firm proposed a major overhaul, employing
virtualization technology from VMware and
NetApp to consolidate the company’s 28
servers into a lean, flexible platform composed
of just six physical servers connected to a
centralized, shared storage area network (SAN).
NetApp’s network-based shared storage
was an essential component of DCI’s
virtualization plan because without it,
many business-continuity features of
a virtualized environment—including
distributed resource management,
high availability, and advanced VMware
features like vMotion—would have been
impossible to implement. NetApp’s
storage management capabilities also
enabled a cost-effective disaster recovery
model known as virtual-to-virtual DR,
which provides enterprise-class disaster
recovery at a low cost.
DCI ruled out the option of beefing up its
nonvirtualized architecture because of the
prohibitive cost. In fact, an analysis showed
that to match the DR and backup capabilities
of the VMware on NetApp platform, DCI
would need to double the number of servers,
add three employees to administer the extra
machines, upgrade its wide area network
(WAN), and spend significantly more on
energy costs. The total cost of the expanded
platform would have exceeded $1M.
Choosing the virtualization strategy, DCI
moved ahead with ISG’s recommendations,
installing VMware ESX virtual servers and
NetApp FAS systems at its data centers.
According to Ross, “we integrated both
at the same time to give us the ability to
integrate our data, create an extremely
reliable business-continuity environment,
and support the technological innovations
needed to drive new business initiatives
such as cloud-based service delivery.”
With ISG’s assistance, the project moved
quickly: DCI’s Hutchison (Kansas) data
center went virtual in October 2008;
Commerce (California) and Golden (Colorado)
in June 2009; and Oklahoma City in August
2009. “ISG had a fast-start program that
enabled us to complete the project very
efficiently, rapidly, and successfully,” Ross
said. “The implementation speed we were
able to attain was amazing.”
The move to the VMware on NetApp
infrastructure achieved the company’s initial
business and technical objectives, allowing
DCI to virtualize and consolidate its Windows
“ ISG had a fast-start program that enabled us to complete
the project very efficiently, rapidly, and successfully.
The implementation speed we were able to attain
was amazing.”
– Robert Ross, SVP of Network/Technical Services
and Chief Security Officer
- 3 -
4. DCI CASE STUDY
NETAPP BENEFITS SERIES
environment and then monitor and dynamic-
ally configure the platform to better serve its
customers. The platform’s centrally managed,
virtualized storage infrastructure also allowed
DCI to adopt standard, consistent backup
processes across every site and to automate
disaster recovery processes. DCI now
encounters fewer errors during backups,
despite doubling the backup frequency.
“Our objective was to ensure rapid data
restore and DR support for our customers,
and the NetApp and VMware infrastructure
enabled us to achieve this,” Ross said.
Significant efficiency and productivity gains
are being achieved in the consolidated,
virtualized environment through the use
of NetApp software tools, including
SnapManager for Oracle, SnapManager for
Exchange, SnapManager for SQL Server,
SnapManager for Virtual Infrastructure,
Snapshot, SnapMirror, FlexVol, and FlexClone.
These software tools automate storage
management tasks, decrease backup times,
and optimize disk space in support of the
company’s business-critical iCore360°
banking applications, Microsoft SQL
databases, and other systems. Today,
DCI is managing more than 100 terabytes
of storage with one full-time employee.
BUSINESS BENEFITS
Mainstay Partners quantified the benefits
of DCI’s investment in the NetApp storage
solution and has projected more than
$800K in net savings over four years. The
following sections detail these benefits.
Infrastructure Optimization
The VMware on NetApp infrastructure
helped DCI shrink its hardware footprint
and contain server sprawl, enabling the
company to run its systems and applications
using fewer physical servers and storage
devices. For example, using NetApp’s
deduplication technology—which eliminates
multiple copies of data—DCI cut the amount
of storage space it consumes by 50%.1
It cut another 25% through the use of thin
provisioning, a NetApp technology that
lets administrators allocate only as much
storage space as is actually needed by
the operating systems and applications.
Overall, the virtualization project allowed
DCI to consolidate IT operations onto just
14 VMware ESX servers, an 80% reduction
in physical machines, as shown in Figure 2.
The move to VMware on NetApp also enabled
DCI to shut down three smaller processing
centers and helped DCI avoid outlays for
three new physical servers a year to keep
up with data-processing growth.
Improvement in Database
Recovery Time
Figure 3
2
30
NetApp
Environment
Previous
Environment
Time to Recover Database (in minutes)
Decrease in the Number
of Physical Servers
Figure 2
14
70
NetApp
Environment
Previous
Environment
Number of Servers
1
NetApp is the only storage technology that deduplicates primary data at the block level. These primary storage savings are subsequently multiplied and extended across the
storage environment during the backup, replication, disaster recovery, and archiving processes. Deduplicating also results in more rapid data replication routines.
“ We wanted better growth management, including
the ability to deliver a flexible IT solution and offer
our version of cloud computing as an IT service.”
– Robert Ross, SVP of Network/Technical Services
and Chief Security Officer
- 4 -
5. DCI CASE STUDY
NETAPP BENEFITS SERIES
Real Estate and Energy Savings
Today, the company’s IT infrastructure
consumes about 75% less floor space,
offering DCI the possibility of future savings
from real estate consolidation and sales.
The smaller hardware footprint also takes
less energy to power and cool. After
implementing its virtualized platform, the
company has seen power consumption
fall by 22%, saving an estimated $108K
in energy costs over four years while
also helping to cut carbon emissions.
Productivity Gains
The introduction of automated storage-
management tools like NetApp SnapManager,
SnapMirror, and FlexVol has significantly
boosted staff productivity. For example,
system administrators who used to spend
4–8 hours per day checking and trouble-
shooting backup jobs now use Snapshot
technology to do the same job in half the
time and without impacting system
performance. “NetApp Snapshot copies
eliminate the redundancy inherent in
traditional backups,” said Brock Benard,
DCI’s lead network engineer. “Snapshot
shortens backup windows and provides
100% data validation.” As a result of these
efficiency gains, DCI expects to forego
hiring two new administrators, saving
approximately $720K over three years.
Meanwhile, administrators are using
FlexVol to resize and optimize storage
capacity about 60% faster than before.
These efficiencies have resulted in
smoother-running applications and
have decreased the number of timeouts
affecting its mission-critical ATM
bank-machine applications.
More Reliable and
Rapid Data Recovery
With its NetApp storage infrastructure,
DCI now expects to take 93% less time to
perform Oracle, Microsoft SQL Server, and
Exchange Server database recoveries—
from 30 minutes down to just 2 minutes.
NetApp FlexClone technology, which creates
instant data copies, was seen as key to this
improvement. In addition, DCI was able
to start performing database recovery
testing remotely, which is saving the
company $47K over four years in travel
costs,2
as shown in Figure 4.
Data Recovery Time
Figure 4
4
0
1
2
3
4
.25
NetApp
Environment
Previous
Environment
RecoveryTime(inhours)
Figure 5
Decrease in Database
Provisioning Time
1
2
NetApp
Environment
Previous
Environment
Hours to Provision Database (in hours)
Figure 6
Decrease in Server
Provisioning Time
1
12
NetApp
Environment
Previous
Environment
Hours to Provision Server (in hours)
2
In the previous environment, six team members travelled to the Colorado data center for DR testing for one week each year, a trip that is no longer required.
“ NetApp Snapshot copies eliminate the redundancy
inherent in traditional backups. It shortens backup
windows and provides 100% data validation.”
– Brock Benard, Lead Network Engineer
- 5 -
6. DCI CASE STUDY
NETAPP BENEFITS SERIES
Data recovery performance has also
improved since moving to NetApp. With
NetApp Snapshot and SnapManager
for Virtual Infrastructure, DCI backs up
data faster and more frequently with little
impact to system performance. It also
improved average data recovery time
by more than 90%.
Faster Provisioning,
Easier Maintenance
Efficiency gains were reported by data-
base administrators who are now using
automated storage-management tools
to provision databases 50% faster. DBAs
also use NetApp FlexClone to rapidly
reload more than 30 databases during
disaster recovery testing. Over three
years, the company expects to save an
estimated $16K in DBA labor costs from
using these storage-optimization tools.
Similarly, system administrators in DCI’s
virtual environments are provisioning
servers in less than an hour — a job that
used to take 12 hours in the previous
environment. They are also spending 75%
less time performing routine maintenance
on servers, a productivity improvement
(valued at about $95K per year) that lets
SAs devote more time to higher-level tasks
like capacity planning, user support, and
data recovery.
Supporting Growth and
New Initiatives
A key driver behind DCI’s investment in
virtualization and NetApp was to create
a modern IT platform that could scale
to accommodate DCI’s strategic goals
and growth trajectory. The launch of a
dedicated disaster recovery center in
Oklahoma City, based on NetApp
technology, was a critical step in building
the new platform. The DR center provides
every office with reliable, consistent DR
capabilities and eliminates single points
of failure in the storage environment.
The enhanced DR environment, combined
with virtualization, is providing the foundation
of a high-availability platform that DCI needs
for future initiatives, including moving its
iCore360º product to a Web-based service
in 2010 by using .NET and Silverlight
technology.
DCI also hopes to utilize the virtualized
platform to launch IT-as-a-Service (ITaaS)
and private cloud service offerings, which
will depend on efficiently partitioning and
allocating parts of its infrastructure to
customers. As DCI’s Ross said, “Our
NetApp and VMware infrastructure is
the cornerstone for our future private
cloud and ITaaS initiatives.”
THE BOTTOM LINE
According to Mainstay’s assessment,
DCI will achieve 40% ROI in the first
four years on a $1.3M investment in
the NetApp platform and will realize
$2.1M in total benefits. The assessment
projects that the company will achieve
payback on its investment within 23
months. Figure 7 breaks down the
investment costs between hardware,
software, consulting, and other costs,
which include business and internal IT
resource costs for design, development
and training, and asset write-off costs.
Figure 8 summarizes the cost and benefits
over four years, and Table 1 breaks out
these figures on a yearly basis.
Figure 7
Investment Cost Breakdown
Other
9%
Software Cost
5%
Consulting
4%
Hardware Cost
82%
“ Our NetApp and VMware infrastructure is the
cornerstone for our future private cloud and
ITaaS initiatives.”
– Robert Ross, SVP of Network/Technical Services
and Chief Security Officer
- 6 -
7. DCI CASE STUDY
NETAPP BENEFITS SERIES
Summary of Costs and Benefits Over 4 Years
Figure 8
Financial Summary
Table 1
Initial Investment
2007 2008 2009 2010 Total
Hardware Cost $1,036,537 $0 $0 $0 $1,036,537
Software Cost $42,645 $8,295 $8,295 $8,295 $67,530
Miscellaneous Cost $62,000 $100,000 $0 $0 $162,000
Total Cost $1,141,182 $108,295 $8,295 $8,295 $1,266,067
Hardware Savings $0 $523,500 $268,500 $268,500 $1,060,500
Productivity Savings $0 $292,580 $292,580 $292,580 $877,740
Miscellaneous Savings $0 $51,000 $51,000 $51,000 $153,000
Total Benefits $0 $867,080 $612,080 $612,080 $2,091,240
Summary
Total Cost $1,266,067
Total Benefits $2,091,240
ROI 40%
Payback 23 months
Net BenefitsHardware
Savings
Productivity
Savings
Miscellaneous
Savings
Total
Investment
Miscellaneous
Costs
Software
Cost
Hardware
Cost
-$1,500,000
-$1,000,000
-$500,000
0
$500,000
$1,000,000
$1,500,000
-$1,036,537 -$67,539
-$162,000 -$1,266,067 $1,060,500
$877,740
$153,000
$825,173
$2,000,000
$2,500,000
- 7 -