An introduction to Amazon RDS for SQL Server as well as how you can lower your costs of running SQL Server in AWS RDS, and Migrating your data into and out of Amazon RDS for SQL Server.
Introducing Amazon RDS Using Oracle DatabaseJamie Kinney
Amazon RDS allows users to easily deploy and run Oracle databases in the AWS cloud. Key benefits include the ability to quickly provision Oracle software on production-grade hardware without needing to pre-allocate resources, pay only for what is used, and leverage pre-configured Oracle solutions. Oracle licenses can also be portability to AWS. The full Oracle software stack is supported, including databases, middleware, and enterprise applications.
Myths and facts of cloud hosting servicesHTS Hosting
Cloud Hosting is a type of web hosting services offered by the web hosting providers.It allows applications and websites to be accessed via cloud resources.
The document provides an overview of running Oracle software on Amazon Web Services (AWS). Key points include:
- AWS allows users to deploy Oracle solutions quickly on production-class hardware without needing to pre-allocate budgets, and pay only for what they use.
- Amazon Machine Images provide pre-configured Oracle solutions for easier deployment.
- Users have full portability to bring Oracle licenses purchased from Oracle to the AWS cloud.
- AWS supports the full Oracle software stack, including databases, middleware, and enterprise applications.
This document discusses compute services and the various compute options available on AWS. It defines compute as calculating or processing data. Compute services like OpenStack Compute are used to host and manage cloud computing systems. Major AWS compute services include EC2 for secure and scalable compute capacity, Lambda for serverless code execution, Batch for batch job processing, and Elastic Beanstalk for deploying web applications without managing infrastructure. The document also discusses options like Lightsail for simple workloads, ECS and EKS for containerized applications, and Fargate for serverless containers.
Migrating minimal databases with minimal downtime to AWS RDS, Amazon Redshift and Amazon Aurora
Migration of databases to same and different engines and from on premise to cloud
Schema conversion from Oracle and SQL Server to MySQL and Aurora
This document discusses how to scale web applications on the cloud using Amazon Web Services (AWS). It explains key AWS services like EC2, S3, RDS, SQS that can be used to build scalable applications. The document also provides an example of how the coding practice platform Coderloop was built on AWS to handle increasing user demand. It recommends tools like Puppet, Capistrano, Nagios for deployment, monitoring and managing infrastructure on AWS. Lastly, it provides tips to reduce AWS costs and concludes that AWS is an excellent platform to build scalable applications.
Amazon RDS provides a relational database service that makes it easy to set up, operate, and scale relational databases in the cloud. Key features include automated backups, software patching, monitoring metrics, and the ability to horizontally scale databases using read replicas or sharding. While Amazon RDS is optimized for vertical scaling, SQL Azure provides better support for horizontal scaling through features like elastic database pools. Overall, Amazon RDS offers a managed relational database service that removes the operational burden of self-managing databases.
This document discusses database management systems (DBMS) and their components. It describes DBMS as a set of programs that allow for the storage and retrieval of data. It then discusses the key components of a DBMS including the physical, logical, and view levels of abstraction, data models, data independence, data definition and manipulation languages like SQL, and the roles of database administrators and users. The document provides an overview of the architecture and design of database systems.
AWS는 고객의 기존 데이터베이스를 쉽게 클라우드로 이전할 수 있도록 데이터베이스 전환을 돕는 AWS Database Migration Service와 AWS Schema Conversion Tool을 제공합니다. 이 강연에서는 이 도구들을 활용하여 오라클 데이터베이스를 Amazon Aurora 데이터베이스로 이전하는 방법에 대하여 실습을 통하여 학습할 예정입니다.
연사: John Winford, 아마존 웹서비스 시니어 테크니컬 매니저
김상필, 아마존 웹서비스 솔루션즈 아키텍트
Uses, considerations, and recommendations for AWSScalar Decisions
From an information session on Amazon Web Services (AWS), looking at uses, considerations, and recommendations for leveraging AWS in your organization.
Topics covered:
- AWS Services Overview
- Some ideal use cases: Disaster Recovery, Backup and Archive, Test/Dev
- Data residency and security considerations
Aws 101 A walk-through the aws cloud (2013)Martin Yan
AWS 101 - A Walk through the AWS Cloud: Introduction to Cloud Computing with AWS
This document provides an introduction to Amazon Web Services (AWS) and cloud computing. It discusses the benefits of cloud computing such as pay-as-you-go pricing, lower costs, scalability, agility, and removing the need to manage infrastructure. The document also summarizes AWS's global infrastructure and regions, services such as compute, storage, databases and analytics, and how customers can get started with the free tier. Examples are given of how various organizations are using AWS across different industries.
Accelerate SQL Server Migration to the AWS Cloud Datavail
In today’s marketplace, moving to the public Cloud is a familiar and consistent trend within the SQL Server community.
But which cloud provider do you choose? After all there are different AWS instances each with their own distinctive features. Migrations to the cloud are only going to gain greater momentum as organizations grapple with their on-premises alternatives.
Recent cloud breaches may have some organizations hesitant to take the leap and move to the cloud, however market-leading cloud providers are making every attempt in adhering to compliance guidelines while boosting their security framework and reliability offerings. They are also becoming more competitive by managing their cost more effectively.
For both homogeneous and heterogeneous migrations, planning plays a critical role in moving to the cloud. Preparing a checklist and asking the right questions to stakeholders lays the groundwork in this planning. There are different methods to migrate databases from on-premises to the AWS cloud.
This webinar is in partnership with PASS, download the recording to learn more about:
Reasons to go to the cloud
SQL Server on AWS EC2 vs. AWS RDS
SQL Server high availability (HA) & disaster recovery (DR)
SQL Server migration methodology
DBAs role in the cloud
- Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service in the cloud. It uses massively parallel processing and columnar storage to enable fast queries on large data sets for a fraction of the cost of traditional data warehousing.
- Some key features include automatic scaling, continuous backups, integrated security and access controls, integration with other AWS services like S3 and DynamoDB, and simple point-and-click management.
- Customers are seeing significant improvements in performance, often 50-100x faster than alternatives like Hive, as well as large cost reductions of up to 80% compared to on-premises data warehousing.
This presentation summarizes Amazon Redshift data warehouse service, its architecture and best practices for application development using Amazon Redshift.
TrsLabs - Fintech Product & Business ConsultingTrs Labs
Hybrid Growth Mandate Model with TrsLabs
Strategic Investments, Inorganic Growth, Business Model Pivoting are critical activities that business don't do/change everyday. In cases like this, it may benefit your business to choose a temporary external consultant.
An unbiased plan driven by clearcut deliverables, market dynamics and without the influence of your internal office equations empower business leaders to make right choices.
Getting things done within a budget within a timeframe is key to Growing Business - No matter whether you are a start-up or a big company
Talk to us & Unlock the competitive advantage
Artificial Intelligence is providing benefits in many areas of work within the heritage sector, from image analysis, to ideas generation, and new research tools. However, it is more critical than ever for people, with analogue intelligence, to ensure the integrity and ethical use of AI. Including real people can improve the use of AI by identifying potential biases, cross-checking results, refining workflows, and providing contextual relevance to AI-driven results.
News about the impact of AI often paints a rosy picture. In practice, there are many potential pitfalls. This presentation discusses these issues and looks at the role of analogue intelligence and analogue interfaces in providing the best results to our audiences. How do we deal with factually incorrect results? How do we get content generated that better reflects the diversity of our communities? What roles are there for physical, in-person experiences in the digital world?
Procurement Insights Cost To Value Guide.pptxJon Hansen
Procurement Insights integrated Historic Procurement Industry Archives, serves as a powerful complement — not a competitor — to other procurement industry firms. It fills critical gaps in depth, agility, and contextual insight that most traditional analyst and association models overlook.
Learn more about this value- driven proprietary service offering here.
Designing Low-Latency Systems with Rust and ScyllaDB: An Architectural Deep DiveScyllaDB
Want to learn practical tips for designing systems that can scale efficiently without compromising speed?
Join us for a workshop where we’ll address these challenges head-on and explore how to architect low-latency systems using Rust. During this free interactive workshop oriented for developers, engineers, and architects, we’ll cover how Rust’s unique language features and the Tokio async runtime enable high-performance application development.
As you explore key principles of designing low-latency systems with Rust, you will learn how to:
- Create and compile a real-world app with Rust
- Connect the application to ScyllaDB (NoSQL data store)
- Negotiate tradeoffs related to data modeling and querying
- Manage and monitor the database for consistently low latencies
This is the keynote of the Into the Box conference, highlighting the release of the BoxLang JVM language, its key enhancements, and its vision for the future.
Quantum Computing Quick Research Guide by Arthur MorganArthur Morgan
This is a Quick Research Guide (QRG).
QRGs include the following:
- A brief, high-level overview of the QRG topic.
- A milestone timeline for the QRG topic.
- Links to various free online resource materials to provide a deeper dive into the QRG topic.
- Conclusion and a recommendation for at least two books available in the SJPL system on the QRG topic.
QRGs planned for the series:
- Artificial Intelligence QRG
- Quantum Computing QRG
- Big Data Analytics QRG
- Spacecraft Guidance, Navigation & Control QRG (coming 2026)
- UK Home Computing & The Birth of ARM QRG (coming 2027)
Any questions or comments?
- Please contact Arthur Morgan at [email protected].
100% human made.
AI and Data Privacy in 2025: Global TrendsInData Labs
In this infographic, we explore how businesses can implement effective governance frameworks to address AI data privacy. Understanding it is crucial for developing effective strategies that ensure compliance, safeguard customer trust, and leverage AI responsibly. Equip yourself with insights that can drive informed decision-making and position your organization for success in the future of data privacy.
This infographic contains:
-AI and data privacy: Key findings
-Statistics on AI data privacy in the today’s world
-Tips on how to overcome data privacy challenges
-Benefits of AI data security investments.
Keep up-to-date on how AI is reshaping privacy standards and what this entails for both individuals and organizations.
TrustArc Webinar: Consumer Expectations vs Corporate Realities on Data Broker...TrustArc
Most consumers believe they’re making informed decisions about their personal data—adjusting privacy settings, blocking trackers, and opting out where they can. However, our new research reveals that while awareness is high, taking meaningful action is still lacking. On the corporate side, many organizations report strong policies for managing third-party data and consumer consent yet fall short when it comes to consistency, accountability and transparency.
This session will explore the research findings from TrustArc’s Privacy Pulse Survey, examining consumer attitudes toward personal data collection and practical suggestions for corporate practices around purchasing third-party data.
Attendees will learn:
- Consumer awareness around data brokers and what consumers are doing to limit data collection
- How businesses assess third-party vendors and their consent management operations
- Where business preparedness needs improvement
- What these trends mean for the future of privacy governance and public trust
This discussion is essential for privacy, risk, and compliance professionals who want to ground their strategies in current data and prepare for what’s next in the privacy landscape.
Dev Dives: Automate and orchestrate your processes with UiPath MaestroUiPathCommunity
This session is designed to equip developers with the skills needed to build mission-critical, end-to-end processes that seamlessly orchestrate agents, people, and robots.
📕 Here's what you can expect:
- Modeling: Build end-to-end processes using BPMN.
- Implementing: Integrate agentic tasks, RPA, APIs, and advanced decisioning into processes.
- Operating: Control process instances with rewind, replay, pause, and stop functions.
- Monitoring: Use dashboards and embedded analytics for real-time insights into process instances.
This webinar is a must-attend for developers looking to enhance their agentic automation skills and orchestrate robust, mission-critical processes.
👨🏫 Speaker:
Andrei Vintila, Principal Product Manager @UiPath
This session streamed live on April 29, 2025, 16:00 CET.
Check out all our upcoming Dev Dives sessions at https://community.uipath.com/dev-dives-automation-developer-2025/.
Semantic Cultivators : The Critical Future Role to Enable AIartmondano
By 2026, AI agents will consume 10x more enterprise data than humans, but with none of the contextual understanding that prevents catastrophic misinterpretations.
HCL Nomad Web – Best Practices and Managing Multiuser Environmentspanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-nomad-web-best-practices-and-managing-multiuser-environments/
HCL Nomad Web is heralded as the next generation of the HCL Notes client, offering numerous advantages such as eliminating the need for packaging, distribution, and installation. Nomad Web client upgrades will be installed “automatically” in the background. This significantly reduces the administrative footprint compared to traditional HCL Notes clients. However, troubleshooting issues in Nomad Web present unique challenges compared to the Notes client.
Join Christoph and Marc as they demonstrate how to simplify the troubleshooting process in HCL Nomad Web, ensuring a smoother and more efficient user experience.
In this webinar, we will explore effective strategies for diagnosing and resolving common problems in HCL Nomad Web, including
- Accessing the console
- Locating and interpreting log files
- Accessing the data folder within the browser’s cache (using OPFS)
- Understand the difference between single- and multi-user scenarios
- Utilizing Client Clocking
More Related Content
Similar to 2015 SQL Pass Summit Breakfast session #1 (7)
AWS는 고객의 기존 데이터베이스를 쉽게 클라우드로 이전할 수 있도록 데이터베이스 전환을 돕는 AWS Database Migration Service와 AWS Schema Conversion Tool을 제공합니다. 이 강연에서는 이 도구들을 활용하여 오라클 데이터베이스를 Amazon Aurora 데이터베이스로 이전하는 방법에 대하여 실습을 통하여 학습할 예정입니다.
연사: John Winford, 아마존 웹서비스 시니어 테크니컬 매니저
김상필, 아마존 웹서비스 솔루션즈 아키텍트
Uses, considerations, and recommendations for AWSScalar Decisions
From an information session on Amazon Web Services (AWS), looking at uses, considerations, and recommendations for leveraging AWS in your organization.
Topics covered:
- AWS Services Overview
- Some ideal use cases: Disaster Recovery, Backup and Archive, Test/Dev
- Data residency and security considerations
Aws 101 A walk-through the aws cloud (2013)Martin Yan
AWS 101 - A Walk through the AWS Cloud: Introduction to Cloud Computing with AWS
This document provides an introduction to Amazon Web Services (AWS) and cloud computing. It discusses the benefits of cloud computing such as pay-as-you-go pricing, lower costs, scalability, agility, and removing the need to manage infrastructure. The document also summarizes AWS's global infrastructure and regions, services such as compute, storage, databases and analytics, and how customers can get started with the free tier. Examples are given of how various organizations are using AWS across different industries.
Accelerate SQL Server Migration to the AWS Cloud Datavail
In today’s marketplace, moving to the public Cloud is a familiar and consistent trend within the SQL Server community.
But which cloud provider do you choose? After all there are different AWS instances each with their own distinctive features. Migrations to the cloud are only going to gain greater momentum as organizations grapple with their on-premises alternatives.
Recent cloud breaches may have some organizations hesitant to take the leap and move to the cloud, however market-leading cloud providers are making every attempt in adhering to compliance guidelines while boosting their security framework and reliability offerings. They are also becoming more competitive by managing their cost more effectively.
For both homogeneous and heterogeneous migrations, planning plays a critical role in moving to the cloud. Preparing a checklist and asking the right questions to stakeholders lays the groundwork in this planning. There are different methods to migrate databases from on-premises to the AWS cloud.
This webinar is in partnership with PASS, download the recording to learn more about:
Reasons to go to the cloud
SQL Server on AWS EC2 vs. AWS RDS
SQL Server high availability (HA) & disaster recovery (DR)
SQL Server migration methodology
DBAs role in the cloud
- Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service in the cloud. It uses massively parallel processing and columnar storage to enable fast queries on large data sets for a fraction of the cost of traditional data warehousing.
- Some key features include automatic scaling, continuous backups, integrated security and access controls, integration with other AWS services like S3 and DynamoDB, and simple point-and-click management.
- Customers are seeing significant improvements in performance, often 50-100x faster than alternatives like Hive, as well as large cost reductions of up to 80% compared to on-premises data warehousing.
This presentation summarizes Amazon Redshift data warehouse service, its architecture and best practices for application development using Amazon Redshift.
TrsLabs - Fintech Product & Business ConsultingTrs Labs
Hybrid Growth Mandate Model with TrsLabs
Strategic Investments, Inorganic Growth, Business Model Pivoting are critical activities that business don't do/change everyday. In cases like this, it may benefit your business to choose a temporary external consultant.
An unbiased plan driven by clearcut deliverables, market dynamics and without the influence of your internal office equations empower business leaders to make right choices.
Getting things done within a budget within a timeframe is key to Growing Business - No matter whether you are a start-up or a big company
Talk to us & Unlock the competitive advantage
Artificial Intelligence is providing benefits in many areas of work within the heritage sector, from image analysis, to ideas generation, and new research tools. However, it is more critical than ever for people, with analogue intelligence, to ensure the integrity and ethical use of AI. Including real people can improve the use of AI by identifying potential biases, cross-checking results, refining workflows, and providing contextual relevance to AI-driven results.
News about the impact of AI often paints a rosy picture. In practice, there are many potential pitfalls. This presentation discusses these issues and looks at the role of analogue intelligence and analogue interfaces in providing the best results to our audiences. How do we deal with factually incorrect results? How do we get content generated that better reflects the diversity of our communities? What roles are there for physical, in-person experiences in the digital world?
Procurement Insights Cost To Value Guide.pptxJon Hansen
Procurement Insights integrated Historic Procurement Industry Archives, serves as a powerful complement — not a competitor — to other procurement industry firms. It fills critical gaps in depth, agility, and contextual insight that most traditional analyst and association models overlook.
Learn more about this value- driven proprietary service offering here.
Designing Low-Latency Systems with Rust and ScyllaDB: An Architectural Deep DiveScyllaDB
Want to learn practical tips for designing systems that can scale efficiently without compromising speed?
Join us for a workshop where we’ll address these challenges head-on and explore how to architect low-latency systems using Rust. During this free interactive workshop oriented for developers, engineers, and architects, we’ll cover how Rust’s unique language features and the Tokio async runtime enable high-performance application development.
As you explore key principles of designing low-latency systems with Rust, you will learn how to:
- Create and compile a real-world app with Rust
- Connect the application to ScyllaDB (NoSQL data store)
- Negotiate tradeoffs related to data modeling and querying
- Manage and monitor the database for consistently low latencies
This is the keynote of the Into the Box conference, highlighting the release of the BoxLang JVM language, its key enhancements, and its vision for the future.
Quantum Computing Quick Research Guide by Arthur MorganArthur Morgan
This is a Quick Research Guide (QRG).
QRGs include the following:
- A brief, high-level overview of the QRG topic.
- A milestone timeline for the QRG topic.
- Links to various free online resource materials to provide a deeper dive into the QRG topic.
- Conclusion and a recommendation for at least two books available in the SJPL system on the QRG topic.
QRGs planned for the series:
- Artificial Intelligence QRG
- Quantum Computing QRG
- Big Data Analytics QRG
- Spacecraft Guidance, Navigation & Control QRG (coming 2026)
- UK Home Computing & The Birth of ARM QRG (coming 2027)
Any questions or comments?
- Please contact Arthur Morgan at [email protected].
100% human made.
AI and Data Privacy in 2025: Global TrendsInData Labs
In this infographic, we explore how businesses can implement effective governance frameworks to address AI data privacy. Understanding it is crucial for developing effective strategies that ensure compliance, safeguard customer trust, and leverage AI responsibly. Equip yourself with insights that can drive informed decision-making and position your organization for success in the future of data privacy.
This infographic contains:
-AI and data privacy: Key findings
-Statistics on AI data privacy in the today’s world
-Tips on how to overcome data privacy challenges
-Benefits of AI data security investments.
Keep up-to-date on how AI is reshaping privacy standards and what this entails for both individuals and organizations.
TrustArc Webinar: Consumer Expectations vs Corporate Realities on Data Broker...TrustArc
Most consumers believe they’re making informed decisions about their personal data—adjusting privacy settings, blocking trackers, and opting out where they can. However, our new research reveals that while awareness is high, taking meaningful action is still lacking. On the corporate side, many organizations report strong policies for managing third-party data and consumer consent yet fall short when it comes to consistency, accountability and transparency.
This session will explore the research findings from TrustArc’s Privacy Pulse Survey, examining consumer attitudes toward personal data collection and practical suggestions for corporate practices around purchasing third-party data.
Attendees will learn:
- Consumer awareness around data brokers and what consumers are doing to limit data collection
- How businesses assess third-party vendors and their consent management operations
- Where business preparedness needs improvement
- What these trends mean for the future of privacy governance and public trust
This discussion is essential for privacy, risk, and compliance professionals who want to ground their strategies in current data and prepare for what’s next in the privacy landscape.
Dev Dives: Automate and orchestrate your processes with UiPath MaestroUiPathCommunity
This session is designed to equip developers with the skills needed to build mission-critical, end-to-end processes that seamlessly orchestrate agents, people, and robots.
📕 Here's what you can expect:
- Modeling: Build end-to-end processes using BPMN.
- Implementing: Integrate agentic tasks, RPA, APIs, and advanced decisioning into processes.
- Operating: Control process instances with rewind, replay, pause, and stop functions.
- Monitoring: Use dashboards and embedded analytics for real-time insights into process instances.
This webinar is a must-attend for developers looking to enhance their agentic automation skills and orchestrate robust, mission-critical processes.
👨🏫 Speaker:
Andrei Vintila, Principal Product Manager @UiPath
This session streamed live on April 29, 2025, 16:00 CET.
Check out all our upcoming Dev Dives sessions at https://community.uipath.com/dev-dives-automation-developer-2025/.
Semantic Cultivators : The Critical Future Role to Enable AIartmondano
By 2026, AI agents will consume 10x more enterprise data than humans, but with none of the contextual understanding that prevents catastrophic misinterpretations.
HCL Nomad Web – Best Practices and Managing Multiuser Environmentspanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-nomad-web-best-practices-and-managing-multiuser-environments/
HCL Nomad Web is heralded as the next generation of the HCL Notes client, offering numerous advantages such as eliminating the need for packaging, distribution, and installation. Nomad Web client upgrades will be installed “automatically” in the background. This significantly reduces the administrative footprint compared to traditional HCL Notes clients. However, troubleshooting issues in Nomad Web present unique challenges compared to the Notes client.
Join Christoph and Marc as they demonstrate how to simplify the troubleshooting process in HCL Nomad Web, ensuring a smoother and more efficient user experience.
In this webinar, we will explore effective strategies for diagnosing and resolving common problems in HCL Nomad Web, including
- Accessing the console
- Locating and interpreting log files
- Accessing the data folder within the browser’s cache (using OPFS)
- Understand the difference between single- and multi-user scenarios
- Utilizing Client Clocking
The Evolution of Meme Coins A New Era for Digital Currency ppt.pdfAbi john
Analyze the growth of meme coins from mere online jokes to potential assets in the digital economy. Explore the community, culture, and utility as they elevate themselves to a new era in cryptocurrency.
How Can I use the AI Hype in my Business Context?Daniel Lehner
𝙄𝙨 𝘼𝙄 𝙟𝙪𝙨𝙩 𝙝𝙮𝙥𝙚? 𝙊𝙧 𝙞𝙨 𝙞𝙩 𝙩𝙝𝙚 𝙜𝙖𝙢𝙚 𝙘𝙝𝙖𝙣𝙜𝙚𝙧 𝙮𝙤𝙪𝙧 𝙗𝙪𝙨𝙞𝙣𝙚𝙨𝙨 𝙣𝙚𝙚𝙙𝙨?
Everyone’s talking about AI but is anyone really using it to create real value?
Most companies want to leverage AI. Few know 𝗵𝗼𝘄.
✅ What exactly should you ask to find real AI opportunities?
✅ Which AI techniques actually fit your business?
✅ Is your data even ready for AI?
If you’re not sure, you’re not alone. This is a condensed version of the slides I presented at a Linkedin webinar for Tecnovy on 28.04.2025.
Book industry standards are evolving rapidly. In the first part of this session, we’ll share an overview of key developments from 2024 and the early months of 2025. Then, BookNet’s resident standards expert, Tom Richardson, and CEO, Lauren Stewart, have a forward-looking conversation about what’s next.
Link to recording, presentation slides, and accompanying resource: https://bnctechforum.ca/sessions/standardsgoals-for-2025-standards-certification-roundup/
Presented by BookNet Canada on May 6, 2025 with support from the Department of Canadian Heritage.
ThousandEyes Partner Innovation Updates for May 2025ThousandEyes
2015 SQL Pass Summit Breakfast session #1
1. Breakfast Session #1
Amazon RDS for SQL Server
Optimizing Cost & Data Migration
Ghim-Sim Chua, Sr. Technical Product Manager, AWS
2. What to expect from this session
Quick overview of Amazon RDS for SQL Server
Lowering the costs of running Amazon RDS for SQL Server
Migrating your data into and out of Amazon RDS for SQL Server
3. What is Amazon RDS for SQL Server?
Power, HVAC, net
Rack and stack
Server maintenance
OS patches
DB software patches
Database backups
High availability
DB software installs OS installation
Scaling
Amazon RDS for
SQL Server
4. Amazon RDS for SQL Server Options
Express, Web, Standard, Enterprise
License Included, Bring Your Own License
10. Other Amazon RDS Instance Types
M3 (Standard)
vCPUs: 1 - 8
RAM: 3GB to 30GB
R3 (Memory Optimized)
vCPUs: 2 - 32
RAM: 15GB to 244GB
11. Optimize your RDS SQL Server for cost
Region
Instance
Storage type
Multi-AZ
Pricing model
Licensing model
12. On Demand vs Reserved Instances
On-Demand
Pay by the hour
No term commitment
Reserved Instances
No upfront
Partial upfront
All upfront
13. Import/Export data options
1. Import and Export Wizard
2. Bulk Copy (bcp utility)
3. AWS Database Migration Service
Importing and Exporting SQL Server Data
http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/SQLServer.Procedural.Importing.html
14. AWS Database Migration Service
Start your first migration in 10 minutes or less
Keep your apps running during the migration
Replicate within, to or from Amazon EC2 or RDS
Move data to the same or different database engine
Sign up for preview at aws.amazon.com/dms
15. Using the AWS Database Migration Service
Customer
Premises
Application Users
AWS
Internet
VPN
• Start a replication instance
• Connect to source and target databases
• Select tables, schemas or databases
Let the AWS Database Migration Service
create tables, load data and keep them
in sync
Switch applications over to the target at
your convenience
AWS
Database Migration
Service
16. Replication and data integration
Replicate data in on-premises databases to AWS
Replicate OLTP data to Amazon Redshift
Integrate tables from third-party software into your reporting or core
OLTP systems
Hybrid cloud is a stepping stone in migration to AWS
17. Cost effective and no upfront costs
T2 pricing starts at $0.018 per Hour for T2.micro
C4 pricing starts at $0.154 per Hour for C4.large
50GB GP2 storage included with T2 instances
100GB GP2 storage included with C4 instances
Data transfer inbound and within AZ is free
Data transfer across AZs starts at $0.01 per GB
Swap
Logs
Cache
18. Q & A
Check out Amazon RDS for SQL Server and
AWS Database Migration Service!
Thank you!
Editor's Notes
#3: In today’s session, I want to give a quick overview of Amazon RDS for SQL Server is as well as talk about how you can lower your costs of running SQL Server in RDS and Migrating your data into and out of Amazon RDS for SQL Server.
#4: For those who are not familiar, Amazon RDS for SQL Server is a fully managed SQL Server database service. By managed service, we mean that you can easily launch a wide variety SQL Server databases with a few clicks of a button. It is easy to set up, operate and scale SQL Server databases. Amazon RDS will take care of mundane DBA tasks such as patching, backups, and automatically replace failed hosts for you. You can also monitor, manage, scale and create High Availability SQL Server databases easily.
#5: RDS for SQL Server offers many flavors of SQL Server, from 2008 R2, SQL Server 2012 to SQL Server 2014, which we launched earlier this week. You can also pick from SQL Server Express, Web, Standard and Enterprise Editions, in License Included and Bring-Your-Own-License offering. With the License Included offering, you pay an hourly rate that includes all the underlying hardware operating costs as well as the software license fees necessary to run SQL Server. If you have purchased your own SQL Server License, you can choose to bring your own license with License Mobility under Software Assurance, and only pay for the cost of operating the hardware.
#6: A lot of our customers come to us and ask us how they can reduce their RDS bill while not sacrificing performance, manageability, scalability and durability.
#7: One of the most important cost optimizing considerations is picking the right size of the instance for your workload. Not only are larger instances more costly to run, SQL Server license costs are based on the number of cores or vCPUs. If you select an instance that is bigger than you need, you are paying a disproportionate amount of the cost of running your SQL Server on the license. And we often see customers over provisioning their SQL Server instances, resulting in a much higher RDS bill than is necessary.
#8: So how do you pick the right instance to run your SQL Server? Well, one way is understand what the demand pattern of your workload looks like. In talking to our customers we found four generic patterns:
Constant, steady-state: your workload may require the same performance characteristics throughout the lifetime of the workload, some always-on mission critical enterprise systems fit in this category
Predictable fluctuations, but steady-state: over the lifetime of the workload, the load on the database fluctuates over predictable time intervals. This can be seen in line of business systems that receive increased load during business hours, but less outside business hours
Growing, but predictable: these workloads grow over time, as more data gets accumulated, or the business itself grows in a fairly predictable way
Fluctuating and spiky: these are workloads that have an unpredictable curve and can see unexpected spikes in load. We see this pattern in web-based consumer workloads, where there are viral spikes.
Your workload may fit within more than one of these patterns at different times. If you’re a startup you may see a combination of the latter 2, caused by organic business growth and viral events.
#9: Amazon RDS for SQL Server offers a variety of instance types to meet the needs of customers with all sorts of workloads.
For instance, though T2 instances are generally less expensive than other instance types, they are burst capable and they are great for fluctuating & spiky workloads or workloads with predictable fluctuations. They have a decent base performance, with the capability burst up to 100% vCPU utilization when the workload needs it. You earn credits per hour when below base performance, and it stores up to 24 hours worth of credits. You can use Amazon CloudWatch metrics to see credits and usage
<click>T2 credits are earned during low CPU utilization and used up during heavy utilization. For T2, 1 CPU credit is equal to 100% utilization of the vCPU for 1 minute, Or 50% of the vCPU for 2 minutes, Or 25% of the vCPU for 2 minutes.
#10: For instance if you use a T2.micro instance, you are given 30 initial CPU credits to ensure that you have decent startup performance. What are 30 credits good for? You can run the instance for an hour consuming half the CPU resources.
Thereafter, you earn 6 CPU credits per hour and have a max CPU credit balance of 144 units. 6 credits per hour means you are able to consume 10% of the CPU over the course of that hour. If you consume less, the remainder gets added to the balance, up to 144 credits. You can use these credits to burst your workload up to 100% utilization of a vCPU. Once your credit is exhausted, you will go back down to your base performance of 10% utilization.
CPU credits expire after 24 hours they are earned and they do not persist between shutdown and startups. However, you are given the initial CPU credit again on startup.
#11: Other instance types that RDS offers include the standard M3 which has between 1 to 8 vCPUs with 3GB to 30GB of memory. The memory optimized instance type, R3, has between 2 to 32 vCPUs and between 15 and 244 GB of memory. If you have a memory intensive workload, you may want to pick the R3 instance type over the M3 instance type. If your workload is less memory intensive but is constant steady-state, you might want to consider the M3 instance type.
#12: So your workload pattern may provide one avenue to save on costs. If your growth is predictable, you don’t need to run overprovisioned instances from the get go. You can run your instance on smaller instance types until you grow out of it, then simply restart into a larger instance type.
If your workloads has known fluctuations, you can do the same scaling down and back up as needed. You can even stop, or terminate and re-provision from snapshot if you know you won’t need the instance for a while.
Beyond leveraging the workload pattern, here are some factors that you can consider to optimize your costs. For example, instances in different regions have different prices. You may want to compare the instance prices between regions and pick one that is cheaper if network latency is not an issue. Multi-AZ will also double the cost of your database instance as you are paying for 2 instances, so we generally recommend that you use Multi-AZ for production purposes.
#13: You can also run your RDS SQL Server instance using on-demand or reserved instance pricing models. On demand is great if you don’t want a long term commitment. You pay by the hour and if you are done, you can simply turn your SQL Server database off and charges will stop accumulating.
If you want to save even more and have no problems committing to a 1 or 3 year term, you can consider purchasing 1 or 3 year reserved instance or RI and save up to 60% over on demand costs. There are 3 types of RIs: No upfront, partial upfront and all upfront. Generally, savings are the greatest with all upfront RIs.
#14: Next, I’d like to talk about how you can import and export data into and out of RDS SQL Server. There are currently 3 options you can choose from – the Import and Export Wizard, bulk copy utility and the recently announced AWS Data Migration Service and AWS Schema Conversion Tool.
So when do you use which tool?
You’ll want to use the Import and Export Wizard if you want to transfer small- to medium-size tables to another DB instance.
You’ll want to use the bulk copy utility if you want to have a large quantity of data to move, like > 1GB of data, because it is more efficient
The Data Migration Service should be used if you cannot afford a long downtime for your database. It will enable your database to still be online while it makes an initial copy of your database to the migrated instance. It will then continually replicate changes until you choose a short outage to cut over from one database to another.
The Schema Conversion Tool should be used if you want to migrate your data to or from a totally different database engine like MySQL, Oracle or Postgres to SQL Server. The tool will help you transform tables, partitions, sequences, views, stored procedures, triggers and functions. It will also automatically find equivalent functions to transform from one database to another. If it cannot, it will give you links on how to do it manually.
BTW, I highly recommend that you read the importing and exporting SQL Server Data guide listed here for more information.
#15: Like all AWS services, it is easy and straightforward to get started with the AWS Data Migration Service or DMS.
*You can get started with your first migration task in 10 min or less. You simply connect it to your source and target databases, and it copies the data over, and begins replicating changes from source to target.
*That means that you can keep your apps running during the migration, then switch over at a time that is convenient for your business.
* In addition to one-time database migration, you can also use DMS for ongoing data replication. Replicate within, to or from AWS EC2 or RDS databases. For instance, after migrating your database, you can use the AWS Database Migration Service to replicate data into your Redshift data warehouses, cross-region to other RDS instances, or back to your on-premises database.
*With DMS, you can move data between engines. DMS supports Oracle, Microsoft SQL Server, MySQL, PostgreSQL, MariaDB, Amazon Aurora, Amazon Redshift
* If you would like to sign up for the preview of DMS, go to this url
#16: Using the AWS Database Migration Service to migrate data to AWS is simple.
(CLICK) Start by spinning up a DMS instance in your AWS environment
(CLICK) Next, from within DMS, connect to both your source and target databases
(CLICK) Choose what data you want to migrate. DMS lets you migrate tables, schemas, or whole databases
Then sit back and let DMS do the rest. (CLICK) It creates the tables, loads the data, and best of all, keeps them synchronized for as long as you need
That replication capability, which keeps the source and target data in sync, allows customers to switch applications (CLICK) over to point to the AWS database at their leisure. DMS eliminates the need for high-stakes extended outages to migrate production data into the cloud and provides a graceful switchover capability.
#17: But DMS is for much more than just migration. DMS enables customers to adopt a hybrid approach to the cloud, maintaining some applications on premises, and others within AWS. There are dozens of compelling use cases for a hybrid cloud approach using DMS. For customers just getting their feet wet, AWS is a great place to keep up-to-date read-only copies of on-premises data for reporting purposes. AWS services like Aurora, Redshift and RDS are great platforms for this.
With DMS, you can maintain copies of critical business data from third-party or ERP applications, like employee data from Peoplesoft, or financial data from Oracle E-Business Suite, in the databases used by the other applications in your enterprise. In this way, it enables application integration in the enterprise.
Another nice thing about the hybrid cloud approach is that it lets customers become familiar with AWS technology and services gradually. Moving to the cloud is much simpler if you have a way to link the data and applications that have moved to AWS with those that haven’t.
#18: AWS Database Migration Service currently supports the T2 and C4 instance classes. T2 instances are suitable for developing, configuring and testing your database migration process, and for periodic data migration tasks that can benefit from the CPU burst capability.
C4 instances are designed to deliver the highest level of processor performance and achieve significantly higher packet per second (PPS) performance, lower network jitter, and lower network latency. You should use C4 instances if you are migrating large databases and are looking to minimize the migration time.
With the AWS Database Migration Service you pay for the migration instance that moves your data from your source database to your target database. Each database migration instance includes storage sufficient to support the needs of the replication engine, such as swap space, logs, and cache.
(CLICK) Inbound data transfer is free.
(CLICK) Additional charges only apply if you decide to allocate additional storage for data migration logs or when you replicate your data to a database in another region or on-premises.
#19: So thank you for attending this breakfast session on RDS SQL Server and the AWS Data Migration Service. I do hope that you can check out both services. I have a few $50 credit codes here for anyone who is interested in trying it out. You can come by after the Q&A session to pick one up.