SlideShare a Scribd company logo
Introduction to
AWS Database Migration
Service
김일호, Solutions Architect
What to Expect from the Session
• Learn about migrating databases with minimal downtime to
Amazon RDS, Amazon Redshift and Amazon Aurora
• Discuss database migrations to same and different engines
• Learn about the converting schemas and stored code from Oracle
and SQL Server to MySQL and Aurora
• One more thing~
Embracing the cloud demands a cloud data
strategy
• How will my on-premises data migrate to the cloud?
• How can I make it transparent to my customers?
• Afterwards, how will on-premises and cloud data interact?
• How can I integrate my data assets within AWS?
• Can I get help moving off of commercial databases?
Historically, Migration = Cost, Time
• Commercial Migration / Replication software
• Complex to setup and manage
• Legacy schema objects, PL/SQL or T-SQL code
• Application downtime
Introducing
AWS Database Migration Service
Purposes of data migration
One-time data migration
Between on premises and AWS
Between Amazon EC2 and Amazo
n RDS
Ongoing Replication
Replicate on premises to AWS
Replicate AWS to on premises
Replicate OLTP to BI
Replicate for query offloading
Ways to migrate data
Bulk Load
AWS Database Migration Service
Oracle Import/Export
Oracle Data Pump Network Mode
Oracle SQL*Loader
Oracle Materialized Views
CTAS / INSERT over dblink
Ongoing Replication
AWS Database Migration Service
Oracle Data Pump Network Mode
Oracle Materialized Views
Oracle GoldenGate
High-speed database migration prior to AWS DMS
EC2
Instance
Linux
Host
On-Premises AWS Availability Zone
Oracle DB
RDS
Oracle
Tsunami Tsunami
DATA_PUMP_DIR
500GB
175GB
~2.5 hours~2.5 hours
Total Time
~7 hours
~3.5 hours~4 hours
Start your first migration in 10 minutes or less
Keep your apps running during the migration
Replicate within, to or from Amazon EC2 or RDS
Move data to the same or different database engine
Sign up for preview at aws.amazon.com/dms
AWS
Database Migration
Service
10 minutes or less to migration
Customer
Premises
Application Users
AWS
Internet
VPN
• Start a replication instance
• Connect to source and target
databases
• Select tables, schemas, or databases
Let AWS Database Migration Service
create tables, load data, and keep
them in sync
Switch applications over to the target
at your convenience
Keep your apps running during the migration
AWS
Database Migration Service
After migration, use for replication and data
integration
• Replicate data in on-premises databases to AWS
• Replicate OLTP data to Amazon Redshift
• Integrate tables from third-party software into your reporting
or core OLTP systems
• Hybrid cloud is a stepping stone in migration to AWS
Cost-effective and no upfront costs
• T2 pricing starts at $0.018 per Hour for T2.micro
• C4 pricing starts at $0.154 per Hour for C4.large
• 50GB GP2 storage included with T2 instances
• 100GB GP2 storage included with C4 instances
•
• Data transfer inbound and within AZ is free
• Data transfer across AZs starts at $0.01 per GB
Swap
Logs
Cache
Migrate and replicate between database engines
Introducing
AWS Schema Conversion Tool
Migrate off Oracle and SQL Server
Move your tables, views, stored procedures and DM
L to MySQL, MariaDB, and Amazon Aurora
Know exactly where manual edits are needed
Download at aws.amazon.com/dms
AWS
Schema Conversion
Tool
Get help with converting tables, views, and code
Schemas
Tables
Indexes
Views
Packages
Stored Procedures
Functions
Triggers
Sequences
User Defined Types
Synonyms
Move your database schema and code
Know exactly where manual edits are needed
One more thing.
RAC on Amazon EC2 would be useful
• Test / dev / non-prod; allow testing to cover RAC-related regression cases
• Scale out and back elastically; a good match for the cloud
• Scale beyond the largest instances
• High-RTO redundancy at the host/instance level; App continuity for near zero downtime
• Test scaling limits; a given workload scales only to n nodes on RAC
• Some applications “require” RAC
• Some customers don’t want to re-engineer everything just to move to AWS
• Customers want it!
AWS 마이그레이션 서비스 - 김일호 :: 2015 리인벤트 리캡 게이밍
Why no RAC on EC2?
EBS Vol
ume
Shared Storage
EC2
Instance
X
Shared storage with iSCSI
EBS Vol
ume
EC2
Instance
EC2
Instance
iSCSI Target
ASM
ASM
ASM
EC2
Instance
iSCSI Target
open-iscsi-2.0-873
Open iSCSI Project
open-iscsi.org
Why no RAC on EC2?
Multicast Network
EC2
Instance
EC2
Instance
EC2
Instance
Multicast
Multicast on EC2
EC2
Instance
EC2
Instance
EC2
Instance
Multicast
EC2
Instance
N2N Edge
edge0
N2N Edge
edge0
N2N Edge
edge0
N2N
Supernode
N2n VPN
ntop
ntop.org/n2n
RAC on EC2 prototype: aws.amazon.com/articles
iSCSI Target 1 (i2.8xl)
Subnet
Placement Group (dedicated instances)
VPC
Route 53
Private Hosted Zone
• DNS
• SCAN
• VIPs
• Etc.
6400G Ephemeral SSD
4800G DATA LVM Vol
tgtd
iSCSI Target 2 (i2.8xl)
RAC Node 1 (c3.8xl)
RAC Node 2 (c3.8xl)
600G Flash Cache
edge
ASM (n
ormal
redund.)
Grid
Infra.
DB
12c
supernode
800G
RECO
800G
free
640G Ephemeral SSD
iscsid
600G Flash Cache
edge
ASM (n
ormal
redund.)
Grid
Infra.
DB
12c
640G Ephemeral SSD
iscsid
6400G Ephemeral SSD
4800G DATA LVM Vol
tgtd
800G
RECO
800G
free
Sign Up for AWS Database Migration Service
• Sign up for AWS Database Migration Service Preview now:
• aws.amazon.com/dms
• Download the AWS Schema Conversion Tool:
• aws.amazon.com/dms
Thank you
Ad

More Related Content

Similar to AWS 마이그레이션 서비스 - 김일호 :: 2015 리인벤트 리캡 게이밍 (7)

2017 AWS DB Day | Amazon Database Migration Service (DMS) 소개 및 실습
2017 AWS DB Day | Amazon Database Migration Service (DMS) 소개 및 실습2017 AWS DB Day | Amazon Database Migration Service (DMS) 소개 및 실습
2017 AWS DB Day | Amazon Database Migration Service (DMS) 소개 및 실습
Amazon Web Services Korea
 
AWS Data migration services
AWS Data migration servicesAWS Data migration services
AWS Data migration services
Arun Sirimalla
 
Migrating Your Databases to AWS Deep Dive on Amazon RDS and AWS
Migrating Your Databases to AWS Deep Dive on Amazon RDS and AWSMigrating Your Databases to AWS Deep Dive on Amazon RDS and AWS
Migrating Your Databases to AWS Deep Dive on Amazon RDS and AWS
Kristana Kane
 
2016 Utah Cloud Summit: RDS
2016 Utah Cloud Summit: RDS2016 Utah Cloud Summit: RDS
2016 Utah Cloud Summit: RDS
1Strategy
 
2015 SQL Pass Summit Breakfast session #1
2015 SQL Pass Summit Breakfast session #12015 SQL Pass Summit Breakfast session #1
2015 SQL Pass Summit Breakfast session #1
Ghim-Sim Chua
 
Scaling on AWS for the First 10 Million Users at Websummit Dublin
Scaling on AWS for the First 10 Million Users at Websummit DublinScaling on AWS for the First 10 Million Users at Websummit Dublin
Scaling on AWS for the First 10 Million Users at Websummit Dublin
Ian Massingham
 
AWS Public Cloud solution for ABC Corporation
AWS Public Cloud solution for ABC CorporationAWS Public Cloud solution for ABC Corporation
AWS Public Cloud solution for ABC Corporation
Manpreet Sidhu
 
2017 AWS DB Day | Amazon Database Migration Service (DMS) 소개 및 실습
2017 AWS DB Day | Amazon Database Migration Service (DMS) 소개 및 실습2017 AWS DB Day | Amazon Database Migration Service (DMS) 소개 및 실습
2017 AWS DB Day | Amazon Database Migration Service (DMS) 소개 및 실습
Amazon Web Services Korea
 
AWS Data migration services
AWS Data migration servicesAWS Data migration services
AWS Data migration services
Arun Sirimalla
 
Migrating Your Databases to AWS Deep Dive on Amazon RDS and AWS
Migrating Your Databases to AWS Deep Dive on Amazon RDS and AWSMigrating Your Databases to AWS Deep Dive on Amazon RDS and AWS
Migrating Your Databases to AWS Deep Dive on Amazon RDS and AWS
Kristana Kane
 
2016 Utah Cloud Summit: RDS
2016 Utah Cloud Summit: RDS2016 Utah Cloud Summit: RDS
2016 Utah Cloud Summit: RDS
1Strategy
 
2015 SQL Pass Summit Breakfast session #1
2015 SQL Pass Summit Breakfast session #12015 SQL Pass Summit Breakfast session #1
2015 SQL Pass Summit Breakfast session #1
Ghim-Sim Chua
 
Scaling on AWS for the First 10 Million Users at Websummit Dublin
Scaling on AWS for the First 10 Million Users at Websummit DublinScaling on AWS for the First 10 Million Users at Websummit Dublin
Scaling on AWS for the First 10 Million Users at Websummit Dublin
Ian Massingham
 
AWS Public Cloud solution for ABC Corporation
AWS Public Cloud solution for ABC CorporationAWS Public Cloud solution for ABC Corporation
AWS Public Cloud solution for ABC Corporation
Manpreet Sidhu
 

More from Amazon Web Services Korea (20)

[D3T1S01] Gen AI를 위한 Amazon Aurora 활용 사례 방법
[D3T1S01] Gen AI를 위한 Amazon Aurora  활용 사례 방법[D3T1S01] Gen AI를 위한 Amazon Aurora  활용 사례 방법
[D3T1S01] Gen AI를 위한 Amazon Aurora 활용 사례 방법
Amazon Web Services Korea
 
[D3T1S06] Neptune Analytics with Vector Similarity Search
[D3T1S06] Neptune Analytics with Vector Similarity Search[D3T1S06] Neptune Analytics with Vector Similarity Search
[D3T1S06] Neptune Analytics with Vector Similarity Search
Amazon Web Services Korea
 
[D3T1S03] Amazon DynamoDB design puzzlers
[D3T1S03] Amazon DynamoDB design puzzlers[D3T1S03] Amazon DynamoDB design puzzlers
[D3T1S03] Amazon DynamoDB design puzzlers
Amazon Web Services Korea
 
[D3T1S04] Aurora PostgreSQL performance monitoring and troubleshooting by use...
[D3T1S04] Aurora PostgreSQL performance monitoring and troubleshooting by use...[D3T1S04] Aurora PostgreSQL performance monitoring and troubleshooting by use...
[D3T1S04] Aurora PostgreSQL performance monitoring and troubleshooting by use...
Amazon Web Services Korea
 
[D3T1S07] AWS S3 - 클라우드 환경에서 데이터베이스 보호하기
[D3T1S07] AWS S3 - 클라우드 환경에서 데이터베이스 보호하기[D3T1S07] AWS S3 - 클라우드 환경에서 데이터베이스 보호하기
[D3T1S07] AWS S3 - 클라우드 환경에서 데이터베이스 보호하기
Amazon Web Services Korea
 
[D3T1S05] Aurora 혼합 구성 아키텍처를 사용하여 예상치 못한 트래픽 급증 대응하기
[D3T1S05] Aurora 혼합 구성 아키텍처를 사용하여 예상치 못한 트래픽 급증 대응하기[D3T1S05] Aurora 혼합 구성 아키텍처를 사용하여 예상치 못한 트래픽 급증 대응하기
[D3T1S05] Aurora 혼합 구성 아키텍처를 사용하여 예상치 못한 트래픽 급증 대응하기
Amazon Web Services Korea
 
[D3T1S02] Aurora Limitless Database Introduction
[D3T1S02] Aurora Limitless Database Introduction[D3T1S02] Aurora Limitless Database Introduction
[D3T1S02] Aurora Limitless Database Introduction
Amazon Web Services Korea
 
[D3T2S01] Amazon Aurora MySQL 메이저 버전 업그레이드 및 Amazon B/G Deployments 실습
[D3T2S01] Amazon Aurora MySQL 메이저 버전 업그레이드 및 Amazon B/G Deployments 실습[D3T2S01] Amazon Aurora MySQL 메이저 버전 업그레이드 및 Amazon B/G Deployments 실습
[D3T2S01] Amazon Aurora MySQL 메이저 버전 업그레이드 및 Amazon B/G Deployments 실습
Amazon Web Services Korea
 
[D3T2S03] Data&AI Roadshow 2024 - Amazon DocumentDB 실습
[D3T2S03] Data&AI Roadshow 2024 - Amazon DocumentDB 실습[D3T2S03] Data&AI Roadshow 2024 - Amazon DocumentDB 실습
[D3T2S03] Data&AI Roadshow 2024 - Amazon DocumentDB 실습
Amazon Web Services Korea
 
AWS Modern Infra with Storage Roadshow 2023 - Day 2
AWS Modern Infra with Storage Roadshow 2023 - Day 2AWS Modern Infra with Storage Roadshow 2023 - Day 2
AWS Modern Infra with Storage Roadshow 2023 - Day 2
Amazon Web Services Korea
 
AWS Modern Infra with Storage Roadshow 2023 - Day 1
AWS Modern Infra with Storage Roadshow 2023 - Day 1AWS Modern Infra with Storage Roadshow 2023 - Day 1
AWS Modern Infra with Storage Roadshow 2023 - Day 1
Amazon Web Services Korea
 
사례로 알아보는 Database Migration Service : 데이터베이스 및 데이터 이관, 통합, 분리, 분석의 도구 - 발표자: ...
사례로 알아보는 Database Migration Service : 데이터베이스 및 데이터 이관, 통합, 분리, 분석의 도구 - 발표자: ...사례로 알아보는 Database Migration Service : 데이터베이스 및 데이터 이관, 통합, 분리, 분석의 도구 - 발표자: ...
사례로 알아보는 Database Migration Service : 데이터베이스 및 데이터 이관, 통합, 분리, 분석의 도구 - 발표자: ...
Amazon Web Services Korea
 
Amazon DocumentDB - Architecture 및 Best Practice (Level 200) - 발표자: 장동훈, Sr. ...
Amazon DocumentDB - Architecture 및 Best Practice (Level 200) - 발표자: 장동훈, Sr. ...Amazon DocumentDB - Architecture 및 Best Practice (Level 200) - 발표자: 장동훈, Sr. ...
Amazon DocumentDB - Architecture 및 Best Practice (Level 200) - 발표자: 장동훈, Sr. ...
Amazon Web Services Korea
 
Amazon Elasticache - Fully managed, Redis & Memcached Compatible Service (Lev...
Amazon Elasticache - Fully managed, Redis & Memcached Compatible Service (Lev...Amazon Elasticache - Fully managed, Redis & Memcached Compatible Service (Lev...
Amazon Elasticache - Fully managed, Redis & Memcached Compatible Service (Lev...
Amazon Web Services Korea
 
Internal Architecture of Amazon Aurora (Level 400) - 발표자: 정달영, APAC RDS Speci...
Internal Architecture of Amazon Aurora (Level 400) - 발표자: 정달영, APAC RDS Speci...Internal Architecture of Amazon Aurora (Level 400) - 발표자: 정달영, APAC RDS Speci...
Internal Architecture of Amazon Aurora (Level 400) - 발표자: 정달영, APAC RDS Speci...
Amazon Web Services Korea
 
[Keynote] 슬기로운 AWS 데이터베이스 선택하기 - 발표자: 강민석, Korea Database SA Manager, WWSO, A...
[Keynote] 슬기로운 AWS 데이터베이스 선택하기 - 발표자: 강민석, Korea Database SA Manager, WWSO, A...[Keynote] 슬기로운 AWS 데이터베이스 선택하기 - 발표자: 강민석, Korea Database SA Manager, WWSO, A...
[Keynote] 슬기로운 AWS 데이터베이스 선택하기 - 발표자: 강민석, Korea Database SA Manager, WWSO, A...
Amazon Web Services Korea
 
Demystify Streaming on AWS - 발표자: 이종혁, Sr Analytics Specialist, WWSO, AWS :::...
Demystify Streaming on AWS - 발표자: 이종혁, Sr Analytics Specialist, WWSO, AWS :::...Demystify Streaming on AWS - 발표자: 이종혁, Sr Analytics Specialist, WWSO, AWS :::...
Demystify Streaming on AWS - 발표자: 이종혁, Sr Analytics Specialist, WWSO, AWS :::...
Amazon Web Services Korea
 
Amazon EMR - Enhancements on Cost/Performance, Serverless - 발표자: 김기영, Sr Anal...
Amazon EMR - Enhancements on Cost/Performance, Serverless - 발표자: 김기영, Sr Anal...Amazon EMR - Enhancements on Cost/Performance, Serverless - 발표자: 김기영, Sr Anal...
Amazon EMR - Enhancements on Cost/Performance, Serverless - 발표자: 김기영, Sr Anal...
Amazon Web Services Korea
 
Amazon OpenSearch - Use Cases, Security/Observability, Serverless and Enhance...
Amazon OpenSearch - Use Cases, Security/Observability, Serverless and Enhance...Amazon OpenSearch - Use Cases, Security/Observability, Serverless and Enhance...
Amazon OpenSearch - Use Cases, Security/Observability, Serverless and Enhance...
Amazon Web Services Korea
 
Enabling Agility with Data Governance - 발표자: 김성연, Analytics Specialist, WWSO,...
Enabling Agility with Data Governance - 발표자: 김성연, Analytics Specialist, WWSO,...Enabling Agility with Data Governance - 발표자: 김성연, Analytics Specialist, WWSO,...
Enabling Agility with Data Governance - 발표자: 김성연, Analytics Specialist, WWSO,...
Amazon Web Services Korea
 
[D3T1S01] Gen AI를 위한 Amazon Aurora 활용 사례 방법
[D3T1S01] Gen AI를 위한 Amazon Aurora  활용 사례 방법[D3T1S01] Gen AI를 위한 Amazon Aurora  활용 사례 방법
[D3T1S01] Gen AI를 위한 Amazon Aurora 활용 사례 방법
Amazon Web Services Korea
 
[D3T1S06] Neptune Analytics with Vector Similarity Search
[D3T1S06] Neptune Analytics with Vector Similarity Search[D3T1S06] Neptune Analytics with Vector Similarity Search
[D3T1S06] Neptune Analytics with Vector Similarity Search
Amazon Web Services Korea
 
[D3T1S04] Aurora PostgreSQL performance monitoring and troubleshooting by use...
[D3T1S04] Aurora PostgreSQL performance monitoring and troubleshooting by use...[D3T1S04] Aurora PostgreSQL performance monitoring and troubleshooting by use...
[D3T1S04] Aurora PostgreSQL performance monitoring and troubleshooting by use...
Amazon Web Services Korea
 
[D3T1S07] AWS S3 - 클라우드 환경에서 데이터베이스 보호하기
[D3T1S07] AWS S3 - 클라우드 환경에서 데이터베이스 보호하기[D3T1S07] AWS S3 - 클라우드 환경에서 데이터베이스 보호하기
[D3T1S07] AWS S3 - 클라우드 환경에서 데이터베이스 보호하기
Amazon Web Services Korea
 
[D3T1S05] Aurora 혼합 구성 아키텍처를 사용하여 예상치 못한 트래픽 급증 대응하기
[D3T1S05] Aurora 혼합 구성 아키텍처를 사용하여 예상치 못한 트래픽 급증 대응하기[D3T1S05] Aurora 혼합 구성 아키텍처를 사용하여 예상치 못한 트래픽 급증 대응하기
[D3T1S05] Aurora 혼합 구성 아키텍처를 사용하여 예상치 못한 트래픽 급증 대응하기
Amazon Web Services Korea
 
[D3T1S02] Aurora Limitless Database Introduction
[D3T1S02] Aurora Limitless Database Introduction[D3T1S02] Aurora Limitless Database Introduction
[D3T1S02] Aurora Limitless Database Introduction
Amazon Web Services Korea
 
[D3T2S01] Amazon Aurora MySQL 메이저 버전 업그레이드 및 Amazon B/G Deployments 실습
[D3T2S01] Amazon Aurora MySQL 메이저 버전 업그레이드 및 Amazon B/G Deployments 실습[D3T2S01] Amazon Aurora MySQL 메이저 버전 업그레이드 및 Amazon B/G Deployments 실습
[D3T2S01] Amazon Aurora MySQL 메이저 버전 업그레이드 및 Amazon B/G Deployments 실습
Amazon Web Services Korea
 
[D3T2S03] Data&AI Roadshow 2024 - Amazon DocumentDB 실습
[D3T2S03] Data&AI Roadshow 2024 - Amazon DocumentDB 실습[D3T2S03] Data&AI Roadshow 2024 - Amazon DocumentDB 실습
[D3T2S03] Data&AI Roadshow 2024 - Amazon DocumentDB 실습
Amazon Web Services Korea
 
AWS Modern Infra with Storage Roadshow 2023 - Day 2
AWS Modern Infra with Storage Roadshow 2023 - Day 2AWS Modern Infra with Storage Roadshow 2023 - Day 2
AWS Modern Infra with Storage Roadshow 2023 - Day 2
Amazon Web Services Korea
 
AWS Modern Infra with Storage Roadshow 2023 - Day 1
AWS Modern Infra with Storage Roadshow 2023 - Day 1AWS Modern Infra with Storage Roadshow 2023 - Day 1
AWS Modern Infra with Storage Roadshow 2023 - Day 1
Amazon Web Services Korea
 
사례로 알아보는 Database Migration Service : 데이터베이스 및 데이터 이관, 통합, 분리, 분석의 도구 - 발표자: ...
사례로 알아보는 Database Migration Service : 데이터베이스 및 데이터 이관, 통합, 분리, 분석의 도구 - 발표자: ...사례로 알아보는 Database Migration Service : 데이터베이스 및 데이터 이관, 통합, 분리, 분석의 도구 - 발표자: ...
사례로 알아보는 Database Migration Service : 데이터베이스 및 데이터 이관, 통합, 분리, 분석의 도구 - 발표자: ...
Amazon Web Services Korea
 
Amazon DocumentDB - Architecture 및 Best Practice (Level 200) - 발표자: 장동훈, Sr. ...
Amazon DocumentDB - Architecture 및 Best Practice (Level 200) - 발표자: 장동훈, Sr. ...Amazon DocumentDB - Architecture 및 Best Practice (Level 200) - 발표자: 장동훈, Sr. ...
Amazon DocumentDB - Architecture 및 Best Practice (Level 200) - 발표자: 장동훈, Sr. ...
Amazon Web Services Korea
 
Amazon Elasticache - Fully managed, Redis & Memcached Compatible Service (Lev...
Amazon Elasticache - Fully managed, Redis & Memcached Compatible Service (Lev...Amazon Elasticache - Fully managed, Redis & Memcached Compatible Service (Lev...
Amazon Elasticache - Fully managed, Redis & Memcached Compatible Service (Lev...
Amazon Web Services Korea
 
Internal Architecture of Amazon Aurora (Level 400) - 발표자: 정달영, APAC RDS Speci...
Internal Architecture of Amazon Aurora (Level 400) - 발표자: 정달영, APAC RDS Speci...Internal Architecture of Amazon Aurora (Level 400) - 발표자: 정달영, APAC RDS Speci...
Internal Architecture of Amazon Aurora (Level 400) - 발표자: 정달영, APAC RDS Speci...
Amazon Web Services Korea
 
[Keynote] 슬기로운 AWS 데이터베이스 선택하기 - 발표자: 강민석, Korea Database SA Manager, WWSO, A...
[Keynote] 슬기로운 AWS 데이터베이스 선택하기 - 발표자: 강민석, Korea Database SA Manager, WWSO, A...[Keynote] 슬기로운 AWS 데이터베이스 선택하기 - 발표자: 강민석, Korea Database SA Manager, WWSO, A...
[Keynote] 슬기로운 AWS 데이터베이스 선택하기 - 발표자: 강민석, Korea Database SA Manager, WWSO, A...
Amazon Web Services Korea
 
Demystify Streaming on AWS - 발표자: 이종혁, Sr Analytics Specialist, WWSO, AWS :::...
Demystify Streaming on AWS - 발표자: 이종혁, Sr Analytics Specialist, WWSO, AWS :::...Demystify Streaming on AWS - 발표자: 이종혁, Sr Analytics Specialist, WWSO, AWS :::...
Demystify Streaming on AWS - 발표자: 이종혁, Sr Analytics Specialist, WWSO, AWS :::...
Amazon Web Services Korea
 
Amazon EMR - Enhancements on Cost/Performance, Serverless - 발표자: 김기영, Sr Anal...
Amazon EMR - Enhancements on Cost/Performance, Serverless - 발표자: 김기영, Sr Anal...Amazon EMR - Enhancements on Cost/Performance, Serverless - 발표자: 김기영, Sr Anal...
Amazon EMR - Enhancements on Cost/Performance, Serverless - 발표자: 김기영, Sr Anal...
Amazon Web Services Korea
 
Amazon OpenSearch - Use Cases, Security/Observability, Serverless and Enhance...
Amazon OpenSearch - Use Cases, Security/Observability, Serverless and Enhance...Amazon OpenSearch - Use Cases, Security/Observability, Serverless and Enhance...
Amazon OpenSearch - Use Cases, Security/Observability, Serverless and Enhance...
Amazon Web Services Korea
 
Enabling Agility with Data Governance - 발표자: 김성연, Analytics Specialist, WWSO,...
Enabling Agility with Data Governance - 발표자: 김성연, Analytics Specialist, WWSO,...Enabling Agility with Data Governance - 발표자: 김성연, Analytics Specialist, WWSO,...
Enabling Agility with Data Governance - 발표자: 김성연, Analytics Specialist, WWSO,...
Amazon Web Services Korea
 
Ad

Recently uploaded (20)

Quantum Computing Quick Research Guide by Arthur Morgan
Quantum Computing Quick Research Guide by Arthur MorganQuantum Computing Quick Research Guide by Arthur Morgan
Quantum Computing Quick Research Guide by Arthur Morgan
Arthur Morgan
 
2025-05-Q4-2024-Investor-Presentation.pptx
2025-05-Q4-2024-Investor-Presentation.pptx2025-05-Q4-2024-Investor-Presentation.pptx
2025-05-Q4-2024-Investor-Presentation.pptx
Samuele Fogagnolo
 
Manifest Pre-Seed Update | A Humanoid OEM Deeptech In France
Manifest Pre-Seed Update | A Humanoid OEM Deeptech In FranceManifest Pre-Seed Update | A Humanoid OEM Deeptech In France
Manifest Pre-Seed Update | A Humanoid OEM Deeptech In France
chb3
 
What is Model Context Protocol(MCP) - The new technology for communication bw...
What is Model Context Protocol(MCP) - The new technology for communication bw...What is Model Context Protocol(MCP) - The new technology for communication bw...
What is Model Context Protocol(MCP) - The new technology for communication bw...
Vishnu Singh Chundawat
 
UiPath Community Berlin: Orchestrator API, Swagger, and Test Manager API
UiPath Community Berlin: Orchestrator API, Swagger, and Test Manager APIUiPath Community Berlin: Orchestrator API, Swagger, and Test Manager API
UiPath Community Berlin: Orchestrator API, Swagger, and Test Manager API
UiPathCommunity
 
How analogue intelligence complements AI
How analogue intelligence complements AIHow analogue intelligence complements AI
How analogue intelligence complements AI
Paul Rowe
 
Greenhouse_Monitoring_Presentation.pptx.
Greenhouse_Monitoring_Presentation.pptx.Greenhouse_Monitoring_Presentation.pptx.
Greenhouse_Monitoring_Presentation.pptx.
hpbmnnxrvb
 
The Evolution of Meme Coins A New Era for Digital Currency ppt.pdf
The Evolution of Meme Coins A New Era for Digital Currency ppt.pdfThe Evolution of Meme Coins A New Era for Digital Currency ppt.pdf
The Evolution of Meme Coins A New Era for Digital Currency ppt.pdf
Abi john
 
Role of Data Annotation Services in AI-Powered Manufacturing
Role of Data Annotation Services in AI-Powered ManufacturingRole of Data Annotation Services in AI-Powered Manufacturing
Role of Data Annotation Services in AI-Powered Manufacturing
Andrew Leo
 
Special Meetup Edition - TDX Bengaluru Meetup #52.pptx
Special Meetup Edition - TDX Bengaluru Meetup #52.pptxSpecial Meetup Edition - TDX Bengaluru Meetup #52.pptx
Special Meetup Edition - TDX Bengaluru Meetup #52.pptx
shyamraj55
 
Increasing Retail Store Efficiency How can Planograms Save Time and Money.pptx
Increasing Retail Store Efficiency How can Planograms Save Time and Money.pptxIncreasing Retail Store Efficiency How can Planograms Save Time and Money.pptx
Increasing Retail Store Efficiency How can Planograms Save Time and Money.pptx
Anoop Ashok
 
SAP Modernization: Maximizing the Value of Your SAP S/4HANA Migration.pdf
SAP Modernization: Maximizing the Value of Your SAP S/4HANA Migration.pdfSAP Modernization: Maximizing the Value of Your SAP S/4HANA Migration.pdf
SAP Modernization: Maximizing the Value of Your SAP S/4HANA Migration.pdf
Precisely
 
Rusty Waters: Elevating Lakehouses Beyond Spark
Rusty Waters: Elevating Lakehouses Beyond SparkRusty Waters: Elevating Lakehouses Beyond Spark
Rusty Waters: Elevating Lakehouses Beyond Spark
carlyakerly1
 
Big Data Analytics Quick Research Guide by Arthur Morgan
Big Data Analytics Quick Research Guide by Arthur MorganBig Data Analytics Quick Research Guide by Arthur Morgan
Big Data Analytics Quick Research Guide by Arthur Morgan
Arthur Morgan
 
Noah Loul Shares 5 Steps to Implement AI Agents for Maximum Business Efficien...
Noah Loul Shares 5 Steps to Implement AI Agents for Maximum Business Efficien...Noah Loul Shares 5 Steps to Implement AI Agents for Maximum Business Efficien...
Noah Loul Shares 5 Steps to Implement AI Agents for Maximum Business Efficien...
Noah Loul
 
DevOpsDays Atlanta 2025 - Building 10x Development Organizations.pptx
DevOpsDays Atlanta 2025 - Building 10x Development Organizations.pptxDevOpsDays Atlanta 2025 - Building 10x Development Organizations.pptx
DevOpsDays Atlanta 2025 - Building 10x Development Organizations.pptx
Justin Reock
 
Semantic Cultivators : The Critical Future Role to Enable AI
Semantic Cultivators : The Critical Future Role to Enable AISemantic Cultivators : The Critical Future Role to Enable AI
Semantic Cultivators : The Critical Future Role to Enable AI
artmondano
 
HCL Nomad Web – Best Practices and Managing Multiuser Environments
HCL Nomad Web – Best Practices and Managing Multiuser EnvironmentsHCL Nomad Web – Best Practices and Managing Multiuser Environments
HCL Nomad Web – Best Practices and Managing Multiuser Environments
panagenda
 
IEDM 2024 Tutorial2_Advances in CMOS Technologies and Future Directions for C...
IEDM 2024 Tutorial2_Advances in CMOS Technologies and Future Directions for C...IEDM 2024 Tutorial2_Advances in CMOS Technologies and Future Directions for C...
IEDM 2024 Tutorial2_Advances in CMOS Technologies and Future Directions for C...
organizerofv
 
Enhancing ICU Intelligence: How Our Functional Testing Enabled a Healthcare I...
Enhancing ICU Intelligence: How Our Functional Testing Enabled a Healthcare I...Enhancing ICU Intelligence: How Our Functional Testing Enabled a Healthcare I...
Enhancing ICU Intelligence: How Our Functional Testing Enabled a Healthcare I...
Impelsys Inc.
 
Quantum Computing Quick Research Guide by Arthur Morgan
Quantum Computing Quick Research Guide by Arthur MorganQuantum Computing Quick Research Guide by Arthur Morgan
Quantum Computing Quick Research Guide by Arthur Morgan
Arthur Morgan
 
2025-05-Q4-2024-Investor-Presentation.pptx
2025-05-Q4-2024-Investor-Presentation.pptx2025-05-Q4-2024-Investor-Presentation.pptx
2025-05-Q4-2024-Investor-Presentation.pptx
Samuele Fogagnolo
 
Manifest Pre-Seed Update | A Humanoid OEM Deeptech In France
Manifest Pre-Seed Update | A Humanoid OEM Deeptech In FranceManifest Pre-Seed Update | A Humanoid OEM Deeptech In France
Manifest Pre-Seed Update | A Humanoid OEM Deeptech In France
chb3
 
What is Model Context Protocol(MCP) - The new technology for communication bw...
What is Model Context Protocol(MCP) - The new technology for communication bw...What is Model Context Protocol(MCP) - The new technology for communication bw...
What is Model Context Protocol(MCP) - The new technology for communication bw...
Vishnu Singh Chundawat
 
UiPath Community Berlin: Orchestrator API, Swagger, and Test Manager API
UiPath Community Berlin: Orchestrator API, Swagger, and Test Manager APIUiPath Community Berlin: Orchestrator API, Swagger, and Test Manager API
UiPath Community Berlin: Orchestrator API, Swagger, and Test Manager API
UiPathCommunity
 
How analogue intelligence complements AI
How analogue intelligence complements AIHow analogue intelligence complements AI
How analogue intelligence complements AI
Paul Rowe
 
Greenhouse_Monitoring_Presentation.pptx.
Greenhouse_Monitoring_Presentation.pptx.Greenhouse_Monitoring_Presentation.pptx.
Greenhouse_Monitoring_Presentation.pptx.
hpbmnnxrvb
 
The Evolution of Meme Coins A New Era for Digital Currency ppt.pdf
The Evolution of Meme Coins A New Era for Digital Currency ppt.pdfThe Evolution of Meme Coins A New Era for Digital Currency ppt.pdf
The Evolution of Meme Coins A New Era for Digital Currency ppt.pdf
Abi john
 
Role of Data Annotation Services in AI-Powered Manufacturing
Role of Data Annotation Services in AI-Powered ManufacturingRole of Data Annotation Services in AI-Powered Manufacturing
Role of Data Annotation Services in AI-Powered Manufacturing
Andrew Leo
 
Special Meetup Edition - TDX Bengaluru Meetup #52.pptx
Special Meetup Edition - TDX Bengaluru Meetup #52.pptxSpecial Meetup Edition - TDX Bengaluru Meetup #52.pptx
Special Meetup Edition - TDX Bengaluru Meetup #52.pptx
shyamraj55
 
Increasing Retail Store Efficiency How can Planograms Save Time and Money.pptx
Increasing Retail Store Efficiency How can Planograms Save Time and Money.pptxIncreasing Retail Store Efficiency How can Planograms Save Time and Money.pptx
Increasing Retail Store Efficiency How can Planograms Save Time and Money.pptx
Anoop Ashok
 
SAP Modernization: Maximizing the Value of Your SAP S/4HANA Migration.pdf
SAP Modernization: Maximizing the Value of Your SAP S/4HANA Migration.pdfSAP Modernization: Maximizing the Value of Your SAP S/4HANA Migration.pdf
SAP Modernization: Maximizing the Value of Your SAP S/4HANA Migration.pdf
Precisely
 
Rusty Waters: Elevating Lakehouses Beyond Spark
Rusty Waters: Elevating Lakehouses Beyond SparkRusty Waters: Elevating Lakehouses Beyond Spark
Rusty Waters: Elevating Lakehouses Beyond Spark
carlyakerly1
 
Big Data Analytics Quick Research Guide by Arthur Morgan
Big Data Analytics Quick Research Guide by Arthur MorganBig Data Analytics Quick Research Guide by Arthur Morgan
Big Data Analytics Quick Research Guide by Arthur Morgan
Arthur Morgan
 
Noah Loul Shares 5 Steps to Implement AI Agents for Maximum Business Efficien...
Noah Loul Shares 5 Steps to Implement AI Agents for Maximum Business Efficien...Noah Loul Shares 5 Steps to Implement AI Agents for Maximum Business Efficien...
Noah Loul Shares 5 Steps to Implement AI Agents for Maximum Business Efficien...
Noah Loul
 
DevOpsDays Atlanta 2025 - Building 10x Development Organizations.pptx
DevOpsDays Atlanta 2025 - Building 10x Development Organizations.pptxDevOpsDays Atlanta 2025 - Building 10x Development Organizations.pptx
DevOpsDays Atlanta 2025 - Building 10x Development Organizations.pptx
Justin Reock
 
Semantic Cultivators : The Critical Future Role to Enable AI
Semantic Cultivators : The Critical Future Role to Enable AISemantic Cultivators : The Critical Future Role to Enable AI
Semantic Cultivators : The Critical Future Role to Enable AI
artmondano
 
HCL Nomad Web – Best Practices and Managing Multiuser Environments
HCL Nomad Web – Best Practices and Managing Multiuser EnvironmentsHCL Nomad Web – Best Practices and Managing Multiuser Environments
HCL Nomad Web – Best Practices and Managing Multiuser Environments
panagenda
 
IEDM 2024 Tutorial2_Advances in CMOS Technologies and Future Directions for C...
IEDM 2024 Tutorial2_Advances in CMOS Technologies and Future Directions for C...IEDM 2024 Tutorial2_Advances in CMOS Technologies and Future Directions for C...
IEDM 2024 Tutorial2_Advances in CMOS Technologies and Future Directions for C...
organizerofv
 
Enhancing ICU Intelligence: How Our Functional Testing Enabled a Healthcare I...
Enhancing ICU Intelligence: How Our Functional Testing Enabled a Healthcare I...Enhancing ICU Intelligence: How Our Functional Testing Enabled a Healthcare I...
Enhancing ICU Intelligence: How Our Functional Testing Enabled a Healthcare I...
Impelsys Inc.
 
Ad

AWS 마이그레이션 서비스 - 김일호 :: 2015 리인벤트 리캡 게이밍

  • 1. Introduction to AWS Database Migration Service 김일호, Solutions Architect
  • 2. What to Expect from the Session • Learn about migrating databases with minimal downtime to Amazon RDS, Amazon Redshift and Amazon Aurora • Discuss database migrations to same and different engines • Learn about the converting schemas and stored code from Oracle and SQL Server to MySQL and Aurora • One more thing~
  • 3. Embracing the cloud demands a cloud data strategy • How will my on-premises data migrate to the cloud? • How can I make it transparent to my customers? • Afterwards, how will on-premises and cloud data interact? • How can I integrate my data assets within AWS? • Can I get help moving off of commercial databases?
  • 4. Historically, Migration = Cost, Time • Commercial Migration / Replication software • Complex to setup and manage • Legacy schema objects, PL/SQL or T-SQL code • Application downtime
  • 6. Purposes of data migration One-time data migration Between on premises and AWS Between Amazon EC2 and Amazo n RDS Ongoing Replication Replicate on premises to AWS Replicate AWS to on premises Replicate OLTP to BI Replicate for query offloading
  • 7. Ways to migrate data Bulk Load AWS Database Migration Service Oracle Import/Export Oracle Data Pump Network Mode Oracle SQL*Loader Oracle Materialized Views CTAS / INSERT over dblink Ongoing Replication AWS Database Migration Service Oracle Data Pump Network Mode Oracle Materialized Views Oracle GoldenGate
  • 8. High-speed database migration prior to AWS DMS EC2 Instance Linux Host On-Premises AWS Availability Zone Oracle DB RDS Oracle Tsunami Tsunami DATA_PUMP_DIR 500GB 175GB ~2.5 hours~2.5 hours Total Time ~7 hours ~3.5 hours~4 hours
  • 9. Start your first migration in 10 minutes or less Keep your apps running during the migration Replicate within, to or from Amazon EC2 or RDS Move data to the same or different database engine Sign up for preview at aws.amazon.com/dms AWS Database Migration Service
  • 10. 10 minutes or less to migration
  • 11. Customer Premises Application Users AWS Internet VPN • Start a replication instance • Connect to source and target databases • Select tables, schemas, or databases Let AWS Database Migration Service create tables, load data, and keep them in sync Switch applications over to the target at your convenience Keep your apps running during the migration AWS Database Migration Service
  • 12. After migration, use for replication and data integration • Replicate data in on-premises databases to AWS • Replicate OLTP data to Amazon Redshift • Integrate tables from third-party software into your reporting or core OLTP systems • Hybrid cloud is a stepping stone in migration to AWS
  • 13. Cost-effective and no upfront costs • T2 pricing starts at $0.018 per Hour for T2.micro • C4 pricing starts at $0.154 per Hour for C4.large • 50GB GP2 storage included with T2 instances • 100GB GP2 storage included with C4 instances • • Data transfer inbound and within AZ is free • Data transfer across AZs starts at $0.01 per GB Swap Logs Cache
  • 14. Migrate and replicate between database engines
  • 16. Migrate off Oracle and SQL Server Move your tables, views, stored procedures and DM L to MySQL, MariaDB, and Amazon Aurora Know exactly where manual edits are needed Download at aws.amazon.com/dms AWS Schema Conversion Tool
  • 17. Get help with converting tables, views, and code Schemas Tables Indexes Views Packages Stored Procedures Functions Triggers Sequences User Defined Types Synonyms
  • 18. Move your database schema and code
  • 19. Know exactly where manual edits are needed
  • 21. RAC on Amazon EC2 would be useful • Test / dev / non-prod; allow testing to cover RAC-related regression cases • Scale out and back elastically; a good match for the cloud • Scale beyond the largest instances • High-RTO redundancy at the host/instance level; App continuity for near zero downtime • Test scaling limits; a given workload scales only to n nodes on RAC • Some applications “require” RAC • Some customers don’t want to re-engineer everything just to move to AWS • Customers want it!
  • 23. Why no RAC on EC2? EBS Vol ume Shared Storage EC2 Instance X
  • 24. Shared storage with iSCSI EBS Vol ume EC2 Instance EC2 Instance iSCSI Target ASM ASM ASM EC2 Instance iSCSI Target open-iscsi-2.0-873 Open iSCSI Project open-iscsi.org
  • 25. Why no RAC on EC2? Multicast Network EC2 Instance EC2 Instance EC2 Instance Multicast
  • 26. Multicast on EC2 EC2 Instance EC2 Instance EC2 Instance Multicast EC2 Instance N2N Edge edge0 N2N Edge edge0 N2N Edge edge0 N2N Supernode N2n VPN ntop ntop.org/n2n
  • 27. RAC on EC2 prototype: aws.amazon.com/articles iSCSI Target 1 (i2.8xl) Subnet Placement Group (dedicated instances) VPC Route 53 Private Hosted Zone • DNS • SCAN • VIPs • Etc. 6400G Ephemeral SSD 4800G DATA LVM Vol tgtd iSCSI Target 2 (i2.8xl) RAC Node 1 (c3.8xl) RAC Node 2 (c3.8xl) 600G Flash Cache edge ASM (n ormal redund.) Grid Infra. DB 12c supernode 800G RECO 800G free 640G Ephemeral SSD iscsid 600G Flash Cache edge ASM (n ormal redund.) Grid Infra. DB 12c 640G Ephemeral SSD iscsid 6400G Ephemeral SSD 4800G DATA LVM Vol tgtd 800G RECO 800G free
  • 28. Sign Up for AWS Database Migration Service • Sign up for AWS Database Migration Service Preview now: • aws.amazon.com/dms • Download the AWS Schema Conversion Tool: • aws.amazon.com/dms

Editor's Notes

  • #3: Introduce self and Sergei (Senior Product Manager) Migration from on-premises and traditionally hosted databases to AWS managed database services… Not only how to migrate between same engines, but also between different, like… But before you can migrate data anywhere, you need a schema; tables and objects into which to load the data. We’ll talk about how you can convert database objects, in order to support moving between engines.
  • #4: These days, we’re hearing a lot of customers tell us they want to move their on-premises applications into the cloud. But moving applications is simpler than moving the databases they depend on. Applications are usually stateless, and can be moved fairly easily using a lift and shift approach. (CLICK) But databases are stateful, and they require more care. To move databases to AWS, requires a data migration strategy. (CLICK) And when it comes to designing those strategies, customers want to be able to do it with the least possible inconvenience and visibility to their users. (CLICK) And once an application is migrated to AWS, it’s not the end of the story. Often customers have several applications, some in the cloud and some on premises or in hosted environments. Customers need to be able to synchronize their data between on-premises and cloud-based applications. (CLICK) And the same goes for applications within AWS. Those applications often share data, and customers want to be able to synchronize and replicate data between the various databases they maintain within AWS. (CLICK) And one other thing: customers moving applications to the cloud, often see it as an opportunity to break free from commercial databases, which tend to have a heavy licensing burden. We often hear customers asking us for a way to convert their commercial databases into AWS solutions, such as RDS MySQL, Postgres, Aurora and Redshift.
  • #6: Announcing preview of AWS DMS Explain what it basically is: A managed hosted data replication service engineered to support graceful migration from legacy database systems into the next generation managed databases at AWS.
  • #7: There are multiple reasons why you’d want to migrate your data. Read-only replication: reporting, read scaling Read/write replication: multi-master
  • #8: There is a multitude of approaches for migrating your data to AWS. Choose the method based on Data set size, Network Connectivity (access, latency, and bandwidth), ability to sustain downtime for source DB, Need for continuous data synchronization. If you can take or day or two of downtime, and you don’t have to do the migration process several times, you want to do a Bulk Load. If you need to minimize downtime, or when your dataset is large and you can’t shut down access to the source while you are migrating your data, you want to consider Ongoing Replication.
  • #9: In the past we recommended the following high-performance technique for moving large databases. Use DataPump and export files in parallel. Use a box that has multiple disks, to parallelize IO Compression Makes 500 GB to 175 GB Time to export 500GB is ~2.5 hours Transport compressed files to EC2 instance using UDP and the Tsunami server. Install Tsunami on both the source data host and the target EC2 host Using UDP you can achieve higher rates for transferring files that using TCP Start Upload when first files from DataPump become available. Upload in Parallel. No need to wait till all 18 files are done to start upload Time to upload 175GB is ~2.5 hours Step 3. Transfer Files to Amazon RDS DB instance Amazon RDS DB instance has an externally accessible Oracle Directory Object DATA_PUMP_DIR Use UTL_FILE. to move data files to Amazon RDS DATA_PUMP_DIR BEGIN perl_global.fh := utl_file.fopen(:dirname, :fname, 'wb', :chunk); END; BEGIN utl_file.put_raw(perl_global.fh, :data, true); END; BEGIN utl_file.fclose(perl_global.fh); END; Transfer Files as They Are Received, No need to wait till all 18 files are received in the EC2 instance. Start transfer to RDS instance as soon as the first file is received. Total time to Transfer Files to RDS ~3.5Hours Step 4. Import data into the Amazon RDS instance • Import from within Amazon RDS instance using DBMS_DATAPUMP package • Submit a job using PL/SQL script Total Time to Import Data into Amazon RDS: ~4hours But because we do everything staged and have 18 distinct files, the total duration is ~7hours. Background ------------------- Open port 46224 for Tsunami communication Tsunami UDP Protocol: A fast user-space file transfer protocol that uses TCP control and UDP data for transfer over very high speed long distance networks (≥ 1 Gbps and even 10 GE), designed to provide more throughput than possible with TCP over the same networks. Tsunami Servers: http://sourceforge.net/projects/tsunami-udp/files/latest/download?_test=goal Optimize the Data Pump Export • Reduce the data set to optimal size, avoid indexes • Use compression and parallel processing • Use multiple disks with independent I/O Optimize Data Upload • Use Tsunami for UDP-based file transfer • Use large Amazon EC2 instance with SSD or PIOPS volume • Use multiple disks with independent I/O • You could use multiple Amazon EC2 instances for parallel upload Optimize Data File Upload to RDS • Use the largest Amazon RDS DB instance possible during the import process • Avoid using Amazon RDS DB instance for any other load during this time • Provision enough storage in the Amazon RDS DB instance for the uploaded files and imported data
  • #10: * Like all AWS services, it is easy and straightforward to get started. You can get started with your first migration task in 10 min or less. You simply connect it to your source and target databases, and it copies the data over, and begins replicating changes from source to target. *That means that you can keep your apps running during the migration, then switch over at a time that is convenient for your business. * In addition to one-time database migration, you can also use DMS for ongoing data replication. Replicate within, to or from AWS EC2 or RDS databases For rinstance, After migrating your database, use the AWS Database Migration Service to replicate data into your Redshift data warehouses, cross-region to other RDS instances, or back to on-premises *Again- it is heterogeneous ~. With DMS, you can move data between engines. Supports Oracle, Microsoft SQL Server, MySQL, PostgreSQL, MariaDB, Amazon Aurora, Amazon Redshift * If you would like to sign up for the preview of DMS, go to…
  • #11: Let’s take a look at how to use the database migr. Service… From the landing page, just click “get started”. That will take you to page that describes how DMS works to migrate your data; how you connect it to a source database and target database, then define replication tasks to move the data.
  • #12: Using the AWS Database Migration Service to migrate data to AWS is simple. (CLICK) Start by spinning up a DMS instance in your AWS environment (CLICK) Next, from within DMS, connect to both your source and target databases (CLICK) Choose what data you want to migrate. DMS lets you migrate tables, schemas, or whole databases Then sit back and let DMS do the rest. (CLICK) It creates the tables, loads the data, and best of all, keeps them synchronized for as long as you need That replication capability, which keeps the source and target data in sync, allows customers to switch applications (CLICK) over to point to the AWS database at their leisure. DMS eliminates the need for high-stakes extended outages to migrate production data into the cloud. DMS provides a graceful switchover capability.
  • #13: But DMSis for much more than just migration. (CLICK) DMSenables customers to adopt a hybrid approach to the cloud, maintaining some applications on premises, and others within AWS. There are dozens of compelling use cases for a hybrid cloud approach using DMS. (CLICK) for customers just getting their feet wet, AWS is a great place to keep up-to-date read-only copies of on-premises data for reporting purposes. AWS services like Aurora, Redshift and RDS are great platforms for this. (CLICK) With DMS, you can maintain copies of critical business data from third-party or ERP applications, like employee data from Peoplesoft, or financial data from Oracle E-Business Suite, in the databases used by the other applications in your enterprise. In this way, it enables application integration in the enterprise. (CLICK) Another nice thing about the hybrid cloud approach is that it lets customers become familiar with AWS technology and services gradually. DMS enables that. Moving to the cloud is much simpler if you have a way to link the data and applications that have moved to AWS with those that haven’t.
  • #14: With the AWS Database Migration Service you pay for the migration instance that moves your data from your source database to your target database.(CLICK) (Actually talk to points) Each database migration instance includes storage sufficient to support the needs of the replication engine, such as swap space, logs, and cache. (CLICK) (actually talk to points) Inbound data transfer is free. (CLICK) Additional charges only apply (CLICK) if you decide to allocate additional storage for data migration logs or when you replicate your data to a database in another region or on-premises. AWS Database Migration Service currently supports the T2 and C4 instance classes. T2 instances are low-cost standard instances designed to provide a baseline level of CPU performance with the ability to burst above the baseline. They are suitable for developing, configuring and testing your database migration process, and for periodic data migration tasks that can benefit from the CPU burst capability. C4 instances are designed to deliver the highest level of processor performance and achieve significantly higher packet per second (PPS) performance, lower network jitter, and lower network latency. You should use C4 instances if you are migrating large databases and are looking to minimize the migration time.
  • #15: Elaborate on heterogeneous use cases Database engine migration – cost savings; Move to full managed and scalable cloud-native – Ent class like Aurora Low-cost reporting, analytics and BI for systems on commercial OLTP (MySQL Postgres Aurora) Data integration – customer accounts, data like that, can be presented no only on the master platform, but also in applications that are based on non-commercial But you can’t just pick up an Oracle table and put it down in MySQL. You can’t run an Oracle PL/SQL package on Postgres. To migrate or replicate data between engines, you need a way to convert the schema, to build a set of tables and objects on the destination that is native to that engine. We’ve been working on that problem. Introduce Sergei
  • #17: The AWS Schema Conversion Tool is a development environment that you download to your desktop and use to save time when migrating from Oracle and SQL Server to next-generation cloud databases such as Amazon Aurora. You can convert database objects such as tables, indexes, views, stored procedures, and Data Manipulation Language (DML) statements like SELECT, INSERT, DELETE, UPDATE.