Cloning Oracle Transportation Management Database Schemas For 6.4.3 (Doc ID 2593800.1)
Cloning Oracle Transportation Management Database Schemas For 6.4.3 (Doc ID 2593800.1)
Cloning Oracle Transportation Management Database Schemas for 6.4.3 (Doc ID 2593800.1)
In this Document
Abstract
Method
Naming Conventions
Prerequisites
History
Details
Export Option 1 - Clone Entire Schemas
Step 1. Export Schemas
Sample Command
Export option 2 - Export Schemas without TRANSIENT Data and with Metadata
Sample Command
Step 2. Prepare Target Database
Step 3. Import Schemas
Sample Command
3.1 Import Schemas without TRANSIENT Data Tables
Sample Command
3.2 Import TRANSIENT Data Tables Metadata
Sample Command
Step 4. Post Import Tasks
Step 5. Clean Up Data (Optional)
Step 6. Review Oracle Transportation Management Configuration Data (Optional)
Step 7. Modify Scalability Configuration Settings in the Database (Optional)
Summary
APPLIES TO:
ABSTRACT
This document explains how to copy Oracle Transportation Management database schemas from one database to another
for version 6.4.3 only. This procedure could be used, for instance, to copy production Oracle Transportation Management
database schemas to a test database.
Note: This note only covers the export and import of the following schemas:
FTI/GTI and ADF repository schemas are not covered in this process.
1 of 7 6/14/2021, 8:57 AM
Document 2593800.1 https://support.oracle.com/epmos/faces/DocumentDisplay?_adf.ctrl-stat...
Method
1. Export Oracle Transportation Management schemas of the source database using data pump.
2. Prepare destination database.
3. Import Oracle Transportation Management schemas into the destination database using data pump.
4. Perform post import tasks which recompiles objects, creates remaining objects and perform grants.
5. Modify configuration data to reflect new database in the Oracle Transportation Management configuration tables.
Naming Conventions
Source database: Database from where Oracle Transportation Management schemas and data is being copied.
Target database: Database to where Oracle Transportation Management schemas and data are being copied.
TRANSIENT data: Temporary data which is no longer required by Oracle Transportation Management for day to
day functionality.
Data pump: Oracle Database utility to export and import schemas.
DATAPUMP DIRECTORY: Directory on the database server available to store exported schema and data.
Prerequisites
Oracle Transportation Management application version of target database should be the same as source database.
Target database should be installed with proper Oracle Transportation Management init.ora parameters similar to
source.
Source and Target database has DATAPUMP DIRECTORY created and proper disk space and permissions are
available to user.
Person performing this task needs basic DBA skills as well as person must have used Data Pump previously. Refer to
Oracle Database Utilities documentation to get more information about Data Pump.
The second option can significantly reduce runtime and disk space during copy process. This option can be used to copy
production database to development systems.
HISTORY
DETAILS
Export following schemas from the source database. Use SYS or other DBA account to perform export/import. Make sure
that the other DBA account has "Exempt Access Policy" role assigned because Oracle Transportation Management uses
VPD for data security.
Sample Command
2 of 7 6/14/2021, 8:57 AM
Document 2593800.1 https://support.oracle.com/epmos/faces/DocumentDisplay?_adf.ctrl-stat...
schemas=GLOGOWNER,REPORTOWNER,GLOGOAQ,ARCHIVE_C,ARCHIVE_C_USER,GLOGLOAD,GLOGDBA,GLOBALREPORTUSER,DIR_XML_USER
logfile=otm_643_full_<source_dbsid>.log
All of these parameters can be combined into a single par file you can use for the export.
Userid='sys/<SysUserPWD>@<source_dbsid> as sysdba'
DIRECTORY=<DATAPUMP_DIR_NAME>
DUMPFILE=OTM_643_exp_%U.dmp
SCHEMAS=GLOGOWNER,REPORTOWNER,GLOGOAQ,ARCHIVE_C,ARCHIVE_C_USER,GLOGLOAD,GLOGDBA,GLOBALREPORTUSER,DIR_XML_USER
LOGFILE=otm_643_full_<source_dbsid>.log
expdp parfile=full_643_exp_<source_dbsid>.par
Note: Be sure and review the otm_643_full_<source_dbsid>.log file for any errors before moving to the next step.
Export option 2 - Export Schemas without TRANSIENT Data and with Metadata
Export schemas from the source database excluding tables which contains transient data. The exclude table list is provided
in the parameter file otm_643_expdp_exclude_list.par and the same tables are listed in the
otm_643_expdp_include_list.par (these files are an attachment to this knowledge document).
otm_643_expdp_exclude_list.par - this file contains the Userid, DIRECTORY, FILESIZE, DUMPFILE, LOGFILE,
schemas and the exclude ( list of tables to be excluded from the export ) parameters
otm_643_expdp_metadata_list.par- this file contains the Userid, DIRECTORY, FILESIZE, DUMPFILE, LOGFILE,
CONTENT=METADATA_ONLY, TABLES ( list of tables to have their metadata exported ) parameters. The TABLES
parameter has schema owner of GLOGOWNER and the full tablename associated with it. If you view the file you will
see all the tables end with a _T.
Each file has several parameters that need to be updated before running expdp.
Userid='sys/<SysUserPWD>@<source_dbsid> as sysdba'
DIRECTORY=<DATAPUMP_DIR_NAME>
FILESIZE=2G
DUMPFILE=otm_exclude_643_<source_dbsid>_%U.dmp
Sample Command
expdp parfile=otm_643_expdp_exclude_list.par
and this:
expdp parfile=otm_643_expdp_metadata_list.par
Note: Check the otm_643_exp_exclude.log and otm_643_metadata_exp.log log files generated for errors before moving
to the next step.
3 of 7 6/14/2021, 8:57 AM
Document 2593800.1 https://support.oracle.com/epmos/faces/DocumentDisplay?_adf.ctrl-stat...
1. Create the Oracle Transportation Management tablespaces similar to the source database. Make sure enough space is
allocated to accommodate source data. You can use Oracle Transportation Management provided script
create_gc3_tablespaces.sql to create tablespaces. Script is located in the directory < OTM Home >/glog/oracle/script8.
2. Create the Oracle Transportation Management users on the target database using create_glog_users.sql script. It is
important to use this script because it also grants necessary privileges which do not get transferred by data pump. Go to
directory < OTM Home >/glog/oracle/script8 of the application server, Log in as SYS and run the following script:
@create_glog_users.sql. Make sure no errors are generated.
Log into the database using sqlplus as sys and run the following commands. Enter OTM643 when prompted.
define edition_name=&edname
Import schemas exported in Step 1 to the target database. This may involve transfer of DUMP file from source database
DATAPUMP DIRECTORY to target database DATAPUMP DIRECTORY.
Sample Command
These parameters can be combined into a single .par file to make the import process easier.
Userid='sys/<SysUserPWD>@<target_dbsid> as sysdba'
DIRECTORY=<DATAPUMP_DIR_NAME>
DUMPFILE=OTM_643_exp_<source_dbsid>_01.dmp
SCHEMAS=GLOGOWNER,REPORTOWNER,GLOGOAQ,ARCHIVE_C,ARCHIVE_C_USER,GLOGLOAD,GLOGDBA,GLOBALREPORTUSER,DIR_XML_USER
LOGFILE=import_otm_643_<target_dbsid>.log
impdp parfile=imp_643_<target_dbsid>.par
Note: Review the import_otm_643_<target_dbsid>.log for errors. You can ignore "user already exists", trigger errors, and
4 of 7 6/14/2021, 8:57 AM
Document 2593800.1 https://support.oracle.com/epmos/faces/DocumentDisplay?_adf.ctrl-stat...
These steps are not required if a full export was taken. Use the steps listed below to import the Transient and Metadata.
Import Schemas without TRANSIENT Data Tables into the Target Database Using the DUMP file created earlier. This may
involve the transfer of DUMP file from the source database DATAPUMP DIRECTORY to the target database DATAPUMP
DIRECTORY.
Sample Command
Similar to the export command, a .par file can be created for the import process too. Create a file called
imp_exclude_643_<target_dbsid>.par with the following parameters updated for your environment.
Userid='sys/<sysUserPWD>@<target_dbsid> as sysdba'
directory=<DATAPUMP_DIR_NAME>
DUMPFILE=otm_exclude_643_<source_dbsid>_01.dmp
LOGFILE=import_exclude_643_<target_dbsid>.log
SCHEMAS=GLOGOWNER,REPORTOWNER,GLOGOAQ,ARCHIVE_C,ARCHIVE_C_USER,GLOGLOAD,GLOGDBA,GLOBALREPORTUSER,DIR_XML_USER
impdp parfile=imp_643_exclude_<target_dbsid>.par
Note: Review the import_exclude_643_<target_dbsid>.log file generated for possible errors. You can ignore the "user
already exists" errors, trigger errors and compilation warnings as shown below.
5 of 7 6/14/2021, 8:57 AM
Document 2593800.1 https://support.oracle.com/epmos/faces/DocumentDisplay?_adf.ctrl-stat...
Import TRANSIENT Data Tables Metadata Only into the Target Database Using the DUMP file created earlier. This may
involve transfer of DUMP file from source database DATAPUMP DIRECTORY to target database DATAPUMP DIRECTORY.
Sample Command
A .par file can be created for the import process. Create a file called imp_643_metadata_<target_dbsid>.par with the
following parameters updated for your environment.
Userid='sys/<sysUserPWD>@<target_dbsid> as sysdba'
directory=<DATAPUMP_DIR_NAME>
DUMPFILE=otm_metadata_643_<source_dbsid>_01.dmp
LOGFILE=import_metadata_643_<target_dbsid>.log
impdp parfile=imp_643_exclude_<target_dbsid>.par
1. Copy the post_import_643.sql script from this note to the < OTM Home >/glog/oracle/script8 directory on the instance
that just had the data imported.
2. Go to the < OTM Home >/glog/oracle/script8 directory and log into the database using sqlplus as sys and run the
following commands.
begin
sys.utl_recomp.recomp_serial();
end;
/
3. While logged into the databased as sys run the following script.
@cloud/install_pkg_awr_report.sql
4. Log into database as GLOGOWNER user via SQLPLUS and run post_import_643.sql
@post_import_643.sql
You may see compilation errors at the start of the script; however at the end there should be no invalid objects.
The "Clean-up Data" step is optional depending on the intention of the cloned instance. For instance, if a production
6 of 7 6/14/2021, 8:57 AM
Document 2593800.1 https://support.oracle.com/epmos/faces/DocumentDisplay?_adf.ctrl-stat...
instance is being cloned to a test environment, it may be necessary to disable production contacts/external systems. You
may need to update the password for the "guest" and "system" user via the update_password.sh script.
The following SQL statements are commonly used to clean-up production data that should not be used in a test instance.
Note: Review these statements and analyze their impact on the installation before running them.
Note: This list of statements is not meant to be a comprehensive list. It may be necessary to modify other data when
cloning a production instance.
Please review the listed data below and make necessary changes before starting Oracle Transportation Management.
Table Column
------------------ ------------------
APP_MACHINE MACHINE_URL
DATA_SOURCE JDBC URL
EXTERNAL_SYSTEM URL
USER_MENU URL
LOGFILE FILE_NAME: Verify it does not have absolute path
STYLESHEET_PROFILE TEMPLATE_NAME: Verify it does not have absolute path
WEB_SERVICE_ENDPOINT SERVICE_ENDPOINT
REPORT_COMMON_PROPERTIES REPORT_COMMON_PROPERTY_VALUE: Verify URL of web server is correct
Also check that other database objects, such as database links, match with the target database.
If source database was part of Oracle Transportation Management scalability configuration then refer to Oracle
Transportation Management scalability guide to change configuration data to match the target configuration.
SUMMARY
7 of 7 6/14/2021, 8:57 AM