Quantcast
Channel: SCN: Message List - SAP MaxDB
Viewing all articles
Browse latest Browse all 2539

Re: Max db (Content Server) Export/Import using several loadercli

$
0
0

Hi All,

 

I am sharing the process for doing a parallel export of Content Server for doing heterogeneous system copy  using loadercli.

 

We have tested the export in our test system in which COMPONENTS0001 had row count of  1031374 & DOCUMENTS0001 also of same row size.

 

What we did was we exported all the tables one by one individually & broke the COMPONENTS0001 in parallel exports based on record count.

 

So here is the order in which we did:

1)  Export only the catalog informationusing EXPORT_CAT.sql having below commands:

 

EXPORT USER

catalog outstream file '/<path of export>/SAPR3.cat'

 

2) Once the above is done, we will need to export all tables individually apart from COMPONENTS0001 using different script for each table. So tables to be exported are:

 

CONTREP

DOCUMENTS0001 to DOCUMENTS0006

COMPONENTS0002 to COMPONENTS0006

 

 

Create EXPORT_CONTREP.sql with below commands:

EXPORT TABLE CONTREP

DATA OUTSTREAM FILE 'CONTREP.data'

 

Create one each for DOCUMENTS000X:

 

So for example EXPORT_DOCU001.sql for DOCUMENTS0001

EXPORT TABLE DOCUMENTS0001

DATA OUTSTREAM FILE 'DOCU01.data'

LOB OUTSTREAM LONG_PROPERTY 'DOCU01_PROP'

LONGFILE LONG_VALUE 'DOCU01_LONG'

     

-------

--------

and so on until last table (we had DOCUMENTS0006 so EXPORT_DOCU006.sql):

EXPORT TABLE DOCUMENTS0006

DATA OUTSTREAM FILE 'DOCU06.data'

LOB OUTSTREAM LONG_PROPERTY 'DOCU06_PROP'

LONGFILE LONG_VALUE 'DOCU06_LONG'

 

The same we will create the Sql scripts for COMPONENTS000xxx tables:

As we will be only splitting COMPONENTS0001 so we exported rest of tables like below. For example cat EXPORT_COMP002.sql for COMPONENTS0002 with below syntax

EXPORT TABLE COMPONENTS0002

DATA OUTSTREAM FILE 'COMP02.data'

LONGFILE LONG_VALUE 'COMP02_LONG'

 

---- Until the last Table

$ cat EXPORT_COMP006.sql

EXPORT TABLE COMPONENTS0006

DATA OUTSTREAM FILE 'COMP06.data'

LONGFILE LONG_VALUE 'COMP06_LONG'

 

By this time you have got all scripts ready for tables apart from one you will break. For us it was COMPONENTS0001 having 1 million rows, so we broke the table into 4 sets using below:

 

$ cat EXPORT_COMP01_01.sql

EXPORT TABLE COMPONENTS0001

DATA OUTSTREAM FILE 'COMP01_1.data' START 1 250000

LONGFILE LONG_VALUE 'COMP01_LONG_1'

 

$ cat EXPORT_COMP01_02.sql

EXPORT TABLE COMPONENTS0001

DATA OUTSTREAM FILE 'COMP01_2.data' START 250001 500000

LONGFILE LONG_VALUE 'COMP01_LONG_2'

 

$ cat EXPORT_COMP01_03.sql

EXPORT TABLE COMPONENTS0001

DATA OUTSTREAM FILE 'COMP01_3.data' START 500001 750000

LONGFILE LONG_VALUE 'COMP01_LONG_3'

 

$ cat EXPORT_COMP01_04.sql

EXPORT TABLE COMPONENTS0001

DATA OUTSTREAM FILE 'COMP01_4.data' START 750001 1100000

LONGFILE LONG_VALUE 'COMP01_LONG_4'

 

So all our export scripts were ready by now.

 

So next step was to run them parallel which can be done with below command:

loadercli -n <server_name> -d <database_name> -u <user_name>,<password> -b <script>

 

What we did was we export all tables apart from COMPONENTS0001 parallel & they were quick being small in size.

 

Then we ran scripts for COMPONENTS0001 in parallel from EXPORT_COMP01_01.sql to EXPORT_COMP01_04.sql & all ran fine.  The unloading command same for all exports.

 

We are yet to test the Import. Once we have done the import I will share the import process also.

 

But please feel free to share your experiences or steps which you took which might benefit others like us.

 

Warm Regards

Ajay Sehgal


Viewing all articles
Browse latest Browse all 2539

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>