Tips for migration of Records Scaling to 200 Million Rows

Giganews Newsgroups
Subject: Tips for migration of Records Scaling to 200 Million Rows
Posted by:  seshan.parameswar…
Date: Thu, 29 Nov 2007

Hello Group
                    Excuse the blast please.  I am seeking an expert
opinion in Migrating data from an existing Oracle RDBMS to another.
The existing system is proprietory and can be accessed only via a JAVA
API Interface provided.  I am working on a migration plan that does
this migration.  I am planning to use the java API Interface to
extract the essential data and create SQL files with the data
generated and then directly use the files to upload data into the new
Oracle System with the nologging option to prevent undo/redo.  In
doing so, I wanted to use partioning as much as possible as I will be
dealing with records scanning across 30 years.  I am aware of the fact
that to do updates to a large number of rows, it is advisable to use
partioning  and to create a new temp table and insert data from the
partition you are planning to update and do the update to the temp
table and replace the partition with the updated rows.  Is there a
similar approach that can be used for inserting new records as well?,
particularly as new rows are inserted, the table is tend to grow and
subsequent inserts would be slower.  I am also investigating the
option of using Oracle Data Pump and to use a java api in coordination
with the same.  Moreover, I have a limited timeframe of a month to
perform this migration.  I am looking for expert advice on the most
feasible approcach that would work for this scenario.  Thanks in
advance for all inputs