What Happens To Undo Tablespace During DataPump Import And How To Fix It
A request to refresh a specific schema in the test environment came in. This is going to be a breeze, you think.
You prepare the scripts and the environment.
The datapump export of the source system is complete, and the target environment is ready for the import.
You kick off the import. All is good… until you receive an OS alert.
The volume hosting the database data files is running out of space.
This can’t happen, you think. Before you started, you confirmed there is enough free space.
You quickly check what is going on, and with horror you see UNDO tablespace reaching 500GB and growing.
Why is UNDO tablespace growing so big?
Most likely auto extend was turned on for the UNDO tablespace, to a very large value (this is possible with bigfile tablespaces). Otherwise, you would have gotten the following error during import:
ORA-30036: unable to extend segment by 8 in undo tablespace 'UNDOTBS1'
Who is using all the UNDO, and why?
It turns out, undo tablespace is used during datapump import when indexes and primary key constraints are present.
Import datapump will not use the direct_path method for importing data into the tables, if indexes are present.
Import datapump will use the undo tablespace, if the primary key constraint on the table is enabled.
A-ha, this starts making sense now.
You were importing data into a large table, which had the primary key enabled. Import datapump is maintaining the unique index on the table, during the import and needs UNDO tablespace for this process, therefore the UNDO tablespace grows.
What can you do?
There are 2 scenarios:
1. For small database imports – it doesn’t really matter. If UNDO tablespace grows by 2-10GB, it is not a big deal.
I wouldn’t complicate the import steps, just use the import datapump as is. Have all the indexes on the table, and the constraints enabled.
2. For large database imports – it really matters. You don’t want the UNDO tablespace to grow too big and run out of space.
This is exactly what you do:
- prior to running the import, disable the primary key constraints. If you have foreign key constraints that reference these primary keys, disable those as well.
ALTER TABLE ... MODIFY CONSTRAINT ... DISABLE NOVALIDATE;
- import data only, and create the indexes on the tables, after the import.
-
use the following parameter for impdp:
ACCESS_METHOD=DIRECT_PATH
-
after import completes, enable the primary key constraints (and of course foreign key constraints if applicable).
ALTER TABLE ... MODIFY CONSTRAINT ... ENABLE VALIDATE
Dbaparadise.com Tip:
Prior to running the import, generate the disable constraint statements, and in the same time the enable constraint statements:
--disable constraints select 'ALTER TABLE ' || owner || '.' || table_name || ' MODIFY CONSTRAINT ' || constraint_name || ' DISABLE NOVALIDATE;' from dba_constraints where owner='HR' and constraint_type='P' order by owner, table_name,constraint_name; --enable constraints select 'ALTER TABLE ' || owner || '.' || table_name || ' MODIFY CONSTRAINT ' || constraint_name || ' ENABLE VALIDATE;' from dba_constraints where owner='HR' and constraint_type='P' order by owner, table_name,constraint_name;
For your next large datapump import, you know: Disable PK constraints prior to the import!
this is pure gold… thanks Diana!
Thank you Simon!
This is really cool stuff 🙂
Thanks – just what I was looking for