Upgrade Your Database To 19c With Export-Import Datapump
Upgrading the Oracle database to 19c is a very hot topic nowadays! I am not going to give you a speech about why you should do it, I am sure you know that already!
There are many ways to approach a 19c upgrade.
In today’s post, I’ll give you one approach: using export/import datapump. Is this the best method? I would not call any method the best method.
The method that works best for you is the best method for you.
What do I mean by export-import datapump as a method? In very simple terms, you export the data you need from the older version database, and then you import it into a brand new 19c database.
5 Reasons I like this method
- My favorite reason for this method is, you start with a brand new database, usually on a brand new server, and you are not upgrading from previous versions. Brand new database, brand new data dictionary, less bugs and less problems.
- My second favorite reason is that the rollback process is very easy and very quick. You just point the application and the users back to the original database and you are back in business. No need to restore from previous backup, as the source database is intact.
- You can setup everything ahead of time, with the only thing remaining to bring over the users and the data.
- You can test the upgrade process many times, and fine tune it, until you get the desired results. Also you can test the upgraded database ahead of time, after your test upgrade runs.
- You can time the upgrade as well, so you know exactly how long your upgrade takes, and how long the outage will be.
The cons of this method
- Something might get missed, a schema, a profile, a scheduler job and so on. You will only export users and data, and a schema might get missed. However this mistake should be discovered during your test runs.
- The duration of this method could be pretty long, depending on the size of the database. With large databases the time to export the data and then import the data, can be very long. This outage time might not be acceptable to the business users. Remember, before you start the export, the application must be stopped, and it can only be started after the import completes.
Now that we looked at the pros and cons, let’s see what the main steps would be to upgrade your database
- Install Oracle 19c on the server you are going to setup the new database. Install the latest patches as well, at the same time. If you want to know how to do the install and the patching in one line, check it out here.
- If you are going with a new server, the mount points for datafiles, redo logs, archived logs, backups etc, have to be setup with the appropriate size. If the source database is 100GB in size and you allocated only 50GB for the new database, then the new database will not fit on the new server. Make sure you plan accordingly.
- Create the new database with DBCA, the following should be taken into consideration: the character set and the national character set of the source and target database must match; SGA/PGA size; database edition: enterprise/standard edition 2; installed components: only install what is required; database size.
- In the source database, extract the DDL for the tablespaces that need to be created. Edit the generated script to fit the location and naming convention of the new database. Things to consider: start small with the tablespace size, and let the datafile grow during the import. During the import the tables will be “reorganized” and indexes will be rebuilt.
You actually might end up with a smaller database size, than the one you started with
- In the source database, extract the DDL for roles, profiles, public synonyms for schema objects other than sys, system and Oracle delivered objects and public database links. Also extract system privileges granted to roles and to users, as these will not be coming across through the import.
- In the target database, create the tablespaces, roles, profiles with the generated scripts from step 4 and 5
- In the source database stop the application, get everyone logged out. Best is to put the database in restricted session mode before you start the export.
- Export datapump the source database. You export the users and schemas that you want to migrate to 19c. This is a good opportunity to clean up the database of users that no longer exists or schemas that have been decommissioned but not dropped in the past. This step could be a time consuming one, depending on the size of the database. A recommendation is to exclude statistics. Stats will be useless, since you will be gathering stats in 19c, after the import, anyways. Not exporting stats will save you some time on the export.
- Copy the export dump file to the target location, or a location that is accessible by the new database.
- Now you are ready to run the import in the new database. To iterate again, the import could take a long time. If the time it takes to export and import datapump the data from the source database to the target 19c database is too long, and unacceptable for the business, then this method for upgrade is not for you. You should upgrade the database using a different method. During the import if you had users with 10G password versions, these accounts will be locked and expired, unless the SQLNET.ALLOWED_LOGON_VERSION_SERVER is set to 10 or 11. I talked about this in my previous post.
- Once the import completes, run the scripts to create the public synonyms, public database links, and system grants generated in Step 5.
- Compile invalid objects. This step could be time consuming. there might be objects that are invalid in both, the source database and the target database; you don’t need to worry about those. There might be objects that show as valid in the source system, but in reality if you would compile them, they would not compile. You just have to work through this step, and ensure that the objects that are supposed to be valid, are in reality valid.
- In the new target database gather statistics for the whole database. Again this step might take some time, depending on the size of the database.
- Run verification scripts for object counts, grants, roles, users, and confirm that everything you expect in the new target database is there.
- Start up the application (usually done by someone else) and get the users test the application and the database.
This is it! Of course you need to run through the steps to fine tune them, and create your own scripts for each step, so you can easily go through it multiple times, and faster each time.
How did you upgrade your database? Leave a comment below, I read every comment!
If you enjoyed this article, and would like to learn more about databases, please sign up to my weekly email, and you will receive my FREE guide: 7 Questions to Ask When Troubleshooting Database Performance Problems!
If you are interested in improving your Oracle Tuning skills, check out my course theultimatesqltuningformula.com