Wonder Time For DBAs – What Are The Results?

Wonder Time For DBAs – What Are The Results?

February 7, 2019 Off By dianarobete

Last year, I introduced you to the idea of Wonder Time For DBAs. I have not come up with the Wonder Time idea, instead it was introduced to me by no-one else, than my children.

Kids really know how to wonder, they have a special joy in their eyes when they do it, and if the subject interest them a lot, then they wonder with their whole being.

Have you been wondering about anything related to your job?

Let me tell you what I have been wondering lately, maybe my wonders spark some ideas for you.

My daily wonder is about what questions could users or managers ask me about the databases? I think about this all the time.

I keep asking myself what if they ask me this, what if they ask me that, could I answer the question with the data and knowledge I have available?
Could I answer it fast, or do I need to do lots of research and formatting?

I also wonder about how accurate my answers are!

These are some of the questions I came up with over time:

  • how many users are logging into each database every hour? Is there a trend? Have the number of user connections increased/decreased compared to a month ago, 6 months ago, a year?
  • what applications(programs) are users using to connect to each database?
  • what users are connecting to each database?
  • in which databases does a specific user exists?
  • have RMAN configurations changed? What were the previous configurations for each database?
  • what was the execution plan for a specific SQL Statement, and what was the elapsed time for it last month?
  • can I look something up in the AWR from last year for a specific database? How about in the Statspack reports? Do I have that information available?
  • what is the status of the Data Guard system, what would you check to confirm everything is working well?

All of the above questions or “wonders”, pushed me to develop scripts and reports that would answer each question.

To answer how many users are logging into a database every hour, I created a report from the audit trail, that provides a clear output with the number of successful logins for each hour for the previous day. This report is saved on the server for future use, indefinitely. Another report is sent out to my inbox, with the unsuccessful logins in each database, for investigation.
Once the initial wonder or question was fulfilled, the answers created new interesting questions, such as why is an application account configured with the wrong password, and nobody notices it until now? Or why is an application account logging into the database 9,000 times each hour?

To answer the question what applications(programs) are users using to connect to database X, I implemented the listener log mining process, which I described in detail last week! By reviewing the reports from these logs, I uncovered some errors I was not aware of and solutions for these errors.

To answer the question what users are connecting to each database, I expanded the first report on the audit trail to get a listing of the usernames that connect to the database.

To answer the question in which databases does a specific user exists, I created a script that checks the existence of the username in all the databases in the enterprise, and tells you if the user exists or not. I implemented many flavors of this script, such as: change the password for a user in all the databases, create the user in all the databases, drop the user in all the databases. The important thing here is the automation of the process.

To answer the question have RMAN configurations changed, I created a script that expects certain values to find in RMAN, and if the values are not the same, it send an email with the expected value and also the current value.

To answer the question, what was the execution plan of a SQL statement last month, or X months in the past, is a bit more tricky. I created a process that exports the AWR repository regularly, and generates the AWR hourly reports. These reports are kept indefinitely, as you never know how far back in time you need to check. The AWR repository exports can be imported any time for analysis. You can do the same thing with Statspack if you are not licensed for AWR.

To answer the question what is the status of the Data Guard system, and I mean all the databases that are Data Guard, I created a script that will output the status, lag time, gap (if any) for the databases involved in a Data Guard configuration. This script can be used after an outage to confirm everything came up correctly.

Do any of these scripts or reports sound familiar?

What are your wonders and the results of your wonders?

Leave a comment below, I read every single comment!

If you enjoyed this article, and would like to learn more about databases, please sign up below, and you will receive
The Ultimate 3 Step Guide To Find The Root Cause Of The Slow Running SQL!

–Diana