Skip to content

DBAzine.com

Sections
Personal tools
You are here: Home » Blogs » Chris Foot Blog » Chris Foot's Oracle10g Blog » 10G Data Warehouses
Seeking new owner for this high-traffic DBAzine.com site.
Tap into the potential of this DBA community to expand your business! Interested? Contact us today.
Who Are You?
I am a:
Mainframe True Believer
Distributed Fast-tracker

[ Results | Polls ]
Votes : 3605
 

10G Data Warehouses 10G Data Warehouses

It has been a long process, but we have successfully converted our first production warehouse to Oracle10G. The database has been running for over a week now with no problems and is performing better than we thought it would. In this first post-turnover blog, I'll cover some of the testing strategies we used to ensure a successful migration to Oracle10G. In part 2, I'll provide you with a laundry list of helpful tips and techniques you can use to make any database upgrade easier and less error-prone.

I wanted to wait for at least a week after converting our first warehouse database to discuss the conversion process.   Now that we have 10 days of processing complete, it looks like our migration from 9I to 10G was a success.  As I stated in previous blogs, the PL/SQL programs run much faster in 10G than they do in 9I.   In addition, we have had no problems (knock on a DBA's head) since the database was converted.

I just reviewed the database's alert log and associated trace files (for the thirtieth time) and found nothing that would cause me to believe that the database is having problems.   In addition, we have been using Oracle 10G Enterprise Manager to administer and monitor the newly converted database and both the tool and the database have been performing flawlessly.

In upcoming blogs, I'll cover how we have configured 10G Enterprise Manager to monitor and administer our 10G test and production databases.  In addition, I'll also describe how we use the tool to perform performance monitoring and tuning.  Performance monitoring and tuning using 10G EM is so drastically different than 9I OEM, that it can only be described as a "paradigm shift". 

I hate to use such a trite expression, but I think that's the best way to describe it.   9i OEM's tools primarily focused on database statistics (buffer hit ratios, file stats, events and waits).  Workload tuning (SQL, PL/SQL) tools were available but they didn't seem to be the primary focus (compared to 10G EM). 

10G EM continues to provide database statistical information but also provides numerous tools to monitor and evaluate database workloads and the affect they have on database performance.  Administrators are now able to easily view all of the programs and SQL that are running (or have run) in the Oracle database and determine how well they are performing.  If they aren't performing well, 10G EM quickly provides the reasons why.   This new release of EM is just so much better at monitoring database workload performance that it is worthwhile for me to spend some time describing how we use it. 

But let's get back to the topic at hand.   I thought I would provide you with some of the reasons why I think our conversion went off without a hitch.  Some DBAs are smart, some are lucky and the rest of us have to be thorough.    Since I am neither lucky nor super-smart, I absolutely have to be thorough to be successful in my profession. 

Let's face it; one of the tasks that makes many DBAs lie awake nights is the database upgrade.    Most point upgrades (i.e. 8.1.6 to 8.1.7) are relatively simple.  The DBA runs a few SQL scripts to upgrade the database to the new release.    But upgrades from one Oracle version to the next, like our 9I to 10G migration, are major projects that require a well thought-out, documented strategy to ensure success.

My primary piece of advice is to be thorough.   Take a look at the test plans below and you'll see that we tried to test everything we could.   The end result was the new database went in and everyone, including management, is happy. The database engine plays such a strategic role in organizations that upgrade projects are often highly visible to upper management.  There's nothing like an upgrade gone bad to tarnish a reputation you have spent years building.
 
OEM Testing
Because 10G Enterprise Manager is so radically different than its 9I counterpart, I felt that it required a separate testing plan.  So, I created a 10G EM testing plan that covered virtually every feature that the new tool provides.   Sound like overkill to you?  Not to me. I am now confident that we will be able to administer the databases without a problem.  
 
RMAN Testing
If you have read my past blogs you also know that I was also highly concerned about the new release of RMAN.  Hey, that tool scares me to begin with so I wasn't going to rest easy until I created an RMAN test plan that covered every RMAN command that we use here at Giant Eagle.  The test plan didn't take me that long to set up and execute so it was well worth the effort.   
 
Base Functionality Testing
Our base functionality test plan included: startup/shutdown, remote access, utility execution (SQLLOADER, Export/Import), interoperability with the UNIX and LINUX operating systems, querying the data dictionary, backup/recovery, basic DML and DDL statement execution.  Please note that since we had been testing Oracle10G in our labs for some time, most of this was just a quick review to ensure that everything continued to work as it had in the past.
 
Application Testing
Our application team counterparts created their own formal 10G test plans. Some of the components they tested are provided below:

  • Business rules enforced by constraints, triggers and procedures
  • Data feeds from other architectures
  • Load utility processing
  • Nightly batch processing
  • Online transaction processing
  • Development tools used to build front-end application screens
  • Third-party tools used to provide Ad-Hoc query access
  • Procedural language used to build business logic (Java, PL/SQL)
  • Advanced features required (replication, advanced queuing, partitioning, parallelism)

Although application functionality testing isn't intended to identify performance problems, general run times were evaluated.  If something ran twice as fast or twice as slow as we expected, we intended to perform an in-depth analysis.  Luckily for us, everything ran the same or faster in Oracle10G.
 
Performance Testing
Performance testing allowed us to compare the performance of various SQL statements in the new Oracle10G database with the statements' performance in the old release (in this case Oracle Version 9.2).  We ran selected online transactions and batch jobs and compared their execution times and performance statistics. 
 
We didn't have to run traditional stress tests.  Like many data warehouses, a high number of concurrent users is not the pressing issue for this database.    We are concerned about loading lots of data and SQL statements that scan millions and millions of rows.  If our test database could survive our nightly batch processing job stream, I knew that 10G could handle virtually everything we threw at it.
 
We have numerous Oracle supplied and third-party toolsets available.   Since 10G and 9i have different versions of Oracle Enterprise Manager, we didn't use them during our performance evaluation.  As a result, we used the basic toolsets at our disposal. 
 
We began by comparing the elapsed execution times of both workloads.  We used explain plans to compare access paths and STATSPACK to generate statistical performance reports.  Yes, even though 10G has all kinds of high-tech tuning gadgets, we used the old tried-and-true STATSPACK tool to generate our performance statistics.   We also used SQL Trace and TKPROF when more in-depth analysis was warranted.  The same tools I have been using since Scott's tiger was a cub.
 
In addition, our UNIX team was also involved.  They used NMON, VMSTAT, IOSTAT and various third-party toolsets to measure the overall load on the system during each of the test runs.    
    
Good Testing Documentation is Essential
Creating new release testing checklists is a task that is often considered to be boring and mundane. Most DBAs would rather perform virtually any other activity than sit in front of a screen using a word processor. As a result, creating testing documentation is often postponed until the last minute. 

But a robust and detailed documentation library is the key to success.  The documents we created for our first production migration can now be used for future migration processes.  The testing checklists ensured that nothing was missed and were essential to the testing process.

Next up
In my next blog, I'll provide you with a laundry list of helpful tips and techniques you can use to make any database upgrade easier and less error-prone.


Friday, March 25, 2005  |  Permalink |  Comments (0)
trackback URL:   http://www.dbazine.com/blogs/blog-cf/chrisfoot/productionturnover/sbtrackback
 

Powered by Plone