Last few days I’ve been playing with export and import of 750 GB database. During this whole trial and error process I’ve learned several things that were unfamiliar to me. In this blog post I want to share some of the stuff that I learnt.
I have Oracle 10.2.0.4 database on Solaris 64bit which I want to migrate to Oracle 18.104.22.168 on Linux 64bit. It is 750GB large database with mostly partitioned tables (compressed partitions).
Let's get to the subject of the post.
I won't talk much about export operation - just to mention (more...)
If you are running 10.2.0.2 in your production environment check document [ID 4604970.8] on Oracle support site.
This week colleague noticed that query, he wanted to use for some reports, returns different results than he expected. He made some additional checks using Microsof Excel and confirmed that something is wrong. He was getting incorrect results.
My first suspicion was that probably something is wrong with query or maybe logical corruption happened.
Example of the queries (I’ve changed names of the table/columns):
-- partitioned_table (date range partitioned table)
create table temp_tab_2011
select col, date_to, sum (col1) (more...)
I’ve decided to perform this test case and publish blog post after reading this topic on OTN forums - http://bit.ly/mLwaA9
How to recover lost datafile when you don’t have valid backup of your database. Your database is running in archivelog mode
and you have all necessary archive logs.
In situation when you don’t have valid backup and your datatabase is running in noarchivelog mode
- your datafile is lost forever
So take regular backups of your database and enable archivelog mode ;-)
Environment for this test:
OS: OEL 5
DB: Oracle EE 10.2.0.4
To check am (more...)
We had serious problems with Oracle TNS Listener last few days. Problems were happening on very critical system - most active OLTP database.
Every few hours/minutes, in irregular intervals, TNS listener suddenly crashes which forces our Red Hat Cluster Manager to restart instance and even switch database to another node. Business systems that have relied on this database were experiencing great difficulties.
It is worth to mention that everything worked without any problems for about 2 years without any intervention on OS or database configuration.
OS: Red Hat Enterprise Linux ES release 4 (64bit)
DB: Oracle EE 10.2. (more...)
I've received following error while trying to perform full database export using datapump to NFS:
Export: Release 10.2.0.4.0 - Production on Tuesday, 31 May, 2011 11:27:50
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 10g Release 10.2.0.4.0 - Production
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "/backup-nfs/testdb/export/exportfull.dmp"
ORA-27086: unable to lock file - already in use
Linux Error: 37: No locks available
Additional information: 10
In this case problem was lying in NFS locking deamons - on most Unix-like (more...)
We are currently in process of replacing our old database servers with new ones. During the evaluation process we’re checking various technical solutions to find the best option in terms of price/performance.
For me, real life experience is the most important when we are talking about database server performance so I’ve posted question on Oracle-L
lists to get some answers about noted servers and performance.
DBA’s from Oracle-L
lists helped me a lot. Thanks guys!
As replay to my question David Robillard
posted answer that is worth publishing on this blog. It is nicely summarized project check list that can (more...)
Colleague reported to me that execution of his query is taking too long. After quick look it was obvious that this was ideal tunning candidate.
I will note my tuning process in this blog post.
Table and column names are changed.
DB: Oracle 10.2.0.2.0
OS: Solaris 9 64-bit
SUM (poz_traj) / 60,2)
SUM (amount) iznos,
COUNT (*) pozivi
FROM usr.ugov@remote_db co,
WHERE 1 = 1
AND co.ugov_id = ci.ugov_id
AND ci.ugov_item_id = np.ugov_item_id
AND NP.kol_id (more...)
You can use Oracle Grid Control for checking policy violations, to keep compliance with company security or configuration standards. Grid Control comes with pre-installed set of policies and recommendations of best practices for databases.
With having that in mind I checked Critical Policy Violations for my database and noticed this critical violations:
Execute Privileges on DBMS_JOB to PUBLIC
Restricted Privilege to Execute UTL_TCP
Restricted Privilege to Execute UTL_HTTP
Restricted Privilege to Execute UTL_SMTP
Execute Privileges on DBMS_LOB to PUBLIC
Execute Privileges on UTL_FILE To PUBLIC
It is recommended to remove excessive privileges from some users to prevent possible attacks for (more...)