Loading “Views” from One Database to Another

Hi all,

This seems to be a simple request, right?  Let’s go to the regular cases / simple answers:

  • Use DBMS_METDATA.ET_DDL to get the view code: DBMS_METADATA.get_ddl (‘VIEW’, <VIEW_NAME>, <OWNER>)
  • Use datapump with include=VIEW Or even specifying the list INCLUDE=VIEW:\”IN (\’VW_EXAMPLE1\’, \’VW_EXAMPLE2\’)\” Or part of the name: INCLUDE=VIEW:”LIKE ‘VW_%EXEMPLE%'”

However, when supporting a client with this need I see this may have some caveats. The options above load the view purely, not the “content” of the view from an application perspective.

And here let’s leave it clear, a view doesn’t have any data but instead stores a query to retrieve the data from regular tables. We could have underlying tables as support mechanisms in case of materialized views,  but in essence, they are transitory build from the actual database tables. Now how to load views to other databases without the source tables including the underlying data?

This question has 2 answers:

  • Exporting the views as tables.

In general lines I’d do it manually: Create tables from views and then export them. This can mean I would need to have space for creating those tables, which can be a lot, though.

But then I see we have VIEWS_AS_TABLES clause in Datapump which makes exactly that. As a reference: https://docs.oracle.com/database/121/SUTIL/GUID-E4E45E81-5391-43BE-B27D-B763EF79A885.htm#SUTIL3904

However, in this case, the import will bring the views as tables, not as views. Which will theoretically resolve the issue from a data perspective, but may not be what is required: What if I need the views as views on the destination database?

Well, in this case it’s needed to load all the underlying tables. How to extract them? The hard way is opening all the view codes and listing them. The smart way is the second option:

  • Listing the tables referenced by Views from dba_dependencies and exporting them.

select owner as schema_name,
name as view_name,
referenced_owner as referenced_schema_name,
referenced_name,
referenced_type
from sys.dba_dependencies
where type = 'VIEW'
-- excluding some Oracle maintained schemas
and owner not in ('ANONYMOUS','CTXSYS','DBSNMP','EXFSYS', 'LBACSYS',
'MDSYS', 'MGMT_VIEW','OLAPSYS','OWBSYS','ORDPLUGINS', 'ORDSYS','OUTLN',
'SI_INFORMTN_SCHEMA','SYS','SYSMAN','SYSTEM','TSMSYS','WK_TEST','WKSYS',
'WKPROXY','WMSYS','XDB','APEX_040000', 'APEX_PUBLIC_USER','DIP',
'FLOWS_30000','FLOWS_FILES','MDDATA', 'ORACLE_OCM', 'XS$NULL',
'SPATIAL_CSW_ADMIN_USR', 'SPATIAL_WFS_ADMIN_USR', 'PUBLIC')
order by owner, name, referenced_name, referenced_owner, referenced_type;

Aaaand, that’s what I actually needed. With the list in place, it’s a matter of exporting with datapump including the tables on the list and the views (as mentioned above).

I hope it helps!

Export/Backup directly to Zip using MKNOD!

We all faced that situation when we have to make a logical backup/export and haven’t so much area to do that, right?
We know the export usually compress a lot on zip/gzip… It wouldn’t be great if we can export directly to compressed file?

zip
This situation become much more common because of Datapump, that requires a directory accessible by database server. If you have not possibility to make a mounting point or any other area, this can help…

## BKP with MKNOD
BKP_DEST=/just/example
DATE=`date +%Y%m%d%H%M`
cd $BKP_DEST
mknod bkp_$DATE.dmp p
gzip  bkp_$DATE.dmp.gz &
### Uncomment and Ajust one of:
## MySQL:
#mysqldump -u $user -p$password $database > bkp_$DATE.dmp
## Oracle (Datapump or EXP)
expdp \"/ as sysdba\" dumpfile=bkp_$DATE.dmp full=y directory=DIRECTORY_EXAMPLE logfile=log_bkpzipped.log compress=y
#expdp $user/$password dumpfile=bkp_$DATE.dmp full=y directory=DIRECTORY_EXAMPLE logfile=log_bkpzipped.log
#exp \"/ as sysdba\" file=bkp_$DATE.dmp log=log_bkpzipped.log compress=y [tables=owner.table,..] [owner=schema1,..] [...]

Hugs!
Matheus.