Last week, a new Oracle release dropped . Instead of an entirely new release, it is an update of the 23c build with a few highly anticipated changes, like vector support and updates to JSON duality views. Instead of the “c” prefix referring to cloud, they are now marketing it as a 23ai release. Currently, it is available in the cloud, and you can run the “FREE” release on your own hardware.
As I want to try out the new features and need to look into the JSON duality view updates in preparation of my KScope24 talk on it, I wanted to upgrade my 23c FREE instance. I hoped the upgrade would be a breeze by just being able to reuse the datafiles and start the new image from them but this is not the case. So instead we have to migrate all the data into a fresh database instance. So lets get started.
Stop DB writes # As it will take some time to back up everything, we need to stop users from being able to change data. As I only have an ORDS that connects users to the data, I simply stopped it. You might want to disconnect other users or disable writing.
1 docker stop oracle-23c-free-ords
2
Export data from the old 23c # Create an Oracle directory # As I want to export the data with a datapump, I need a directory where the dumps get exported to.
1 # ssh into docker database container
2 # my container is named "oracle-23c-free" you might need to change this
3 docker exec -u oracle -it oracle-23c-free bash
4
5 cd /opt/oracle/oradata
6 mkdir datapump
7 exit
8
9 # Connect to FREEPDB1 as sysdba
10 sql sys/▋▋▋▋▋▋▋@23c.phartenfeller.de:15211/FREEPDB1 as sysdba
11
12 # Create a new directory to store datapump exports
13 SQL > CREATE OR REPLACE DIRECTORY dp_dir AS '/opt/oracle/oradata/datapump' ;
14 # Directory created.
15
16 # schema_name
17 SQL > GRANT READ, WRITE ON DIRECTORY dp_dir TO apex_off_grid ;
18 # Repeat this for every schema you want to export
19
20 SQL > exit
21
Export data # To back up the schema data we will use SQLcl and run a datapump export. Note that the dumps will be written to the database directory, not to the machine that runs SQLcl.
1 ## connect to a database user/schema you want to export
2 sql apex_off_grid/▋▋▋▋▋▋▋@23c.phartenfeller.de:15211/FREEPDB1
3
4
5 SQL > datapump export -
6 -schemas apex_off_grid -
7 -directory DP_DIR -
8 -dumpdirectory DP_DIR -
9 -dumpfile apex_off_grid.dmp -
10 -logfile apex_off_grid.log -
11 -version latest
12 # I needed to run this two times as the first time I got this strange error:
13 ## Value DATA_PUMP_DIR is not valid for Parameter directory
14 ##DataPump: Unable to process request. Verify specified options:
15
16
Repeat this step for every database schema you want to backup.
Export APEX Workspaces and Apps # This time SQLcl will export all the workspace and app scripts to the machine where you run SQLcl. So changed my current directory to a new folder so I have all the files in one place.
1 # create folder to store workspace and app files
2 mkdir apex_exports
3 cd apex_exports
4
5 sql apex_off_grid/▋▋▋▋▋▋▋@23c.phartenfeller.de:15211/FREEPDB1
6
7 SQL > select WORKSPACE_DISPLAY_NAME, WORKSPACE_ID from apex_workspaces ;
8 # WORKSPACE_DISPLAY_NAME WORKSPACE_ID
9 # _________________________ ____________________
10 # APEX_OFF_GRID 14850141765267480
11
12 SQL > apex export-workspace -woi 14850141765267480
13 # Exporting Workspace 14850141765267480:'APEX_OFF_GRID'
14 # File w14850141765267480.sql created
15
16 SQL > apex export-all-applications -woi 14850141765267480
17 # Exporting Workspace APEX_OFF_GRID - application 100:Recipes (Blueprint)
18 # File f100.sql created
19 # Exporting Workspace APEX_OFF_GRID - application 101:Changes
20 # File f101.sql created
21 # ...
22
Start a new 23ai database container # As I am using docker-compose I created a new entry there for the new database container. I changed the port so I can temporarily run the old an new container next to each other.
1 oracle-23ai-free :
2 container_name : oracle - 23ai - free
3 image : container - registry.oracle.com/database/free : 23.4.0.0
4 ports :
5 - 95212 : 1521
6 environment :
7 - ORACLE_PWD= **SetTheSamePwHere**
8 - TZ=Europe/Berlin
9 volumes :
10 - ~/dbs/oracle - 23ai - free/oradata : /opt/oracle/oradata
11 hostname : orcl - 23ai - free
12
Now we can start the new database:
1 docker-compose up -d oracle-23ai-free
2
I had some issues with file permissions. Even though the container created the oradata
folder the DB init failed with following error message:
1 [ 2024 -05-05 16 :30:40.671 CEST ] Prepare for db operation
2 [ 2024 -05-05 16 :30:40.713 CEST ] Cannot create directory "/opt/oracle/oradata/FREE" .
3 DBCA_PROGRESS : 7 %
4 DBCA_PROGRESS : 100 %
5 [ 2024 -05-05 16 :30:40.718 CEST ] [ FATAL ] Prepare Operation has failed.
6 DBCA_PROGRESS : 0 %
7
8
So I lazily ran sudo chmod 777 ./oradata
, dropped the container and created a new one; now it worked.
You can follow the DB initial process with docker logs oracle-23ai-free --follow
. This can take some time, depending on your hardware.
This is what the log should look like when the DB is ready:
1 Prepare for db operation
2 7 % complete
3 Copying database files
4 29 % complete
5 Creating and starting Oracle instance
6 30 % complete
7 33 % complete
8 36 % complete
9 39 % complete
10 43 % complete
11 Completing Database Creation
12 47 % complete
13 49 % complete
14 50 % complete
15 Creating Pluggable Databases
16 54 % complete
17 71 % complete
18 Executing Post Configuration Actions
19 93 % complete
20 Running Custom Scripts
21 100 % complete
22 Database creation complete. For details check the logfiles at:
23 /opt/oracle/cfgtoollogs/dbca/FREE.
24 Database Information:
25 Global Database Name:FREE
26 System Identifier ( SID ) :FREE
27 Look at the log file "/opt/oracle/cfgtoollogs/dbca/FREE/FREE.log" for further details.
28
29 Connect to Oracle Database using one of the connect strings:
30 Pluggable database: orcl-23ai-free/FREEPDB1
31 Multitenant container database: orcl-23ai-free
32
33 SQL*Plus: Release 23.0 .0.0.0 - Production on Sun May 5 16 :41:16 2024
34 Version 23.4 .0.24.05
35
36 Copyright ( c ) 1982 , 2024 , Oracle. All rights reserved.
37
38
39 Connected to:
40 Oracle Database 23ai Free Release 23.0 .0.0.0 - Develop, Learn, and Run for Free
41 Version 23.4 .0.24.05
42
43 SQL >
44 System altered.
45
46 SQL >
47 Pluggable database altered.
48
49 SQL >
50 PL/SQL procedure successfully completed.
51
52 SQL > SQL >
53 Session altered.
54
55 SQL >
56 User created.
57
58 SQL >
59 Grant succeeded.
60
61 SQL >
62 Grant succeeded.
63
64 SQL >
65 Grant succeeded.
66
67 SQL >
68 User altered.
69
70 SQL > SQL > Disconnected from Oracle Database 23ai Free Release 23.0 .0.0.0 - Develop, Learn, and Run for Free
71 Version 23.4 .0.24.05
72 The Oracle base remains unchanged with value /opt/oracle
73
74 Executing user defined scripts
75 /opt/oracle/runUserScripts.sh: ignoring /opt/oracle/scripts/extensions/setup/*
76
77 DONE: Executing user defined scripts
78
79 The Oracle base remains unchanged with value /opt/oracle
80 #########################
81 DATABASE IS READY TO USE !
82 #########################
83
Connect to the new DB # You can now connect to the new DB and should be greeted by following message:
1 sql sys/▋▋▋▋▋▋▋@23ai.phartenfeller.de:95212/FREEPDB1 as sysdba
2
3 # Connected to:
4 # Oracle Database 23ai Free Release 23.0.0.0.0 - Develop, Learn, and Run for Free
5 # Version 23.4.0.24.05
6
Import data into the new 23ai DB # Create datapump dir in new DB # To import our datapump dumps we need them accessible to the DB inside a directory.
1 # ssh into docker database container
2 docker exec -u oracle -it oracle-23ai-free bash
3
4 cd /opt/oracle/oradata
5 mkdir datapump
6 exit
7
8 sql sys/▋▋▋▋▋▋▋@23c.phartenfeller.de:15211/FREEPDB1 as sysdba
9
10 # Create a new directory to store datapump exports
11 SQL > CREATE OR REPLACE DIRECTORY dp_dir AS '/opt/oracle/oradata/datapump' ;
12 # Directory created.
13
14 SQL > GRANT READ, WRITE ON DIRECTORY dp_dir TO apex_off_grid ;
15 # Repeat this for every schema you want to export
16
17 SQL > exit
18
Copy datapump dumps to 23ai directory # 1 sudo cp -r ~/dbs/orcl-23c-free/oradata/datapump ~/dbs/oracle-23c-bkp
2 sudo chmod 777 -R ~/dbs/oracle-23c-bkp/datapump
3 sudo chown -R myuser:myuser ~/dbs/oracle-23c-bkp/datapump
4
5 docker cp ~/dbs/oracle-23c-bkp/datapump/apex_off_grid.dmp oracle-23ai-free:/opt/oracle/oradata/datapump
6 # repeat for ever *.dmp file you have
7
Recreate DB users # This is my typical create user/schema script that I ran for every schema I wanted to import. I reused all my old passwords for convenience.
1 create tablespace tbs_apex_off_grid
2 datafile 'tbs_apex_off_grid.dat'
3 size 10 M
4 reuse
5 autoextend on next 2 M ;
6
7 create user apex_off_grid
8 identified by ##secret_password##
9 default tablespace tbs_apex_off_grid
10 ;
11
12 grant create session to apex_off_grid ;
13 grant create table to apex_off_grid ;
14 grant create view to apex_off_grid ;
15 grant create any trigger to apex_off_grid ;
16 grant create any procedure to apex_off_grid ;
17 grant create sequence to apex_off_grid ;
18 grant create synonym to apex_off_grid ;
19 grant unlimited tablespace to apex_off_grid ;
20
21 -- also recommended when with apex workspace
22 grant create cluster to apex_off_grid ;
23 grant create dimension to apex_off_grid ;
24 grant create indextype to apex_off_grid ;
25 grant create job to apex_off_grid ;
26 grant create materialized view to apex_off_grid ;
27 grant create operator to apex_off_grid ;
28 grant create procedure to apex_off_grid ;
29 grant create trigger to apex_off_grid ;
30 grant create type to apex_off_grid ;
31 grant create any context to apex_off_grid ;
32 grant create mle to apex_off_grid ;
33 grant create property graph to apex_off_grid ;
34 grant execute dynamic mle to apex_off_grid ;
35
Import Data # 1 # Connect with SQLcl to the user where we want to import data to
2 sql offl_recipes/▋▋▋▋▋▋▋@23ai.phartenfeller.de:95212/FREEPDB1
3
4 datapump import -
5 -schemas offl_recipes -
6 -directory dp_dir -
7 -dumpfile offl_recipes.dmp -
8 -logfile offl_recipes-imp.log -
9 -version latest
10 # had to run this two times again because of:
11 ## Value DATA_PUMP_DIR is not valid for Parameter directory
12 ## DataPump: Unable to process request. Verify specified options:
13
Repeat this step for every schema.
Install APEX # 1 wget https://download.oracle.com/otn_software/apex/apex_23.2_en.zip
2 unzip apex_23.2_en.zip
3 cd apex
4
5 # Connect to FREEPDB1 as sys and run APEX install script
6 sql sys/▋▋▋▋▋▋▋@23ai.phartenfeller.de:95212/FREEPDB1 as sysdba @apexins.sql SYSAUX SYSAUX TEMP /i/
7
Import Workspaces and Apps # To make our lives easier, we can go to the folder where we exported all the APEX stuff and call ls -A1
. Copy the file names into a new install-all.sql
file and prefix them with @
. Move the w
prefixed workspace files to the top.
1 # install workspaces
2 @w3600473184497866.sql
3 @w4395603499594930.sql
4
5 # install apps
6 @f100.sql
7 @f101.sql
8 @f102.sql
9 @f103.sql
10 @f104.sql
11 @f105.sql
12 @f106.sql
13 @f107.sql
14 @f110.sql
15 @f111.sql
16 @f113.sql
17 @f145.sql
18 @f185.sql
19 @f9910.sql
20
Now we can connect to FREEPDB1
and install all workspaces and apps with a single script:
1 sql sys / ▋▋▋▋▋▋▋ @23ai.phartenfeller.de : 95212 / FREEPDB1 as sysdba @install - all . sql
2
# Now the data and APEX stuff should be back as they were. We can now change our ORDS config to point to our new database.
1 cd [ PATH_TO_ORDS_CONFIG ]
2 sudo rm -r databases
3
4 # Creatte conn_string.txt with DB credentials
5 cd [ PATH_TO_ORDS_VARIABLES_DIR ]
6 nano conn_string.txt
7 CONN_STRING = sys/▋▋▋▋▋▋▋@23ai.phartenfeller.de:95212/FREEPDB1
8
Now we can restart the ORDS:
1 docker start oracle-23c-free-ords
2
And I needed to unlock the ORDS user in the database.
Summary # This whole journey took me longer than expected but I have everything up and running again. If you know of better/faster/easier approaches please let me know.
Resources