Upgrading from Oracle 23c FREE to 23ai

My upgrade journey from dockerized Oracle 23c FREE to 23ai

Last week, a new Oracle release dropped. Instead of an entirely new release, it is an update of the 23c build with a few highly anticipated changes, like vector support and updates to JSON duality views. Instead of the “c” prefix referring to cloud, they are now marketing it as a 23ai release. Currently, it is available in the cloud, and you can run the “FREE” release on your own hardware.

As I want to try out the new features and need to look into the JSON duality view updates in preparation of my KScope24 talk on it, I wanted to upgrade my 23c FREE instance. I hoped the upgrade would be a breeze by just being able to reuse the datafiles and start the new image from them but this is not the case. So instead we have to migrate all the data into a fresh database instance. So lets get started.

Stop DB writes

As it will take some time to back up everything, we need to stop users from being able to change data. As I only have an ORDS that connects users to the data, I simply stopped it. You might want to disconnect other users or disable writing.

1docker stop oracle-23c-free-ords
2

Export data from the old 23c

Create an Oracle directory

As I want to export the data with a datapump, I need a directory where the dumps get exported to.

1# ssh into docker database container
2# my container is named "oracle-23c-free" you might need to change this
3docker exec -u oracle -it oracle-23c-free bash
4
5cd /opt/oracle/oradata
6mkdir datapump
7exit
8
9# Connect to FREEPDB1 as sysdba
10sql sys/▋▋▋▋▋▋▋@23c.phartenfeller.de:15211/FREEPDB1 as sysdba
11
12# Create a new directory to store datapump exports
13SQL> CREATE OR REPLACE DIRECTORY dp_dir AS '/opt/oracle/oradata/datapump';
14# Directory created.
15
16#                                             schema_name
17SQL> GRANT READ, WRITE ON DIRECTORY dp_dir TO apex_off_grid;
18# Repeat this for every schema you want to export
19
20SQL> exit
21

Export data

To back up the schema data we will use SQLcl and run a datapump export. Note that the dumps will be written to the database directory, not to the machine that runs SQLcl.

1## connect to a database user/schema you want to export
2sql apex_off_grid/▋▋▋▋▋▋▋@23c.phartenfeller.de:15211/FREEPDB1
3
4
5SQL> datapump export -
6     -schemas apex_off_grid -
7     -directory DP_DIR -
8     -dumpdirectory DP_DIR -
9     -dumpfile apex_off_grid.dmp -
10     -logfile apex_off_grid.log -
11     -version latest
12# I needed to run this two times as the first time I got this strange error:
13## Value DATA_PUMP_DIR is not valid for Parameter directory
14##DataPump: Unable to process request. Verify specified options:
15
16

Repeat this step for every database schema you want to backup.

Export APEX Workspaces and Apps

This time SQLcl will export all the workspace and app scripts to the machine where you run SQLcl. So changed my current directory to a new folder so I have all the files in one place.

1# create folder to store workspace and app files
2mkdir apex_exports
3cd apex_exports
4
5sql apex_off_grid/▋▋▋▋▋▋▋@23c.phartenfeller.de:15211/FREEPDB1
6
7SQL> select WORKSPACE_DISPLAY_NAME, WORKSPACE_ID  from apex_workspaces;
8# WORKSPACE_DISPLAY_NAME            WORKSPACE_ID
9# _________________________ ____________________
10# APEX_OFF_GRID                14850141765267480
11
12SQL> apex export-workspace -woi 14850141765267480
13# Exporting Workspace 14850141765267480:'APEX_OFF_GRID'
14# File w14850141765267480.sql created
15
16SQL> apex export-all-applications -woi 14850141765267480
17# Exporting Workspace APEX_OFF_GRID - application 100:Recipes (Blueprint)
18# File f100.sql created
19# Exporting Workspace APEX_OFF_GRID - application 101:Changes
20# File f101.sql created
21# ...
22

Start a new 23ai database container

As I am using docker-compose I created a new entry there for the new database container. I changed the port so I can temporarily run the old an new container next to each other.

1oracle-23ai-free:
2  container_name: oracle-23ai-free
3  image: container-registry.oracle.com/database/free:23.4.0.0
4  ports:
5    - 95212:1521
6  environment:
7    - ORACLE_PWD=**SetTheSamePwHere**
8    - TZ=Europe/Berlin
9  volumes:
10    - ~/dbs/oracle-23ai-free/oradata:/opt/oracle/oradata
11  hostname: orcl-23ai-free
12

Now we can start the new database:

1docker-compose up -d oracle-23ai-free
2

I had some issues with file permissions. Even though the container created the oradata folder the DB init failed with following error message:

1[ 2024-05-05 16:30:40.671 CEST ] Prepare for db operation
2[ 2024-05-05 16:30:40.713 CEST ] Cannot create directory "/opt/oracle/oradata/FREE".
3DBCA_PROGRESS : 7%
4DBCA_PROGRESS : 100%
5[ 2024-05-05 16:30:40.718 CEST ] [FATAL] Prepare Operation has failed.
6DBCA_PROGRESS : 0%
7
8

So I lazily ran sudo chmod 777 ./oradata , dropped the container and created a new one; now it worked.

You can follow the DB initial process with docker logs oracle-23ai-free --follow. This can take some time, depending on your hardware.

This is what the log should look like when the DB is ready:

1Prepare for db operation
27% complete
3Copying database files
429% complete
5Creating and starting Oracle instance
630% complete
733% complete
836% complete
939% complete
1043% complete
11Completing Database Creation
1247% complete
1349% complete
1450% complete
15Creating Pluggable Databases
1654% complete
1771% complete
18Executing Post Configuration Actions
1993% complete
20Running Custom Scripts
21100% complete
22Database creation complete. For details check the logfiles at:
23 /opt/oracle/cfgtoollogs/dbca/FREE.
24Database Information:
25Global Database Name:FREE
26System Identifier(SID):FREE
27Look at the log file "/opt/oracle/cfgtoollogs/dbca/FREE/FREE.log" for further details.
28
29Connect to Oracle Database using one of the connect strings:
30     Pluggable database: orcl-23ai-free/FREEPDB1
31     Multitenant container database: orcl-23ai-free
32
33SQL*Plus: Release 23.0.0.0.0 - Production on Sun May 5 16:41:16 2024
34Version 23.4.0.24.05
35
36Copyright (c) 1982, 2024, Oracle.  All rights reserved.
37
38
39Connected to:
40Oracle Database 23ai Free Release 23.0.0.0.0 - Develop, Learn, and Run for Free
41Version 23.4.0.24.05
42
43SQL>
44System altered.
45
46SQL>
47Pluggable database altered.
48
49SQL>
50PL/SQL procedure successfully completed.
51
52SQL> SQL>
53Session altered.
54
55SQL>
56User created.
57
58SQL>
59Grant succeeded.
60
61SQL>
62Grant succeeded.
63
64SQL>
65Grant succeeded.
66
67SQL>
68User altered.
69
70SQL> SQL> Disconnected from Oracle Database 23ai Free Release 23.0.0.0.0 - Develop, Learn, and Run for Free
71Version 23.4.0.24.05
72The Oracle base remains unchanged with value /opt/oracle
73
74Executing user defined scripts
75/opt/oracle/runUserScripts.sh: ignoring /opt/oracle/scripts/extensions/setup/*
76
77DONE: Executing user defined scripts
78
79The Oracle base remains unchanged with value /opt/oracle
80#########################
81DATABASE IS READY TO USE!
82#########################
83

Connect to the new DB

You can now connect to the new DB and should be greeted by following message:

1sql sys/▋▋▋▋▋▋▋@23ai.phartenfeller.de:95212/FREEPDB1 as sysdba
2
3# Connected to:
4# Oracle Database 23ai Free Release 23.0.0.0.0 - Develop, Learn, and Run for Free
5# Version 23.4.0.24.05
6

Import data into the new 23ai DB

Create datapump dir in new DB

To import our datapump dumps we need them accessible to the DB inside a directory.

1# ssh into docker database container
2docker exec -u oracle -it oracle-23ai-free bash
3
4cd /opt/oracle/oradata
5mkdir datapump
6exit
7
8sql sys/▋▋▋▋▋▋▋@23c.phartenfeller.de:15211/FREEPDB1 as sysdba
9
10# Create a new directory to store datapump exports
11SQL> CREATE OR REPLACE DIRECTORY dp_dir AS '/opt/oracle/oradata/datapump';
12# Directory created.
13
14SQL> GRANT READ, WRITE ON DIRECTORY dp_dir TO apex_off_grid;
15# Repeat this for every schema you want to export
16
17SQL> exit
18

Copy datapump dumps to 23ai directory

1sudo cp -r ~/dbs/orcl-23c-free/oradata/datapump ~/dbs/oracle-23c-bkp
2sudo chmod 777 -R ~/dbs/oracle-23c-bkp/datapump
3sudo chown -R myuser:myuser ~/dbs/oracle-23c-bkp/datapump
4
5docker cp ~/dbs/oracle-23c-bkp/datapump/apex_off_grid.dmp oracle-23ai-free:/opt/oracle/oradata/datapump
6# repeat for ever *.dmp file you have
7

Recreate DB users

This is my typical create user/schema script that I ran for every schema I wanted to import. I reused all my old passwords for convenience.

1create tablespace tbs_apex_off_grid
2  datafile 'tbs_apex_off_grid.dat'
3    size 10M
4    reuse
5    autoextend on next 2M;
6
7create user apex_off_grid
8  identified by ##secret_password##
9  default tablespace tbs_apex_off_grid
10;
11
12grant create session to apex_off_grid;
13grant create table to apex_off_grid;
14grant create view to apex_off_grid;
15grant create any trigger to apex_off_grid;
16grant create any procedure to apex_off_grid;
17grant create sequence to apex_off_grid;
18grant create synonym to apex_off_grid;
19grant unlimited tablespace to apex_off_grid;
20
21-- also recommended when with apex workspace
22grant create cluster to apex_off_grid;
23grant create dimension to apex_off_grid;
24grant create indextype to apex_off_grid;
25grant create job to apex_off_grid;
26grant create materialized view to apex_off_grid;
27grant create operator to apex_off_grid;
28grant create procedure to apex_off_grid;
29grant create trigger to apex_off_grid;
30grant create type to apex_off_grid;
31grant create any context to apex_off_grid;
32grant create mle to apex_off_grid;
33grant create property graph to apex_off_grid;
34grant execute dynamic mle to apex_off_grid;
35

Import Data

1# Connect with SQLcl to the user where we want to import data to
2sql offl_recipes/▋▋▋▋▋▋▋@23ai.phartenfeller.de:95212/FREEPDB1
3
4datapump import -
5  -schemas offl_recipes -
6  -directory dp_dir -
7  -dumpfile offl_recipes.dmp -
8  -logfile offl_recipes-imp.log -
9  -version latest
10# had to run this two times again because of:
11## Value DATA_PUMP_DIR is not valid for Parameter directory
12## DataPump: Unable to process request. Verify specified options:
13

Repeat this step for every schema.

Install APEX

1wget https://download.oracle.com/otn_software/apex/apex_23.2_en.zip
2unzip apex_23.2_en.zip
3cd apex
4
5# Connect to FREEPDB1 as sys and run APEX install script
6sql sys/▋▋▋▋▋▋▋@23ai.phartenfeller.de:95212/FREEPDB1 as sysdba @apexins.sql SYSAUX SYSAUX TEMP /i/
7

Import Workspaces and Apps

To make our lives easier, we can go to the folder where we exported all the APEX stuff and call ls -A1. Copy the file names into a new install-all.sql file and prefix them with @. Move the w prefixed workspace files to the top.

1# install workspaces
2@w3600473184497866.sql
3@w4395603499594930.sql
4
5# install apps
6@f100.sql
7@f101.sql
8@f102.sql
9@f103.sql
10@f104.sql
11@f105.sql
12@f106.sql
13@f107.sql
14@f110.sql
15@f111.sql
16@f113.sql
17@f145.sql
18@f185.sql
19@f9910.sql
20

Now we can connect to FREEPDB1 and install all workspaces and apps with a single script:

1sql sys/▋▋▋▋▋▋▋@23ai.phartenfeller.de:95212/FREEPDB1 as sysdba @install-all.sql
2

Reconfigure ORDS

Now the data and APEX stuff should be back as they were. We can now change our ORDS config to point to our new database.

1cd [PATH_TO_ORDS_CONFIG]
2sudo rm -r databases
3
4# Creatte conn_string.txt with DB credentials
5cd [PATH_TO_ORDS_VARIABLES_DIR]
6nano conn_string.txt
7CONN_STRING=sys/▋▋▋▋▋▋▋@23ai.phartenfeller.de:95212/FREEPDB1
8

Now we can restart the ORDS:

1docker start oracle-23c-free-ords
2

And I needed to unlock the ORDS user in the database.

Summary

This whole journey took me longer than expected but I have everything up and running again. If you know of better/faster/easier approaches please let me know.

Resources