Wednesday 28 May 2014

Querying XML CLOB Data Directly in Oracle

Oracle 9i introduced a new datatype - XMLType- specifically designed for handling XML naively in the database.

However, we may find we have a database that stores XML directly in a CLOB/NCLOB datatype (either because of legacy data, or because we are supporting multiple database platforms).

It can be useful to query this data directly using the XMLType functions available. This post just shows a very simple example of a query to do this.

(Note: there may be better ways to do this - but this worked for me when I needed an ad-hoc query quickly!)

Our Table

Assume we have a  table called USER, which has the following columns



Our embedded XML (example for Ringo)



Our Query

Say we want to look at account details in the XML - e.g. find which users all have the same sort code. A query like the following:



Will produce output like this. Obviously from here you can use all the standard SQL operators to filter, join etc further to get what you need:




Friday 14 March 2014

Data Pumper - Migrating Data from MySQL to PostgreSQL

I recently made the decision to migrate from MySQL to PostgreSQL (the reasons why to be covered later). I thought I'd just jot down my notes on a tool I tried which helped with the migration - SQL Workbench/J Data Pumper (no sniggering at the back..).

Migrating the Database

My MySQL database was running on Amazon RDS. I first created a new PostgreSQL instance on Amazon RDS and created the basic Table structures (I'm using JPA/Hibernate in my app - so this was just a case of updating my JPA config and pointing it at the new PostgreSQL instance and let it create all the tables for me) - the benefits of ORM!

Migrating the Data

I then needed to migrate the data over. I didn't have a huge amount of data (in the region of a few hundred/low thousands across a number of tables).

I was initially going to just use mysqldump - possibly to a csv and then just import this into Postgres. However, I had a few issues getting the data out this time (I've used mysqldump before on a semi-regular basis - so not sure exactly what the problem was this time - though it was the first time I'd used it on RDS with MySQL 5.6). As I was moving away from MySQL - I wasn't particularly bothered about getting to the bottom of this - so changed tack and thought I'd give the SQL Workbench Data Pumper a whirl.

It worked really well - and is a great little tool for jobs like this (undoubtedly not the best tool of choice if you had large datasets and remote systems - but for small 'hacky' jobs like this it was ideal).

Just to outline below - this is what I did:

Pre-requisites

  • Download SQL Workbench/J   
    • http://www.sql-workbench.net/ 
    • I did this on linux - so it was simple extract and run 'chmod +x *.sh' to make the shell scripts executable
    • run 'sqlworkbench' to fire up the app
  • Download the JDBC Database Drivers
  • Configure SQL Workbench with connections to both your databases

  • Try connecting to a database just to confirm it all works as expected


  • Open the 'Data Pumper' (Tools .. Data Pumper)

  • Select Source connection in top left (MySQL in my case), and Target connection in top right (PostgreSQL). Select the Source and Target tables and field mappings, and then click the 'Start' button.

  • Bingo - data transferred! Simple and easy to use. I'd definitely use it again for small jobs like this - useful to have in the toolbox (and supports a wide range of database platforms).