Populating historical data to data model tables

About

AEMO provides NEM participants a monthly archive. It consists of a historical subset of the Electricity Data Model data. Below are the prerequisites and four different procedures participants can follow to load historical data to the data model tables.

Prerequisites

On AEMO's Market Data NEMweb, navigate to Data Model > Monthly Archive and download the archived data .

Using the supplied SQL Loader Wrapper script

  1. Load the monthly DVD to drive or copy the files to a directory.

  2. Unzip the data files.

  3. In the target schema, disable the Foreign Key constraints if you have implemented any.

  4. To a directory, save the <VISIBILITYID>_RUN_SQLLDR_<YYYYMM>.bat script with write access and edit it.

  5. In the script, locate the following section, make the necessary changes and save the file.

  6. On the command line, set the working directory (using CD) to the directory containing the <VISIBILITYID>_RUN_SQLLDR_<YYYYMM>.bat script and run the script.

  7. At the location specified by the LOGDIR entry set in the <VISIBILITYID>_RUN_SQLLDR_<YYYYMM>.bat, check the produced logfiles for any errors.

  8. In the target schema, enable the Foreign Keys if you disabled any in step 3.

SQL Loader command line

  1. Unzip the zipped csv file.

  2. Run the SQL Loader utility as follows:

  3. Check the log files for any errors.

  4. Check the discard file for any records discarded due to failing the WHEN clause.

    Each discard file is expected to have 3 rows beginning with C, I and C respectively - these are comment rows and need to be discarded. Any rows beginning with D are data rows and are expected to load successfully.

  5. Check the bad file for any rows not able to be loaded.

    Unsuccessful loading of a row is due usually to a column data type mismatch or number of columns mismatch between the data in the csv file and the structure of the table in your target database.

Using the supplied BCP Wrapper scripts

  1. Load the monthly DVD to drive or copy the files to a directory.

  2. In the target schema, disable the Foreign Key constraints if you have implemented any.

  3. In the DATA directory, unzip the csv files.

  4. To a directory, save the <VISIBILITYID>_RUN_CONVERT_<YYYYMM>.bat script with write access and edit it.

  5. In the script, locate the following section, make the necessary changes and save the file.

  6. On the command line, set the working directory (using CD) to the directory containing the <VISIBILITYID>_CONVERT_BCP_<YYYYMM>.bat script and run the script.

  7. At the location specified by the LOGDIR entry set in the <VISIBILITYID>_CONVERT_BCP_<YYYYMM>.bat, check the produced logfiles for any file conversion errors.

  8. In the script, locate the following section, make the necessary changes and save the file.

  9. On the command line, set the working directory (using CD) to the directory containing the <VISIBILITYID>_RUN_BCP_<YYYYMM>.bat script and run the script.

  10. At the location specified by the LOGDIR entry set in the <VISIBILITYID>_RUN_BCP_<YYYYMM>.bat, check the produced logfiles for any errors.

  11. In the target schema, enable the Foreign Keys if you disabled any in step 2.

BCP command line

  1. In the DATA directory, unzip the required csv file.

  2. Run the BCPT.EXE file conversion utility as follows:

  3. Check the log files and error files for any errors.

  4. Run the SQL Server BCP utility as follows:

  5. Check the log files and error files for any errors.

  6. Check the bad file for any rows not able to be loaded.

    Unsuccessful loading of a row is due usually to a column data type mismatch or number of columns mismatch between the data in the csv file and the structure of the table in your target database.