Code Extraction
Instructions to extract Teradata code.
The first step for a migration is getting the code that you need to migrate. There are many ways to extract the code from your database. However, we recommend using the extraction scripts provided by Mobilize.Net.
All the source code for these scripts is available on Github

Prerequisites

    Access to a server with Teradata database.
    Permissions to run shell scripts with access to the server.
    Teradata utilities likebteq / tpt.
    Python

Installing the scripts

Install From source

You can install from source.
Go to the terminal and run:
1
curl -H 'Cache-Control: no-cache' -L https://git.io/JcziL | bash
Copied!

Install from PyPi

This is the preferred installation method. To use this method you need to have Python 3.x installed. Go to the terminal and run:
1
$ pip3 install snowconvert-tera-exporter --upgrade
Copied!
NOTE: these tools must be executed in bash shell on a Linux environment with access tobteq/tpt utilities. If you need more detailed installation instructions please refer to this installation guide.
After completing the installation, thesc-tera-exporttool will be available.
To validate if the tool is installed, open a new terminal and run:
1
sc-tera-export -h
Copied!
If the tool is properly installed you will see an output like:
1
usage: sc-tera-export [-h] -S -U -P
2
3
Mobilize.NET Teradata Code Export ToolsVersion X.X.X
4
5
optional arguments:
6
-h, --help show this help message and exit
7
-S , --server Server address. For example: 127.0.0.1
8
-U , --user Login ID for server. Usually it will be the DBC user
9
-P , --password The password for the given user.
Copied!
sc-tera-export is a simple tool to help you to export Teradata code so it can be upgraded to SnowFlake using the SnowConvert Tool.
The sc-tera-export can be used to generate extraction scripts that can be run to generate data definition language (DDL) for database objects in Teradata.
Those output scripts can then be used as an input for the SnowConvert Tool
The sc-tera-export also bundles another tool called sc-tera-split-ddl. This tool can be used to split large DDLs into a file per object.
All the source code for this tool is open source and is available on GitHub.

Run the extraction process

Once you have installed the scripts and verified it is properly installed, your next step is to create a folder for your extraction. For example:
1
mkdir TeradataExport
2
cd TeradataExport
Copied!

Run the sc-tera-export tool

To run the tool you will need the address to your server, and a user and password with enough privileges.
For example:
1
sc-tera-export -S 127.0.0.1 -U DBC -P DBC
Copied!
The tool will ask before writing the scripts. Type y and press ENTER
Two new folders will be created:
    bin
    scripts

Modify create_ddls_config.sh

The script will open thecreate_ddls.shon an editor. Review the script. Check the parameters. For more information about this script read this documentation. When ready save the script and exit the editor.
Add your settings in thecreate_ddl_config.sh .In that file, you can specify which database(s) to export and which elements to include.
For example, you can set the parameters like this to export the AdventureWorksDW database and exclude the others databases.
1
DBS="'AdventureWorksDW'"
2
DBS_EXCLUDE="'SYS_CALENDAR','ALL','CONSOLE','CRASHDUMPS','DBC','DBCMANAGER','DBCMNGR','DEFAULT','EXTERNAL_AP','EXTUSER','LOCKLOGSHREDDER','PDCRADM','PDCRDATA','PDCRINFO','PDCRSTG','PDCRTPCD','PUBLIC','SQLJ','SYSDBA','SYSADMIN','SYSBAR','SYSJDBC','SYS_MGMT','SYSLIB','SYSSPATIAL','SYSTEMFE','SYSUDTLIB','SYSUIF','TD_SERVER_DB','TD_SYSFNLIB','TD_SYSFNLIB','TD_SYSGPL','TD_SYSXML','TDMAPS','TDPUSER','TDQCD','TDSTATS','TDWM','VIEWPOINT'"
3
...
4
TABLES_INCLUDE="'%'"
5
...
6
JOININDEX_INCLUDE="'%'"
7
...
8
VIEWS_INCLUDE="'%'"
9
...
10
FUNCTIONS_INCLUDE="'%'"
11
...
12
MACROS_INCLUDE="'%'"
13
...
14
PROCEDURES_INCLUDE="'%'"
Copied!

Run create_ddls.sh

Next, run thecreate_ddls.shscript to extract the DDL scripts.
There is no need to specify host, user, or password as they were added to the script before.
1
cd bin
2
./create_ddls.sh
3
cd ..
Copied!

Go to the output folder.

After running thecreate_ddls.sh,the following files will be generated:
    DDL_Databases.sql
    DDL_Schemas.sql
    DDL_Tables.sql
    DDL_Views.sql
    DDL_Procedures.sql
    DDL_Functions.sql
    DDL_Macros.sql
    DDL_Join_Indexes.sql
Important: After a successful run, delete thebinandscriptsfolders.

Package the results

When the script is done, the output folder will contain all the DDLs for the migration. You can then compress this folder to use it with SnowConvert
For example run:
1
zip -r output.zip ./output
Copied!
Last modified 1mo ago