How to programatically dump a subset of records from a table

Posted by hutcj on 17-Oct-2019 14:41

What: Looking for a way to dynamically dump a subset of records from a table. The EXPORT statement does not support dynamic buffers, and the tables have too many fields to try putting all the data into a single variable to then EXPORT. Additionally, I can't just EXPORT each BUFFER-FIELD individually because that would add a new line after each field, and it needs to look like a normal .d file (fields separated by spaces, records separated by new lines) in order to load properly.

Why: I want to generate a set of .d files that can be used to create a test database for testing the application against a localhost. Obviously, dumping the entire database is not realistic because it is too large. So, for tables greater than a certain size, I want to only dump a portion of the records (using prodict/dump_d.p works great for whole table dumps).

Any ideas? How would you typically handle the maintenance of a localized test database?

All Replies

Posted by frank.meulblok on 17-Oct-2019 14:58

Use the "BUFFER <name>:HANDLE" to get handles to static buffers. Then you can add those static buffers to a dynamic(ally prepared) query to fetch the records you need, and use the EXPORT statement to write them to file.

Alternatively, if performance of the dump isn't much of a concern, start with the code in knowledgebase.progress.com/.../P4410  and adapt it to your needs.

Posted by hutcj on 17-Oct-2019 15:57

I had come across that article in initial searches but wasn't able to get it working - I'll take a closer look.

On your solution using static buffers, that would require writing out the name for each table, right? I want to use this for multiple databases with a combined hundreds of tables, so I want to avoid hard coding any direct references to tables (this would also help if we drop or add tables later, for example).

Posted by Peter Judge on 17-Oct-2019 16:08

Static buffers would require writing or generating code, and compiling it (so need the appropriate license). That may be a factor in your decision.
 
 

Posted by Richard.Kelters on 17-Oct-2019 21:43

Way back Tony Lavinio and Peter van Dam wrote a dynamic export program. And would you believe it, here it is

www.futureproofsoftware.com/.../index.php

You will need to adapt it to your needs and your current OpenEdge version because it was written when Progress 9.1 was out!

Posted by Patrick Tingen on 18-Oct-2019 10:37

The DataDigger can do just this for you.

In addition, the DataDigger can also generate a local database for you that has the same schema as your remote database.
Right-click on the table browse and choose "clone this database":

Choose the folder where the database should be created:

You end up with a local, empty database with the same structure as your production db. 

You can dump data from your database in various formats. Make your selection of the records and choose the download button:

Then choose the format you like:

If you choose XML format, you can simply dump them, switch to your local database and drag the XML file onto the DataDigger window. It will then import the XML file:

If you have a lot of tables, this may not be the best way to do it, since you have to do it one by one. If the number of tables is limited, I think this is the smoothest solution. 

The DataDigger is free and open source, and available here

(disclaimer: I am the author)

Posted by hutcj on 18-Oct-2019 16:15

Thanks Richard! I will take a look and see if I can modernize it a bit ;)

Posted by hutcj on 18-Oct-2019 16:17

Patrick, we do have hundreds of tables, so probably not a full solution for us. However, it might be useful as a supplemental tool for when you need to test against a specific set of data.

Posted by hutcj on 18-Oct-2019 16:24

I think eventually we will have to create ProDataSets, etc., for all the tables so we end up with meaningful data (parent/child relationships, some of this type of customer, some of that type, etc.). I saw a suggestion elsewhere to use the JSON methods for a TEMP-TABLE for import/export. It sounds like a good idea except for the extra buffering overhead (which might be inevitable anyway if we're going to ProDataSets).

For now just to get started, I modified the prodict/dump_d.p source to allow just pulling X number of records off the top of the table as an option. That way I don't have to worry about proper formatting or the "PSC filename=table..." stuff at the end of the .d file.

This thread is closed