This solution is not database specific
Most databases can generate DDL for any object but not a lot of them allow generation of INSERT statements for the table data. The workaround is to use ETL Tools for transferring data across servers. However, there exists a need to generate INSERT statements from the tables for porting data. The simplest example is when a small or large amount of data needs to be taken out on a removable storage media and copied to a remote location, INSERT VALUES statements come in handy.
There is a number of scripts available to perform this data transformation task. The problem with those scripts that all of them are database specific and they do not work with textiles
Advanced ETL Processor gives you a universal solution for creating Insert statements. It works with any database or file.
Advanced ETL Processor generating insert statements
A lot of our customers confused about generating a table creation script and providing information to the support team.
Quite often we need to be able to reproduce the problem in the office.
In order to do that we need additional information from the customer
you can now generate table creation script by simply clicking on the data
And here is the SQL script:
create table TableName ([COMPANY NAME] CHAR(29),[YEAR] CHAR(10),[MONTHID] CHAR(9),[PRODUCT ID] CHAR(12),[AMOUNT] CHAR(12))
It is not ideal but it is a very good start
This update applies to both Visual Importer ETL and Advanced ETL Processor
Load data in Microsoft SQL server in just a few clicks. Plus there is no need to create any scripts or learn command line tools just point and click.
There several ways to load data into Microsoft SQL Server Database
This article describes loading data via BCP.
BCP is the fastest way of loading data into SQL Server
Check constraintsEnsure that any constraints on the destination table are checked during the bulk copy operation. By default, constraints are ignored. Keep identitySpecify that there are values in the data file for an identity column.
Keep NULLS Specify that any columns containing a null value should be retained as null values, even if a default value was specified for that column in the destination table. Batch sizeSpecify the number of rows in a batch. The default is the entire data file. The following values for the Batch sizeproperty have these effects:If you set Batch size to zero, the data is loaded in a single batch. The first row that fails will cause the entire load to be cancelled, and the step fails.If you set Batch size to one, the data is loaded one row at a time. Each row that fails is counted as a one-row failure. Previously loaded rows are committed. If you set Batch size to a value greater than one, the data is loaded one batch at a time. Any row that fails in a batch fails that entire batch; loading stops and the step fails. Rows in previously loaded batches are either committed or if the step has joined the package transaction, provisionally retained in the transaction, subject to later commitment or roll-back.
There are several ways to connect Excel Files and run SQL against it:
Both Advanced ETL Processor and Visual Import ETL can use ODBC to connect to Excel
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\MyExcel.xls;Extended Properties="Excel 8.0;HDR=Yes;IMEX=1";
"HDR=Yes;" indicates that the first row contains column names, not data. "HDR=No;" it does not
A very useful feature for the Excel file below when you want to retrieve the Invoice Number:
SQL syntax for ODBC and Ole DB is slightly different
SELECT * FROM [Sheet1$]
SELECT * FROM MyRange
To specify an unnamed range of cells as your record source, append standard Excel row/column notation to the end of the sheet name in the square brackets. For example:
SELECT * FROM [Sheet1$A1:B10]
ADO/ODBC must determine the data type for each column in your Excel worksheet or range (This is not affected by Excel cell formatting settings.) This is done by scanning number of rows defined by the registry setting TypeGuessRows (default value is 8). Quite often there are numeric values mixed with text values in the same column, For example, sorted financial coding structures often have numbers at the beginning of the list 001-999 than text AAA-XXXBoth the Jet and the ODBC Provider return the data of the majority type but return NULL (empty) values for the minority data type. If the two types are equally mixed in the column, the provider chooses numeric over text.Plus when first rows are less than 255 characters long it will truncate all data to 255 characters even if cell values below are longerOne of the ways of avoiding this problem is using Import Mode (IMEX=1). This forces mixed data to be converted to text. However, it only works when the first TypeGuessRows Rows have mixed values.If all values are numeric than setting IMEX=1 will not convert the default datatype to Text, it will remain numeric. The best combination to avoid problems is TypeGuessRows=0 + IMEX=1.Setting TypeGuessRows=0 forces driver to read all data to determine field type.Unfortunately, our own experience shows that quite often it does not work.And when it does work it slows everything downSo the only solution is not to use mixed values and be prepared for the data being truncated to 255 characters
As you can see there is plenty of room for improvement for Microsoft. The first version of Microsoft Office was released very long time ago and it is still not possible to read excel file correctly. Why do not they just add another setting to ODBC driver EverhingIsText=1? This so simple and will solve all the problems.
All our products work correctly with Excel because we do not use OleDb or ODBC
Visual Importer ETL is capable of running SQL before and after loading data into the target database.
Typical usage of this feature would be truncating the table before loading the data and running some calculations after
It is possible to run any SQL statement here, including stored procedures all statement must be separated by a delimiter
All our ETL tools have no limitations, which means you can load your data today and solve all the data quality problems right now.
A new version of Database Browser is available for download
In this release, we have greatly enhanced SQL Execution log functionality.
This SQL script will create and populate the Time dimension for PostgreSQL based data warehouse
CREATE TABLE time_dim(time_key integer NOT NULL,time_value character(5) NOT NULL,hours_24 character(2) NOT NULL,hours_12 character(2) NOT NULL,hour_minutes character (2) NOT NULL,day_minutes integer NOT NULL,day_time_name character varying (20) NOT NULL,day_night character varying (20) NOT NULL,CONSTRAINT time_dim_pk PRIMARY KEY (time_key))WITH (OIDS=FALSE);COMMENT ON TABLE time_dim IS 'Time Dimension';COMMENT ON COLUMN time_dim.time_key IS 'Time Dimension PK';insert into time_dimSELECT cast(to_char(minute, 'hh24mi') as numeric) time_key,to_char(minute, 'hh24:mi') AS tume_value,-- Hour of the day (0 - 23)to_char(minute, 'hh24') AS hour_24, -- Hour of the day (0 - 11)to_char(minute, 'hh12') hour_12,-- Hour minute (0 - 59)to_char(minute, 'mi') hour_minutes,-- Minute of the day (0 - 1439)extract(hour FROM minute)*60 + extract(minute FROM minute) day_minutes,-- Names of day periodscase when to_char(minute, 'hh24:mi') BETWEEN '06:00' AND '08:29'then 'Morning'when to_char(minute, 'hh24:mi') BETWEEN '08:30' AND '11:59'then 'AM'when to_char(minute, 'hh24:mi') BETWEEN '12:00' AND '17:59'then 'PM'when to_char(minute, 'hh24:mi') BETWEEN '18:00' AND '22:29'then 'Evening'else 'Night'end AS day_time_name,-- Indicator of day or nightcase when to_char(minute, 'hh24:mi') BETWEEN '07:00' AND '19:59' then 'Day'else 'Night'end AS day_nightFROM (SELECT '0:00'::time + (sequence.minute || ' minutes')::interval AS minute FROM generate_series(0,1439) AS sequence(minute)GROUP BY sequence.minute) DQORDER BY 1
Based on on information provided here
+ Up to 2 times faster data extraction from SQL Server + Up to 2 times faster data extraction from ODBC sources + Up to 40 percent faster loading data into SQL Server + Up to 40 percent faster loading data into ODBC + Up to 10 percent faster QVX files creation + Up to 10 percent faster loading data into PostgreSQL - Various bugs fixes and improvements
Based on users feedback in the latest version of Advanced ETL Processor and Visual Importer ETL we have made it possible to edit Insert/Update/Delete/Count SQL statements manually.
To enable this option
To regenerate the SQL statements clear it and open it again. Keep the fields order same and in there are any changes with target table structure, regenerate everything again
Copyright © 2022 ETL-tools.com. All Rights Reserved.