Automation

  • Based on the customer feedback in the latest version of Advanced ETL Processor and Visual Importer ETL we have introduced Agent Email Notifications.

    This option allows to control state of repository connection.

    For example it my decide to reboot database server where repository is hosted or
    Repository database may run of disk space.

    When agent is up and running it is constantly checking repository connection, In case of failure it will send email notification and abort all tasks currently being executed.

    Once connection to the repository is reestablished agent will send another email and an start executing scheduled tasks

    Agent Email Notifications

    Example of Error Email:

    Error Message: [DBNETLIB][ConnectionWrite (send()).]General network error. Check your network documentation
    Computer Name: COMPUTER
    User Name: bill
    Os Version: Windows 8.1 (Version 6.3, Build 9600, 64-bit Edition)
    Number Of Threads: 11
    Repository: MS Sql Server Repository
    Type: MS Sql Server
    Server: COMPUTER\SQLEXPRESS2008
    Database: REPOSITORY
    User: sa

    Example of Reconnection Email

    Error Message: Reconnected!
    Computer Name: COMPUTER
    User Name: bill
    Os Version: Windows 8.1 (Version 6.3, Build 9600, 64-bit Edition)
    Number Of Threads: 11
    Repository: MS Sql Server Repository
    Type: MS Sql Server
    Server: COMPUTER\SQLEXPRESS2008
    Database: REPOSITORY
    User: sa

    Example of Agent Log

    Agent log is located in the same folder as agent itself

    Download Advanced ETL Processor Enterprise Download Visual Importer ETL Enterprise
  • Visual Importer ETL provides extended support for automation of comression and decompression operations , it supports wide range of archive file formats such as

    Zip archive *.zip;*.jar;*.ear;*.war;*.cbz;*.apk;*.wsz;*.wal;
    *.xpi;*.crx;*.dfsz;*.pcv;*.bsz;*.mskin;*.wmz;*.ipa;*.docx;
    *.xlsx;*.pptx;*.sxw;*.sxi;*.sxt;*.sxd;*.sxc;*.sxm;*.sxg;*.stw;
    *.sti;*.std;*.stc;*.odh;*.odd;*.odt;*.odm;*.ods;*.ots;*.odg;
    *.otg;*.odp;*.otp;*.odf;*.odb
    BZip2 archive *.bz2;*.bzip2;*.tbz2;*.tbz
    7z archive *.7z
    Tar archive *.tar
    GZip archive *.gz;*.gzip;*.tgz;*.tpz
    Swf archive *.swf

    Archive dialog

    About data compression

    Data compression refers to the process of encoding large files in order to reduce the number of bytes. Smaller files occupy lesser memory space and are also easier to transfer over a network, which could be the Internet, intranet or a LAN. The automation of compression files can be of great help here.

    Compression involves applying an algorithm to the required data that removes the redundant bits. Redundancy is very common in most types of computer files. Visual Importer compression mechanism gets rid of this redundancy thus compressing the file. Breaking them into smaller parts and sending them over a network can compress files that are extremely large. These files are then reassembled at the receiving end to get back the original data.

    Visit Support Forum Learn More Download Buy It
  • The latest version of our ETL Software support reading and posting Telegram messages using Bot API. This opens a new window of opportunity for chat automation.

    Usage example:

    Note: The user must talk to the Bot first

    Telegram package

    Upgrade procedure

    You would need to run the following SQL against the repository to enable Telegram connections 

    Insert into OBJECT_TYPES
    (OBJECT_TYPE,OBJECT_TYPE_NAME)
    Values
    ('911','Telegram Connections');

    Insert into OBJECT_TYPES
    (OBJECT_TYPE,OBJECT_TYPE_NAME)
    Values
    ('912','Telegram Connection');

    INSERT INTO OBJECTS_TREE
    (OBJECT_ID,
    PARENT_ID,
    NAME,
    OBJECT_TYPE
    )
    Select
    OBJECT_ID*-6 as OBJECT_ID,
    PARENT_ID,
    'Telegram' as NAME,
    911 as OBJECT_TYPE
    From
    OBJECTS_TREE
    where OBJECTS_TREE.OBJECT_ID=16;

    We would like to thank Vassalli from Galvolux for providing us with useful feedback

    Download Advanced ETL Processor Enterprise now!
  • This article was created to assist our customers with data synchronization.

    In our examples we will be synchronizing between different physical SQL servers, the same methodology can be used for any database

    Here is table creation script

    CREATE TABLE TBL_SOURCE
    (
    PRIMARY_KEY int NOT NULL IDENTITY (1,1),
    DATA varchar(50) NULL,
    UPDATE_FLAG varchar(5) NULL
    ) ON
    [PRIMARY]
    GO
    ALTER TABLE TBL_SOURCE ADD CONSTRAINT
    PK_TBL_SOURCE PRIMARY KEY CLUSTERED
    (PRIMARY_KEY) WITH( STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON
    [PRIMARY]
    GO
    CREATE TABLE TBL_TARGET
    (PRIMARY_KEY int NOT NULL,
     DATA varchar(50) NULL,
     UPDATE_FLAG varchar(5) NULL
    ) ON
    [PRIMARY]
    GO
    ALTER
    TABLE TBL_TARGET ADD CONSTRAINT
    PK_TBL_TARGET PRIMARY KEY CLUSTERED
    (PRIMARY_KEY)
    WITH( STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON
    [PRIMARY]
    GO

    Objective

    Load new records only into the target table
    Once loading is completed source records must be marked as updated

    Assumptions

    Data is only inserted into the source table
    There are no changes to the source records
    There are no deletions to the source records

    Implementation

    Synchronization1

  • The Problem

    Quite often it is necessary to make sure that only one copy of the package is being executes at given time
    For example the package below is being executed every 2 minutes but it might take more than 2 minutes to complete
    The package is creating text files it is is not possible to open same file twice for writing.
    So we have to check if same package is being executed and if it is abort the execution

    Check Object Execution1

    The Solution

    This can be easily done by using SQL Data Check Action

  • Latest versions of Advanced ETL Processor, Visual Importer ETL, Data Exchange Wizard and QlikView connector support extracting data from active directory now

    Extracting Data from Active Directory

    Following Data can be exported from Active Directory

    List of Users:

    • distinguishedName
    • cn
    • instanceType
    • objectCategory
    • objectClass
    • objectSid
    • sAMAccountName
    • title
    • Name
    • givenName
    • UserPrincipalName
    • description
    • mail
    • physicaldeliveryofficename
    • telephoneNumber
    • facsimileTelephoneNumber
    • mobile
    • company
    • department
    • useraccountcontrol
    • lastLogon
    • pwdLastSet
    • whenCreated
    • whenChanged

    List of Groups:

    • distinguishedName
    • cn
    • groupType
    • instanceType
    • objectCategory
    • objectClass
    • objectSid
    • sAMAccountName
    • description
    • whenCreated
    • whenChanged

    List of Computers:

    • distinguishedName
    • cn
    • instanceType
    • objectCategory
    • objectClass
    • objectSid
    • sAMAccountName
    • description
    • useraccountcontrol
    • operatingSystem
    • lastLogon
    • whenCreated
    • whenChanged

    Professional and Enterprise versions to follow

    Learn More Download Buy It
  • Improvements to Active Directory data processing

    It is now possible to extract:

    • List of group(s) members
    • List of groups the user(s) belongs to

    active directory datasource

    We do recommend using filter during design to reduce the pressure on the server

    active directory connection

    When no user name/password are provided current user is used

    Run the folloing SQL against the repository to enable Active Directory Connection

    Insert into OBJECT_TYPES
     (OBJECT_TYPE,OBJECT_TYPE_NAME)
    Values
    ('312','Active Directories')

    Insert into OBJECT_TYPES
     (OBJECT_TYPE,OBJECT_TYPE_NAME)
    Values
    ('313','Active Directory')

    INSERT INTO OBJECTS_TREE
    (OBJECT_ID,
     PARENT_ID,
     NAME,
     OBJECT_TYPE
    )
    Select
      OBJECT_ID*-6 as OBJECT_ID,
      PARENT_ID,
      'Active Directory' as NAME,
      312 as OBJECT_TYPE
    From
      OBJECTS_TREE
    where OBJECTS_TREE.OBJECT_ID=16

    Visit Support Forum Learn More Download Buy It
  • Here is very basic example of email automation

    This is important feedback message stored on the file which must passed to the supplier

     Insert file1

    Automation Package

    Insert file2

    Script Package Action

    Insert file4

    begin
     SetVariable('<FileData>',FileToString('C:\tmp\text.txt'));
     Result:=True;
    end;

    Send Email Action

    Insert file5

    Variable Value after script execution is completed

    Insert file3

    About Variables

    Variables are used to pass parameters between various objects.
    For example SQL script has failed and you would like to email the SQL to the developer.
    Insert <Sql Script> Into email message and it will be replaced with actual SQL.

    Note:

    There are several ways to create variables
    1. Using script action
    2. Using transformation action
    3. Using calculation object within transformation action or package
    4. Using Set Variable Action
    5. Manually using Global Variables

    Download Advanced ETL Processor Enterprise Download Visual Importer ETL Enterprise
  • We are a small Accounting Company based in Melbourne. Over the last year, our customer base had grown twice and it is still growing. We spend a lot of time thinking about how we can optimise our business processes. The first thing we did create standard Excel templates for everything. For example, standard expenses form will consist of customer id, employee id, expense date, category and amount. Using standard forms saved us a lot of time but once all the forms are filled in they had to be processed manually. Our people ware constantly under the pressure and we had to employ agency staff on a temporary basis.

    Loading Data into General Ledger

    Not any more.

    During routine "Google search" we discovered Advanced ETL Processor Ent which quickly becomes a cornerstone for automating our business processes.

    By using Advanced ETL Processor Ent we were able to eliminate most of our manual tasks

    It is like a spider which sits in the middle connected to all our systems.

    For example, FOR expenses form the process as follows.

    • A customer emails us expenses form
    • Advanced ETL Processor Ent package connects to mail server, downloads attachment,
    • Loads Excel file into our accounting system,
    • Runs SQL script to validate submitted data,
    • Emails report back to the person who submitted expenses from
    • At the end of the month summary report emailed to the company automatically.

    Carl Barret,
    Accountant

    Download Advanced ETL Processor Enterprise now!

     

  • Quite often it necessary to go through the list of values and perform specific action for example load files form the directory and process them one by one.

    In the latest version of Advanced ETL Processor and Visual Importer ETL we added Loop Action.

    Now it is possible to loop through the list of values

      1. loaded from the text file
      2. Database
      3. Entered by the user
      4. Numeric values "From To" and "Down To"

    Example:

    Looping through the list of the files:

    Loop1

    Saving list of files into the file

    Creating List of files

    Looping through the list of files created at previous step

    Looping through list of files

    <Loop Value>  holds current file name

    Copy File

    Download Free Trial Learn More About Our ETL Software
  • Question from the customer

    I have 350+ text files in the same folder.

    1. There is no pattern in files names so they cannot be grouped together.
    2. All CSV files have certain columns that are found in all (Store Name, Address, City, State, Zip Code, Phone Number)
    3. Fields order is not guaranteed
    4. Some CSV file will have extra columns such as Fax Number, Store Website, Store Hours....

    I would like to combine all 350 CSV files and line up the common columns but also include the extra columns
    Also, I would like to add a new column to each record which would be the file name.
    Also, a log of the operation that would identify the problem CSV (Ones unable to import/combine because if format errors)o add the file name and record name to the output file

    Proposed Solution

    The first thing we need to is to do create the list of fields this can be easily done with the help from validator, pivot and deduplication objects.

    As usual, the list of fields is at the first line of the file so we need to generate file lines number and reject all lines except the first one.

    Creating List of Fields step by step guide

    Generating File Line Number

    Creating List Of Fields 1

    Filtering First Line

    Creating List Of Fields 2

    Unpivot the Data

    Creating List Of Fields Using Unpivot

    Deduplicate the data

    Creating List Of Fields Using Deduplicator

    The result

    Creating List Of Fields Result

     

    Visit Support Forum Learn More Download Buy It
  • How when we have the list of the fields we can process the data

    The first thing we need to do is to place a list of fields next to the data

    Source File example

    SourceFile1

    Desired Result

    Step1

    This can be easily done with keep value transformation function, plus we use Metadata function to get the source file name

    Using Keep value transformation function

    Next step is to get rid of the first line of every file using validator + Is Equal Validation functionRemoving first line using validator

    And the last step is to extract field values based on field name position

    Extracting Field Name

    Feedback from the customer: Thanks for your help. Your solution was the only one to achieve this (tried all your competitors)

    Visit Support Forum Learn More Download Buy It
  • Our partner Albatross Financial Solutions uses Advanced ETL Processor Enterprise as part of iPARS – Integrated Patient Acuity Recording System

    iPARS is designed to provide on-line real time recording of Patient Acuity information by nursing staff on wards. The system architecture consists of the following 7 main elements:

    HL7 Processing

    1 Data Sources

    iPARS is designed with the ability to extract Patient Ward Stay information from a variety of data sources. It can interface with text files, receive messages from HL7 and link to SQL\Oracle databases. The basic requirement to these data sources is availability of the up to date Patient Ward Stay information.

    2 iPARS ETL Tool

    The iPARS ETL Tool is based on the Advanced ETL Processor (AETL) software application supplied by DB Software Laboratory (http://www.etl-tools.com/etl-tools/advanced-etl-processor-enterprise/overview.html ). This system is configured to extract information from the ward stay text file. It is scheduled to run automatically every night at 4 o’clock. The information is loaded from the text file directly into the iPARS SQL database.  The same application is used to receive and process all the HL7 messages.

    Advanced ETL Processor Package example

    Advanced ETL Processor Exporting All Tables From Act

    3 SQL Database

    The iPARS SQL database stores all the Patient Ward Stay information into DS_WARDSTAY table. The information is saved with Patient ID, Ward Code and Date as the primary keys. In addition to the Patient Ward Stay table, the SQL database contains the main WARD_STATS and BED_STATS table which holds Acuity data for each patient – ward -date-shift (3 shifts per day) combination. In addition to these main tables the iPARS SQL database also holds a variety of lookup tables including Ward Lookup, Acuity Levels description, etc. Another very important part of the database is the security table USERS It holds user name and password. Each user can access information only for a specific number of wards. The link between user name and ward is stored in USERS table as well.

    4 System Administrator Unit

    The iPARS System Administrator unit is based on the Active Table Editor (ATE) software application supplied by DB Software Laboratory (

    5 Web Form

    The iPARS Web form is designed using ASP .NET technology and is split into two separate parts. Part 1 is a user login facility which filters ward lists based on the user name. Part 2 enables the user to record Acuity information, ward data (number of admissions, absence, transfers, etc), patient notes,  etc. At present the form is designed and tested for PC technology, but in the future the plan is to create additional versions for iPAD's and tablet PC's.

    6 iPARS Reports

    this element of the iPARS solution is still in the design stage. At present all the acuity reports and analysis are delivered using  using QlikView. This platform provides literally unlimited flexibility in designing analysis and will bring the system in line with the overall front end reporting technology to be deployed across the NHS Trusts.

    7 PLICS Interface

    Brings Patient Acuity information into the existing cost allocation process

    Contact Albatross FS

  • Several of our customers have asked us recently how to run Advanced ETL Processor or Visual Importer ETL Package from Active Table Editor or any other application.

    All of theese customers use agent for execution.

    Agent reads QUEUE table several times per minute if there is something to execute it runs it.

    QUEUE table has a lot of fields and it is not very clear which values to use.

    Here is basis SQL Server stored procedure:

    CREATE PROCEDURE [dbo].[EXECUTE_OBJECT]
    (
    @LogDirectory NVARCHAR(255) ='',
    @ComputerToExecuteOn NVARCHAR(255) ='',
    @ObjectID Int,

    @UseAgent Int = 1, -- 1 = True, 0 = False
    @ExecuteOnComputer Int = 1
    )
    AS BEGIN BEGIN TRAN
    -- Calculating ID
    DECLARE @ID Int
    select @ID =max(ID)+1 from ID_GENERATOR where id_type=2
    update ID_GENERATOR set ID=@ID where id_type=2
    INSERT INTO QUEUE
    (QUEUE_ID,
    OBJECT_ID,
    OBJECT_TYPE,
    OBJECT_NAME,
    LOG_FILE,
    SUBMITED,
    STATUS,
    USE_AGENT,
    EXECUTE_ON_COMPUTER,
    COMPUTER_TO_EXECUTE_ON,
    COMPUTER_NAME,
    OSUSER )
    SELECT @ID as QUEUE_ID,
    OBJECT_ID, OBJECT_TYPE,
    NAME as OBJECT_NAME,
    @LogDirectory+'\Package_'+Cast(@ID as VARCHAR)+'.log' as LOG_FILE,
    getdate() as SUBMITED, 'P' as STATUS,
    @UseAgent as USE_AGENT,
    @ExecuteOnComputer as EXECUTE_ON_COMPUTER,
    @ComputerToExecuteOn as COMPUTER_TO_EXECUTE_ON,
    @@SERVERNAME as COMPUTER_NAME,
    user as OSUSER
    from objects_tree
    where object_id=@ObjectID
    COMMIT
    END
    GO

    Execution example:

    EXEC [EXECUTE_OBJECT] 'C:\TEMP','DBSLCOMPUTER',126,1,0

    Parameters:

    Log Directory:

    Log Directory

    Object ID:

    The rest of parameters is self explanatory

    Active Table Editor Execute Button

  • In the latest version of Advanced ETL Processor and Visual Importer ETL we have introduced support for running Python Scripts

    At the moment it can be only executed from the package script object.

    PythonScript

    Following functions can be used:

    • WriteToLog
    • SetVariable
    • GetVariable

    Python integration is on very early stages and we still working on making it better.

    About Python

    Python is an interpreted, object-oriented, high-level programming language with dynamic semantics. Its high-level built in data structures, combined with dynamic typing and dynamic binding, make it very attractive for Rapid Application Development, as well as for use as a scripting or glue language to connect existing components together. Python's simple, easy to learn syntax emphasizes readability and therefore reduces the cost of program maintenance. Python supports modules and packages, which encourages program modularity and code reuse. The Python interpreter and the extensive standard library are available in source or binary form without charge for all major platforms, and can be freely distributed.

    More information:

    https://www.python.org/doc/essays/blurb/

    Using Python

    Python and Google drive

    https://developers.google.com/drive/v3/web/quickstart/python

    Python and Google documents

    https://developers.google.com/gdata/articles/python_client_lib#introduction

    Python and Dropbox

    https://www.dropbox.com/developers-v1/core/docs/python

    Python and Amazon AWS

    https://aws.amazon.com/sdk-for-python/

    Shopify

    https://docs.shopify.com/api/guides/supported-libraries

    Big Commerce

    https://developer.bigcommerce.com/api/clients

    Practical Example

    PythonExample

    Visit Support Forum Learn More Download Buy It

We have 971 guests and no members online