Automation

  • Agent Email Notifications

    Based on the customer feedback in the latest version of Advanced ETL Processor and Visual Importer ETL we have introduced Agent Email Notifications.

    This option allows the controlling state of the repository connection.

    For example, IT may decide to reboot the database server where the repository is hosted or
    Repository database may run of disk space.

    When the agent is up and running it is constantly checking repository connection, In case of failure, it will send an email notification and abort all tasks currently being executed.

    Once the connection to the repository is reestablished agent will send another email and start executing scheduled tasks again

    Agent Email Notifications

    Example of Error Email:

    Error Message: [DBNETLIB][ConnectionWrite (send()).]General network error. Check your network documentation
    Computer Name: COMPUTER
    User Name: bill
    Os Version: Windows 8.1 (Version 6.3, Build 9600, 64-bit Edition)
    Number Of Threads: 11
    Repository: MS SQL Server Repository
    Type: MS SQL Server
    Server: COMPUTER\SQLEXPRESS2008
    Database: REPOSITORY
    User: sa

    Example of Reconnection Email

    Error Message: Reconnected!
    Computer Name: COMPUTER
    User Name: bill
    Os Version: Windows 8.1 (Version 6.3, Build 9600, 64-bit Edition)
    Number Of Threads: 11
    Repository: MS SQL Server Repository
    Type: MS SQL Server
    Server: COMPUTER\SQLEXPRESS2008
    Database: REPOSITORY
    User: sa

    Example of Agent Log

    Agent log is located in the same folder as the agent itself

    Direct link, no registration required.
  • Automate FTP Uploads and Downloads

    FTP protocol is used daily by most businesses around the world. Our ETL Software makes it easy to automate FTP uploads and downloads.
    Complete FTP Automation

     The users can create and delete directories on the FTP server, delete files, move files and produce a list of files. Here is a basic example of FTP workflow. 

    ftp automation workflow example

     

    Download free trial of Advanced ETL Processor

    We also offer FTP monitor which constantly checks FTP servers for new files arrivals. When a new file is detected it executes a predefined action.

    ftp monitor 1

    ftp monitor 2

    ftp monitor 3

    Making Life Of End Users And Developers Easier

    We are constantly working on making our software better. This dialogue is a fully functioning FTP client. It allows the users to see the files on the FTP server. But we did not stop there. The users can also download and upload files, create and delete remote directories. 

    ftp package action

    Supported FTP Protocols
    • FTP – File Transfer Protocol
    • FTPS – FTP over implicit TLS/SSL
    • FTPES – FTP over explicit TLS/SSL Require TLS
    • FTPES – FTP over explicit TLS/SSL
    • SFTP – SSH File Transfer Protocol

    ftp connection dialogue

    What is FTP
    File Transfer Protocol (FTP) is a standard network protocol used to copy a file from one host to another over a TCP/IP-based network, such as the Internet. FTP is built on a client-server architecture and utilizes separate control and data connections between the client and server applications which solves the problem of different end-host configurations (i.e. Operating System, file names). FTP is used with user-based password authentication or with anonymous user access.

    FTP Automation with Visual  Importer ETL
    How FTP works

    A client makes a connection to the server on TCP port 21. This connection, called the control connection, remains open for the duration of the session, with a second connection, called the data connection, on port 20 opened as required to transfer file data. The control connection is used to send administrative data (i.e. commands, identification, passwords). Commands are sent by the client over the control connection in ASCII and terminated by a carriage return and line feed. For example "RETR filename" would transfer the specified file from the server to the client. Due to this two-port structure, FTP is considered out-of-band, as opposed to an in-band protocol such as HTTP.

    The server responds on the control connection with three-digit status codes in ASCII with an optional text message, for example, "200" (or "200 OK.") means that the last command was successful. The numbers represent the code number and the optional text represent explanations (i.e. <OK>) or needed parameters (i.e. <Need account for storing file>). A file transfer in progress over the data connection can be aborted using an interrupt message sent over the control connection.

    FTP can be run in active mode or passive mode, which control how the second connection is opened. In active mode, the client sends the server the IP address port number that the client will use for the data connection, and the server opens the connection. The passive mode was devised for use where the client is behind a firewall and unable to accept incoming TCP connections. The server sends the client an IP address and port number and the client opens the connection to the server.

    Security consideration

    FTP has no encryption tools meaning all transmissions are in clear text; usernames, passwords, FTP commands and transferred files can be read by anyone sniffing on the network. This is a problem common to many Internet protocol specifications written prior to the creation of SSL, such as HTTP, SMTP and Telnet. The common solution to this problem is to use either SFTP (SSH File Transfer Protocol) or FTPS (FTP over SSL), which adds SSL or TLS encryption to FTP.

    Anonymous FTP

    A host that provides an FTP service may additionally provide anonymous FTP access. Users typically log in to the service with an 'anonymous' account when prompted for a username. Although users are commonly asked to send their email address in lieu of a password, no verification is actually performed on the supplied data.

    Direct link, no registration required.
  • Automatic File Compression

    Visual Importer ETL provides extended support for automation of compression and decompression operations, it supports a wide range of archive file formats such as

    Zip archive *.zip;*.jar;*.ear;*.war;*.cbz;*.apk;*.wsz;*.wal;
    *.xpi;*.crx;*.dfsz;*.pcv;*.bsz;*.mskin;*.wmz;*.ipa;*.docx;
    *.xlsx;*.pptx;*.sxw;*.sxi;*.sxt;*.sxd;*.sxc;*.sxm;*.sxg;*.stw;
    *.sti;*.std;*.stc;*.odh;*.odd;*.odt;*.odm;*.ods;*.ots;*.odg;
    *.otg;*.odp;*.otp;*.odf;*.odb
    BZip2 archive *.bz2;*.bzip2;*.tbz2;*.tbz
    7z archive *.7z
    Tar archive *.tar
    GZip archive *.gz;*.gzip;*.tgz;*.tpz
    Swf archive *.swf

    Archive dialog

    About data compression

    Data compression refers to the process of encoding large files in order to reduce the number of bytes. Smaller files occupy lesser memory space and are also easier to transfer over a network, which could be the Internet, intranet or a LAN. The automation of compression files can be of great help here.

    Compression involves applying an algorithm to the required data that removes the redundant bits. Redundancy is very common in most types of computer files. Visual Importer compression mechanism gets rid of this redundancy thus compressing the file. Breaking them into smaller parts and sending them over a network can compress files that are extremely large. These files are then reassembled at the receiving end to get back the original data.

    Direct link, no registration required.
  • Creating Telegram bots

    The latest version of our ETL Software support reading and posting Telegram messages using Bot API. This opens a new window of opportunity for chat automation.

    Usage example:

    Note: The user must talk to the Bot first

    Telegram package

    Useful links

    We would like to thank Vassalli from Galvolux for providing us with useful feedback

    Direct link, no registration required.
  • Data Synchronization part 1

    This article was created to assist our customers with data synchronization.

    In our examples we will be synchronizing between different physical SQL servers, the same methodology can be used for any database

    Here is table creation script
    CREATE TABLE TBL_SOURCE
    (
    PRIMARY_KEY int NOT NULL IDENTITY (1,1),
    DATA varchar(50) NULL,
    UPDATE_FLAG varchar(5) NULL
    ) ON
    [PRIMARY]
    GO
    ALTER TABLE TBL_SOURCE ADD CONSTRAINT
    PK_TBL_SOURCE PRIMARY KEY CLUSTERED
    (PRIMARY_KEY) WITH( STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON
    [PRIMARY]
    GO
    CREATE TABLE TBL_TARGET
    (PRIMARY_KEY int NOT NULL,
     DATA varchar(50) NULL,
     UPDATE_FLAG varchar(5) NULL
    ) ON
    [PRIMARY]
    GO
    ALTER
    TABLE TBL_TARGET ADD CONSTRAINT
    PK_TBL_TARGET PRIMARY KEY CLUSTERED
    (PRIMARY_KEY)
    WITH( STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON
    [PRIMARY]
    GO
    Objective

    Load new records only into the target table
    Once loading is completed source records must be marked as updated

    Assumptions

    Data is only inserted into the source table
    There are no changes to the source records
    There are no deletions to the source records

    Implementation

    Synchronization1

  • Example of data warehouse workflow

    Background:

    This article was created as an answer to one of our ETL forum posts, we hope it will help a lot of users in a similar situation.

    "The scenario is based on files landing in a monitored directory. At this stage, I am not worried about volume or frequency. But will be handling each file individually for internal reasons. Each file has a unique name but a common format of xxx-yyyymmddhhmmssnnnn.ext

    "xxx" represents a client prefix (eg. abc, bcd, cde). The format of each file for the same client will be the same. But the format is different for each client. The file extension '.ext' will generally be .csv, .xml, or even .xls.

    The file transformations between clients will be different. However, the output is always the same format and the target is a SQL Server database table.

    I am developing each of the transformations in isolation, but with the same basic structure.

    My idea is to have a single monitor looking at a single folder for *.* That then executes a single Package based on the creation of a file. The Monitor passes the file to the Package that then does some environment management, put the source file in a staging location, then based on the file prefix conditionally (Probably through a CASE action) call the associated transformation and load to the SQL server and then clean up."

    Notes:

    The monitor (Directory monitor) is a windows service that is constantly monitoring a directory for changes. When it happens it adds predefined action to the execution queue. Monitor passes the file name as a variable into the action, this variable name is <Directory Monitor File Name>.

    Solution:

    Example of data warehouse workflow

    "Get ClientID" script

     is a starting point, it extracts the first three characters of the file name. The Result is stored in '<ClientID>' variable.

    var
    FileName: String;
    ClientID: String;
    begin
    FileName := GetVariable('<Directory Monitor File Name>');
    ClientID:=UpperCase(SubString(FileName,1,3));
    SetVariable('<ClientID>',ClientID);

    Result:=true;
    end;

    Check Client directory

    Checks if the client directory exists and if it does not it creates it together with subdirectories.

     Every transformation has an ID. We use this ID to execute a specific transformation for a specific customer.

    Table Transformations

    Holds mapping between ClientID and TransformationID:

    CREATE TABLE [dbo].[Transformations]
    ([ClientID] [char](3) NOT NULL,[TransformationID] [int] NULL, 
    CONSTRAINT [PK_Transformations]
    PRIMARY KEY CLUSTERED ([ClientID] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
    ON [PRIMARY]) ON [PRIMARY]
    GO 

    Loookup

    "Insert Into Transformations"

    Sql script inserts a record for missing clients:

    insert into Transformations(
    [ClientID],
    [TransformationID])
    values ('<ClientID>',0)

    GO

    Once the file is ready for ready processing it is moved into Staging Area.

    Get transformation ID

    executes the following SQL:

     Select TransformationID
    from Transformations
    where ClientID='<ClientID>'

    The result is saved into <TransformationID> variable.

    Execute Transformation script

    begin
    ExecuteObject(GetVariable('<TransformationID>'));
    Result:=true;
    end;

    Upon completion, the file is moved into Processed Area.

    Directory Structure

     DirectoryStructure

    Variable as Mask

    Mask

    Error message

    Acts as a default error handler

    Direct link, no registration required.
  • Executing single copy of the package

    The Problem

    Quite often it is necessary to make sure that only one copy of the package is being executed at a given time. For example, the package below is being executed every 2 minutes but it might take more than 2 minutes to complete.
    The package is creating text files it is not possible to open the same file twice for writing. So we have to check if the same package is being executed and if it aborts the execution.

    Check Object Execution1

    The Solution

    This can be easily done by using SQL Data Check Action

  • Extracting Data from Active Directory

    Great news All our ETL products support extracting data from active directory now

    Extracting Data from Active Directory

    Following Data can be exported from Active Directory
    List of Users:
    • distinguishedName
    • cn
    • instanceType
    • objectCategory
    • objectClass
    • objectSid
    • sAMAccountName
    • title
    • Name
    • givenName
    • UserPrincipalName
    • description
    • mail
    • physicaldeliveryofficename
    • telephoneNumber
    • facsimileTelephoneNumber
    • mobile
    • company
    • department
    • useraccountcontrol
    • lastLogon
    • pwdLastSet
    • whenCreated
    • whenChanged
    List of Groups:
    • distinguishedName
    • cn
    • groupType
    • instanceType
    • objectCategory
    • objectClass
    • objectSid
    • sAMAccountName
    • description
    • whenCreated
    • whenChanged
    List of Computers:
    • distinguishedName
    • cn
    • instanceType
    • objectCategory
    • objectClass
    • objectSid
    • sAMAccountName
    • description
    • useraccountcontrol
    • operatingSystem
    • lastLogon
    • whenCreated
    • whenChanged

    Professional and Enterprise versions to follow

    Direct link, no registration required.
  • File operation actions and Data integration

    The File Operation Action can be used to perform different types of operations on files and directories such as create, move, delete, and also to set the attributes for files and folders. The different types of operations which can be performed by the File Operation Actions are shown in the below snippet.

    1. Copy File(s) can be used to copy a file from one location to another. You will need to configure both the source and destination when using this operation of the File Operation Action.
    2. Move File(s) can be used to move a file from one location to another.
    3. Rename File can be used to rename a file available within a specified location.
    4. Create Directory can be used to create a folder. You will need to specify the source location where the folder needs to be created.
    5. Delete File(s) can be used to delete a user-specified file from the specified folder.
    6. Merge Files can be used to append one file to another.
    7. Set Attributes can be used to set attributes to files and folders. Different types of attributes that can be set are hidden, archive, read-only and system.

    One important benefit to note is that the File Operation Action can use a mask to work with multiple files so there is no need to use loops.

    File Operation Action

    Direct link, no registration required.
  • Getting List of Active Directory group(s) members

    Improvements to Active Directory data processing

    It is now possible to extract:
    • List of group(s) members
    • List of groups the user(s) belongs to

    active directory datasource

    We do recommend using a filter during design to reduce the pressure on the server

    active directory connection

    When no user name/password are provided the current user is used

    Run the following SQL against the repository to enable Active Directory Connection

    Insert into OBJECT_TYPES
     (OBJECT_TYPE,OBJECT_TYPE_NAME)
    Values
    ('312','Active Directories')

    Insert into OBJECT_TYPES
     (OBJECT_TYPE,OBJECT_TYPE_NAME)
    Values
    ('313','Active Directory')

    INSERT INTO OBJECTS_TREE
    (OBJECT_ID,
     PARENT_ID,
     NAME,
     OBJECT_TYPE
    )
    Select
      OBJECT_ID*-6 as OBJECT_ID,
      PARENT_ID,
      'Active Directory' as NAME,
      312 as OBJECT_TYPE
    From
      OBJECTS_TREE
    where OBJECTS_TREE.OBJECT_ID=16

    Direct link, no registration required.
  • Inserting context of the file inside email message text

    Here is a very basic example of email automation

    This is an important feedback message stored on the file which must be passed to the supplier

     Insert file1

    Automation Package

    Insert file2

    Script Package Action

    Insert file4

    begin
     SetVariable('<FileData>',FileToString('C:\tmp\text.txt'));
     Result:=True;
    end;
    Send Email Action

    Insert file5

    Variable Value after script execution is completed

    Insert file3

    About Variables

    Variables are used to pass parameters between various objects.
    For example, SQL script has failed and you would like to email the SQL to the developer.
    Insert <Sql Script> Into the email message and it will be replaced with actual SQL.

    Note:

    There are several ways to create variables
    1. Using script action
    2. Using transformation action
    3. Using calculation object within transformation action or package
    4. Using Set Variable Action
    5. Manually using Global Variables

    Direct link, no registration required.
  • Loading Excel files from emails - questions from the customer

    I've been looking at your Advanced ETL Processor for taking data from Excel sheets into a MySQL database. We receive regular Excel sheets from about 20-25 suppliers quoting rates for services; each supplier has a different format (although some commonalities between them).

    Some questions:
    • Can loading definitions be set up to run automatically / semi-automatically? i.e. when a particular supplier provides their new Excel sheet can the previous loading definition set up for this supplier be accessed and used to load data via some schedule?

      Answer
      : Yes, it is possible. You can filter incoming emails using the sender email or subject and execute the appropriate data transformation script.
      Business Automation Package
    • Can it take data from different worksheets within a single Excel file and join to load into MySQL tables?

      Answer:
      Yes, it is possible. Every Excel sheet, print area or filter is treated as a separate table and so can write an SQL statement to join the tables
      Excell File
      Data Reader
    • Can it be configured to ignore a defined number of header lines? Or look for a specified text to identify the next line to start importing data from?

      Answer:
      Yes, it is possible. You can use a validator to filter records header/footer records out

       
    • Can it handle logic to identify records that shouldn't be imported? e.g. if one column contains a numeric currency value and it finds "NA" can it ignore these records? i.e. not attempt to load them

      Answer:
      Yes, it is possible. Again you can use a validator to filter records out
      Validation EditorIs Equal Properties
    • Some Excel sheets contain a record where a cell contains a list of values (generally comma separated)... can it break this up and store multiple records? i.e. cell A may contain "John", cell B "$0.015" and cell C "A, X, 56, Z".... can this be imported to store four records:
      (John, $0.015, A) (John, $0.015, X) (John, $0.015, 56) (John, $0.015, Z)

      Answer:
      Yes, it can be done using an UnPivot object.
      UnPivot PropertiesUnPivot Results
    • Sometimes have a variation on the above, e.g. cell A may contain "Susan", cell B "$0.017" and cell C may be empty this time.... can this be imported to store one record:
      (Susan, $0.017, NULL)

      Answer:
      Yes, it is possible. You can use a combination of unpivot and validator to filter null records out
    • Another variation on the above #5 is that the multi-cell content may exist on another worksheet, e.g. on worksheet X, cell A may contain "Simon", cell B "$0.03" and then on worksheet Y, cell A would contain "Simon" and cell B "T, U, P".... can this be imported to store three records:
      (Simon, $0.03, T) (Simon, $0.03, U) (Simon, $0.03, P)

      Answer:
      Yes, it can be done using separate transformations
      Data Transformation
    • Excel sheets can also contain footnotes; i.e. records at the bottom of the data requiring loaded that should be ignored. Can a footer definition be set up to be ignored? e.g. text to specify the start of the footer or a certain number of blank rows in the worksheet to define a point at which import stops?

      Answer:
      Sure, use the validator to achieve that.
    • Can a timestamp be added by the import tool and stored with all records as an extra field into the MySQL table? This timestamp may either be sysdate or taken from a specified fixed cell in the Excel sheet - can both of these options be supported?

      Answer:
      Yes, it can be done by using a transformation object.
    • As a variation to the above #9, can the timestamp come from different locations depending upon the value of a field within a row being imported? e.g. if cell C was "Increase" on the row currently being imported then obtain a timestamp from cell A3 otherwise take from cell A4. Then as each row is imported, the timestamp value being stored comes from either A3 or A4 depending upon content.

      Answer:
      Yes, it is possible. It can be done in several ways, for example, you can validate timestamp format and if there is something wrong with it, you can use the current date and time or you can write your own calculation transformation and use any logic you wish.
      Is Date Properties
      calculation properties
    • Can it read in content within a single cell such as "18:00-07:59" and break into two fields as "18:00" and "07:59" at times?

      Answer:
      Yes, you can achieve this by using a splitter object.
      Splitter Properties
      Transformation Editor
    • Is any modification/preparation of the Excel file required before your product can process it?

      Answer: In some cases, it might be necessary. It is important to keep the format the same, so no future modifications would be required.

    Additional information:

    It is very interesting what you are trying to achieve and we would be happy to assist you in difficult cases. If some functionality is missing we will add it for you.

    Direct link, no registration required.
  • Loading Financial Data

    We are a small Accounting Company based in Melbourne. Over the last year, our customer base had grown twice and it is still growing. We spend a lot of time thinking about how we can optimise our business processes. The first thing we did create standard Excel templates for everything. For example, the standard expenses form will consist of customer id, employee id, expense date, category and amount. Using standard forms saved us a lot of time but once all the forms are filled in they had to be processed manually. Our people were constantly under the pressure and we had to employ agency staff on a temporary basis.

    Loading Data into General Ledger

    Not any more.

    During routine "Google search" we discovered Advanced ETL Processor Ent which quickly becomes a cornerstone for automating our business processes.

    By using Advanced ETL Processor Ent we were able to eliminate most of our manual tasks

    It is like a spider that sits in the middle connected to all our systems.

    For example, Here is our expenses form process.
    • A customer emails us expenses form
    • Advanced ETL Processor Ent package connects to mail server, downloads attachment,
    • Loads Excel file into our accounting system,
    • Runs SQL script to validate submitted data,
    • Emails report back to the person who submitted expenses from
    • At the end of the month summary report emailed to the company automatically.

    Carl Barret,
    Accountant

    Direct link, no registration required.
  • Looping through the list of values

    Quite often it necessary to go through the list of values and perform a specific action for example load files from the directory and process them one by one.

    In the latest version of Advanced ETL Processor and Visual Importer ETL, we added Loop Action.

    Now it is possible to loop through the list of values

      1. loaded from the text file
      2. Database
      3. Entered by the user
      4. Numeric values "From To" and "Down To"

    Example:

    Looping through the list of the files:

    Loop1

    Saving list of files into the file

    Creating List of files

    Looping through the list of files created at the previous step

    Looping through list of files

    <Loop Value>  holds current filename

    Copy File

    Direct link, no registration required.
  • Processing files with different field positions part 1

    Question from the customer

    I have 350+ text files in the same folder.

    1. There is no pattern in files names so they cannot be grouped together.
    2. All CSV files have certain columns that are found in all (Store Name, Address, City, State, Zip Code, Phone Number)
    3. Fields order is not guaranteed
    4. Some CSV file will have extra columns such as Fax Number, Store Website, Store Hours...

    I would like to combine all 350 CSV files and line up the common columns but also include the extra columns
    Also, I would like to add a new column to each record which would be the file name.
    Also, a log of the operation that would identify the problem CSV (Ones unable to import/combine because if format errors)o add the file name and record name to the output file

    Proposed Solution

    The first thing we need to is to do create the list of fields this can be easily done with the help of validator, pivot and deduplication objects.

    As usual, the list of fields is at the first line of the file so we need to generate file lines number and reject all lines except the first one.

    Creating a List of Fields step by step guide
    Generating File Line Number

    Creating List Of Fields 1

    Filtering First Line

    Creating List Of Fields 2

    Unpivot the Data

    Creating List Of Fields Using Unpivot

    Deduplicate the data

    Creating List Of Fields Using Deduplicator

    The result

    Creating List Of Fields Result

     

    Direct link, no registration required.
  • Processing files with different field positions part 2

    How when we have the list of the fields we can process the data

    The first thing we need to do is to place a list of fields next to the data

    Source File example

    SourceFile1

    Desired Result

    Step1

    This can be easily done with keep value transformation function, plus we use Metadata function to get the source file name

    Using Keep value transformation function

    Next step is to get rid of the first line of every file using validator + Is Equal Validation functionRemoving first line using validator

    And the last step is to extract field values based on field name position

    Extracting Field Name

    Feedback from the customer: Thanks for your help. Your solution was the only one to achieve this (tried all your competitors)

    Direct link, no registration required.
  • Processing Healthcare Data

    Our partner Albatross Financial Solutions uses Advanced ETL Processor Enterprise as part of iPARS – Integrated Patient Acuity Recording System

    iPARS is designed to provide on-line real-time recording of Patient Acuity information by nursing staff onwards. The system architecture consists of the following 7 main elements:

    HL7 Processing

    1 Data Sources

    iPARS is designed with the ability to extract Patient Ward Stay information from a variety of data sources. It can interface with text files, receive messages from HL7 and link to SQL\Oracle databases. The basic requirement of these data sources is the availability of the up to date Patient Ward Stay information.

    2 iPARS ETL Tool

    The iPARS ETL Tool is based on the Advanced ETL Processor (AETL) software application supplied by DB Software Laboratory (http://www.etl-tools.com/etl-tools/advanced-etl-processor-enterprise/overview.html ). This system is configured to extract information from the ward stay text file. It is scheduled to run automatically every night at 4 o’clock. The information is loaded from the text file directly into the iPARS SQL database.  The same application is used to receive and process all the HL7 messages.

    Advanced ETL Processor Package example

    Advanced ETL Processor Exporting All Tables From Act

    3 SQL Database

    The iPARS SQL database stores all the Patient Ward Stay information into DS_WARDSTAY table. The information is saved with Patient ID, Ward Code and Date as the primary keys. In addition to the Patient Ward Stay table, the SQL database contains the main WARD_STATS and BED_STATS table which holds Acuity data for each patient – ward -date-shift (3 shifts per day) combination. In addition to these main tables, the iPARS SQL database also holds a variety of lookup tables including Ward Lookup, Acuity Levels description, etc. Another very important part of the database is the security table USERS It holds the user name and password. Each user can access information only for a specific number of wards. The link between user name and ward is stored in USERS table as well.

    4 System Administrator Unit

    The iPARS System Administrator unit is based on the Active Table Editor (ATE) software application supplied by DB Software Laboratory (

    5 Web Form

    The iPARS Web form is designed using ASP .NET technology and is split into two separate parts. Part 1 is a user login facility that filters ward lists based on the user name. Part 2 enables the user to record Acuity information, ward data (number of admissions, absence, transfers, etc), patient notes,  etc. At present, the form is designed and tested for PC technology, but in the future, the plan is to create additional versions for iPAD's and tablet PC's.

    6 iPARS Reports

    this element of the iPARS solution is still in the design stage. At present all the acuity reports and analysis are delivered using QlikView. This platform provides literally unlimited flexibility in designing analysis and will bring the system in line with the overall front end reporting technology to be deployed across the NHS Trusts.

    7 PLICS Interface

    Brings Patient Acuity information into the existing cost allocation process

    Direct link, no registration required.
  • Running ETL Packages from Active Table Editor

    Several of our customers have asked us recently how to run Advanced ETL Processor or Visual Importer ETL Package from Active Table Editor or any other application.

    All of these customers use the agent for execution.

    The agent reads the QUEUE table several times per minute if there is something to execute it runs it.

    QUEUE table has a lot of fields and it is not very clear which values to use.

    Here is the basic SQL Server stored procedure:

    CREATE PROCEDURE [dbo].[EXECUTE_OBJECT]
    (
    @LogDirectory NVARCHAR(255) ='',
    @ComputerToExecuteOn NVARCHAR(255) ='',
    @ObjectID Int,

    @UseAgent Int = 1, -- 1 = True, 0 = False
    @ExecuteOnComputer Int = 1
    )
    AS BEGIN BEGIN TRAN
    -- Calculating ID
    DECLARE @ID Int
    select @ID =max(ID)+1 from ID_GENERATOR where id_type=2
    update ID_GENERATOR set ID=@ID where id_type=2
    INSERT INTO QUEUE
    (QUEUE_ID,
    OBJECT_ID,
    OBJECT_TYPE,
    OBJECT_NAME,
    LOG_FILE,
    SUBMITED,
    STATUS,
    USE_AGENT,
    EXECUTE_ON_COMPUTER,
    COMPUTER_TO_EXECUTE_ON,
    COMPUTER_NAME,
    OSUSER )
    SELECT @ID as QUEUE_ID,
    OBJECT_ID, OBJECT_TYPE,
    NAME as OBJECT_NAME,
    @LogDirectory+'\Package_'+Cast(@ID as VARCHAR)+'.log' as LOG_FILE,
    getdate() as SUBMITED, 'P' as STATUS,
    @UseAgent as USE_AGENT,
    @ExecuteOnComputer as EXECUTE_ON_COMPUTER,
    @ComputerToExecuteOn as COMPUTER_TO_EXECUTE_ON,
    @@SERVERNAME as COMPUTER_NAME,
    user as OSUSER
    from objects_tree
    where object_id=@ObjectID
    COMMIT
    END
    GO

    Execution example:

    EXEC [EXECUTE_OBJECT] 'C:\TEMP','DBSLCOMPUTER',126,1,0

    Parameters:

    Log Directory:

    Log Directory

    Object ID:

    The rest of the parameters is self-explanatory

    Active Table Editor Execute Button

  • Using Python for business process automation

    In the latest version of Advanced ETL Processor and Visual Importer ETL, we have introduced support for running Python Scripts

    At the moment it can be only executed from the package script object.

    PythonScript

    Following functions can be used:
    • WriteToLog
    • SetVariable
    • GetVariable

    Python integration is on very early stages and we still working on making it better.

    About Python

    Python is an interpreted, object-oriented, high-level programming language with dynamic semantics. It's high-level built-in data structures, combined with dynamic typing and dynamic binding, make it very attractive for Rapid Application Development, as well as for use as a scripting or glue language to connect existing components together. Python's simple, easy to learn syntax emphasizes readability and therefore reduces the cost of program maintenance. Python supports modules and packages, which encourages program modularity and code reuse. The Python interpreter and the extensive standard library are available in source or binary form without charge for all major platforms and can be freely distributed.

    More information:

    https://www.python.org/doc/essays/blurb/

    Using Python
    Python and Google drive

    https://developers.google.com/drive/v3/web/quickstart/python

    Python and Google documents

    https://developers.google.com/gdata/articles/python_client_lib#introduction

    Python and Dropbox

    https://www.dropbox.com/developers-v1/core/docs/python

    Python and Amazon AWS

    https://aws.amazon.com/sdk-for-python/

    Shopify

    https://docs.shopify.com/api/guides/supported-libraries

    Big Commerce

    https://developer.bigcommerce.com/api/clients

    Practical Example

    PythonExample

    Direct link, no registration required.
 

This site uses cookies. By continuing to browse the site, you are agreeing to our use of cookies