NABU Extract-Transform-Load

A suite of applications which run along side your legacy systems building continually up to date Data Marts to feed into your Data Warehouse.
video
  • logo
  • logo
  • logo
  • logo
  • logo
process 2


ETL Extract & ETL Share

ETL Extract

  • Extracts data from legacy database to a local file repository.
  • Extracts all or parts of tables either as scripts or as separated files.
  • Can generate create scripts, similar to mysqldump.

ETL Share

  • Monitors files and folders and determines how to share their contents.
  • Can share via emails as links to the source files or as attachments.
  • Can share via FTP/sFTP upload for remote sources.
  • Can share via file copy onto mapped network drive or UNC source.
  • Supports file and folder compression.

process 3


ETL Load

  • Loads the output from the ETL Extract program into the destination database.
  • Looks for the existence of specific files in specific folders either scripts or separated files.
  • Where scripts exists it optionally drops/creates the destination tables as necessary.
  • Data within the script/separated files are iterated and inserted into the destination DB
  • Source script/separated files may then be optionally deleted automatically

process 2


ETL Fetch

  • Process to fetch files from external sources.
  • Can retrieve files from HTTP/HTTPs servers, link is parsed and the file downloaded automatically onto a local folder.
  • Can pull files from emails either as files to UNC, links to FTP, links to HTTP and as attachments, all downloaded onto a local folder.
  • Can download automatically from FTP and sFTP servers to pick up files produced onto local folder
  • Can pull files from other network shares and UNC drives etc. into a local folder
  • Can connect to Web Services and retrieve response data automatically and store that data onto local folder
  • Can connect to Web Pages and extract HTML tables and store them on the local folder as separated files for later automated loading

process 3


ETL Stage

  • Loads structured files into a staging database, effectively converting a file into a database table.
  • Can also stage data from legacy database sources, effectively copying all or parts of databases into a known structure within the staging DB, could be complex joins to make a single database table in the staging DB
  • Supports CSV, PSV, Excel (.xls and .xlsx), HTML, other separated values and fixed width files

process 2


ETL Transform

  • Transforms data originating in either a legacy database or in the staging database into dimensions tables and facts/measures tables
  • Dimensions tables represent data which provides context to the facts being collected, an example would be a date, a time, a product and employee etc.
  • Facts represent the measures being collected, for example the hours recorded by an employee at a particular site on a particular date

process 3


ETL Aggregate

  • Aggregate runs specific queries to create data aggregations, so for example, there might be a dimension for employee, and a dimension for date and another for site, which provides richness of who, when, where, however in many cases, the business simply requires the total spend on across all employees at a specific site on a specific date. The aggregate process runs these queries to generate aggregate facts which then speed up the delivery of reports to the client user experience



ETL Analyse

  • Automates the production of dashboards by prefetching into a database cache to improve performance in the user experience.
  • Can generate static HTML reports into a file server cache, to speed up generation of common reports using HTML 5 templates.