The Future of ETL and ELT

Tuesday, August 8, 2017 - 15:03

The data engineering industry has been through heavy disruption over the last five years. The increasing adoption of Hadoop, NoSQL databases, columnar analytic databases and Cloud platforms have all changed data integration strategy. One change as a result of these new platforms has been a move from a more traditional ETL (Extract, Transform, Load) design to an ELT (Extract, Load, Transform) design.

What is ELT?

ELT and ETL are both similar in that they are high level strategies to move data from one system to another. The difference is where data transformation processing occurs. In ETL the transformation is done in an external application such as Pentaho, Talend, or Informatica with the results loaded into the target. ELT does the transformation in the target system. The data is first loaded into the target system exactly as it looks in the source and then scripts are run inside the target platform to perform the necessary transformations.

ELT is not a new concept. It has been around for years and is a common approach. However, in many situations, it introduces difficult challenges that make ETL a more attractive option. Some of these challenges include:

  • ELT increases load on the target system during data loads. ETL isolates the processing power required to transform the data outside the target database. ELT puts all of this load on the target database, perhaps causing performance conflicts with active analytic queries.
  • ELT processes usually require scripting. For databases this means writing SQL queries. For Hadoop this means writing Spark, Pig, Hive, or MapReduce. These scripts can be very complex and require significant expertise to maintain. Think 100+ line SQL queries.
  • Although the Transform is done by scripts inside the target platform, a solution is still required for Extract, Load, and overall job orchestration. This generally requires custom scripts or the use of traditional ETL tools such as Pentaho, Talend, or Informatica. (And if you’re using a traditional ETL tool for Extract and Load, the temptation to leverage its Transformation capabilities grows!)

 

Why ELT?

Performance. Data engineers are always under pressure to load the data into the target as quickly as possible. The faster the data can be loaded, the sooner the data can be acted upon --  improving the speed of business. Performance differences between ELT and ETL when using classic relational databases such as Oracle, SQL Server, MySQL, and Postgres have been negligible. However, the introduction of Hadoop and analytic databases has tipped the scale, making ELT orders of magnitude faster than ETL when using these platforms.

Hadoop solved the system performance problem by massively scaling out processing. The performance benefits of MapReduce and Spark moving the processing to the data and the volume of data processed in Hadoop make ELT the obvious choice. Analytic databases, on the other hand, have significant performance degradation when you use ETL processing. These databases are very slow when processing a multitude of record-level updates and inserts. In some cases, a single record insert can take seconds. On the other hand, analytic databases are very good at set-based operations requiring users to bulk load data into the database, and then execute SQL scripts to manipulate (i.e. Transform) the required records in batch.

ELT is faster for these types of new systems, but as discussed, it still requires manual writing of scripts, limiting the audience that is able to maintain it. Inquidia recognizes this problem and uses a couple of new products on the market to help solve them.

ELT Technology Options

For the Cloud-based Analytic databases Amazon Redshift, Snowflake, and Google BigQuery, Inquidia has been working with Matillion, a Cloud-based visual ELT environment. There is no software license for Matillion, you simply pay a small upcharge on an EC2 instance and pay for Matillion by the hour. Matillion uses the same concepts as traditional ETL tools with orchestration jobs and transformation steps that read data from the database and transform the data in a visual development environment. The difference between the traditional ETL and Matillion is Matillion compiles the transformations into SQL that are run in the database rather than externally. Users familiar with traditional ETL tools will find the Matillion development interface easy to learn. 

For ELT on Hadoop, Inquidia finds Pentaho’s adaptive execution layer to be a powerful solution. This layer allows users to run ordinary Pentaho transformations as Spark jobs on a Cloudera Hadoop cluster. Existing Pentaho users can take advantage of parallel processing on Hadoop using Spark with no additional training. The written transformations are the exact same as Pentaho developers are used to writing maintaining the same level of maintainability as traditional ETL transformations.

The Future of ETL and ELT

It’s clear that new data management solutions are upsetting the traditional ways of transporting and organizing data. With every release, traditional ETL providers are adding capabilities to support evolving data management environments. And new vendors are also rapidly entering the market with purpose built tools to solve these new challenges. As these new data management solutions mature, the purpose built tools will continue to alter the traditional ETL and ELT landscape. With all of these changes, expect to see best practices for data management evolve along with them.  Inquidia can help you stay up to date with the latest data management solutions and designs.

Contact us today to find out how Inquidia can show you how to collect, integrate and enrich your data. We do data. You can, too.

Would you like to know more?

Sign up for our fascinating (albeit infrequent) emails. Get the latest news, tips, tricks and other cool info from Inquidia.