Table of Contents:
  • Physical: This type of extraction can be done either online or offline. In an online physical extraction, the data moves from the source into the ETL pipeline; in an offline physical extraction, it's stored first and then fed separately into those next processes.
  • Logical: This type of extraction may require a full extraction of all source data. Otherwise, you'll need to prepare to track changes in the data if it is extracted bit by bit.

The ETL process determines the smooth and accurate conversion of your raw data into usable information. There are a number of ways to approach this process. Before you devote your resources to a particular path, however, it is key to develop an overall business picture of your ETL goals and future needs. This article will help you ask the right questions to get you started and offer a high-level illustration of the options for ETL processing based on your needs.

  1. 3 ETL Process Considerations

  2. 3 ETL Process Configurations

  3. Building ETL with Integrate.io

3 ETL Process Considerations

Your ETL process depends a lot on what you are trying to accomplish. The good news is that these protocols are easily tailored to your specific data set and your ultimate goals. Before implementing an ETL process, it is important to take stock of your current situation with regard to your data and what you want to achieve.

Why Do You Need Your Data?

There are a number of potential uses for the same set of data. How you plan to use your data will affect how you implement the transformation process. For example, you may have a very clear set of expectations for customer transaction data, for example, and you can foresee the long-term use of that data. In this case, you may need to simply build a repository of customer activity and contact information. Or maybe you want to safeguard the raw data for more complex analysis down the line that goes beyond just identification characteristics into actionable marketing, sales, and business intelligence, which will require a more complex ETL. Having a clear picture of what you ultimately need from your data will in many ways define your ETL.

How and Where Do You Get Your Data?

After you determine what information you need, you then must determine where your data is coming from and how it is going to be fed into the ETL process. When it comes to extraction (or the "E" in ETL), you have two choices -- physical or logical.

Cleaning Your Data

The next step is to automatically clean the data. Sometimes, data cleansing is part of the ETL process -- specifically, at the "T," or "transformation" stage, the data is scrubbed for duplications, errors, or irrelevant pieces. This makes the data ready to become actionable business intelligence, no matter what kind of intelligence that may be. In the early days of data conversion, data cleansing was often done manually, where a set of human eyes scanned the information for obvious mistakes. Now, more efficient automated systems handle this process for massive amounts of data, identifying errors that humans miss, and in a fraction of the time.

Advanced Reading: The 6 Building Blocks of ETL Architecture

3 ETL Process Configurations

When it comes to how your ETL moves and treats your data, you have a few options. The following three suggestions are the most effective for the majority of organizations. In a nutshell, they represent the scope from the automatic conversion of raw data to a closer inspection of that data in small chunks, or batches, before it moves from its inception location to the target database.

Cloud-Based Data Warehouse

This is a bit of a misnomer, as it claims to bypass the ETL process completely. While on the surface this kind of solution may look fully automated and worry-free, it is only appropriate for organizations that can gather, extract, and transform data according to preset analytical requirements. Furthermore, with such limited functionality, it is not meant for any organization that wants to use raw data for multiple different analytical purposes. For any organization that has a huge amount of data, or has diverse uses for data, the following two processes might be more appropriate.

Stream Processing

Some organizations process data as it comes in. E-commerce sites, for example, may shuttle user data immediately after a transaction into the ETL process. That way it's immediately cleaned up, transformed, and made into analytic data without any delays. The data is modified in this way according to preset conditions, according to established use processes. Although you can take that information and store it in a data lake -- available to send to another series of target locations -- it is nonetheless restructured into usable information immediately, and sent to specific locations.

Batch Processing

In batch processing, data sits in a kind of stasis before it enters the ETL process. When it does enter the process, it does so in a group of other data at the same time. Typically, batches are created using reference codes for a specific type (i.e., data that includes the same location code or inventory number). From there, it is converted into a common file type, cleaned up, and loaded into the target database. Despite the delay in the movement of data compared to the stream processing, batch processing may give some organizations more flexibility and control over how they manage their raw data.

Building ETL with Integrate.io

Unlike other tools on the market, Integrate.io's unique focus is on data pipelines. This means that you can develop the information route that works best for your data. With a simple, easy-to-navigate interface, Integrate.io lets you create a unique ETL process that is transparent, functional, and tailored precisely to the needs of your organization. That way, there's no need to build an ETL process entirely from scratch -- you can use Integrate.io's tool to make the transformation intuitive and efficient, regardless of the volume of data you need to access and use. To experience the Integrate.io platform for yourself, contact us to schedule a demo.