Most businesses have data stored in a variety of locations, from in-house databases to SaaS platforms. To get a full picture of their finances and operations, they pull data from all those sources into a data warehouse or data lake and run analytics against it. But they don't want to build and maintain their own data pipelines.

Fortunately, it’s not necessary to build everything in-house. We put together this ETL tool comparison guide to help you choose the product that’s the best fit for your business.

Overview

Apache Airflow, mParticle, and Stitch are all popular platforms. Here's a side-by-side look at how they stack up against each other.

Focus Orchestration, scheduling, workflows Mobile-focused event tracking and data ingestion Data ingestion, ELT
Database replication Only via plugins No database sources Full table; incremental via change data capture or SELECT/replication keys
SaaS sources Only via plugins About 150 More than 100
Ability for customers to add new data sources Yes Yes, via Events API Yes
Connects to data warehouses? Data lakes? Yes / Yes Yes / Yes Yes / Yes
Transparent pricing Yes No Yes
G2 customer satisfaction Not rated 3.5/5 4.8/5
Support SLAs No No Available
Purchase process Free to download and use Requires a conversation with sales Options for self-service or talking with sales. Also available from the AWS store.
Compliance, governance, and security certifications None GDPR, ISO/IEC 27001:2013, SOC 2 Type II HIPAA, GDPR, SOC 2
Data sharing Yes, via plugins No Yes, through Talend Data Fabric
Vendor lock-in Free to use Annual contracts Month to month or annual contracts. Open source integrations
Developer tools Experimental REST API Several APIs and client SDKs Import API, Stitch Connect API for integrating Stitch with other platforms, Singer open source project

Let's dive into some of the details of each platform.

Transformations

Apache Airflow

Apache Airflow is a powerful tool for authoring, scheduling, and monitoring workflows as directed acyclic graphs (DAG) of tasks. A DAG is a topological representation of the way data flows within a system. Airflow manages execution dependencies among jobs (known as operators in Airflow parlance) in the DAG, and programmatically handles job failures, retries, and alerting. Developers can write Python code to transform data as an action in a workflow.

mParticle

mParticle lets users track events across multiple applications and send event data to multiple destinations. It's designed to collect and connect customer data from multiple marketing, analytics, and data warehousing tools. It's not primarily an ETL tool, but it does include connectivity to some SaaS data sources and data warehouse destinations. To the extent that it performs ETL operations, mParticle focuses on the E and L, extraction and loading. It extracts raw data from sources and loads it into destinations without allowing users to define their own transformations — though mParticle itself transforms events and objects to conform to its standard as it sends them between different tools, including to the data warehouse. mParticle also lets users create preload transformations in a graphical interface, and write custom rules in JavaScript.

Stitch

Stitch is an ELT product. Within the pipeline, Stitch does only transformations that are required for compatibility with the destination, such as translating data types or denesting data when relevant. Stitch is part of Talend, which also provides tools for transforming data either within the data warehouse or via external processing engines such as Spark and MapReduce. Transformations can be defined in SQL, Python, Java, or via graphical user interface.

Try Stitch for free for 14 days

  • Unlimited data volume during trial
  • Set up in minutes

No credit card required

Connectors: Data sources and destinations

Each of these tools supports a variety of data sources and destinations.

Apache Airflow

Airflow orchestrates workflows to extract, transform, load, and store data. It run tasks, which are sets of activities, via operators, which are templates for tasks that can by Python functions or external scripts. Developers can create operators for any source or destination. In addition, Airflow supports plugins that implement operators and hooks — interfaces to external platforms. The Airflow community has built plugins for databases like MySQL and Microsoft SQL Server and SaaS platforms such as Salesforce, Stripe, and Facebook Ads.

mParticle

mParticle has a different focus than most ETL tools — it's designed to track events from applications, websites, and mobile apps and unify them into customer profiles. An event is a set of actions that represents a step in the funnel, such as user invited or signed up or order completed. Users can define events that make sense for their business use cases, based on key metrics that provide business value. You embed mParticle tracking into mobile apps or connected devices. The platform creates unique mobile identifiers and tracks things like device IDs, location data, and in-app activity. mParticle supports about 170 destinations, most of which are SaaS platforms for marketing, advertising, and analytics. It also supports data warehouses like Amazon Redshift, Google BigQuery, and Snowflake. Customers can use mParticle's developer tools to track new event sources, but no one outside of the mParticle team or its partners can build new cloud app sources.

Stitch

Stitch supports more than 100 database and SaaS integrations as data sources, and eight data warehouse and data lake destinations. Customers can contract with Stitch to build new sources, and anyone can add a new source to Stitch by developing it according to the standards laid out in Singer, an open source toolkit for writing scripts that move data. Singer integrations can be run independently, regardless of whether the user is a Stitch customer. Running Singer integrations on Stitch’s platform allows users to take advantage of Stitch's monitoring, scheduling, credential management, and autoscaling features.

Support, documentation, and training

Data integration tools can be complex, so vendors offer several ways to help their customers. Online documentation is the first resource users often turn to, and support teams can answer questions that aren't covered in the docs. Vendors of the more complicated tools may also offer training services.

Apache Airflow

The open source community provides Airflow support through a Slack community. Documentation includes quick start and how-to guides. Other than a tutorial on the Apache website there are no training resources.

mParticle

mParticle provides support via email. Documentation is comprehensive. Digital training materials are not available.

Stitch

Stitch provides in-app chat support to all customers, and phone support is available for Enterprise customers. Support SLAs are available. Documentation is comprehensive and is open source — anyone can contribute additions and improvements or repurpose the content. Stitch does not provide training services.

Pricing

Apache Airflow

Airflow is free and open source, licensed under Apache License 2.0.

mParticle

Pricing is not disclosed.

Stitch

Stitch has pricing that scales to fit a wide range of budgets and company sizes. All new users get an unlimited 14-day trial. After the trial, there's a free plan for smaller organizations and nonproduction workloads. Standard plans range from $100 to $1,250 per month depending on scale, with discounts for paying annually. Enterprise plans for larger organizations and mission-critical use cases can include custom features, data volumes, and service levels, and are priced individually.

Get started now

Which tool is best overall? That's something every organization has to decide based on its unique requirements, but we can help you get started. Sign up now for a free trial of Stitch.