Most businesses have data stored in a variety of locations, from in-house databases to SaaS platforms. To get a full picture of their finances and operations, they pull data from all those sources into a data warehouse or data lake and run analytics against it. But they don't want to build and maintain their own data pipelines.
Fortunately, it’s not necessary to code everything in-house. Here's an comparison of three such tools, head to head.
About Apache Airflow
Apache Airflow is an open source project that lets developers orchestrate workflows to extract, transform, load, and store data.
Segment has a different focus than most ETL tools — it's designed to collect and connect customer data from multiple marketing, analytics, and data warehousing tools and unify it into customer profiles. It's not primarily an ETL tool, but it does include connectivity to some SaaS data sources and data warehouse destinations.
In Segment, an event is a set of actions that represents a step in the funnel, such as user invited or signed up or order completed. Users can define events that make sense for their business use cases, based on key metrics that provide business value, and create tracking plans to document the events and properties they intend to collect across Segment sources.
Stitch Data Loader is a cloud-based platform for ETL — extract, transform, and load. More than 3,000 companies use Stitch to move billions of records every day from SaaS applications and databases into data warehouses and data lakes, where it can be analyzed with BI tools. Stitch is a Talend company and is part of the Talend Data Fabric.
|Focus||Orchestration, scheduling, workflows||Event tracking, data ingestion, ELT||Data ingestion, ELT|
|Database replication||Only via plugins||No database data sources||Full table; incremental via change data capture or SELECT/replication keys|
|SaaS sources||Only via plugins||About 35||More than 100|
|Ability for customers to add new data sources||Yes||For events but not cloud app sources (see explanation below)||Yes|
|Connects to data warehouses? Data lakes?||Yes / Yes||Yes / No||Yes / Yes|
|G2 customer satisfaction||Not rated||4.6/5||4.8/5|
|Purchase process||Free to download and use||Options for self-service and talking with sales||Options for self-service or talking with sales. Also available from the AWS store.|
|Compliance, governance, and security certifications||None||GDPR||HIPAA, GDPR, SOC 2|
|Data sharing||Yes, via plugins||No||Yes, through Talend Data Fabric|
|Vendor lock-in||Month to month or annual contracts. The company maintains several open source projects.||Month to month or annual contracts. Open source integrations|
Let's dive into some of the details of each platform.
Apache Airflow is a powerful tool for authoring, scheduling, and monitoring workflows as directed acyclic graphs (DAG) of tasks. A DAG is a topological representation of the way data flows within a system. Airflow manages execution dependencies among jobs (known as operators in Airflow parlance) in the DAG, and programmatically handles job failures, retries, and alerting. Developers can write Python code to transform data as an action in a workflow.
To the extent that it performs ETL operations, Segment focuses on the E and L, extraction and loading. It extracts raw data from sources and loads it into destinations without allowing users to define their own transformations — though Segment itself transforms events and objects to conform to its standard as it sends them between different tools, including to the data warehouse.
Stitch is an ELT product. Within the pipeline, Stitch does only transformations that are required for compatibility with the destination, such as translating data types or denesting data when relevant. Stitch is part of Talend, which also provides tools for transforming data either within the data warehouse or via external processing engines such as Spark and MapReduce. Transformations can be defined in SQL, Python, Java, or via graphical user interface.
Try Stitch for free for 14 days
- Unlimited data volume during trial
- Set up in minutes
Connectors: Data sources and destinations
Each of these tools supports a variety of data sources and destinations.
Airflow orchestrates workflows to extract, transform, load, and store data. It run tasks, which are sets of activities, via operators, which are templates for tasks that can by Python functions or external scripts. Developers can create operators for any source or destination. In addition, Airflow supports plugins that implement operators and hooks — interfaces to external platforms. The Airflow community has built plugins for databases like MySQL and Microsoft SQL Server and SaaS platforms such as Salesforce, Stripe, and Facebook Ads.
Segment offers integrations for several different kinds of sources:
To understand the difference between the events and SaaS data sources, take the example of Salesforce, which is one of a handful of integrations supported in both categories. Event tracking would enable updates to Salesforce leads and contacts based things those users did in a separate system, like a mobile app.
If you want to replicate all of the data in your Salesforce instance to your data warehouse, including information entered by your sales team, that’s only possible with a cloud app source.
Customers can use Segment's developer tools to track new event sources, but no one outside of the Segment team can build new cloud app sources.
Segment supports six destinations:
Stitch supports more than 100 database and SaaS integrations as data sources, and eight data warehouse and data lake destinations. Customers can contract with Stitch to build new sources, and anyone can add a new source to Stitch by developing it according to the standards laid out in Singer, an open source toolkit for writing scripts that move data. Singer integrations can be run independently, regardless of whether the user is a Stitch customer. Running Singer integrations on Stitch’s platform allows users to take advantage of Stitch's monitoring, scheduling, credential management, and autoscaling features.
Support, documentation, and training
Data integration tools can be complex, so vendors offer several ways to help their customers. Online documentation is the first resource users often turn to, and support teams can answer questions that aren't covered in the docs. Vendors of the more complicated tools may also offer training services.
The open source community provides Airflow support through a Slack community. Documentation includes quick start and how-to guides. Other than a tutorial on the Apache website there are no training resources.
Segment provides support through a ticketing system. Documentation is comprehensive. Segment does not provide training services.
Stitch provides in-app chat support to all customers, and phone support is available for Enterprise customers. Support SLAs are available. Documentation is comprehensive and is open source — anyone can contribute additions and improvements or repurpose the content. Stitch does not provide training services.
Airflow is free and open source, licensed under Apache License 2.0.
Segment provides a 14-day free trial, and has a free Developer tier. Team accounts start at $120 per month. Pricing is based in part on the volume of monthly tracked users in data sources. Business plans for larger organizations are priced individually.
Stitch has pricing that scales to fit a wide range of budgets and company sizes. All new users get an unlimited 14-day trial. Standard plans range from $100 to $1,250 per month depending on scale, with discounts for paying annually. Enterprise plans for larger organizations and mission-critical use cases can include custom features, data volumes, and service levels, and are priced individually.
Get started now
Which tool is best overall? That's something every organization has to decide based on its unique requirements, but we can help you get started. Sign up now for a free trial of Stitch.