Most businesses have data stored in a variety of locations, from in-house databases to SaaS platforms. To get a full picture of their finances and operations, they pull data from all those sources into a data warehouse or data lake and run analytics against it. But they don't want to build and maintain their own data pipelines.

Fortunately, it’s not necessary to code everything in-house. Here's an comparison of two such tools, head to head.

About Apache Airflow

Apache Airflow is an open source project that lets developers orchestrate workflows to extract, transform, load, and store data.

About AWS Data Pipeline

Amazon Web Services (AWS) has a host of tools for working with data in the cloud. Data Pipeline focuses on data transfer. It's one of two AWS tools for moving data from sources to analytics destinations; the other is AWS Glue, which is more focused on ETL.

Stitch and Talend partner with AWS. While this page details our products that have some overlapping functionality and the differences between them, we're more complementary than we are competitive. AWS offers lots of products beyond what's mentioned on this page, and we have thousands of customers who successfully use our solutions together.

About Stitch

Stitch Data Loader is a cloud-based platform for ETL — extract, transform, and load. More than 3,000 companies use Stitch to move billions of records every day from SaaS applications and databases into data warehouses and data lakes, where it can be analyzed with BI tools. Stitch is a Talend company and is part of the Talend Data Fabric.

Apache Airflow LogoAWS Data Pipeline LogoStitch Logo
FocusOrchestration, scheduling, workflowsData transferData ingestion, ELT
Database replicationOnly via pluginsFull table; incremental replication via timestamp fieldFull table; incremental via change data capture or SELECT/replication keys
SaaS sourcesOnly via pluginsNoneMore than 100
Ability for customers to add new data sourcesYesNoYes
Connects to data warehouses? Data lakes?Yes / YesYes / YesYes / Yes
Transparent pricingYesYesYes
Support SLAsNoYesAvailable
Purchase processFree to download and useSelf-serviceOptions for self-service or talking with sales. Also available from the AWS store.
Compliance, governance, and security certificationsNoneNoneHIPAA, GDPR, SOC 2
Data sharingYes, via pluginsYesYes, through Talend Data Fabric
Vendor lock-inFree to useMonth to month. No open sourceMonth to month or annual contracts. Open source integrations
Developer toolsExperimental REST APIAWS Data Pipeline API gives programmatic control over most Data Pipeline operations. SDKs are available for several languages.Import API, Stitch Connect API for integrating Stitch with other platforms, Singer open source project

Let's dive into some of the details of each platform.

Transformations

Apache Airflow

Apache Airflow is a powerful tool for authoring, scheduling, and monitoring workflows as directed acyclic graphs (DAG) of tasks. A DAG is a topological representation of the way data flows within a system. Airflow manages execution dependencies among jobs (known as operators in Airflow parlance) in the DAG, and programmatically handles job failures, retries, and alerting. Developers can write Python code to transform data as an action in a workflow.

AWS Data Pipeline

Data Pipeline supports preload transformations using SQL commands. You can create a pipeline graphically through a console, using the AWS command line interface (CLI) with a pipeline definition file in JSON format, or programmatically through API calls.

Stitch

Stitch is an ELT product. Within the pipeline, Stitch does only transformations that are required for compatibility with the destination, such as translating data types or denesting data when relevant. Stitch is part of Talend, which also provides tools for transforming data either within the data warehouse or via external processing engines such as Spark and MapReduce. Transformations can be defined in SQL, Python, Java, or via graphical user interface.

Try Stitch for free for 14 days

  • Unlimited data volume during trial
  • Set up in minutes

Set up in minutesUnlimited data volume during trial

No credit card required

Connectors: Data sources and destinations

Each of these tools supports a variety of data sources and destinations.

Apache Airflow

Airflow orchestrates workflows to extract, transform, load, and store data. It run <b>tasks</b>, which are sets of activities, via <b>operators</b>, which are templates for tasks that can by Python functions or external scripts. Developers can create operators for any source or destination. In addition, Airflow supports plugins that implement operators and <b>hooks</b> — interfaces to external platforms. The Airflow community has built plugins for databases like MySQL and Microsoft SQL Server and SaaS platforms such as Salesforce, Stripe, and Facebook Ads.

AWS Data Pipeline

Data Pipeline supports four types of what it calls data nodes as sources and destinations: DynamoDB, SQL, and Redshift tables and S3 locations. Data Pipeline doesn't support any SaaS data sources.

Stitch

Stitch supports more than 100 database and SaaS integrationsas data sources, and eight data warehouse and data lake destinations. Customers can contract with Stitch to build new sources, and anyone can add a new source to Stitch by developing it according to the standards laid out in Singer, an open source toolkit for writing scripts that move data. Singer integrations can be run independently, regardless of whether the user is a Stitch customer. Running Singer integrations on Stitch’s platform allows users to take advantage of Stitch's monitoring, scheduling, credential management, and autoscaling features.

Support, documentation, and training

Data integration tools can be complex, so vendors offer several ways to help their customers. Online documentation is the first resource users often turn to, and support teams can answer questions that aren't covered in the docs. Vendors of the more complicated tools may also offer training services.

Apache Airflow

The open source community provides Airflow support through a Slack community. Documentation includes quick start and how-to guides. Other than a tutorial on the Apache website there are no training resources.

AWS Data Pipeline

AWS provides online support through a ticketing system and a knowledgebase. Support tickets may get phone or chat responses. Documentation is comprehensive. Digital training materials are available.

Stitch

Stitch provides in-app chat support to all customers, and phone support is available for Enterprise customers. Support SLAs are available. Documentation is comprehensive and is open source — anyone can contribute additions and improvements or repurpose the content. Stitch does not provide training services.

Pricing

Apache Airflow

Airflow is free and open source, licensed under Apache License 2.0.

AWS Data Pipeline

Data Pipeline pricing is based on how often your activities and preconditions are scheduled to run and whether they run on AWS or on-premises.

Stitch

Stitch has pricing that scales to fit a wide range of budgets and company sizes. All new users get an unlimited 14-day trial. Standard plans range from $100 to $1,250 per month depending on scale, with discounts for paying annually. Enterprise plans for larger organizations and mission-critical use cases can include custom features, data volumes, and service levels, and are priced individually.

Get started now

Which tool is better overall? That's something every organization has to decide based on its unique requirements, but we can help you get started. Sign up now for a free trial of Stitch.

Give Stitch a try, on us

Select your integrations, choose your warehouse, and enjoy Stitch free for 14 days.

Set up in minutesUnlimited data volume during trial