A glance at Gartner’s sizable list of analytics and business intelligence (BI) tools is all it takes to produce analysis paralysis. Spoiled for choice, business leaders or end users may find it challenging to select appropriate technologies for their reporting and analytical needs. As older platforms adapt to the changing data landscape, new tools surface rapidly.

Business intelligence is a category of applications and technology that turns raw enterprise data into visual reports or insights. BI tools enhance corporate decision-making, help inform strategy, and impart direct analytical power to users across the entire organization.

BI tools serve myriad purposes, from the most technical aspects of a data mining operation to a manager’s simple need to aggregate a cross-section of records and pull up an informative chart.

Although designed for a generally nontechnical audience, nearly all BI software requires some degree of training for proficiency. Knowing which stakeholders will utilize a BI system, and understanding these users’ capabilities and needs, are critical steps when selecting BI software.

An introduction to BI and analytics concepts

When assessing an organization’s need for BI infrastructure — and when evaluating actual BI tools — managers and users may encounter terms or concepts that, while familiar, may need further definition or context. Here are a few important ones:

Business intelligence refers to the discovery and communication of insights or actionable information in enterprise data. Managers use this information to improve business processes and aid in making decisions.

Analytics refers to the exploration and propagation of patterns and knowledge in data, which might be expressed through simple statistical tests or predictive systems powered by machine learning.

Other phrases that are important when selecting a BI or analytics tool:

  • Big data may be the first phrase that comes to mind when one thinks of analytics or BI. It refers to the growing volume, variety, and velocity of data in the modern information age.
  • Data mining is a subset of analytics concerning the discovery of patterns in very large data sets. It's an older term, and it often is used to refer to any kind of large-scale data processing or analysis.
  • Data structure refers to the way data is organized, managed, and stored.
    • Structured data is defined by a schema and often is stored as tabular data in relational databases.
    • Unstructured data is any collection of objects or documents, and may be stored in data lakes.
  • A data warehouse stores enterprise data and makes it available for applications, users, and analytics. It contains both historical and current data. Data marts are smaller warehouses focused on individual business functions.
  • A data lake is a pool of raw data, structured or unstructured, the purpose for which may not be defined until the data is needed.
  • Extract, transform, and load (ETL) is the process of moving data from sources, modifying the data so that it suits the analytics uses for which it's intended, ensuring the data is in a format supported by the destination, and then transferring the data to that destination.
    • Extract, load, transform (ELT) is a variation of this process that simplifies the data replication process by loading raw data without transformation. Once the data is in the destination, organizations can run any transformations needed.

Try Stitch with your data warehouse and favorite BI tool today

7 BI tools to know about

While there are dozens of BI and analytics tools, Stitch surveyed its customers to learn which ones they use. These seven tools were mentioned most often:

  • Tableau Desktop is a data visualization application that supports inputs across most types and structures, from tabular data to geospatial coordinates. It's the only desktop app on this list. Tableau allows users to create charts and real-time dashboards (based on multidimensional data) with drag-and-drop functionality and powerful cross-filtering. Tableau is ideal for teams that need collaborative visualization and dashboarding, and is appropriate for technical/semi-technical users.
  • Looker is a data exploration tool with a web-based interface designed to be intuitive for users. It features a query language called LookML that outsources complex SQL programming to the tool’s engine. Looker is ideal for data discovery use cases where technical/semi-technical users must build reports quickly.
  • Power BI is Microsoft's business intelligence offering. It's similar to Tableau and Looker in features, but more geared to the average stakeholder. Power BI is appropriate for organizations that have both technical and nontechnical users who rely on tools in the Microsoft or Azure ecosystems.
  • Google Data Studio (GDS) expands Google’s data stack with a visualization tool. It's both one of the simplest and most visually pleasing graphing and dashboarding tools. GDS is well-suited for organizations that use Google Cloud Platform and BigQuery, and teams with minimal BI requirements and semi- or nontechnical end users.
  • Chartio is a web-based dashboarding tool with options for both drag-and-drop and SQL querying functionality. It's suitable for advanced users with SQL expertise, and for semi-technical users, for fast, one-time, and ad-hoc analyses. Chartio is appropriate for organizations seeking to empower data analysts with BI functionality.
  • Similarly, Mode is a web-based analytics platform that combines powerful SQL, Python, and reporting capabilities, and helps to produce dashboards, charts, and data visualizations. A key feature of Mode is easy collaboration — with a shared link, the entire organization can contribute to a report.
  • Periscope Data is another SQL-first BI tool, with optional drag-and-drop functionality and support for detailed analysis in Python and R programming languages. Periscope is best for technical teams and those who need a data visualization platform. It stands out with a particularly sophisticated data governance module.

These tools are just a part of a much larger analytics universe that includes tools for most use cases and budgets.

How to choose the right BI tool

Anyone tasked with selecting the most appropriate BI tool for their organization should review both market research and product features, and match the applications with their business requirements to find the best fit.

An enterprise’s size and growth also are important factors. For example, lightweight tools and software are cheaper, and are best for small companies with less data variety and volume to manage.

Use case is a crucial consideration. A logistics company, interested in optimizing routes and preventing driver churn, would prioritize different BI features than, say, a digital marketing company, more interested in sourcing information or measuring user engagement.

A simple tool with fewer features might be easier to learn, and more cost-effective, when training beginner users. A more sophisticated tool might be appropriate if end users are expected to be well-versed in the software, or in analytics generally.

Other considerations: data quality and processes

One challenge many organizations encounter when analyzing their data is the quality and level of integration of their data warehouse and underlying ETL/ELT processes. The centralized repository must be a consolidated, accessible, and accurate source of data. In fact, the most important step when deploying BI and analytics is not necessarily choosing the right tool or implementation, but rather verifying that functional and operational data systems are trusted and dependable at the outset.

For example, AMARO, an online fashion retailer that wanted to build intelligence to understand customers better, found that simplifying ingestion and carefully integrating data sources were key steps to building out its BI infrastructure.

On the other hand, Oodle’s journey is a great example of how business intelligence can be an unintentional misnomer, if the tools or the implementation are bad. A digital marketing agency searching for a better ETL solution, Oodle paid for one bundled inside a BI platform. Unfortunately, the data ingestion portion of this offering was weak. After just two months, major (planned, public) changes to external APIs went unsupported and the tool became immediately unusable.

Stitch provides fast time to insights

The lessons here are: 1) BI tools should provide actual business value, and 2) data ingestion and ETL/ELT are key components of data pipelines serving available, high-quality, and secure data. Optimized data ingestion is the fuel for better BI, if not a foundational requirement for effective analytic deployments.

Stitch supports dozens of sources and all leading destinations for enterprise data, as well as integration with a full complement of BI/data analysis tools. If your business is ready to build out its BI infrastructure, sign up for Stitch for free and begin optimizing a powerful, fast, available data pipeline.

Give Stitch a try, on us

Stitch streams all of your data directly to your analytics warehouse.

Set up in minutesUnlimited data volume during trial