Datadog Inc.

11/14/2024 | News release | Distributed by Public on 11/14/2024 15:05

Integrate usage data into your product analytics strategy

Web applications emit a wealth of metadata and user interaction information that's critical to understanding user behavior. However, parsing this data to find what is most relevant to your product analytics project can be challenging-what one product analyst might find useful, another might consider unnecessary noise. When combined with the wide variety of analytical lenses available-from longitudinal cohort studies to snapshots of individual sessions-determining which type of data best suits your needs becomes even more difficult.

In order to effectively analyze usage data, you need to devise a product analytics strategy that covers everything from ingestion to analysis-including identifying the most useful data for your project, verifying and normalizing it, and precisely querying it. Despite its broad scope, this strategy can be broken down into these basic steps:

Organize and ingest your usage data

While there are many approaches to developing a data analytics strategy, a subtractive model enables you to ensure that you collect the data you need with minimal overhead. This model involves ingesting all the usage data your app emits up front, then filtering and querying that data afterwards to answer key questions. This approach gives you more flexibility than an additive model, which involves ingesting only the data you've decided you need. Project goals may evolve over the course of analysis, or you may realize you need certain data you didn't anticipate. A subtractive model enables you to quickly adapt to these changes.

If you're sending raw data from your hosts to a database, you'll likely want to organize usage data into a hierarchical data taxonomy. Taxonomies enable you to separate the data into logical categories and subcategories, which helps analysts more easily link and query data. You'll also want to establish clear primary keys for each of your tables, enabling you to join your event data to external reference tables for contextual insights, such as details about your customers' organizations. The following example of a data taxonomy shows the primary keys in white text at the top of each table:

Verify and normalize your data

While looking at raw usage data may provide superficial insights into user behavior, you'll need to export it to an observability platform to perform deeper analysis on a larger scale. Regardless of which platform you choose, it's crucial to verify that the data has been ingested correctly. A few conditions you'll want to confirm include the following:

  • Names are consistent and spelled correctly
  • Data isn't duplicated or miscategorized
  • Parameters are correctly associated with events
  • Every field you want to capture is available for querying

Once you've identified potential inconsistencies or gaps in your ingested data, data normalization comes into play. While some data analysis platforms execute certain normalization processes automatically, performing basic normalization procedures manually helps you catch any nuances that might be specific to your organization or project. For example, you might need to check that unique abbreviations in field names aren't duplicated with full names elsewhere in your data (e.g., "PA" vs. "Product Analytics").

Standardizing elements such as spelling, capitalization, and character usage (including underscores and spaces) across your field names doesn't just help you avoid repetition within your data that can make it difficult to query. It also helps you catalog the events and parameters that your apps are generating. You can also further define the relationships among your data points, as well as eliminate redundancy and misleading dependencies, by ensuring that your data follows normal form rules.

Effectively query product data

To effectively query the normalized data, you need to identify which pieces of information are the most useful for your project. Product data consists of two aspects: the event and the parameters. The event is an action in the system (e.g., an item is added to a cart, a page is loaded, a sign-up button is clicked), and the parameters are the metadata associated with the event. These parameters may consist of details about the user or the system they're using, such as location data, device types, or user IDs. Other times, the parameters might contain information specific to the event they're associated with. Some examples of possible event and parameter combinations for a shopping web app are listed below:

Event Parameter Purpose
Add to Cart Product Category Tells you what kind of product the user has just added to their cart
Purchase Use Discount Helps you determine whether the user applied a discount code when making their purchase
Signup Has Purchased Shows you whether a user signing up for your loyalty program has made a purchase previously

Deciding what data to query consists of two parts: figuring out which events correspond with the user interactions you want to analyze, and determining which parameters will provide the most useful context for these events. For example, let's say you're evaluating whether your team's new email campaign has resulted in an increase in page visits from loyal, returning users. In addition to analyzing referral link click events, you'll also likely want to consider parameters that might impact conversion, such as user age or location. Alternatively, let's say you're executing a more technical project-maybe you want to study whether time spent on your website is consistent across mobile devices. Here, parameters such as device type, OS, and screen size will probably be more relevant.

Visualize your product usage data with Datadog

Once the data has been ingested into the monitoring platform of your choice, you can then visualize and query it to identify usage trends. Datadog Product Analytics provides a variety of visualizations-including funnels, path exploration, user segmentation, and cohort analysis-that enable you to identify trends within your user behavior. Product Analytics also helps you easily enrich your data with context from reference tables, as well as perform natural language queries to answer granular questions.

To ensure that the data within Product Analytics is useful and accurate, Datadog also makes ingestion and verification easy. You can send your raw event data to Product Analytics with no additional setup required. Alternatively, you can import your data from a third-party service by using any of our more than 800 integrations.

Once your data has been ingested, you can view it in Datadog RUM to ensure that your data is labeled logically and accurately. RUM enables you to easily filter your data to analyze how a specific subset of your sessions is categorized. For a closer look, you can also click on a single session to view a list of associated tags and values.

Start analyzing your usage data today

Usage data is the foundation of any product analytics project-it enables you to ground your UX insights in concrete trends that point towards who your customers are and how they're interacting with your app. This data is not the be-all and end-all, as it's always good to supplement large-scale usage trends with more targeted practices like user interviews and A/B testing, but it still gives you a critical starting point for your project.

By selectively collecting data, normalizing and organizing it via a taxonomy, and verifying it through synthetic testing, you can help ensure that your analyses are built upon accurate, well-informed insights. To learn how to start analyzing your product usage data in Datadog, see our Product Analytics documentation. If you're new to Datadog, you can sign up for a 14-day free trial.