API Analytics - Bulk Export

Bulk export is an enterprise feature that provides direct access to your data stored in Moesif. With bulk export, you can quickly load hundreds of millions of events into data warehouses like Snowflake, Redshift, and BigQuery.

You can trigger a new bulk export job within the Moesif UI or through the Management API such as to setup recurring exports.

Bulk export is designed to move large amounts of your raw events for archival and data warehousing purposes. If you want to import a chart’s dataset into a tool like Microsoft Excel or Google Sheets, you may want to use the Export Chart button instead.

Export Types

Moesif supports bulk export on three types of entities in Moesif:

  • Events: Export your API calls and user actions in Moesif with their associated fields like HTTP headers.
  • Users: Export your users stored in Moesif and their associated metadata.
  • Companies: Will export your companies stored in Moesif and their associated metadata.

File Types

  • CSV: CSV files has some of the widest support for many of your favorite tools. This makes it ideal for exporting things like user and company data to your CRM and more.
  • JSON: JSON is ideal for parsing data quickly for scripting purposes. Many fields in Moesif are deeply nested which makes JSON a breeze.
  • Parquet: A Parquet file is column-oriented which can make analysis on a subset of columns faster vs loading the entire file in memory. Your schema is directly embedded in the file which can help SQL warehouse like tools which requires strict schema enforcement.

Export schema

Data representing API calls can naturally be quite sparse and Moesif heavily partitions for performance (i.e. you may have certain HTTP headers set by only a few customers). To ensure exports are fast while reducing file size, Moesif will only include the columns with at least one defined value for the dataset.

If you’re using any automated ETL tooling, ensure it’s set up to handle schema evolution.

Some tools like BigQuery have very strict requirements around schema detection while incorrectly classifying data commonly. For these tools, it’s recommended to leverage a format like Parquet which can help overcome these challenges.

How to start an events export

To export events, go to Live Event Stream from under the Events header menu, then add your filter criteria. The time it takes to export your data is directly proportional to both the number of events you are exporting and the number of columns selected.

If you want to trigger export jobs via API, contact your account manager or reach out to support for instructions.

Filtering Events

First, add any filters you want applied before exporting. For example, if you want to export all API calls from the last 24 hours that had a status code of 400 or greater, set up your filter criteria like below.

Selecting Data to Export

Then click the export button as shown above.

Selecting columns

If you don’t want all columns you can add some specific columns of interest before clicking the export. This can be done by clicking the Select Columns button like below. This will open a pop up enabling you to select your columns of interest.

Selecting Columns to Export

Exporting data

Once you have your event stream configured with the right criteria, click the export button. This will open up a window as shown below. In the pop up, select the file type and whether you want all columns or the previously selected columns.

Bulk Events Export

You should also specify the email where the email should be sent. The email will include a link to download a file from an Azure Storage account.

How to start a users/companies export

Exporting users or companies is very similar and can be done using the Export button like below:

Bulk export users and companies

This will open a new pop up enabling you to select the file type and enter an email that will receive the exports.