Skip to content

Loading Data

One of the most common integration use cases is data ingestion from other systems. In Contrail, we refer to this process as "loading." Contrail can load different types of data, including Items, Colors, Custom Entities, and Project structures.

One of the main advantages to loading data using the loader framework (rather than using complex logic and endless HTTP calls) is the speed gained from running the data load within the VibeIQ cloud environment. Once you upload your load file, all the operations take place within our infrastructure, alleviating internet bandwidth and quota concerns.

The loader framework is exposed through a combination of our CLI and Typescript SDK.

Loader Process Schema

Loader Entity:

{
  fileLocation: string;
  loadType: LoadType[]; // ITEM, PROJECT_ITEM, ASSORTMENT, COLOR, CUSTOM_ENTITY, SIZE_RANGE_TEMPLATE
  federatedMappings?: any;
  conditionalColumns?: ColumnDefinition[];
  assortmentSplit?: AssortmentSplit;
  workspaceIdentifier?: string;
}

fileLocation:

The id of the file the CSV was loaded to.

loadType:

The type of load to perform.
This is an array. An empty array is treated as passing [ITEM, ASSORTMENT].

Individual Load types:

  • ITEM
  • PROJECT_ITEM
  • ASSORTMENT
  • COLOR
  • CUSTOM_ENTITY
  • SIZE_RANGE_TEMPLATE

federatedMappings:

Used to create a new column from an existing using the exact value from the existing column (NOTE: federated mappings will not overwrite existing properties)

ex.
"federatedMappings": {"name": "Seq#'} - a new property ‘name’ will be created from existing property ‘seq#’. (as long as name does not already exist).

conditionalColumns:

Used to create a new column based on conditions - we are able to use existing column data here but do not have to.
We can use data from the row in the conditions and default. This is done by putting the property name in brackets (shown in example).
NOTE: federated mappings will not over write existing properties - conditionalColumns will.

conditionalColumns schema:

  fromProperty?: string;
  toProperty: string;
  conditions?: ConditionalValues[];
  default?: any;
toProperty:
The property we will write to.
fromProperty:
Can be used if we want to do a strait copy from an existing to a new (Like federated mappings but will over write) - this is only used if no condition is passed in.
conditions:
Holds a list of conditionals to check (allows for almost any check) and will use the value of the first condition that hits.
default:
Used if 1. there are no conditions and no fromProperty. 2. there are no conditions and fromProperty’s value is null. 3. If there is a condition but none of the conditions hit.

ex.

{
  "conditions": [
      {
       "conditional": "{strategy} === 'aggressive'",
       "value": "rubber"
      }
    ],
  "toProperty": "treadType",
  "default": "{treadTypeLegacy}"
}

assortmentSplit:

used if we want to split a single CSV across multiple assortments based on a property. (NOTE: If this is not set - the loader will look for a property ‘assortmentId’ on the first item loaded and load all the items to that one assortment.)

assortmentSplit schema:

AssortmentSplit: {
  fieldToSplitOn: string;
  values: ValueToAssortment[];
}
ValueToAssortment: {
  value: string;
  assortmentId?: string;
  assortmentIdentifier?: string;
}

ex.

 "assortmentSplit": {
     "fieldToSplitOn": "sizeTypes",
     "values": [
        {
          "value": "kids",
          "assortmentId": "l6FuOu6JVfgsd"
        },
        {
          "value": "mens",
          "assortmentIdentifier": "2023 Summer Mens"
        }
      ]
  }

workspaceIdentifier:

This is a string used to identify the workspace we would like to load data into. This is only used on a project-item load if there is no other way to know what project we are working in. i.e. no assortmentSplit and no assortmentId column in the data.

Loading With The CLI

The contrail CLI contains all the functions needed to successfully load a CSV file into the VibeIQ ecosystem. There are two commands that can be used to kick off a load job with a new or existing VibeIQ File.

Loading with a local file

To load with a local file you need to have a csv file to load and a json or yml configuration file.

The CLI command to run is

contrail loader-process upload-and-load <File Path> <Config Path>

In the command above the should the location and name of the CSV file you want to upload into a File entity. i.e. upload.csv if the file is at the root of where you run the command or data/files/upload.csv for file in a nested folder.

In the command above the should the location and name of the JSON or YML file you want to use as the file upload config. You DO NOT need to set the loadFileId, this will be filled in with the newly created file.

Loading with an existing file

To load with an existing file you just need a json or yml configuration file.

The CLI command to run is

contrail loader-process load <Config Path>

In the command above the should the location and name of the JSON or YML file you want to use as the file upload config. i.e. config.json if the file is at the root of where you run the command or data/files/config.json for file in a nested folder.

You DO not need to set the loadFileId when loading from an existing file.

Loading With The SDK

Logging In

The first thing we need to do is setup your login credentials so we can make authenticated calls to VibeIQ. To do this we pull in the ‘login’ package from the contrail SDK. When using the login package you may either use a VibeIQ provided app token or a username and password. The login function take a parameter 'LoginCredential'. orgSlug is required in all cases, apiKey is required for the appToken, and email and password are required for user credentials.

LoginCredentials {
    orgSlug: string,
    email?: string,
    password?: string,
    apiKey?: string,
}
code example:
import { login } from '@contrail/sdk';
async function bootstrap() {
  const loginCredentials: LoginCredentials = {
    orgSlug: '<ENTER ORG SLUG>', 
    apiKey:  '<ENTER API TOKEN. i.e. app:123123>'
  }
  await login(loginCredentials);
}
await bootstrap();

Uploading Your Data File

The first thing we need to do when uploading a file is read-in the data file we want to upload. To do this we will use the node package fs. Using the readFileSync function we can read the csv file like this:

  let csvContent = fs.readFileSync('uploadFile.csv'); // change the name to your data file name
Now that we have our data file read, we will upload it via the contrail SDK.

We will use the Files library to create a VibeIQ file, setting the name, file type, and ttl.

This will look something like this:

const csvFile = Buffer.from(csvContent, "utf-8");
const createdFile = await new Files().createAndUploadFileFromBuffer(
  csvFile,
  "text/csv",
  fileName,
  null,
  5 * 60 * 60 * 1000 // If you never want the file deleted, do not set ttl
);
console.log("createdFile: ", JSON.stringify(createdFile));
console.log('VibeIQ file ID: ', createdFile.id);

(Optional) Uploading Your Config File

Every data load accepts a load configuration. This load configuration can be set on the load job directly (see Starting A Loader Job), or it can reference a pre-defined configuration. These pre-defined configurations can be useful when re-using existing configuration on a number of different load files.

To set up a pre-defined loader configuration file, simply crate a loader-configuration entity like below. The fields are kept empty in this example, but they follow the same loader process scheme.

const loaderConfig = await entities.create(
  {
    "federatedMappings": {
      "name": "",
      "itemFamilyFederatedId": "",
      "optionName": "",
      "itemOptionFederatedId": ""
    },
    "workspaceIdentifier": "",
    "assortmentSplit": {},
    "propertiesToRemove": [],
    "conditionalColumns": [],
    "name": "Winter 2024"
  }
)
console.log("loader config id:", loaderConfig.id)

Starting A Loader Job

A loader job is run on a file in the VibeIQ system via file id. So to do this we will need to use the file id from the previous step. We will use the Entities package to kick off a loader process with the desired configuration. To see more on the loader configuration see: Loader Service

The code for this should look something like this:

// start a load job referencing the newly created file
const loadJob = await entities.create(
  {
    entityName: 'loader-process',
    object: {
      fileLocation: createdFile.id,
      loadType: ['ITEM', 'ASSORTMENT'],
      // loaderConfigurationId: loaderConfig.id // if using a pre-defined loader config
      federatedMappings: {
        name: "",
        itemFamilyFederatedId: "",
        optionName: "",
        itemOptionFederatedId: ""
      },

    }
  });

console.log('Load has been kicked off. loader-process ID: ', loadJob.id);