Historic Data Push

Data of events happened in the past can be used to run Ad hoc campaigns to target crowd.

The Historic data can be sent to Gamooga through various mediums specific to the availability and presence of the data.

1. Bulk Imports

Historical data (orders, events etc.) can be uploaded into Gamooga. Usually a list of users with their email ids, mobile numbers or any past data that could be useful for the segmentation of users or target them can be uploaded in the form of events.


  1. The data has to be in .csv format.
  2. There should be a timestamp column which reflects the actual date of the event. If no timestamp is available, Gamooga considers the timestamp as the current time at the time of upload.
  3. A unique event name is to be provided for the database query. Avoid spaces while picking a nomenclature for the event name. For example: “event_name” is a valid name but “event name” is not acceptable. The same is to be maintained for property names where spaces are not to be utilized
  4. The unique identifier should be maintained across all bulk imports
  5. Select “event” if you wish to upload properties only as event properties, or “visitor” if you wish to upload the properties only as visitor properties. Alternatively if you need them to be uploaded as both visitor and event properties select “Both”.

2. Database Queries

Past data as well as data that will be present in a database view in the future can be sent to Gamooga periodically using Database Queries. Using a Database view, querying the required data for a particular event, based on the timestamp is possible and the query can be scheduled to run periodically (once a day/week etc.) based on the requirement.

To setup the connection between the database server and Gamooga, the database settings such as the host, port, database name, username and password along with database type are to be added on the panel.

The required columns can be selected from the table and the respective schema with the unique identifier. A unique event name is to be provided for each bulk import. Avoid spaces while picking a nomenclature for the event name. For example: “event_name” is a valid name but “event name” is not acceptable.

If you wish to add the columns to the visitor properties of the users you can select the column name from the dropdown list.

In case a one-time import is required from the database table, you can click on “Make One Time Query”. If it is required to schedule the data pull on a regular basis, the table/view should contain a column of the update date when the data was updated. This column name is to be selected in the limit field and its respective format is to be selected from the “limit field format” field. For example if the row is updated on 2001-01-01, the “updated_date” column for this row should have the same date that is “2001-01-01”.

3. API call

Events that have occurred in the past can be pushed to Gamooga by utilizing the event push API which can be done by using the unique identifier (such as user_id or customer_id). This API call will first identify the users on the unique identifier fetching the visitor id from Gamooga’s backend and then push the event to Gamooga mapped with the fetched visitor ids.

https://evbk.gamooga.com/evwid/?c=<company id>&u=<unique id>&e=<event name>&
ky=<property name 1>&vl=<property value 1>&tp=<property datatype 1>&
ky=<property name 2>&vl=<property value 2>&tp=<property datatype 2>&tm=<epoch>

:- “epoch(tm)”- The tm parameter takes values in Unix time in milliseconds. The date and time at which the event occurred should be converted into epoch format and passed into the “tm” parameter.
:- “Company id(c)”- company id (which can be found in account section)
:- “Unique id(u)”- Unique identifier of the visitor.
:- “Event Name(e)”- Event name to be used in the campaign. An event can have multiple property sets.
:- “Property Set (ky, vl, tp)”- Each property set defines a single property of an event. And multiple sets can be added as shown above.’ky’ being the name of the property, ‘vl’ being the property value and ‘tp’ being the datatype of the property. The acceptable datatypes are string(s), numeric(n) and boolean(b).
:- “Response”- Response status 200 means a successful event push.

4. Scheduled Datasets

Datasets are external data that can be queried to fetch personalized data and be used in a campaign journey. Datasets can be automated to be updated with a CSV hosted on an FTP/HTTP server.

Configuration is as follows:

Resources > Datasets > Create New > Name:dataset name > Alert Email to get updates in case of failure of upload > Import from link > Give link of the url where csv hosted > Save > Select the appropriate data type of the columns respectively > Index those columns which are used in querying/fetching for faster processing > Save

You can get the status of upload by refreshing the page. “Imported n rows. Ready” indicates that the file has been uploaded successfully. “n” indicates the number of rows in the uploaded file.

Rules to be followed for the dataset upload:

  1. The column names should contain only alphanumeric characters and underscore only.
  2. The datatype of columns should be appropriate. For example, product_name as string and price as numeric.
  3. The file should be UTF-8 encoded CSV.
  4. There should not be any empty columns or rows.
  5. The file can contain any number of columns like product_id, product_name, URL, image_link, availability (in stock/out of stock), price etc.

The same can be automated to be updated on a daily/weekly/monthly basis. This is achieved as follows:

Settings > Scheduler > Create New > Type: Dataset > Name: Select the dataset to be updated from the drop-down > Set up the alert mail to which email is to be sent on every update > Setup the time at which the upload has to be automated from the link configured in the CSV > Select Day/Month/Weekday based on the interval of update of dataset required > Save