Data Destinations
This document defines the different destinations where the importer can push the data uploaded by the users. The following is the list of supported destinations.
  1. 1.
    None
  2. 2.
    API
  3. 3.
    Amazon S3
  4. 4.
    MySQL
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
    Bubble.io
  10. 10.
    Airtable
  11. 11.
    Zapier
  12. 12.
    Notion
At a time only one destination can be selected per sheet.

None

The user-uploaded data will not be pushed anywhere. The files will be, however, available for download via the csvbox.io admin.

API / Webhook

The data will be pushed to a webhook endpoint as configured in the sheet settings. You can choose between JSON, XML, and FORM Data formats for receiving data to your webhook. The data will be pushed in chunks of rows. The number of rows per chunk can be configured in the sheet settings.

Sample JSON POST to your API:

1
[
2
{
3
"import_id": 79418895,
4
"sheet_id": 55,
5
"sheet_name": "Products",
6
"row_number": 1,
7
"row_data": {
8
"Name": "TP-Link TL-WN822N Wireless N300 High Gain USB Adapter",
9
"SKU": "AS-100221",
10
"Price": "33.00",
11
"Quantity": "3",
12
"Image URL": "https://cdn.shopify.com/s/files/1/1491/9536/products/31jJOj1DS5L_070b4893-b7af-482f-8a15-d40f5e06760d.jpg?v=1521803806"
13
},
14
"custom_fields": {
15
"user_id": "1a2b3c4d5e6f",
16
"team_id": "sales2"
17
}
18
},
19
{
20
"import_id": 79418895,
21
"sheet_id": 55,
22
"sheet_name": "Products",
23
"row_number": 2,
24
"row_data":{
25
"Name": "EPower Technology EP-600PM Power Supply 600W ATX12V 2.3 Single 120mm Cooling Fan Bare",
26
"SKU": "AS-103824",
27
"Price": "95.35",
28
"Quantity": "8",
29
"Image URL": "https://cdn.shopify.com/s/files/1/1491/9536/products/71pRC5VjF-L_8f840eb9-6a47-407f-999c-490f7814159d.jpg?v=1521803806"
30
},
31
"custom_fields": {
32
"user_id": "1a2b3c4d5e6f"
33
"team_id": "sales2"
34
}
35
},
36
]
Copied!
The data will come in as HTTP POST requests. Each request will have an array of rows based on the chunk size defined in the sheet settings. You can set the chunk size to 1 to receive 1 record per HTTP request.
If you want to jump in and get started, we recommend testing using webhook.site, to get your webhook URL. For testing on your local machine, we recommend using ngrok.

Amazon S3

The files uploaded by the users can be pushed to the AWS S3 Bucket of your choice. You simply need to select the destination type as 'Amazon S3' and provide the AWS credentials, bucket/folder name, and access policy for storing the files.
The data will be stored as S3 objects with the name {{import_id}}_{{user_id}}.csv where user_id is the custom user attribute that you reference via the setUsermethod while installing the importer code. The other 4 custom user attributes will be saved as the user-defined metadata of the S3 object.
The AWS credentials need the following 3 minimum policies for uploading files to S3:
  1. 1.
    ListBucket policy is required for testing the connection.
  2. 2.
    PutObject is required to add objects to S3.
  3. 3.
    PutObjectTagging is required to add the tags (metadata) to the uploaded objects.

MySQL

Import CSV files and Excel sheets directly into your MySQL tables. How it works:
  • Select the destination type as 'MySQL Database'.
  • Connect your MySQL database by providing the credentials.
  • Specify table name where you want the data to be pushed.
csv to mysql
  • Map sheet columns to the table column.
  • You can also map custom attributes to table columns.
map sheet to table columns
The user CSV data will then be directly be appended to the MySQL table.

SQL Server

Import CSV files and Excel sheets directly into your SQL Server tables. How it works:
  • Select the destination type as 'SQL Server Database'.
  • Connect your SQL Server database by providing the credentials.
  • Specify the table name where you want the data to be pushed.
  • Click the 'Test Connection' button.
  • If the connection is successful, then click the 'Map Columns' button and match the sheet template columns to the SQL Server table columns.
  • You can also map custom attributes to table columns.
  • Select between the following 2 operations:
    • Insert - The importer will always push the incoming CSV data as new records in the database.
    • Upsert - The importer will check if the record exists in the database. If the record exists, then it will be updated with the incoming data from the CSV. If the record does not exist, then a new record will be inserted. The record check will be done based on the index keys specified in the mapping modal.
Define Unique Key for Upsert Operation
The Upsert operation will be significantly slower than the Insert operation. For the Insert operation, the records can be pushed in chunks. Whereas for the Upsert operation only one record can be processed at a time, and it requires multiple queries.

Google Sheets

Import CSV files and Excel sheets directly into Google Sheets. Here is how it works:
  • Select the destination type as 'Google Sheets'.
  • Connect your Google account by clicking the Google button and accepting the relevant permissions.
The importer requires permission to view the list of Google sheets in your account and edit sheet data.
  • Provide the Google sheet name.
  • Specify the worksheet name where you want the data to be pushed.
  • Map the template columns to the Google sheet columns.
  • You can also map custom attributes to sheet columns.
The user CSV data will then be directly be added to the Google sheet.

Bubble

Import user CSV files and Excel sheets directly into your Bubble app. More information here.

Notion

Import user CSV files and Excel sheets directly into your Notion databases. More information here.

PostgreSQL

Import CSV files and Excel sheets directly into your PostgreSQL tables. How it works:
  • Select the destination type as 'PostgreSQL'.
  • Connect your PostgreSQL database by providing the credentials.
  • Specify the table name where you want the data to be pushed.
PostgreSQL Data Destination Settings
  • Map sheet columns to the table column.
  • You can also map custom attributes to table columns.
map sheet to table columns
The user CSV data will then be directly be appended to the PostgreSQL table.

Airtable

Import CSV files and Excel sheets directly into your Airtable. Here is how it works:
  • Select the destination type as 'Airtable'.
  • Connect your Airtable by providing the credentials.
Steps to get the Airtable API Key are mentioned here.
Steps to get the Base ID are mentioned here.
  • Specify the table name where you want the data to be pushed.
  • Map sheet columns to the Airtable table column.
  • You can also map custom attributes to table columns.
The user CSV data will then be directly be appended to the Airtable table.

Zapier

Import user CSV files and Excel sheets to Zapier. More information here.

FTP Server

The files uploaded by the users can be pushed to your FTP Server. You simply need to select the destination type as 'FTP' and provide the conenction details and the folder name for storing the files.
The data will be stored as CSV files with the name {{import_id}}_{{user_id}}.csv where user_id is the custom user attribute that you reference via the setUsermethod while installing the importer code.