How To
Jul 28, 2023

Use the Gigasheet API to Integrate with Salesforce

Hey, data enthusiasts! Ready to dive into the epic world of Gigasheet and Salesforce integration? Hold on tight, because we're about to embark on a wild data adventure! So, here's the deal: Gigasheet is like a spreadsheet on steroids, designed to handle massive datasets. It's a game-changer for exploring and analyzing big data.

But wait, there's more! Gigasheet also has this awesome API that lets you connect it with other applications (you can even use Zapier), including the mighty Salesforce. We'll start by exporting a report from Salesforce and using the Gigasheet API like a pro to seamlessly integrate it with Gigasheet.

Looking for a no-code approach? Schedule data exports from Salesforce in the Gigasheet UI

In the end, we'll have the entire Salesforce report appended to a Gigasheet spreadsheet. This integration empowers you to unleash the full potential of Salesforce while enjoying the flexibility and versatility of Gigasheet's spreadsheet interface. It definitely helps you save time and significantly boosts your productivity. Prepare yourself to participate in the Gigasheet party by signing up as a priority. Yes, Gigasheet's community plan is free!

We aim to achieve a seamless workflow, so we will create a Python script to accomplish this. I’d like to use the simple_salesforce library which is a basic REST API client. We need to run the following pip command to locate the library and swiftly download it to our computer.

pip install simple_salesforce

Now, we have the amazing Simple Salesforce library at our disposal. We have few more libraries and methods to work with for integrating Python with Salesforce:

from simple_salesforce import Salesforce
import requests
import pandas as pd
from io import StringIO

Let's now create an object (sf) to interact with various Salesforce resources. This object acts as a gateway between our Python code and our Salesforce account, allowing us to access and manipulate data within out Salesforce organization. Please ensure the security of your passwords and keys by setting environment variables for these credentials within your script.

sf = Salesforce(username='REDACTED', password='REDACTED', security_token='REDACTED')

In addition to the Salesforce username, password, and security token, we also need to provide the Salesforce Instance URL and the ID of the report we want to download. These details are essential for the successful execution of our code. Here's a tip: you can spot the report ID within the Salesforce URL.

sf_instance = ''
reportId = '00O4x000005op7lEAA'
Fictious Salesforce Data Report Summarizing Sample Sales & Marketing Data Use the Gigasheet API to integrate with Salesforce
Fictitious Salesforce Data Report Summarizing Sample Sales & Marketing Data

Before we continue, I want to be clear that the names and personal information in this report are not real. They are fictitious and have been created for illustrative purposes only. You need a proof? Check out the 4th entry with the name Admiral Kassius Konstantine! I mean, seriously, who signs up for a name like that? Are there any Star Wars fans in the house?

Let's move on! In order to build the URL to download this Salesforce report as a CSV file, we need to add some query parameters:

sf_URL = f'{sf_instance}{reportId}?isdtp=p1&export=1&enc=UTF-8&xf=csv'

Here, isdtp (Internal Salesforce Display Type Parameter) specifies the presentation type of the report. The value p1 indicates that the report should be downloaded as a CSV file. Export specifies that the report should be exported. Enc (Encoding) specifies the encoding of the report file. The value UTF-8 is the standard encoding for Unicode text. xf (Export Format) specifies the format of the report file. The value csv indicates that the report should be saved as a CSV file.

Sure, we can stop the automation here and do it the old-fashioned way. That is, we can manually save the Salesforce report and upload it to Gigasheet. But be warned, this practice is as outdated as wearing socks with sandals. Just imagine the hours you'll save by automating this process. With the Gigasheet API, you'll be able to spend all that extra time doing the things you love... like sleeping, watching Netflix, or browsing Reddit.

Wave Goodbye to Manual Downloads and Uploads between Platforms    

Here, we would like to say farewell to the necessity of downloading and uploading files across different platforms. To achieve a seamless integration, let's take care of some initial tasks. The following commands are designed to read the .csv file and convert it into a DataFrame, providing an opportunity for further data manipulation within the Gigasheet API.

response = requests.get(sf_URL, headers=sf.headers, cookies={'sid': sf.session_id})
sf_report = response.content.decode('utf-8')
data_frame = pd.read_csv(StringIO(sf_report))

#Add below if don't want to receive error message for NaN values
data_frame = data_frame.fillna('')

In his blog titled "Automate Data Delivery Using the Gigasheet API", Steve discussed about the 'Upload from URL' API. Here, we'll introduce a shiny new API ready to join your existing toolbox! That is, the 'Append to a sheet' API. It also enables the addition of rows to the bottom of an existing spreadsheet. Check out the cool black shaded part on the right side! It's all about the Gigasheet API in Python format. All we need to do is hook it up with the Salesforce Frame Data that we generated earlier. It's a piece of cake!

Leveraging the 'Append to a Sheet' API for Efficient Row Addition Use the Gigasheet API to integrate with Salesforce
Leveraging the 'Append to a Sheet' API for Efficient Row Addition

We simply need to include a JSON-formatted payload with the column keys. For example, in order to append a row to the first column, it is sufficient to include a JSON-formatted payload where the column key corresponds to B. Here, we have 24 columns so we need to use almost all letters of the alphabet.

#Gigasheet Handle
sheet_handle ='825683d3_937e_4155_a156_08b1b253684c'

for index, row in data_frame.iterrows():
 url = f"{sheet_handle}/append"

 payload = { "records": [
         "B": row['First Name'],
         "C": row['Last Name'],
         "D": row['Title'],
         "E": row['Company / Account'],
         "F": row['Email'],
         "G": row['Lead Source'],
         "H": row['Street'],
         "I": row['Rating'],
         "J": row['Lead Owner'],
         "K": row['Lead ID'],
         "L": row['Universe'],
         "M": row['Testing Category'],
         "N": row['Salutation'],
         "O": row['Description'],
         "P": row['Industry'],
         "Q": row['Street Line 1'],
         "R": row['City'],
         "S": row['State/Province'],
         "T": row['Zip/Postal Code'],
         "U": row['Country'],
         "V": row['Phone'],
         "W": row['Email Bounced Reason'],
         "X": row['Email Bounced Date']
     ] }

 headers = {
     "accept": "application/json",
     "content-type": "application/json",

 response =, json=payload, headers=headers)

Voila! The end result is truly remarkable as you can sit back and effortlessly witness the seamless and swift copying of the Salesforce report to the Gigasheet spreadsheet.

Final Thoughts about the use of Gigasheet API to integrate with Salesforce

We totally nailed it! The Gigasheet and Salesforce integration using the Gigasheet API is up and running flawlessly. Now, when we execute it, the Salesforce report magically appears in the Gigasheet spreadsheet without a hitch. We can access the resulting Gigasheet spreadsheet and enjoy the flawless and effortless flow of everything in this ultra-efficient workflow. You are welcome to download the Python script from the GitHub repository without any restrictions. Don't miss out on the chance to try Gigasheet - sign up today and see for yourself!

The ease of a spreadsheet with the power of a database, at cloud scale.

No Code
No Database
No Training
Sign Up, Free

Similar posts

By using this website, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.