top of page

Accessing your cloud servers with python made easy

Updated: Jan 10, 2020

A few weeks ago, we had to move a client's architecture from SQL (A-SQL) in the Microsoft Azure Platform to Big Query (BQ) Google in the Google Cloud Platform. We have created the architecture on Google and then worked on the python scripts used as back-end language to perform different action using BQ instead of A-SQL. In this post, we will go over side-by-side comparisons of the functions to use in order to:

  1. Connect to those cloud servers

  2. Read a table from those cloud servers

  3. Create a table to those cloud servers

  4. Append to a table in those cloud servers

Prerequisites

You will need to have python installed on your machine. The rest should flow naturally in the post. We are using the following version of python:

Python 3.7.2 [MSC v.1916 64 bit (AMD64)] on win32

Python - Azure SQL



Azure SQL - Connect to those cloud servers

In order to start interacting with your SQL Azure server, we will need first to install a couple of packages that make the script much move easy to follow. Here are the libraries we need to properly perform the tasks we are focusing on in this post:

Using this function will allow us to connect to the designated server and database:


server: name of the server.

db: name of the database in the server.

user: username that you wish to use.

password: password linked to the user.


🚀 Bonus 🚀

If kept as is, any requests from and to the server is going to be quite slow. In order to make our script more efficient, here is an additional function to add to your script:

Azure SQL - Read a table

Now that we are connected, we can read any table and return a pandas data frame:

Azure SQL - Create a table

Next, we can create a table in this server using a pandas data frame in memory:

Azure SQL - Append to a table

And finally we can append to an already existing table:


Python - Google Cloud Platform


Google Big Query - Connect to those cloud servers

In order to start interacting with your you Big Query server, we will need first to install a couple of packages that make the script much move easy to follow. Here are the libraries we need to properly perform the tasks we are focusing on in this post:

A next step is to collect a json file from the creation of a service account in the APIs & Services from the Google Cloud Console (https://console.cloud.google.com/apis). We can then just call this json file and

Google Big Query - Read a table

Now that we are connected, we can read any table and return a pandas data frame:

client: connection obtained in the first step.

server: name of the server.

db: name of the database in the server.

table_id: name of the table.


Google Big Query - Create a table

Next, we can create a table in this server using a pandas data frame in memory:

client: connection obtained in the first step.

db: name of the database in the server.

table_id: name of the table you wish to create.

source_file: dataframe you wish to push to the cloud server.


Google Big Query - Append to a table

And finally we can append to an already existing table.

Note: it will only work if the metadata of the new data frame is identical to the metadata of the table you are appending.

client: connection obtained in the first step.

db: name of the database in the server.

table_id: name of the table you wish to append.

source_file: dataframe you wish to push to the cloud server.


Want more information?

Please contact us at ask@simply-bi.com

51 views0 comments

Recent Posts

See All
bottom of page