In this tutorial, we will learn how to create a SQL database using Python and the MySQL relational database management system.
You can save data to a MySQL database easily using the pyMySQL package in Python.
Getting Started
Before you can get started, you should have two things:
- Install Python
- Install MySQL using XAMPP
Make sure that you have Apache and MySQL activated.
You can access your database using PHPMyAdmin.
To do so, open a browser and type localhost/phpmyadmin
.
Create Your First MySQL Database Using Python
Now that MySQL and PHPMyAdmin are installed on your computer, you can create your first MySQL Database with Python using the pymysql package.
Install and Load Libraries
The first step is to install and load the pymysql library.
To install the library in Anaconda, open the Anaconda Prompt.
Add this command to the prompt:
conda install -c anaconda pymysql
Press “Y” when you are asked to do so.
The package should be downloaded successfully.
Then, in your Python Script, import the package.
import pymysql.cursors
Create the MySQL Database
Now, we are going to establish a connection with the database.
connection = pymysql.connect(host='localhost',
user='root',
port='',
password='')
Then, we need to simulate the “CREATE DATABASE
” MySQL function using cursor()
.
Here, I create a “gsc_db
” database because I want to backup Google Search Console data.
try:
with connection.cursor() as cursor:
cursor.execute('CREATE DATABASE gsc_db')
finally:
connection.close()
Remember that it is always good to close the connection after the operation using the close()
command.
Let’s see if the database was successfully created.
Go back to PHPMyAdmin.
Allright, it is there. We can now create a table in the database.
Create the Table in MySQL
To create a new table in my gsc_db
database, I will need to re-establish the connection that I closed earlier.
This time, I will establish the connection to the gsc_db
database.
connection = pymysql.connect(host='localhost',
user='root',
port='',
password='',
db='gsc_db',
cursorclass=pymysql.cursors.DictCursor)
The pymysql.cursors.DictCursor
is the object you use to interact with the database and return the result as a dictionary.
Then, I will create the backup_gsc
table. I will create it only if it does not exist using the CREATE TABLE IF NOT EXISTS
function.
try:
with connection.cursor() as cursor:
sqlQuery = '''CREATE TABLE IF NOT EXISTS backup_gsc(Date DATE,
Page TEXT,
Query TEXT,
Clicks INT,
Impressions INT,
Ctr DECIMAL(10,2),
Position DECIMAL(3,2))'''
cursor.execute(sqlQuery)
finally:
connection.close()
Here, I setted-up columns to the table and gave them datatypes. This set-up is based on the data that I want to extract from Google Search Console.
If you don’t know which data type you should give to your columns, look at the cheat sheet.
The
DECIMAL(3,2)
datatype means that I want to log 3 digits before the decimal, 2 after the decimal.
Congratulations, your database should be created.
To see if it worked, go to PHPMyAdmin and click on Structure.
You should have your empty table created using the columns you specified.
Full Python Code
import pymysql.cursors
# Create a mySQL Database
# Establish connection
connection = pymysql.connect(host='localhost',
user='root',
port='',
password='')
# Simulate the CREATE DATABASE function of mySQL
try:
with connection.cursor() as cursor:
cursor.execute('CREATE DATABASE gsc_db')
finally:
connection.close()
# Create a table
connection = pymysql.connect(host='localhost',
user='root',
port='',
password='',
db='gsc_db',
cursorclass=pymysql.cursors.DictCursor)
try:
with connection.cursor() as cursor:
sqlQuery = '''CREATE TABLE IF NOT EXISTS backup_gsc(Date DATE,
Page TEXT,
Query TEXT,
Clicks INT,
Impressions INT,
Ctr DECIMAL(10,2),
Position DECIMAL(3,2))'''
cursor.execute(sqlQuery)
finally:
connection.close()
Load Data To Your Table
Next, we are going to see how you can save your Google Search Console data in the MySQL table that you just created.
SEO Strategist at Tripadvisor, ex- Seek (Melbourne, Australia). Specialized in technical SEO. Writer in Python, Information Retrieval, SEO and machine learning. Guest author at SearchEngineJournal, SearchEngineLand and OnCrawl.