Week 4: Data Gotta Go Somewhere Right?
After making ScrapSaver public, the next step is connecting my localhost server to an external database. This will allow me to analyze customer data, safely store user information, and ensure the quick retrieval of information!
Like the last blog post, I’ll break down this post into three major steps:
- Install PostgreSQL And PGAdmin
- Create A Server
- Create A Database Within That Server
- Connect The Database In Django Settings
- Migrate Existing Data
Step 1: Installing PostgreSQL and PGAdmin. PostgreSQL is an open source (free for me to access and use!) object-relational database system that combines the SQL language with many user-friendly features. Object-relational databases are very well organized; they follow the same structure as a relational database(often consisting of tables and lists) but also are object-oriented. This allows us to inherit and share many qualities across tables and classes. PGAdmin is a Graphical User Interface (GUI) manager that I can use to communicate with Postgres and its database on local and remote servers.
Step 2/3: Create a Server. After Logging in through PGAdmin, I created a server group called ScrapSaver and a specific server to store my database (elephant emoji). Then I created a database, which generated the following information:
Step 4: Connect the database in Django settings. First, I configured the ScrapSaver database within PGAdmin:
Then I simply carried that information into Django, adding the specified parameters into ScrapSaver’s settings:
Step 5: Making the migrations. Migrations allow me to send changes I make to your models, like adding or deleting a variable, into the database. To make the migrations, I ran:
python manage.py makemigrations
To migrate them into the Postgres database, I ran:
python manage.py migrate
As you can see, now the models data is inside the database:
Next week, I’ll connect the official public website to an AWS postgres database!