Getting Began
In part 1 we walked by way of the method of organising a board sport suggestion system leveraging FastAPI and PostgreSQL. In Half 2 we proceed this mission and present easy methods to deploy this mission to a cloud service, on this case Render, to make it accessible to customers.
To make this a actuality, we’ll be engaged on organising our PostgreSQL database on Render, populating it with our information, Dockerizing our FastAPI utility, and eventually deploying it to a Render Net Utility.
Desk of Contents
- Deploying a PostgreSQL database on Render
- Deploying a FastAPI app as a Render Web Application
– Dockerizing our utility
– Pushing Docker Picture to DockerHub
– Pulling from DockerHub to Render
Tooling Used
- Render
- Docker Desktop
- Docker Hub
Deploying on Render
Now we have now a PostgreSQL database and a FastAPI utility that work domestically, and it’s time to deploy on a cloud service that may be accessed by a front-end utility or finish consumer (through Swagger). For this mission, we’ll use Render; Render is a cloud platform that, for small initiatives, presents a extra simple setup expertise than bigger cloud suppliers like AWS and Azure.
To get began, navigate to Render and create a brand new account, then you possibly can create a brand new mission by deciding on the ‘New Venture’ button proven under. Be aware, as of the time of this writing, Render has a trial interval that ought to assist you to observe alongside at zero price for the primary month. We’re calling this mission fastapi-test, we then navigate into that mission after it’s created.
Every mission incorporates all the pieces required for that mission to work in a self-contained surroundings. On this case, we’d like two elements: a database and an online server for our FastAPI utility. Let’s begin with creating our Database.

That is quite simple, we choose ‘Create New Service’ as proven in Determine 3 after which choose ‘Postgres’. We’re then navigated to the sector proven in Determine 4 to arrange the database. We title our database “fastapi-database” and choose the free tier to get began. Render solely means that you can use the free tier database for a restricted time, however will probably be wonderful for this instance, and if you happen to wanted to keep up a database long term, the pricing may be very cheap.

After inputting our database data and deciding on ‘Create’ it is going to take a minute to arrange the database, and also you’ll then be introduced with the display proven in Determine 5. We’ll save the Inner Database URL + Exterior Database URL variables in our .env file, as we’ll want these to attach from our FastAPI utility. We are able to then check our connection to the database utilizing the Exterior Database URL variable(connecting from our native machine is exterior the Render Surroundings) and create the tables from our native machine earlier than transferring on to organising our FastAPI utility.

We then run our check database connection script, which makes an attempt to connect with our database through the use of the External_Database_Url variable because the connection string and create a check desk. Be aware that our External_Database_Url is our full connection string for the database, so we are able to cross this as our single enter. A profitable run ought to end in a printout as proven in Determine 6.

from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, Session
from sqlalchemy.ext.declarative import declarative_base
import os
from dotenv import load_dotenv
from utils.db_handler import DatabaseHandler
import pandas as pd
import uuid
import sys
from sqlalchemy.exc import OperationalError
import psycopg2
# Load surroundings variables from .env file (override=True reloads modified values)
load_dotenv(override=True)
# loaidng exterior database URL
database_url = os.environ.get("External_Database_Url")
if not database_url:
print("❌ External_Database_Url not present in surroundings variables")
print("Please examine your .env file incorporates: External_Database_Url=your_render_postgres_url")
sys.exit(1)
print(f"Database URL loaded: {database_url[:50]}...")
# Parse the database URL to extract elements for testing
from urllib.parse import urlparse
import socket
def parse_database_url(url):
"""Parse database URL to extract connection elements"""
parsed = urlparse(url)
return {
'host': parsed.hostname,
'port': parsed.port or 5432,
'database': parsed.path.lstrip('/'),
'username': parsed.username,
'password': parsed.password
}
db_params = parse_database_url(database_url)
def test_network_connectivity():
"""Take a look at community connectivity to Render PostgreSQL endpoint"""
print("n=== Community Connectivity Checks ===")
# 1. Take a look at DNS decision
attempt:
ip_address = socket.gethostbyname(db_params['host'])
print(f"✅ DNS Decision profitable")
besides socket.gaierror as e:
print(f"❌ DNS Decision failed: {e}")
return False
# 2. Take a look at port connectivity
attempt:
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.settimeout(10) # 10 second timeout
consequence = sock.connect_ex((db_params['host'], int(db_params['port'])))
sock.shut()
if consequence == 0:
print(f"✅ Port {db_params['port']} is accessible")
return True
else:
print(f"❌ Port {db_params['port']} is NOT accessible")
print(" This would possibly point out a community connectivity challenge")
return False
besides Exception as e:
print(f"❌ Port connectivity check failed: {e}")
return False
# Run connectivity assessments
network_ok = test_network_connectivity()
if not network_ok:
print("n🔍 TROUBLESHOOTING STEPS:")
print("1. Examine your web connection")
print("2. Confirm the Render PostgreSQL URL is right")
print("3. Guarantee your Render PostgreSQL occasion is lively")
print("4. Examine if there are any Render service outages")
sys.exit(1)
print("n=== Making an attempt Database Connection ===")
# connect with the database utilizing psycopg2
attempt:
conn = psycopg2.join(
host=db_params['host'],
database=db_params['database'],
consumer=db_params['username'],
password=db_params['password'],
port=db_params['port'],
connect_timeout=30 # 30 second timeout
)
# If the connection is profitable, you possibly can carry out database operations
cursor = conn.cursor()
# Instance: Execute a easy question
cursor.execute("SELECT model();")
db_version = cursor.fetchone()
print(f"✅ PostgreSQL Database Model: {db_version[0]}")
# Take a look at making a easy desk to confirm permissions
cursor.execute("CREATE TABLE IF NOT EXISTS connection_test (id SERIAL PRIMARY KEY, test_time TIMESTAMP DEFAULT NOW());")
conn.commit()
print("✅ Database permissions verified - can create tables")
cursor.shut()
conn.shut()
print("✅ psycopg2 connection profitable!")
besides psycopg2.OperationalError as e:
print(f"❌ Database connection failed: {e}")
if "timeout" in str(e).decrease():
print("n🔍 TIMEOUT TROUBLESHOOTING:")
print("- Examine your web connection")
print("- Confirm the Render PostgreSQL URL is right")
print("- Examine if Render service is experiencing points")
elif "authentication" in str(e).decrease():
print("n🔍 AUTHENTICATION TROUBLESHOOTING:")
print("- Confirm the database URL incorporates right credentials")
print("- Examine in case your Render PostgreSQL service is lively")
print("- Make sure the database URL hasn't expired or modified")
sys.exit(1)
besides Exception as e:
print(f"❌ Surprising error: {e}")
sys.exit(1)
# If we get right here, connection was profitable, so exit the check
print(f"n✅ All assessments handed! Render PostgreSQL connection is working.")
print(f"✅ Linked to database: {db_params['database']}")
print("✅ Prepared to be used in your utility!")
Loading Database
Now that we’ve verified that we are able to connect with our database from our native machine, it’s time to arrange our database tables and populate them. To load our database, we’ll use our src/load_database.py file, which we beforehand walked by way of the person items of this script firstly of this text, so we gained’t go into additional element on it right here. The one notable factors are that we’re once more utilizing our External_Database_Url as our connection string, after which on the finish, we’re utilizing the test_table perform that we’ve outlined as a part of our DatabaseHandler class. This perform makes an attempt to connect with the desk title handed to it and returns the variety of rows in that desk.
Working this script ought to end in an output as proven in Determine 11, the place every of the tables was created, after which on the finish we recheck that we are able to return information from them and present that the output rows match the enter rows.

from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, Session
from sqlalchemy.ext.declarative import declarative_base
import os
from dotenv import load_dotenv
from utils.db_handler import DatabaseHandler
import pandas as pd
import uuid
import sys
from sqlalchemy.exc import OperationalError
import psycopg2
# Load surroundings variables from .env file
load_dotenv(override=True)
# Assemble PostgreSQL connection URL for Render
URL_database = os.environ.get("External_Database_Url")
# Initialize DatabaseHandler with the constructed URL
engine = DatabaseHandler(URL_database)
# loading preliminary consumer information
users_df = pd.read_csv("Information/steam_users.csv")
games_df = pd.read_csv("Information/steam_games.csv")
user_games_df = pd.read_csv("Information/steam_user_games.csv")
user_recommendations_df = pd.read_csv("Information/user_recommendations.csv")
game_tags_df = pd.read_csv("Information/steam_game_tags.csv")
# Defining queries to create tables
user_table_creation_query = """CREATE TABLE IF NOT EXISTS customers (
id UUID PRIMARY KEY,
username VARCHAR(255) UNIQUE NOT NULL,
password VARCHAR(255) NOT NULL,
e-mail VARCHAR(255) NOT NULL,
function VARCHAR(50) NOT NULL
)
"""
game_table_creation_query = """CREATE TABLE IF NOT EXISTS video games (
id UUID PRIMARY KEY,
appid VARCHAR(255) UNIQUE NOT NULL,
title VARCHAR(255) NOT NULL,
kind VARCHAR(255),
is_free BOOLEAN DEFAULT FALSE,
short_description TEXT,
detailed_description TEXT,
builders VARCHAR(255),
publishers VARCHAR(255),
worth VARCHAR(255),
genres VARCHAR(255),
classes VARCHAR(255),
release_date VARCHAR(255),
platforms TEXT,
metacritic_score FLOAT,
suggestions INTEGER
)
"""
user_games_query = """CREATE TABLE IF NOT EXISTS user_games (
id UUID PRIMARY KEY,
username VARCHAR(255) NOT NULL,
appid VARCHAR(255) NOT NULL,
shelf VARCHAR(50) DEFAULT 'Wish_List',
score FLOAT DEFAULT 0.0,
evaluation TEXT
)
"""
recommendation_table_creation_query = """CREATE TABLE IF NOT EXISTS user_recommendations (
id UUID PRIMARY KEY,
username VARCHAR(255),
appid VARCHAR(255),
similarity FLOAT
)
"""
game_tags_creation_query = """CREATE TABLE IF NOT EXISTS game_tags (
id UUID PRIMARY KEY,
appid VARCHAR(255) NOT NULL,
class VARCHAR(255) NOT NULL
)
"""
# Working queries to create tables
engine.delete_table('user_recommendations')
engine.delete_table('user_games')
engine.delete_table('game_tags')
engine.delete_table('video games')
engine.delete_table('customers')
# Create tables
engine.create_table(user_table_creation_query)
engine.create_table(game_table_creation_query)
engine.create_table(user_games_query)
engine.create_table(recommendation_table_creation_query)
engine.create_table(game_tags_creation_query)
# Guaranteeing every row of every dataframe has a singular ID
if 'id' not in users_df.columns:
users_df['id'] = [str(uuid.uuid4()) for _ in range(len(users_df))]
if 'id' not in games_df.columns:
games_df['id'] = [str(uuid.uuid4()) for _ in range(len(games_df))]
if 'id' not in user_games_df.columns:
user_games_df['id'] = [str(uuid.uuid4()) for _ in range(len(user_games_df))]
if 'id' not in user_recommendations_df.columns:
user_recommendations_df['id'] = [str(uuid.uuid4()) for _ in range(len(user_recommendations_df))]
if 'id' not in game_tags_df.columns:
game_tags_df['id'] = [str(uuid.uuid4()) for _ in range(len(game_tags_df))]
# Populates the 4 tables with information from the dataframes
engine.populate_table_dynamic(users_df, 'customers')
engine.populate_table_dynamic(games_df, 'video games')
engine.populate_table_dynamic(user_games_df, 'user_games')
engine.populate_table_dynamic(user_recommendations_df, 'user_recommendations')
engine.populate_table_dynamic(game_tags_df, 'game_tags')
# Testing if the tables have been created and populated accurately
print(engine.test_table('customers'))
print(engine.test_table('video games'))
print(engine.test_table('user_games'))
print(engine.test_table('user_recommendations'))
print(engine.test_table('game_tags'))
Deploying a FastAPI Utility on Render
We now have the primary half of our mission deployed on render, and it’s time to arrange our FastAPI utility. To do that, we’re going to make use of Render’s Net Utility internet hosting service, which is able to permit us to deploy our FastAPI App as an online utility that may be accessed by exterior providers. If we wished to construct a full-stack utility, we may then permit our entrance finish to ship requests to the FastAPI utility on Render and return information to the consumer. Nonetheless, as a result of we’re not considering constructing a front-end element at the moment, we’ll as a substitute work together with our App by way of the Swagger docs.
Containerizing our Utility with Docker
We’ve arrange our FastAPI mission in an area surroundings, however now we have to switch it, with all of the code, dependencies, and environmental variables, to a container on Render. This may very well be a frightening problem. Thankfully, Docker handles all of the difficult items and permits us to do exactly that with a easy configuration file and a few instructions. For individuals who haven’t used Docker, there’s a nice tutorial here. The temporary overview is that Docker is a device that simplifies the method of deploying and managing purposes by permitting us to bundle our utility with all its dependencies as a picture after which deploy that picture to a service like Render. On this mission, we use DockerHub as our picture repository, which serves as a central version-controlled storage space for our picture, which we are able to then pull into Render.
Our general move for this mission could be considered like this FastAPI app operating domestically → A ‘Snapshot’ is taken with Docker and saved as a Docker Picture → That Picture is pushed to DockerHub → Render pulls this picture and makes use of it to spin up a Container that runs the applying on a Render Server. Getting began with this course of, which we’ll stroll by way of subsequent, requires having Docker Desktop put in. Docker has an easy set up course of which you will get began on right here: https://www.docker.com/products/docker-desktop/
Moreover, if you happen to don’t have one already, you’ll want a Docker Hub account as this may function the repository to avoid wasting Docker Photographs to after which Pull them into Render. You may create a Docker Hub right here: https://hub.docker.com/.
Constructing a Docker Picture
To create a Docker Picture for our mission, first make sure that Docker Desktop is operating; if it isn’t, you’ll seemingly get an error when attempting to create a Docker picture. To make sure it’s operating, open the Docker Desktop utility out of your search bar or desktop, click on on the three dots within the backside left as proven under, and make sure you see the Inexperienced dot adopted by ‘Docker Desktop is operating’.

Subsequent, we have to inform Docker easy methods to construct our picture, which is finished by defining a Dockerfile. Our Dockerfile could be seen in Determine 9. We put it aside in our top-level listing, and it supplies the directions that inform Docker easy methods to bundle our utility into a picture that may be deployed on a distinct piece of {hardware}. Let’s stroll by way of this file to know what it’s doing.
- FROM: Selecting Base Picture: The primary line in our Dockerfile specifies what base picture we need to use to then prolong for our utility. On this case, we’re utilizing the python:3.13-slim-bullseye picture, which is a light-weight Debian-based picture that may function the bottom for our utility.
- WORKDIR: Altering Work Listing: Right here we’re setting the default listing inside our container to /app
- RUN: Checking for updates to system dependencies
- COPY: Coping necessities.txt file, it’s important that necessities.txt is updated and incorporates all libraries required for the mission, or the Picture gained’t run accurately after we attempt to spin it up
- RUN: Putting in our necessities.txt file
- COPY: Copy our complete mission from our native listing to /app, which we created in step 2
- RUN: Making a logs listing at /app/logs
- EXPOSE: Doc that the port we’ll be exposing is port 8000
- ENV: Units our Python path to /app
- CMD: Runs our FastAPI app utilizing Uvicorn, units our app to the one outlined in src.fundamental:app, runs our app on port 8000

With our Dockerfile outlined, we now have a set of directions that we can provide to Docker to containerize our utility into a picture that we are able to then push to Docker Hub. We are able to now do that with a few instructions from our VS Code terminal, proven under. Every of those traces must be run individually within the VS Code terminal from the highest listing of your mission.
- First, we construct our Docker picture, which is able to seemingly take a minute or two. On this case, we’re naming our picture ‘recommendersystem’
- Subsequent, we tag our picture, the syntax right here is image_name user_name/docker_hub_folder:image_name_on_dockerhub
- Lastly, we push our picture to Dockerhub once more specifying the user_name/docker_hub_folder:image_name_on_dockerhub
docker construct -t recommendersystem .
docker tag recommendersystem seelucas/fastapi_tutorial:fastapi_on_render
docker push seelucas/fastapi_tutorial:fastapi_on_render
After that is carried out, we should always be capable to log in to DockerHub, navigate to our mission, and see that we have now a picture whose title matches what we gave it within the earlier 3 instructions, on this case, fastapi_on_render.

Pulling Docker Picture to Render
Now we have now our Docker Picture on DockerHub, and it’s time to deploy that Picture on Render. This may be carried out by navigating to the identical mission that we created our database in, “fastapi-test”, deciding on “New”, within the prime proper, after which deciding on “Net Service” as our FastAPI app will likely be deployed as a Net Utility.
As a result of we’re deploying our picture from Dockerhub, we specify that our Supply Code is an Current Picture, and as proven in Determine 11, we paste the Dockerhub Listing path to the Picture we need to deploy into ‘Picture URL’ in Render. We then get a notification that this can be a personal picture, which suggests we’ll must create a Dockerhub Entry token that we are able to then use to securely pull the picture from DockerHub into Render.

Thankfully, making a DockerHub Entry token is easy; we navigate to our DockerHub account -> Settings → Private Entry token. The display ought to seem like Determine 12. we offer an entry token title, expiration date, and permissions. Since we’re pulling the picture into Render, we solely want learn entry reasonably than write or delete, so we choose that.

Lastly, deciding on ‘Generate’ will generate our token, which we then want to repeat over to render and enter as proven in Determine 13.

As soon as we’ve chosen ‘Add Credential’ as proven above, it is going to then load for a minute because the credentials are saved. We’ll then be taken again to the earlier display, the place we are able to choose our credentials to make use of to connect with DockerHub. On this case, we’ll use the tutorial credentials we simply created and choose Join. We’ll then have established a connection that we are able to use to drag our Docker Picture from DockerHub to Render for Deployment.

On the subsequent web page, we proceed with organising our Render Net applicaiton by deciding on the free choice after which importantly, on Environmental Variables, we copy and paste our .env file. Whereas we don’t use all of the variables on this file, we do use the ‘Internal_Database_Url’, which is the URL that FastAPI will search for in our fundamental.py file. With out this, we gained’t be capable to connect with our database, so it’s important that we offer this. Be aware: for testing, we beforehand used the ‘External_Database_Url’ as a result of we have been operating the script from our native machine, which is exterior to our Render surroundings; nonetheless, right here each the Database and Net Server are in the identical Render surroundings, so we use the Internal_Database_Url in fundamental.py.
After getting into our environmental variables, we then select ‘Deploy Net Service’.

The service will take a few minutes to deploy, however then it is best to get a notification like under that the service has deployed with a render hyperlink on prime that we are able to entry at.

Navigating to this hyperlink will take us to the Hiya World technique, if we add/docs to the tip of it, we’ll be taken to the swagger docs in Determine 17. Right here we are able to check and guarantee our FastAPI Net Utility is linked to our database through the use of the Fetch All Customers technique. We are able to see under that this does certainly return information.

Lastly, we need to examine if our consumer suggestions system is dynamically updating. In your earlier API name, we are able to see that there’s a consumer ‘user_username’ in our database. Utilizing the Fetch Beneficial Sport technique with this username, we are able to see the highest match is appid = B08BHHRSPK.

We replace our customers’ preferred video games by selecting a random one from our video games appid = B0BHTKGN7F, which seems to be ‘The Elder Scrolls: Skyrim Boardgame’, and leveraging our user_games POST technique.

Including a sport to our consumer video games desk is meant to routinely set off the recommender pipeline to rerun for that consumer and generate new suggestions. If we navigate to our console, we are able to see that it seems to have occurred as we get the brand new consumer suggestions generated message proven under.

If we navigate again to our Swagger docs, we are able to attempt the fetch suggestion technique once more, and we see in Determine 21 that we certainly do have a distinct record of suggestions than the one earlier than. Our Recommender Pipeline is now routinely updating as customers add extra information and is accessible past our native surroundings.

Wrapping Up:
On this mission, we’ve proven easy methods to arrange and deploy a suggestion system leveraging a FastAPI interplay layer with a PostgreSQL database to generate clever board sport suggestions for our customers. There are additional steps we may take to make this technique extra strong, like implementing a hybrid suggestion system as we acquire extra consumer information or enabling consumer tagging to seize extra options. Moreover, though we didn’t cowl it, we did make the most of a GitHub workflow to rebuild and push our Docker picture at any time when there’s a brand new replace to our fundamental department, and this code is on the market in .github/workflows. This helped to vastly speedup growth as we didn’t need to manually rebuild our Docker picture at any time when we made a small change.
I hope you loved studying and that this helps you construct and deploy your initiatives with FastAPI.
LinkedIn: https://www.linkedin.com/in/lucas-see-6b439188/
Electronic mail: [email protected]
Figures: All photographs, except in any other case famous, are by the writer.
Hyperlinks:
- Github Repository for Venture: https://github.com/pinstripezebra/recommender_system
- FastAPI Docs: https://fastapi.tiangolo.com/tutorial/
- Docker Tutorial: https://www.youtube.com/watch?v=b0HMimUb4f0
- Docker Desktop Obtain: https://www.youtube.com/watch?v=b0HMimUb4f0
- Docker Hub: https://hub.docker.com/