FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3.6+ based on standard Python type hints. It is built on top of Starlette for the web parts, and Pydantic for the data parts. FastAPI is known for its speed, simplicity, and ease of use in creating APIs.
AWS Lambda is a serverless computing service provided by Amazon Web Services. Using AWS Lambda, you can run your code without provisioning or managing servers. AWS Lambda automatically scales your application by running code in response to each trigger, which helps you to build highly scalable applications.
In this tutorial, we will learn how to deploy a FastAPI application on AWS Lambda in just 9 minutes. Let’s get started!
Prerequisites:
- A basic understanding of Python and FastAPI.
- An AWS account with the necessary permissions to create AWS Lambda functions.
Step 1: Install required libraries
First, let’s install the required libraries for our FastAPI application:
pip install fastapi uvicorn
Step 2: Create a FastAPI application
Next, let’s create a simple FastAPI application. Create a file named main.py
and add the following code:
from fastapi import FastAPI
app = FastAPI()
@app.get("/")
def read_root():
return {"Hello": "World"}
This code creates a simple FastAPI application with a single endpoint that returns a JSON response.
Step 3: Test the FastAPI application locally
Before deploying our FastAPI application on AWS Lambda, let’s test the application locally. Run the following command to start the FastAPI application using Uvicorn:
uvicorn main:app --reload
Visit http://127.0.0.1:8000
in your browser and you should see the JSON response {"Hello": "World"}
.
Step 4: Package the FastAPI application for AWS Lambda
To deploy our FastAPI application on AWS Lambda, we need to package it along with all its dependencies into a ZIP file. To do this, create a file named lambda_function.py
and add the following code:
import os
from aws_lambda_powertools.logging import LoggerFactory
from aws_lambda_powertools.tracing import Tracer
tracer = Tracer()
logger = LoggerFactory().get_logger()
os.environ["FASTERAPI_ENV"] = "AWS_LAMBDA"
from fastapi_lambda import FastAPILambda
from main import app
fastapi_lambda = FastAPILambda(app)
Next, install the required library for AWS Lambda:
pip install aws-lambda-powertools aws-lambda-fastapi
Now, create a ZIP file containing the main.py
, lambda_function.py
, and the site-packages
directory with the required dependencies. You can achieve this by running the following command:
zip -r9 lambda.zip main.py lambda_function.py site-packages
Step 5: Create an AWS Lambda function
Now, let’s create an AWS Lambda function for our FastAPI application. Follow these steps:
- Sign in to the AWS Management Console and open the Lambda console.
- Click on the
Create function
button. - Choose
Author from scratch
and give your Lambda function a name and select the Python 3.8 runtime. - Upload the
lambda.zip
file that we created in the previous step. - Set the handler to
lambda_function.fastapi_lambda.handler
. - Create or choose an existing execution role with sufficient permissions.
- Click on
Create function
to create the Lambda function.
Step 6: Configure API Gateway
To create a public endpoint for our FastAPI application, we need to configure API Gateway. Follow these steps:
- Click on
Add trigger
in the Lambda function. - Choose API Gateway as the trigger type.
- In the
Create a new API
dropdown, chooseHTTP API
. - Click on
Add
to create the API Gateway trigger.
Step 7: Deploy the API Gateway
To make the API Gateway accessible publicly, we need to deploy the API. Follow these steps:
- Go to the
API Gateway
service in the AWS Management Console. - Select the API that is connected to your Lambda function.
- Click on
Actions
and chooseDeploy API
. - Select the deployment stage (e.g.,
prod
) and click onDeploy
.
Step 8: Test the deployed FastAPI on AWS Lambda
Now that our FastAPI application is deployed on AWS Lambda, we can test it by hitting the public endpoint provided by API Gateway. Copy the URL of the deployed API and paste it into your browser. You should see the JSON response {"Hello": "World"}
.
Congratulations! You have successfully deployed a FastAPI application on AWS Lambda. You can now expand your application, add more endpoints, and integrate it with other AWS services.
I hope this tutorial was helpful in guiding you through the process of deploying FastAPI on AWS Lambda in just 9 minutes. Thank you for reading!
music….
Does this work with StreamingResponse?
hi Eric, the way you shoed to deploy the code using zip looks easy , but in production and working in companies is there nay professional way like using docker or other so we can directly run some command in lambda to install the requirements
Once I tried to include another router in the main, Lambda keeps giving me internal server error
Do I need to enable CORS for the fastAPI or when I deploy it on aws it will allow me to send external requests?
Why would you deploy FastAPI on top of Lambda instead of using API Gateway and Lambda functions that run only the required logic? It's like forcing a use case in a service that is not designed for that purpose.
Fantastic tutorial! I really appreciate your work.
Thanks, it is a great tutorial. In a short video, you explained the process nicely.
Great Video
I'd recommend using the FastAPI version that is shown in this video, else things might not work, didn't work in my case.
How do I set restrictions so that nobody can hit my api too many times and I get a bill sent by AWS.
Kindly make a s3 uploads and trigger using lambda and fast api bro
I get internal server error despite changing my handler details. What could be the reason
Really great tutorial. Thank you so much for sharing. FYI – to people who may be reading the comments – if you setup your FastApi with .env locally and are running with dotenv, you will need to change all those os.getenv() to ENV_VAR = os.environ['NAME'] in lambda.
Magnum is not maintained so I don't we should use it.
This actually worked, Gemini was giving me wrong information.
Nice!
The Lambda function worked, but when I uploaded the zip file, I got an "internal server error" which I traced to "errorMessage": "Unable to import module 'main': No module named 'pydantic_core._pydantic_core' ". Any ideas?
Hey hey @codingwithroby, after few months – would you still consider Mangum as a good solution? it has no support for python 3.11 and 3.12, it seems to becoming abandoned project, so starting using this today may result in problems tomorrow. If you agree – do you know any alternative as easy to use as mangum ?
hello my friend
what theme are you using in vscode?
nice video, man, really helpful
pip3 install -r requirements.txt –platform manylinux2014_x86_64 –target=dependencies –implementation cp –python-version 3.10 –only-binary=:all: –upgrade openai
Command updated (change the python version)