17th December 2020
When it comes to RESTful APIs, Python isn't the first option that comes to mind, does it? Of course, it's possible, I've used Flask to make a couple and it works quite well. But when it comes to speed and latency, you're probably better off with Node.JS/Express or Go.
What happens though when you have to expose a service of a system that is made best with Python? ML models? Bots? I've never had to deploy a model but I've always wondered what is a good way to do so. Looking into this, I came across FastAPI that pretty much blows out the competition.
FastAPI, if it wasn't obvious already, is fast. It's lightweight as well. More importantly, it natively supports asynchronous support, unlike other Python web frameworks. This is what helps it perform on par with Node.JS and Go.
Not only is it fast, but also very easy to get set up and going. Minimal code. Automatic interactive API documentation to play around with your endpoints. Easy data validation.
This is all the code you need to get a GET and POST endpoint running:
from fastapi import FastAPI
app = FastAPI()
items = {
0: "Banana",
1: "Orange",
2: "Kiwi"
}
# GET routes
@app.get("/")
def index():
return items
@app.get("/items/{item_id}")
def get_item(item_id: int):
return {
item_id: items[item_id]
}
# POST route
@app.post("/items")
def create_item(item_name: str):
item_id = max(items.keys()) + 1
items[item_id] = item_name
return {
item_id: item_name
}
The decorators above the functions specify what kind of endpoint they are while defining its path and parameters.
Probably my favourite feature for FastAPI is the automatically produced API documentation page found at the endpoint `/docs`. It shows you the different endpoints available as shown below. By clicking on one of the endpoints, the expected fields and schema are shown. Even better is that the endpoints can also be tested with custom input.
Setting up an API shouldn't be a hassle. FastAPI makes sure of that so you can get on with more important work. I've only scratched the surface of what you can do with it. Most importantly, it's simple, Pythonic and fast, making it perfect for lots of use cases. A definite go-to if you're trying to serve an ML model. Sorry Flask!