Flama
Flama is a python library which establishes a standard framework for development and deployment of APIs with special focus on machine learning (ML). The main aim of the framework is to make ridiculously simple the deployment of ML APIs, simplifying (when possible) the entire process to a single line of code. This is a Framework for the development of Lightweight Applications and Machine-learning Automation.
The library builds on Starlette, and provides an easy-to-learn philosophy to speed up the building of highly performant GraphQL, REST and ML APIs. Besides, it comprises an ideal solution for the development of asynchronous and production-ready services, offering automatic deployment for ML models.
Some remarkable characteristics:
-
Generic classes for API resources with the convenience of standard CRUD methods over SQLAlchemy tables.
-
A schema system (based on Pydantic, Marshmallow, or Typesystem) which allows the declaration of inputs and outputs of endpoints very easily, with the convenience of reliable and automatic data-type validation.
-
Dependency injection to make ease the process of managing parameters needed in endpoints via the use of Components. Flama ASGI objects like Request, Response, Session and so on are defined as Components ready to be injected in your endpoints.
-
Components as the base of the plugin ecosystem, allowing you to create custom or use those already defined in your endpoints, injected as parameters.
-
Auto generated API schema using OpenAPI standard.
-
Auto generated docs, and provides a Flama docs endpoint.
-
Automatic handling of pagination, with several methods at your disposal such as limit, offset, page numbers, to name a few.
Now that you know the main advantages of Flama, you can learn how to install it and run your first API here.