close icon
API

How to Create CRUD REST API with AWS Chalice

Learn how to build a book database API with AWS Chalice serverless technology.

May 26, 2021

TL;DR: In this article, you will learn how to build Serverless Python applications with AWS Chalice. You will build a book database REST API to store book entries and perform CRUD operations on the API. The code for this tutorial is available here, AWS Chalice API sample

Introduction

AWS Chalice is a serverless Python framework developed by Amazon. It shares so many similarities in syntax and semantics with Flask. Like other serverless technology, Chalice allows the developer to be able to concentrate on application development without having to deal with managing servers. It uses AWS Lambda and the Amazon API Gateway.

You will use Chalice to build a book database REST API. Your users will be able to store book entries and perform CRUD operations on the API. Entries will be stored in a DynamoDB database. It is a NoSQL database system offered by Amazon. It stores data in the form of tables, items, and attributes.

Finally, you will also see how to integrate Auth0 authentication to your AWS Chalice applications. Besides, you will use Auth0 for authorization. Authorization will allow you to distinguish the endpoints that can be accessed by the general public from the ones that the authenticated users of the application have access to.

Prerequisites

  1. Python 3.7 or later versions
  2. pip
  3. venv
  4. AWS account
  5. AWS CLI

Ensure you have Python and pip installed on your machine. You can download Python for your operating system from the official Python website. You may install pip by following these instructions.

Venv is a package for creating isolated environments for individual projects on a machine. If you installed Python 3.7 or later versions, venv would come pre-installed with the Python package.

You need an AWS account. You may sign up for one. Check the AWS CLI v2 user guide if you do not have AWS CLI installed on your machine.

1. Configure AWS Credentials

To build applications with Chalice, you need to set up your AWS credentials on the AWS CLI. This will allow you to use the Amazon API Gateway and AWS Lambda.

If you installed AWS CLI successfully, then you can use the command below to configure your AWS credentials:

aws configure

You should get a prompt for the AWS Access Key ID, AWS Secret Access Key, default region name. Supply your AWS Access keys and a chosen region. You can use any available AWS region. Finally, you can skip the default output format to use the default option: None:

AWS Access Key ID [None]: ****************ABCD
AWS Secret Access Key [None]: ****************abCd
Default region name [None]: us-west-2
Default output format [None]:

If you don't have AWS credentials yet, you will need to set up an IAM user in the IAM Console. Visit Creating IAM users for a guide on how to create IAM Users. Once you have created the IAM user, you can then go through the instructions for setting up the AWS Access keys.

To check whether you set up your credentials correctly, run the following command:

aws ec2 describe-regions

You should see the list of all regions where EC2 is available.

2. Set up Project Dependencies

Starting out, you will need a virtual environment for your project. Firstly, you will need to create a new directory, e.g., chalice-sample. Enter it. Then, create the virtual environment. Last but not least, don't forget to activate it in order to use it.

mkdir chalice-sample
cd chalice-sample
python -m venv env
source env/bin/activate

Now, install boto3, the AWS SDK for Python. It will allow us to perform DynamoDB database operations needed for the project.

pip install boto3

Then, install AWS Chalice:

pip install chalice

Next, set up a new AWS Chalice project with the command below:

chalice new-project chalice-api-sample

Using the above command, the argument new-project helps you to specify that a new project is being created by Chalice, and you'd like to call it chalice-api-sample.

If you check the new project structure generated by chalice command, it will look like the following:

chalice-api-sample
├── app.py
├── .chalice
│   └── config.json
├── .gitignore
└── requirements.txt

Let's look at what each of the generated files means:

  • app.py: here is where the application logic for AWS Lambda exists.
  • .chalice: this folder consists of the application configuration and database settings.
  • .gitignore: list of files that won't be pushed into the remote repository if you are tracking the project with Git
  • requirements.txt: list of app dependencies for use in the application

3. Database Configuration and Deployment Set up

You have to define the necessary deployment and database configuration for the project.

To start, edit the config.json file located in the .chalice directory. The config.json file contains the deployment configuration.

{
  "version": "2.0",
  "app_name": "chalice-api-sample",
  "stages": {
    "dev": {
      "api_gateway_stage": "api",
      "autogen_policy": false
    }
  }
}

You have created a deployment stage with stage-name called dev. However, it was Chalice that assumed the stage name for you by default, as shown above. Therefore, you can instead use a name that you like. Note that a stage in AWS refers to deployment.

  • api_gateway_stage is the URL prefix for the API.
  • autogen_policy is a setting that specifies if Chalice should automatically set up an IAM policy based on the application code. If it is set to false, it will check the policy-<stage-name>.json file for the IAM policy that you defined.

Next, create a new file called policy-dev.json inside the .chalice directory. It will contain the policy for reading and writing from the DynamoDB database.

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Action": [
        "logs:CreateLogGroup",
        "logs:CreateLogStream",
        "logs:PutLogEvents"
      ],
      "Resource": "arn:aws:logs:*:*:*",
      "Effect": "Allow"
    },
    {
      "Action": [
        "dynamodb:PutItem",
        "dynamodb:DeleteItem",
        "dynamodb:UpdateItem",
        "dynamodb:GetItem",
        "dynamodb:Scan",
        "dynamodb:Query"
      ],
      "Resource": ["arn:aws:dynamodb:*:*:table/my-demo-table"],
      "Effect": "Allow"
    }
  ]
}

In the file above, you made a provision for your user to create log groups and log events. You also specified DynamoDB actions for read, create, update, scan, and query operations on database items. Finally, you named your DynamoDB table my-demo-table.

You will use AWS CloudFormation to create and set up the DynamoDB database. CloudFormation is a tool for specifying the resources and dependencies for your AWS project. It also helps you set them up. The CloudFormation template, in a JSON or YAML format, contains resources that make up your stack. CloudFormation uses that template to set up and configure the resources specified in it.

Thus, create a new file called dynamodb_cf_template.yaml inside the .chalice directory. The file will be the template that CloudFormation will use to create the database.

AWSTemplateFormatVersion: "2010-09-09"
Resources:
  chaliceDemo:
    Type: AWS::DynamoDB::Table
    Properties:
      TableName: my-demo-table
      AttributeDefinitions:
        - AttributeName: "id"
          AttributeType: "S"
        - AttributeName: "author"
          AttributeType: "S"
      KeySchema:
        - AttributeName: "id"
          KeyType: "HASH"
        - AttributeName: "author"
          KeyType: "RANGE"

      ProvisionedThroughput:
        ReadCapacityUnits: "5"
        WriteCapacityUnits: "5"

Outputs:
  TableName:
    Value: !Ref "chaliceDemo"
    Description: Name of the newly created DynamoDB table

The code above consists of the key attributes definition of the DynamoDB table to be created. In this case, the id of a book and the author attribute will store the name of the book authors. The KeySchema attributes make up the primary key of the table. You do not need to include any other attribute that is not a key in the KeySchema and AttributeDefinitions arrays. You may check the official AWS documentation for more information on attribute definition in DynamoDB table creation.

Then, use the following command on the terminal to create the DynamoDB database that you defined above.

aws cloudformation deploy\
 --template-file dynamodb_cf_template.yaml\
 --stack-name "my-stack"

The above command allows you to use CloudFormation to set up your stack called my-stack in AWS with the resources defined in the template file.

Now, you may run and test the setup on the localhost. Run the command below:

chalice local

You should get a terminal output as follows:

Found credentials in shared credentials file: ~/.aws/credentials
Serving on http://127.0.0.1:8000

4. Implement CRUD Features

You will now define the CRUD functionalities in the app.py. You will create endpoints for creating book entries, fetching all books. You will also create endpoints for fetching particular books by id. Also, you will include the endpoints updating and deleting books from the database.

Navigate to the app.py file of the project. Make necessary imports and create a function to connect to the database as shown below:

from chalice import Chalice, Response
import boto3
from boto3.dynamodb.conditions import Key

app = Chalice(app_name='chalice-api-sample')


def get_app_db():
    dynamodb = boto3.resource("dynamodb")
    table = dynamodb.Table('my-demo-table')
    return table

The code above consists of library imports and the definition of the name of the app. It also contains a function called get_app_db, which defines the DynamoDB table through boto3, the AWS SDK for Python applications. The function also defines the name of the table to be used in the app as my-demo-table.

Create a Book Entry

Now, you will add an endpoint for creating a book entry using the POST method in the app.py file:

@app.route('/book', methods=['POST'])
def add_book():
    data = app.current_request.json_body
    try:
        get_app_db().put_item(Item={
            'id': data['id'],
            "title": data['title'],
            "author": data['author']
        })
        return {'message': 'ok - CREATED', 'status': 201, "id": data['id'], "title": data['title'], "author": data['author']}
    except Exception as e:
        return {'message': str(e)}

In the code above, you did the following:

  • Specified the URL route for the endpoint as /book with the POST action
  • In the line data = app.current_request.json_body, you have defined what the API Content-Type will be so as to accept the request body and produce a response body in JSON form
  • Allowed the addition of a book entry with attributes for the id, title, and author of the book
  • Included a return statement to return the attributes of a newly added book

You may use the HTTPie command-line tool to test your API. It can be installed with pip install httpie. If you have HTTPie already, try adding a book to the database:

http POST 127.0.0.1:8000/book id=123  title="Javascript Know It All" author="Chukwuma Obinna"

You should get a response like this:

HTTP/1.1 200 OK
Content-Length: 110
Content-Type: application/json
Date: Fri, 30 Apr 2021 03:36:22 GMT
Server: BaseHTTP/0.6 Python/3.7.7

{
    "author": "Chukwuma Obinna",
    "id": "123",
    "message": "ok - CREATED",
    "status": 201,
    "title": "Javascript Know It All"
}

Then, add another book entry:

http POST http GET https://vvyngxvyag.execute-api.us-west-2.amazonaws.com/api/book id=456 title="Python for Primary School" author="Iwobi Peter"

The response will be:

{
    "author": "Iwobi Peter",
    "id": "456",
    "message": "ok - CREATED",
    "status": 201,
    "title": "Python for Primary School"
}

Fetch all the Books in the Database

To get all the books in the database, add the function below to scan the table and get all books.

@app.route('/', methods=['GET'])
def index():
    response = get_app_db().scan()
    data = response.get('Items', None)
    return {'data': data}

In the code above, the root route was added with a GET action. Then, the scan method was applied to the table, and all the items in the table were returned.

Use the following command to access the endpoint:

http GET 127.0.0.1:8000/

The output you get should be like the following:

HTTP/1.1 200 OK
Content-Length: 155
Content-Type: application/json
Date: Fri, 30 Apr 2021 03:52:47 GMT
Server: BaseHTTP/0.6 Python/3.7.7

{
    "data": [
        {
            "author": "Iwobi Peter",
            "id": "456",
            "title": "Python for Primary School"
        },
        {
            "author": "Chukwuma Obinna",
            "id": "123",
            "title": "Javascript Know It All"
        }
    ]
}

Fetch a Book by Id

Now, you will add a route to get a particular book by specified id:

@app.route('/book/{id}', methods=['GET'])
def get_book(id):
    response = get_app_db().query(
        KeyConditionExpression=Key("id").eq(id)
    )
    data = response.get('Items', None)
    return {'data': data}

In the code above:

  • the /book/{id} route accepts a specific book id and has a GET action
  • The get_book function above accepts the id parameter
  • the KeyConditionExpression with the condition that the DynamoDB table is queried with the id parameter
  • the table item with the id queried is then returned

Using HTTPie to test, fetch a particular book whose id is 123:

http GET 127.0.0.1:8000/book/123

The output should like the following:

HTTP/1.1 200 OK
Content-Length: 110
Content-Type: application/json
Date: Fri, 30 Apr 2021 03:36:22 GMT
Server: BaseHTTP/0.6 Python/3.7.7

{
    "author": "Chukwuma Obinna",
    "id": "123",
    "message": "ok - CREATED",
    "status": 201,
    "title": "Javascript Know It All"
}

Update a Book Entry

At this point, you will add an endpoint to update the title of a book. You will maintain the same URL as the route for adding a book. However, you may decide to use a new route URL entirely. you will indicate the PUT method for updating items in REST APIs as shown below:

@app.route('/book/{id}', methods=['PUT'])
def update_book(id):
    data = app.current_request.json_body
    try:
        get_app_db().update_item(Key={
            "id": data['id'],
            "author": data['author']
        },
            UpdateExpression="set title=:r",
            ExpressionAttributeValues={
            ':r': data['title']
        },
            ReturnValues="UPDATED_NEW"
        )
        return {'message': 'ok - UPDATED', 'status': 201}
    except Exception as e:
        return {'message': str(e)}

In the above code, the update_book function accepts the id parameter. Furthermore, you provided the Key element consisting of the attributes specified in the KeySchema when creating the DynamoDB. Then, you used UpdateExpression and ExpressionAttributeValues to specify the attribute that will be updated.

To see the update functionality in effect, update the title of the book with id 123:

http PUT 127.0.0.1:8000/book/123 id=123 title="Chalice Book" author="Chukwuma Obinna"

You should get an output like this:

HTTP/1.1 200 OK
Content-Length: 39
Content-Type: application/json
Date: Fri, 30 Apr 2021 12:22:50 GMT
Server: BaseHTTP/0.6 Python/3.7.7

{
    "message": "ok - UPDATED",
    "status": 201
}

Now, check to see the updated entry:

http GET 127.0.0.1:8000/book/123

The output will look like this:

{
    "data": [
        {
            "author": "Chukwuma Obinna",
            "id": "123",
            "title": "Chalice Book"
        }
    ]
}

Delete a Book Entry

Now, you will add the code to delete a book entry. You will add a route that accepts id parameter and a DELETE method. Add the following code to the app.py file:

@app.route('/book/{id}', methods=['DELETE'])
def delete_book(id):
    data = app.current_request.json_body
    try:
        response = get_app_db().delete_item(
            Key={
                "id": data['id'],
                "author": data['author']
            }
        )
        return {'message': 'ok - DELETED', 'status': 201}

    except Exception as e:
        return {'message': str(e)}

The delete_book function above accepts the id parameter in the /book/{id} route. Then, the delete_item method is applied to the DynamoDB table. The Key attributes are specified for the delete_item method to delete the specified item in the database.

Then, you can try to delete a book entry in the database:

http DELETE 127.0.0.1:8000/book/456 id=456 author="Iwobi Peter"

The command-line output is as follows:

{
    "message": "ok - DELETED",
    "status": 201
}

5. Deploy to AWS

Before you deploy your code to AWS, make sure you have the necessary permissions set for your IAM account as defined in the AWS user guide. Now, with a single command, chalice deploy, you can deploy your Chalice application to AWS.

chalice deploy

You should get the API URL endpoint and a Lambda ARN on your terminal, which you can use to interact with the API.

Creating deployment package.
Creating IAM role: chalice-api-sample-dev-api_handler
Creating lambda function: chalice-api-sample-dev
Creating Rest API
Resources deployed:
  - Lambda ARN: arn:aws:lambda:us-west-2:xxxxxxxxxxxx:function:chalice-api-sample-dev
  - Rest API URL: https://vvyngxvyag.execute-api.us-west-2.amazonaws.com/api/

Now, If you look at the folder structure, you will notice a few new, automatically generated files to accommodate the deployment changes:

chalice-api-demo
├── app.py
├── .chalice
|    ├── deployed
|    ├── deployments
│   └── config.json
├── .gitignore
└── requirements.txt

6. Testing the Deployed Chalice API

Now, you can test the REST API URL generated by Chalice:

http GET https://vvyngxvyag.execute-api.us-west-2.amazonaws.com/api/

The output would look like the following:

HTTP/1.1 200 OK
Connection: keep-alive
Content-Length: 73
Content-Type: application/json
Date: Sun, 02 May 2021 00:32:35 GMT
Via: 1.1 xxxxxxxxxx.cloudfront.net (CloudFront)
X-Amz-Cf-Id: xxxxxxxxxxxxxxxx==
X-Amz-Cf-Pop: AMS1-C1
X-Amzn-Trace-Id: Root=1-xxxxxxxxx-xxxxxxxxxxxxxx;Sampled=0
X-Cache: Miss from cloudfront
x-amz-apigw-id: xxxxxxxxxxxxxx=
x-amzn-RequestId: xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxxx

{
    "data": [
        {
            "author": "Chukwuma Obinna",
            "id": "123",
            "title": "Chalice Book"
        }
    ]
}

Congratulations! You have built and tested your REST API with AWS Chalice. If you would like to delete your application, you can use the following command:

chalice delete

Don't forget to delete the DynamoDB table that you created too:

aws cloudformation delete-stack --stack-name my-stack

7. Adding authentication

By now, you have your Chalice application set up with a REST API and DynamoDB. You could consider adding authentication and authorization features to your application. Auth0 enables you to authenticate your users. It also enables you to provide authorization access to the endpoints in your REST APIs. To use Auth0 authentication, you need an Auth0 account. If you don't have an account yet, you can sign up for a free Auth0 account here.

After signing up or signing in, go to your dashboard and locate the Accounts tab. Select Create Application. Supply a name for your app and select Regular Web Applications. After creating the app, navigate to the Settings tab to see the information needed to connect your Chalice application with Auth0. Also, add the URL of your Chalice app to the Auth0 app settings for the callback and logout URLs. Save the changes.

After adding the URLs, navigate to the app.py file of your Chalice project so that you can connect the Chalice application with Auth0. Make necessary imports as shown below and add your Auth0 app credentials, the domain, and audience:

from chalice import Chalice, Response # imported Response
import boto3
from boto3.dynamodb.conditions import Key
from functools import wraps # new line


AUTH0_DOMAIN = 'dev-xxxxxxxx.us.auth0.com'
API_AUDIENCE = 'https://chalice-demo/'
ALGORITHMS = ["RS256"]
...

Then add the following code to the app.py file to handle errors:

class AuthError(Exception):
    def __init__(self, error, status_code):
        self.error = error
        self.status_code = status_code

def handle_auth_error(ex):
    response = jsonify(ex.error)
    response.status_code = ex.status_code
    return response

Next, add the following function to get the JSON Web Token (JWT) access token from the authorization header in the request made to the API:

def get_token_auth_header():
    """Obtains the Access Token from the Authorization Header
    """
    request = app.current_request
    auth = request.headers.get("Authorization", None)
    if not auth:
        raise AuthError({"code": "authorization_header_missing",
                         "description":
                         "Authorization header is expected"}, 401)

    parts = auth.split()

    if parts[0].lower() != "bearer":
        raise AuthError({"code": "invalid_header",
                         "description":
                         "Authorization header must start with"
                         " Bearer"}, 401)
    elif len(parts) == 1:
        raise AuthError({"code": "invalid_header",
                         "description": "Token not found"}, 401)
    elif len(parts) > 2:
        raise AuthError({"code": "invalid_header",
                         "description":
                         "Authorization header must be"
                         " Bearer token"}, 401)

    token = parts[1]
    return token

Add the following code. The requires_auth function below is a decorator, and it finds out if the access token obtained by the get_token_auth_header above is valid. It verifies the access token with the JWKS (JSON Web Key Sets) in your Auth0 account.

def requires_auth(f):
    """Determines if the Access Token is valid
    """
    @wraps(f)
    def decorated(*args, **kwargs):
        token = get_token_auth_header()
        jsonurl = urlopen("https://"+AUTH0_DOMAIN+"/.well-known/jwks.json")
        jwks = json.loads(jsonurl.read())
        unverified_header = jwt.get_unverified_header(token)
        rsa_key = {}
        for key in jwks["keys"]:
            if key["kid"] == unverified_header["kid"]:
                rsa_key = {
                    "kty": key["kty"],
                    "kid": key["kid"],
                    "use": key["use"],
                    "n": key["n"],
                    "e": key["e"]
                }
        if rsa_key:
            try:
                payload = jwt.decode(
                    token,
                    rsa_key,
                    algorithms=ALGORITHMS,
                    audience=API_AUDIENCE,
                    issuer="https://"+AUTH0_DOMAIN+"/"
                )
            except jwt.ExpiredSignatureError:
                raise AuthError({"code": "token_expired",
                                 "description": "token is expired"}, 401)
            except jwt.JWTClaimsError:
                raise AuthError({"code": "invalid_claims",
                                 "description":
                                 "incorrect claims,"
                                 "please check the audience and issuer"}, 401)
            except Exception:
                raise AuthError({"code": "invalid_header",
                                 "description":
                                 "Unable to parse authentication"
                                 " token."}, 401)

            app.current_request.context.update(payload)
            return f(*args, **kwargs)
        raise AuthError({"code": "invalid_header",
                         "description": "Unable to find appropriate key"}, 401)
    return decorated

Then, you can add authorization to specific endpoints by using the @requires_auth decorator, as shown below.

@requires_auth
app.route('/book', methods=['POST'])
def add_book():
    data = app.current_request.json_body
    ...

Therefore, if a user is not signed in, Auth0 cannot have Access tokens that enable Auth0 to verify the user. So, they can't use those endpoints protected by the @requires_auth decorator.

Conclusion

In this article, you learned about AWS Chalice serverless technology. You built a REST API for a book database application with Chalice. You integrated DynamoDB and CRUD functionalities with the API. You also learned how to use Auth0 to put in place authorization for some endpoints.

Let us know your thoughts in the comments section below. We hope you enjoyed reading the article.

  • Twitter icon
  • LinkedIn icon
  • Faceboook icon