engineering

Creating a Node gRPC Service Using Mali

Learn about the engineering principles, technology, and goals behind the gRPC system by setting up a service.

Jan 16, 201916 min read

At Auth0 we’re responsible for responding to a large volume of queries within a relatively small period of time. For each API call that’s made we must perform a variety of tasks: authentication, rate limiting, data access, payload validation, and external API calls to name a few.

Hitting target response metrics while performing all of these calls requires a fair amount of thought with regard to service design. As part of a recent feature introduction, we added a new call into our authentication flow. Given the criticality of this flow, we needed to assure that the new functionality had minimal impact on response times and success rates.

During discovery, we analyzed what technologies would be the most effective in transforming and persisting our data, performing application logic, and what would be the most effective transport for the service.

When starting off with an API, it’s easy to go with defaults: HTTP/1.x and JSON, and these are entirely reasonable selections. HTTP is a well supported, tooled and observable protocol, and JSON is presently the de-facto standard for transferring payloads between services on the internet.

During the initial research of our new project, we looked into different technologies to see which fit our needs best. One of the technologies we looked into was gRPC, because of promising results we'd heard from our peers on how gRPC could provide the performance that we required for our use case.

In this post, we are going to take a quick look on how to get a gRPC service up and running using the Mali framework while explaining some of the engineering principles, technology, and goals behind the gRPC system. Let's get started.

What Is GRPC?

gRPC is a framework originally created by Google for internal use but has since been open sourced. It leverages an HTTP/2 transport, and Protocol Buffers as its service definition language (these are defaults, and can be changed).

One of the key characteristics of gRPC is that it supports a compressed, full duplex stream over HTTP2. As denoted by Google, HTTP2 was designed to enable a more efficient use of network resources and reduce latency by introducing header field compression and allowing multiple concurrent exchanges on the same connection. Once a client is connected to a server, either side is free to send and receive messages at will. This always-connected, full-duplex connection is ideal for our service, as it allows the calling service to utilize a pre-established stream to call into the downstream service without delay.

HTTP 2.0 Connection Multiplexing

Source: Google

Read What is HTTP/2 All About? for more details on HTTP2.

Services are defined in a definition language called Protocol Buffers. Google defines Protocol Buffers as "language-neutral, platform-neutral, extensible mechanism for serializing structured data". They are positioned as a smaller, faster, and simpler version of XML. Protocol Buffers allows for strongly typed services (endpoints), messages (payloads) and fields (properties). These definitions enable messages to be marshaled in binary format over an existing connection very effectively and allow the consumer of the message to reduce the guesswork when unmarshalling them.

In contrast, HTTP is more oriented for request/response pairs. What is more significant than HTTP is the overhead of JSON. When comparing a well-structured message to a message whose format is undefined there are significant performance improvements.

Why Should You Use It?

It’s important to call out that we’re not suggesting gRPC in all of your services or even any of them. However, when performance is absolutely critical, and you have the ability to influence the service implementation, consumption, and infrastructure then gRPC is something you may want to look into.

Setting Up a GRPC App Using Node

First, we’re going to get our project configured:

mkdir grpc-demo
cd $_ && npm init -y
npm i "grpc"@"^1.15.1" "mali"@"^0.9.1" "@grpc/proto-loader"@"~0.3.0"
# Initialize project

This command creates a directory for our project, initializes NPM, and installs the packages that we’ll be using:

To facilitate the implementation of gRPC service, we are going to use the Mali gRPC microservice framework as it offers the following benefits:

  • It is designed to use modern JavaScript asynchronous constructors such as Promises and
    async
    /
    await
    .
  • It supports header, trailer, status, and error metadata.
  • The minimal core of Mali can be extended to add features by composing and cascading middleware.
  • Just a few lines of code give you a fully operational server.

Learn how to set up a gRPC server in Node using the Mali framework.

Tweet This

Service Definition

Our server will consume a

.proto
file which contains our service definition, so we'll add a file at the path of
protos/hello.proto
:

mkdir protos
touch protos/hello.proto

Populate the

hello.proto
file with this content:

syntax = "proto3";

service Hello {
    rpc Echo (EchoRequest) returns (EchoResponse) {};
}

message EchoRequest {
    string message = 1;
}

message EchoResponse {
    string message = 1;
    int32 timestamp = 2;
}

Let's understand what's happening in this

.proto
file. As stated above, gRPC uses Protocol Buffers for its service definition language.
hello.proto
is a Protocol Buffer file, which contains our service definition, along with the messages that our service will be using.

Let's break down what's happening in this file.

syntax = "proto3";

This line defines the syntax that the file is using. There are multiple versions of Protocol Buffer syntax. This syntax uses the latest version available at the time of writing.

service Hello {
    rpc Echo (EchoRequest) returns (EchoResponse) {};
}

This section defines a

service
, which is what gRPC will use to expose a set of RPC endpoints. RPC stands for remote procedure call and it denotes the event when a computer program causes a procedure to execute in a different address space. An RPC endpoint, then, is a location or path where the RPC can be called at.

This example only exposes the

Echo
RPC, which accepts a response message of
EchoRequest
, and returns a request message of
EchoResponse
. In gRPC parlance a call that is a single request and response is referred to as
Unary
. There are other operations that allow server streaming, client streaming, or full duplex streaming too.

 message EchoRequest {
     string message = 1;
 }

 message EchoResponse {
     string message = 1;
     int32 timestamp = 2;
 }

EchoRequest
is a message containing only one field,
message
.
message
is a string (other types will be rejected by the server and client), and it has a field number of
1
. Field numbers are used to indicate the unique index of a field within a message. These are used by the server and client to serialize and deserialize the message from the binary format.

EchoResponse
is a message too, but has two fields
message
and
timestamp
. You may notice that it has the same field number,
1
as we used in the
EchoRequest
. Field numbers are unique to a message, so this is not a problem for us. We also add the new field
timestamp
. We will use this field for returning the timestamp that the message was received at.

With our service in place, we are now ready to implement our server.

Server Implementation

Inside of our new directory, we'll create a file called

server.js
, and wire up a very simple gRPC service.

// server.js

const path = require("path");
const Mali = require("mali");

// Defines the path to a proto file that will hold the service definition
const PROTO_PATH = path.resolve(__dirname, "./protos/hello.proto");

/**
 * Handler for the Echo RPC.
 * @param {object} ctx The request context provided by Mali.
 * @returns {Promise<void>}
 */
const echo = async ctx => {
  // Log that we received the request
  console.log("Received request.");

  // Set the response on the context
  ctx.res = {
    // Define the message, and time
    message: ctx.request.req.message,
    timestamp: Date.now()
  };
};

/**
 * Define the main entry point for the application.
 * From here, we stand up the server and do some light logging.
 */
const main = () => {
  /**
   * Create a new instance of the Mali server.
   * We pass in the path to our Protocol Buffer definition,
   * and provide a friendly name for the service.
   * @type {Mali}
   */
  const app = new Mali(PROTO_PATH, "Hello", {
    // These are gRPC native options that Mali passes down
    // to the underlying gRPC loader.
    defaults: true
  });

  // Create a listener for the Echo RPC using the echo function
  // as the handler.
  app.use({ echo });

  // Start listening on localhost
  app.start("127.0.0.1:50051");

  // Log out that we're listening and ready for connections
  console.log("Listening...");
};

// Start the service and listen for connections
main();

This file is a very simple gRPC server that is implemented by using the Mali framework. Mali provides a way to implement gRPC services in a very simple way, in a design that's similar to how Koa handles HTTP services.

After our

require
's, we define the path where our
proto
file resides at.

// server.js

const path = require("path");
const Mali = require("mali");

// Defines the path to a proto file that will hold the service definition
const PROTO_PATH = path.resolve(__dirname, "./protos/hello.proto");

Next, we create a function called

echo
. This is an async function that provides a
ctx
argument.
ctx
contains a
request
property. To read from the request, we use
ctx.request.req.<field name>
.

// require's...
// PROTO_PATH ...

const echo = async ctx => {
  // Log that we received the request
  console.log("Received request.");

  // Set the response on the context
  ctx.res = {
    // Define the message, and time
    message: ctx.request.req.message,
    timestamp: Date.now()
  };
};

In our implementation of

echo
, we're taking the provided input of
ctx.request.req.message
, and setting it as
ctx.res.message
, effectively echoing the input. We're also returning the current time to the caller via
timestamp
by calling
Date.now()
.

There's no need to return from the

echo
function; setting
ctx.res
with the fields you desire is all that's needed for the response to be delivered to the caller.

We use the

main
function as the entry point for our server. It's fairly simple, we create a new instance of
Mali
dynamically, provide the path to our service definition, the name of our service, and some other configuration that's passed to the underlying Node library.

// require's...
// PROTO_PATH ...
// echo function definition

const main = () => {
  const app = new Mali(PROTO_PATH, "Hello", {
    defaults: true
  });
};

All options for the loader are defined within the gRPC Protobuf Loader document.

We then tell Mali to use the

echo
function we defined earlier. The Mali
use()
method
defines middleware and handlers. This method maps the name of functions that serve as handlers to RPC endpoints.

// require's...
// PROTO_PATH ...
// echo function definition

const main = () => {
  // app...

  app.use({ echo });
};

Using JavaScript object shorthand,

app.use({ echo });
, we tell Mali to map our
echo()
function with an
echo
property that is stored within a handler dictionary. When our server receives a request for the
echo
path, Mali will use this dictionary to resolve the handler of that path and trigger the correct RPC endpoint.

In contrast to the REST architecture style, within the gRPC architecture style, there is no concept of "endpoint methods" such as

GET
or
POST
. Instead, we'd use descriptive RPC names such as
getUser
,
createUser
, or
updateUser
.

As a last step within the

main
function, we listen locally on port
50051
.

// require's...
// PROTO_PATH ...
// echo function definition

const main = () => {
  // app...
  // handler mapping ...

  app.start("127.0.0.1:50051");

  console.log("Listening...");
};

Finally, we need to call our

main()
function to kickstart the app:

// require's...
// PROTO_PATH ...
// echo function definition
// main

main();

To run the server, we don't need to create an NPM script within

package.json
. Instead, we are going to define
main
as our
server.js
file. Open
package.json
and update it as follows:

{
  "name": "grpc-demo",
  "version": "1.0.0",
  "description": "",
  "main": "server.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "keywords": [],
  "author": "",
  "license": "ISC",
  "dependencies": {
    "@grpc/proto-loader": "^0.3.0",
    "grpc": "^1.17.0",
    "mali": "^0.9.2"
  }
}

That's it. We're now able to start listening for service requests. Let's learn next how to run the server and make API calls.

Calling the API

At this point, we have enough to begin calling our

Echo
RPC endpoint.

To do this easily, we'll use

grpcc
, a flexible command-line client for any gRPC server designed to test APIs quickly without much setup.
grpcc
can be installed locally in your project, globally in your system, or be run without installation by using
npx
.

Installing
grpcc

grpcc
can be installed in different ways.

Global Installation

To install the client globally, run the command:

npm i -g grpcc
# Global installation

Local Installation

Run the following command:

npm i -D grpcc
# Local installation

i
is the short form of the
install
command.
-D
is the short form of the
--save-dev
flag that saves the packages as
devDependencies
.

With a local installation, you'd need to create an NPM script within

package.json
that allows you to run the local package. More on what this script looks like in the next section.

NPX

As an alternative, we can use

npx
to run the
grpcc
command.
npx
is an NPM package runner that ships with
npm
v5.2
and above. It executes a command either from a local
node_modules/.bin
or from a central cache. Thus, without having to install the package locally or globally,
npx grpcc
pulls the latest version of
grpcc
from the NPM cloud registry and runs it. It's a great tool to test and run one-off commands.

Learn more about everything you can do with

npx
.

Running the Server

To run the server, open a terminal window and, from the project directory, execute the following command:

npm start

Node will automatically run

server.js
for us. The terminal should now show
Listening...
as the output.

Calling the Service RPC Endpoint

Follow one of the following steps depending on your chosen

grpcc
installation method.

Global Installation

In another terminal, again from the project directory, run the following command:

grpcc -i -p protos/hello.proto -a 127.0.0.1:50051
# Running grpcc using the global install

Local Installation

Open

package.json
and create the following
call
NPM script:

{
  "name": "grpc-demo",
  "version": "1.0.0",
  "description": "",
  "main": "server.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1",
    "call": "grpcc -i -p protos/hello.proto -a 127.0.0.1:50051"
  },
  "keywords": [],
  "author": "",
  "license": "ISC",
  "dependencies": {
    "@grpc/proto-loader": "^0.3.0",
    "grpc": "^1.17.0",
    "mali": "^0.9.2"
  },
  "devDependencies": {
    "grpcc": "^1.1.3"
  }
}

In another terminal, again from the project directory, run the following command:

npm run call
# Running grpcc using the local install

npx
Installation

In another terminal, again from the project directory, run the following command:

npx grpcc -i -p protos/hello.proto -a 127.0.0.1:50051
# Use npx to pull grpcc from the NPM cloud

Using the
grpcc
Client

Let's breakdown the flags that are being used with the

grpcc
command:

  • -i
    stands up an insecure client (we need this because our service is listening insecurely).
  • -p
    is the path to the proto definition.
  • -a
    specifies the address to the host and port that the service is listening on.

Running

grpcc
will open a Node REPL environment uses Node's
repl
module. Feel free to use any of the in-built features such as save/restore history. We are going to execute commands within this environment to run a client that calls our gRPC service.

The REPL will give you

Hello@127.0.0.1:50051>
as the CLI prompt.

Within the REPL environment, run the following statement:

client.echo({message:"Hello"}, printReply)

The statement we provide to

grpcc
takes advantage of a few variables within the test client.
client
is an instantiated client of the service we provided earlier. In this case,
client
represents the
Hello
service. We use
client
, and call the
echo
function, which is converted to an RPC, and sent to the server.

The first argument of the

echo
function is an object containing the named fields of
EchoRequest
. As you can see, we provide
message
, which is the only field in
EchoRequest
. The final argument is
printReply
, which is a callback variable provided by
grpcc
. This prints the reply of the RPC to the terminal so that we can observe the output.

If everything worked correctly, you should see this output from

grpcc
:

{
  "message": "Hello",
  "timestamp": "1546892703355"
}

Here, we see the response provided by our server. Just like we expected, we can see the two fields that we defined within

EchoResponse
:
message
, and
timestamp
. The
timestamp
field returns the current time, and the
message
field contains the message that we provided within our request.

grpcc is a handy package that lets you easily test gRPC endpoints. Learn how to use it here.

Tweet This

Conclusion

At Auth0, one of our core values is learning. We often refer to this value as

N + 1 > N
to denote the benefits of constantly building ourselves. Within our engineering organization, we foster an environment that promotes learning, experimentation, and innovation. We are constantly trying out new technologies, improving our engineering design and processes, and challenging our product — all without fear of failing as we learn from our mistakes.

As such, we'll continue experimenting with technologies such as gRPC, to find out the best tool for the job. Right now, we are in need of responding to a massive volume of queries in a short time. We'll be using benchmarks to measure tool performance and analyzing the cost of migration of any tool that requires to shift from our existing architecture. That's another part of our values: we run data-driven experiments to gain insights and make informed decisions.

We'll see if our innovation projects lead us to integrate gRPC as part of our architecture. Stay tuned for more engineering insights from our team and future updates through this blog.

About Auth0

Auth0 by Okta takes a modern approach to customer identity and enables organizations to provide secure access to any application, for any user. Auth0 is a highly customizable platform that is as simple as development teams want, and as flexible as they need. Safeguarding billions of login transactions each month, Auth0 delivers convenience, privacy, and security so customers can focus on innovation. For more information, visit https://auth0.com.