Version your proto definitions for stability

Semver and package managers to stabilize gRPC changes.

When you are using gRPC across micro-services you can quickly forget to update a protocol buffers file and mess up a server call if the spec does not align.

If the server to server does not match the proto definition to the T the request will fail so its important to setup a pipeline of model that can scale down the road

through out your application development.

How to structure protobuf files?

Setting up your proto definitions can vary company to company or app to app by the requirements.

The gRPC protocol was designed so that it can use protocol buffers or .proto files to implement a schema that can generate a server and client that can send messages following the formats defined. At first you might setup a .proto file in two locations across two different Github repos or folders that do not share common modules and get the lego blocks to connect. At some point you may end with multiple service definitions and messages that are duplicated across the repos. With gRPC you can import the modules that you need across files to prevent code from being duplicated. On that note it might help to setup one central location to scaffold out the files and ship to the dependency managers to consume inside your app like npm, cargo, or composer.

An example of this can be can be found at this location and tested using npm ( the node package manager ) by running npm i @a11ywatch/protos. You will see the definition files inside the node_module.

Proto buffers inside from npm using `npm i @a11ywatch/protos` in shell.
example of the proto files inside node_modules

Versioning benefits following semver.

When you follow semver (version control ) with gRPC it helps define a compatibility layer that will be true at various levels. For the most part gRPC is non breaking across updates, except when you do the following.

Binary breaking changes

The following changes are non-breaking at a gRPC protocol level, but the client needs to be updated if it upgrades to the latest .proto contract or client gRPC builder. Binary compatibility is important if you plan to publish a gRPC library across package managers.

  • Removing a field - Values from a removed field are deserialized to a message's unknown fields. This isn't a gRPC protocol breaking change, but the client needs to be updated if it upgrades to the latest contract. It's important that a removed field number isn't accidentally reused in the future. To ensure this doesn't happen, specify deleted field numbers and names on the message using Protobuf's reserved keyword.
  • Renaming a message - Message names aren't typically sent on the network, so this isn't a gRPC protocol breaking change. The client will need to be updated if it upgrades to the latest contract. One situation where message names are sent on the network is with Any fields, when the message name is used to identify the message type.
  • Nesting or un-nesting a message - Message types can be nested. Nesting or un-nesting a message changes its message name. Changing how a message type is nested has the same impact on compatibility as renaming.
  • Changing namespace - Changing namespace will change the namespace of generated language types. This isn't a gRPC protocol breaking change, but the client needs to be updated if it upgrades to the latest contract.

Doing cool things now that we have a central version system.

Now that we have the important stuff down we can take advantage of the automated tools that help improve gRPC workflows and productivity.

docker script with contents - `FROM pseudomuto/protoc-gen-doc AS generator  WORKDIR /usr/src/app  RUN apk add npm  RUN npm i @a11ywatch/protos@0.1.5  RUN mkdir ./doc && cp -R node_modules/@a11ywatch/protos proto  RUN protoc --doc_out=./doc --doc_opt=html,index.html proto/*.proto`
Creating documentation in docker based off your .proto files in a multi-stage build

That will then generate a nice doc that can be served in a rest endpoint like: grpc docs table of contents example displaying proto definitions
Table of contents generated example.
Protobuf table example from showing - IssuesInfo info to use to gather all stats for the issues on the page.  Field	Type	Label	Description possibleIssuesFixedByCdn	int32		 possible issues that may be fixed using the cdn  totalIssues	int32		 all of the page issues  issuesFixedByCdn	int32		 how many issues that are fixed using the cdn  errorCount	int32		 errors on the page  warningCount	int32		 warnings on the page  noticeCount	int32		 notices on the page that mainly used for info purposes  adaScore	int32		 rough accessibility score  issueMeta	IssueMeta		 extra data on the issue  Page page model of all helpful insight  Field	Type	Label	Description domain	string		 the domain for the reqeust []  url	string		 the url of the request with http or https  cdnConnected	bool		 is the cdn for accessibility fixes connected on the page.  pageLoadTime	PageLoadTime		 page load time.  insight	google.protobuf.Struct		 the json details from lighthouse  issuesInfo	IssuesInfo		 issues on the page.  lastScanDate	string		 the last date of the scan.
Table displaying useful information like fields and descriptions example.

You can view the repo here.

References and resources:

Jeff Mendez

My name is Jeff and I am the founder and creator of A11yWatch.