GitXplorerGitXplorer
g

protobuf-management

public
6 stars
0 forks
1 issues

Commits

List of commits on branch master.
Unverified
91dc858f7dcb7b91e8520b4534203059ffd7e77f

Augment source tasks and add publishing configuration.

ggorzell committed 6 years ago
Unverified
6f853fcf0042eadef60202fced45468223ccc759

Upgrade Gradle and plugin versions.

ggorzell committed 6 years ago
Unverified
26e58eb3466d23a0756ceeaae57f7151ef9294d9

Update README for clarity.

ggorzell committed 6 years ago
Unverified
f69654618c8fbbf99849dc6eef0a9fbe9d002364

Merge branch 'master' of github.com:gorzell/protobuf-management

ggorzell committed 6 years ago
Unverified
77adc7db6105969b582f7b7e4c75436ba9e89acc

Document dependencies.

ggorzell committed 6 years ago
Unverified
1211443ba620d352ec6c77048770c138d35fd650

Updates to README to reflect changes in the prototype and clarify the approach.

ggorzell committed 7 years ago

README

The README file for this repository.

Protocol Buffer (and gRPC) Management

This is a proposed set of guidelines, practices and tooling for managing proto files, as well as the creation and distribution of artifacts created from them across multiple languages.

Why?

I work at a large organization that has multiple groups moving towards using both protobufs and gRPC. When I looked at how different groups were integrating these things into they workflow and how they were being consumed, there did not seem to be a consistent solution. Some teams were simply vendoring an upstream repo and generating sources for their language themselves, others had built some custom tooling to help their consumers (in a subset of possible languages) integrate their schemas. Neither of these solutions really felt optimal to me and not having a consistent solution did not feel particularly scalable.

Instead, it seemed like there should be a set of consistent practices, tooling and automation for managing both the definitions and the artifacts that they produce. With the popularity of both protobuf and gRPC, I assumed that there must be some well documented best practices that already existed. However, when I started looking for prior art about how other companies manage these things I only found one blog post[1] and one git repo[2] that seemed applicable. I was really hoping to find some industry best practices that we could simply adopt. However when that did not work out I decided that I should just write a proposal down and prototype it.

Requirements

I started with a simple set of requirements that I drew from what I considered pretty standard best practices in software development:

  • Semantic Versioning
    • This gets a bit tricky because of the nature/backwards compatibility of protobufs, but still seems worthwhile.
  • CI Builds
  • CI Testing
  • Publishing Artifacts
  • Easy/Native Integration

Approach

  1. Store and manage the proto files in their own module/directory/repository so that you can treat them as you would any other library and they can be easily consumed by multiple other projects.
  2. Generate the source code for each desired language as part of a CI process in that repo.
  3. Lint and test the proto files to the extent possible.
  4. Build all the way to a standard artifact for that language.
  5. Test against the artifacts as much as possible in the CI process.
  6. Publish artifacts for each language in their "standard" way, including the proto files themselves (in tar/zip files).
  7. Dependent projects should consume the artifacts rather than the repository or code. Even if you need depend on and include the proto files in other protobuf projects, you should be able to consume artifact bundles.

Implementation

Prototype

I have put together a prototype version of the above process in this repo. It leverages gradle to do the proto build because it seemed to have the richest set of features for both generating src across multiple languages and compiling and packaging the java artifacts. It also supports downloading artifacts with proto files in them and using them as dependencies, which is pretty nice.

Currently this is just using the default gradle directory layout, however the gradle build can be configured to have a source and build directory structure that makes sense of your project.

The prototype generates and packages code in both java JAR files and ruby gem files. It generates the gemspec on the fly, and uses rbenv to load a sandboxed copy of the appropriate ruby version.

Finally, there is a Google Container Builder config file that will build, lint and package everything up. Eventually this can be extended to also publish the results.

Dependencies

Future Features

  • More variables to control things like the version, artifact names, etc for the gradle build.
  • Package all of the build work into a dockerfile and users only would need a Makefile and cloudbuild.yaml.
  • Publish versioned tar for proto files themselves.
  • Tooling to ping/list a running gRPC service?

Related Projects

Footnotes

[1] How We Build gRPC Services At Namely
[2] prototool