How Teal Keeps Their API, Tests and Documentation in Sync

|
Calendar Icon
October 8, 2020
|
Edited by
|
Clock Icon
19
min read

There's a common problem in web development of creeping divergence between API documentation and API functionality. Its so easy to update functionality and forget to update the documentation! We can also just accidentally not align the two right from the start. This is solved by using a tool that tests the API and generates documentation at the same time. In the Rails ecosystem, we have the RSWAG gem which almost prevents divergence:

Rswag extends rspec-rails "request specs" with a Swagger-based DSL for describing and testing API operations. You describe your API operations with a succinct, intuitive syntax, and it automatically runs the tests. Once you have green tests, run a rake task to auto-generate corresponding Swagger files and expose them as YAML or JSON endpoints

I said Rswag almost prevents divergence because it does not enforce a few key items that ensure divergence is eradicated. This blog post shows what is really necessary to ensure the problem is solved, and also offers a few patterns to help with keeping the code DRY and clean.

Part 1: What RSwag gives you out of the box, and the problems that still exist

Part 2: A simple solve using Ruby JSON Schema Validator and the problems that still exist

Part 3: A robust solve that uses Json Schema Builder along with some patterns

Part 1: What RSwag gives you out of the box, and the problems that still exist

Let's assume RSWAG installed and you have a basic Article model, associated route and controller as follows:

Let's also assume you have a basic RSwag test set up for the above endpoint:

When we run the test, it passes, and per the RSwag promise we can generate some Swagger documentation with rails rswag, start up our server and navigate to localhost:3000/api-docs to see the docs.

The Problem

But what is actually getting tested here? The response status code, given the params. That's all.

Given the goal of preventing creeping divergence, this is a massive issue: There is no coupling that ensures that the request schema is what is actually being used in the test! (and similarly, there is nothing testing that the response object matches the response schema)

If you look at the code above, you can see that the request object defined by let(:params) could be different than the schema defined in the request specification. The tests will still pass. This completely fails the promise of coupling between tests and documentation.

We want some sort of validation that the test's object passed to the controller and the request schema match.

We want something like:


Part 2: A simple solve using Ruby JSON Schema Validator and the problems that still exist

We can achieve this with the Ruby JSON Schema Validator gem, which offers a simple way to check if a JSON blob adheres to a JSON schema.

With the gem installed we can re-write the actual tests:

Now if we decided to add a parameter to the test's request object, but forget to change the request schema, we get a failed test! This means there can never be a divergence between the API functionality, the test suite and the docs! For example if we change the params being sent to the controller to include the param extra, then the test will throw:


Are we done? No!
There are still a few problems remaining which really reveal themselves as you try to apply the above to an entire application. The first problem is that with the method above, we will potentially start to have duplicates of schemas. This might happen as we build out various update or index endpoints. Furthermore, if we create an Author class, and an author has multiple articles, does that mean when we test our author GET show endpoint (which contains an array of articles), we need to copy/paste the article schema into the author schema?? Yikes. That doesn't feel very DRY. We are getting into the world of building and composing schemas. That means it's time for some more help.

Part 3: A robust solve that uses Json Schema Builder along with some patterns

Enter the Json Schema Builder. With this gem, we are going to do a few things. The first is that we are going to consolidate our schemas into a specs/schemas directory. While we lose the visual convenience of the schemas and tests being in the same file, placing schemas into a common directory enables easier schema composition. The second change is that since this gem comes with its own validation feature, we can stop using (and remove) the aforementioned Ruby JSON Schema Validator gem.

We are going to add some lines to the top of the swagger_helper.rb file so we know where to find our awesome schema definitions:


With this in place we can create our schemas:


A shoutout here to the schema builder gem. It's DSL is fantastic and allows for very readable schemas.

We can now update our integration test to use these brand new schemas:


This is starting to look better but we still have a bit more to do. Right now there is a lot of duplication that we will see once we add more integration tests. What we really want to do is wrap everything up in a reusable, shared test that we can add to various endpoints.. The three tests above (checking for request structure, response structure and status code) should be included in every endpoint test, so let's move them to a shared test.

First let's make sure that we required our shared examples. In your swagger_helper.rb file add the following line:


Then you can define a shared example:


And finally in your spec, you can utilize the shared example::

Now when you make another endpoint, your API functionality and documentation will remain in sync with just a few lines of code! What will you do with all of your reclaimed spare time? I hear a lot of people are learning to bake sourdough bread these days...

Frequently Asked Questions

What strategies does Teal employ to maintain consistency between their API and its documentation?

Teal's Engineering Team employs an integrated approach where documentation is treated as a core component of the API development process. They use automated tools to ensure that changes in the API are simultaneously reflected in the documentation, maintaining a single source of truth and preventing divergence.

How does Teal ensure that their API tests remain up-to-date with API changes?

Teal implements a continuous integration and testing system where tests are automatically run against new code commits. This ensures that any changes to the API are immediately tested, and discrepancies are caught early. The tests are written to reflect the current API specifications, ensuring they are always in sync.

Can you describe a specific tool or method Teal uses to keep their API, tests, and documentation aligned?

Teal utilizes a documentation generator that is directly tied to the API's source code, such as Swagger or OpenAPI Specification. This tool automatically updates the documentation whenever the API is modified, ensuring that the API's functionality and the documentation provided to users are always consistent.

Dave Fano

Founder and CEO of Teal, Dave is a serial entrepreneur with 20+ years of experience building products & services to help people leverage technology and achieve more with less.

We help you find
the career dream.