Types of scans in API Scan

You can use API Scan to run different types of scans that focus on different aspects of your API implementation. In all scans, the underlying scan engine is the same, but the scan configuration that the engine uses includes different instructions on what kind of requests to generate for the scan.

Different scan types are only available for Scan v2 engine. Scan v1 engine can only run conformance scan.

API Scan can run the following types of scans:

  • Conformance scan: A design-time scan to ensure that you do not inadvertently introduce vulnerabilities and your code and API implementation matches the documented contract in your API definition
  • Drift scan: A lightweight scan on deployed and operational APIs to ensure they continue to work the expected way

When viewing the list of scan configurations, each configuration shows a label indicating what type of scan can be run with it.

An example screenshot showing the Pixi API with six different scan configurations.

By default, each scan configuration is only of one type and runs only one type of scan. However, as you edit and enhance a scan configuration, it may combine different scan types into the same configuration. In this case, you can assign additional scan types for your configuration as needed.

Conformance scan

Conformance scan is the traditional scan that API Scan runs to check how your API implementation conforms to the contract set in the OpenAPI definition of the API, to find out if there are any mismatches between the API definition describing your API and what it actually does. If API Scan finds any discrepancies, it reports the issues clearly so that you can fix them. This helps avoid introducing discrepancies and vulnerabilities when designing and developing your API implementation in the first place.

Fuzzing in conformance scan

The key function in conformance scan is fuzzing. Once API Scan has run the happy path tests to get a baseline for what a success for a particular API operation looks like, it is ready to test if the API implementation correctly catches and handles malformed or erroneous requests. Conformance tests are based on the happy path requests and designed to uncover issues in the error handling of your API implementation and where it does not conform to what the OpenAPI definition of the API declares.

Fuzzing means that each test marked for fuzzing includes an intentionally tweaked element (header, body, parameters, HTTP method) so that the request no longer matches what the API expects. The tweaks that API Scan does in the conformance tests fall into three categories:

  • Omit: API Scan omits an element that the scanned API definition requires from the request
  • Add: API Scan adds an extra element that is not included the scanned API definition into the request
  • Fuzz: API Scan changes an element of the request so that it no longer matches what is defined in the scanned API definition (for example, using a value that does not match the schema constraints)

The implementation of the API should catch this and respond accordingly.

Response validation in conformance scan

API Scan tests how the API implementation handles, for example, requests to operations not defined in the OpenAPI definition at all, or misconfigured requests to existing operations. How the API responds to the crafted test requests in the scan determines whether or not it conforms to the contract it sets out in its API definition. To catch issues, API Scan validates the response from the API and considers, for example, the following:

  • Is the provided HTTP status code defined in the response object in the API definition?
  • Are all headers in the response defined in the response object in the API definition and do they match the defined schema? Are all required headers present in the response?
  • Is the returned response body too big?
  • Should the response even have a body (method HEAD or returned status code HTTP 204)?
  • Does the Content-Type of the returned response match the types defined in the content map in the API definition, and does it match the defined schema?

Response validation is done in two parts:

  1. Response code: Did error handling work? Did the received HTTP status code in the API response match what API Scan expected or not? Or in the worse case, did the intentionally malformed request not raise an error at all?
  2. Contract conformity: Did the received response match what is defined in the OpenAPI definition of the API?

From security risk perspective, incorrect error handling poses a bigger risk for the API implementation than response content, and therefore API Scan focuses on it first.

  • For requests to non-existing operations, API Scan expects the API to respond with HTTP 405 Method not allowed. Any other response is considered to be wrong.
  • For misconfigured requests to existing operations:
    • The returned HTTP status code must be equal to or greater than HTTP 400 to indicate an error.
    • The returned HTTP status code (or a default response) must be defined in the OpenAPI definition of the API.

Based on this, API Scan splits the received response codes into three classes:

  • Incorrect: The API did not raise an error, but responded with a success to a malformed request. This means that the API implementation does not catch and handle the error at all, indicating serious problems.
  • Unexpected: The received response code does not match what API Scan expected for the test request, but the API implementation still raised an error, if not the correct one. This means that there are some problems in the error handling in API implementation, but at least the issue is caught.
  • Expected: The received response code matches what API Scan expected for the intentionally malformed request, and the API implementation raised the error correctly. This means that the API behavior is good.

However, even if response codes match what API Scan expects, it does not mean all is well. API Scan could also uncover discrepancies between the API contract set out in the OpenAPI definition and the backend API implementation:

  • The returned response body must match what is defined in the API definition for the returned HTTP status code.
  • The returned response headers must match what is defined in the API definition (if response headers are defined).

Based on analyzing the response bodies and headers, the received responses are flagged either conformant or not conformant in respect to the API definition.

By default, API Scan does not follow redirects (HTTP 3XX) in API responses to analyze the final response, but instead analyzes the received redirect. Depending on your API, this could result in conformance failure if the response definition in you API is the expected final response that the redirects would lead to. You can change this behavior in scan settings, if needed, but we do not recommend it as it may prevent the scan from completing: the final response from a redirect could often be in an unsuitable format, resulting in error. By not following redirects, API Scan can complete and successfully evaluate the main concern: is the error handling of your API implementation working as it should.

Conformance scan report

Running a conformance scan produces a report that focuses on how well your API implementation matches the contract set out in the API definition of your API.

Drift scan

Although it is only natural for APIs to change as new business needs arise and further iterations are done, if changes are not managed properly, it can lead to problems. API drift happens when APIs over time start deviating from their initial documented API contract when additional development happens in an uncontrolled manner, for whatever reason.

API drift can lead to high technical debt, with constant fixing of bugs, as well as dissatisfied and confused API consumers who cannot figure out how to use your API. API drift is not a mere documentation issue: APIs no longer working the way they were supposed to can also cause them no longer to align with the core business needs, or operational infrastructure or setup, and surprise breaking changes can break backward compatibility. It can also be outright dangerous, allowing for attacks through malicious input or unauthorized access to potentially sensitive data, possibly extending even beyond your organization if your API integrates with 3rd party services.

This is where a drift scan comes in. Drift scan is a light-weight, non-invasive scan that API Scan runs to detect API drift and ensure that your operational application infrastructure does not change without notice behind your back. This allows you to make sure that the documented API contract continues to evolve hand-in-hand with the overall API implementation.

Today, APIs are rarely self-contained. Responses are often shaped by downstream dependencies across a complex supply chain of external services, each introducing potential vulnerabilities or even malicious content. This is why drift scan also extends to 3rd party APIs your APIs integrate with, so that you have better visibility into your API dependencies and supply chain to detect unscheduled or unexpected changes that may cause breaking changes for your own APIs and applications. Drift scan is based on a subset of conformance scan, but because the scan involves 3rd party APIs, there is a crucial difference: a drift scan only sends GET requests and thus does not do any real fuzzing, to avoid causing issues to operational systems that you do not own or control.

Drift scan should be considered as a monitoring tool for your existing live API deployments, not a qualitative design-time tool. Drift scan is not supported in IDE.

Drift scan report

Running a drift scan produces a report which focuses on are all API endpoints still working as documented in the API contract or has something changed. Because drift scan is meant as a monitoring tool, it is geared more towards creating automated alerts on downstream services, for example, based on scan logs or scripting, rather than the report to be consumed from the platform UI. However, for transparency, the report can still also be viewed on the platform UI, and it is very similar to the conformance scan report.

We will continue to adapt and improve the drift scan report in future releases.

Because drift scan aims to establish that the API implementation continues to work as it has been documented, its focus in on happy path tests and the report is filtered by default to show the happy path results. Just like in a conformance scan, the happy path requests are build directly from the API contract documented in the API definition and the expected response codes are the same as in conformance scan. However, in addition to that, a drift scan intentionally sends out another happy path request without the required authentication information and specifically expects to receive HTTP 401. This allows monitoring that the authentication measures on endpoints also continue to protect the API implementation from unauthorized access.

Clicking an issue in the result list provides further details on it, such as did the received response match what the scan expected, and where the issue occurred. The request and response tabs provide more details on the request that the scan sent as well as the response it received. In contrast to conformance scan report, by default drift scan only tests GET requests, so other operations, even if present in the API definition, are not tested in a scan.