API Conformance Scan
API Conformance Scan is a dynamic runtime analysis of your API to check that the implementation behind your API and the behavior of the backend service matches the contract set out in the OpenAPI (formerly known as Swagger) definition of the API.
You can run a scan on an API you have imported to 42Crunch Platform and deployed to find out if there are any mismatches between the API definition describing your API and what it actually does. If Conformance Scan testing finds any discrepancies, it reports the issues clearly so that you can fix them.
The scan generates real traffic to the selected API endpoint and could incur costs depending on your setup.
For best results, make sure that your OpenAPI definition is valid and well-formatted before you scan it. The API must be deployed so that the API endpoint is live, and the backend server your API uses must be accessible to Conformance Scan. Otherwise the API cannot be scanned.
If your account belongs to the free community organization, you cannot scan APIs in 42Crunch Platform, but you can still use the on-premises version of the scan.
Conformance Scan can have potential side effects: APIs can throw exceptions, fail, and data can be affected. As per our terms and conditions, you must only scan APIs that you own, and only against non-production systems and non-production data! Do not use Conformance Scan in production environment!
Scan v1 and Scan v2
We have introduced a new version of Conformance Scan, referred to as Scan v2. For backward compatibility and to avoid the adoption of the new version disrupting your day-to-day work, we have retained the previous version, Scan v1, and you can choose to continue to it for now, if you want. Results from Scan v1 continue to be used to represent the scan statistics of the API on API summary page and on the list of APIs in an API collection.
Both versions of Conformance Scan share the same core features and operation, but the new Scan v2 offers additional features and more flexibility, such as:
- Multiple scan configurations for a single API that you can edit and iterate on
- New scan settings
- Optimizing scan report size, for example, by skipping curl requests in the report
- Validating just the response code, not the response, for happy path requests
- Only fields defined as
required
are validated in happy path requests - Automated credential generation
Where applicable, the difference between the scan versions has been clearly indicated in this documentation. Scan configurations and tokens are specific to the scan version: you cannot run Scan v2 using a Scan v1 scan configuration or scan token, and vice versa. Scan configurations and tokens are specific to a scan version: you cannot run Scan v2 using a Scan v1 scan token, and vice versa. When running a scan, make sure you specify the right scan token for the scan version you are using, otherwise Conformance Scan cannot use the associated scan configuration and fails to run.
You can run Scan v1 in 42Crunch Platform or on premises as a Docker image. Scan v2 is currently available only as Docker image for on-premises scan.
Scan v2 does not yet support customization rules.
What you can scan
Both OpenAPI Specification v2 and v3 are supported. The file size of your API should not exceed 10 MB.
By default, Conformance Scan limits the maximum length for strings in the requests it sends during the scan to 4096
. If the properties minLength
or maxLength
or the length limits in a regular expression that you have defined for an API operation in your API definition conflict with this limit, it causes issues during the scan.
If the minimum length required is longer than the string length limit allowed in Conformance Scan, the scan cannot create the happy path request for that operation to establish a baseline. If the maximum length allowed in the API is longer than the allowed string length limit in Conformance Scan, the scan can create the happy path request but not the actual request during the scan.
In both cases, the operation is shown as a skipped operation in the scan report, but for different reasons. You must fix the operation in your API definition before it can be successfully scanned.
Conformance Scan does not support operations that require request bodies with the content type multipart/formData
. Only request bodies with the content type application/json
and application/x-www-form-urlencoded
are supported.
How Conformance Scan works
- Preparation: Conformance Scan checks the defined scan configuration and prepares the pieces required for the scan:
- Checks that the scan configuration you provided is valid.
- Parses the OpenAPI definition of your API, generating default values for the parameters your API operations require.
- Checks that any certificates provided for authentication to the API endpoint are valid.
- Tests the designated endpoint to check that the server is available.
- Happy path requests: Conformance Scan generates and sends a happy path request to all operations in your API to establish a successful benchmark. Any operations where the happy path request fails are skipped in the scan.
- Generating tests: The scan generates the test requests for the API operations in the scan based on the happy path requests. Each test request includes an intentionally tweaked element (header, body, parameters, HTTP method) so that the request no longer matches what the API expects. The implementation of the API should catch this and respond accordingly.
- Scan:
Conformance Scan sends tests requests at a constant flow rate to the live API endpoint.
- Conformance Scan waits for the API to respond within 30 seconds before it raises a timeout error for the request in the scan logs.
- When the API responds, Conformance Scan analyzes the received response to determine if the API conforms to the contract it sets out in its OpenAPI definition.
Unlike the static testing in API Security Audit, Conformance Scan is dynamic testing and variable by nature. To better simulate real API traffic and more reliably test the API's behavior, the requests and parameter values that Conformance Scan generates are random, as is the order in which the requests are sent to the API. As a result, the API responses and the outcome of the scan can also vary. So do not be alarmed if you get slightly different results for your API in scans, that is completely normal.
You can customize how Conformance Scan behaves by creating scan rules and applying them to the APIs you want using tags. For more details, see Customizations.
Generating values for parameters
To successfully call the API operations in your API, Conformance Scan must follow the OpenAPI definition of the API and provide the required parameters in the calls. For this, when Conformance Scan loads the API definition in memory, it generates default values for each schema and parameter in the OpenAPI definition, and uses these values in the requests it sends. Because Conformance Scan scan does not generate any responses itself (it only validates the responses coming directly from the API), response schemas are excluded.
For the malformed test requests Conformance Scan can simply generate random values, intentionally disregarding the constraints for schemas and parameters. However, for the happy path requests, the generated values must match all defined constraints.
Some formats are not random at all, but follow a standard pattern as defined in the OpenAPI Specification (OAS). Conformance Scan uses a default generator to match the standard constraints of formats like:
- Date and time (uses the current date and time by default, formats as defined by RFC 3339)
- Email addresses
- Hostnames
- IP addresses (both IPv4 and IPv6)
- URIs
- JSON pointers
- UUIDs
If the data format itself does not set a standard pattern, Conformance Scan uses the constraints set in your OpenAPI definition. If you provide a regular expression for a schema or parameter, Conformance Scan uses that to generate the value. Otherwise, Conformance Scan generates the value on its own.
Providing examples
It may be very difficult to create valid values for some things, like some object schemas, or strings with a very specific pattern. To ensure best performance, if you have complicated schemas in your API, we recommend including some examples for these kind of values directly in the OpenAPI definition.
There are several properties you can use for this, in the order of preference:
x-42c-sample
default
enum
example
If the property x-42c-sample
contains a value that is not valid against the schema, Conformance Scan tries to load the value from the property default
, and so on, until it finds a sample value it can use. As a last resort, or if no samples are provided at all, Conformance Scan generates a value from scratch. If Conformance Scan cannot generate a default value that an API operation requires, that operation is skipped in the scan.
For more details on the vendor extension, see x-42c-sample. For more details on the other properties, see the OpenAPI Specification (OAS).
Conflicts from regular expressions
When generating values, Conformance Scan considers the properties minLength
, maxLength
, and pattern
separately. This means that if the string limitations in the regular expression in pattern
do not match minLength
and maxLength
values Conformance Scan may not be able to generate a valid value. To prevent this, if Security Audit detects a conflict between these string properties, it raises an issue about it that prevents scanning the API until the issue has been resolved.
There are several reasons why the conflict could happen. For example, instead of defining exact for length in the regular expression, you could have used +
or *
:
"example": { "type": "string", "minLength": 2, "maxLength": 5, "pattern": "^[a-z]+$" }
We recommend that instead of using +
or *
, you properly specify the length in the regular expression. If the regular expression is simple, you can specify the length in both the pattern and minLength
and maxLength
, as long as the values do not conflict:
"example": { "type": "string", "minLength": 2, "maxLength": 5, "pattern": "^[a-z]{2,5}$" }
If you have a complex regular expression with multiple segments, the lengths of all segments and minLength
and maxLength
must match. In this case, it is probably better to properly specify the length limits in the regular expression, and omit minLength
and maxLength
:
"example": { "type": "string", "pattern": "^(https?:\\/\\/)?(www\\.)?[-a-zA-Z0-9@:%._\\+~#=]{2,256}\\.[a-z]{2,6}\\b([-a-zA-Z0-9@:%_\\+.~#?&//=]*)$" }
Remember to include the anchors ^
and $
in your regular expression, otherwise the overall length of the pattern could be considered infinite. If you include the anchors in the regular expression and the pattern only has fixed or constant quantifiers (like {10,64}
, for example), you do not have to define the property maxLength
separately for the object, as the length is fully constrained by the pattern. However, if the regular expression does not include the anchors or its quantifiers are not fixed (like in ^a.*b$
), it can be considered to be just a part of a longer string and the property maxLength
is required to constrain the length.
In both cases, it is always beneficial to provide an example value that Conformance Scan can use when generating happy path requests.
Types of requests
Conformance Scan sends different types of requests for different purposes. Happy path requests and test requests are part of both Scan v1 and Scan v2, but unhappy path requests and custom requests can only be configured for the new Scan v2, not for Scan v1.
Happy path requests
Conformance Scan needs a benchmark to determine if the incorrect behavior of the API was caused by the test request or some other failure. To establish this benchmark, Conformance Scan first sends a happy path request to the operations in the API before it starts the actual scan.
A happy path request is a valid request generated directly from the OpenAPI definition of your API, designed and expected to always succeed. Conformance Scan generates and sends this request to each operation defined in your API, and validates the responses it received.
For a happy path request to be a success, the response must be either successful or expected: the received HTTP status code must be 200
—399
, or 404
(because the likelihood that the scan manages to generate a value that matches an existing ID is vanishingly small). Otherwise, the happy path request fails. If a happy path request fails, the operation in question is skipped in the scan, because any results for it would be inconclusive without a successful benchmark.
Happy path request failing often indicates significant issues in your OpenAPI definition that you must fix before Conformance Scan can scan the failing operation. Running Security Audit and checking what kind of issues it raises helps you find the culprits.
The happy path request can also fail because the connection to the defined endpoint was broken, or the API took too long (over 30 seconds) to respond.
Unhappy path requests
This applies to Scan v2 only.
Happy path requests, as their name implies, are the cases where everything goes well, which in most cases is how things should go. But sometimes you might want to test the implementation for HTTP response status codes for errors (such as 4XX
and 5XX
). Because the happy path requests consider these response codes as unsuccessful, they cannot provide the baseline for how the received response for error codes should look like. This is where unhappy path requests come into play.
To be able to see how the API responses from your error handling do in Conformance Scan, you can configure unhappy path requests that define what to normally expect for the specific error responses you want to test. Conformance Scan will run the unhappy path requests alongside the normal happy path requests to establish the baseline for the normal API responses from error handling, so that it can run the actual test requests against these endpoints during the scan.
Conformance requests
Once Conformance Scan has the baseline for what a success for a particular API operation looks like, it is ready to test if the API implementation correctly catches and handles malformed or erroneous requests. These conformance requests are the actual test requests that Conformance Scan runs, designed to uncover issues in the error handling of your API implementation and where it does not conform to what the OpenAPI definition of the API declares.
Each test request includes an intentionally tweaked element (header, body, parameters, HTTP method) so that the request no longer matches what the API expects. The implementation of the API should catch this and respond accordingly. The tweaks that Conformance Scan does in the test requests fall into three categories:
- Omit: Conformance Scan omits an element that the scanned API definition requires from the test request
- Add: Conformance Scan adds an extra element that is not included the scanned API definition into the test request
- Fuzz: Conformance Scan changes an element of the test request so that it no longer matches what is defined in the scanned API definition (for example, using a value that does not match the schema constraints)
Custom requests
This applies to Scan v2 only.
Sometimes you might want to test a very specific thing in your API implementation and the normal conformance tests cannot catch that. In this case, you can write your own custom test request that captures the issue you want to test. The custom test can be as complex or as simple (for example, just checking that an authorization scenario works with the correct credentials) as you want and need.
Scan configuration
To successfully scan an API, Conformance Scan needs some basic information on what it is supposed to do:
- What API to scan?
- Which endpoint to send the requests to?
- How to authenticate to the API, if that is required?
For this, Conformance Scan needs a scan configuration, a JSON file that captures these details for each API that you want to scan.
Scan configurations and tokens are specific to a scan version: you cannot run Scan v2 using a Scan v1 scan token, and vice versa. When running a scan, make sure you specify the right scan token for the scan version you are using, otherwise Conformance Scan cannot use the associated scan configuration and fails to run.
You can quickly create a basic scan configuration in 42Crunch Platform by providing some basic information, or if a more complex scan configuration is needed, you may choose to work on it outside the platform in an editor of your choice and upload the finished configuration to the platform. You can also update your existing configurations later. Scan configurations are stored encrypted in 42Crunch Platform.
All available scan configurations for an API are listed on the Conformance Scan page. How many scan configurations a single API can have depends on the version of the scan you want:
- For Scan v1, you can have one scan configuration for running Conformance Scan on 42Crunch Platform and another for running it on premises. You can update these configurations as needed, but you cannot create more scan configurations.
Scan v1 configurations do not state what percentage of the API operations in the API they cover, or the estimated number of tests that could be run with them. This information is provided only for Scan v2 configurations.
- For Scan v2, you can have as many scan configurations as needed.
The details that you can view on a scan configuration depend if the configuration is for Scan v1 or Scan v2.
API endpoints
By default, Conformance Scan lists endpoint URLs that are parsed directly from your OpenAPI definition. However, if you want to use a URL that is not listed, you can also enter it when you configure the scan settings for Scan v1 or edit the configuration for Scan v2.
If you want to override the API endpoint defined in your API definition and scan a different endpoint, make sure you specify a valid host for the paths in your API definition. For example, if your API follows OAS v2 and uses a basePath
, make sure you include the basePath
in the URL you enter, otherwise the basePath
is ignored in the scan configuration.
The URL you enter for the API endpoint must fulfill the following criteria:
- It is a public URL.
- It specifies either
http
orhttps
(for example,http://255.255.255.255
orhttps://api.example.com/v2
). - It is not an internal address (for example,
http://localhost
or255.255.255.255
). - It does not include parameters (for example,
http://www.example.com/products?id=1&page=2
). - It does not include anchors (for example,
http://www.example.com#up
).
Authentication
If you have defined authentication for your API or its operations, Conformance Scan must provide the required details to authenticate when it sends requests to the API. You must therefore configure the authentication details for the security schemes in your API that you want to use in the scan. If you have not defined any security requirements in the API definition, no authentication is required.
Conformance Scan currently supports the following authentication types:
- Basic authentication
- API keys in headers, query strings, or cookies
- Bearer token
- OAuth 2
- OpenID Connect (OAS v3 only)
To configure OAuth2 authentication, you must first manually obtain an access token that Conformance Scan can use to authenticate to your API. In the scan configuration wizard, authentication with OAuth token is configured like bearer token. For more details on OAuth2, see RFC 6749.
If needed, you can also configure mutual TLS for client authentication. The client certificate must be in p12
format. For more details, see Scan API conformance.
How authentication details are configured is different for Scan v1 and Scan v2:
- Scan v1 configuration: The configuration wizard shows all security schemes defined in the OpenAPI definition of your API. Fill in the details for the security schemes you want to use in the scan. You can leave the security schemes that you do not want to use in the scan empty and Conformance Scan will ignore these schemes. Any API operations that use only these security schemes for authentication are skipped in the scan.
- Scan v2 configuration: The basic scan configuration is generated automatically directly from your OpenAPI definition, including any authentication methods listed. The authentication tab of the configuration provides an at-a-glance summary of the available authentication methods, while the JSON file of the configuration lists the full details of each authentication method, as well as the environment variables (for example,
SCAN42C_SECURITY_OAUTH2
) that are used to provide the credential in your Docker command when running the scan. You only need to provide credential details for authentication methods that your API actually uses. If you have defined a method in you OpenAPI definition but your API does not actually use it, you do not need to provide credentials for it.
If you run Scan v1 in 42Crunch Platform, the authentication details are only used in the current scan and are not stored anywhere; for on-premises scan, the authentication details are stored encrypted as part of the scan configuration in the platform. The authentication details are not retrievable and credentials are hidden. For Scan v2, the authentication details are only provided in the Docker command and not stored anywhere.
With on-premises Scan v1, instead of hard-coding the authentication details in the scan configuration, you can use environment variables. See Using environment variables in Scan v1.
Additional settings
When you create a scan configuration for running Conformance Scan, there are various settings that you can choose to configure. Configuring these settings is entirely optional and not needed in most cases, the default settings that Conformance Scan uses are usually enough. However, for more advanced use, the settings let you tweak some aspects of the scan, such as memory limits.
The available settings may vary depending on are you running Scan v1 or Scan v2 and in 42Crunch Platform or on premises. For the full list of available settings, see API Conformance Scan settings
Scan environments
This applies to Scan v2 only.
Scan configuration for Scan v2 lets you use a single configuration file for scanning a single API in deployed in different environments.
For example, you might have an API that is live in both your development and testing environments and the required host or authentication details are different. In this case, you could simply edit the scan configuration to add another set of environment variables that are used when running the scan in the other environment. This way you can reuse the same configuration without having to recreate the parts that stay in common.
Scan configuration always includes definitions for at least one environment, but you can expand the configuration as needed.
Scan token
Creating a scan configuration also produces a scan token. The token indicates to Conformance Scan which API it should scan and with which settings. If running Conformance Scan on premises in a Docker container, the scan token is passed in the environment variable SCAN_TOKEN
. When Conformance Scan starts, it connects to 42Crunch Platform and fetches the scan configuration that matches the specified scan token. This ensures that the on-premise scan runs the correct configuration for your API. If running Conformance Scan in 42Crunch Platform, you do not have to provide the scan token separately.
Scan configurations and tokens are specific to a scan version: you cannot run Scan v2 using a Scan v1 scan token, and vice versa. When running a scan, make sure you specify the right scan token for the scan version you are using, otherwise Conformance Scan cannot use the associated scan configuration and fails to run.
When the on-premises scan starts, it establishes a a two-way, HTTP/2 gRPC connection to 42Crunch Platform at the address services.<your hostname>
and the port 8001
. Make sure that your network configuration (like your network firewall) authorizes these connections. The on-premises scan uses this connection to verify the scan token and to download the scan configuration you have created. During runtime, on-premises scan uses the connection to send the scan report and logs to the platform.
If you are a user in the free Community organization and access 42Crunch Platform at https://platform.42crunch.com
, your hostname is 42crunch.com
, and the endpoint you must enable is services.42crunch.com
.
If you are an enterprise customer not accessing 42Crunch Platform at https://platform.42crunch.com
, your hostname is the same one as in your platform URL.
Scan configuration validation report (SCVR)
This applies to Scan v2 only.
By default, the basic scan configuration is based directly on the OpenAPI definition of the API, and thus always valid for that API. However, if you edit the basic configuration further, or indeed create a more complex scan configuration with additional features, it is possible that you might inadvertently introduce discrepancies or errors into the scan configuration so that Conformance Scan could no longer successfully scan the API because the scan configuration now contradicts its OpenAPI definition.
To allow you to detect such errors already before you run Conformance Scan, when you create, update, or upload a scan configuration, Conformance Scan checks if the configuration is valid for the OpenAPI definition of the API in question. If Conformance Scan finds any discrepancies that would automatically mean that the scan could not successfully run with that configuration, these errors are flagged to you so you can fix them. You cannot run a scan using a scan configuration that is not valid for API in question.
Response validation
Conformance Scan tests how the API implementation handles, for example, requests to operations not defined in the OpenAPI definition at all, or misconfigured requests to existing operations. How the API responds to the crafted test requests in the scan determines whether or not it conforms to the contract it sets out in its API definition. To catch issues, Conformance Scan validates the response from the API and considers, for example, the following:
- Is the provided HTTP status code defined in the
response
object in the API definition? - Are all headers in the response defined in the
response
object in the API definition and do they match the definedschema
? Are all required headers present in the response? - Is the returned response body too big?
- Should the response even have a body (method
HEAD
or returned status codeHTTP 204
)? - Does the
Content-Type
of the returned response match the types defined in the content map in the API definition, and does it match the definedschema
?
If the response body in the response from the API exceeds 8 KB, it is truncated.
Response validation is done in two parts:
- Response code: Did error handling work? Did the received HTTP status code in the API response match what Conformance Scan expected or not? Or in the worse case, did the intentionally malformed request not raise an error at all?
- Contract conformity: Did the received response match what is defined in the OpenAPI definition of the API?
From security risk perspective, incorrect error handling poses a bigger risk for the API implementation than response content, and therefore Conformance Scan focuses on it first.
- For requests to non-existing operations, Conformance Scan expects the API to respond with
HTTP 405 Method not allowed
. Any other response is considered to be wrong. - For misconfigured requests to existing operations:
- The returned HTTP status code must be equal to or greater than
HTTP 400
to indicate an error. - The returned HTTP status code (or a
default
response) must be defined in the OpenAPI definition of the API.
- The returned HTTP status code must be equal to or greater than
Based on this, Conformance Scan splits the received response codes into three classes:
- Incorrect: The API did not raise an error, but responded with a success to a malformed request. This means that the API implementation does not catch and handle the error at all, indicating serious problems. This is the worst case and the result is shown in red.
- Unexpected: The received response code does not match what Conformance Scan expected for the test request, but the API implementation still raised an error, if not the correct one. This means that there are some problems in the error handling in API implementation, but at least the issue is caught. The result is shown in amber.
- Expected: The received response code matches what Conformance Scan expected for the intentionally malformed request, and the API implementation raised the error correctly. This means that the API behavior is good and the result is shown in green.
However, even if response codes match what Conformance Scan expects, it does not mean all is well. Conformance Scan could also uncover discrepancies between the API contract set out in the OpenAPI definition and the backend API implementation:
- The returned response body must match what is defined in the API definition for the returned HTTP status code.
- The returned response headers must match what is defined in the API definition (if response headers are defined).
Based on analyzing the response bodies and headers, the received responses are flagged either as failures or successes in contract conformity.
By default, Conformance Scan does not follow redirects (HTTP 3XX
) in API responses to analyze the
final response, but instead analyzes the received redirect. Depending on your API, this could result in conformance failure if the response definition in you API is the expected final response that the redirects would lead to. You can change this behavior in scan settings, if needed, but we do not recommend it as it may prevent the scan from completing: the final response from a redirect could often be in an unsuitable format, resulting in error. By not following redirects, Conformance Scan can complete and successfully evaluate the main concern: is the error handling of your API implementation working as it should.
Scan report
Conformance Scan produces a scan report that provides valuable information on how well your API conforms to its API definition. The report summarizes how many tests were run, what was scanned, and how the scan went. You can also check how severe the found problems were.
For Scan V1, you can have a total two reports available for a single API: one from the latest scan run in 42Crunch Platform another one from the latest scan run on premises. For Scan v2, each scan configuration you have retains the report from the latest scan where that configuration was used.
In a scan report, the tests and their findings are listed by path and operation, and one operation can have multiple tests run against it, in which case it lists the results for all of them. Above the list of found issues, you have "Critical to Success" filter bar that you can use to home in on the scan results:
- OWASP vulnerabilities: Which OWASP API Security Top 10 vulnerabilities Conformance Scan found in your API.
- Operations not tested: Which operations could not be tested in the scan and why.
- Test results: How the received API responses were classified in response validation. The results are grouped in the order of severity, starting with the most critical issues on the left. As you address the issues, they move to the right towards the other end of the scale where everything is good.
The filter bar also shows the distribution (in percentage) of the test results between different result classes.
To see this latest version of scan report with the "Critical to Success" filter bar, you need to scan your API again. Otherwise, you see the old style can report with the bars charts for paths above the list of found issues. If running Conformance Scan on premises, you need 42crunch/scand-agent:v1.16.0
or later to get the new style scan report.
You can click on the filters to view only the results you are interested in. These filters are not cumulative, meaning that clicking on one changes the listing completely, it does not add more results to the existing view. However, within these main filters, you can further refine the current result list using the additional dropdown filters.
The filter sidebar on the left lets you filter the results by path, and also shows how many issues were found in each path, as well as which operations the scan had to skip, for example, because the happy path request failed.
Clicking an issue in the result list provides further details on it, such as the description and ID of the test that the scan performed, the URL the scan called, the response time of the API, and the size and content type of the response. The issue details also show which response codes Conformance Scan expected to receive as well as where that expectation comes from.
To make it easier to reproduce the results, the report also provides the cURL requests the scan used to detect each issue.
You can download a copy of the report, for example, to share the results outside the platform.
Like with the audit score, you can also see how the number of issues that Conformance Scan found in your API has changed over time in the Conformance Scan trends chart on the API summary page.
Expected and unexpected HTTP status codes
HTTP status codes are a crucial part of API traffic: they allow communicating the status of a request back to its sender, like backend services responding back to clients and API consumers but also microservices communicating with other microservices within the same architecture. Especially in the latter case, sending back a wrong response code could have serious and unforeseen consequences down the line, which is why response code analysis is a critical part of Conformance Scan.
The issue details in the scan report show both which HTTP status codes Conformance Scan received and which it expected to receive for any given test. This helps you decide how to fix the possible discrepancies.
Conformance Scan also shows the source of its expectation for a particular HTTP code:
- 42Crunch default expectations: These are HTTP status codes that Conformance Scan expects to receive based on standards, such as RFC 7231 or RFC 7235.
- Customization rules: These are HTTP status codes that have been defined as expected response codes in the scan rules applied to the scanned API.
We recommend using HTTP status codes as defined in RFCs as much as possible to avoid any accidental mismatches between the sending and the receiving end.
Running Conformance Scan on premises
Running Conformance Scan in 42Crunch Platform is quick and straightforward, but in some cases it might not offer enough options for your needs, or you might want to store the data from scans in your own system. In this case, you can deploy and run Conformance Scan locally as a Docker image.
For users in the free Community organization, and currently for Scan v2, running the scan on premises is the only option available.
To run on-premises scan, you create a scan configuration in 42Crunch Platform and then pull and run the Conformance Scan Docker image from Docker Hub, using the configuration you created by providing the scan token of the configuration to be used in your Docker command. Scan configurations and tokens are specific to a scan version: you cannot run Scan v2 using a Scan v1 scan token, and vice versa. When running a scan, make sure you specify the right scan token for the scan version you are using, otherwise Conformance Scan cannot use the associated scan configuration and fails to run.
When the on-premises scan starts, it establishes a a two-way, HTTP/2 gRPC connection to 42Crunch Platform at the address services.<your hostname>
and the port 8001
. Make sure that your network configuration (like your network firewall) authorizes these connections. The on-premises scan uses this connection to verify the scan token and to download the scan configuration you have created. During runtime, on-premises scan uses the connection to send the scan report and logs to the platform.
If you are a user in the free Community organization and access 42Crunch Platform at https://platform.42crunch.com
, your hostname is 42crunch.com
, and the endpoint you must enable is services.42crunch.com
.
If you are an enterprise customer not accessing 42Crunch Platform at https://platform.42crunch.com
, your hostname is the same one as in your platform URL.
Regular users can only create scan configurations and run the on-premises scan on APIs in their own API collections. Organization administrators can create scan configurations and run the on-premises scan on all APIs in their organization.
You can check where (in 42Crunch Platform or on premises) the latest scan run on the API summary page.
For more details on how to run the on-premises scan and customize the scan configuration, see Scan API conformance.
Using environment variables in Scan v1
When running Conformance Scan on premises, you can use environment variables and supply values for them in your Docker command when you run the scan. This way you can easily test the different authentication methods for your API.
When configuring the authentication for on-premises scan configuration, you can enter an environment variable to any field instead of hard-coding a value. The environment variable can be called anything you want, as long as fulfills the following criteria:
- Must be inside curly brackets (
{}
) - Must start with
$
- Cannot contain other special characters than
-
,_
, and.
- Must be longer than one character (just
{$}
is not a valid environment variable)
Environment variables are currently not supported for mutual TLS password.
When you run the on-premises scan, you provide the values for the environment variables in your run command. The variables must have a prefix SECURITY_
added before them, for example:
docker run -e SCAN_TOKEN=<your scan token> -e SECURITY_ACCESS_TOKEN='<the access token value you want to use>' 42crunch/scand-agent:latest
Providing authentication details for Scan v2
For Scan v2, the scan configuration is generated automatically from the OpenAPI definition of the APIThe credential values are not hardcoded to the configuration, instead the configuration lists the credentials and their corresponding environment variables (for example, SCAN42C_SECURITY_OAUTH2
), and you can provide the credentials you want to use in your Docker command.
The basic scan configuration is generated automatically directly from your OpenAPI definition, and all found authentication methods are automatically given the environment variables (for example, SCAN42C_SECURITY_OAUTH2
) that are used to provide the values for credentials in the Docker command when running the scan.
When you run Scan v2, you provide the values for the variables in your run command:
docker run -e SCAN_TOKEN=<your scan token> -e PLATFORM_SERVICE=services.42crunch.com:8001 -e SCAN42C_SECURITY_OAUTH2='<the token value you want to use>' 42crunch/scand-agent:v2.0.0
If you do not provide credentials to all required authentication methods listed in your scan configuration, Conformance Scan cannot run tests on all API operations because it cannot authenticate to your API.
Scan logs
As part of its operation Conformance Scan produces logs. When running the scan in 42Crunch Platform, the logs are stored in the platform, but when running Conformance Scan on premises, there are more options.
By default, Conformance Scan run on premises writes all log levels as standard output (STDOUT) to console and your terminal, but only uploads ERROR
and CRITICAL
level scan logs to 42Crunch Platform. You can also direct the STDOUT logs to be consumed downstream services (see the documentation for your environment for how this is done), or write the logs in a volume that you mount as part of your run command. The scan writes logs to the file <your local filepath>/opt/scand/log/<task ID>-<epoch timestamp>-scand.log
.
The default size limit for the log file is 100 MB, but you can also limit the size more, if needed. When you create the scan configuration for on-premises scan, you can also choose how many issues you want the scan to report (default is 1000). Decreasing the maximum number of reported issues also decreases the maximum possible size of a single scan report. Large response bodies do not inflate the logs: response bodies over 8 KB are truncated.
If the log file size exceeds the size of the volume you mounted, the scan raises errors but Conformance Scan continues to run normally. These errors are also uploaded to 42Crunch Platform, but the rest of the DEBUG
and INFO
level logs are only written as STDOUT, not to file.
Errors in Conformance Scan
Occasionally, Conformance Scan might fail to scan your API. The reason for this could be, for example:
- Invalid OpenAPI definition: You API definition has critical errors that are preventing Conformance Scan from running. For example, the structure of your API might not conform to the OAS. Use API Security Audit to check your API definition and fix any found issues in Security Editor, then try Conformance Scan again.
- Invalid scan configuration: The configuration you set up for the scan does not match your API definition and thus is not valid. For example, you might have chosen an authentication method that does not match the ones defined in your API definition. Try configuring and running Conformance Scan again, making sure the authentication details match your API definition.
- Scan cannot reach API endpoint: Conformance Scan tried to run the scan but failed to reach the API endpoint you had selected for the scan. The API host could be down, or there could be an error in the URL, especially if entered a custom URL. Check the URL and the host of your API and try again.
- Timeout: The scan took longer than the maximum scan duration (3600 seconds).
What is...
How to...
Learn more...