Skip to content

Input Schema

Overview

Schemas are declarative documents that define the structure, data types and constraints of inputs being scanned. Trivy provides certain schemas out of the box as seen in the explorer here. You can also find the source code for the schemas here.

It is not required to pass in schemas, in order to scan inputs by Trivy but are required if type-checking is needed.

Checks can be defined with custom schemas that allow inputs to be verified against them. Adding an input schema enables Trivy to show more detailed error messages when an invalid input is encountered.

Unified Schema

One of the unique advantages of Trivy is to take a variety of inputs, such as IaC files (e.g. CloudFormation, Terraform etc.) and also live cloud scanning (e.g. Trivy AWS plugin) and normalize them into a standard structure, as defined by the schema.

An example of such an application would be scanning AWS resources. You can scan them prior to deployment via the Trivy misconfiguration scanner and also scan them after they've been deployed in the cloud with Trivy AWS scanning. Both scan methods should yield the same result as resources are gathered into a unified representation as defined by the Cloud schema.

Supported Schemas

Currently out of the box the following schemas are supported natively:

  1. Docker
  2. Kubernetes
  3. Cloud
  4. Terraform Raw Format

You can interactively view these schemas with the Trivy Schema Explorer

Example

As mentioned earlier, amongst other built-in schemas, Trivy offers a built in-schema for scanning Dockerfiles. It is available here Without input schemas, a check would be as follows:

Example

# METADATA
package mypackage

deny {
    input.evil == "foo bar"
}

If this check is run against an offending Dockerfile(s), there will not be any issues as the check will fail to evaluate. Although the check's failure to evaluate is legitimate, this should not result in a positive result for the scan.

For instance if we have a check that checks for misconfigurations in a Dockerfile, we could define the schema as such

Example

# METADATA
# schemas:
# - input: schema["dockerfile"]
package mypackage

deny {
    input.evil == "foo bar"
}

Here input: schema["dockerfile"] points to a schema that expects a valid Dockerfile as input. An example of this can be found here.

Now if this check is evaluated against, a more descriptive error will be available to help fix the problem.

1 error occurred: testcheck.rego:8: rego_type_error: undefined ref: input.evil
        input.evil
              ^
              have: "evil"
              want (one of): ["Stages"]

Custom Checks with Custom Schemas

You can also bring a custom check that defines one or more custom schema.

Example

# METADATA
# schemas:
# - input: schema["fooschema"]
# - input: schema["barschema"]
package mypackage

deny {
    input.evil == "foo bar"
}

The checks can be placed in a structure as follows

Example

/Users/user/my-custom-checks
├── my_check.rego
└── schemas
    └── fooschema.json
    └── barschema.json

To use such a check with Trivy, use the --config-check flag that points to the check file or to the directory where the schemas and checks are contained.

$ trivy --config-check=/Users/user/my-custom-checks <path/to/iac>

For more details on how to define schemas within Rego checks, please see the OPA guide that describes it in more detail.

Scan arbitrary JSON and YAML configurations

By default, scanning JSON and YAML configurations is disabled, since Trivy does not contain built-in checks for these configurations. To enable it, pass the json or yaml to --misconfig-scanners. Trivy will pass each file as is to the checks input.

Example

$ cat iac/serverless.yaml
service: serverless-rest-api-with-pynamodb

frameworkVersion: ">=2.24.0"

plugins:
  - serverless-python-requirements
...

$ cat serverless.rego
# METADATA
# title: Serverless Framework service name not starting with "aws-"
# description: Ensure that Serverless Framework service names start with "aws-"
# schemas:
#   - input: schema["serverless-schema"]
# custom:
#   id: SF001
#   severity: LOW
package user.serverless001

deny[res] {
    not startswith(input.service, "aws-")
    res := result.new(
        sprintf("Service name %q is not allowed", [input.service]),
        input.service
    )
}

$ trivy config --misconfig-scanners=json,yaml --config-check ./serverless.rego --check-namespaces user ./iac
serverless.yaml (yaml)

Tests: 4 (SUCCESSES: 3, FAILURES: 1)
Failures: 1 (UNKNOWN: 0, LOW: 1, MEDIUM: 0, HIGH: 0, CRITICAL: 0)

LOW: Service name "serverless-rest-api-with-pynamodb" is not allowed
═════════════════════════════════════════════════════════════════════════════════════════════════════════
Ensure that Serverless Framework service names start with "aws-"

Note

In the case above, the custom check specified has a metadata annotation for the input schema input: schema["serverless-schema"]. This allows Trivy to type check the input IaC files provided.

Optionally, you can also pass schemas using the config-file-schemas flag. Trivy will use these schemas for file filtering and type checking in Rego checks.

Example

$ trivy config --misconfig-scanners=json,yaml --config-check ./serverless.rego --check-namespaces user --config-file-schemas ./serverless-schema.json ./iac

If the --config-file-schemas flag is specified Trivy ensures that each input IaC config file being scanned is type-checked against the schema. If the input file does not match any of the passed schemas, it will be ignored.

If the schema is specified in the check metadata and is in the directory specified in the --config-check argument, it will be automatically loaded as specified here, and will only be used for type checking in Rego.

Note

If a user specifies the --config-file-schemas flag, all input IaC config files are ensured that they pass type-checking. It is not required to pass an input schema in case type checking is not required. This is helpful for scenarios where you simply want to write a Rego check and pass in IaC input for it. Such a use case could include scanning for a new service which Trivy might not support just yet.

Tip

It is also possible to specify multiple input schemas with --config-file-schema flag as it can accept a comma seperated list of file paths or a directory as input. In the case of multiple schemas being specified, all of them will be evaluated against all the input files.