How could Steampipe and Turbot help with setting Cloud Security Baselines?

Greg
4 min readFeb 4, 2024

I wrote a few thoughts about Cloud Security Baselines. Initially, the idea started in a spreadsheet to help describe and tailor the controls for the most impactful areas first where resources can be publicly accessible and may expose data and expanded from there

With the end of Zero Interest Rate Policy, scrutiny for investments in tooling must demonstrate the return on that investment. As part of exploring different options I have taken a look away from the CNAPP tools to consider other approaches.

I have spent a couple of hours exploring Turbot Pipes and Steampipe as tools to help automate and scale up the approach.

Takeaways

The hosted version of SteamPipe, Turbot Pipes, is an effective way of getting up and running to measure against your baselines in different cloud providers and a low cost.

To make full use of the features and customise the baselines it is necessary to customise the “Mods” in SteamPipe to your organisation. Simply fork the “Mods” and maintain your customisations, and stay on top of the latest industry best practices as they are added by the SteamPipe community to the upstream version.

The process of working with your organisation to make changes to your current cloud infra is where there is more of a gap. Steampipe has a Jira addon so that you can publish snapshots of the baselines into your different team workspaces, but these require building and customising before you can get the measurable improvements made to your cloud infrastructure.

Exploring the features of Turbot and Steampipe

Turbot Pipes is an intelligence, automation & security platform built specifically for DevOps.

Steampipe is an open-source zero-ETL engine to instantly query cloud APIs using SQL.

Turbot monitors the configuration of infrastructure and services via the native cloud API’s.

  • It has integrations with existing tooling such as GitHub, GCP and AWS cloud providers.
  • There are maintained frameworks of checks to query the services against industry best practice
  • We can build our checks and queries as it uses a Postgres-compatible interface and can integrate with our Looker BI tooling for reporting.

“Workspaces” are used to isolate the data and mods in the tooling. https://turbot.com/pipes/docs/workspaces

  • workspaces can run with schedules
  • Auto disable after 4 days of no access into a “stopped” state that can be “woken”

Connections can be shared between workspaces and identities

Aggregators — Summarise data in a workspace

Datatank — Stores the results in a workspace as a snapshot.

Turbot uses a Cloud managed DB that is handled by Steampipe team so rather than managing the underlying DB’s yourself, these are handled by the Turbot platform.

Using Steampipe to check AWS benchmarks as a PoC

Steps

Installed to an AWS account by granting a cross account read only role to query the account resources

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::XXXXXXXXXX:root"
},
"Action": "sts:AssumeRole",
"Condition": {
"StringEquals": {
"sts:ExternalId": "u_cXXXXXXXXXXXXX:XXXXXXXXX"
}
}
}
]
}

Set up with AWS CIS compliance V2.0.0 dashboard after a few minutes. For production using Infra as Code with CloudFormation or Terraform is a much better choice. Turbot provides these templates to use too.

The reporting is helpful in understanding how the accounts are configured but lacks the customisation of the framework out of the box to customise and document the controls.

To customise the controls it is likely to need to fork the compliance Mod in GitHub and update the checks that are run for that framework.

The mods themselves are opensource.

The details of the invidual checks that are run are documenting in the Mod documentation. For the AWS compliance module it has 536 checks https://hub.steampipe.io/mods/turbot/aws_compliance/queries

Example for a cloud trail check:

Then the query is referenced into the relevant frameworks.

These are linked to the compliance frameworks and the remediation guidance for that issue.

https://hub.steampipe.io/mods/turbot/aws_compliance/controls/control.cis_v200_3_2?context=benchmark.cis_v200/benchmark.cis_v200_3

Tags are used to describe the resources following a taxonomy for SteamPipe.

How could the reports be operationalised in an organisation?

  1. Customise the framework checks (if necessary)
  2. Create a snapshot of the configuration in the workspace
  3. Snapshots can be shared to other workspaces in the organisation. https://steampipe.io/docs/snapshots/batch-snapshots#sharing-snapshots
    E.g.
  4. Maintain a security workspace where the queries and reports are customised and build
  5. Create an org level workspace to publish validated snapshots into that can be used to work with teams

How could dashboards be customised?

Using the AWS compliance as an example, the frameworks are described in markdown.

  • The query pulls in the data for the associated checks for that controls.
  • tags are defined for associated the data in sections and for querying.

For custom dashboard different checks can be included or omitted and documented as code in a forked version of the mod.

control "cis_v200_3_2" {
title = "3.2 Ensure CloudTrail log file validation is enabled"
description = "CloudTrail log file validation creates a digitally signed digest file containing a hash of each log that CloudTrail writes to S3. These digest files can be used to determine whether a log file was changed, deleted, or unchanged after CloudTrail delivered the log. It is recommended that file validation be enabled on all CloudTrails."
query = query.cloudtrail_trail_validation_enabled
documentation = file("./cis_v200/docs/cis_v200_3_2.md")
  tags = merge(local.cis_v200_3_common_tags, {
cis_item_id = "3.2"
cis_level = "2"
cis_type = "automated"
service = "AWS/CloudTrail"
})
}

The query is then defined in the conformance_pack maintained by steampipe community

query "cloudtrail_trail_validation_enabled" {
sql = <<-EOQ
select
arn as resource,
case
when log_file_validation_enabled then 'ok'
else 'alarm'
end as status,
case
when log_file_validation_enabled then title || ' log file validation enabled.'
else title || ' log file validation disabled.'
end as reason
${local.tag_dimensions_sql}
${local.common_dimensions_sql}
from
aws_cloudtrail_trail
where
region = home_region;
EOQ
}

It uses ok or alarm to describe the outcome of the checks. https://github.com/turbot/steampipe-mod-aws-compliance/blob/main/conformance_pack/cloudtrail.sp#L383

--

--

Greg

Security addict, 17+ years in industry making systems more secure and finding those that aren’t