SBoMs Are Growing Up: Reflections on CISA's SBoM Conference

July 31, 2023 Ramiro Salas

Last month, the Cybersecurity and Infrastructure Security Agency (CISA) organized a one-day software bill of materials (SBoM)-centric conference, both amusingly and aptly entitled SBoM-a-Rama. It was a hybrid event allowing for both in-person and remote participation; I chose the latter.

As a long-term security practitioner, I’ve been observing the development of this field with great enthusiasm, but always from the periphery. So participating in this conference seemed like the perfect opportunity to gain a fresh and direct perspective on the market landscape, delivered by the very practitioners, stakeholders, and pioneers who have been elbow deep in the subject since it first appeared.

By now, any software producer aiming to make a sale to the U.S. government should be intimately familiar with Executive Order 14028 on Improving the Nation’s Cybersecurity. A significant stipulation of this order is the requirement for federal agencies to ensure their software product and service providers are capable of generating SBoMs that are in accordance with both the executive order and the National Telecommunications and Information Administration’s report, The Minimum Elements For a Software Bill of Materials. Naturally, this stipulation sent the industry into an almost complete tailspin as everyone scrambled to meet these mandates ahead of the initially provided deadlines.

However, as explored during the conference, the subject of SBoMs isn't as straightforward as running a command and bundling the resulting output file with your product. This very point was the reason why the following themes surfaced as central points during the conference.

Quality

What happens when your product leverages multiple components sourced from various suppliers and/or open source projects? What if the standards employed in their build process are inconsistent? Or if those processes have differing life cycles and vulnerability management processes? How can software consumers ensure that these components are unaltered from the moment the SBoM was produced until the point of deployment?

The topic of SBoM quality emerged as a primary concern among both the speakers and the general audience at the conference. It was discussed that this issue is exacerbated by process inconsistencies across vendors, an absence of quality benchmarks, and a nebulous understanding of what a "good" SBoM should look like. In essence, the burning question was, "Can we trust a vendor's SBoM?" Furthermore, are there any available verification mechanisms? SBoMs are perceived as assertions that, without a consistent attestation mechanism, risk adding minimal value in the long run.

The Internet Engineering Task Force's (IETF) Supply Chain Integrity, Transparency, and Trust (SCITT) working group was mentioned a few times throughout the event for its promising work. Given their commitment to transparency, integrity, and accountability in this area, coupled with the IETF's proven track record, their efforts are worth looking into. According to the working group, they aim to “standardize the technical flows for providing information about a software supply chain, which also includes firmware and covering the essential building blocks that make up the architecture.”

Ironically though, not a single mention of blockchain technology was made during the entire discussion of this topic (at least to my recollection), when this could have been the one use case where a distributed ledger would have been a reasonable approach (in conjunction with a distributed reputation model). The key takeaway, however, is that the industry is finally aligning on the most critical needs and setting priorities appropriately.

Package naming was another recurring topic. The software industry is rife with varying naming conventions across multiple ecosystems and stacks. Many of them appear similar, yet are distinctly different. The PURL standard seems to be gaining traction, as does BOM-link utilized in the CycloneDX format. However, it's premature to declare a clear winner, or indeed whether there's a need for one. Even if the industry does eventually converge on a format, it will likely take some time to align all package managers to a common standard.

Non-disclosure agreements and secrecy

Security by obscurity has long been considered an outdated approach, found to cause more harm than good in many scenarios. However, the devil is in the details, and there are many sectors where confidentiality reigns supreme and lawyers wield more power and influence than security practitioners (until they require our help, that is).

This tension was on display during the healthcare industry discussion, where supply chain secrecy was depicted as a matter of life and death. Literally.

While this impacts several sectors, the conversation largely focused on the firmware supply chain for medical devices, a critical area where undisclosed components might contain vulnerabilities that could potentially be exploited by malicious actors, posing a threat to patients.

Trust is the fundamental cornerstone in the patient–doctor relationship. Even when software supply chain security is not necessarily a topic of discussion when a medical device is brought into the patient–doctor conversation, trust operates as a transitive construct, like the unspoken safety agreement between you and the restaurant where you eat.

Despite these challenges, security practitioners in the medical field are making strides towards transparency. For example, NewYork-Presbyterian showcased Daggerboard, an open source project that translates SBoM files into human-readable output, with a focus on comprehensive risk evaluation.

Other presentations, such as the one on the automotive industry, underscored that software vulnerabilities in their supply chain also carry life and death consequences, notwithstanding their heavily regulated industry status.

Vulnerability Exploitability eXchange

I was pleased to see the topic of Vulnerability Exploitability eXchange (VEX) receiving considerable attention at this conference.

VEX is a machine-readable format conceived out of the necessity to communicate the status of a vulnerability to software consumers. This empowers users to make appropriate risk assessments, apply workarounds if available, or, depending on the tangible risk posed by the vulnerability, implement additional safeguards. VEX also allows software providers to make assertions about the status of  common vulnerabilities and exposures (CVE) and map all affected components.

Other methods do exist—such as the Vulnerability Disclosure Report (VDR),—but their approach differs, and are complementary in some instances. The Open Worldwide Application Security Project (OWASP) has already created a superb breakdown of the differences between VDR and VEX, so if you're intrigued by this subject, their article should be your top reference.

CISA, in partnership with the industry, has taken an important lead in better framing VEX and has specified the minimum elements required for VEX to be meaningful and useful, as well as described use cases in detail. What’s more, CISA has also detailed the proper triaging workflow and the steps that should be considered at each stage, as well as the language (tags) that should be used when describing the outcome of a triaging operation (specifically status and justification).

At the time of writing, CycloneDX, CSAF 2.0 Profile 5, and OpenVEX are the available VEX formats, all of which CISA recognizes as acceptable. Looking ahead, SPDX 3.0, currently in its release candidate stage, is also expected to incorporate VEX-related fields. CycloneDX has also demonstrated how it can accommodate eight of the nine use cases defined by CISA.

However, these formats are not all identical, and some fields appear in some but not in others. Additionally, the  workflows of each format are somewhat different, and all have various degrees of adoption across the industry. It's too early to predict if one of these formats will emerge as the dominant choice, if a new one will surface, or if tools will adopt a multi-format approach.

Some SBoM formats like CycloneDX—or the upcoming SPDX 3.0—can include VEX information in them, but the lifecycle of a VEX statement is inherently different from an SBoM. The lifecycle is expected to keep evolving as new vulnerabilities are discovered, reassessed, reclassified, and triaged, all while the SBoM may remain static. Of course, you can still choose to separate them into different files.

Regardless, given the graph-nature of the relationships between SBoMs and VEX statements and the complexity of the various release processes across the industry, it’s most likely that the format themselves become less important over time, and the focus is put more on the tooling and workflows to ensure the outcome is delivered. In the end, as I mentioned before, both SBoMs and VEX statements are ultimately about sharing information that allow customers to properly manage their risk.

Other topics discussed

The following are other highlights covered at the conference.

International adoption

Presenters representing both the EU and Japan presented during the conference, showcasing their current work, which maps very well with the current efforts happening in the U.S. It was discussed that consistent international standards will ensure every company eventually participates in these efforts, making the entire global software industry more secure and accountable.

Service transparency

The first draft for the service transparency advisory was announced, and the CISA SBoM Cloud Working Group would be moving the effort forward with focus on risk management, service availability, and data governance. This is a particularly interesting area given the increased prevalence of software as a service (SaaS). CycloneDX already has a model that addresses SaaS called SaaSBoM. It will be interesting to explore parallels and differences as this area develops further.

SBoMs for machine learning and AI

Given the increasing importance of machine learning (ML) systems and more recently, large language models (LLMs) in the software industry, and how different they are from traditional executable programs, how can you reason about SBoMs? The architecture and execution model may be completely different, but some of the same questions that regular software is trying to address via SBoMs are also applicable to LLMs and ML models. How was the model trained? What’s the provenance of the data that it was trained on? If synthetic data was used, should we consider the source a component of an SBoM? Should the model’s weights be part of an SBoM given how they affect the output? Many of these questions came up towards the end of the Q&A session, and no clear answers were given. Rather, there was a general recognition that this is something that we will eventually have to tackle.

Interestingly, CycloneDX yet again has a taxonomy addressing the components of what they call ML-BoM.

Final thoughts

I came away with the impression that SBoMs, VEX, and all related technologies aimed at enhancing transparency in the software building process have matured, blossomed, and are beginning to reach into additional realms such as IoT and ML.

Far from being a burden, the executive order has proven to be the necessary stick in the absence of enough carrots. I can foresee only goodness stemming from this in the long run. I look forward to seeing how the entire industry coalesces around standards, processes, best practices, and software models—all in the name of bolstering our collective security.

I highly recommend you familiarize yourself with all the current and upcoming standards, initiatives, and proposals in the community, several of which are linked in this post. 

Join the practitioners and attend upcoming conferences, participate in working groups and join others to contribute your ideas, expertise, and use cases. A robust and secure software supply chain will help everyone.

Previous
Government Organizations Embrace Upskilling In-House IT Staff to Meet the Latest Executive Orders
Government Organizations Embrace Upskilling In-House IT Staff to Meet the Latest Executive Orders

Governments need to deliver new digital services that align with civilians’ expectations. To provide those ...

Next
VMware Greenplum on Samsung’s Gen-5 NVMe Drives: Powerful Speed and Performance for Big Data, Analytics, and Data Warehousing
VMware Greenplum on Samsung’s Gen-5 NVMe Drives: Powerful Speed and Performance for Big Data, Analytics, and Data Warehousing

Using Greenplum with Samsung establishes a new reference architecture that can be customized to meet indivi...