Open Source Compliance in the context of containers

Modern software development, especially micro-service architectures, is hardly imaginable without container technologies such as docker. The promises of ease and Flexibility through containers prove true. But what does this flexibility mean for open source compliance?

It’s wonderful: Just take an image (FROM instruction) from the Docker hub with e.g. a Linux Alpine – a very lightweight Linux distribution – as a base, then add some source files from your own repository (COPY) and build the target application with Maven (RUN). Quickly add a web server with apk add package (RUN) and adding some configuration (another COPY). The Java service is ready for delivery. Thanks to Docker!

It seems to be very little: A Docker file with 6-7 lines, plus a handful of own classes, maybe 300-500 lines of code. Nevertheless, in total, half a million lines of code are created. 99.9% of it is open source, additionally pulled from the net when building the image. Irrespective of any security concerns, this is how many of the services used today are created, especially in the context of current micro-service architectures. 

March 11, 2020In KnowledgeBy Jan Thielscher

A considerable share of almost all services is based on Open Source

This description is not intended to diminish the performance of service developers, nor is it generally applicable. However, it shows roughly the relation in which open source has spread in today’s software development. A significant part of the efficiency gain in software production is based on the free availability of infrastructure solutions as open source. 

Irrespective of this, however, new challenges also arise with this clout. In the commercial environment in particular, the use of open source must be well documented, not only for security reasons. There are also legal reasons:

Whenever an author’s work is used by a third party, the user must secure the rights of use. This is done by an agreement – even tacitly – between the author (copyright holder) and the user. This is usually done by the author placing his code under a “license”. In the context of Open Source, these licenses usually grant the rights for commercial or even scientific use, modification and distribution open source or binary, etc.. There are hundreds of these licenses, some more open, some more restrictive, see the OSI license list or the official SPDX list.

Know the conditions of Open Source

Very important for the users of Open Source is, however, that many of the comparatively open licenses also impose conditions for their validity! If these conditions are not fulfilled, the right of use is void! 

The following excerpt from the certainly well-known Apache 2.0 license, for example, clearly binds the right of distribution to the compliance with additional conditions. These conditions include, among others, the provision of the license text, the naming of authors or the retention of copyright information in the source code. In addition, these must be explained in a “notice file” if necessary.  If any changes have been made to the software, these must also be reported. If even one of these conditions is not fulfilled, the right to distribute the software expires! 

Containers add a special aspect to this already demanding list of requirements. With the help of containers, it is possible to deliver essential parts of the runtime environment preconfigured.  This extends the scope of the delivery considerably.  If a Zip, EAR or WAR file with its own source code has been delivered so far, a Linux with pre-installed Tomcat, Wildfly or other Open Source components can be delivered directly ready for use in the container. 

If not only the docker file – i.e. the recipe for the construction – is delivered, but the finished image, all infrastructure components become part of the delivery. Depending on the type of license, this can have different consequences, especially in the area of documentation, but also with regard to intellectual property. In any case, it would be critical if one of the license conditions is not fulfilled. Unauthorised commercial marketing is no longer a trivial offence.  

The majority of open source licenses require at least the naming of the author, if not the provision of the license together with the delivered solution. However, if you look at the images on Docker-Hub, you will find neither bills of material nor corresponding metadata, let alone license texts. Rarely enough, there is at least a hint or link to the terms and conditions on the Docker file/image. 

4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions:
  1. You must give any other recipients of the Work or Derivative Works a copy of this License; and
  2. You must cause any modified files to carry prominent notices stating that You changed the files; and
  3. You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and
  4. If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License.

Accepting documentation as a challenge

Woe betide the person who is now to carry out the compliance documentation for a docked product enriched across several value creation stages / partners! As described above, most different sources for the origin of the image finally used come into question. Especially critical are the manually added files. These can contain everything imaginable and elude virtually any form of analysis.

Furthermore, the author of an image has the possibility to copy something into the image from public or private sources, to load source code, to install and execute packages or software or even to build it. The latter is also possible with the help of package managers, which in turn resolve and obtain the necessary dependencies. In short: a docker image offers its author in each layer the opportunity to do all the dirty things he could do on a server as well.

Tools and support

Fortunately, there are several OpenSource tools that can help here. Not all of them were created for this purpose, but some of them can be used. There is tern on the one hand, and Clair on the other. 

Tern is developed by a small team at vmware to decompose docker images and collect the license information of the components contained in them. The focus is on analysis for legal purposes.

Clair was created with the focus on vulnerability analysis and has less focus on the legal issue. However, Clair also indexes the components of a container and could therefore serve as a supplier of BOM information. However, if only the legal aspect is in focus, Clair would be shooting at sparrows with cannons (see below).

tern

Tern can perform its analysis on the basis of a Docker file as well as a Docker image.  To do this, tern uses Docker, as well as the Linux “find” command in conjunction with the extended attributes of the Linux file system. In the newer Linux distributions the required attr-library is already available, in older Linux distributions it may have to be installed later.

Since tern activates the find command with the help of a simple bash script and analyzes the files on mounted host volumes, the Docker version of tern is unfortunately also currently not executable on Mac or Windows. However, it runs well on a Linux system. tern is developed with Python 3 and can also be installed via PIP.

As a result, tern prints a list of the components it has found in each layer of the image (see example).

A docker image creates a docker container. It is, so to speak, the building regulation according to which the Docker demon assembles the container. You can imagine the container construction analogous to a 3D print. Each command line of the docker file creates its own layer. These layers are separated by namespaces and therefore more independent than on a real system.

Thus an existing image can be loaded with the help of the FROM command. It comes with everything that its author gave it. What this is can only be determined exactly if This is one of the reasons why you should only get images from trustworthy sources or preferably build them from scratch!

Images should always come from trustworthy sources or be built from scratch!

Furthermore, the author of an image has the possibility to copy something into the image from public or private sources, to load source code, to install and execute packages or software or even to build it. The latter is also possible with the help of package managers, which in turn resolve and obtain the necessary dependencies. In short: a docker image offers its author in each layer the opportunity to do all the dirty things he could do on a server as well.

------------------------------------------------

Layer: 9a78064319:

info: Found 'Ubuntu 18.04.4 LTS' in /etc/os-release.

info: Layer created by commands: /bin/sh -c #(nop) ADD file:91a750fb184711fde03c9172f41e8a907ccbb1bfb904c2c3f4ef595fcddbc3a9 in / 

info: Retrieved by invoking listing in command_lib/base.yml

Packages found in Layer: adduser-3.116ubuntu1, apt-1.6.12, base-files-10.1ubuntu2.8, base-passwd-3.5.44, bash-4.4.18-2ubuntu1.2, bsdutils-1:2.31.1-0.4ubuntu3.5, bzip2-1.0.6-8.1ubuntu0.2, coreutils-8.28-1ubuntu1, dash-0.5.8-2.10, debconf-1.5.66ubuntu1, debianutils-4.8.4, diffutils-1:3.6-1, (…) , zlib1g-1:1.2.11.dfsg-0ubuntu2

Licenses found in Layer:  None

------------------------------------------------

Layer: dd59018c47:

(…)

If things go well, tern also supplies meta data such as licenses for the identified packages. The tool still has some difficulties with the analysis of chained commands (&&) in the docker file and – inevitably – the processing of the contents of COPY statements. To overcome the latter, the tern team has now created a way to integrate other tools – such as Scancode ((https://github.com/nexB/scancode-toolkit)) from nexB.

In addition, we are currently working on extending tern with an interface to TrustSource so that the results of the analysis can be transferred directly to TrustSOurce for further processing. The image would be a separate project, the layers found would be interpreted as TrustSource modules and the components found within a layer would be interpreted as components. This would make it possible to combine the found components with the TrustSource metadata and vulnerability information, thus integrating the container into a compliance process and automating the documentation as far as possible.

Clair

On the other hand there is the Clair, originally developed by Quay.io – now part of RedHat via coreos – in go-lang. As described above, Clair focuses on the identification of vulnerabilities in Docker Images. The heart of Clair is a database which is continuously fed with vulnerability information. 

This database can be accessed via an API and thus can be integrated with the container registry used in your own project. Thus, scans do not have to be done manually, but can be integrated into the CI/CD chain after image creation. 

In the course of the scan, a parts list of the image is generated – analogous to tern’s procedure – and transferred to Clair. There the components found are then compared with the collected vulnerability information. If necessary, Clair can trigger certain notifications to draw attention to identified vulnerabilities.

In principle, this is a sensible structure. Especially if there are a large number of image producers and buyers, it is highly recommended to explicitly check the images again for vulnerabilities.

However, the use of image-specific whitelists, as envisaged by Clair, is quite critical in this context, as vulnerability 1 is not dangerous in use case A and can therefore be placed on the whitelist, but could be dangerous in context B. Image-specific whitelists would therefore be rather counterproductive. As in our TrustSource solution, the whitelists must be context-specific.

Another limitation of Clair is the time of the scan. This only takes place after the image has been built and transferred to the repository. Actually, no image with vulnerabilities should be transferred to a repository! Errors should already be found in the CI/CD process and the image should not even be built, let alone stored.

For this reason Armin Coralic (@acoralic) has extracted a scanner version based on the work of the coreOS team, which can be integrated into the CI/CD process, preferring work, so to speak (see GitHub). In its current form, however, this version only displays the weaknesses found, not the entire structure, although it must be able to detect them. It is therefore advisable to output the structure list available in Clair directly. This would require appropriate adjustments to the Clair project.

TrustSource Platform

We generally recommend that all checks be drawn as far forward as possible in the software development process. This enables the developer to inquire about the status of a library or component or its possible vulnerabilities at the time of its installation. Should the component prove to be too old, too faulty or legally unsuitable, the developer saves time and unnecessary integration work. 

TrustSource uses the Check-API for this purpose. This enables developers to perform these checks from the command line. The TrustSource scanners already mentioned above (see here) can also cover this task automatically in the CI/CD pipeline for a large number of languages. This is already _after_ integration, but still _before_ further processing by others. 

TrustSource can also integrate runtime components as infrastructure modules in the respective project context and enrich them with meta data by means of crawlers that run independently in the background. This collection of information reduces the documentation effort and, by concentrating the information in one place, allows you to automatically monitor appropriately released compositions throughout the life cycle. Detailed information on the individual TrustSource services and how they support compliance tasks can be found at https://www.trustsource.io.

Conclusion

The preceding explanations show that by using container technologies such as Docker, in addition to the pure source code runtime components can quickly be included in the “distribution”. This increases the urgency of a precise, systematic software composition analysis. Although there are a handful of Open Source tools that make it possible to approach this challenge as well, they are still quite isolated and only little integrated into a comprehensive, life cycle-oriented process. 

With TrustSource, EACG offers a process-oriented documentation support that is designed for the life cycle of an application and integrates many tools for the different tasks. TrustSource itself is also available as open source in a community edition for Inhouse use, but also might be subscribed to as SaaS. 

EACG also offers consulting in the area of Open Source strategy as well as in the design of Open Source governance and compliance processes. We develop the TrustSource solution, which is also available as open source, and thus help companies to securely exploit the advantages of open source for their own value creation and to avert the associated risks. 


Scanners

This page will be populated shortly…

Meanwhile find the scanners here


TrustSource v1.9 updated

TrustSource v1.9.6 updated to minor update v.1.9.16

There will still be a few steps to take before we will be able to launch our long awaited v2.0.0. But we already had a cool set of enhancements and features available, that we did not want to hide longer. So we decided to provide them with this interim update.

New Features:

  • Deep-Links have been added to DeepScan results

The deep link allows you to jump directly to the file in the repository scanned, so that you may review the findings with you eyes

  • DeepScan Feedback loop

You may edit the findings of DeepScan in place. Thus you may enhance or modify the automatically identified stuff. We will use your input to add he cases to our learning model

  • New Approval-Screen for Compliance Managers

Compliance Managers get a new Screen “Approvals”, allowing them to manage all approval requests from one screen

  • Version age indicator introduced

As we have a great understanding of all existing versions, we have added an hourglass symbol in the module details view. It is getting into waring (yellow) and cirtical (red) states, if the delta between the used and the latest version is about to grow. You may define sensitivity of this measure in the module settings or mute this alarm, in case you require a specific version.

Improvements:

  • New legal questionnaire

As many users reported problems completing the legal questionnaire, we have simplified it. Out of seven questions, we were able to convert three into options to switch on or off (Want that protection? yes/no). The remainig questions are pretty simple to answer so that no more obstacles in completing the questions should remain. (Please let us know, if you think different!)

Fixes:

  • Treatment of “no-License” cases in Notice file amended
  • The license “no license” appeared to be handled as a regular license when generating a Notice File. We have changed the behavior that such a component will not appear in the Notice file and a warning will be issued.
  • Some spelling mistakes have been corrected.
  • Few layout-fixes where long names/vales broke design

Enjoy the improved version and feel free to comment / feedback your opinion.


TrustSource v1.9 release available!

We are happy to announce our latest release of TrustSource, version 1.9 to arrive Sept. 10th 2019!

With our v1.8 we have introduced a more detailed versioning schema allowing a closer matching between the code repos and the scan results by introducing support for branch and tag. In this release on of the major updates is the new approval, which allows to specify a combination of versions which shall be taken for an approval. But besides this, there is even more:

New Features:

  • Improved Approval Process:
    The new approval will be started as the former version. But a wizard will guide you through the steps. You will be able to give the approval a particular name, select the modules and analysis versions to apply. When you are confident with your selection and issue the approval request, TrustSource will generate all relevant documentations (Notice File, if required a SOUP file – see below – and  a new compliance report) and process the request towards the compliance manager(s).
    They will find the information in their inbox and can immediately dive into the new compliance report.
  • Compliance Report:
    The new compliance report will give a comprehensive summary of the project situation including legal and vulnerability status. Also it will the required documentation and unveil the degree of completion. Also the report will outline the changes in component status a project manager has made will be shown. Thus manual status changes will become immediately visible, comments on vulnerabilities having been muted will be available for review and more….
  • DeepScan:
    How often do you rely on the declared license? How many of us really dig deep to ensure that there is no little snippet of code containing another license? I probably better do not want to know. However, we proudly present DeepScan. This new service allows you to specify the URL of a component repository, which DeepScan then will assess for you in the background. It will search for all sort of data providing a relation to a license specification and present the findings to you. Give it a try! You might be surprised…
  • Support for Medical Device Directive Documentation:
    In 2020 the European Medical Device Directive will force developers of software related to medical devices to clearly outline all third party software and track these changes and require these to register each change (UDI-ID). We prepare to support this procedure and already offer automatically generated SOUP-lists as well ass COTS-component management. Further steps to follow, sty tuned with our newsletter.

Improvements:

  • Improved sorting and selections from lists
  • Multi-license cases will be marked as warning as long as no decision about the license to use has been made, even if both licenses would result in no obligations. Thus all required decisions will be easy to identify.
  • We were able to identify some inputs from scanners being misinterpreted, e.g. “>=”-version conditions, resulting in difficulties with the correct version matching. These cases will be properly resolved now.


Visual Studio plugin extension for .Net-framework available

Today we are proud to announce the availability of the new Visual Studio plugin, allowing to scan .Net-Core and .Net-Framework projects in parallel with one plugin only. The simplest way to obtain the plugin will be through the Visual Studio marketplace. As the plugin itself is open source, you will find the code also on github.

Continuing our initiative to bring TrustSource – the modern art of open source management – to the Microsoft developers community, we have combined both worlds in one plugin. Thus allowing you to operate both implementations in one solution.

Out next step on the road will be to extend the commandline-capability for the combined solution, allowing you to integrate the scanning also with your CI/CD chain. We expect this to be available by the end of March.

Feel free contacting us any time, to clarify questions, feedback usage experiences or issues you experience throughout the installation. We are happy to hear from you!


New release v1.7 introduces Notice-file-Generator

We are proud to announce the long awaited Notice File Generator. With this tool the time consuming digging through files and collection of license information belongs to the past. With the Notice-File Generator it will be available in only one click!

For all scanned projects TrustSource evaluates resulting obligations depending on project context. For each license-component combination the resulting documentation requirements are derived and the information will be assembled into the Notice-File. Wherever change notices or author credits will be required, TrustSource will try to fill the information from its Knowledgebase or outline a Todo. So the Compliance Managers work can focus on closing the gaps. Due to our shared open source component base, digging for authorship or copyright information must not be done, where another customer already did the job. This will reduce clearing time a lot!

In addition to that we renovated and extended some of our plugins. Now it is possible to break your build based on the results of a scan, e.g. on violations using most of the plugins. This extends your compliance  control over the CI/CD chains. Also proxy capability has been added, so that a use behind corporate firewalls is possible as well.

The new user management allows free accounts to login using Github or LinkedIn-accounts.  Also the new Identity Management introduces the option, to add multiple roles to a single user inside the same  company.  This increases the flexibility of your role design. Starting from corporate versions on, also Multi-factor-authentication is available. Integration of LDAP and other identity providers is available as well.

Read more about the features in our Knowledge base.


.Net-Core Support available

We are proud to announce new tools to support scanning of .Net-Core projects today. You will find the platform independent tool in our public repository on Github.

The tool allows to scan .Net-core projects and transfers the list of dependencies to our platform for further analysis. There the identified components will be reseached for license and vulnerability information and legal obligations depending on the usage context will be derived.

It consists of two parts: The first part is the Scanner itself. This part is taking care for the dependency resolution and assembly of all relevant data. The second part is the Console App, which is developed to allow command line interaction with the scanner.If executed from within the project path, an execution might be triggered with a statement as simple as

$ dotnet TS-NetCore-Scanner.dll -user “user@domain.com” -key “TrustSource Key”

While the console application itself is written in .Net-Core and thus more or less platform independent, the corresponding Visual Studio plugin is available for the Windows platform only. If you feel like the Visual Studio Plugin for .Net-Core should be available on other platforms as well, please let us know!

Finally we close this relevant gap in our tooling landscape. Together with the new Nuget-crawler we have provided, the .Net-Core-Developer now can experience the same level of quality, like other Developers already do quiet a while. But this is not enough. There is more to come:

In a next step we will extend the plugin with the capability to also scan .Net-Framework projects as well. This will bind the solution to the windows platform, which is where we see most of the Visual Studio developers delivering.

We hope those of you enjoying the Microsoft development world now will start enjoying TrustSource as well. Please feel free to reach out in case of questions or ideas on how we may improve the plugins! We are happy to learn about your usage experiences.


New Release v1.6 available

We are happy to announce availability of v1.6! Also v1.6 comes with massive new features, focused on process improvements. Read more:

New Features

  • Vulnerability-alert -  It took us quiet a bit, to get the matching towards a reasonable quality, but we manged it after all. You will now get notified by TrustSource, if new vulnerabilities appear for components that you are using in your most recent build.
  • "Action required" items in inbox - Especially for our compliance managers we provide an in box on the dashboard listing all open approval requests. This allows you to immediately see, where action is required.
  • Dependency graph - The so called dependency list is a flat list of all components entering the project even through transitive dependencies. To allow a better understanding of the impact this component has, the graphical display allows to actually _see_ the position within the dependency graph. You may modify the appearance and expand or shrink single nodes for better visibility.

Improvements

  • Improve rule sets - Based on customer feedback and own research, we were able to improve the analysis results of several licenses.
  • Improved maven Plugin - The maven plugin has been extended to support the check functionality allowing to verify components on dev-desktop without the need to push a scan.
  • Improved Jenkins Plugin - Also the Jenkins plugin has been extended to use the transient version of the check-API.

Fixes

  • Add name in register form - Changing your name after having been invited while login in the first time is possible now.
  • Propagate deletion of all members - Changing members of a project respectively a all modules within a project at once has been introduced since a while. But it has not been recognized that the propagation of an empty list does not immediately take effect. This has been fixed now.

Our next version v1.7 will focus on security and extend the login capabilities. We will introduce alternative ways to authenticate and simplify corporate SSO. The given role model has been reviewed and will be tuned towards more flexibility.

If you want to get an overview or some insights in to our roadmap, feel free contacting our sales team! They will be happy presenting you the upcoming steps.

If you feel like there should be some features you do not see on the horizon, please let us know! Our business development or your engagement manager will be happy to hear about your ambitions.


Understanding the most important vulnerability acronyms

Since the Equifax event, vulnerability management gained a lot of attention. But what does "vulnerability" or "known vulnerability" mean? How to handle such an information? And why is this particular important to open source components?

To answer all these questions, we will publish a short series of articles. First we will dive into the goals of vulnerability management and basic concepts. Then we point our attention to the process of vulnerability management and finally show, how TrustSource may support you in performing these tasks.

The goal of vulnerability management

The goal of vulnerability management should be to MINIMIZE THE POSSIBLE ATTACK SURFACE of your environment, where "the environment" may be any scope of software you define (a SaaS, a software package or a complete enterprise). Minimizing the attack surface on the one hand means to accept that there will remain a risk. On the other hand, it means to know all possible attack vectors, assess the risk (maximal loss) associated and derive measures well suited to reduce the attack surface to a financially acceptable risk. This said, it is absolutely essential to assess your environment applying the most recent knowledge - known vulnerabilities - and to address all critical aspects.

The acronyms and concepts

This goal sounds ambitious. But no panic! tehre are some concepts out there helping you to do the job. Before we step into the process, we advise to familiarize with the following few abbreviations , terms and concepts:

  • CVSS = Common Vulnerability Scoring System

has been introduced to measure the impact of a vulnerability. Has a scale of 0-10, with 10 the highest, most critical. Everything above 7.5 may be considered as critical. CVSS is currently in v3, however, vulnerabilities reported prior to 2016 will have been reported in CVSS v2.

  • CAV = Common Attack Vector

describes the identified attack covering aspects such as prerequisites to execute the attack, impact and effect the attack will have. in v3 this will be attach vector (AV), attack complexity (AC), priviledges required (PR) and UI (User interaction). In version 2 (pictured in the tool tip on the right) you will see attack vector (AV), access complexity (AC), Authentication (Au), as well as the impacts on Confidentiality (C), Integrity(I) and Availability (A).

The standardized description of the attack vector is a great help when it comes to understand the impact a potential threat or a vulnerability may have.

  • CVE = Common Vulnerability and Exposure

The CVE is a key identifying a particular vulnerability. The key consists of the three letters CVE, the year and a counter. The counter is assigned by an assigning authority. The counter has no other meaning than to differentiate the particular vulnerability and exposure entries. It gets assigned in the moment it is requested. To request a number, no evidence is required. However, between assignment of an ID and its confirmation several weeks or months may pass.

  • CPE = Common Platform Enumeration

To provide a sound capability to match a vulnerability with the components concerned, the CPE has been introduced. Each component that has a vulnerability assigned, receives a CPE - currently following specification v2.3. A CPE is a unique identifier of a product allowing to refer back from the vulnerability to the product. The CPE (v2.3) consists of a type (h=hardware, o = Operating system, a = application), vendor, product and version information. A central directory, the CPE-dictionary contains all CPEs ever assigned.

However, the matching of CVEs with its assigned CPEs to real life components is critical. Wrong matches lead to false positives, putting the cat under the pigeons; missing matches leave vulnerabilities untreated. That is why we do spend so much attention on accuracy here.

  • CWE = Common Weakness Enumeration

This is a list of weaknesses found in applications. It is a community approach led by MITRE and SANS Institute, supported by huge number of technology heavy weights listing all kind of weaknesses, outlining their inner workings, exploit code and more details. CVEs may have a link to the corresponding weaknesses. the list is a great resource for security experts as well as wanna-be-hackers. It helps to understand the way attacks are created as well as what causes attacks to be successful.

The information supports the understanding of the impact a vulnerability really may have on the individual application.

So what?

Having read all this, you may want to turn your back at the topic and say, "Well, sounds good. Seems like all settled. Why bother?". Yes, there is a lot of structural work that has been done. But it these structures have only been created to allow you doing the job efficiently. The job still needs to be done. In our next post on vulnerabilities, we will cover the process on how to really assess a vulnerability and derive useful action.


How TrustSource supports OpenChain compliance

OpenChain in a nutshell

OpenChain is LinuxFoundation project with the goal to improve trust in open source software. To achieve this, OpenChain identified a set of requirements each organization should cope with to become a valuable and trustful open source user and producer.

„Open Source components delivered by a OpenChain certified organization you may trust!“

The OpenChain specification knows six goals. Four of them focus on the organization of Open Source usage inside the company while the fifth goal addresses the contributions to open source projects.

Finally the sixth goal requests to take responsibility and declare conformity with the OpenChain rules by getting the organization certified. The following will outline the goals and show how TrustSource supports you in achieving the goals.

G1: Know your responsibilities

In a first step it is relevant to know and understand tasks and obligations within your organization. This typically results in an Open Source Policy, a manual on how to handle open source software correctly. Such a policy describes roles and responsibilities and the processes and procedures how to deal with the different use cases.

TrustSource provides pre-defined standard policies and procedures for your purpose. You may want to use and adopt them for your own purposes.

OpenChain requires you not only to provide a policy but also to ensure that your staff is aware of that policy, so that it can be assumed the policy will take effect. To cope with that requirement, a documented procedure is required to proof that at minimum 85% of staff have knowledge of it.

Starting from v1.7 TrustSource comes with a learning management system including a set of online trainings and video materials, that allow to foster the spread of OpenChain behavior and knowledge including learning success metrics.

In addition it is required that procedures to identify and catalogue the applied open source components including the determination of the corresponding rights and obligations are defined.

TrustSource provides a wizard to document the project context which is essential to derive the obligations. Changes in the context, e.g. a changed commercial model will be documented and then new obligations may be determined.

G2: Assign Responsibilities

Unfortunately the current situation concerning documentation in the open source space leave room for improvement. Thus, it is most likely to that questions by downstream users may occur concerning options of usage or contained components. To answer these questions in a professional manner, the specification requires to assign a „FOSS-liaison“, that has to be publicly announced.
The specification requires you to standardize and organize the communication following a clearly defined set of rules. This is especially relevant in case of a legal dispute.

With TrustSource you may delegate this task. You will create a mail-alias and all incoming requests will be routed to our TrustSource help desk. They will be structured, documented and worked on based on a procedure defined together with you.

To cope with the requirements of this process, it is essential that sufficient legal expertise is available for this process. The expertise may be internal or external.

G3: Review and approve FOSS

The third goal focusses on the documentation of generated software respectively the used artifacts. A process how to create the bill of materials (BoM) should be qualified and documented. Where „qualified“ addresses that the process should be suitable to really identify all used components.
This should not happen once only. It shall happen on a continuous base. Especially in continuous deployment environments an extraordinary obligation on the actuality and the need to archive older versions of BoMs arises. A pure manual approach is almost not imaginable anymore.

TrustSource may supports this optimal. Due to the integration with build tools, you have the most recent information available after each build. Each single version may be saved and archived. The „freeze release“ mechanism allows to export specific versions via API or SPDX. Thus allowing you to to provide the most recent BoM with one click.

The specification does not define any requirements concern the contents of a BoM. But within the Linux Foundation the SPDX working group - Software Package Data Exchange - has created a specification and software as well as human readable format for the license documentation. It does not solve the problem in total but provides a sound basis for a technical documentation.
However, the provisioning of a BoM does not make a fully compliant treatment of FOSS. Depending on the use case the clauses of the meanwhile more than 396 known licenses may trigger different obligations. Mean of use, commercialization type or form of distribution and other parameters will impact the identification of the resulting obligations.
This is especially important due to the fact that some of the licenses will terminate the right of use if the obligations are not met. In other words: you will not have the right to use the component. Use of a component without right is a criminal offense.
Therefor OpenChain requires a legal examination, that discovers all obligations concerning the respective use case, to ensure a legally compliant application for that particular scenario.

Here TrustSource presents one of its dominant unique capabilities: Due to the knowledge of the conditions, rights and obligations of several hundred licenses and a structured capturing of the legal context, the TrustSource legal engine may resolve the conditions of the application completely automated and case specific. As a result a list with all required obligations will be provided to the project staff.

G4: Deliver FOSS artifacts

The forth goal aims at the delivery of the created compliance artifacts together with the Software in a single dispatch.

TrustSource supports this by providing an export of an SPDX document per module or project. In a next version a complete notice file will be generated and delivered. All this can be done within the application or through the API.

Another legal requirement is audit acceptability. The means older versions must be available until it is ensured that no old version will be in use anymore. This is required to retain the ability of answering questions concerning older versions based on a solid documentation.

TrustSource stores the complete history of information and thus allows to invoke data of older versions anytime.

On this basis it is always possible to identify which components habe been used in what module and under which circumstances (legal context). On the one hand documentation makes vulnerable, because it also documents what has not been made. On the other hand documentation also creates security, because it may be confirmed that all possible actions have been taken to achieve legal compliance. Such an approach is well suited to discover potentially missing aspects or actions.

G5: Understand Community

Open source should not be a one way street. You should not only take form the community, but understand that you are invited to support and return. This typically happens through participation in open source projects, so called „contributions“.
Depending on the engagement a contribution may reach from occasional code fixes up to leadership of projects. In some cases individuals contribute on occasional basis only, in other cases complete teams are dedicated on the development of specific modules or complete projects. Depending on its input the influence on the project grows. Some companies even create their own open source initiatives.
Whether occasionally or targeted, whatever form will be selected, the commitment itself needs to be clearly regulated. This comprises questions concerning the work time versus spare time design, claims or rights concerning the created software as well as the potential claims against the company arising from provisioning the contributions.
The fifth goal addresses this form of giving-back, the contributions to the open source community. It requires that a policy exists, that clarifies the handling of such contributions. As for the policy of open source use also this policy must not only exist but being distributed and well known.

To present the policy on open source contributions the same TrustSource mechanisms as for the support of G1 may be used. TrustSource also provides a sample policy as baseline that can be used.

It is possible that the policy you will choose prohibits contribution to open source projects. In this case only the prohibition needs to be propagated. Based on our experience, a strict ban is not a suitable answer. If you are using open source seriously, you must accept that you will come across errors. Not being Abel to repair them based on a policy forbidding the contribution would be negligent.
Such contributions also require a governance procedure. Ideally it does not differ much from the governance process for the own products. This will easy understanding as well as handling for all participants.

TrustSource provides a uniform platform for both use cases providing a broad set of tools for the process support.

G6: certify adherence with OC requirements

Goal number six requests the willingness of the organization to certify the adherence to the OpenChain goals one to five, respectively the according requirements. This is currently possible using a self audit. Th website of the openChain project offers a questionnaire and OpenChain partner organizations support you in implementing the required organizational changes.

TrustSource provides the most suitable platform to ensure process conformity. This is well known and understood with all involved parties and therefor will support all further clarifications (e.g. upcoming certifications, when OpenChain advances from specification to an official Standard)

Summary

In conclusion it can be said that most likely OpenChain conformity requires your organization to change processes and mindsets. This always is a time consuming effort, requiring dedication and focus. But an enterprise wide application of TrustSource will reduce a huge part of the complexity and efforts the transformation towards OpenChain conformity will cost.
We suggest to precisely define the starting scope. It might be a good idea not to start local instead global, focussing one entity first, creating a success story and transfer these experiences accordingly. Typically good stories sell well and the spread will ease.

This is approach also is supported by TrustSource. The Multi-Entity-feature allows to differentiate several entities or business areas within one account. It will be possible to manage them together but operate them on an isolated base. They may have specific policies or black- and whitelists, but operate one LDAP for example.

Even companies or business units that use already a scanning tool, TrustSource may leave the freedom to operate at their own. The option to transfer scanning results to the TrustSource API respectively to import SPDX-documents of single modules, allows to use TrustSource as the linking platform, to ensure a sound and conform legal and security analysis as well as harmonized delivery platform. Thus allowing to ensure process conformity and harmonized governance.