Security is one of the top topics of the cloud native community, as Chris Aniszczyk, CTO of CNCF, emphasized on the occasion of the KubeCon+CloudNativeCon, which took place in Amsterdam with over 10,000 participants and was organized by the Cloud Native Computing Foundation. The seventh edition of the Global DevSecOps Report, which GitLab presented at the conference, provides further evidence of this. Under the motto “Security without Sacrifices”, the study shows the growing need for more security, more efficient processes and the targeted use of artificial intelligence in software development.
With a little help from my team
In DevOps organizations, developers are increasingly taking on responsibility for security in the software supply chain. A good 70 percent of the security experts surveyed for the GitLab study stated that around a quarter of the vulnerabilities are currently being identified by developers. In the previous year, only around half of the respondents had made this statement. A key finding is also that the traditional silos in companies are breaking up. Teams take responsibility and share the necessary tasks. In terms of a more holistic security approach, GitLab wants to support this trend and expand its own platform in such a way that teams receive targeted help in making the right decisions and sharing responsibility appropriately.
The company can draw on the data and experience gathered over many years to determine which troubleshooting measures are most suitable in which section of the supply chain, as Michael Friedrich, Senior Developer Evangelist at GitLab, said in an interview heise Developer added. Among other things, the SLSA framework (Supply Chain Levels for Software Artifacts), which the OpenSSF has just presented in version 1.0, can help against attacks on the build system – as the Solarwinds 2020 case showed.
With a little help from AI
While 66 percent of survey participants would like to reduce the number of tools they use in order to achieve higher productivity, developers can also expect help from machine learning and AI-based automation. According to the report, almost two-thirds now use AI tools for code testing, compared to around half just a year ago. GitLab wants to increasingly use AI in all areas in which the technology directly increases the efficiency of the work of developers. This also applies, for example, to the support of junior staff, who can be introduced to their new jobs more quickly.
According to this year’s Global DevSecOps Report, AI tools are being used to test code more than last year.
(Image: GitLab)
AI promises valuable help in making the right decisions, especially when it comes to detecting weak points and dealing with them. To ensure that AI is used responsibly, when training the ML models, GitLab attaches great importance to using only internal data or data that has been explicitly released for the purpose, says Friedrich.
Study details
On behalf of GitLab, the market research company Savanta surveyed more than 5,000 software professionals from various industries in March 2023 – including IT managers, CISOs (Chief Information Security Officers) and software developers. The survey was conducted via social media channels and an email distribution list provided by GitLab. In order to avoid bias in the sample, Savanta also carried out panel sampling. GitLab provides further details on the methodology and the complete report free of charge.
(map)