CNCF Archives - SD Times https://sdtimes.com/tag/cncf/ Software Development News Wed, 06 Nov 2024 15:43:14 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 https://sdtimes.com/wp-content/uploads/2019/06/bnGl7Am3_400x400-50x50.jpeg CNCF Archives - SD Times https://sdtimes.com/tag/cncf/ 32 32 Using certifications to level up your development career https://sdtimes.com/softwaredev/using-certifications-to-level-up-your-development-career/ Wed, 06 Nov 2024 15:43:14 +0000 https://sdtimes.com/?p=56000 Building a career as a software developer can be valuable, but can be a competitive field to break into, especially in 2024 when over 130,000 layoffs have occurred at tech companies already. While not all 130,000 may have been software engineers, they have not been immune from the cuts. One way developers can set themselves … continue reading

The post Using certifications to level up your development career appeared first on SD Times.

]]>
Building a career as a software developer can be valuable, but can be a competitive field to break into, especially in 2024 when over 130,000 layoffs have occurred at tech companies already. While not all 130,000 may have been software engineers, they have not been immune from the cuts.

One way developers can set themselves up for better opportunities is to pursue certifications for skills that are relevant to their career. A certification offers an opportunity for developers to show others that they have a particular skill; It’s one thing to list Kubernetes as a core competency on their resume, and another to say they’ve passed the certification exam for one of the CNCF’s Kubernetes certifications.  

“People are really happy by taking a certification, because it is the validation of some knowledge,” said  Christophe Sauthier, head of CNCF certifications and trainings, in a recent episode of our What the Dev? podcast. “It is something that we feel is really important because anybody can say that they know something, but proving that usually makes a real difference.”

A 2023 CompTIA report found that 80% of US HR professionals surveyed relied on technical certifications during the hiring process. Sauthier said the CNCF has conducted a survey looking into the impact of certifications as well, and has also seen that people who obtain them generally benefit. 

“More than half the people who answered the survey said that taking some training or certification helped them get a new job,” said Sauthier. “It is a way for people to be more recognized for what they know, and also to usually get better pay. And when I say a lot of people get better pay, it was about one third of the people who answered our survey who said that they had a higher pay because of taking training or certifications.”

Another survey from CompTIA in 2022 showed that IT professionals that obtained a new certification saw an average $13,000 increase in salary. 

How to select a certification

In order to see these benefits, it’s important for anyone pursuing a certification to think about which one will best suit their needs, because they come in all shapes and sizes.

Sauthier says he recommends starting with an entry-level certification first, as this can enable someone to get used to what it means to take a certification. 

Then, it might make sense to move onto more advanced certifications. For instance, the CNCF’s Certified Kubernetes Security Specialist (CKS) certification is “quite tough”, he said. However, its difficulty is what appeals to people.  

“People are really attracted by it because it really proves something,” he said. “You need to actually solve real problems to be able to pass it. So we give you an environment and we tell you, ‘okay, there is this issue,’ or ‘please implement that,’ and we are then evaluating what you did.”

Sauthier did note that difficulty alone shouldn’t be a deciding factor. “When I’m looking at the various certifications, I am more interested in looking at something which is widely adopted and which is not opinionated,” he said. Having it not be opinionated, or not tied to a specific vendor, will ensure that the skills are more easily transferable. 

“Many vendors from our community are building their bricks on top of the great project we have within the CNCF, but the certifications we are designing are targeting those bricks so you will be able to reuse that knowledge on the various products that have been created by the vendors,” he said.

He went on to explain how this informs the CNCF’s process of certification development. He said that each question is approved by at least two people, which ensures that there is wide agreement. 

“That is something that is really important so that you are sure when you’re taking a certification from us that the knowledge that you will validate is something that you will be able to use with many vendors and many products over our whole community,” he said. “That’s really something important for us. We don’t want you to be vendor locked with the knowledge you have when you take one of a certification. So that’s really the most important thing for me, and not the difficulty of the certification itself.”

The CNCF recently took its certification program a step further by introducing Kubestronaut, an achievement people can get for completing all five of its Kubernetes certifications. Currently, there are 788 Kubestronauts, who get added benefits like a private Slack channel, coupons for other CNCF certifications, and a discount on CNCF events, like KubeCon. 

The post Using certifications to level up your development career appeared first on SD Times.

]]>
Linux Foundation announces several new subgroups during Open Source Summit Day 1 https://sdtimes.com/os/linux-foundation-announces-several-new-subgroups-during-open-source-summit-day-1/ Mon, 16 Sep 2024 15:45:43 +0000 https://sdtimes.com/?p=55656 The Linux Foundation had a lot of news to share during the first day of its Open Source Summit in Vienna, Austria. Several new subgroups have been formed within the organization to support popular technologies and practices. Here are some highlights from Day 1: Announcing the Developer Relations Foundation  The organization announced plans to form … continue reading

The post Linux Foundation announces several new subgroups during Open Source Summit Day 1 appeared first on SD Times.

]]>
The Linux Foundation had a lot of news to share during the first day of its Open Source Summit in Vienna, Austria. Several new subgroups have been formed within the organization to support popular technologies and practices.

Here are some highlights from Day 1:

Announcing the Developer Relations Foundation 

The organization announced plans to form the Developers Relations Foundation (DRF), which will focus on driving interest and awareness into the importance of the developer relations (DevRel) role within software development. 

Its Steering committee will include members from global DevRel leaders, and according to the Linux Foundation, the new group already has support from DevRel communities, including Aerospike, Ant Group, Hookdeck, MoonGift, SUSE, and TraceLink.

“Establishing the Developer Relations Foundation is a pivotal moment for our community,” said Stacey Kruczek, director of developer relations and community at Aerospike. “This foundation will create a unified, supportive ecosystem where knowledge, resources, and best practices are shared. By endorsing this initiative, Aerospike underscores the importance of empowering developers, enhancing their experiences, and advancing the entire tech industry. We deeply appreciate the support and guidance from the Linux Foundation. Together, we can drive innovation and foster a more inclusive and dynamic developer relations community.”

RELATED: Open Model Initiative now hosted by Linux Foundation

AWS donates OpenSearch to Linux Foundation

OpenSearch is a search engine for enterprise data that enables companies to search, analyze, and visualize their data. It was created in 2021 and has been downloaded by over 700 million people. 

To support the project, the Linux Foundation announced that it is forming the OpenSearch Software Foundation to drive the growth and maintenance of the project. 

At launch, the new foundation has support for premier Linux Foundation members AWS, SAP, and Uber, and general members Aiven, Aryn, Atlassian, Canonical, DigitalOcean, Eliatra, Graylog, NetApp Instaclustr, and Portal26. 

“The Linux Foundation is excited to provide a neutral home for open and collaborative development around open source search and analytics,” said Jim Zemlin, Executive Director of the Linux Foundation. “Search is something we all rely on every day, for both business and consumer purposes, and we look forward to supporting the OpenSearch community and helping them provide powerful search and analytics tools for organizations and individuals around the world.”

Linux Foundation Decentralized Trust forms

Another new group under the Linux Foundation umbrella that was announced today is LF Decentralized Trust, which will build a foundation for open source decentralized technology ecosystems. 

It has support from over 100 Linux Foundation members and will launch with 17 total projects, including the entire Hedera codebase, which will live at the Linux Foundation under the name Hiero. 

“Around the world, decentralized technologies are modernizing critical systems and core infrastructure,” said Daniela Barbosa, general manager of decentralized technologies at the Linux Foundation and executive director of LF Decentralized Trust. “LF Decentralized Trust is the new home for collaboratively developing the ecosystems that will make these trusted systems and applications. We are launching with a powerful mix of ledger, identity, cryptography, interoperability, and implementation projects and a diverse, committed membership base. And we are just getting started. Our mission is to drive collective innovation that delivers transparency, reliability, security, and efficiency on a global scale.”

Linux Foundation, CNCF, and Unified Patents continue the fight against patent trolls

The Linux Foundation and the Cloud Native Computing Foundation (CNCF) also announced that they are improving their existing work with Unified Patents to protect open source projects from non-practicing entities (NPEs), or “patent trolls,” which are entities that buy up patents and then seek to profit from them. 

As a part of the partnership, Linux Foundation and CNCF members will get access to benefits like annual NPE risk analysis, patent portfolio analysis, Unified Patents’ PATROLL prior art bounty program, updates on NPE activities, specialized events, and the ability to sponsor PATROLL contests and participate in the royalty-free licenses granted in resulting settlements. 

“The expansion of our partnership with the Linux Foundation and CNCF is a significant step forward in the ongoing battle against patent trolls,” said Shawn Ambwani, chief operating officer at Unified Patents. “By combining our expertise and resources, we are better equipped to protect the open source community from those who seek to exploit the system for profit without contributing to innovation.”

The post Linux Foundation announces several new subgroups during Open Source Summit Day 1 appeared first on SD Times.

]]>
Open source in 2024: Tackling challenges related to security, AI, and long-term sustainability https://sdtimes.com/os/open-source-in-2024-tackling-challenges-related-to-security-ai-and-long-term-sustainability/ Fri, 22 Mar 2024 19:15:48 +0000 https://sdtimes.com/?p=54091 The first piece of open source code was published just over 70 years ago, and now open-source software finds itself in almost every application that exists today.  A 2024 report from Synopsys found that the average application has over 500 open source components in it, and most recent industry reports show that over 95% of … continue reading

The post Open source in 2024: Tackling challenges related to security, AI, and long-term sustainability appeared first on SD Times.

]]>
The first piece of open source code was published just over 70 years ago, and now open-source software finds itself in almost every application that exists today. 

A 2024 report from Synopsys found that the average application has over 500 open source components in it, and most recent industry reports show that over 95% of codebases contain open source software. 

Chris Aniszczyk, CTO of the Cloud Native Computing Foundation and VP of developer relations at the Linux Foundation, says that while open source has largely been used in applications in the technology sector, it is expanding into nearly every industry in recent years, such as agriculture and pharma. The Linux Foundation also recently announced OS-Climate to tackle climate change problems. 

Given the pervasiveness of open source software, let’s look at some of the trends we’ve been seeing across the last year and what we can expect from the open source community this year. 

Open source security is now being tackled by governments

In general, open source software has been under more of a microscope lately, due to several major security issues over the past decade involving open source components, such as the Log4Shell vulnerability in Log4J. 

Both the United States and European Union are now acting to improve the security of open source projects. Within the U.S., President Joe Biden signed an executive order on improving cybersecurity, and a part of that is improving open source security. CISA also has several initiatives tackling this issue. 

In the EU, the Cyber Resilience Act places stricter security requirements on software. While it doesn’t target open source software specifically, Mike Milinkovich, executive director of the Eclipse Foundation, says “there’s really no way that you can regulate the software industry without regulating open source as some sort of a first order side effect.”

The Executive Order has made people start thinking more about things like Software Bill of Materials (SBOMs) and vulnerability management (including license management), said Michele Rosen, research director at IDC.

“If you’re installing a package that three dependencies deep is using some sort of GPL software, and you’re now building software on it, that can be a big legal risk for a company,” she said. “So one of the things that they’re finding is that SBOM management systems can help with not only managing the vulnerabilities, but also managing the licenses of the underlying code.”

According to Aniszczyk, this regulation and push for transparency makes sense, because when we go to the grocery store, for example, we want to know exactly what is in the food we’re buying. Until now, there hasn’t really been an incentive to do that with software.

“We just have so much choice in open source land and developers just use what they find on GitHub or GitLab, or all over the internet,” said Aniszczyk. “And there’s just not this maturity that you would find in industries like manufacturing or so on where there’s like a little bit more scrutiny on the supply chain.”

Milinkovich is hopeful that a side effect of this regulation is that it entices larger corporations to contribute back to open source more.

“There is absolutely no incentive in any part of that relationship for the companies in particular that are using open source to contribute anything back,” said Milinkovich. “There’s no reason to; it’s like ‘thanks for the free stuff.’ And then we’re going to put it into our applications in our internal systems. And that’s great. But regulation changes that equation somewhat. So with regulation, now, they might have a requirement to be able to produce SBOMs, they might have a requirement to demonstrate that the software components that they’re using in their products that they’re selling to the US government have to follow the NIST SSVF capabilities.”

Open source may win the AI race

A leaked memo from a Google staffer last May titled “We Have No Moat And Neither Does OpenAI” explored the idea that as Google was busy trying to compete with OpenAI, they realized the possibility that neither company would win the AI race: open source could.

“The moats memo was basically saying open source guys are getting similar results, or in some ways, even better results. And they’re advancing at a pace that’s faster, even with much smaller datasets,” said Milinkovich.

The memo states: “Plainly put, they are lapping us. Things we consider “major open problems” are solved and in people’s hands today … Open-source models are faster, more customizable, more private, and pound-for-pound more capable. They are doing things with $100 and 13B params that we struggle with at $10M and 540B. And they are doing so in weeks, not months.”

Some of the large companies are even starting to open source their models, and open source makers are also striking deals with the larger companies, said Rosen.

For instance, Meta has partially open sourced Llama and Mistral, the French startup producing open source models, recently made a deal with Microsoft.  

“So I think it’s pretty clear that open models are going to play a part in this whole AI space one way or the other … there was a question I would say last year where some people were implying that network effects being what they are, we were all going to sort of converge on a single model and I don’t see that happening at all, I think there’s going to be a proliferation,” she said.

Another thing to keep an eye on when it comes to AI is how contributions made using AI will be handled, given the fact that the author might not actually be the author, said Milinkovich.

He believes that it will become more popular to use tools that check for plagiarism. “There’s some options in Copilot, where it will check to see if the code that it has produced is almost identical to code that went into its training data,” he said. “If there’s something that would be interpreted by a human as looking like plagiarism, you need to try to use those tools to avoid that.”

Rosen says “the problem is that particularly with an open source model, it’s very hard to know how to apply those licenses to let’s say the training data set or the architecture or even the system prompt or something like that.”

The impact of tech layoffs on open source

According to Rosen, about half of the open source contributors are paid in some way to contribute to open source. That’s why when Google decided to lay off its open source division last year, it made some waves. 

Google wasn’t the only one; According to Crunchbase’s layoff tracker, 191,000 tech workers lost their jobs in 2023 and as of March 8th, another 31,000 had already been laid off this year. 

However, despite the layoffs, data from the Open Source Contributor Index reveals the number of active contributors from top tech companies (including Google) went up every single month in 2023. 

“It’s true that obviously some of the open source, commercial software leaders were subject to layoffs,” said Rosen. “And even though we know that there must have been some developers laid off who were contributing to open source projects, it’s important to put those layoffs in context. The losses represented a relative minority of the hiring that had taken place for the two or three previous years, so the overall impact, it’s not something that I’ve seen or that I have a sense that there has been a drain.”

How to sustain open-source projects long-term

Long-term sustainability of open source projects is another thing that has gotten more attention over the past few years. There were several examples of popular projects changing the license or business model of their projects in the last year. For instance, HashiCorp switched Terraform from MPL v2 to the Business Source License last year, and earlier this year, Buoyant announced that stable Linkerd releases would only go out to Enterprise users. Also, Red Hat had previously announced that its RHEL releases would only be available through CentOS Stream, which upset many in the open source community. 

These aren’t isolated incidents over the last year, however; A number of other open source projects have changed their licenses over the years, including Akka, CockroachDB, Elasticsearch, MongoDB, Redis, and more. 

Aniszczyk believes that because of the backlash companies faced, this isn’t going to be a common occurrence for open-source projects. “I think that’s going to happen less because of how much pain it caused them, like they lost a lot of community trust,” he said, speaking of HashiCorp. 

Rosen says that she believes companies are starting to think more about the long-term strategy of a project than they used to.

“[They’re] maybe being a little bit more active in diversifying the management and really trying to think about a longer term strategy,” she said. “Whereas I think a lot of open source projects are launched sort of in the innovation mindset, and maybe don’t think about longer term governance. If this project becomes successful, how are we going to maintain it, what’s going to happen?”

A paper published in January by the Harvard Business School revealed that 96% of the value of open source is generated by 5% of developers. 

“We have a relatively small population of people that, frankly, society is depending upon,” said Milinkovich. “And, you know, how do we make sure that those people don’t burn out? … How do we make sure those developers are sustained, but also how are they replaced as they retire and the next generation has to come back in behind them and pick up the mantle of some of these core pieces of infrastructure.” 

The value of open source

It’s an important problem to solve, because that same Harvard Business School paper valued the demand side of open source software at $8.8 trillion and supply side at $4.15 billion.

“We find that firms would need to spend 3.5 times more on software than they currently do if OSS did not exist,” the researchers stated in the report. 

Milinkovich believes Harvard’s numbers are an underestimate of the value because they only measured websites and not operating systems. 

“Some of the headlines I’ve seen make me think they didn’t actually read the paper, because it’s like, you know, ‘open source is worth $8.8 trillion?’ No, they only measured a fraction of the open source ecosystem, right? They only measured websites, and they specifically excluded operating systems. So basically, the economic value of all of the web infrastructure around the planet that we use every day, and open source’s contributions to that is about $8.8 trillion, but that excludes other uses. It excludes operating systems. So it’s obviously in fact, much, much higher than that.”

The post Open source in 2024: Tackling challenges related to security, AI, and long-term sustainability appeared first on SD Times.

]]>
OpenFeature feature flagging API becomes a CNCF incubating project https://sdtimes.com/feature-flags/openfeature-feature-flagging-api-becomes-a-cncf-incubating-project/ Wed, 27 Dec 2023 15:33:15 +0000 https://sdtimes.com/?p=53404 The CNCF Technical Oversight Committee (TOC) has recently approved OpenFeature as an incubating project within the CNCF. OpenFeature is an open specification designed to provide a vendor-neutral, community-driven API specifically for feature flagging.  Feature flagging is a method used in software development where teams can switch features or code paths on or off, or modify … continue reading

The post OpenFeature feature flagging API becomes a CNCF incubating project appeared first on SD Times.

]]>
The CNCF Technical Oversight Committee (TOC) has recently approved OpenFeature as an incubating project within the CNCF. OpenFeature is an open specification designed to provide a vendor-neutral, community-driven API specifically for feature flagging. 

Feature flagging is a method used in software development where teams can switch features or code paths on or off, or modify their behavior, without changing the source code.

The introduction of OpenFeature as a standard for feature flags aims to combine different tools and vendors under a unified interface. This approach is intended to prevent vendor lock-in at the code level and offers a structure for developing extensions and integrations. These can then be shared throughout the community.

“Specifications fill a unique place in cloud native. They allow adopters to experience consistent development and integration patterns to achieve uniform functionality across platforms. However they have more challenges in adoption due to the need for a reference implementation,” said Emily Fox, TOC Sponsor for OpenFeature and senior principal software engineer at Red Hat. “OpenFeature taps into its talented contributor pool who manage community-developed SDKs for reference implementations that provide adopters with various options to meet their needs. Their commitment to collaboration for improving and expanding the specification will continue to allow the project to gain momentum as it begins its journey towards Graduation.”

OpenFeature is now focused on driving further standardization – building on OpenFeature’s existing definition for a flag evaluation SDK, the project is exploring two further standards – a wire protocol for remote flag evaluation and a standard flag definition format.

Additional details are available here.

The post OpenFeature feature flagging API becomes a CNCF incubating project appeared first on SD Times.

]]>
CNCF’s Notary and Notation projects get major update https://sdtimes.com/security/cncfs-notary-and-notation-projects-get-major-update/ Mon, 28 Aug 2023 18:43:00 +0000 https://sdtimes.com/?p=52142 Notary, the CNCF project that provides cross-industry standards for supply chain security, has announced a major release.  This brings both the Notary Project and Notation Project to version 1.0.0. Notation is a sub-project that implements Notary specifications.  Included in this release are an OCI signature specification, OCI COSE signature envelope, OCI JWS signature envelope, OCI … continue reading

The post CNCF’s Notary and Notation projects get major update appeared first on SD Times.

]]>
Notary, the CNCF project that provides cross-industry standards for supply chain security, has announced a major release. 

This brings both the Notary Project and Notation Project to version 1.0.0. Notation is a sub-project that implements Notary specifications. 

Included in this release are an OCI signature specification, OCI COSE signature envelope, OCI JWS signature envelope, OCI signing and verification workflow, signing scheme, Trust Store, and Trust policy, and plugin specification for Notation. 

The team also revealed what it’s working on next. These include the ability to sign and verify arbitrary blogs, integration with GitHub Actions, a HashiCorp Vault plugin, plugin lifecycle management, timestamping support, and the ability to manage trust policies using CLI commands. 

“As containers and cloud native artifacts become common deployment units, users want to make sure that they are authentic in their environments. The Notary Project is a set of specifications and tools intended to provide cross-industry standards for securing software supply chains through signing and verification, signature portability, and key/certificate management,” the project maintainers wrote in a blog post.

The post CNCF’s Notary and Notation projects get major update appeared first on SD Times.

]]>
Why the world needs OpenTelemetry https://sdtimes.com/monitor/why-the-world-needs-opentelemetry/ Thu, 02 Mar 2023 18:19:09 +0000 https://sdtimes.com/?p=50449 Observability has really taken off in the past few years, and while in some ways observability has become a bit of a marketing buzzword, one of the main ways companies are implementing observability is not with any particular company’s solution, but with an open-source project: OpenTelemetry. Since 2019, it has been incubating at the Cloud … continue reading

The post Why the world needs OpenTelemetry appeared first on SD Times.

]]>
Observability has really taken off in the past few years, and while in some ways observability has become a bit of a marketing buzzword, one of the main ways companies are implementing observability is not with any particular company’s solution, but with an open-source project: OpenTelemetry.

Since 2019, it has been incubating at the Cloud Native Computing Foundation, but the project has its origins in two different open-source projects: OpenCensus and OpenTracing, which were merged into one to form OpenTelemetry. 

“It has become now the de facto in terms of how companies are willing to instrument their applications and collect data because it gives them flexibility back and there’s nothing proprietary, so it helps them move away from data silos, and also helps connect the data end to end to offer more effective observability,” said Spiros Xanthos, SVP and general manager of observability at Splunk

OpenTelemetry is one of the most successful open-source projects, depending on what you measure by. According to Austin Parker, head of DevRel at Lightstep and maintainer of OpenTelemetry, it is the second highest velocity project within the CNCF, only behind Kubernetes, in terms of contributions and improvements.

According to Parker, one of the reasons why OpenTelemetry has just exploded in use is that cloud native development and distributed systems have “eaten the world.” This in turn leads to increased complexity. And what do you need when complexity increases? Observability, visibility, a way to understand what is actually going on in your systems. 

RELATED ARTICLE: How to ensure open-source longevity 

Parker feels that for the past few decades, a real struggle companies have run into is that everyone has a different tool for each part of observability. They have a tool for tracing, something for handling logs, something to track metrics, etc. 

“There’s scaling issues, lack of data portability, lack of vendor agnosticism, and a lack of ability to easily correlate these things across different dimensions and across different signal types,” said Parker. “OpenTelemetry is a project whose time has come in terms of providing a single, well-supported, vendor-agnostic solution for making telemetry a built-in part of cloud native systems.” 

Morgan McLean, director of product management at Splunk and co-founder of OpenTelemetry,  has seen first-hand how the project has exploded in use as it becomes more mature. He explained that a year ago, he was having conversations with prospective users who at the time felt like OpenTelemetry didn’t meet all of their needs. Now with a more complete feature set, “it’s become a thing that organizations are now much more comfortable and confident using,” Morgan explained. 

Today when he meets with someone to tell them about OpenTelemetry, often they will say they’re already using it. 

“OpenTelemetry is maybe the best starting point in that it has universal support from all vendors,” said Xanthos. “It’s a very robust set of, let’s say, standards and open source implementation. So first of all, I know that it will be something that will be around for a while. It is, let’s say, the state of the art on how to instrument applications and collect data. And it’s supported universally. So essentially, I’m betting on something that is a standard accepted across the industry, that is probably going to be around for a while, and gives me control over the data.”

It’s not just the enterprise that has jumped on board with OpenTelemetry; the open-source community as a whole has also embraced it. 

Now there are a number of web frameworks, programming languages, and libraries stating their support for OpenTelemetry. For example, OpenTelemetry is now integrated into .NET, Parker explained. 

Having a healthy open-source ecosystem crucial to success

There are a lot of vendors in the observability space, and OpenTelemetry “threatens the moat around most of the existing vendors in the space,” said Parker. It has taken a lot of work to build a community that brings in people that work for those companies and have them say “hey, here’s what we’re going to do together to make this a better experience for our end users, regardless of which commercial solution they might pick, or which open-source project they’re using,” said Parker. 

According to Xanthos, the reason an open-source standard has become the de facto and not something from a vendor is because of demand from end users. 

“End users essentially are asking vendors to have open-source standards-based data collection, so that they can have more effective observability tools, and they can have control over the data,” said Xanthos. “So because of this demand from end users, essentially all vendors either decided or were forced to support OpenTelemetry. So essentially, there is no major vendor and observability that doesn’t support it today.”

OpenTelemetry’s governance committee seats are tied to people, not companies, which is the case for some other open-source projects as well. 

“We try to be cognizant of the fact that we all work for people that have commercial interests here, but at the end of the day, we’re people and we are not avatars of our corporate overlords,” said Parker. 

For example, Morgan and Parker work for two separate companies which are direct competitors to each other, but in the OpenTelemetry space they come together to do things for the project like form end-user working groups or running events. 

“It doesn’t matter who signs the paycheck,” Parker said. “We are all in this space for a reason. It’s because we believe that by enabling observability for our end users through OpenTelemetry, we are going to make their professional lives better, we’re going to help them work better, and make that world of work better.”

What’s next?

OpenTelemetry has a lot planned for the future, and recently published an official project roadmap

The original promise of OpenTelemetry back when it was first announced was to deliver capabilities to allow people to capture distributed traces and metrics from applications and infrastructure, then send that data to a backend analytics system for processing. 

The project has largely achieved that, which presents the opportunity to sit down and ask what comes next. 

For example, logging is something important to a large portion of the community so that is one focus. “We want to be able to capture logs as an adjacent signal type to distributed traces and to metrics,” said Morgan.

Another long-term focus will be capturing profiles from applications so that developers can delve into the performance of their code.

The maintainers are also working on client instrumentation. They want OpenTelemetry to be able to extract data from web, mobile, and desktop applications. 

“OpenTelemetry is very focused on back end infrastructure, back end services, the stuff that people run inside of AWS or Azure or GCP,” Morgan explained. “There’s also a need to monitor the performance and get crash reports from their client applications, like front end websites or mobile applications or desktop applications, so they can judge the true end to end performance of everything that they’ve built, not just the parts that are running in various data centers.”

The promise of unified telemetry

At the end of the day, it’s important to remember the main goal of the project, which is to unify telemetry. Developers and operators are dealing with increasing amounts of data, and OpenTelemetry’s purpose is to unify those streams of data and be able to do something with it. 

Parker noted the importance of using this data to deliver great user experiences. Customers don’t care whether you’re using Kubernetes or OpenTelemetry, he said. 

“Am I able to buy this PS5? Am I able to really easily put my shopping list into this app and order my groceries for the week?” According to Parker this is what really matters to customers, not what technology is making this happen. 

“OpenTelemetry is a foundational component of tying together application and system performance with end user experiences,” said Parker. “That is going to be the next generation of performance monitoring for everyone. This isn’t focused on just the enterprise; this isn’t a particular vertical. This, to me, is going to be a 30 year project almost, in terms of the horizon, where you can definitely see OpenTelemetry being part of how we think about these questions for many years to come.” 

The post Why the world needs OpenTelemetry appeared first on SD Times.

]]>
Report: Majority of open source contributors wish their company paid for their contributions https://sdtimes.com/open-source/report-majority-of-open-source-contributors-wish-their-company-paid-for-their/ Wed, 18 Jan 2023 20:52:45 +0000 https://sdtimes.com/?p=50095 Nearly four-fifths of developers (79%) believe that contributing to open source projects has helped further their careers, but 78% still say that companies should pay them for the time spent on contributions.  This is according to a new survey from the CNCF and TAG Contributor Strategy (TAG CS), which is a group within the CNCF … continue reading

The post Report: Majority of open source contributors wish their company paid for their contributions appeared first on SD Times.

]]>
Nearly four-fifths of developers (79%) believe that contributing to open source projects has helped further their careers, but 78% still say that companies should pay them for the time spent on contributions. 

This is according to a new survey from the CNCF and TAG Contributor Strategy (TAG CS), which is a group within the CNCF that helps the organization’s projects build and maintain sustainable contributor strategies. 

The top career benefits from working on open-source projects included learning from collaboration with others and advancing technical skills. 

In addition, sixty-five percent of respondents say they work on bug fixes, 62% write documentation, and 55% work on writing new features, which are all transferable skills in software development. 

Fifty-nine percent of respondents to the survey contribute to projects on work time and 25% contribute full time. 

The survey also found that over half (53%) were “regular, frequent, or high-volume” contributors and a third were maintainers. 

The majority of respondents (59%) said they plan to increase their contributions to CNCF projects in the next year. Twenty-eight percent of respondents will keep their contributions at the same amount and less than five percent said they’ll contribute less. 

The post Report: Majority of open source contributors wish their company paid for their contributions appeared first on SD Times.

]]>
Security platform Kubescape accepted into CNCF Sandbox https://sdtimes.com/kubernetes/security-platform-kubescape-accepted-into-cncf-sandbox/ Wed, 11 Jan 2023 17:53:59 +0000 https://sdtimes.com/?p=50049 Cybersecurity company Armo has announced that an open-source project it developed is being donated to the Cloud Native Computing Foundation (CNCF) as a Sandbox project. Kubescape is an end-to-end security platform for Kubernetes, and is the first security scanner under the CNCF umbrella, according to Armo. “ARMO’s commitment to open source means ensuring Kubescape is … continue reading

The post Security platform Kubescape accepted into CNCF Sandbox appeared first on SD Times.

]]>
Cybersecurity company Armo has announced that an open-source project it developed is being donated to the Cloud Native Computing Foundation (CNCF) as a Sandbox project. Kubescape is an end-to-end security platform for Kubernetes, and is the first security scanner under the CNCF umbrella, according to Armo.

“ARMO’s commitment to open source means ensuring Kubescape is free, open and always improving to become the end-to-end open-source Kubernetes security platform of choice,” said Shauli Rozen, co-founder and CEO of ARMO. “I’m proud that Kubescape’s acceptance by the CNCF cements this commitment. ARMO remains dedicated to making Kubescape the best open source Kubernetes security platform, and ARMO Platform the best enterprise version for Kubescape.  We strive to provide the best and simplest option for organizations to get the benefits of Kubescape with enterprise-level service support and features, to ensure the most complete security experience.”

Armo will continue to lead development of Kubescape and continue its commitment of making “Kubernetes security a simple and trustworthy DevOps-first experience.” 

Key features of Kubescape include risk analysis, security compliance, and misconfiguration scanning. 

It can scan clusters, YAML files, and Helm charts against a number of frameworks, such as NSA-CISA, MITRE ATT&CK, and the CIS Benchmark. 

To verify Kubernetes objects it also uses Open Policy Agent, which is another CNCF project. 

Scan results are available in a number of formats, including JSON, junit XML, HTML, PDF, or submitted to a cloud service. 

In addition to this news, the company also announced the launch of Armo Platform, which is an enterprise offering of Kubescape, providing full enterprise-grade support, maintenance and additional features.

The post Security platform Kubescape accepted into CNCF Sandbox appeared first on SD Times.

]]>
Linux Foundation, CNCF, and Ethical Intelligence partner on new ethics in open-source course https://sdtimes.com/os/linux-foundation-cncf-and-ethical-intelligence-partner-on-new-ethics-in-open-source-course/ Wed, 18 May 2022 17:56:18 +0000 https://sdtimes.com/?p=47602 The Linux Foundation, Cloud Native Computing Foundation (CNCF), and Ethical Intelligence have all partnered up to create a free online course on ethics in open-source development.  It is designed for developers looking to apply ethics to their coding practice, and for product managers looking to incorporate ethics-by-design technology into their workflows.  According to the Linux … continue reading

The post Linux Foundation, CNCF, and Ethical Intelligence partner on new ethics in open-source course appeared first on SD Times.

]]>
The Linux Foundation, Cloud Native Computing Foundation (CNCF), and Ethical Intelligence have all partnered up to create a free online course on ethics in open-source development

It is designed for developers looking to apply ethics to their coding practice, and for product managers looking to incorporate ethics-by-design technology into their workflows. 

According to the Linux Foundation, developers aren’t always thinking through how a piece of code could be used by a bad actor or how an algorithm might affect different classes of people.  

They explained that this is why it is important to include ethical principles like transparency and accessibility in open source. 

After this course, students should be able to assess technology for ethical blind spots, apply ethical critical thinking techniques, understand the Ethics Journey Cycle in open-source development, and utilize ethics as a decision-making tool for risk mitigation. They will also be prepared for roles like a Responsible Technologist or Ethics Developer Lead. 

The course includes two to three hours of material, including videos and quizzes. It was developed by Olivia Gambelin, CEO of Ethical Intelligence; Rahaf Albalkhi, member of the IEEE P7003-Algorithmic bias considerations working group; Dr. Michael Klenk, former management consultant; and Rand Hirmiz philosophy Ph.D candidate at York University who is specializing in the ethics of AI in healthcare. 

Registration for the course is now open

The post Linux Foundation, CNCF, and Ethical Intelligence partner on new ethics in open-source course appeared first on SD Times.

]]>
CNCF accepts Backstage as incubating project https://sdtimes.com/softwaredev/cncf-accepts-backstage-as-incubating-project/ Tue, 15 Mar 2022 18:20:20 +0000 https://sdtimes.com/?p=46925 The Cloud Native Computing Foundation (CNCF) has voted to accept a new platform for building developer portals as an incubating project. Backstage enables developers to bring together their organization’s tooling, services, apps, data, and documentation into a single UI.  The project has its origins at Spotify. In 2016 the company was growing quickly and struggling … continue reading

The post CNCF accepts Backstage as incubating project appeared first on SD Times.

]]>
The Cloud Native Computing Foundation (CNCF) has voted to accept a new platform for building developer portals as an incubating project. Backstage enables developers to bring together their organization’s tooling, services, apps, data, and documentation into a single UI. 

The project has its origins at Spotify. In 2016 the company was growing quickly and struggling to onboard engineers fast enough to meet its needs. Creating Backstage enabled Spotify engineers to be more productive and work faster.

After using it internally for a few years, Spotify open-sourced the project in March 2020 and then later that year in September it entered the CNCF Sandbox. 

Since joining the CNCF Backstage has seen growth across its core components and features. Maintainers have focused efforts into updating, refining, documenting, deprecating and stabilizing core components in advance of the 1.0 release of the Core Framework, which is still upcoming. The Core Framework includes the Software Catalog, Software Templates, TechDocs, and API Reference. 

Backstage is currently in use publicly at over 100 companies, including American Airlines, HelloFresh, Netflix, Wayfair, and more. 

“It’s amazing to see organizations from vastly different industries adopting Backstage as their development platform and working together at making the developer experience better for everyone,” said Johan Haals, senior site reliability engineer at Spotify. “Our community has grown tremendously this past year, and I’m excited for the efforts the community is bringing to increase the project’s velocity.”  

The post CNCF accepts Backstage as incubating project appeared first on SD Times.

]]>