Blog post

Seventeen years later, code quality is more relevant than ever

Olivier Gaudin photo

Olivier Gaudin

Founder

7 min read

Nearly two decades ago, as a tech lead at a small company in France, I was frustrated that there wasn’t a tool to let us software engineers ensure the consistency and quality of code as we created it.

So I teamed up with two friends equally passionate about software engineering. We launched an open source project to create a code quality review tool, something that could help make coding more consistent and predictable. We didn’t set out to start a business initially, but it became obvious we should. So, on November 13, 2008 (17 years ago today), we incorporated the company that is now Sonar. 

What started as a free and open source tool to solve our own problem has grown to become SonarQube, a product now used by more than 7 million developers around the world to review and improve the quality and security of over 750 billion lines of code every day.

In many ways, though, I’m still that frustrated software engineer. Our industry has evolved, awareness is much higher, and hundreds of thousands of organizations are using Sonar. But code review has not become the mainstream and systematic practice it should be. 

And in a world where coding has emerged as a “killer app” for AI, which now writes billions of lines of code a day, organizations are increasingly discovering that coding speed is no longer the bottleneck. Instead, the verification of the code generated by LLMs has become the new challenge, as AI generated code often contains all sorts of issues.

If there’s one thing that my time at Sonar has confirmed to me, it’s that low-quality code unavoidably slows teams down, increases business risk, and eventually leads to rewriting applications. 

So to celebrate our 17th anniversary, I decided to look back at some of the biggest achievements we’ve made over the years in pursuit of improved code quality.

Sonar’s early days, providing developers access to the data

In the beginning the three of us operated out of a small office in Geneva. We made one pitch to an investor, and decided right after that we should focus on doing what we know best: developing a product. We bootstrapped the company, even though none of us had ever been entrepreneurs before. We paid ourselves when we could, and had to wait to hire employees until after we began seeing regular revenue.

We were working from our core strengths. First, we deeply understood software engineering, i.e. the state of the art of building software, and its evolution. Second, we were building a product for developers, and we were using it every day for our own needs (in other words, dogfooding it). Third, we chose to open source our product, which quickly gave us access to a vibrant Sonar community, and enabled us to operate on a “release early, release often” approach, fueled by community feedback. Last but not least, we had a vision that after software configuration management (SCM), Continuous Integration (CI), and issue tracking, the next area of the devops transformation would be testing, and we wanted to be ready for this.

We started from the belief that developers care about code quality and that if the code they deliver misses the mark, this is because they did not have access to its quality information while writing it. So our goal was to provide that information, initially integrating existing OSS tools, and building two features: to have a single configuration to drive multiple tools and a database to store historical information. 

We quickly realized that the quality and depth of the data was key, and we could not rely on external tools. We started to write an analyzer on top of existing parsers, and eventually owned the entire stack, building symbolic execution, semantic analysis, and dataflow analysis to become what you know today. 

From very early, it was clear that we were onto something. We received feedback from developers who’d been seeking to build something similar, but when they saw Sonar, they would drop those efforts and use Sonar instead. We made it quickly as a codehaus mojo, enabling our users to use the magical mvn sonar:sonar command on their project.

Then a week after we released our first paid feature, we sold it to a Fortune 500 company (and we had no idea they were already using Sonar). We did not even have a process to sell, and when we sent them the invoice, we numbered it F0000242 to pretend they were not our first customer.

A year later, we were able to pay ourselves decently and also hire our first two employees. Which then led to a different problem a few years later when we started to be extremely successful: hiring enough people to maintain and develop the product. 🙂

From startup to scaleup, crossing the chasm

Our industry continued its transformation, and we began to face some new challenges.

For example, as CI became standard in many organizations, delivery was becoming (generally) more linear, and the definition of “done” clearer. This meant that verifying code quality only once a day as part of a build was not good enough anymore. 

Indeed, developers (including our own) started to complain that receiving a notification about a quality gate failing after they had already started working on a different topic was painful context switching. We realized we needed to shift some of this code quality review left—further upstream—by moving into the integrated development environment (IDE). So was born SonarQube IDE (SonarLint at the time). 

When I first tested an early version of our IDE-integrated tool, though, it was taking about 30 seconds to provide feedback on just-written code in Eclipse. I asked the team who is using SonarLint, and the answer was: only Julien. (Julien was the only SonarLint developer at the time…) 

We made a rule right then and there: Everything had to happen in less than 300 milliseconds in the IDE. To achieve this, we needed to make massive changes to the architecture of SonarQube. The biggest one was that analyzers should not access the database directly. This was done in SonarQube 5.1, which took us 7 months to release (drama…), versus the usual every-other-month release schedule. That delay was the price necessary to create the most widely adopted IDE extension for code quality and security globally. 

In the meantime, we also started to see a shift in the industry whereby customers no longer wanted to build their own environment using best-of-breed tools. Instead, they sought an industry-standard integrated suite of tools where they only needed to add the missing pieces. We then became obsessed with integration, trying to create a seamless experience in products like Github, Gitlab, Bitbucket, TFS & VSTS (!), Jenkins… 

Because we could not fight on all fronts, we also made the decision to not be a platform to generally manage quality, but instead to be the best product to manage code quality. We went through an interesting time, during which we started to remove all sorts of APIs and generic features from the platform product. This was done with the community. It was not easy but it has stayed for me as a high moment as we eventually generated a lot of value.  

Leading with value

As our product became more and more popular, we next faced a challenge with internal growth and organizational structure. I would claim that at some stage we were the most sub-organized 100M+ revenue company in the world. 

I believe our fast growth was because of our first-principles approach to everything. We wanted everyone to understand our approach to the product, and therefore the product should sell itself. (Our demo system’s password was actually sells1tself, not to upset anyone, but just because we believed it). Because we began as a boot-strapped company, we had no other choice but to deliver value that people wanted to pay for if we were going to succeed. 

This also meant that we had to honestly challenge ourselves when the product was missing the mark and adjust quickly, which was helped by the fact that we were always dogfooding our own product and seeing its successes and failures firsthand.

In retrospect, I believe that leading with product value gave us a critical sales and marketing advantage that most companies would envy.

Sonar today, meeting the AI moment 

Something we realized a while ago is that if “software rules the world,” it means that a codebase is a company's most valuable asset and they should take good care of it. With AI entering the dance, this stays true. But the entropy of the codebase is accelerating, and this makes code quality that much more critical. 

While LLMs continue to get more powerful each day, we are still nowhere close to a day where AI is writing “good enough” code. In fact, an independent, standardized code verification layer is even more important as human developers become more removed from code creation.

I’m proud of all that our team has accomplished the past 17 years, establishing Sonar as a de facto industry standard for code quality. In many ways, our original roots from when the open source project was first created are still visible. We are still developer-first, product-led, and focused on providing value for our customers and users. 

Code quality is more relevant than ever, and so is Sonar!

  • Follow SonarSource on Twitter
  • Follow SonarSource on Linkedin
language switcher
简体中文 (Simplified Chinese)
  • 法律文件
  • 信任中心

© 2025 SonarSource Sàrl。版权所有。