Start your free trial
Verify all code. Find and fix issues faster with SonarQube.
CommencerWelcome to the final installment of our .NET developer’s guide to SonarQube.
Throughout this series, we have journeyed from the individual developer's IDE (Part 1) to a local Docker setup (Part 2), implemented automated CI/CD pipelines (Part 3), and learned to interpret the software qualities dashboard (Part 4).
If you have followed along, you likely have a functioning pipeline enforcing a "focus on new code" strategy. However, you may have noticed a few gaps. Perhaps your "coverage" tile is sitting at 0.0%, or you are dealing with the rapid influx of code generated by AI assistants like GitHub Copilot.
In this concluding article, we will tackle the advanced topics that separate a basic setup from a professional, enterprise-grade implementation: integrating code coverage, detecting complex security vulnerabilities via taint analysis, and maintaining rigorous standards in the age of AI.
1. Closing the loop: Integrating code coverage
SonarQube Cloud serves as a visualization hub for test coverage, but it does not execute the tests itself. It relies on your build pipeline to execute a test runner, generate a report file, and provide that file to the scanner.
For modern .NET development, the industry standard for this task is Coverlet.
Step 1: The NuGet package
Ensure every test project in your solution references the collector package. This is included by default in most modern .NET templates, but verification is recommended.
<PackageReference Include="coverlet.collector" Version="6.0.0" />Step 2: The pipeline update
You must instruct the dotnet test command to output the results in the OpenCover format so SonarQube can understand them.
Azure DevOps update: Modify your .yaml file to add the --collect argument.
- task: DotNetCoreCLI@2
displayName: 'Run Tests with Coverage'
inputs:
command: 'test'
projects: '**/*.Tests.csproj'
# The argument below generates OpenCover XML reports
arguments: '--configuration Release --collect:"XPlat Code Coverage" -- DataCollectionRunSettings.DataCollectors.DataCollector.Configuration.Format=opencover'GitHub Actions update: The logic is identical for the CLI command.
- name: Test
run: dotnet test --configuration Release --collect:"XPlat Code Coverage" -- DataCollectionRunSettings.DataCollectors.DataCollector.Configuration.Format=opencoverStep 3: Telling the scanner
Finally, you must explicitly tell the scanner of the report location. This is achieved by adding a property to the extraProperties section of your SonarCloudPrepare task (Azure) or the begin command (GitHub).
Properties
# Add this to your scanner configuration
sonar.cs.opencover.reportsPaths=**/TestResults/**/coverage.opencover.xml
}Once this configuration is committed, your next pull request analysis will populate the Coverage metric. Under the "focus on new code" strategy, your quality gate will typically require 80% coverage on new code.
2. Deep security: Taint analysis
Standard static analysis identifies defects within the scope of a single file. However, sophisticated security vulnerabilities often involve data flowing across multiple files, methods, and architectural layers—for example, user input entering via a controller, flowing through a service, and being concatenated into a SQL query in a repository.
This detection capability is known as taint analysis.
If you are using SonarQube Cloud (or SonarQube Server Developer Edition+), this engine executes automatically. It detects complex injection flaws that standard linters cannot identify.
NOTE: Taint Analysis issues cannot be identified by the IDE extension alone; they require the full context of the server-side scan.
Example: The hidden SQL injection (rule S3649) Consider the following code snippet:
public void DeleteRecord(string recordId)
{
// 'recordId' originated from an API endpoint (Source)
// and is used in a raw SQL command (Sink).
_db.Execute($"DELETE FROM Records WHERE Id = '{recordId}'");
}SonarQube's Taint Analysis engine tracks the recordId variable from the moment it enters the application. It recognizes that no sanitization occurred along the execution path and flags this as a critical security vulnerability (SQL Injection).
Action item: Regularly review the "Security" tab on your dashboard.
3. Customizing the Analysis: Roslyn & quality profiles
In some scenarios, the default rules (the "Sonar way") may not align perfectly with the needs of your organization.
Modifying the quality profile
You have the ability to create a new quality profile on top of the default quality profile, to do that:.
- Navigate to quality profiles > C#.
- Select the gear icon next to "Sonar way" and choose extend.
- Name the new profile (e.g., "Corporate.NET Way").
- In this child profile, you can deactivate rules that generate noise for your team, or activate stricter rules that are disabled by default.
Roslyn integration
The SonarScanner for .NET is built on top of the Microsoft Roslyn compiler platform. This architecture automatically imports issues from other Roslyn analyzers you may have installed (such as XUnit analyzers or StyleCop) though most will come from the analyzers that are part of the .NET SDK as well as the analyzers built by Sonar. If you would like to disable the importing of these issues, you can add additional properties to do so
If you install a third-party analyzer via NuGet:
<PackageReference Include="StyleCop.Analyzers" Version="1.1.118" />The issues detected by StyleCop will appear in your build logs and automatically be imported into the SonarQube dashboard, providing a "single pane of glass" for all code quality metrics.
Conclusion: The journey to high-quality code
This series has provided a comprehensive roadmap for integrating SonarQube into a professional .NET environment.
- Part 1 introduced SonarQube for IDE for instant, local feedback.
- Part 2 demonstrated the setup of a local server and the mechanics of the scanner.
- Part 3 automated the workflow with CI/CD pipelines.
- Part 4 focused on interpreting software qualities and applying the "focus on new code" strategy.
- Part 5 concluded with advanced integrations for coverage, taint analysis, and custom quality profiles.
Implementing SonarQube is more than a tooling decision; it is a commitment to a culture of quality. By enforcing consistency, intentionality, adaptability, and responsibility, you ensure that your software remains a valuable asset rather than a liability, regardless of whether it is written by humans or machines.
