Step-by-step guide

The .NET Developer’s Guide to SonarQube Part 2: Analyzing Your First Project

Table of contents

In Part 1, we explored how to catch issues in real-time using the SonarQube for IDE extension. That tool covers the "inner loop" of development, providing immediate feedback on the specific file you are editing. However, to truly understand the health of a codebase, we need to look at the broader picture.

In this installment, we will move from local, single-file linting to performing a comprehensive static analysis of an entire .NET solution. We will explore the architecture of the SonarScanner for .NET, set up a local analysis environment using Docker, and walk through the specific workflow required to analyze managed code.

Provisioning a local analysis server

To understand how SonarQube processes and stores data, you can spin up a temporary instance to see the value it can provide to you and your developer teams. While production environments typically involve dedicated servers and external databases, developers can replicate this architecture locally using Docker. This allows for a consistent, isolated environment for testing analysis configurations.

We will use Docker Compose to orchestrate two services: the SonarQube Community application instance (which is a free version of SonarQube for Open Source projects) and a PostgreSQL database to persist the analysis reports.

Create a file named docker-compose.yml in an empty directory and add the following configuration:

services:

  sonarqube:

    image: sonarqube:community

    hostname: sonarqube

    container_name: sonarqube

    read_only: true

    depends_on:

      db:

        condition: service_healthy

    environment:

      SONAR_JDBC_URL: jdbc:postgresql://db:5432/sonar

      SONAR_JDBC_USERNAME: sonar

      SONAR_JDBC_PASSWORD: sonar

    volumes:

      - sonarqube_data:/opt/sonarqube/data

      - sonarqube_extensions:/opt/sonarqube/extensions

      - sonarqube_logs:/opt/sonarqube/logs

      - sonarqube_temp:/opt/sonarqube/temp

    ports:

      - "9000:9000"

    networks:

      - ${NETWORK_TYPE:-ipv4}

  db:

    image: postgres:17

    healthcheck:

      test: [ "CMD-SHELL", "pg_isready -d $${POSTGRES_DB} -U $${POSTGRES_USER}" ]

      interval: 10s

      timeout: 5s

      retries: 5

    hostname: postgresql

    container_name: postgresql

    environment:

      POSTGRES_USER: sonar

      POSTGRES_PASSWORD: sonar

      POSTGRES_DB: sonar

    volumes:

      - postgresql:/var/lib/postgresql

    networks:

      - ${NETWORK_TYPE:-ipv4}

volumes:

  sonarqube_data:

  sonarqube_temp:

  sonarqube_extensions:

  sonarqube_logs:

  postgresql:

networks:

  ipv4:

    driver: bridge

    enable_ipv6: false

  dual:

    driver: bridge

    enable_ipv6: true

    ipam:

      config:

        - subnet: "192.168.2.0/24"

          gateway: "192.168.2.1"

        - subnet: "2001:db8:2::/64"

          gateway: "2001:db8:2::1"

NOTE: The above docker compose is for macOS or Linux environments. If you would like to use Windows to run your SonarQube instance, you have to update the volume paths to adhere to the Windows filesystem

C:\dev\sonarqube-server\data:/opt/sonarqube/data

C:\dev\sonarqube-server\extensions:/opt/sonarqube/extensions

C:\dev\sonarqube-server\logs:/opt/sonarqube/logs

C:\dev\sonarqube-server\postgresql:/var/lib/postgresql

To start the environment, run the following command in your terminal:

docker-compose up -d

Once the containers initialize, SonarQube will be accessible at http://localhost:9000. You can log in using the default credentials (username: admin, password: admin).

You'll be prompted to change your password, and that's it. You now have a persistent, fully functional SonarQube instance running locally.

When you first connect to your instance, you will be prompted to create a project. We will do that in a moment.

Understanding .NET static code analysis architecture

Static analysis works differently depending on the language being analyzed. For interpreted languages like JavaScript or Python, a scanner simply reads the source files and parses the text.

For .NET, the process is more integrated. The SonarScanner for .NET leverages the Roslyncompiler platform. Rather than reading files externally, the scanner hooks directly into the MSBuild pipeline. This allows it to analyze the code using the full semantic model provided by the compiler, understanding type definitions, references, and method overloads with the same accuracy as the compiler itself.

This architecture dictates how we run the analysis. We cannot simply point the scanner at a folder; we must wrap the standard build process in specific "Begin" and "End" steps.

Prerequisite: Installing the Scanner

The scanner is distributed as a .NET Global Tool. To install it, run:

dotnet tool install --global dotnet-sonarscanner

The code analysis lifecycle

To analyze a project, we must execute a three-step sequence. This sequence ensures that the Roslyn analyzers are injected into the build and that the results are correctly aggregated and uploaded.

Step 0: Project Setup

Before we can scan, we need to tell our SonarQube instance about the project.

  1. On your dashboard (http://localhost:9000), click the "Create a local project" button.
  2. Provide a unique Project key (e.g., my-awesome-app) and a Display name.  
  3. Choose Follows the instance’s default and click the “Create project” button
  4. For Analysis Method, choose Locally
  5. On the next screen, generate an analysis token. Copy this token and store it somewhere safe; you will not be shown it again.
  6. Finally choose “.NET” as your project type, which will provide the steps needed to run SonarScanner

Now that we have a configured project, we can run the SonarScanner from the command line to analyze our local codebase.

Step 1: The Begin Command

The begin step downloads the Quality Profile (the active set of rules) from SonarQube and configures MSBuild to use the SonarQube analyzers.

# Run from the root folder of your solution

dotnet sonarscanner begin /k:"my-local-project"
/d:sonar.host.url="http://localhost:9000"
/d:sonar.login="<YOUR_TOKEN>"
  • /k: Defines the Project Key, a unique identifier for this project on SonarQube.
  • /d:sonar.host.url: Points the scanner to your local Docker instance.
  • /d:sonar.login: Authenticates the request.

Step 2: The Build

Next, run your standard build command. Because of the begin step, MSBuild will now execute the SonarQube analyzers alongside the standard compilation tasks. You may notice warnings appearing in the build output, these are the issues detected by the Roslyn analyzers, which will be added to the analysis by default. One thing to note is that this process may impact the build time of the project.

dotnet build

Step 3: The End Command

The end step is critical. It stops the analysis hook, collects the results produced during the build, and uploads the final report to the SonarQube instance.

dotnet sonarscanner end /d:sonar.login="<YOUR_TOKEN>"

Interpreting the results

Once the end command completes, navigate to your project dashboard at http://localhost:9000.

Unlike the local IDE view, this dashboard provides a holistic view of the application's health:

  • Quality Gate: This is the most important metric. It indicates whether the project meets the defined release standards (Pass/Fail).
  • New Code Definition: By default, the dashboard focuses on issues introduced in recent changes (New Code), helping teams prevent technical debt from growing without getting overwhelmed by legacy issues.
  • Separation of Concerns: Issues are categorized into Reliability (Bugs), Security (Vulnerabilities), and Maintainability (Code Smells), allowing developers to prioritize critical fixes.

Summary

In this article, we established a local analysis environment and learned the specific begin-build-end workflow required to analyze .NET solutions. By integrating with Roslyn, SonarQube provides deep, compiler-level insights into your code.

While running these commands manually is useful for learning and ad-hoc debugging, a robust development workflow relies on automation. In the next part of this series, we will apply these concepts to a Continuous Integration (CI) pipeline, automating the analysis for every pull request using Azure DevOps and GitHub Actions.