In Q3 2023, our 12-person platform engineering team faced a $2.1M annual Checkmarx 10.0 renewal quote for 517 active codebases, with 68% of scans timing out on PRs over 2000 lines. We migrated every single one to SonarQube 10.0 in 11 weeks, cutting SAST costs by 42%, reducing false positives by 57%, and getting scan times under 90 seconds for 92% of repositories.
📡 Hacker News Top Stories Right Now
- Integrated by Design (68 points)
- Microsoft and OpenAI end their exclusive and revenue-sharing deal (758 points)
- Talkie: a 13B vintage language model from 1930 (91 points)
- Meetings are forcing functions (45 points)
- Three men are facing charges in Toronto SMS Blaster arrests (96 points)
Key Insights
- 517 codebases migrated in 11 weeks with 0 unplanned downtime incidents
- Checkmarx 10.0.12 to SonarQube 10.0.4 Enterprise Edition migration path validated for Java, Python, Go, and Terraform
- $882k annual SAST cost reduction (42% lower than Checkmarx renewal quote)
- 80% of enterprise SAST migrations will shift from legacy commercial tools to SonarQube or Semgrep by 2026
Why We Migrated: The Checkmarx 10.0 Pain Points
For 3 years, Checkmarx 10.0 was our default SAST tool. We had 412 repos when we first adopted it in 2020, and the $140k annual cost seemed reasonable. But by 2023, our repo count grew to 517, and Checkmarx's pricing model (per repo, per year) pushed our renewal quote to $2.1M. That's a 1400% cost increase for 25% more repos, driven by Checkmarx's aggressive per-repo pricing for Enterprise features.
Beyond cost, scan performance degraded as our codebases grew. Our largest Java monolith (28k LOC) took 22 minutes to scan in Checkmarx, up from 8 minutes in 2021. 68% of PRs with more than 2000 lines of code timed out during Checkmarx scans, forcing developers to merge code without security validation. False positives were another major issue: 38% of Checkmarx findings were non-actionable (e.g., hardcoded credentials in test files, false SQL injection flags in ORM code), leading to alert fatigue. Developers started ignoring Checkmarx emails, which defeated the purpose of SAST entirely.
We also needed native IaC support for our Terraform and Kubernetes code, which Checkmarx 10.0 lacked. We were using a third-party IaC scanner that cost an additional $120k/year, and integrating it with Checkmarx's PR workflows added 2 minutes of latency per scan. When Checkmarx quoted us another $85k/year for SSO/SAML support to meet our compliance requirements, we hit a breaking point. We decided to evaluate alternatives.
Evaluation: Why SonarQube 10.0 Won
We evaluated three tools: SonarQube 10.0 Enterprise, Semgrep, and Snyk SAST. Semgrep had excellent scan times (2.1 minutes for 10k LOC Java) but lacked native quality gate management for 500+ repos, and its enterprise support was unproven for our scale. Snyk SAST had great dependency scanning but weak custom rule support, which we needed for our 127 custom Checkmarx rules. SonarQube 10.0 Enterprise hit all our requirements:
- Native IaC support for Terraform, CloudFormation, and Kubernetes, eliminating our third-party IaC scanner cost.
- Quality gate management as code via API and Terraform, critical for 500+ repos.
- SSO/SAML included in the base Enterprise license, saving $85k/year.
- Custom rule SDK for Java, Python, and Go, letting us port all 127 Checkmarx custom rules.
- API rate limit of 1000 req/min, 10x higher than Checkmarx, enabling parallel scan orchestration.
We ran a 2-week PoC on 20 canary repos, comparing scan times, false positive rates, and rule coverage. The results are summarized in the table below.
Pre-Migration Benchmarking
Metric
Checkmarx 10.0.12
SonarQube 10.0.4 Enterprise
Annual Cost (per 100 repos)
$420,000
$210,000
Avg Scan Time (10k LOC Java)
14.2 minutes
4.8 minutes
False Positive Rate
38%
16%
Native IaC Support
No
Yes (Terraform, CloudFormation, Kubernetes)
PR Decoration Latency
4.1 minutes
1.2 minutes
SSO/SAML Support
$85k/year add-on
Included
API Rate Limit
100 req/min
1000 req/min
Custom Rule Support
Yes (XML)
Yes (JSON API)
The benchmarking data confirmed SonarQube 10.0 delivered 3x faster scan times, 22 percentage points lower false positive rate, and 50% lower cost per 100 repos. We approved the migration in Q3 2023.
Migration Challenges
No large-scale migration is without hurdles. We faced four major challenges during the 11-week project:
- API Rate Limits: Checkmarx's 100 req/min API limit made exporting scan history for 517 repos take 6 hours. We implemented exponential backoff with 3 retries in our orchestration script, which reduced export time to 2 hours. SonarQube's 1000 req/min limit let us import project configurations in 15 minutes.
- Custom Rule Conversion: We exported 127 custom Checkmarx rules as XML, then built a Python script to convert them to SonarQube's custom rule JSON format. 89% of rules mapped directly, but 14 rules (11%) required manual adjustment for syntax differences between Checkmarx's rule engine and SonarQube's. A security engineer spent 2 weeks completing the conversion, and we validated all rules against known vulnerable test repos.
- Team Training: 12 platform engineers and 47 backend developers needed training on SonarQube's UI, quality gate configuration, and PR workflows. We conducted 4 hours of live training per team, plus self-paced modules, totaling 236 training hours. We also published internal docs with screenshots and video tutorials.
- PR Latency: Initial SonarQube scans added 2 minutes of latency to PRs, which developers complained about. We tuned SonarQube's scanner to skip test directories and reduced scan timeout for small repos to 5 minutes, bringing average PR latency to 1.2 minutes, faster than Checkmarx's 4.1 minutes.
Code Example 1: Migration Orchestration Script
The script below scans all 517 repos from https://github.com/acme-corp, runs parallel Checkmarx and SonarQube scans, and generates a migration report. It uses asyncio for parallel processing, aiohttp for API calls, and includes retry logic for rate limits and transient errors.
# migrate_sast.py
# Orchestration script to migrate 517 codebases from Checkmarx 10.0 to SonarQube 10.0
# Requires: aiohttp, asyncio, python-dotenv, pandas
# Environment variables: GITHUB_TOKEN, CHECKMARX_TOKEN, SONARQUBE_TOKEN, SONARQUBE_URL
import asyncio
import aiohttp
import json
import logging
import os
import csv
from dotenv import load_dotenv
from pandas import DataFrame
load_dotenv()
# Configure logging
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(levelname)s - %(message)s",
handlers=[logging.FileHandler("migration.log"), logging.StreamHandler()],
)
# Constants
GITHUB_API_URL = "https://api.github.com"
CHECKMAKX_API_URL = "https://checkmarx.acme-corp.com/rest"
SONARQUBE_URL = os.getenv("SONARQUBE_URL")
GITHUB_TOKEN = os.getenv("GITHUB_TOKEN")
CHECKMAKX_TOKEN = os.getenv("CHECKMARX_TOKEN")
SONARQUBE_TOKEN = os.getenv("SONARQUBE_TOKEN")
REPO_CSV = "repos.csv"
REPORT_CSV = "migration_report.csv"
MAX_RETRIES = 3
RETRY_DELAY = 5 # seconds
# Fetch all repos from GitHub organization https://github.com/acme-corp
async def fetch_repos(session):
repos = []
page = 1
while True:
try:
async with session.get(
f"{GITHUB_API_URL}/orgs/acme-corp/repos",
headers={"Authorization": f"token {GITHUB_TOKEN}"},
params={"page": page, "per_page": 100, "type": "private"},
) as resp:
if resp.status == 429:
logging.warning("GitHub rate limit hit, retrying after 60s")
await asyncio.sleep(60)
continue
resp.raise_for_status()
page_repos = await resp.json()
if not page_repos:
break
repos.extend(page_repos)
page += 1
except aiohttp.ClientError as e:
logging.error(f"Failed to fetch repos page {page}: {e}")
if page > MAX_RETRIES:
break
await asyncio.sleep(RETRY_DELAY)
page += 1
return repos
# Run Checkmarx scan for a single repo
async def run_checkmarx_scan(session, repo_name, repo_url):
for attempt in range(MAX_RETRIES):
try:
# Trigger Checkmarx scan via API
async with session.post(
f"{CHECKMAKX_API_URL}/scans",
headers={"Authorization": f"Bearer {CHECKMAKX_TOKEN}"},
json={"repoUrl": repo_url, "projectName": repo_name},
) as resp:
if resp.status == 429:
await asyncio.sleep(RETRY_DELAY * (attempt + 1))
continue
resp.raise_for_status()
scan_id = (await resp.json())["scanId"]
# Wait for scan to complete (simplified for example)
await asyncio.sleep(30)
return scan_id
except aiohttp.ClientError as e:
logging.error(f"Checkmarx scan failed for {repo_name}: {e}")
if attempt == MAX_RETRIES - 1:
return None
await asyncio.sleep(RETRY_DELAY)
return None
# Run SonarQube scan for a single repo
async def run_sonar_scan(session, repo_name, repo_url):
for attempt in range(MAX_RETRIES):
try:
# Trigger SonarQube scan via CLI (simulated via API)
# In production, this would call the SonarQube scanner CLI
async with session.post(
f"{SONARQUBE_URL}/api/projects/create",
headers={"Authorization": f"Bearer {SONARQUBE_TOKEN}"},
data={"project": repo_name, "name": repo_name},
) as resp:
resp.raise_for_status()
# Trigger scan via webhook (simplified)
return True
except aiohttp.ClientError as e:
logging.error(f"SonarQube scan failed for {repo_name}: {e}")
if attempt == MAX_RETRIES - 1:
return False
await asyncio.sleep(RETRY_DELAY)
return False
# Main migration logic
async def main():
async with aiohttp.ClientSession() as session:
# Fetch all repos
logging.info("Fetching repos from GitHub...")
repos = await fetch_repos(session)
logging.info(f"Fetched {len(repos)} repos")
# Write repo list to CSV
with open(REPO_CSV, "w", newline="") as f:
writer = csv.DictWriter(f, fieldnames=["name", "url", "language"])
writer.writeheader()
for repo in repos:
writer.writerow({
"name": repo["name"],
"url": repo["html_url"],
"language": repo["language"],
})
# Process repos in batches of 10 to avoid rate limits
report_rows = []
for i in range(0, len(repos), 10):
batch = repos[i:i+10]
tasks = []
for repo in batch:
repo_name = repo["name"]
repo_url = repo["html_url"]
# Run both scans in parallel
tasks.append(asyncio.gather(
run_checkmarx_scan(session, repo_name, repo_url),
run_sonar_scan(session, repo_name, repo_url),
))
results = await asyncio.gather(*tasks)
for repo, (cx_scan, sonar_scan) in zip(batch, results):
report_rows.append({
"repo": repo["name"],
"checkmarx_scan": "success" if cx_scan else "failed",
"sonarqube_scan": "success" if sonar_scan else "failed",
"language": repo["language"],
})
logging.info(f"Processed batch {i//10 + 1}, total repos: {len(report_rows)}")
# Write migration report
with open(REPORT_CSV, "w", newline="") as f:
writer = csv.DictWriter(f, fieldnames=["repo", "checkmarx_scan", "sonarqube_scan", "language"])
writer.writeheader()
writer.writerows(report_rows)
logging.info(f"Migration report written to {REPORT_CSV}")
if __name__ == "__main__":
asyncio.run(main())
Code Example 2: SonarQube Quality Gate Configuration
This script converts Checkmarx 10.0 policies to SonarQube 10.0 quality gates via the SonarQube Web API. It reads a YAML file of Checkmarx policies, creates a quality gate per language, and adds conditions mapping Checkmarx rules to SonarQube metrics.
# configure_sonar_gates.py
# Converts Checkmarx 10.0 policies to SonarQube 10.0 quality gates via API
# Requires: aiohttp, asyncio, pyyaml, python-dotenv
# Environment variables: SONARQUBE_URL, SONARQUBE_TOKEN
# Input: checkmarx_policies.yaml (exported from Checkmarx 10.0)
import asyncio
import aiohttp
import yaml
import logging
import os
from dotenv import load_dotenv
load_dotenv()
# Configure logging
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(levelname)s - %(message)s",
handlers=[logging.FileHandler("gate_config.log"), logging.StreamHandler()],
)
# Constants
SONARQUBE_URL = os.getenv("SONARQUBE_URL")
SONARQUBE_TOKEN = os.getenv("SONARQUBE_TOKEN")
POLICIES_FILE = "checkmarx_policies.yaml"
MAX_RETRIES = 3
RETRY_DELAY = 3
# Load Checkmarx policies from YAML
def load_policies():
try:
with open(POLICIES_FILE, "r") as f:
return yaml.safe_load(f)
except FileNotFoundError:
logging.error(f"Policies file {POLICIES_FILE} not found")
raise
except yaml.YAMLError as e:
logging.error(f"Failed to parse YAML: {e}")
raise
# Create a SonarQube quality gate
async def create_quality_gate(session, gate_name, language):
for attempt in range(MAX_RETRIES):
try:
async with session.post(
f"{SONARQUBE_URL}/api/qualitygates/create",
headers={"Authorization": f"Bearer {SONARQUBE_TOKEN}"},
data={"name": f"{gate_name}_{language}"},
) as resp:
if resp.status == 409:
logging.warning(f"Quality gate {gate_name}_{language} already exists")
return gate_name
resp.raise_for_status()
return gate_name
except aiohttp.ClientError as e:
logging.error(f"Failed to create quality gate {gate_name}: {e}")
if attempt == MAX_RETRIES - 1:
return None
await asyncio.sleep(RETRY_DELAY)
return None
# Add condition to quality gate (map Checkmarx rule to SonarQube metric)
async def add_condition(session, gate_name, rule_id, severity, language):
# Map Checkmarx severity to SonarQube metric threshold
severity_map = {
"High": "critical",
"Medium": "major",
"Low": "minor",
}
sonar_severity = severity_map.get(severity, "major")
for attempt in range(MAX_RETRIES):
try:
async with session.post(
f"{SONARQUBE_URL}/api/qualitygates/add-condition",
headers={"Authorization": f"Bearer {SONARQUBE_TOKEN}"},
data={
"gateName": gate_name,
"metric": f"security_hotspots_{sonar_severity}",
"op": "LT",
"warning": "1",
"error": "1",
},
) as resp:
if resp.status == 400:
logging.warning(f"Invalid condition for rule {rule_id}")
return False
resp.raise_for_status()
return True
except aiohttp.ClientError as e:
logging.error(f"Failed to add condition for rule {rule_id}: {e}")
if attempt == MAX_RETRIES - 1:
return False
await asyncio.sleep(RETRY_DELAY)
return False
# Assign quality gate to a project
async def assign_gate_to_project(session, project_key, gate_name):
for attempt in range(MAX_RETRIES):
try:
async with session.post(
f"{SONARQUBE_URL}/api/qualitygates/select",
headers={"Authorization": f"Bearer {SONARQUBE_TOKEN}"},
data={"gateName": gate_name, "projectKey": project_key},
) as resp:
resp.raise_for_status()
return True
except aiohttp.ClientError as e:
logging.error(f"Failed to assign gate {gate_name} to {project_key}: {e}")
if attempt == MAX_RETRIES - 1:
return False
await asyncio.sleep(RETRY_DELAY)
return False
# Main logic
async def main():
# Load policies
try:
policies = load_policies()
except Exception:
return
logging.info(f"Loaded {len(policies)} policies from {POLICIES_FILE}")
async with aiohttp.ClientSession() as session:
for policy in policies:
policy_name = policy["name"]
language = policy["language"]
rules = policy["rules"]
severity = policy["severity"]
# Create quality gate for language
gate_name = await create_quality_gate(session, policy_name, language)
if not gate_name:
continue
# Add conditions for each rule
for rule in rules:
await add_condition(session, gate_name, rule["id"], severity, language)
# Assign to all projects of that language (simplified)
logging.info(f"Configured quality gate {gate_name} for {language}")
if __name__ == "__main__":
asyncio.run(main())
Code Example 3: PR Decoration Webhook Handler
This Flask app receives webhooks from GitHub and SonarQube, processes scan results, and posts SonarQube findings as PR comments, replacing Checkmarx's native PR decorations. It includes HMAC signature verification for security and error handling for API failures.
# pr_decoration_webhook.py
# Flask webhook handler to replace Checkmarx PR comments with SonarQube results
# Requires: flask, requests, python-dotenv, hmac, hashlib
# Environment variables: GITHUB_TOKEN, SONARQUBE_TOKEN, WEBHOOK_SECRET
# GitHub repos: https://github.com/acme-corp/*
import hmac
import hashlib
import logging
import os
from flask import Flask, request, jsonify
import requests
from dotenv import load_dotenv
load_dotenv()
# Configure logging
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(levelname)s - %(message)s",
)
# Constants
GITHUB_TOKEN = os.getenv("GITHUB_TOKEN")
SONARQUBE_TOKEN = os.getenv("SONARQUBE_TOKEN")
WEBHOOK_SECRET = os.getenv("WEBHOOK_SECRET")
SONARQUBE_URL = os.getenv("SONARQUBE_URL")
GITHUB_API_URL = "https://api.github.com"
app = Flask(__name__)
# Verify GitHub webhook signature
def verify_signature(data, signature):
if not WEBHOOK_SECRET:
logging.warning("No webhook secret configured, skipping verification")
return True
mac = hmac.new(WEBHOOK_SECRET.encode(), msg=data, digestmod=hashlib.sha256)
expected = f"sha256={mac.hexdigest()}"
return hmac.compare_digest(expected, signature)
# Post comment to GitHub PR
def post_github_comment(repo, pr_number, comment):
try:
resp = requests.post(
f"{GITHUB_API_URL}/repos/acme-corp/{repo}/issues/{pr_number}/comments",
headers={
"Authorization": f"token {GITHUB_TOKEN}",
"Accept": "application/vnd.github.v3+json",
},
json={"body": comment},
)
resp.raise_for_status()
return True
except requests.exceptions.RequestException as e:
logging.error(f"Failed to post comment to {repo} PR {pr_number}: {e}")
return False
# Fetch SonarQube scan results for a project
def fetch_sonar_results(project_key):
try:
resp = requests.get(
f"{SONARQUBE_URL}/api/qualitygates/project_status",
headers={"Authorization": f"Bearer {SONARQUBE_TOKEN}"},
params={"projectKey": project_key},
)
resp.raise_for_status()
return resp.json()
except requests.exceptions.RequestException as e:
logging.error(f"Failed to fetch SonarQube results for {project_key}: {e}")
return None
# Format SonarQube results into GitHub comment
def format_comment(sonar_results, repo, pr_number):
status = sonar_results["projectStatus"]["status"]
conditions = sonar_results["projectStatus"]["conditions"]
comment = f"## SonarQube Scan Results for PR #{pr_number}\n"
comment += f"**Status**: {'✅ Passed' if status == 'OK' else '❌ Failed'}\n\n"
comment += "### Conditions:\n"
for cond in conditions:
metric = cond["metricKey"]
value = cond["actualValue"]
status = cond["status"]
comment += f"- {metric}: {value} ({'✅' if status == 'OK' else '❌'})\n"
comment += f"\nView full report: {SONARQUBE_URL}/dashboard?id={repo}"
return comment
# Webhook endpoint for GitHub PR events
@app.route("/github-webhook", methods=["POST"])
def github_webhook():
# Verify signature
signature = request.headers.get("X-Hub-Signature-256")
if not verify_signature(request.data, signature):
logging.error("Invalid webhook signature")
return jsonify({"error": "Invalid signature"}), 403
# Parse payload
payload = request.json
if not payload:
return jsonify({"error": "No payload"}), 400
# Handle pull request events only
if payload.get("action") not in ["opened", "synchronize"]:
return jsonify({"status": "ignored"}), 200
repo = payload["repository"]["name"]
pr_number = payload["number"]
repo_url = payload["repository"]["html_url"]
# Extract project key from repo URL (https://github.com/acme-corp/repo -> repo)
project_key = repo
# Fetch SonarQube results
sonar_results = fetch_sonar_results(project_key)
if not sonar_results:
return jsonify({"error": "Failed to fetch SonarQube results"}), 500
# Format and post comment
comment = format_comment(sonar_results, repo, pr_number)
if post_github_comment(repo, pr_number, comment):
return jsonify({"status": "success"}), 200
else:
return jsonify({"error": "Failed to post comment"}), 500
# Webhook endpoint for SonarQube scan completion
@app.route("/sonar-webhook", methods=["POST"])
def sonar_webhook():
payload = request.json
if not payload:
return jsonify({"error": "No payload"}), 400
project_key = payload["project"]["key"]
# Trigger GitHub PR comment update (simplified)
logging.info(f"SonarQube scan completed for {project_key}")
return jsonify({"status": "success"}), 200
if __name__ == "__main__":
app.run(host="0.0.0.0", port=5000, debug=False)
Case Study: Java Microservices Team
- Team size: 4 backend engineers, 1 platform engineer, 1 security architect
- Stack & Versions: Java 17, Spring Boot 3.1, Maven 3.9, GitHub Actions, Checkmarx 10.0.12, SonarQube 10.0.4 Enterprise
- Problem: p99 SAST scan time for 112 Java microservices was 14.2 minutes, 38% false positive rate, $2.1M annual Checkmarx renewal quote, 68% of PRs with >2000 LOC timed out during scans
- Solution & Implementation: Migrated all 112 Java repos using the
migrate_sast.pyorchestration script, replicated 47 custom Checkmarx Java rules usingconfigure_sonar_gates.py, replaced Checkmarx PR decorations with thepr_decoration_webhook.pyFlask app, conducted 2-week training for all backend engineers on SonarQube UI and custom rule configuration - Outcome: p99 scan time dropped to 4.1 minutes, false positive rate reduced to 14%, SAST costs for Java repos cut by 45% ($380k/year savings), 99% of PRs received scan results in <2 minutes, zero production incidents related to missed security findings post-migration
Developer Tips
Tip 1: Validate Rule Parity with Automated Diffing
The single biggest risk in a SAST migration is losing coverage for critical security rules. Checkmarx 10.0 and SonarQube 10.0 have overlapping but non-identical rule sets: we found 127 custom rules in Checkmarx that had no out-of-the-box SonarQube equivalent. To avoid gaps, we built an automated diffing pipeline that runs both tools against a canary repo, exports all findings as JSON, and uses pandas to diff the results. For each rule, we checked: 1) Is the rule present in both tools? 2) Does the rule trigger on the same vulnerable code patterns? 3) Is the severity level consistent? We found 12% of Checkmarx rules had no direct SonarQube equivalent, which required writing custom SonarQube rules using the sonar-java SDK. The diffing script took 3 days to build but caught 7 critical rule gaps that would have left our authentication libraries untested. For teams with custom rules, this step is non-negotiable: manual rule mapping is error-prone at scale, and missing a single critical rule can lead to a production breach. We recommend running diffing on at least 5 canary repos across different languages before full rollout.
Short code snippet for diffing findings:
import pandas as pd
import json
def diff_findings(checkmarx_json, sonarqube_json):
cx_df = pd.DataFrame(checkmarx_json["findings"])
sonar_df = pd.DataFrame(sonarqube_json["issues"])
# Merge on vulnerability type and file path
merged = pd.merge(cx_df, sonar_df, left_on=["vulnType", "filePath"], right_on=["rule", "component"], how="outer")
# Find rules only in Checkmarx
only_cx = merged[merged["rule"].isna()]
# Find rules only in SonarQube
only_sonar = merged[merged["vulnType"].isna()]
return only_cx, only_sonar
Tip 2: Use Infrastructure as Code for SonarQube Configuration
When migrating 500+ codebases, manual configuration of SonarQube projects, quality gates, and permissions is a recipe for disaster. We initially tried configuring 50 repos manually via the SonarQube UI, and introduced 12 configuration drift issues in a single day: duplicate project keys, incorrect quality gate assignments, and missing permissions for GitHub Actions service accounts. To solve this, we adopted the Terraform SonarQube Provider to define all SonarQube resources as code. This let us version control our configuration, reuse quality gate definitions across repos of the same language, and automate project creation via GitHub Actions. For example, we defined a base quality gate for Java repos once, then used Terraform count to apply it to all 112 Java microservices. We also used Ansible to automate the SonarQube server setup, reducing server provisioning time from 4 hours to 15 minutes. Infrastructure as Code also made auditing easier: we could prove to our compliance team that all repos had the required quality gates applied, with a git commit history to back it up. For teams with more than 50 repos, manual SonarQube configuration is unsustainable: IaC is the only way to maintain consistency at scale.
Short Terraform snippet for SonarQube project:
resource "sonarqube_project" "java_microservice" {
count = length(var.java_repos)
project = var.java_repos[count.index]
name = var.java_repos[count.index]
visibility = "private"
}
resource "sonarqube_quality_gate_project_association" "java_gate" {
count = length(sonarqube_project.java_microservice)
gate_name = "java_security_gate"
project_key = sonarqube_project.java_microservice[count.index].project
}
Tip 3: Implement Gradual Rollout with Canary Repos
Migrating 500+ codebases in a single cutover is a high-risk strategy that we almost fell for. Our initial plan was to switch all repos from Checkmarx to SonarQube in one weekend, but our security architect talked us out of it. Instead, we implemented a 4-week canary rollout: week 1, migrate 20 low-risk internal tool repos; week 2, migrate 50 medium-risk backend services; week 3, migrate 150 high-risk customer-facing services; week 4, migrate remaining 297 repos. This let us catch two critical issues early: first, SonarQube's default scan timeout of 10 minutes was too short for our largest monolith (28k LOC), which we fixed by increasing the timeout to 15 minutes for that repo. Second, we found that SonarQube's Terraform scanner flagged 30% more false positives than Checkmarx for our legacy IaC code, which we fixed by adjusting the rule severity for 3 specific Terraform rules. We used LaunchDarkly feature flags in our GitHub Actions pipelines to toggle between Checkmarx and SonarQube scans per repo, which made rolling back a single repo take less than 1 minute. Gradual rollout added 2 weeks to our timeline but eliminated unplanned downtime: we had zero scan-related incidents during the entire migration. For large-scale migrations, slow and steady always wins.
Short GitHub Actions snippet for canary toggle:
- name: Run SAST Scan
run: |
if curl -s "https://launchdarkly.acme-corp.com/api/flags/sast-sonar-rollout" | grep -q "${{ github.repository }}"; then
sonar-scanner -Dsonar.projectKey=${{ github.repository }}
else
checkmarx-cli scan -p ${{ github.repository }}
fi
Join the Discussion
We've shared our raw migration playbook, benchmark data, and all three code examples above. All scripts are available at https://github.com/acme-corp/sast-migration-toolkit under an MIT license. We'd love to hear from teams who've done similar migrations, or are planning to.
Discussion Questions
- With SonarQube 10.1 adding experimental AI-powered rule suggestions, do you expect open-source SAST tools to fully replace commercial legacy tools by 2027?
- Would you prioritize lower scan times over higher false positive rates when choosing a SAST tool, assuming both meet your compliance requirements?
- How does Semgrep's SAST performance compare to SonarQube 10.0 for Python and Go codebases, based on your production experience?
Frequently Asked Questions
Do I need SonarQube Enterprise Edition to migrate from Checkmarx 10.0?
No, SonarQube Community Edition 10.0 supports Java, Python, Go, and JavaScript scanning, but lacks native IaC support, SSO, and project-level quality gate customization. For 500+ codebases with compliance requirements, Enterprise Edition is required to match Checkmarx's feature set. We used Enterprise Edition 10.0.4, which costs 50% less than Checkmarx 10.0 for our repo count.
How do I handle custom Checkmarx rules not available in SonarQube?
We exported all 127 custom Checkmarx rules as XML, wrote a Python script to convert them to SonarQube custom rule format, and imported them via the SonarQube Web API. 89% of rules mapped directly, 11% required manual adjustment for syntax differences. All conversion scripts are available at https://github.com/acme-corp/sast-migration-toolkit/rules.
What's the biggest mistake to avoid during a large-scale SAST migration?
Migrating all repos at once without a canary phase. We initially tried to cut over 50 repos in one day, and hit a SonarQube API rate limit that broke PR decorations for 12 hours. We switched to a 4-week canary phase with 20 repos per week, which let us tune scan timeouts and fix rule parity issues before full rollout.
Conclusion & Call to Action
After 11 weeks and 517 codebases, our migration from Checkmarx 10.0 to SonarQube 10.0 is unequivocally a success. For teams with 100+ codebases, commercial SAST tools like Checkmarx are no longer cost-effective: SonarQube 10.0 delivers better scan times, lower false positives, and 40%+ cost savings. Our only regret is not starting the migration 6 months earlier. If you're evaluating SAST tools in 2024, SonarQube 10.0 is the only choice for enterprise teams that value developer velocity and cost efficiency. All our migration scripts, configuration files, and training materials are available at https://github.com/acme-corp/sast-migration-toolkit – fork it, adapt it, and share your results with us.
42% Reduction in annual SAST costs vs Checkmarx 10.0







