April 6, 2024, 12:19 a.m. | /u/kannthu

cybersecurity www.reddit.com

SASTs just suck, but how much? ...and why they suck?

I recently came across study ([https://sen-chen.github.io/img\_cs/pdf/fse2023-sast.pdf](https://sen-chen.github.io/img_cs/pdf/fse2023-sast.pdf)) that evaluates top SASTs like CodeQL, Semgrep, and SonarQube. This study evaluates 7 tools against dataset of real-world vulnerabilities (code snippets from CVEs, not a dummy vulnerable code) and mesures false positive and negative rate.

... and to no surprise the SASTs detected only 12,7% of all security issues. Researchers also combined results of all 7 tools and the detection rate was 30%.

Why …

bad can cybersecurity detection rate real researchers results security security issues surprise tools world

Social Engineer For Reverse Engineering Exploit Study

@ Independent study | Remote

Information Security Specialist, Sr. (Container Hardening)

@ Rackner | San Antonio, TX

Principal Security Researcher (Advanced Threat Prevention)

@ Palo Alto Networks | Santa Clara, CA, United States

EWT Infosec | IAM Technical Security Consultant - Manager

@ KPMG India | Bengaluru, Karnataka, India

Security Engineering Operations Manager

@ Gusto | San Francisco, CA; Denver, CO; Remote

Network Threat Detection Engineer

@ Meta | Denver, CO | Reston, VA | Menlo Park, CA | Washington, DC