Jan. 23, 2022, 2:06 p.m. | /u/red_shrike

cybersecurity www.reddit.com

In cybersecurity, risk should be calculated with probability as a determining factor. If a threat has near zero probability of being applicable, then probability (and thus risk) should be infinitesimal.

Risk = (threat x vulnerability) x Impact / Probability (not true risk equation, but for conversation purposes)

The problem is some security folks see hypothetical threats as probable threats without any statistics to back it up, or evidence it's being exploited in the wild given a certain architecture.

"I saw …

cyber cybersecurity cyber threat threat

SOC 2 Manager, Audit and Certification

@ Deloitte | US and CA Multiple Locations

Digital Trust Cyber Transformation Senior

@ KPMG India | Mumbai, Maharashtra, India

Security Consultant, Assessment Services - SOC 2 | Remote US

@ Coalfire | United States

Sr. Systems Security Engineer

@ Effectual | Washington, DC

Cyber Network Engineer

@ SonicWall | Woodbridge, Virginia, United States

Security Architect

@ Nokia | Belgium