Yes, “AI” will compromise your information security posture. No, not through some mythical self-aware galaxy-brain entity magically cracking your passwords in seconds or “autonomously” exploiting new vulnerabilities.
It’s way more mundane.
When immensely complex, poorly-understood systems get hurriedly integrated into your toolset and workflow, or deployed in your infrastructure, what inevitably follows is leaks, compromises, downtime, and a whole lot of grief.
Complexity means cost and risk
LLM-based systems are insanely complex, both on the conceptual level, and on the implementation level. Complexity has real cost and introduces very real risk. These costs and these risks are enormous, poorly understood – and usually just hand-waved away. As Suha Hussain puts it in a video I’ll discuss a bit later: