Original research on cognitive privacy, algorithmic influence, and the defense of independent thought.
AI Algorithms Can Read Your Mind
Traditional privacy frameworks depend on a controllable perimeter where data sharing requires active user participation. AI-integrated wearables have dissolved this boundary through "biometric psychography"—the extraction of psychological profiles from involuntary physiological signals. I examine the "pre-consent collection problem" and how our mere presence in public space is being converted into a product of algorithmic inference.
The Architecture of Total Capture
Hesitation before typing, deleted drafts, or pauses mid-thought are no longer private. They're data points. The Architecture of Total Capture introduces cognitive privacy and documents four convergent threats: pre-upload scanning, biometric sentiment tracking, analytic atrophy, and cognitive prompt injection.


Most AI audits evaluate data privacy and cybersecurity, but completely ignore what the system does to human thinking. The Cognitive Privacy Impact Assessment (CPIA) introduces six critical domains to evaluate how AI systems capture, infer, and influence cognitive processes. Here is the framework every responsible AI audit must address to protect human agency.