US researchers have found that popular AI browser extensions are collecting sensitive user data, including medical records and Social Security numbers, posing significant privacy risks. The study was published in August 2025.
The research, conducted by experts from University College London, Mediterranea University of Reggio Calabria, and University of California, Davis, analyzed various AI-assisted browser extensions. These extensions serve as interfaces for models like OpenAI’s ChatGPT, Google’s Gemini, and Meta’s Llama, injecting content scripts into webpages to enable autonomous data scraping. The study focused on extensions such as ChatGPT for Google, Sider, Monica, Merlin, MaxAI, Perplexity, HARPA, TinaMind, and Microsoft’s Copilot. Researchers simulated browsing activities in both private and public contexts, including reading news, watching YouTube videos, viewing pornography, and completing tax forms.
The findings revealed that these extensions captured sensitive information, including images and text such as medical diagnoses, Social Security numbers, and dating app preferences. For instance, Merlin transmitted banking details and health records, while Sider AI recorded user activity even in private browsing modes. Analysis of decrypted traffic showed that data was being transmitted to company servers and third-party trackers. Sider and TinaMind shared user prompts and IP addresses with Google Analytics, facilitating cross-site tracking. Microsoft’s Copilot retained chat histories across sessions in browser storage.
The study also found that several extensions, including Google, Copilot, Monica, ChatGPT, and Sider, profiled users by age, gender, income, and interests to provide personalized responses over multiple sessions. However, Perplexity emerged as the most privacy-respecting among the tested tools, as it does not recall prior interactions and its servers avoid personal data from private spaces. Nevertheless, Perplexity still processes page titles and user location.
OpenAI’s Atlas, released after the study, selectively analyzes content but processes all website images and text. It includes optional memory features that store browsing history elements to customize user experiences. Users can disable settings that permit OpenAI to use webpage data from chatbot queries for ChatGPT training and incorporate full browsing history into training. OpenAI anonymizes data before using it for training, although specifics on boundaries remain limited.
The practice of repurposing stored user data for large language model training without explicit consent is common among AI companies. This issue is compounded by security vulnerabilities, including prompt injection attacks that allow hackers to embed malicious content in browser backends. These attacks enable phishing and theft of credentials, banking details, and personal data. A Brave study in October 2025 described prompt injections as a systemic challenge for AI browsers, increasing phishing risks. LayerX Security reported that Perplexity Comet users face 85 percent higher vulnerability to such attacks compared to Chrome users.
The USENIX researchers tested extensions in controlled scenarios to measure data capture. The results showed that extensions logged article text and images while browsing news, captured video thumbnails and comments on YouTube, recorded images and preferences on pornography sites, and exposed Social Security numbers and financial details during tax form simulations. Decrypted traffic from Merlin showed plaintext transmission of health records and banking logins. Sider AI’s packets included IP addresses paired with prompts containing personal identifiers.
The market context shows that while Chrome holds a 70 percent share of the global browser market, AI browsers have gained traction since their introduction in 2025. McKinsey & Company forecasts the browser industry will generate $750 billion in revenue by 2028. As AI browsers continue to challenge established players, concerns regarding privacy and security remain paramount. OpenAI fulfilled 105 U.S. government data requests in the first half of 2025, highlighting the need for greater transparency and regulation in the industry.




