Bitget App
Trading Inteligente
Comprar criptoMercadosTradingFuturosEarnWeb3CentroMás
Trading
Spot
Compra y vende cripto con facilidad
Margen
Aumenta tu capital y maximiza tus fondos
Onchain
Going Onchain, without going Onchain!
Convert
Convierte sin comisiones ni deslizamiento
Explorar
Launchhub
Obtén ventajas desde el principio y empieza a ganar
Copy
Copia al trader elite con un solo clic
Bots
Bot de trading con IA sencillo, rápido y confiable
Trading
Futuros USDT-M
Tradea futuros liquidados en USDT
Futuros USDC-M
Futuros liquidados en USDC
Futuros Coin-M
Tradea futuros liquidados en cripto
Explorar
Guía de Futuros
Un recorrido de principiante a experto en el trading de futuros
Promociones de futuros
Gana grandes recompensas
Resumen
Una variedad de productos para incrementar tus activos
Simple Earn
Deposita y retira en cualquier momento para obtener retornos flexibles sin riesgo.
On-chain Earn
Obtén ganancias diarias sin arriesgar tu capital
Earn estructurado
Innovación financiera sólida para sortear las oscilaciones del mercado
VIP y Gestión Patrimonial
Aumenta tu patrimonio con nuestro equipo de primer
Préstamos
Préstamos flexibles con alta seguridad de fondos
ChatGPT, OpenAI, and 8 other AI assistants secretly track users’ data that should remain private

ChatGPT, OpenAI, and 8 other AI assistants secretly track users’ data that should remain private

Cryptopolitan2025/08/14 01:00
Por:By Enacy Mapakame

AI assistants built into web browsers, including ChatGPT, are scooping up sensitive personal information from sites that many people assume are private, a new research from teams in the UK and Italy shows.

The researchers examined ten popular AI-enabled browsers and extensions, among them OpenAI’s ChatGPT, Microsoft Copilot, and Merlin AI for Google Chrome, and put them through tests on both open websites and password-protected portals such as a university’s health records system.

Private sites exposed information to ChatGPT and like-minded tools

Results from the study were shocking. These revealed that nine of the ten tools took and sent private data, including medical histories, banking details, academic transcripts, as well as even social security numbers.

Perplexity AI was the only tool that did not appear to collect such data.

“These assistants have a level of access to our online activity that’s unlike anything we’ve seen before,” said Anna Maria Mandalari, the study’s senior author and an assistant professor at University College London.

“They make things quicker and easier, but our evidence shows that sometimes this comes at the expense of privacy, and in some cases, may break the law.”

Mandalari.

To run the tests, the team mimicked everyday browsing, shopping online, checking medical results, logging into bank accounts, and then asked the assistants follow-up questions like: What was the reason for the most recent medical visit?

Through intercepting and decoding the data moving between the user’s browser, the AI company’s servers, and third-party trackers, the researchers found that some assistants still collected and transmitted full-page content from supposedly secure sites.

See also Indonesia targets sovereign AI fund to boost sector amid regional competition

As for Merlin, the researchers discovered a mix of sensitive data to include health records, banking details, exam results, and taxpayers’ social security numbers.

Peers Sider’s AI assistant and TinaMind were seen sending user prompts to Google Analytics, as well as identifying essential information like IP addresses. This data could be used for targeted advertising and cross-site tracking.

According to the researchers, other assistants like Copilot and Monica quietly kept complete chat logs in the browser even after sessions ended.

When accessed via certain browser integrations, OpenAI-made ChatGPT profiled users based on their perceived age, income level, gender, and interests, then tailored its answers accordingly.

“There’s simply no clear way for users to know where this information ends up once it’s been collected,” Mandalari warned.

Recently, OpenAI CEO Sam Altman warned users of privacy concerns, saying they should be cautious when using chatbots like ChatGPT for certain purposes, as they don’t carry some privacy safeguards as those with a real doctor or lawyer, for instance.

Could AI tools be breaching the laws?

The research was carried out in the United States, but the team concluded that some AI assistants were likely in breach of both American and European privacy laws. In the US, certain cases appeared to violate rules protecting medical information, while in the EU, the findings suggested potential breaches of the General Data Protection Regulation (GDPR), which has strict limits on the storage and sharing of personal data.

See also Anthropic gives Claude to federal agencies for $1 in push to win contracts

Even where companies publish privacy notices, the fine print can be startling. Merlin’s EU and UK policy, for instance, lists names, contact details, login credentials, transaction records, payment information, and any typed input as data it may collect. It says this may be used for personalisation, customer support, or legal compliance.

Sider makes similar disclosures, adding that user data can be analysed for “insights” or to help develop new services. It names Google, Cloudflare, and Microsoft as possible data recipients, while assuring that partners are bound by contracts to safeguard personal information.

OpenAI’s own terms confirm that data from UK and EU users is stored outside those regions, though the company says user rights are unaffected.

“These products are pitched as making web use faster and smarter,” she said. “But what’s happening under the hood is often a detailed recording of your private life online.”

With regulators tightening data protection rules and tech firms rushing to embed AI into every corner of the internet, scrutiny of these tools is likely to grow.

For now, the researchers recommend caution. While Perplexity AI avoided the privacy pitfalls in their testing, most others did not. “If you let an AI see everything you do online,” Mandalari said, “you should assume that somewhere, somehow, that information is being stored, and maybe even shared.”

If you're reading this, you’re already ahead. Stay there with our newsletter .

0

Descargo de responsabilidad: El contenido de este artículo refleja únicamente la opinión del autor y no representa en modo alguno a la plataforma. Este artículo no se pretende servir de referencia para tomar decisiones de inversión.

PoolX: Bloquea y gana nuevos tokens.
APR de hasta 12%. Gana más airdrop bloqueando más.
¡Bloquea ahora!