A sudden full-screen prompt stops your scroll, asks you to wait, and hints that you look suspicious online to systems.
Website defences are getting tougher, and they now bite even when you only want to read the news. Across major publishers, readers face short tests that separate human eyes from automated tools. The message that blocked your page comes from News Group Newspapers, which runs the Sun, and it shows how fast the rules have shifted.
What triggered the warning on your screen
News Group Newspapers (NGN) says its systems detected behaviour that looked automated. Fast clicks, unusual page loads, or privacy tools can all set off alarms. The company bans bots from accessing, collecting, or mining articles without permission. That block now extends to AI training, machine learning workflows, and large language models.
NGN prohibits automated access, collection, and text or data mining of its content, including for AI, machine learning, and LLMs, under its terms.
Sometimes the filter catches real people. If that happens, NGN invites readers to get help. The company also provides a route for businesses that want a licence to use its content for commercial purposes.
If you’re a human visitor who got flagged, contact the Sun’s support team at [email protected]. For commercial licences, write to [email protected].
Why normal browsing can look robotic
Anti-bot tools scan patterns and compare them with known attack signatures. They track speed, order of actions, and device signals. A short burst of odd behaviour can tip you into a grey zone and trigger a challenge.
- Very fast scrolling or clicking that beats typical human reaction times.
- Multiple page requests in seconds, especially from the same IP address.
- Blocked JavaScript, disabled cookies, or strict tracking protection.
- VPN, proxy, or Tor usage that hides your location or rotates IPs.
- Headless browsers, automation scripts, or extension conflicts.
- Time drift on your device clock, which breaks security handshakes.
The bigger story: publishers vs. scraping, AI training and ad fraud
Publishers face two costly problems. First, industrial scraping copies journalism at scale to feed AI models or rival services. Second, fake traffic drains ad budgets and pollutes audience metrics. Both push news groups to harden their sites and to demand licences.
In the UK and Europe, text and data mining sits in a legal grey area that shifts with purpose and consent. Research projects may claim narrow exemptions. Commercial use sits on firmer ground for publishers, who can restrict access through terms and technical measures. Many have turned away AI crawlers and issued new guidance that draws a sharp line between reading and harvesting.
Pay for what you use: that is the new deal publishers want from AI firms and bulk collectors.
Expect more checks on busy news days, during breaking stories, or when a site spots high-volume requests. The aim is not to punish readers. The aim is to keep costly bots out, keep servers stable, and protect value in the archive.
What happens during those 12 seconds
Most challenges run a handful of quick tests before the page loads. You may not even see a puzzle. The system looks at your browser’s behaviour, confirms it runs scripts correctly, and assigns a score. If the score looks human, access follows. If not, you see a visible test.
| Check | Typical time | What it uses | Your control |
|---|---|---|---|
| Background risk test | 2–5 seconds | Script execution, timing, page interactions | Enable JavaScript and cookies |
| Network trust check | 1–3 seconds | IP reputation, region, rate of requests | Avoid rapid refreshes; pause VPN if safe |
| Visible challenge | 5–12 seconds | Simple puzzles or one-click confirmation | Complete the prompt; keep the tab active |
How to get through the gate without losing your cool
You do not need to give up privacy to pass. Small tweaks usually fix the block fast. These steps help you look more like a real reader and less like an automated fetcher.
- Refresh once, not ten times. Rapid reloads raise suspicion.
- Turn on JavaScript and allow first-party cookies for the site.
- Pause VPN or anti-tracker extensions for a minute if the page fails to load.
- Keep your device clock in sync with internet time.
- Avoid opening dozens of tabs at once from the same domain.
- If the message persists, write to [email protected] with the time, your browser, and a screenshot.
- For business use or AI training, request a licence at [email protected].
What data these checks may touch, and why that matters to you
Verification tools do not need your name to work. They rely on signals. That includes your IP address, browser type, and the way your page responds to scripts. The system may store a token to mark your device as trusted for a short period. Settings differ by publisher and vendor, and each site’s privacy notice defines the guardrails.
Most bot filters rely on transient signals and short-lived tokens; they aim to distinguish behaviour, not identity.
Readers want two things: fast access and respectful tracking. Publishers want two different things: resilient sites and fair value for their work. The present wave of checks tries to balance those goals. Expect minor delays, most often between three and twelve seconds, and occasional false alarms when traffic surges.
The money angle: who pays for the gatekeeping
Every challenge costs time and server resources. Publishers justify that spend by pointing to losses from scraping and invalid traffic. Advertisers also push for cleaner audiences. They expect fewer bots, stronger verification, and transparent metrics. That pressure shapes the experience you feel at the page level.
Key terms and a quick scenario to test at home
Text and data mining means copying content at scale to extract patterns. AI training pipelines rely on that process to teach models how language works. A licence sets the price and conditions for that use. Without one, companies risk legal claims and reputational blowback.
Run a simple simulation. Open a news page in a normal browser and wait for it to load. Then try the same page with scripts blocked and a VPN switched on. Watch for the difference in load time and challenges. The first case often passes without any prompt. The second case often triggers a visible check, and sometimes a block. The result shows what these systems look for: predictable human interactions from a consistent device.
Risks, advantages and what readers should weigh up
- Risks: short delays, rare lockouts, and friction for privacy-heavy setups.
- Advantages: fewer spam comments, less ad fraud, and better site stability during major news events.
- Trade-off: you give minimal technical signals so publishers can keep bots out; in return, pages load more reliably.
If you rely on strict privacy tools, set per-site rules. Allow scripts and cookies for trusted news outlets while keeping your wider protections. If you research or build software that needs article access at scale, seek a commercial licence first. That route reduces legal risk, keeps servers stable, and ensures journalists get paid for their work.



Wait, I’m suspicious because I scroll fast? Guess my coffee is an attack vector now.