How we work
At ovr.news, we believe you deserve honesty about how your news is made. So here's how we work.
What we do
Mainstream news amplifies cognitive biases: negativity bias, learned helplessness, declinism. These biases make the world seem worse than it is. We don't counter that with cheerleading. We use evidence-based lenses that correct each bias by surfacing what it hides.
Where does the news come from?
We gather news from over 1,400 sources worldwide, via public RSS feeds, news websites, and public APIs. Think science journalism, solutions-focused media, universities, and international organizations.
Important: We respect copyright and paywalls. We only read what's publicly available, and always link to the original source so you can read more.
RSS feeds often contain only a brief excerpt. To properly evaluate and summarize an article, we attempt to retrieve the full text from the source website. We fully respect any blocking by the source. If a website denies access, we accept that. Articles where we cannot retrieve sufficient content are excluded from our selection.
For publishers: Want your content removed from ovr.news? Use our contact form or see our publisher page. We remove sources within 48 hours.
How does AI select stories?
Every day we collect thousands of articles. Each passes through multiple lenses, each looking for a different kind of evidence that mainstream news tends to miss:
- Thriving People thriving, health improving, lives getting better
- Belonging Community bonds, rootedness, intergenerational connection
- Recovery Ecosystems healing, species returning, nature bouncing back
- Solutions Technology and policy solutions working at scale
- Discovery Cultural heritage, traditions, cross-cultural connections
AI scores every article on multiple dimensions specific to each lens. These scores determine which articles appear on the site and in what order. The selection is driven by AI, not a human editor. The best from all lenses appear in Breakthroughs.
Each article shows why it was selected, in a one-sentence rationale that makes our editorial process transparent.
Summary and translation
Articles come from sources worldwide. We use AI to create clear English summaries and Dutch translations, so readers quickly understand what each story is about. Translation uses professional services (DeepL) and AI.
Note: AI can make mistakes. That's why we always link to the original, so you can verify.
What AI doesn't do
AI is a tool, not an editor. There are things we deliberately don't leave to AI:
- Whether numbers in an article are accurate
- Whether a source is reliable
- Whether a claim is proven
We leave that judgment to you. We give you the context to decide for yourself.
Scores are visible
For each article, you can see how the AI evaluated it. That transparency is intentional. We want you to understand why a story was selected. You don't have to agree with us.
Open source
Our code is public. You can see exactly how everything works. No black box, no secrets.
View the source code on GitHub →
Why we tell you this
Not because we have to. Because we believe honesty is the foundation of trust. And trust is what we want to build, story by story.
Questions?
Have questions about how we work? Get in touch via our GitHub project.
Last updated: March 2026