Travis Avery

Written by Travis Avery

Founder, SaaS Builder,
& Father

The Automated AI Newsroom: How I Built a 24/7 News Scout with Google AntiGravity

PUBLISHED: DEC 2025 // READ TIME: 5 MIN // TOPIC: AI
Automated AI Newsroom Cover

In software, "gravity" is the heavy, invisible force of maintenance. It's the manual data entry, the daily content updates, and the repetitive tasks that keep a solo founder tethered to their laptop when they should be building.

For my community site, Postal Soup, gravity was becoming a problem. I wanted to keep my audience of 300,000+ mail carriers updated on the latest contract negotiations and safety alerts. But writing daily news articles? That’s a full-time job I didn't have.

I needed a way to defy that gravity.

Enter Google AntiGravity (Google's agent-first IDE) and the concept of "Agentic Workflows." Instead of hiring a team, I built a serverless newsroom that runs on autopilot. Here’s how I did it.

The Architecture of an AI Newsroom

The goal was simple: Automate the discovery and drafting of news, but keep the approval human. I didn't want a "spam bot" flooding the internet with hallucinated garbage. I wanted a News Scout that reports to me.

The system has three core components:

  1. The Scout: A script that continuously scans the horizon for signal.
  2. The Editor: An AI agent that filters noise and pitches stories.
  3. The Writer: An agent that drafts the final copy upon approval.

Step 1: The News Scout (`scan-news.js`)

Everything starts with data. I built a Node.js script that acts as my 24/7 field reporter. It doesn’t just blindly scrape; it listens to specific frequencies.

// scan-news.js snippet
const SOURCES = {
    googleNews: 'https://news.google.com/rss/search?q=USPS+contract...',
    reddit: 'https://www.reddit.com/r/USPS/top.json?t=day&limit=10'
};

async function scanAndReport() {
    // Fetches RSS feeds and Reddit JSON
    // Filters for "breaking" signals (high upvotes, recent timestamps)
}

The script pulls from Google News RSS feeds and the top posts on r/USPS. But raw data is noisy. A meme getting 500 upvotes isn't news. A contract update getting 50 upvotes is.

Postal Soup Dispatch Index Page showing automated news aggregation
The "News Scout" automatically aggregates potential stories into a Dispatch Index for review.
View Postal Soup Live

Step 2: The "Human-in-the-Loop" Editor

This is where Google AntiGravity shines. I used it to design an intelligent filter using OpenAI's GPT-4. The script sends every candidate story to the LLM with a specific prompt:

"You are the Editor of 'Postal Soup'. You only approve stories that are URGENT news for mail carriers. Reject generic advice, memes, or customer complaints."

If the AI Editor approves a story, it doesn't publish it. It creates a GitHub Issue tagged pending-review.

This was my "Eureka" moment. I didn't need to build a complex admin dashboard. GitHub is my CMS. My phone buzzes, I see a new Issue titled "News Candidate: NALC Contract Arbitration Update," and I have two choices: close it (reject) or label it "Approve."

Step 3: The Writer Agent (`generate-post.js`)

Once I click "Approve," the gravity really turns off. A GitHub Action triggers the second agent: The Writer.

This script reads the approved issue, researches the original source URL, and drafts a full news article. But it doesn't just guess at the tone. I fed it a "Style Guide" (news-skills.md) that teaches it to write like a veteran journalist—inverted pyramid structure, factual tone, no fluff.

It even handles the visuals. Using DALL-E 3, it generates a photojournalistic header image to match the story, avoiding the cartoony "AI look" by requesting "documentary photography" styles.

// generate-post.js
const imageResponse = await openai.images.generate({
    model: 'dall-e-3',
    prompt: `Professional news photo: ${article.imagePrompt}. Style: Documentary photography...`,
});

The Result: Logic Over Labor

Before this system, keeping Postal Soup updated took 5-10 hours a week. Now, it takes about 5 minutes of clicking "Approve" on GitHub.

This is the promise of Google AntiGravity. It allows developers to lift the heavy burdens of maintenance so we can focus on high-level architecture. We aren't just writing code anymore; we're choreographing agents to do the work for us.

Postal Soup Blog Page with automated articles
Fully automated news releases and blog posts, live on the site.
Visit postalsoup.com

The gravity is gone. The sky is the limit.

Conclusion

Building this automated newsroom proved that we can now treat content generation as an engineering problem. With tools like AntiGravity, we can architect solutions that are resilient, scalable, and most importantly, autonomous. If you're still manually curating content in 2026, it might be time to let the agents take over.

Looking to build your own agentic workflows?

I'm available for projects and consulting.

Work with me at Thriving AI

Image Generation Prompts

Here are the prompts I used to generate the visuals for this article:

  • The "AntiGravity" Hero:
    "A cinematic, surreal 3D render of a glowing, futuristic 'Newsroom' desk floating weightlessly in a clean, dark void. Papers and digital screens are hovering in mid-air, organized by invisible forces. The lighting is moody, with accents of neon purple and cool blue. Represents 'AntiGravity' automation."
  • The "Scout" Diagram:
    "A clean, high-tech isometric diagram on a dark background. Shows a central glowing node labeled 'AI Brain' connected to satellite nodes labeled 'Google News', 'Reddit', and 'GitHub'. Data streams (represented by light particles) flow from the satellites into the center."
  • The "Human-in-the-Loop":
    "A split-screen composition. On the left, a chaotic mess of binary code, newspapers, and noise. On the right, a clean, organized, single 'Approve' button glowing green. A human hand is poised to press the button. Symbolizes clarity from chaos."