Tutorials/Tutorial
Reddit Posts Icon

Website to Reddit Posts Automation

Build an automation that crawls any website and generates 15 authentic, community-friendly Reddit posts for founder subreddits. Perfect for sharing product updates without manual content adaptation.

20 min readIntermediateDecember 26, 2025
Website to Reddit Posts Workflow
15
Reddit Posts Per URL
60s
Processing Time

Two-Layer Architecture

Flow Playground

The crawling and content generation engine. Analyzes websites, extracts key pages, and generates Reddit posts.

We'll build this first

App Playground

The user interface. Simple URL input that displays generated posts in a clean gallery.

We'll build this second

Part 1: The Website Crawler & Reddit Generator Flow

This flow crawls a website, identifies the 3 most marketing-relevant pages, and generates 5 unique Reddit posts for each page (15 total). Let's see the complete flow first.

Complete Flow (Left to Right)

API Input
url: string
URL entry
Crawler
Depth: 1
Max pages: 3
Site mapping
Extract URLs
Top 3 pages
AI analysis
Split
3 threads
Parallel
Fetch HTML
Get page
Extract Text
Clean text
⚡3x parallel
Generate Posts
5 posts per page
AI content
Flatten
Combine all
Merge
Iterator
Per post
Loop
Dataset (Write)
Save posts
Storage
Data flows left to right. The flow crawls → analyzes → splits → fetches → generates → stores.

Breaking Down Each Node

API Input
Field:
• url (string, required)
What it does:

Receives website URL from the app. Starting point for the entire process.

Example:
{ "url": "https://yourproduct.com" }
Crawler
Settings:
Mode: Auto
Depth: 1
Max pages: 3
Same site only: Yes
What it does:

Crawls the website and maps its structure with page metadata.

Settings:
  • Depth 1 - Homepage + 1 level
  • Max 3 pages - Fast results
  • Same site only - Stay in domain
Output:

Tree structure of all URLs with metadata.

AI Prompt
Extract URLs
Model: GPT-4.1
Temp: 0.7
Output: Top 3 URLs
What it does:

AI analyzes the crawl tree and picks the 3 most marketing-relevant pages.

Looks for:
✓ Include:
• Product features
• Documentation
• Announcements
• Updates
✗ Exclude:
• Terms, privacy
• Signup/login
• Utility pages
Output:

Array of 3 URLs with summaries.

Array Splitter
Input: Array of 3 URLs
Creates 3 parallel threads
What it does:

Splits 3 URLs into separate threads. Processes all pages simultaneously.

Sequential: 3 × 20s = 60s
Parallel: ~20s total→ 3x faster!
Fetch HTML
Gets raw HTML for each URL
HTML Text Extractor
Extracts clean text content
What they do:

Fetch HTML → Extract clean text. Removes tags, scripts, styles.

Why:
  • • Raw HTML confuses AI
  • • Clean text = better posts
  • • Lower token costs
Output:

500-2000 words of clean text per page.

⚡3x parallel
AI Prompt
Generate Reddit Posts
Model: GPT-4.1
Temp: 0.7
Output: 5 posts per page
3x parallel
What it does:

Creates 5 authentic Reddit posts per page. Runs 3x in parallel = 15 total posts.

Input:
html Clean text
url Source page
Output:
• Subreddit name
• Post title
• Post body
Key instructions:
✓ Personal tone
✓ Show emotions
✓ Short paragraphs
✗ No marketing speak
✗ Not too perfect
✗ Avoid AI tone
Why it works: Explicit anti-perfection instructions + temp 0.7 = authentic founder voice
Array Flatten
Combines nested arrays into single array
What it does:

Combines 3 arrays into one. [[5], [5], [5]] → [15 posts]

Why:

Iterator needs a flat array, not nested ones.

Iterator
Loops through all 15 posts one by one
What it does:

Loops through all 15 posts one by one.

Why sequential:

Dataset writes are fast (~50ms). 15 writes = 1 second. No need to parallelize.

Dataset (Write)
Post Collector
Saves each post with title, body, subreddit
What it does:

Saves all 15 posts to cloud storage.

Schema:
{ "title": "string", "body": "string", "subreddit": "string" }
Why:
  • • Display in App UI gallery
  • • Post library for reference
  • • Filter by subreddit
  • • History of all content
Flow Summary

Input (URL) → Crawler (map site) → Extract URLs (top 3 pages) → Split (3 threads) → Fetch + Extract (per page) → Generate Posts (5 per page, 15 total) → Flatten → Iterator → Dataset (save all posts)

Time: ~60s
With parallel processing
Output: 15 posts
Ready for Reddit

Part 2: The App Playground (User Interface)

Now that we have the flow built, let's create the simple UI that lets users input a URL and view generated posts.

Complete App Structure

Input Layer
Text Input
Website address
Flow Execution Layer
Flow Execution
Crawler & Generator
Calls Part 1 flow
Data Layer
Dataset Connection
Post Collector
Data Display
Shows posts

How It Works Together:

1. User Input: User pastes a website URL (e.g., their product landing page)
2. Flow Execution: Triggers the crawler & generator flow from Part 1
3. Processing: Flow crawls site, analyzes pages, generates 15 Reddit posts
4. Storage: Posts are saved to "Post Collector" dataset
5. Display: Dataset Connection feeds posts to Data Display component
6. Result: User sees all 15 posts in a clean gallery with title, body, and target subreddit
Complete User Journey:
  1. 1. Paste URL: "https://myproduct.com"
  2. 2. Click "Create reddit posts" button
  3. 3. Wait ~60 seconds (progress indicator shows)
  4. 4. View 15 generated Reddit posts
  5. 5. Copy posts to share in r/entrepreneur, r/SideProject, etc.

Deployment

REST API

Deploy as an API endpoint

POST api.evaligo.com/app/your-app-id

Embed Widget

Embed the App UI in your website

<iframe src="app.evaligo.com/embed/..." />

Ready to Automate Your Reddit Presence?

Start building with Evaligo's visual workflow builder. No coding required.

No credit cardFree tierDeploy in 1-click

Related