In this hands-on tutorial, you'll build an automated website content analyzer that discovers all pages on a website, extracts their content, and generates AI-powered insights. This is a practical example of combining multiple nodes to create a powerful workflow.
What You'll Build
By the end of this tutorial, you'll have a flow that:
- Maps all pages on a website
- Scrapes content from each page
- Cleans HTML into readable text
- Analyzes each page with AI
- Generates a comprehensive site summary
- Saves all results to a dataset
Prerequisites
- Evaligo account (free tier works fine)
- Basic familiarity with the platform interface
- A prompt already created (we'll guide you)
Step 1: Create Your Prompts
First, we need to create the prompts we'll use in our flow.
Prompt 1: Page Analyzer
- 1
Go to Playground Navigate to the Prompt Playground
- 2
Create new prompt Click "New Prompt"
- 3
Enter prompt template Copy the template below
- 4
Test it Try with sample content
- 5
Save as "Page Analyzer" Save for use in flow
Analyze this webpage content and provide a structured summary:
URL: {{url}}
Content: {{content}}
Provide:
1. Main topic or purpose
2. Key points (3-5 bullet points)
3. Target audience
4. Content type (article, product page, documentation, etc.)
Be concise and factual.Prompt 2: Site Summarizer
Create a second prompt for summarizing all pages:
Analyze these page summaries from a website and create a comprehensive overview:
{{pageAnalyses}}
Provide:
1. Overall purpose of the website
2. Main content categories
3. Target audience
4. Key themes and topics
5. Recommendations for content improvement
Total pages analyzed: {{pageCount}}Save this as "Site Summarizer".
Step 2: Build the Flow
Now let's build the actual workflow.
Add API Input Node
- 1
Open Flow Playground Navigate to AI Flows
- 2
Drag API Input node From the node palette
- 3
Configure input Add field "websiteUrl" (string, required)
Add Website Mapper Node
- 1
Drag Website Mapper node Add to canvas
- 2
Connect API Input to Website Mapper Drag from API Input's output handle
- 3
Map variable Map
_input.websiteUrltourl - 4
Configure settings Set max pages: 10 (for testing)
Add Array Splitter
- 1
Drag Array Splitter node This will process each URL individually
- 2
Connect Website Mapper output Connect to Array Splitter input
- 3
Map array Map
out.urlsto the splitter
Add Page Scraper
- 1
Drag Page Scraper node Add after Array Splitter
- 2
Connect nodes Each split URL will be scraped
- 3
Map URL Map
out(the URL) tourl
Add HTML Text Extractor
- 1
Drag HTML Text Extractor Clean the scraped HTML
- 2
Connect Page Scraper Connect output to extractor
- 3
Map HTML Map
out.htmltohtml
Add Prompt Node (Page Analyzer)
- 1
Drag Prompt node Add to canvas
- 2
Select "Page Analyzer" Choose from saved prompts
- 3
Map variables Map
out.texttocontent
Map original URL tourl
Add Array Flatten
- 1
Drag Array Flatten node Collect all page analyses
- 2
Connect Prompt output Automatically collects all results
Add Second Prompt (Site Summarizer)
- 1
Drag another Prompt node Add below Array Flatten
- 2
Select "Site Summarizer" Choose from saved prompts
- 3
Map flattened results Map
out(array) topageAnalyses
Mapout.counttopageCount
Add Dataset Sink
- 1
Drag Dataset Sink Save results
- 2
Connect final prompt Connect Site Summarizer output
- 3
Create new dataset "Website Analyses"
- 4
Map fields Map summary and metadata to dataset columns
Add API Output
- 1
Drag API Output node Define what the API returns
- 2
Map output fields
summary: from Site SummarizerpagesAnalyzed: from Array Flatten countwebsiteUrl: from original input
Step 3: Test Your Flow
Run in Playground
- 1
Click "Run Flow" Button in top toolbar
- 2
Enter test URL Try "https://evaligo.com" or your own site
- 3
Watch execution See nodes light up as they process
- 4
Review results Click nodes to inspect outputs
Verify Output
Check that you receive:
- Individual page analyses in Array Flatten
- Comprehensive summary in Site Summarizer
- Data saved in Dataset Sink
- Structured JSON in API Output
Step 4: Deploy as API
- 1
Click "Deploy" Button in top toolbar
- 2
Name your API "Website Content Analyzer"
- 3
Copy API endpoint You'll use this to call your flow
- 4
Copy API key For authentication
Test API Call
curl -X POST https://api.evaligo.com/flows/your-flow-id/execute-async \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"websiteUrl": "https://example.com"
}'Step 5: Monitor and Improve
Review Results
- Check the "Website Analyses" dataset
- Review individual page analyses for quality
- Check site summary for accuracy
- Note execution time and costs
Optimize Performance
Consider these improvements:
- Enable parallel processing for Array Splitter
- Add caching for frequently analyzed sites
- Filter out certain page types (login, search)
- Increase max pages for full site analysis
Enhance Analysis
- Add sentiment analysis
- Extract structured data (prices, dates)
- Compare changes over time
- Generate SEO recommendations
Common Issues
Empty Results
Problem: No pages found or scraped
Solution:
- Verify URL is accessible
- Check Website Mapper output
- Try a different website
- Increase timeout settings
Timeout Errors
Problem: Flow times out
Solution:
- Reduce max pages temporarily
- Use async execution mode
- Enable parallel processing
Poor Analysis Quality
Problem: AI summaries are generic
Solution:
- Refine prompts with more specific instructions
- Test prompts in Playground first
- Add examples to prompt template
- Use structured output schema
Next Steps
Extend This Flow
- Add competitor analysis (compare multiple sites)
- Schedule regular scans to detect changes
- Generate reports in different formats
- Build a dashboard with historical data
Learn More
- Try the SEO Audit Tutorial
- Explore batch processing patterns
- Set up monitoring and alerts
- Integrate with your existing tools