The Problem

James runs a five-person online electronics shop. For a related e-commerce automation, see our Shopify order fulfilment automation that eliminates manual data entry entirely.

James runs a five-person online electronics shop. They compete with several larger retailers on roughly 250 SKUs — mostly consumer audio, cables, and accessories. Every morning, he or someone on his team would open a spreadsheet, manually visit four competitor websites, find matching products, and update the sheet with their current prices.

When we spoke to him he'd been doing this for two years. He knew it was a waste of time. He also knew that on the days nobody got round to it — bank holidays, busy periods — they were flying blind. Twice in six months they'd missed a competitor running a weekend flash sale and watched their conversion rate tank without understanding why until Monday morning.

"The data exists," he said. "I just can't see it fast enough to act on it."

Proper repricing tools exist, but they're expensive, built for Amazon, and overkill for a shop his size. He didn't need dynamic repricing. He just needed to know what was happening so he could make a decision.

What He Actually Needed

After scoping the problem, the requirements were simple:

No repricing. No complex rules engine. Just: "here's what changed, here's where you're exposed, here's where you have room to move."

What We Built

The full pipeline looks like this:

price monitor pipeline — n8n, runs daily at 06:00
Cron Trigger
06:00 daily
──▶
🕷️
Scraper (×4)
HTTP + CSS selectors
──▶
🛒
Shopify API
Own prices
──▶
🔍
Compare + Flag
Risk / Opportunity
──▶
📊
Log to Sheets
Historic record
──▶
💬
Slack Digest
Formatted summary
↳ Team wakes up to a Slack message: top 10 price gaps, any new competitor drops, and a "clear" signal if nothing material changed

Each morning at 6 AM, n8n wakes up and hits four competitor sites using HTTP request nodes with CSS selector scraping. Combined with our Shopify order fulfilment automation, this gives e-commerce owners full operational visibility. No Puppeteer, no headless browser — these sites render prices in the HTML, so a plain fetch is fast and reliable. If a site goes down or changes its markup, n8n flags it in the digest rather than silently failing.

The scraped prices are matched against a mapping table (a Google Sheet James maintains with competitor URLs mapped to his own SKUs) and then compared with live Shopify prices pulled via the API. The logic is straightforward: items where a competitor is more than 3% cheaper get flagged as "⚠️ exposure," items where he's cheaper than all competitors get tagged "✅ room to move."

Everything gets written to a rolling Google Sheet — one row per SKU per day. James can now see a 90-day price history on any product with a filter. That alone was something he'd never had before.

The Slack message lands at 6:30 AM (the scrape and comparison takes about 20 minutes across all SKUs). It shows the top flagged items, any new changes since yesterday, and a footer summary. On quiet days it's two lines. On busy days — like when a major competitor ran a 15% flash sale on audio gear — the message is longer and actionable.

The Edge Cases Worth Mentioning

A few things we handled that aren't obvious until you're building it:

The Result

// measured results — 6 weeks post-launch

The margin improvement is the headline number. By identifying products where they were cheaper than all four competitors — often by 10–20% — James raised prices on 42 SKUs over the first three weeks. Conversion didn't move. Revenue did.

"I'd been leaving money on the table for two years without knowing it," he told us. "The monitoring bot paid for itself in the first week."

The flash sale catch was a bonus. A competitor dropped prices across a category at 7 PM on a Thursday. The n8n pipeline runs again at midnight — it caught it, Slack'd the team, and they matched prices before Friday morning. Previously that would have gone unnoticed until someone happened to check over the weekend.

What This Costs to Build

We scoped this as a single automation under our Starter plan. The build took about two and a half days: one day for the scraper and Shopify integration, one day for the comparison logic and Google Sheets logging, half a day testing and tuning the Slack output format.

James got the full n8n workflow JSON, the Google Sheets mapping template, and a short doc explaining how to add new competitors or update SKU mappings himself. No ongoing dependency on us.

If you're running an e-commerce store and spending any manual time on competitor price research, this is one of the highest-ROI automations you can build. Get in touch — the call is free.

📖 Further Reading