The Problem
James runs a five-person online electronics shop. For a related e-commerce automation, see our Shopify order fulfilment automation that eliminates manual data entry entirely.
James runs a five-person online electronics shop. They compete with several larger retailers on roughly 250 SKUs — mostly consumer audio, cables, and accessories. Every morning, he or someone on his team would open a spreadsheet, manually visit four competitor websites, find matching products, and update the sheet with their current prices.
When we spoke to him he'd been doing this for two years. He knew it was a waste of time. He also knew that on the days nobody got round to it — bank holidays, busy periods — they were flying blind. Twice in six months they'd missed a competitor running a weekend flash sale and watched their conversion rate tank without understanding why until Monday morning.
"The data exists," he said. "I just can't see it fast enough to act on it."
Proper repricing tools exist, but they're expensive, built for Amazon, and overkill for a shop his size. He didn't need dynamic repricing. He just needed to know what was happening so he could make a decision.
What He Actually Needed
After scoping the problem, the requirements were simple:
- Check 4 competitor sites daily across ~250 matched SKUs
- Compare against his own Shopify prices automatically
- Flag where he's priced above competitors (risk) and below (opportunity)
- Deliver a summary before the team starts work — no login required
- Keep a rolling history so trends are visible over time
No repricing. No complex rules engine. Just: "here's what changed, here's where you're exposed, here's where you have room to move."
What We Built
The full pipeline looks like this:
Each morning at 6 AM, n8n wakes up and hits four competitor sites using HTTP request nodes with CSS selector scraping. Combined with our Shopify order fulfilment automation, this gives e-commerce owners full operational visibility. No Puppeteer, no headless browser — these sites render prices in the HTML, so a plain fetch is fast and reliable. If a site goes down or changes its markup, n8n flags it in the digest rather than silently failing.
The scraped prices are matched against a mapping table (a Google Sheet James maintains with competitor URLs mapped to his own SKUs) and then compared with live Shopify prices pulled via the API. The logic is straightforward: items where a competitor is more than 3% cheaper get flagged as "⚠️ exposure," items where he's cheaper than all competitors get tagged "✅ room to move."
Everything gets written to a rolling Google Sheet — one row per SKU per day. James can now see a 90-day price history on any product with a filter. That alone was something he'd never had before.
The Slack message lands at 6:30 AM (the scrape and comparison takes about 20 minutes across all SKUs). It shows the top flagged items, any new changes since yesterday, and a footer summary. On quiet days it's two lines. On busy days — like when a major competitor ran a 15% flash sale on audio gear — the message is longer and actionable.
The Edge Cases Worth Mentioning
A few things we handled that aren't obvious until you're building it:
- Temporarily out-of-stock items — competitors sometimes remove the price when a product is OOS. We treat missing prices as "unknown" rather than zero, so they don't pollute the comparison.
- Bundled pricing — one competitor lists some items in multi-packs. The mapping table includes a unit-price multiplier so the comparison stays apples-to-apples.
- Markup changes — we added a simple checksum on the scraped HTML. If the page structure changes significantly (indicating a site redesign), n8n sends an alert before the next run rather than returning garbage data.
- Rate limiting — the scraper adds a randomised 2–6 second delay between requests per site and uses a rotating user-agent. Nothing aggressive, just enough to be a polite bot.
The Result
- ~2 hours/day saved — zero manual checking since launch
- 12% average margin improvement — identified 40+ SKUs where they were underpriced vs. all competitors
- Flash sale caught in real-time — matched a competitor promotion within 90 minutes of it going live
- 90-day price history now available on every product
- Runs on the same $12/month VPS as their other n8n workflows
The margin improvement is the headline number. By identifying products where they were cheaper than all four competitors — often by 10–20% — James raised prices on 42 SKUs over the first three weeks. Conversion didn't move. Revenue did.
"I'd been leaving money on the table for two years without knowing it," he told us. "The monitoring bot paid for itself in the first week."
The flash sale catch was a bonus. A competitor dropped prices across a category at 7 PM on a Thursday. The n8n pipeline runs again at midnight — it caught it, Slack'd the team, and they matched prices before Friday morning. Previously that would have gone unnoticed until someone happened to check over the weekend.
What This Costs to Build
We scoped this as a single automation under our Starter plan. The build took about two and a half days: one day for the scraper and Shopify integration, one day for the comparison logic and Google Sheets logging, half a day testing and tuning the Slack output format.
James got the full n8n workflow JSON, the Google Sheets mapping template, and a short doc explaining how to add new competitors or update SKU mappings himself. No ongoing dependency on us.
If you're running an e-commerce store and spending any manual time on competitor price research, this is one of the highest-ROI automations you can build. Get in touch — the call is free.