Blogs

How We Synced 4 E-commerce Platforms in Real-Time and Eliminated Overselling
Key Takeaways
- Real-time e-commerce inventory sync requires a centralised single source of truth — not platform-to-platform connections
- Each major platform (Shopify, Amazon, Etsy, WooCommerce) has a completely different API structure and rate limiting behaviour
- A queue system using BullMQ is essential for managing rate limits across multiple platforms simultaneously
- Overselling incidents dropped from 23 per week to zero within 60 days of the system going live
- The architecture generalises to any multi-platform retail operation, regardless of platform combination
Who Is This For?
This technical case study is for e-commerce retailers, technical leads, and CTOs dealing with overselling problems caused by selling the same inventory across multiple platforms. It covers the full technical architecture of the solution we built — including the specific API challenges of each platform and the queue management approach that makes real-time sync viable at scale.
Real-time e-commerce inventory sync across multiple selling platforms is one of the most technically demanding problems in modern retail operations. The business consequence of getting it wrong — overselling — is among the most damaging outcomes a retailer can face. You sell a product, take payment, send the customer a confirmation email, and then discover the item is out of stock because a different platform sold the last unit 20 minutes earlier. The result is a refund, a negative review, a damaged customer relationship, and in some marketplaces, a policy strike that affects your seller standing.
The client in this case study was a mid-size UK retailer selling across four platforms simultaneously: Shopify (their primary D2C store), Amazon Seller Central (their largest revenue channel), Etsy (for a specific product category), and a WooCommerce store (their legacy channel, retained for a specific customer segment). Each platform maintained its own inventory count, updated independently, with no synchronisation between them. During peak periods — particularly Black Friday and the Christmas selling season — this architecture was generating 23 overselling incidents per week, each requiring a manual refund, a customer service interaction, and a negative review risk.
This post documents the complete technical architecture of the real-time inventory sync system we built, the specific API challenges we encountered on each platform, the queue management approach we used to handle rate limiting, and the results measured at 30 and 60 days post-launch. The architecture we describe is not specific to this client's platform combination — it generalises to any multi-platform retail operation.
Why Multi-Platform Inventory Sync Is Technically Hard
The naive approach to multi-platform inventory sync is to connect platforms directly to each other: when Shopify records a sale, update Amazon; when Amazon records a sale, update Shopify. This point-to-point architecture fails at scale for two reasons. First, it creates race conditions: if Shopify and Amazon each receive an order for the last unit within seconds of each other, both platforms may update the other simultaneously, producing conflicting inventory states. Second, it scales poorly: with four platforms, you need 12 directional connections to synchronise all pairs — and each connection has its own error handling and retry logic to maintain.
The correct architecture is hub-and-spoke: a central inventory service that maintains the single authoritative inventory count, with all four platforms connecting to that central service rather than to each other. When any platform records a sale, it notifies the central service, which decrements the master count and pushes the updated quantity to all other platforms. There is one source of truth, four connections to maintain, and no race condition because all inventory mutations flow through a single, atomic update process.
The second dimension of complexity is that each of the four platforms has a completely different API design. Shopify provides webhooks for inventory level changes and a straightforward REST API for inventory updates. Amazon Seller Central uses a Selling Partner API with OAuth2 authentication, strict rate limits, and asynchronous feed submission for bulk inventory updates. Etsy's API has its own authentication model and different endpoints for listing quantity updates. WooCommerce uses a REST API but hosted on the client's own server, requiring a different network path and authentication approach. None of them share a common data model or event format.
The Central Inventory Service Architecture
We built the central inventory service on a Node.js backend with a PostgreSQL database. PostgreSQL's transaction isolation model was the critical technical reason for this choice: when two orders arrive simultaneously for the same SKU on different platforms, the database's row-level locking ensures that both inventory decrements are processed atomically and sequentially — not concurrently — which prevents the race condition that causes overselling. This guarantee is not available in eventually-consistent NoSQL databases, which would be inappropriate for an inventory system requiring strict consistency.
The database schema is straightforward: a `products` table with a `master_quantity` field for each SKU, an `order_events` table that logs every order received from any platform, and a `sync_log` table that records every inventory update pushed to each platform — including the response code, the quantity sent, and any error state. This audit log is essential for debugging sync failures and for answering questions like 'why does Amazon show 3 units but Shopify shows 4?' — a discrepancy that, before the system was built, required manual investigation across four separate admin interfaces.
The service exposes a webhook endpoint for each platform. When Shopify processes an order, its webhook fires to our `/webhooks/shopify/order` endpoint. The handler validates the webhook signature (using Shopify's HMAC-SHA256 signature scheme), extracts the SKU and quantity from the order payload, and calls the core `decrementInventory(sku, quantity)` function. That function runs a PostgreSQL transaction that atomically reads the current master quantity, subtracts the ordered quantity, and writes the new quantity back — then queues update tasks for all other platforms.
Building a Multi-Platform E-commerce Integration?
We have built inventory sync systems, platform integrations, and custom e-commerce backends for UK retailers. Book a free technical consultation to discuss your specific platform combination and requirements.
Book a Free Technical ConsultationRate Limiting and Queue Management: The Hardest Part
The most technically challenging aspect of the build was not the inventory sync logic itself — that was relatively straightforward once the central service architecture was in place. The challenge was managing the rate limits imposed by each platform's API while still pushing inventory updates fast enough to prevent overselling during peak order periods. Amazon's Selling Partner API is the most restrictive: the inventory update endpoint allows only two requests per second with a burst allowance of 30 requests. During a flash sale generating 200+ orders per hour, naive sequential API calls would create an update queue hours long.
We implemented a queue system using BullMQ — a Node.js message queue built on Redis — to manage all outbound API calls to the four platforms. When the central service decrements inventory and needs to push updates to the other platforms, it adds jobs to four separate BullMQ queues — one per platform — rather than making API calls directly. Each queue has a worker with platform-specific rate limit configuration: the Shopify worker can process updates at 2 per second, the Amazon worker at 2 per second with a 30-request burst allowance, the Etsy worker at 10 per second, and the WooCommerce worker at 5 per second.
We also implemented priority queuing for high-velocity SKUs. The top 20% of SKUs by weekly order volume receive priority in the queue, meaning their updates process before lower-velocity items. During a period of peak demand, this ensures that the products most likely to oversell receive their inventory updates first. Low-velocity SKUs that turn over once per week can safely wait a few extra seconds for their update without meaningful overselling risk.
Handling Platform-Specific API Challenges
Each platform presented distinct challenges beyond rate limiting. Amazon's asynchronous feed submission model was the most significant departure from the other platforms' synchronous API patterns. Rather than accepting a single inventory update via a REST call, Amazon requires batch inventory updates to be submitted as XML or JSON feed files, which are then processed asynchronously — with a processing delay that can range from a few minutes to several hours. We handled this by implementing a reconciliation job that runs every 15 minutes to compare the master inventory count against each platform's reported quantity and push corrective updates where discrepancies exist.
WooCommerce, being self-hosted, introduced a network reliability challenge absent from the other platforms. Cloud-hosted platforms like Shopify and Amazon have essentially perfect uptime. The client's WooCommerce server, hosted on a shared hosting environment, experienced occasional downtime and elevated latency. We addressed this with exponential backoff retry logic in the WooCommerce queue worker: failed updates are retried at 30 seconds, 2 minutes, 10 minutes, and 1 hour, with an alert triggered if a SKU update fails all four retry attempts. The alert notifies the operations team via Slack, allowing manual intervention before the inventory discrepancy causes an overselling incident.
Results After 60 Days
The system went live on a Thursday morning during a lower-traffic period to allow the team to monitor behaviour before the weekend peak. In the first 24 hours, the system processed 847 inventory update events across the four platforms without a single error. The sync log confirmed that all updates were completing within 8 seconds of an order being placed on any platform — well within the window required to prevent simultaneous sales of the same last unit.
At the 30-day mark, overselling incidents had dropped from 23 per week to 2 — both attributable to a brief WooCommerce server outage that prevented updates from reaching that platform during a flash sale. Both incidents were caught by the reconciliation job within 15 minutes and resolved before the affected orders were shipped. At the 60-day mark, overselling incidents were at zero. Customer service tickets related to inventory issues dropped 78%. Negative reviews mentioning out-of-stock fulfilment failures dropped to zero.
The financial return was clear within the first quarter. Overselling incidents had been costing the client approximately £4,200 per month in refunds, customer service time, and marketplace penalty risk. The engineering investment was recouped in under four months. The ongoing infrastructure cost — Redis, the Node.js service, and PostgreSQL — runs at approximately £180 per month. To discuss a similar integration for your e-commerce operation, visit our development services page or get in touch directly.
Dream Code Labs
Web Development & Automation Agency · 7+ years experience
Dream Code Labs is a remote-first development and automation agency specialising in custom websites, AI-powered tools, and workflow automation for marketing agencies and growing SMEs across the UK, US, Canada, and Australia. We have delivered 50+ projects that produce measurable, real-world results.
Frequently Asked Questions
How do you sync inventory across multiple e-commerce platforms in real time?
The correct architecture is a centralised inventory service — a single database that holds the authoritative inventory count — with all platforms connecting to it via webhooks and API calls. When any platform records an order, it triggers an update to the central service, which decrements the master count and pushes the updated quantity to all other platforms. Point-to-point connections between platforms create race conditions and scale poorly beyond two platforms.
What causes overselling across multiple e-commerce platforms?
Overselling occurs when two or more platforms each receive an order for the same last unit before either platform has been notified of the other's sale. It is caused by latency in inventory updates between platforms — even a 30-second delay between a sale on Platform A and the corresponding inventory reduction on Platform B creates an overselling window during high-traffic periods. The solution is a centralised inventory service that processes all sales atomically and pushes updates to all platforms within seconds.
Which e-commerce platforms support real-time inventory sync?
Shopify, WooCommerce, and Etsy all provide REST APIs and webhook support for near-real-time inventory synchronisation. Amazon Seller Central uses a more complex asynchronous feed submission model with longer processing delays, requiring a reconciliation layer to maintain accuracy. BigCommerce, Squarespace Commerce, and most other major platforms also provide API access suitable for inventory sync, though implementation complexity varies.
How long does it take to build a multi-platform inventory sync system?
A custom inventory sync system covering three to four platforms typically takes 8–12 weeks to build, test, and deploy to production. The development timeline depends primarily on the complexity of the platform combination — Amazon's asynchronous API requires significantly more engineering work than Shopify or Etsy — and the volume and complexity of the product catalogue being managed.
What is the cost of building a custom e-commerce inventory sync system?
A custom multi-platform inventory sync system covering three to four platforms typically costs £15,000–£35,000 to build depending on platform complexity and product catalogue scale. Ongoing infrastructure costs are typically £150–£400 per month for hosting, Redis, and database services. Most retailers generating significant revenue across multiple platforms recover the build investment within 6–12 months through eliminated overselling costs and reduced customer service overhead.
Last updated: 20 Apr 2025




