Introduction
Automatic product data gathering can occasionally be thwarted by advanced firewall solutions, scraper blocking technology using AI and other methods.This evolving technology can sometimes require that price and stock status updates be performed manually.MetaLocator monitors crawling failures to identify troublesome URLs that are otherwise valid but failing to yield product data.
In these cases MetaLocator will usually reach out to the retailer, sometimes in cooperation with the manufacturer, to seek a whitelisting of our crawler, access to a product data feed, API or other alternative.
MetaLocator can enqueue the failed product update for manual review in cases where product updates fail.We then deliver the price update request to up to three separate (human) workers.The workers update the price using a triple-blind review model, where each update is automatically compared to each other update. If all updates are identical, they are accepted as accurate. If one update disagrees, and the other two concur, we accept the two that concur as accurate.If two disagree, the task is assigned to three new workers and the process repeats.
Manual price updates are relatively expensive and are only performed after repeated attempts to obtain data automatically.Retailers that thwart all methods of automated data gathering are not supported and will be removed from the solution.
Configuring Manual Updates
The number of failed automatic crawl attempts that trigger a manual update can be managed on the retailer profile, as shown below under Manual Update Threshold.
The retailer must also be set to allow manual updates, as shown in the checkbox above.