When Reservation Bots Steals Your Favorite Table
Want a table at that trendy restaurant? Oh, you can book it online! What? There are no available times open at all? Well how about in a week? No? A month? No? How about two months? Is this a joke? How can this be possible?
This kind of conversations with your browser might not be that unusual. The core question, how is it possible that people can book all the available times? The answer might be that it is not an actual person making the bookings.
Coders can use bots to grab available tables, leaving other diners feeling frustrated.
The bot waits for tables to become available then reserve them before humans can even react. Does this sound like an unfair advantage? Probably.
How about doing something about it? Being pro-active? Maybe deploy your own bot to even the score? It is definitely possible; I mean you can Google it. Oh look, I found a scraping tool with a video step-by-step tutorial. That sounds easy enough, I’m going to give it a shot!
Welcome to bot wars where the most optimized bot will win! You’re competing against people with the experience level of “I code bots for a living” to “anyone with an Internet connection”. This kind of competition means that you have to think about every aspect of your bot. It might even mean that you have to place your servers closer to the location of the server running the restaurants website, just to give your bot a microsecond advantage.
I mean, do you want a table, or do you want a table? Of course, for the barely novel scraper, the first reaction will be:
“How can this be, why must it be so complicated and time consuming? Ah, the frustration! All I wanted was a table for Saturday night!”
At ScrapeSentry we see the topic of legality of scraping coming up whenever a new industry is put in the spot light. However, all industries are to some scale affected, it is only a question as to the worth of the information. We can read a lot about companies in the news, that sue over scraped aggregated data, like QVC, Ticketmaster, Facebook, Craigslist and eBay.
For large companies that have the resources, all they have to do is send out cease-and-desist letter left to right, and quite effectively clamp down on any dining-reservations websites, often forcing them out of business.
Unfortunately, not every company has the resources necessary to battle scrapers on the legal front in a juggernaut kind-of way, luckily there are alternatives.
ScrapeSentry deals with scrapers in a unique way. We have the same automation most anti-scraping companies do, however we also have the 24/7/365 constant manpower that other companies don’t. Keeping your traffic safe by monitoring users and blocking scrapers.