“You agree that your abusive use of the Site may cause damage and harm to us, including impaired goodwill, lost sales and increased expenses. You also agree that monetary damages for your abusive use of the Site are difficult to determine and that if you, or others acting with you, request more than 1,000 pages of the Site or make more than 800 reserve requests on the Site in any 24-hour period, you, and those acting with you, will be jointly and severally liable for liquidated damages in the amount of twenty-five cents ($0.25) for each page request or reserve request made during that 24-hour period which exceeds those limits.”
If the court rules that by booking the tickets Higs have agreed to this it will certainly make ticket scraping in the US much harder. Ticketmaster claims Higs make up to 350 000 page requests per day which would mean they would need to pay Ticketmaster up to $87 000 per day of abuse.
We see customers getting hit millions of times per day by scrapers so a positive ruling on this case could definitely be a welcome tool to stop the abusive web scrapers that plague many web sites today. The ticketing business is somewhat unique in that it is fairly easy to identify scrapers as the tickets are traceable, most other businesses affected by web scraping does not have this advantage. It is however often possible to identify even scrapers hiding behind layers of open proxy servers or anonomizers through seeding the data that they are after with unique values. Seeding is a practice that first was utilized by cartographers hundreds of years ago when they wanted to identify people copying their maps. By adding non-existing streets or houses on the maps the cartographers were able to identify these features on copied maps and that way prove that the map was a copy rather than an original.
Together with a positive ruling for Ticketmaster, seeding could become a very powerful tool to stop web scrapers.