Incapsula Study Highlights Data Privacy Danger from Web Bots

If you want to have a serious Web presence, you need to deal with Internet crawler bots. But those bots also present increasingly heavy burdens for servers, as well as security and data privacy challenges. So says security vendor Incapsula, which has released a study of the bots, good and bad, that trawl the Web today.

Christopher Tozzi, Contributing Editor

December 17, 2014

2 Min Read
Incapsula Study Highlights Data Privacy Danger from Web Bots

If you want to have a serious Web presence, you need to deal with Internet crawler bots. But those bots also present increasingly heavy burdens for servers, as well as security and data privacy challenges. So says security vendor Incapsula, which has released a study of the bots, good and bad, that trawl the Web today.

Many bots, or automated programs that visit websites, do lots of useful things on the Internet, including indexing websites for search engines and collecting data for RSS feeds. Others are designed with nastier intent, aiming to launch DDoS attacks or sniff out private data, for example.

And according to Incapsula — which makes its money, in part, by helping to secure clients' sites against DDoS attacks and other disruptions — the bad bots have grown smarter in the last year. Although the overall number of bots crawling the Internet has shrunk since 2013, the proportion of "impersenator" bots — those that pretend to be from a legitimate service, like Google, in order to get past security checks — has grown, making it more difficult for filters to catch them at the gateway.

That new CAPTCHA technology allows a computer to answer modern CAPTCHA challenges accurately in up to ninety percent of cases, as Incapsula also noted in its report, adds to the danger posed by bots, since it is now easier for them to pass tests that only real, live humans are supposed to be able to complete successfully.

On top of all this, Incapsula says, even non-malicious bots can come at a cost for website owners. One of the study's major findings was that, in the case of sites with relatively few visitors, bots account for as much as four-fifths of the total Web traffic a site receives — which means a large chunk of the cash organizations pay for bandwidth and server time may be going to serve non-human visitors. The number is less extreme for larger websites, but even on sites that receive more than 100,000 daily visits, bots can account for more than half of Web traffic, according to Incapsula's research.

All of this may come as upsetting news if you're running a website, and there's currently no good answer to the conundrum. Still, it's worth noting as a reminder that traditional security threats remain relevant in the cloud age — and that the bots are much, much smarter now than they were when spelling out the "at" and "dot" in your email address was enough to protect your data.

Read more about:

AgentsMSPsVARs/SIs

About the Author

Christopher Tozzi

Contributing Editor

Christopher Tozzi started covering the channel for The VAR Guy on a freelance basis in 2008, with an emphasis on open source, Linux, virtualization, SDN, containers, data storage and related topics. He also teaches history at a major university in Washington, D.C. He occasionally combines these interests by writing about the history of software. His book on this topic, “For Fun and Profit: A History of the Free and Open Source Software Revolution,” is forthcoming with MIT Press.

Free Newsletters for the Channel
Register for Your Free Newsletter Now

You May Also Like