Companies should also remember that some bots are created just to obtain information that might be easier to get with a company-provided API, said Michael Facemire, a principal analyst with Forrester.
"If I find some information that is useful to me right now but also would be useful over time, as a developer, the first thing I do is see if there’s an API to get that information," he said. "If the answer to that is ‘no’, the next easiest way to get it is to write a bot or crawler to regularly scrape the site for that information."
Since crawlers negatively affect a company's website, it's important to use analytics: first to see what pages are being pulled, and then to decide whether a public API could expose some of that data, he said.
Ultimately, it's a game of cat and mouse, said analyst Roger Kay, president of Endpoint Technologies Associates.
"The bad guys always devise a workaround, and the good guys do the best they can under the latest assault to filter out extraneous traffic," Kay said.
Sign up for CIO Asia eNewsletters.