Bots Compose 42% of Overall Web Traffic; Nearly Two-Thirds Are Malicious, Reports Akamai
Akamai Technologies (NASDAQ: AKAM) has released a new State of the Internet report revealing that bots make up 42% of overall web traffic, with 65% of these being malicious. The report, titled 'Scraping Away Your Bottom Line: How Web Scrapers Impact Ecommerce,' highlights the severe impact of undetected web scraping bots on ecommerce. These bots are used for activities like competitive intelligence, inventory hoarding, and imposter site creation, adversely affecting revenue and customer experience. The report underlines the challenges posed by AI botnets and headless browser technology, which require sophisticated mitigation strategies to manage. Key findings show that bots can degrade website performance, pollute site metrics, and increase compute costs. The report includes mitigation strategies and a case study demonstrating improved website performance post-defense implementations.
- Akamai identifies that bots make up 42% of web traffic, highlighting the need for advanced security measures.
- The report provides valuable insights and mitigation strategies for companies to protect against web scraping bots.
- Case study included shows significant improvement in website performance after implementing bot defenses.
- 65% of bots are malicious, posing significant threats to ecommerce by impacting revenue and customer experience.
- AI botnets and headless browsers are making it harder for companies to detect and manage malicious bots.
- Technical impacts include website performance degradation, increased compute costs, and compromised credentials.
Undetected web scraping bots severely impact ecommerce
With its reliance on revenue-generating web applications, the ecommerce sector has been most affected by high-risk bot traffic. Although some bots are beneficial to business, web scraper bots are being used for competitive intelligence and espionage, inventory hoarding, imposter site creation, and other schemes that have a negative impact on both the bottom line and the customer experience. There are no existing laws that prohibit the use of scraper bots, and they are hard to detect due to the rise of artificial intelligence (AI) botnets, but there are some things companies can do to mitigate them.
"Bots continue to present massive challenges resulting in multiple pain points for app and API owners," said Patrick Sullivan, CTO, Security Strategy at Akamai. "This includes scraping that can steal web data and produce brand impersonation sites. The scraper landscape is also changing due to advancements like headless browser technology, requiring organizations to take an approach to managing this type of bot activity that is more sophisticated than other JavaScript-based mitigations."
Key findings from the report include:
- AI botnets have the ability to discover and scrape unstructured data and content that is in a less consistent format or location. Additionally, they can use actual business intelligence to enhance the decision-making process through collecting, extracting, and then processing data.
- Scraper bots can be leveraged to generate more sophisticated phishing campaigns by grabbing product images, descriptions, and pricing information to create counterfeit storefronts or phishing sites aimed at stealing credentials or credit card information.
- Bots can be used to facilitate new account opening abuse — which, according to recent research, composes up to
50% of fraud losses. - Technical impacts that organizations face as a result of being scraped, whether the scraping was done with malicious or beneficial intentions, include website performance degradation, site metric pollution, compromised credentials attacks from phishing sites, increased compute costs, and more.
The Scraping Away Your Bottom Line research report offers mitigation strategies against scraper bots and features a case study that shows how websites operate much faster and efficiently once defenses against these bots are put into place. In addition, the research addresses compliance considerations that must be taken into account in light of these increasing attacks.
This year marks the 10th anniversary of Akamai's State of the Internet (SOTI) reports. The SOTI series provides expert insights on the cybersecurity and web performance landscapes, based on data gathered from Akamai Connected Cloud.
About Akamai
Akamai powers and protects life online. Leading companies worldwide choose Akamai to build, deliver, and secure their digital experiences — helping billions of people live, work, and play every day. Akamai Connected Cloud, a massively distributed edge and cloud platform, puts apps and experiences closer to users and keeps threats farther away. Learn more about Akamai's cloud computing, security, and content delivery solutions at akamai.com and akamai.com/blog, or follow Akamai Technologies on X, formerly known as Twitter, and LinkedIn.
Contact
Jim Lubinskas
Akamai Media Relations
703.907.9103
jlubinsk@akamai.com
View original content to download multimedia:https://www.prnewswire.com/news-releases/bots-compose-42-of-overall-web-traffic-nearly-two-thirds-are-malicious-reports-akamai-302180377.html
SOURCE Akamai Technologies, Inc.
FAQ
What percentage of web traffic is composed of bots according to Akamai's report?
How many of the bots identified in Akamai's report are malicious?
What are some of the impacts of web scraping bots on ecommerce as reported by Akamai?
What technological advancements make it difficult to detect and manage bots according to Akamai?
What are some mitigation strategies suggested by Akamai to counter web scraping bots?
What are the technical impacts of web scraping bots as identified by Akamai?