15,000 bad bot requests per minute were causing slowdowns and a degraded user experience for global real estate platform Lamudi.
Founded in Berlin, Germany, Lamudi’s global platform aimed at developing markets is available in more than 30 countries across Asia, Africa, the Middle East and Latin America.
It provides sellers, buyers, landlords and renters a secure and easy-to-use platform to find or list properties online.
Lamudi was using IP blocking techniques to block bad bots across different regions of their web application infrastructure.
Chief Technology Officer Oliver Feige says Amazon Web Service’s (AWS) Web Application Firewall was used in Latin America, Mexico, Colombia and Peru, whereas a built-in Linux firewall was used in other regions.
“But a growing scraping and spam problem prompted the IT team to look for a more sophisticated bot detection and mitigation solution,” Feige says.
“Our normal traffic is about 1,000 requests per minute, but suddenly we were getting more than 15,000 requests,” Feige says. “It was obvious bots were scraping our site.”
Not only was new business in jeopardy, the bad bots threatened to cause slowdowns and downtime.
“At the beginning we went offline as a result of the web scraping,” Feige says. “Later in the year we were able to answer each request, but response times were slow and it was getting expensive to manage.”
Slower performance fueled scrapers’ argument that customers should turn to competing sites. The scrapers were stealing information, spamming listing agent contact forms as well as analysing listing inventory to try to gain a competitive advantage.
“Our business model is selling leads,” Feige says. “We offer listings on our platform, and buyers search our platform to find them. When they see a property they’re interested in, they contact the listing agent through our website.”
But competitors and spammers were using bots to send unwanted messages to Lamudi's listing agents.
“The bots were sending thousands of spammy messages through our listing agent contact forms,” Feige explains. “Manually blocking IP addresses was taking a toll on operations. Typically, two full-time employees were dedicated to manually blocking the bad bots, which proved to be an expensive, ineffective approach.”
Feige’s team tried to throw hardware at the problem, wrote scripts to monitor the web logs and analysed bots to find offending IP addresses.
“When we found them, we’d block them from the firewall, but as you can imagine, there are many IP addresses out there, and scrapers simply use another AWS account to start scraping again,” Feige says.
“Plus, in countries such as Bangladesh or Pakistan, there aren’t any fixed IP addresses. Instead, they use an EFM modem or something similar, 4 and each time they login, they get a different address. So blocking IP addresses based on number of requests was blocking our local offices and legitimate users.”
Lamudi needed a solution with capability beyond IP blocking – one which could whitelist the same URLs on each account, and for each domain. Additionally, the solution had to be able to integrate well with Akamai.
“Akamai uses the Dynamic Site Accelerator (DSA) protocol, which routes packets optimally to accelerate performance,” Fiege says. “We leverage this technology, so whatever solution we chose would have to accommodate a complex configuration that connected our AWS instances to Akamai.”
Fiege’s team deployed Distil Networks software and knew they’d made the right call.
Distil’s multi-faceted approach to bot blocking includes whitelisting IP addresses from Lamudi’s offices with dedicated IP addresses, as well as whitelisting their performance monitoring tools.
Implementation took about a month and a half and since then, Distil’s solution has been live in all 30 countries Lamudi serves.
Lamudi has deployed Distil’s Cloud CDN in most regions, and the Distil Appliance in Asia.
“In Latin America, Africa and the Middle East, the public cloud has no latency issues, but in East Asia, we decided to co-locate the Distil Appliance next to our AWS infrastructure to provide the performance we needed,” Fiege explains.
He adds Distil allows Lamudi to create various automated tests to help identify clients eligible for whitelisting.
“We increased website performance by eliminating bad bot traffic. The biggest benefit of Distil is that the bad bots are gone. No more slowdowns and it’s easier to monitor and plan out their infrastructure requirements for optimal website performance.
“We don't have a bot problem anymore,” he says. “Distil is the best anti-bot and anti-scraper protection solution available, hands down.”
Fiege says his team has been able to reclaim time spent addressing listing agent complaints related to form spam.
“The bots spamming our customers were out of control,” he says.
“Before Distil, we were spending 20 to 30 hours of IT operations time investigating and blocking bots in our Web Application Firewalls.
Now, it’s all automated. Distil also eliminates time we would spend getting systems up and running again after an application denial of service due to a flooding of bad bot traffic.”