Mars Bot – The Internet’s Most Active (and Stupid) Brute Force Robot

Brute force attacks refer to repetitive attempts to guess the login credentials of websites’ user accounts or administrative backends. By gaining access to a hacked account the attacker may be able to hijack the website, steal valuable data, or alter user facing content.

As you might expect, guessing a username and password combination can require thousands or millions of attempts before being successful. For this reason it’s not feasible for attackers to execute a brute force attack themselves, but instead deploy scripts – known as bots – to automatically cycle through combinations day and night until finally guessing correctly. Compared to other modern cyber attacks, brute force scripts are simplistic and without any advanced mathematics or A.I., but nevertheless pose a significant risk to unsecured websites.

At Access Watch we’re tracking all types of robotic traffic across our network of websites. Brute force attacks are one of the most common threats to all websites- and the Mars bot is currently the most active of them all! Amazingly, Mars was present on 54% of total websites plugged into the Access Watch in August. That level of activity beats out the bots from common and harmless services like Facebook, Yahoo and Twitter.

What exactly is Mars doing?

Across different websites Mars displays a fairly consistent pattern of behavior. Initially, only 2 to 3 requests are made as it assesses the login form, without the bot carry out any attempts at username + password combinations. These initial checks are done at scale across a very large volume of websites.

Following the initial probe Mars will begin to brute force a page’s login form. However, not all websites first visited are targeted for attack, in August actual Mars attacks were measured on 43% of the total websites it reached. Brute force attempts are spread across hundreds or thousands of IP addresses with an average of 4 attempts made from one IP before the bot moves on. This strategy employed by Mars is a simplistic attempt to avoid IP blacklists, login security rules and filters.

The brute force attack is sustained for a period of a few hours in most cases. If Mars has failed to guess the correct combination at that point, the bot will move on to it’s next target.

What happens if you’re hacked by Mars?

Brute force robots are attempting to take over websites or user accounts. By correcting guessing the admin username and password for a website, Mars can immediately adjust user permissions, thereby hijacking the website.

  • Alter content & spam – with administrative access the bot may be free to change or alter content and features on your website, severely damaging your brand’s reputation or user experience.
  • Steal data – depending on the type of account accessed, private or sensitive data belonging to your company or users can be easily be stolen.
  • Virus distribution– an attack can be set to trigger downloads of viruses or malware by your users.
  • Redirect – a new URL can be set as the destination for traffic intending to land on your webpage.

Additionally, many people tend to repeat login credentials over multiple websites or applications. The correctly guessed username and password combination can subsequently be used to gain access to additional accounts or websites.

Brute force bots targeting accounts of your customers or users also pose a significant risk to your business. Private and valuable data is often stolen when a bot forces it’s way into these accounts. This can include personal identity information, credit card, bank and other payment related data.

Prevent brute force attacks.

Access Watch provides a security layer against brute force attacks. Threats identified in your website traffic are used to form a complete robot signature. From there, the entire entity is prevented from carrying out it’s attack. As opposed to simple IP address blacklisting, this method is far more effective in defending against distributed brute force attacks over thousands of IP addresses.

Brute force bots are a threat to nearly all websites today. At Access Watch, we’re happy to help. Sign up now and start efficiently blocking these attacks. Let us know what you think!

 

Access Watch teams up with Sematext!

We’re happy to announce Sematext as a launch partner for Reveal, our embeddable traffic intelligence, initially targeting ELK & other log management platforms.

We started discussing the idea of Reveal in our networks and the community. Early on, we got some great feedback from the Sematext team and quickly decided to partner on bringing the Access Watch intelligence to Logagent, a new – cool – open source – Javascript based – log shipper they’ve initiated.

Reveal + Logagent

Access Watch’s Reveal easily plugs into your existing log setup to augment this data with robot detection and traffic intelligence.

The first version of Reveal is based on the Access Watch API and is available as a plugin (or integration). Each time a new identity is detected in the logs, the API is queried to get the data necessary to augment the logs. To preserve performance, data is cached.

If you’re planning on using or already using Logagent, as you can see in our setup instructions, activation is easy: get an API key, source configuration and destination configuration, that’s all.

Once setup correctly, you will end up with augmented requests looking like this:

 

After that, you’ll be able to configure dashboards in Kibana to capitalize on the newly added data.

 

About Sematext

Sematext Cloud is an all-in-one logging tool with monitoring for infrastructure as well as APM with transaction tracing. Get alerts including anomaly detection with popular services such as PagerDuty, Slack, HipChat, OpsGenie and more. For those needing an on-premise solution Sematext Cloud has a cousin named Sematext Enterprise enabling all data to be kept locally.

About Access Watch

Access Watch provides robot detection and traffic intelligence services for websites and online businesses, helping them increase security and decrease risk and costs associated with bad and malicious traffic.

Introducing Reveal, our new product to augment web traffic logs

Over the last few months, we received a lot of feedback asking how Access Watch could plug into existing log infrastructures. Many teams have effective strategies in place with tools like ELK (Elasticsearch / Logstash / Kibana) to index their access logs which can be further leveraged for valuable insights.

This vast amount of existing log data is an often underutilized source for robot detection and intelligence. A core reason for this- these logs are indexed but not analysed. Many questions remain for teams to solve themselves: What is behind each request? Is it a robot or a human? Is that bad, suspicious? What is trying to be achieved and how does that affect my website or application?

Similar to popular services such as GeoIP and User Agent parsing, Access Watch’s Reveal easily plugs into your log infrastructure to augment your logs with robot detection and traffic intelligence.

Because this product is quite different from our dashboard, we’ve decided to use new name for it, finally settling on “Reveal“, because with this service, Access Watch reveals what’s really behind each request.

How does it work?

The first version of Reveal is based on the Access Watch API and is available as a plugin (or integration) for log management solutions. Each time a new identity is detected in the logs, the API is queried to get the data necessary to augment the logs. To preserve performance, data is cached. In this setup, logs stay in your infrastructure and are not communicated to us.

Reveal delivers robot detection, request reputation, threat analysis and access to the most exhaustive robot database on the market.

Next Steps

In the days and weeks to come we look forward to integrating with many more log management solutions on the space. We’ll be optimizing our strategy for caching and API calls to ensure scalability of the service for larger volume customers. Our team has also been working on new Kibana visualizations for our robot data, and we’ll be open sourcing those for the community very soon!

If you have any question, or there is anything you’d like to see added to Reveal, make us happy and get in touch with your feedback!

Get started now with Reveal.

A world of good and bad robots. Introducing our Robot Database!

A few weeks ago at Access Watch, we quietly launched our Robot Database. This section of our website is an opportunity for us to help you understand how important robot activity has become, what these robots do as well as their effect on websites and web APIs.

What are the key insights from our Robot Database?

More than 80% robots

Today, on average, on the 1.000+ websites using Access Watch, 81.95% of the traffic is made by robots. Who thought websites were only meant to be used by humans?

Continue reading “A world of good and bad robots. Introducing our Robot Database!”

Introducing Access Watch!

Since the beginning of the year I have been quietly working on something new and it’s finally time to introducing it.

Web traffic data is critical for many businesses, but very few have a clue of what is really happening. When I say things like 80% of the traffic is made by robots and 50% are bad robots, people are surprised. However, this is something we commonly see in the wild. The key problem here is that the people building and operating websites are stuck with tools based on outdated paradigms.

That’s a shame, because when you look deeply at your web traffic, you can easily improve things like performance, reliability, security and costs. I founded Access Watch to fix that, analyzing web traffic data from anywhere and allowing our customers to see and touch something that was previously inaccessible.

But let’s go back to where it started.

Inception

In my spare time, I’m still operating websites, one of them being the venerable BlogMarks, a social bookmarking application that launched 15 years ago. One day, the performance of BlogMarks started to be problematic. We had a modest user base and were operating from a single server, but somehow we had enough traffic to bring the site down. How could that be? I dove into the data and this is what I saw:

The current experience for many website administrators: Access logs that are hard to read and impossible to decipher.

Continue reading “Introducing Access Watch!”