Since the beginning of the year I have been quietly working on something new and it’s finally time to introducing it.
Web traffic data is critical for many businesses, but very few have a clue of what is really happening. When I say things like 80% of the traffic is made by robots and 50% are bad robots, people are surprised. However, this is something we commonly see in the wild. The key problem here is that the people building and operating websites are stuck with tools based on outdated paradigms.
That’s a shame, because when you look deeply at your web traffic, you can easily improve things like performance, reliability, security and costs. I founded Access Watch to fix that, analyzing web traffic data from anywhere and allowing our customers to see and touch something that was previously inaccessible.
But let’s go back to where it started.
In my spare time, I’m still operating websites, one of them being the venerable BlogMarks
, a social bookmarking application that launched 15 years ago. One day, the performance of BlogMarks started to be problematic. We had a modest user base and were operating from a single server, but somehow we had enough traffic to bring the site down. How could that be? I dove into the data and this is what I saw:
The current experience for many website administrators: Access logs that are hard to read and impossible to decipher.