Website security and prevention must include log file analysis!

Without website log file analysis, you may never know for sure how your site was infected!

Our methods

Our log file analysis is handled by "Buns". Buns isn't a person, but a bank of servers running our software.

We learned early on that you need to know how a website was infected before you can protect it. That knowledge comes from the log files. When you see the time frame a website was infected and match that with the traffic coming to the website during that same time, it's relatively easy to determine how it happened.

Without this log file correlation, you're putting your site back online (after an infection) without knowing how it happened. Therefore, you have no idea what to do to protect it.

And more importantly, without knowing how it happened - it will happen again. That's a re-infectiond and we hate re-infections!

Log file analysis is not easy. Especially to have it performed by automation. However, this company was not started because it was easy. It was started because someone had to help the millions of website owners with infected websites.

This is where our experience of cleaning, not just monitoring, but actually cleaning over 300,000 websites really pays dividends - for you.

Our systems know the difference between a failed login and a successful login to a website. This is important because often times the hackers will attempt hundreds or thousands of logins, but maybe only one is successful. If they bury the successful login in the hundreds or thousands of lines in a log file, you may never see it. However, with an automated process, it never escapes our watchful eye.

This is just one example of what gets analyzed by Buns.

We've invested a lot of time and resources to the website log file analysis section of our service.

We believe that it's the reason we can boast about our .047% re-infection rate. You see when any site under our service is infected after we've removed the malware, we take it very personally.

After physically reading thousands of lines of log files, you quickly realize that you're no longer very effective. An automated process was developed that was unleashed on all the log files we have saved. We created rules for reading these entries, classifying them and then analyzing them to determine what traffic is malicious and which is just normal traffic.

Our system also had to determine what attacks were successful and which ones were not. Obviously you want to fully analyze the successful attacks - which is what we do.