Analyzing websites is a major part of what I do for a living. The wonderful Screaming Frog crawler and AHREFs have been trusty tools in my software toolbox.
Despite having these powerful tools I found myself doing lengthy website audits but being left with too many unanswered questions.
A few questions I often get from clients and also have myself are:
- How should I organize these pages?
- How should I link pages to other pages?
- How are the pages related to each other?
If you have 20 pages in a spreadsheet this is a fairly easy manual task. But if you have 2,000 pages it becomes more challenging. And with 20,000 pages, impossible within most budgets.
Seeking answers to these questions is why I decided to create a piece of software to help with website analysis.
What my software helps with
It's a content analysis application that helps me understand the important role of your website's context.
In short, it crawls your website and uses hand coded logic and machine learning to organize the pages on your website based on what they say. It also helps map and optimize internal link structures based on a variety of metrics. This is in addition to the standard SEO features found in a standard crawler like Screaming Frog. It then exports this into a spreadsheet.
The best part is as I see new trends and learn from website audits, I add new features to the software.
My end goal is getting the software to a point where it can replace an SEO by exporting an easy-to-understand spreadsheet that you can use to optimize your website.