Single post hero image
THE DEALERON BLOG

4 Ways Independent Website Analysis Can End Up Misleading You

In the quest to get the best website possible, we often turn to independent website analysts. While these companies will provide you with an audit of your site and a grade for how it performs on specific criteria, it’s less helpful than it seems. This information, devoid of context, can mislead as often as it informs, leaving you with a distorted idea of how effective your site is. Here are four ways that website analysis can end up misleading you.

1. They see everything in black and white

Website analysis companies use automated scans to audit pages. This means that every criteria is judged on a pass/fail metric, with absolutely no room for degrees. It’s a binary proposition: either a site adheres to a specific standard and it passes, or it doesn’t and it fails. Any web designer or SEO specialist will tell you that this isn’t how reality works.

It’s a commonly accepted “rule” that websites have to load in three seconds. While it is true that a site’s bounce rate rises rapidly after the first three seconds, it is emphatically not true that this is a brick wall. Imagine if as soon as a load time hit three seconds, 100% of visitors bounced. This would make a site that loaded nanoseconds longer as bad as one that didn’t load at all, which of course is utter nonsense. Still, most automated website audit tools give a site that loaded in 3.02 seconds a 0% rating. While technically accurate for the criteria they are judging, it’s incredibly misleading representation of the overall quality of the site. These sorts of assessment potentially end up fleecing a Dealership into spending hundreds of dollars to fix something that wasn’t broken in the first place.

2. They view customer preferences as bugs rather than features

Any web design company fundamentally answers to its customers. Therefore, it’s possible, even probable that a specific site might not have the best practices according to a website analysis company’s audit because a customer insisted on a non-optimized point. Maybe that point works better in their particular market, maybe the rule in question is based on outdated or flawed data, or maybe the customer is flat-out wrong. It’s only the place of the design company to advise. As soon as the customer insists on a feature, it’s going in.

Remember, the scans are automated, so they don’t look for context or preference. For example, they will check to see if VDPs have a section for fees, taxes, delivery, and scheduling. If they do, the analysis spits out a perfect score. If not, it’s the dreaded zero. However, this binary approach ignores the fact that many dealers don’t want fees and taxes listed and others prefer a cleaner look to their site and omit the others. So what amounts to a personal choice by an individual customer gets a single pass/fail grade.

The presence of two chat buttons can also trip this bug. While an analysis company would give that a zero rating for violating what is essentially an arbitrary rule of design, the customer that demanded the two chat buttons (and the increased loading time that came with them) would be more than happy with the result. The same is true if a dealership happens not to want an amenities page. It’s an option that’s available, but if it’s not selected then it’s not going to be present.

The result is that the analysis company is holding a website to a standard that that runs counter to the demand of the customer and more than likely their market.

3. They lack transparency

Sometimes a company’s criteria for passing or failing a specific test is baffling. They don’t share their methods with customers, so it ends up being left to guesswork, not to mention the fact that often times individual auditors hold websites to different standards.

Some web analysis companies rate homepage banners to determine the relevancy to the viewer. What determines relevancy? Your guess is as good as mine. There’s no attempt to explain either before or after the audit.

As dealership sites are fundamentally marketplaces, independent website analysts will rate sales content. The way they do this is MSLPs, or model special landing pages. However, sometimes even identifying the presence of MSLPs can go wrong. MSLPs can be missed in their crawl, producing a score of zero when a perfect score is warranted.

So even if a site fails in some regards, it’s unclear as to why. Without this information, it would be impossible to act to improve these factors, making those grades pretty arbitrary.

4. They don’t take into account the existing landscape of SEO

There are two templates for searches that will lead to a dealership’s site. The first is “[make] [model] near me,” and the second is “[year] [make] [model] [city].” The first is always going to be taken up by the big box dealerships like Carfax and Autotrader. The second is where there’s a larger chance that a single dealership’s SEO practices will get them to the top. Guess which phrasing most audits test? This penalizes a dealership for not outranking big box stores, and/ or tricks them into thinking that another design company could “fix” these “ranking issues” leading to dealerships jumping from provider to provider searching for a solution to a problem they don’t actually have.

These automated scans on their own wouldn’t be bad if they were approached as one of several tools, bolstering a human auditor’s own research and critical thinking. Instead, they tend to be used as panaceas, holding sites not only to unattainable standards, but standards to which the customers themselves might actively oppose.

Author Justin Robinson-Prickett

Justin Robinsion-Prickett is a content writer from Los Angeles with over a decade of experience in the auto industry under his belt. When not working, he enjoys fencing, re-editing dialogue in old movies to remove articles, and playing with his two dogs James Westphal and Dr. Kenneth Noisewater.

More posts by Justin Robinson-Prickett

Leave a Reply

Call support
(877) 543-4200
Call Sales
(877) 543-6321