Web developers

6 Ways Web Developers Can Improve SEO

The relationship between web developers and search engine optimization teams is sometimes controversial. Seemingly unrelated technical decisions by developers can impact organic search traffic.

But there are huge SEO benefits to working with development staff and their release planning and test cycles. In 13 years, I have met a single developer who refused to take SEO recommendations into account.

When SEO practitioners sit down with developers to discuss the income generating opportunities for the site, amazing things can happen. In this article, I will list six of them.

How Web Developers Can Improve SEO

SEO self-sufficiency. Identify ways in which the SEO team can become self-reliant, freeing developers from mundane work. Can you edit canonical tags yourself via bulk upload instead of asking developers to do it? Can you redirect 301 individual pages as needed when promotions expire?

Look for the small tasks that take developers away from more strategic work. Ask if a solution could be worked out to allow a member of your team to perform the task instead. This will likely result in faster implementation of these tasks since you won’t have to wait for them in a release cycle.

JavaScript binding. Search engines are increasingly adept at crawling JavaScript. Google claims it can crawl anything you throw at it, although it still avoids hashtags and form fields.

Still, if you want to make sure that search engines can crawl your site and associate relevance and authority signals with every page, have JavaScript developers link the destination URL to the anchor text. This creates a link that acts very similar to simple HTML, sending out all of the SEO signals you want.

301 redirects. Of utmost importance when migrating content or redesigning a site, 301 redirects are also necessary for day-to-day actions. For example, when you change the name of a category from singular to plural, it probably also changes the URL of the keyword.

Unfortunately, 301 redirects are painful for developers to write and test. Look for an automatic redirect solution for small instances like this so that every modified URL gets redirected instantly. You won’t have to remember to ask for it, developers won’t have to implement it, and it will automatically protect your organic search performance.

Exploration errors. Errors occur on every site. The number of “normal” errors depends on the type of error and the size of the site. For example, if Amazon had 1000 not found 404 file errors, that would be a drop in the bucket compared to the enormity of this site. But for a small ecommerce site with 100 products, even 20 mistakes would be a big problem.

Excessive errors reduce the algorithmic perception of search engines of the quality of the site. If your site has a high percentage of errors relative to its size, the chances are higher that a searcher will land on an error page from the search results page. This makes both the search engine and your site look bad. Search engines don’t want to take this risk. So, the more errors there are on your site, the less likely search engines are to rank it.

The quality of the site is an understandable sensitive point for developers. They also want a clean site. Reporting errors can make them defensive. Avoid subjective statements, such as “The site has a lot of errors” and focus on the details. Show up at the table with data from Google Search Console or your web crawler that shows which pages are causing errors, so you can move quickly to solutions.

Duplicate content. Search engine bots have a crawl budget. They spend a limited time on your site. If your site is full of duplicate content, bots will waste time crawling these redundant URLs and search engines will have a hard time identifying which pages to rank. They also won’t discover new content so quickly.

There are many ways to resolve duplicate content. Each method has risks and benefits. Choosing a solution that is good for SEO and easily implementable by your development team requires discussion.

An added benefit of this discussion is that your developers will likely be able to tell you what is causing the duplicate content in the first place. You may be able to fix the problem at its source.

Speed ​​and safety. Google values ​​the speed and security of the site. Both are part of the ranking algorithm. And SEO practitioners need to rely on development teams to improve these areas. Remember that speed and security are also customer experience issues. Your development team might already have them on the roadmap.

In the case of site speed, developers probably have people pestering them across the company. Adding your vocals to the mix is ​​unlikely to produce instant results. Instead, ask about their plans to improve the situation.

Also educate yourself on what it takes to implement these changes – they are not easy solutions – so you can discuss them in an informed manner. For example, contributor Hamlet Batista wrote an excellent article on the switch to HTTPS. And all kinds of tools, like the one from Google Preview page speed, can recommend how to improve loading times.