Do you know why most web developers drown? Think! Think! Because they have “too many anchors.” I cracked up so hard when one of my friends said this. You won’t believe it, but he really meant this. And then we went on to discuss the miserable lives of web developers, including his, and of course, mine. 😉
While we were sharing our ideas, I touched the aching veins, the inflamed ones that were actually making web developers’ lives terrible. How often do you feel you’ve submitted the best work, but the project doesn’t quite perform for your clients? And it’s disappointing when you’ve high hopes with “the quality of work” you’ve delivered.
“But my projects always deliver for clients,” did you just say it to yourself? Yeah, yeah! Great if it does. But most of us belong to the same school of thoughts, my… dear… fellow… web… developing… brethren. You’re most probably making the mistakes most of us make, unconsciously.
Anyway, we discussed how a client’s business depends on the way we work around the projects. And if we missed on specific points and factors— bam! The demolition began!
So you must have already been wondering what on the fricking earth were we talking about? Think hard! It’s something most web developers miss! I mean… most of us.
SEO! Yes. We conveniently overlook these SEO factors because we’re web developers.
We aren’t usually the ones that handle basic seo because we’re web developers. Right? Wrong! There’s hardly any debate left on whether web developers should know about seo factors.
You should if you want to lay a strong optimization foundation for the clients. Trust me; the argument was buried long back. Not just like rotting dead, but like real dry-bone dusty dead.
So what is seo supposed to mean for us? Well, it’s mostly on-page seo. Since we’re developing a website, we have to take care of on-page seo-elements.
I’m sure you’d want to get over these aching veins. Don’t you? Here are some seo factors my friend and I discussed and want you to put close ears to. Implementing them in your websites will help you with technical hiccups. Ready to know them? Keep scrolling!
Seo factors#1: Robots.txt file hack
I was telling my friend how vital the teeny tiny text file was. It’s ironic the web developers who look to boost on-page seo overlook Robot.txt completely, the little source of link juice.
Not that these text files are hard to workaround! I think web developers often tend to slip up on easy things because working with thunders has become their forte. Robot.txt looks like threading a pin to them. So, coming back to the point: why do these text files matter?
Robot.txt files are like you’re talking with a search engine crawler in their native language. Let’s say you want to keep Google bots away from crawling to your site. How would you do it?
“What? Why would I want to keep them away? Isn’t it what I’m looking for?”
Well, right. But it’s top-secret. Read this in a soft and whispering voice. There are times when you don’t want them to be sneaking to your website, mostly when they are under construction.
So we use the text file to notify the bots the pages we want them to crawl-through and the pages we don’t want them to sneak.
When a web spider is about to visit any page, it checks on the robot.txt file first. Here’s one simple example of a Robot.txt file.
This is the basic skeleton of Robot.txt where the asterisk says the instruction will stand true for all the web robots. The forward-slash tells the crawlers not to visit any page.
There are times when your website has a lot of webpages. Don’t believe me? Just hop around your website once. You’ll be surprised. So if you have many pages, these crawlers will take time to navigate the website, which will eventually hurt your ranking.
Search engine bots have a budget and time constraints too! Like you can’t wander every street of LA in a day, crawlers don’t want to visit too many pages at once. Basically, you want it to be going through the most valuable pages if not all. Robot.txt also helps you define what pages you want the bot to skim.
And yeah, don’t forget to enable the bots to crawl when your site is completed. You want a house inspection before visitors visit you. Don’t you?
Seo factors#2: Mobile-first development
So my friend said he championed making desktop websites once. Getting traffic, leads, and conversion was not quite a task.
But suddenly, the websites’ rankings started to drop; the ratings dived to a new low as if the desktop agents were out of pay-roll.
And then came a happy realization. He was slipping on the most trendy thing: mobile-first development. He was unaware that Search engines had predominantly started to use mobile-first indexing for indexing and ranking.
You may also be living in history as my friend; I mean… you haven’t gotten out of your desktop habits yet. I hope you’re not an internet explorer guy! It’s still a decade behind. #Facepalm
So my point is, the majority of users have migrated to mobile devices. They make their Google searches on it. With the help of smartphone agents, the bots take care of indexing and ranking based on mobile-first-indexing algorithms. How do you comply with bots?
- You make sure Google Bots get to access your mobile content for rendering.
- Please keep the meta robot texts the same on mobile and desktop.
- Avoid lazy-load primary content.
- Let the crawlers skim through your resources.
- You also ensure content on mobile and desktop are the same.
- Check your structured data, use correct urls, train the data highlighters to provide structured data.
Seo factors#3: Write meaningful urls
URLs are as useless as “ay” in “okay.” That’s what both of us thought. Who cares what the url is? Crawlers are machines anyway! Whether we keep it as “marketing gimmick slash slash blah blah” or “jhwshhpsnhsvbgd slash slash hhbsijaba,” who cares? They’d read it anyway. No? False!
We’re not doing everything to appease crawlers only. We have another entity called customers. If you remember. And if you don’t appease customers, you’ll never succeed in appealing the crawlers. So as a web developer, you have to optimize urls for better customer experience.
The webpage url should be described enough to be enticing and informative even before the visitors scroll down. Consider it as a complex yet compelling copy with lots of slashes and dashes.
M-E-A-N-I-N-G… it is the effing meaning you have to give out of the urls to the customers. So why is giving sense to the urls a vital seo factor? Because an accurate, enticing, and well-structured url enhances visitor-experience and improves search engine decision-making processes. And how to achieve it? Keep reading!
- Describe the content… in url properly, so the users get to guess the web page’s content accurately.
- Use keywords… in the url. Try to fit the primary keyword at the start of the link.
- Remember dashes? I meant hyphens. You can use underscore (_), but search engines love hyphen (—) to separate the words.
- Put the caps off… yeah, I know you want to draw the bot’s attention. People who use uppercase either want to make a point or don’t know how to turn off their caps. Search engines prefer lowercase. However, there have been a lot of debates around this. Some believe that buggy bots convert uppercase into lower and result in a unique bug. And that affects your rank. To avoid duplicate urls, keep it in lowercase.
- Short urls… are easy to remember, type, and search. Keep ’em crisp. Bots can confuse too many words with keyword stuffing. Result? Bye-bye rank.
- Static urls… are the key seo-friendly urls. Avoid confusing parameters like “?,” “=,” and “&.” These urls are readable by both users and crawlers.
Seo factors#4: Unnecessary large images
So my friend and I were getting a selfie when my phone uttered in despair, “insufficient storage, please clear the memory.” It was quite ironic as we had just discussed how large images affect the page ranking.
Since I was a self-proclaimed cottage photographer, I thought putting quality images did the job. Well, they do— provided there is no barrier of entry for the search engines and viewers.
How would you check-into a cumbersome website? I mean… had the web developers put a lot of barriers with massive images, how would you get into the website? With the slow page load time, you’ll definitely reconsider your options. Right?
Technically, the bots consider the size of the images while weighting the seo value to your page. No matter how Goodly-Woodly intent you have, massive images damp the websites. And that eventually frustrates everyone, including the “non-emotional” Google.
So we have to fix the images to optimize the page load and overall website performance. How to do it?
- The size of the image… is dicey. High quality will slow the page load time; low quality will make the image fuzzy. Getting the right balance is the key! Keeping it below 500 KB is always polished & effective.
- Again the size of the image…literally… The ideal height and the width size of the image in mobile is 640 by 320 pixels. Though you can exercise other ratios, stick to these for desirable results.
- Consistency in image sizing… is a must. The display and the actual size of the image should be almost the same. Not doing it will be affecting your site speed unnecessarily.
Seo factors#5: Select a light-weight development theme
One thing both of us, my friend and myself, had in common was our love for complex themes. Who doesn’t love unique development themes? These themes make clients stand-out from others.
But here’s the catch: not all the development themes think of our sales. Yes, aesthetically, they may be leading the pack. But performance-wise, they’re a big no-no.
Fortunately, both of us had realized this a few months ago, and about the distaste, these themes bring for google bots. And being a WooCommerce developer, both of us had to go through the similar distaste with the broken seo! So what did I do to avoid this?
Nothing much. I used themes that were built on research and best practices. One of these was Shoptimizers, the fast WooCommerce theme with tonnes of advanced features without disrupting the page load.
Such is the theme’s greatness that it gives a healthy google insight score, better seo ranking, and conversion. Here’s why you… yes you… need themes like Shoptimizer as a web developer.
- Minified and critical CSS… So the theme automatically minifies the CSS file, making your client’s website fast.
- Perfect performance score… in your hand. Run a Shoptimizer website audit in Google Lighthouse Audit and improve your score.
- Focus on checkouts… the theme removes the header, the footer, and the sidebar and focuses on distraction-free checkouts
- Better seo spidering… with superior product category seo.
Checkout website Huptech Web developed on Shoptimizer. Medixcbd.com
Seo factors#6: Use 301 redirects appropriately
So when my friend and I didn’t like the leafy steak salad, we decided to move to another location and continue our discussion. And we went to another restaurant and resumed these misery-striking seo factors.
Fortunately, we didn’t have to inform anyone about shifting to another place? Nah! No one at all! Because no one cares where we go.
But that’s not the scene with webpages. As a web developer, you might have to change the url or the url structure if your clients don’t like it. But you can’t permanently move your web page from one location to another without informing anyone.
You have to use 301 redirects, which many web developers miss, not out of ignorance, but out of laziness. 301 redirect is a kind of symbolic gesture where you tell the browser: “This page has been moved permanently to another address. Here’s the new location and we don’t think we’ll move back on the previous one.”
The browser in its language responds: “Alright! I’ll keep the users posted on this address now onwards. Cheers!” Remember, not redirecting an already moved page will come with hefty Google fines. And these cops don’t even warn or launch memos. They penalize search ranking.
Now you might be wondering— what if you do not have any relevant page to redirect to? Well, it’s quite common. In such cases, most web developers simply redirect their webpages to the home page.
So you’d want to do 301 redirections with:
- Editing your site’s .htaccess file.
- If you don’t have one. Create a TextEdit file (Mac user) or a notepad file (Windows users).
Optimized: Redirect 301 /old-page.html /new-page.html
Seo factors#7: Name image files correctly. Use alt tags and titles.
As the evening went by, the two of us got into the tiny details. Afterall, the devil is in the detail. Being a web developer was tough for either of us. Laziness was the weed that often caught us, especially when we had to name the files. But that needed an addressal too!
I was so wrong with naming. I would keep names such as “1.jpg,” or “madcoder.jpg” No relevance, no accuracy, no information. Glad I’m not a neologist; I would have goofed-up the language big time.
But these tiny details affect the seo ranking. As Google says, make your website in such a way it helps visitors, not crawlers. When you have done the first job, the second is already done. So the search engine says filenames and alt text should be relevant, short, and descriptive. Here are the key things you’d like to do with images:
- Keywords… shouldn’t be missed. Keep ’em in the image name.
- Jpg over png… as jpg has many color shades. Png is suitable for small files with limited colors.
- Specify alternative text… for the images that can’t be displayed. We’re using it for the visually impaired readers too.
- Use keywords phrases… in the alt texts too. Make sure they sound descriptive and informative.
- Avoid long alt texts… because the bots consider them spammy nuts. Brevity is the key. Keep the tag concise.
- Capitalise on description and title… they give you more spaces to place keywords for ranking. They’re bang on great for optimization. And as mentioned, we are already using the best keywords in urls.
Seo factors#8: Use canonical tags for duplicate content
Having done with our dinners, we took a short walk where we came to the real pain points of web developers: confusion! Well… we may have been good at developing websites. But confusions, we sucked at eliminating them. Especially the identical content appearing on multiple urls.
Not that two of us couldn’t have afforded to eliminate it, but at times, crawlers came from different sources. When url content looks more or less the same, bots get confused, and by then, you know how these mad crawlers behave. Don’t ya?
So it was a critical seo factor we couldn’t fix despite knowing ‘the penalties’ and ‘the repercussions’ and ‘the threats,’ by the bots.
All these links may direct to the same page. For us, they can be a single webpage, but for crawlers, every single of these links are unique yet duplicate.
If your website has similar-looking url content— ALERT! Your clients may not be pleased because their business might be losing hold on search engine results.
Thankfully, we have canonical tags. It’s a loud & clear way of signaling Google that “this url” is the master url, so to speak, so you should display it on the search result. Not adding the canonical tag can have three problems.
- First, if the search crawlers have to hop around duplicate url content, when will they go-through your unique url content? Chances are high they’ll miss those unique content that you made.
- Second, large-scale duplication is bad for your website. It dilutes your ranking chances.
- Third, even when your content is ranked, search engines will have a hard time decoding the master page. Chances are full; they’ll pick the wrong url as “master url.”
So what can you do to avoid all these messes?
- You can self-reference the canonical tags.
- Canonicalize your home page, like real-proactively.
- Spot check your urls and dynamic canonical tags.
- Stay careful while canonicalizing close-duplicates.
- Don’t forget to canonicalize cross-domain duplicates.
Ok, so what did we learn? Can you repeat it real quick?
You learned… that Robot.txt is a vital source of link juice. It’s not as hard as you think. A few tweaks, here and there, and you’ve fixed one of the notable seo factors in a snap of fingers.
You learned… that search engines have a biased-indexing affinity towards mobile-first-development. I know it’s difficult to get above your desktop habits. But optimizing your website for mobile is the real deal for now and the long coming future.
You learned… that Urls are not just a piece of useless links. They can be as informative as titles, descriptions, and copies. So do this favor, don’t ignore these dudes.
You learned… that Quality images >>> fuzzy images. But light-weight images >>> heavy-weight images. Mix-matching both the factors, reach a middle-ground, and use images in such a way your website sprints.
You learned… that aesthetic-ladened development themes look good, but often mis-perform. You don’t need a beautifully-exterior home with an ugly, spider den in the interior—performance, and aesthetics over ‘only aesthetics’ with Shoptimizers.
You learned… that moving your web pages around on different url addresses is easy. You just have to inform the browser about the act. Rest assured, the browser is smart enough to get your guests on the redirected urls.
You learned… that image naming, alt texts, and titles should make sense. Of course, you’re entitled to name it the way you want. But then Google is entitled to de-rank you. Simple bargain. No?
You learned… that the least you confuse Google bots, the better you perform. To stop it from confusing and picking the wrong urls as ‘original urls,’ use canonical tags wisely.
Thank you for your informative post!!!