There’s a common misconception that search engine optimization is all about written content. Keyword incorporation and long-tail phrases both play a significant role in a website’s SEO ranking, but regular use of popular search terms is just one of many attributes used to calculate a site’s ranking results.
These are some of the search engine optimization techniques to consider:
Search engines typically display snippets of text in their results listings. Often limited to 160 characters, these text tags are known as meta descriptions. They’re designed to provide a concise summary of the site, which means meta description text effectively becomes an organic advert. It’s also a great place to deploy targeted keywords, conveying the benefits of your brand and encouraging a sense of urgency regarding time-limited offers.
Incorporating images into a web page is a neat way to sneak in some search engine optimization. Images can be given HTML tags – brief descriptions of what’s being displayed. The central part of an image tag is the alt text describing what’s in the picture – either verbally for visually-impaired audiences, or as a written caption that appears when the mouse moves over it. Tags give search engines a better indication of the page’s topic, which is great for boosting keyword representation throughout the site. They’re also valuable for boosting the prevalence of product names on ecommerce sites.
The online equivalent of a floor plan drawing for real estate listings, XML sitemaps help search engines understand a website’s structure and links between pages. Designed to be accessed by web crawlers, these sitemaps ensure every page is ranked – even URLs that wouldn’t otherwise be identified. Sitemaps are especially valuable on larger platforms with complex structures, or where dynamic pages are used for ecommerce purposes. Many web design platforms offer to automatically create XML sitemaps on your behalf, in the format preferred by crawlers like Googlebot.
A robots.txt file
This is another snippet of text designed to make life easier for search engines. Site owners frequently create a text document to instruct crawlers about which pages they should (and shouldn’t) visit. Saved into a web server’s top-level directory, robots.txt files disallow certain pages from being scanned if they might damage the site’s overall performance. Reasons for this might include duplicated content, intranets or live development sites that aren’t ready to be publicly indexed.
Links from external sites
Third-party links to a website suggest it’s an important resource. You wouldn’t drive traffic towards external sites without good reason, so the number of inbound links acts as a leading metric for a site’s value. Link building can be reciprocal, paid for or generated through the provision of quality content. Guest blogging represents a popular method of generating third-party links, while directories and advertising platforms also carry a degree of weight. Association with low-quality link farms badly damages any website’s SEO ranking, as will excessive links in blog comments.
Social media SEO
Expanding on the last point, links from social media platforms often rank highly in search results, since platforms like Facebook and Twitter are seen as reputable content providers. YouTube is the second most popular search engine behind parent company Google, and there is plentiful scope for incorporating keywords and links into each social post. This necessitates SEO-friendly content being uploaded regularly, with full exploitation of image tags and titles. Finally, search engine optimization can also be improved with increased site visitor volumes, so aim to maximize traffic levels and the amount of time people spend on your site.