Current category: eBusiness Articles
Having worked in the SEO industry for several years, it’s safe to say I’ve conducted my fair share of SEO audits for clients’ websites. From small websites with fewer than a hundred pages to blue-chip organisations’ websites with several thousand pages, I’ve seen it all and remained a “white hat” SEO all the way!
Over the years I’ve compiled a comprehensive check list that covers crucial aspects of a technical SEO audit. Different SEOs will plan the technical SEO audit differently, and often a full audit of a particular site isn’t needed because they’re getting it right in some areas.
One of the main aspects that people do get wrong is the accessibility of their website’s content. If Google can’t get to your content, neither can a user so you’ll be suffering a double hit of lack of visits and conversions.
Whether it’s your own site or a client’s , make sure it’s connected to a Google Webmaster Tools account and if you or your client doesn’t have Google Webmaster Tools set up, well, that’s the first issue to resolve.
Following are the key areas of a website I look at when I conduct a website accessibility audit.
#1 Error Handling
By logging in to your Google Webmaster Tools account you can find out if Google’s crawlers have come across any mistyped URLs; pages that don’t exist sending back a ‘not found’ (404) error page or soft 404 errors, or if your server is rejecting crawlers’ access to some pages and sending back ‘500’Server errors or any other type of accessibility issues.
If Google has reported any issues while accessing your site, do not ignore them! Make sure you address them ASAP by putting affective redirects in place for all the broken links and investigate the reason for server errors.
Robots.txt is a text file that is normally positioned at the root of a website. Review the robots.txt file of your site to check whether it is blocking search engine crawlers’ access to important pages.
t is also crucial not to expose secure pages and directories from the site, i.e shopping basket or login page, so the search engines don’t crawl the page and pages that carry no extra value for search users; instead they are populating a search engine’s database with junk pages.
#3 Robots Meta Tag
Search engine crawlers can be further guided on a page level using Robots Meta tags.
Customise the robots meta tag for the different areas of the website, where some pages are not to be indexed, or not to be followed.
It is recommend using a robots meta tag if you have the odd page in indexed folders that you want to block, but generally, if most of your non-indexed content is in one or more folders, then use robots.txt to block the entire folder. An example of a robots meta tagis:
#4 Nofollow, Noindex Tag
Use the nofollow tag for links on pages that don’t hold much value, like review form pages. use “nofollow” tags to prevent the passing of page rank through a particular link to the linked-to page.
You can also use noindex on these same pages. Use of the noindex tag prevents a search engine robot from indexing a particular page on a site.
#5 XML Sitemap
The sitemap is a text file that carries a complete map of all URLs that exist on a website. This helps search engine crawlers to index all of the website’s pages more efficiently as it contains useful information like the desired crawl frequency of pages; the date the XML file was last modified and shows which pages have recently changed and set appropriate priorities for all the pages.
Ensure that there is an XML sitemap present at the root of the website and submitted into Google and Bing webmaster tools to help them crawl your website. XML site maps are even more useful and a ‘must have’ when the website is large.
#6 Latency (page load speed)
There can be latency issues if the HTML code is not efficient enough, or if the pages are not compressed. Efficient code and compression will allow the website to load faster.
Test your website’s loading speed by using tools like gtmetrix.com and Google’s Page speed test tools. These are FREE to use and not only give you a list page speed issues with your website but also the possible solution to resolve them.
#7 Content Duplication
One of the fundamental causes of poor organic search performance is due to the presence of duplicate content on a website. Pages that present the same content to search engines via multiple URLs cause content duplication. Any difference between one URL to another (even seemingly minor differences such as a capitalised letter) will cause search engines to consider these URLs as separate pages.
There are many types of duplication issues that can exist in a site and I’ve listed them as separate points below to look out for and address:
#8 Duplication: Click and session tracking
Websites use sessions to track customer behaviour on the site, particularly ecommerce websites that like to store visitors’ shopping cart activities, etc. So when they return to the site they can see items they have included in the shopping cart previously. Some websites do this by using Session IDs in the URL. By doing this, every internal link on the website gets that Session ID appended to the URL, and because that Session ID is unique to that session, it creates a new URL, and thus duplicate content.
#9 Duplication: Architecture
Duplication can occur when a product is placed into more than one category, and the architecture of the website is not correctly structured for multiple category products.
#10 Duplication: HTTP vs. HTTPS
A website can be duplicated through secured HTTPS sessions; HTTPS pages are often used for payment transactions, logins and shopping baskets to provide an encrypted and secure connection. Make sure to use HTTPS on pages that require this security protocol and using the canonical tag to refer to its non-https version of the URL.
#11 Duplication: Capitalisation
Capitalised URLs can create duplication issues if the web browser resolves them. They should be redirected to the correct version of the URL.
#12 Redirects: 301
301 redirects to be used when a page becomes irrelevant i.e. a discontinued product to be permanently redirected to a category page, or a new, similar product. The 301redirect passes the link credibility (link juice) to the redirecting destination.
#13 Redirects: 302
The 302 redirects are only to be used for temporary redirects. The 302 redirect doesn’t pass any link credibility to the redirecting destination.
#14 Redirects: Chained Redirects
I often come across websites that have multiple redirects, where page A redirected to page B and page B is then redirected to page C. Avoid this kind of chain redirects to ensure maximum link credibility from one URL to the redirected URL.
#15 Canonical Tag
If there is duplicate content on the website that cannot be redirected, use the rel=canonical tag on pages that require it.
Pagination occurs when a website divides its content into multiple pages, for example an ecommerce site displaying its product or category listing into multiple pages.
Ensure pagination has been implemented correctly so that it’s not causing any duplication. Google recommends implementing rel=“next” and rel=“prev” for pagination.
#17 URL Structure
To get the best search and usability performance from you website, keep its structure sound, avoiding superfluous folders, and keeping the architecture as flat as possible. When it comes to subfolders, search engines assume that content that lives in folders far away from the root domain is less important.
My suggestion would be to limit folder depth to three folders, also keeping all the important content within three clicks away from the home page.
In the example above, product A will be getting much more credibility compared to product B
#18 HTML Validation
Websites should be cleaned and have corrected XHTML formatting applied according to the recommended standards. To check whether your website is up the standard, you can use this FREE Tool validator.w3.org.
#19 Links: Nofollow
Use rel=”nofollow” tag when a link is not to be considered as valuable, i.e. in comments, so the link strength is not diluted
#21 Links: Image links
Search engines can’t see images, so it is important we use the alt attribute to provide descriptive text about the image. Use alt text for images, describing the link with relevant keywords.
#23 Customised 404 Page
The 404 (broken link) page often gets ignored by the webmasters. They don’t realise that a good 404 page is perhaps as important as having great content. Sometimes it may not be your fault visitors land on error pages, but being able to communicate the error professionally is as good as a second chance to re-engage a visitor. That is why it is recommended to have a branded 404 page that not only links back to the homepage but has other engaging features like a search option.
#24 Structured Data (Rich Snippet)
Structured data allows search engines to provide users with relevant related content through its various vertical markets using different data types. Make the most of structured data for improved click through in the SERP.
Structured Data or rich snippets can be used on the following elements of a website:
Event, Person, Place, LocalBusiness, Product, Offer and Review etc.
Google Webmaster Tools has introduced a new feature called ‘Data Highlighter’ (under their optimisation tab), which allows you set up Structured Data on events on your site, without you having any prior coding knowledge.
At Receptional, our core concern is to enhance the performance of your site in search. We’ve helped many clients improve the structure of their sites and conversion rates, so get in touch with us and we’ll start helping you today!
Author: Syed Shehzad
Courtesy of www.receptional.com
The longer I work in internet marketing, the more I wonder if we’re often missing the point. We tend to think primarily in terms of “link-juice,” conversion rates, or increasing traffic, and when it comes to accountability, we usually look to Google as some omnipotent deity, hoping it will bestow favor upon us. But with so much emphasis on these inanimate statistics, we’ve forgotten something really important:
The internet is not about Google.
The internet is not about ads.
The internet is not about traffic.
The internet is about PEOPLE.
Shocker, right? Amidst all the hubbub centered around how to best optimize X, or increase Y, or reduce Z, we overlook the simple, raw purpose of the internet.
At its core, the internet was created as a tool for sharing information and connecting networks of people. Websites were developed to be a place for users to engage with unique content, and Google was designed to help people find this content. The internet has always been about making things easier for people.
But in our hurry to “rise-to-the-top,” we often forget that there are living, breathing humans behind the mouse and keyboards and hyper-focus on marketing tactics instead of creating engaging and helpful experiences for our users.
Usability First. Optimization Second.
When designing a piece of content for your users, it’s really important that you don’t let yourself elevate the role of tactics. Yes, optimization is important, but it’s more important that you focus on the user’s experience. People are really good at sniffing out marketing strategies, and so you have to ask yourself whether or not your piece is actually bringing value to your users. At the end of the day, it doesn’t really matter where your page ranks if no one is interested in what you have to offer.
Skip the Jargon
As marketers, it’s really easy to over-optimize content and spend too much time talking about our brand(s). But here’s the deal: no one likes an egocentric company. So cut everyone a break and quit talking about yourself. One of the most common mistakes is when websites/creative pieces assume others are interested in their products. Don’t assume anything. Instead, convince your user that you actually have something valuable to offer.
With regards to design, there are a few things to keep track of. First, be aware of how you use your calls to action. We’ve all seen sites that inundate their readers with “buy now” or “find out more” buttons scattered throughout the page. Instead, try and hone in on a clear navigational path and then use your resources to show your users what you have to offer.
Refine Your Content
We’ve all read the studies about how much more distracted we’re becoming, and yet so many websites still rely too heavily on text. While there’s nothing wrong with well-written copy (you are reading an article after all!), it’s usually better to distil your content down to its bare bones and implement visual elements like icons, charts and video. Focus on creating easily skim-able columns of content with a clear option for the user to read more.
There’s a common dilemma between designers and marketers regarding how content should be displayed. Designers typically want things stripped completely down, usually only showcasing images and video. Marketers, on the other hand, typically like to fill pages with search-friendly content that can often create a cluttered design. Fight the desire to swing to an extreme. You’ll run the danger of creating a beautiful site that no one can find, or an easily accessible site that nobody wants to use.
DO Sweat the Small Stuff
In the age of agile, we’ve grown used to shipping projects. And while this works really well for a whole host of tasks, don’t let it make you lazy when testing your projects. When it comes to user experience, the beauty is in the details. Take the time to sift through your contact forms, navigate to deep pages and assess the quality of your design elements. While it may not seem pressing to have a universal system for all your colors and images, its details like these that will go a long way in allowing your user to engage with your content.
Meet People Where They’re At
In a world where anything can be accessed from anywhere, it’s vital that your content has the chops to meet users in their current contexts. And with a growing mobile market, you content can’t just slide by on smaller screens. Instead, it has to be just as beautiful on a 4-inch display as on a 27-inch monitor.
While there’s a lot of discussion around whether to utilize responsive design or build a separate mobile site, make sure to keep your user’s interests at the forefront of your decision process there isn’t a one-size-fits-all solution. Regardless of your decision, when it comes to designing for mobile, it’s important that you recognize responsive design not as a inferior to its desktop alternative, but as a completely new medium in itself. While there are a ton of different techniques, the bigger theme is that you focus on making it a pleasant experience for your users to engage with your content on mobile platforms.
Amidst so many tactics and strategies, it can be easy to forget the simple importance of building quality user experiences. Social shares, rankings, and traffic are only as good as the user engagement. Ultimately, we can’t look at our users as mindless lemmings who will click through our pages. Instead, we have to recognize the common motivations, desires and expectations of our audience and try to meet these needs in the most engaging and humanlike manner.
Author: Luke Clum
Courtesy of www.distilled.net
At the foundation of an SEO campaign are the keywords that are targeted. These keywords are implemented throughout the website and included in any content that is published throughout the web on behalf of the business, whether it’s a guest blog post, a company profile on an industry site, or a local directory submission. Therefore, it’s imperative that the proper keywords are targeted. Targeting the wrong keywords can result in website visitors that aren’t going to convert and lead to missed opportunities. In order to select the proper keywords, you need to find out how people are searching for what you have to offer. Many website owners make the mistake of skipping over this step, thinking that they know their target audience so well that they know how they are searching. This often isn’t the case, which is why data is important.
To ensure that you target the right keywords, it’s necessary to follow these steps:
1. Stay organized
Organization is an important element of any business activity today and SEO is no exception. For keyword research activities, create a spreadsheet that includes the URL of every page of the website and two sections per URL. The first section is for keywords that are currently generating visitors and the second is for the new keyword research that will be conducted.
2. Look at current keyword data
Keyword research should be conducted page by page. If you have Google Analytics set up on the site (which every site should!) go in and pull the keyword data for each page. You will be able to see what keywords are currently bringing visitors to each page of the site from organic search. This can serve as a starting point. If these keywords make sense, you can use them throughout the rest of the process. If they are way off base, it may mean that the content itself is off base and may need revision.
3. Brainstorm potential keywords
Carefully read through the content on the page and come up with a list of potential keywords that make sense, and that you’d like to target.
4. Conduct keyword research
This is the part that website owners sometimes skip over, even though it’s the most important part! Using a keyword research tool (like the free one that Google provides), type in the potential keywords to target from the list of keywords that are already generating visitors and the additional ones that have been brainstormed. The keyword tool will generate a list of related potential keywords along with their search volume and level of competitiveness.
5. Scrub the list
The keyword tool can give you a very large list of potential keywords, but they probably aren’t all going to be quite on target. Go through and remove any that aren’t relevant or are too broad. The keywords that you are left with can be targeted on the site and throughout additional SEO content that is published elsewhere on the web.
Use these five steps as a framework for keyword research for your website and you’ll soon find that proper keyword selection is back on track.
Author: Nick Stamoulis
Courtesy of www.searchmarketingstandard.com