SEO is a big part of modern business but, despite its popularity, mistakes are still being made.
If you want to give yourself the best chance of success then our SEO services in London can be a great help. Don’t just take our word for it though – take a look at these three common mistakes which SEO services in UK regions can help you avoid.
1. Wrong indexing
If you main website is not indexed or your dev site is indexed then you have committed a cardinal SEO sin. Luckily, it’s fairly straightforward to fix and you just need to block the content on your dev site and leave your main site fully visible.
To do this you can use a robots directive which blocks all dev/staging content or a meta noindex tag on each dev page.
Professional SEO services in London can help you understand this in a little more detail if necessary.
2. URL changes
Chances are there has been a time when you’ve changed the destination URL for a specific page or online content. No big deal right? Well, what about its impact on SEO?
Changing URLs can lose you significant web traffic for extended periods of time so if it can be avoided then that’s what we’d recommend. If you do move or change URLs then do everything in your power to keep them visible.
Implementing permanent redirects on relevant pages (those with link equity) is also highly recommended and will prevent any disruption to user experience too.
3. New strategies
Finally, changing strategies or brand messaging can result in huge disruptions to your SEO.
You may decide your new message is more important than previous keyword targeting but just changing the wording of your website without any further consideration can really dent your traffic.
Instead, speak to cheap SEO services in UK territories for more information on how this might affect your page ranking and identify which webpages currently rank the highest and attract the most visitors. See what keywords are targeted here and try and integrate them with your new messaging and strategy.
Whatever you do, remember that these tips are designed to help you get the most from your website but are best implemented by professionals. The best way to avoid common SEO mistakes and get the website your business deserves is by looking for the best SEO services in London and taking advantage of their expertise.
"Remember what it was like to search in 1998? You'd sit down and boot up your bulky computer, dial up on your squawky modem, type in some keywords, and get 10 blue links to websites that had those words," Singhal wrote in a separate blogpost.
"The world has changed so much since then: billions of people have come online, the Web has grown exponentially, and now you can ask any question on the powerful little device in your pocket."
Page and Brin set up shop in the garage of Susan Wojcicki -- now a senior Google executive -- in September 1998, around the time they incorporated their company. This week marks the 15th anniversary of their collaboration.
Amit Singhal, senior vice president of search, told reporters on Thursday that the company launched its latest "Hummingbird" algorithm about a month ago and that it currently affects 90 percent of worldwide searches via Google.
Google is trying to keep pace with the evolution of Internet usage. As search queries get more complicated, traditional "Boolean" or keyword-based systems begin deteriorating because of the need to match concepts and meanings in addition to words.
"Hummingbird" is the company's effort to match the meaning of queries with that of documents on the Internet, said Singhal from the Menlo Park garage where Google founders Larry Page and Sergey Brin conceived their now-ubiquitous search engine.
When’s the last time Google replaced its algorithm this way?
Google struggled to recall when any type of major change like this last happened. In 2010, the “Caffeine Update” was a huge change. But that was also a change mostly meant to help Google better gather information (indexing) rather than sorting through the information. Google search chief Amit Singhal told that perhaps 2001, when he first joined the company, was the last time the algorithm was so dramatically rewritten.
Does it mean that “PageRank” algorithm is dead?
No. PageRank is one of over 200 major “ingredients” that go into the Hummingbird recipe. Hummingbird looks at PageRank — how important links to a page are deemed to be — along with other factors like whether Google believes a page is of good quality, the words used on it and many other things.
What type of “new” search activity does Hummingbird help?
“Conversational search” is one of the biggest examples Google gave. People, when speaking searches, may find it more useful to have a conversation.
“What’s the closest place to buy the iPhone 5s to my home?”
A traditional search engine might focus on finding matches for words — finding a page that says “buy” and “iPhone 5s,” for example.
Hummingbird should better focus on the meaning behind the words. It may better understand the actual location of your home, if you’ve shared that with Google. It might understand that “place” means you want a brick-and-mortar store. It might get that “iPhone 5s” is a particular type of electronic device carried by certain stores. Knowing all these meanings may help Google go beyond just finding pages with matching words.
In particular, Google said that Hummingbird is paying more attention to each word in a query, ensuring that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than particular words. The goal is that pages matching the meaning do better, rather than pages matching just a few words.
By the way, another term for the “meaning” connections that Hummingbird does is “entity search,” and we have an entire panel on that at our SMX East search marketing show in New York City, next week. The Coming “Entity Search” Revolution session is part of an entire “Semantic Search” track that also gets into ways search engines are discovering meanings behind words.
On June 24, 2013, in a series of letters to major search engines such as Google, Yahoo, and Ask.com, as well as to specialized search engines, the FTC issued updated guidance to maintain clear disclosures to public regarding paid advertisements in search results.
Advertising disclosures need to be clear and prominent.
Consumers assume that search results reflect the most relevant results. When results appear because the advertiser has paid the search engine for, say, prominent placement, that placement could be deceptive to consumers if they are unaware of the commercial relationship between the advertiser and the search engine.
They’re giving the search engines some leeway with this, and they state that “any method may be used, so long as it is noticeable and understandable to consumers.” This however doesn’t prevent them from giving some tips.
Here is how the FTC thinks that Google, Yahoo! and Bing should do things differently.
Provide clearer visual cues surrounding their ads. The current shading is too light in their opinion and the it is difficult to differentiate the two. Below areand their recommended specifications for a proper visualoff set:
Advertising shouldn’t be set off simply by shading and an outline. Advertisements should have easily distinguishable text labels.
The FTC asserts that text labels must be used in addition to the visual cues as a search engine may use to distinguish advertising. Text labels must:
The FTC is basically letting the search engines know that their shading and text has gotten too nondescript, and they’ve been able to realize that consumers are less likely to notice when an ad is an ad.
There’s also a footnote in the letter that has to be somewhat validating for those who’ve been preaching the gospel of mobile.
It looks like the year of mobile is finally here! If the FTC is recognizing the value of a mobile campaign, it’s a safe bet that mobile has made it to the main stage. It seems like mobile devices are a bigger offender in their eyes than desktops, so look for changes in that platform. It’ll be an interesting few months for mobile advertising, what with campaigns getting enhanced and the shading of the ads almost certainly set to change.
In designing web pages, search engines … should ensure that any visual cues used to distinguish advertising, such as background shading, are sufficiently visible on both mobile devices and desktop computers. A search engine can indeed at a minimum determine whether a web page will be displayed on a mobile device as opposed to desktop computer… Consequently, we believe that search engines should consider using web pages of different luminosities for mobile devices and desktop computers.
Before closing the letter, the FTC offers a bit of shaming to the engines for their decreasingly relaxed emphasis on users knowing when an ad is an ad.
Search engines have reduced the font size of some text labels to identify top ads and other advertising and often locate these labels in the top right-hand corner of the shaded area or ‘ad block,’ as is the case with top ads. Consumers may not as readily notice the labels when placed in the top right-hand corner, especially when the labels are presented in small print and relate to more than one result.
Ads are their revenue stream, and they’re going to do what they can to maximize that. But things are going to look different.
How this going to affect us:
The FTC cites something in their letter that we already know to be true. People just don’t like ads. The clearer the indication that something is an ad, the higher the likelihood that someone with distaste for marketing professionals will avoid clicking. I think that PLAs are doing so well right now because it’s easy to overlook that they’re ads. They just look like regular listings at first pass.
We’re already at a disadvantage with these devices, as only the top two placements show above the organic listings. What’s going to happen to historically higher CTRs in mobile if the layout changes drastically? That seems like the area where the FTC was the most concerned, so I anticipate that’s where there’s going to be the most upheaval. I mentioned it before, but it’s worth stressing again: mobile will most likely be affected in a big way.
Different ad real estate
Will Google shift where they place their ad blocks? Could bottom ads go away at some point? Bing’s practice of showing the same ad above and below organic results will surely be affected in some way. Now that ads are supposed to be clearly demarcated, what are the engines going to do to goose CTR as much as possible? Google seems to be always playing with display URLs and where to place them, but could we see more radical ad formats are in our future? What can they do to entice clicks on what makes them money while complying with the new guidelines?
Every website owner tries to impress clients with lucrative web designs, stylish themes and easy to navigate websites to gain more reliance from them. They spruce it up with rich content to make people more aware about their company & their products or services. As we all know content has always been considered as the king of internet marketing, this has encourages website owners to publish more & content on their website.
Content on the website not only provide the information about the company, their product/services but also gives more information about the “thing” visitors are searching for on the internet.
10 important things to consider while putting any content on a website:
1. Know your audience
2. Decide why you are writing
3. A short & descriptive headline
4. Avoid using UPPER CASES
5. Choose the right words
6. Avoid acronyms or jargons
7. Punctuations & grammar
8. Effective hyperlinking
9. Right length
10. Proof reading
In order to increase the number of webpages on the website, webmasters forgot an important aspect of the content posting i.e. to keep the originality of the content. And in haste to rank on top for various search engines; incidently they piled up a lot of duplicate content on their website.
Though it provided an increase in traffic for many website with good amount of content to offer to visitors; It also made it difficult for search engines to choose the best content due to plagiarism on many sites with the same content.
Google has always aimed to provide the best search results to anyone who searches on their platform. With the increase in duplicate content on website, Google crawlers started treating those websites as low quality websites. This took place in form of an algorithmic updated called the Panda Update which looked for similarities between websites that real humans found as of good or bad quality in terms of their originality & relevance as posed to an algorithm.
After the release of this update, websites with plagiarized content were penalized. Their webpages were de-indexed and that resulted in fall of their ranking for certain keywords. Interestingly, Panda update reportedly affected the rankings of almost 12 percent of all search results.
As being the saviour of the digital world, Google recommended webmasters to either re-write those pages or block them from getting indexed with Google or any other search engine. This update in return brought additional value to websites having original content.
Although things got in shape with time for the websites who were penalized by adhering to the SEO guidelines. But in due course many webmaster & website owners changed their strategy to rank on top of search results from content farming to link building.
As we all know, backlinks to any website is an important factor in calculating the Pagerank of the website. People started doing rigorous link building & promoting their keywords screeching way over the permissible limit set by Google’s SEO Guidelines.
With the help of many black hat techniques like keyword stuffing, cloaking, spam link submissions, deliberately creating more anchor text on a page etc. websites gained good ranking on SERPs till they again got caught by Google.
And, Google once again released another update called the Penguin update, that aimed to penalize those websites who were using black hat techniques to promote their website & rank well for the keywords. As the update focused on quality of backlinks, the effect varied from website to website. Google also mentioned about the doorway pages which were only built to attract search engine traffic that are against the SEO webmaster guidelines.
Many websites saw a decrease in their Pagerank post Penguin update which meant the website had backlinks which weren’t built on best SEO practices.
The Penguin update affected approximately 3.1% of search queries in English, about 3% of queries in languages like German, Chinese, and Arabic, and an even bigger percentage of them in "highly spammed" languages.
Google specifically mentioned 5 bullet points to keep in consideration while doing a link building campaign for any website after this update as:
1. Keyword research for link building
2. Quality of the linking page
3. Landing page where you desire your visitors to land
4. Use of correct Anchor text
5. Prefer manual submissions over bot-submissions
Now, with the number of version updates of Panda & Penguin – Webmasters & website owners are compelled to adhere with SEO guidelines for their website optimization.
Remarkably, there has been a noticeable change in approach as well from SEO experts to rank their website while working within the limits of white hat techniques.
To consider & evaluate the importance of link & content for the website. It will not be a diplomatic statement to say that they both contribute to each other’s performance & at the sametime operates individually as a ranking parameter for the website.
We should have good quality content on our website describing our product/services in suited way to give a best user experience to our visitors where inbound links will help in increasing the referral traffic resulting in increase of visitors on website.
Like we say, A compulsive marketing plan can only be successful if we have best product/service to offer. Similarly, Link building is our marketing plan that creates awareness of our brand among online users & content gives great user experience to customers if we maintain its originality & relevance.
With the announcement of war against a spammy link building, Google has officially launched a search update to target such spam queries like payday loans, pornographic and others.
While search queries that are generally spam in characteristics, such as [pay day loans] or some adult relevant queries, were somewhat less likely to be a focus for Google’s spam team – Matt Cutts said Google is more likely to look at this area in the long run. He cleared it that these demands are arriving from outside of Google and thus Google wants to deal with those issues with these kinds of queries.
While at SMX Advanced, Cutts described this goes after exclusive weblink techniques, many of which are unlawful. He also included this is a world-wide upgrade and is not just being combined out in the U.S. but being combined out worldwide.
This upgrade affected approximately 0.3% of the U.S. issues but He said it went as great as 4% for Turkish queries where web spamming is generally greater.
Matt Cutts hinted about a version of site speed ranking factor is in process to be rolled out. As it was in 2010, site speed was a ranking factor. Similarly, this time it will be an update for mobile search soon.
Google announced demotion factors for mobile sites but they did not announce site speed as one of those demotion factors. That is because that feature is not yet live but it is coming soon and Matt wants webmasters and SEOs to prepare.
He described a device at Google I/O to observe the rate of mobile website load time. Outlier websites that have load time really slow will see a demotion of the google search. I believe mobile websites have to be even quicker than the outlier for the desktop outlier speed variety.
Earlier on 13th May 2013, Matt Cutts published a new post on his blog titled “What to expect in SEO in the coming months”. This gave a clear indication to many webmasters about what’s next coming their way for the website. In last SMX Conference, huge speculations on Social media optimization were made and many campaign activities were under the scanner, though nothing happened on that front until now.
And, on 23rd May 2013 – Matt Cutts tweeted
It was revealed that the off-page update was finally switched on following weeks of speculation as to when the update would be actioned and sources show that Google’s Matt Cutts revealed that an expected 2.3% of English based terms were going to be being affected by the Penguin 2.0 roll out.
Google Penguin has been an integral part of the war against spam that Google have pledged in order to eliminate black-hat techniques driven websites from their SERPs and combined with the release of rolling updates of Google Panda (on-page algorithm), the fight for a cleaner search engine is firmly in motion however there is much more to come if a previous Webmaster Help video is anything to go by.
Although the release of the new generation of Google Penguin is now already in place, the video seems to hint towards further filtering alterations that the search engine giants are working on and from what we can see within the search engine results at the moment in time, many of those are still to come.
Many site owners indicated their opinion towards how they expected Google Penguin update would help to clean up the spam linkage traffic across the web. However, it seems that some of the results that still feature within the search are spam produced sites that have clearly seen many computerized inbound links pressured into a link in order to control the rankings.
But, it looks like there is a shake up still awaiting to happen and it will be coming in the near future but meanwhile make sure that you are keeping your backlinks and marketing initiatives as fresh as possible because there is no way that Google have completed the fight for spam linking yet.
Rich Snippets offer a way for websites with certain kinds of content to improve their website's search engines listing results in Google. All the Web site has to do is add some HTML rule (in the way of microformats or RDFa) to framework that information in their webpages so the search engines can identify it. And even before Google began providing these snippets in search results. 6 years ago, Yahoo! allowed microformats in SearchMonkey back in 2008.
So what does it look like? This snippet reveals up as an extra line in a search engines listing, placed between the headline and information written text as the description.
The Rich Snippet can contain a conclusion of opinions, range of prices, or other specific details, which is intended to give visitors practical data at a look about that Web page.
In other words, Rich Snippet increases the size and value of your record on a SERP to show additional, user-targeted details.
As a conclusion, You can entice more eyes, more mouse clicks, and more conversions to your site.
Seems like Google itself allowing webmasters to promote their website by the usage of few HTML codes to improve their listings on search engine result pages.
It’s an opportunity to improve the transformation rate through improved exposure in the search engines. As Tim O’Reilly noticed when Google first combined out Rich Snippets: (excerpts from his release):
If Google is aware of the content on your webpages, we can make rich snippets— detailed information designed to help customers with specific issues. For example, the snippet for a restaurant might show the average review and price range; the snippet for a recipe page might show the total preparation time, a photo, and the recipe’s review rating; and the snippet for a music album could list songs along with a link to play each song. These Rich Snippets help customers identify when your site is appropriate to their search, and may result in more mouse clicks to your webpages.
Rich Snippets are still relatively unusual in Google search results. This makes a chance of an aggressive advantage and improved exposure that can be beneficial for any Web site containing the appropriate type of content that user is looking for. I suggest that it’s a chance not to be skipped.