Google has begun to take page layout very seriously when it comes to search results. They are now incorporating changes into their algorithm which will examine what appears “above the fold” on a webpage.
The algorithm will detect how much content is contained within the top portion of webpage. It will not be looking for a select number of ads; it will be looking to see how ads are displayed and if they affect content. Here’s what Matt Cutts over at Google had to say:
“It’s not a numbers game. Google hasn’t written their algorithm to punish sites with, say, 20 ads above the fold, as opposed to the site owner who only has 19 showing. No, from the Cutts/Google perspective, the algorithm alteration inspects pages to see how the space, especially above the fold, is being used.
In fact, Google isn’t concerned about the number of ads at all. Instead, they just don’t want these ads — however many are appearing above the fold — taking up too much space.”
A site displaying too many ads above the fold will probably start to see drop in their search traffic. As far as details on what to avoid, we don’t really have any particulars as of yet, but a number of large sites have seen drops in traffic as a result. Here are some of the sites mentioned: (provided by SearchMetrics)
Here is the rest of the list:
These don’t seem like ad excessive sites to a lot of people, but nonetheless, they have been claimed to be hit by this latest algorithm change (at least according to SearchMetrics). Some sites that didn’t see drops in traffic but did after Panda where Article directories like Ezine Articles and Article Base; which is curious because they both use a fair ammount of Adsense on their pages. Some people have mentioned that excessive use of AdSense is getting dinged the hardest but in the case of these Article directories who both use Adsense, they haven’t seen a drop in traffic; at least not a sever drop. Using AdSense in small doses probably won’t affect your search results – as it is a Google product after all – but using multiple instances and taking up valuable real estate on a page might hurt you.
As always, Google is a little cryptic with what exactly is being looked at here, but they always are. As always, the best method is trial error. If you’ve seen a drop in traffic, try placing your ads on a different section of your website. If you don’t have ads at all; you probably don’t need to worry.
Yesterday Google unveiled their latest “improvement” to social networking; Google+. Here is the info (pasted from my adsense account) straight from Google on how the new system works: “Adding the +1 button to your pages allows users to recommend your content to friends and contacts on Google search. As a result, you could get more and better qualified traffic.” It seems like a good idea, but ultimately this could be a major misstep for Google. This system, unless closely monitored, is ripe for manipulation – just like Facebook and YouTube – and will ultimately prove unreliable in providing better search results.
Now before you start going crazy about how Google+ is so much more than a glorified “like” system and that is supposed to be the new Facebook, I’m not referring to these features. I’m referring to the fact that Google has specifically stated that these results and opinions might – and most likely will – have an impact on their actual search results. My critique is on this angle, not the content it provides in its pseudo-wannabe-Facebook-anti-wall system. This is purely aimed at search results and their ability to be manipulated.
Look at YouTube, the site has quality content, but more often than not, top videos for any search query are not the best quality or most relevant, they are just the most heavily promoted. When you search for any given topic on YouTube – especially trending topics – you are bound to get mislabeled and poorly made advertisement videos, stolen videos with affiliate links, and videos with the most “views” or “likes”. The truth is you can spend $5 to promote a video by hiring outsourcers to watch and rate it with their YouTube accounts. How do I know this? I’ve done it, but for free.
When Michael Brea, an actor from the series Ugly Betty killed his mother with a sword on November 23, 2010 the news story was trending like crazy. I had seen a technique on how to get a video to the top of YouTube when a trending topic began so I decided to try it out. I simply turned on the news, took out my video camera, and filmed the TV. I then uploaded the story, used some simple manipulation, and received 15,000 views in the first day. It cost me $0. The video now has nearly 27,000 views and shows up nearly at the top when you search for Michael Brea on YouTube. Want proof? Michael Brea kills mom with sword. I’ve since sold off that account – but the hits keep coming.
So what’s my point? Well if Google+ works anything like YouTube it will be easily manipulated and simply gamed by marketers and advertisers. No trending topic is safe and no organic search using Google+ can be trusted; it’s just a terrible system that has not been refined enough to warrant utilizing it for search purposes. When it comes to Facebook the same thing rings true. I don’t get updates from friends, they are all updates from pages I never signed up for, or put a “like” on their fan page, but thankfully no more Farmville or mafia war updates. But the truth is, Facebook is now dominated by advertisements – and I didn’t even sign up for them! These posts have the most “likes” and comments, and thus shoot to the top of your update list. Marketers can also pay outsourcers, just like YouTube, to like and comment on their posts and therefore push them up in your update list – Facebook has gotten so out of control you can literally just pay for friends now!
This system has failed. It is merely a way of getting more advertisements into your social media. It seems like a great idea, and sometimes on Facebook and YouTube it works, but more often than not, it’s merely a gamed and flawed results system; it’s is too easily manipulated to be trusted. When Google performed their Panda update they effectively killed content farms and article directories because they were providing bad results and people were complaining. How long before your entire search results are merely paid Google+ manipulations and people start complaining about it? Just look at the SEO game before Panda came out, it was a joke, I could write an article and have it pop to the top 5 results for a competitive keyword in 72 hours.
While this won’t be the utter disaster that Google Wave was, it will ultimately fail to produce the results Google wants. In the rush to make everything social-media-ized companies like Google and Facebook keep forgetting their reason to exist. When it comes to Facebook, people were seeking refuge from MySpace and their relentless advertising – only to be inundated with ads on Facebook; it makes you wonder if eventually we will have to switch to a new social media platform. All people want from Google is the best results possible for a given keyword, they don’t want ads, they don’t care about if someone “plused” a page; they want results. With this approach Google is getting further and further away from what made them so popular in the first place – great search results. If the technology and know-how was these to make this ranking system successful, then I’m all for it, but at this point the technology and ability to improve results is not there.
But if there is one thing that was made predominately clear in 2009, it’s that Microsoft is serious about competing with Google. After buying up Yahoo, Microsoft is claiming a much larger portion of online search results and ad revenue.
Currently Google takes up about 70% of all online searches, Yahoo claims around 16% and Bing claims about 10% (source HitWise). One thing that is interesting to note is that Bing’s numbers have actually dropped since the their launch buzz wore off – they were initially seeing about 12% of search results (as of June 2009).
Despite the drop in numbers, I think most people would characterize Bing as a success. Google didn’t start out a giant, it grew into its position by innovating search results with a more advanced and refined algorithm than its competitors, they also gained popularity by not focusing on ad-based results (which is definitely not the case now), and a unique look.
Bing doesn’t offer much in the way of an improved algorithm, less of a focus on ad-based results, or looks (Bing definitely used Google as a source of inspiration on all of these). Microsoft’s game plan is to set itself apart by expanding search related services, integrating with social networks, and generally spending lots of money.
In terms of search results, Microsoft really doesn’t offer any improvements from the Google algorithm. The search results are similar but nothing especially different or more useful. I think that Google is a better and more friendly search engine than Bing; I get much more traffic from Google, their business listings are much easier than Bings, and Google’s suite of Webmaster Tools, Analytics, and PPC advertising is top notch.
Microsoft was the first to announce a partnership with Twitter for real-time search results (although Google quickly responded with Twitter search of their own). And with the introduction of real-time Facebook results in searches, it’s clear that both search engines view social networking as a burgeoning frontier. It’s still unclear where and if this integration will effect marketing online, but it’s exciting to see the expanse of social networking.
Things started to heat up when Rupert Murdoch spoke out against Google and their “stealing stories”. Microsoft answered the call and began offering publishers money for search results. With stories like these, it’s easy to tell that the search landscape will continue to change in 2010; for better or worse.
While Microsoft seems intent on pressuring Google to spend money in order to compete, it’s unclear on whether or not this tactic will work. Google hasn’t necessarily been open on whether or not they would pay for search results, but it seems contradictory to their nature. Google is known as a search engine where the best site wins, by opening up search results to large publishers this could effectively phase out smaller news sources (like yours truly) from ever gaining the recognition we currently receive through organic search results.
Overall, the fight continues. Microsoft will no doubt try and reclaim more of the search market in 2010 (they still have plenty of fight and money left before they back down). The big question that remains is if Google will set itself apart from Bing. At this point, they seem to be mirroring each other’s innovations and services and not offering anything unique. I think once the Facebook and Twitter search fizzles out the focus will be on content. Will Google pay for results? Will Microsoft even pay for results? I think this will be the key issue that dominate search engine news for the majority of 2010. If one things clear, Microsoft has an uphill battle in front of them, and they will need to provide something unique from Google to stand out and ultimately succeed.
A recent article from CNN.com reveals that Microsoft is in talks with NewsCorp to “de-list” its news sites from Google. This is interesting news coming off the recent interview with Rupert Murdoch (NewsCorp chairman) in which he spoke out against Google.
Financial Times has reported that Microsoft has contacted other big name online publishers in hopes of persuading them to “de-list” their websites from Google as well.
“This is all about Microsoft hurting Google’s margins,” said a web publisher familiar with the plan.
Are we about to see a bidding war between Google and Microsoft for content?
This could be good news for the newspaper industry, which has yet to develop an online business model that could support their declining revenues from print and advertising.
Matt Brittin, Google’s UK director, told a Society of Editors conference that Google did not need news content to survive. “Economically it’s not a big part of how we generate revenue.”
After Microsoft’s launch of Bing (which reportedly receives around 10% of search traffic; reported by ComScore) they have been actively trying to steal users from Google. Striking deals with Twitter, Yahoo, and now possibly large news publishers, Microsoft is actively pursuing paying for content in order to gain in users.
Do you think Google, which currently holds about 60% of the total searches online, would suffer from a “de-listing” from large news sources?
More importantly, how would this affect our current search results? Would paid content get priority?
Are you gaining traffic to your site from Twitter? Are you measuring your traffic through an Analytics program like Google Analytics or Omniture? Well, according to Stan Pugsley, director of business intelligence for iCrossing, up to 70% of referral traffic from Twitter goes unnoticed by analytics applications.
“The problem is not with the web analytics tools, but with the Twitter applications like Tweetdeck and Twhirl that are not based in an Internet Browser,” says Pugsley. Apparently links within these programs are counted as direct traffic rather than referrals. Pugsley explains, “When a user clicks through a link in a tweet, those applications do not register a referring URL that can be picked up by the destination website. It appears that they are coming directly to the site. According to TweetStats, only 31.7% of tweets originate from twitter.com, and those are the visitors that can be tracked back to tweets.”
Is Twitter providing more quality traffic than we originally thought? Since there is no way to analyze your direct traffic – not that I know of at least – we can’t accurately analyze whether or not these third party applications really are providing any sort of quality traffic. I wonder how popular programs like Tweetdeck and Twhirl really are, and whether or not they provide quality traffic.
An interesting YouTube video with Google’s Matt Cutts reveals that Google might change your page title in it’s search results. “We’ve been willing to show the titles that are most useful,” says Cutts. “Suppose the title of your page is ‘Untitled’, or if there is no title. If that’s the case we try to show a useful relevant title.” This seems like a good idea, but how often do you see a high ranking page without a title? I don’t think this is something that we will notice much in the serps.
Cutts explains later in the video that repeating meta tags or title tags might also be altered in Google’s results, “we reserve the right to try to figure out what’s a better title, what’s a more descriptive snippet.” Again, this seems like a good move on Google’s part, but how often are well ranked pages using duplicate titles and meta-tags? Anyone familiar with Google’s Webmaster Tools is well aware of the potential serps hit for duplicate titles and tags. This again seems like something we wouldn’t really notice.
Cutts also provides an interesting comment on longer titles, “If you have a title that’s really really long and has a bunch of different words in it, we may still use that in our scoring but when we’re ready to show the snippet to the user we may try to find a better title.” This is interesting news because its long been considered good seo practice to keep page titles under 70 characters. Are Google’s serps taking into account titles longer than 70 characters? It would be interesting to find out if there is any hidden truth behind this comment.
Below is the full video:
Yahoo has decided to place it’s news service in the hands of a human editor, Anthony Moor. Moor currently works as the deputy managing editor for interactive news at the Dallas Morning News. He has been responsible for increasing the web traffic of DallasNews.com by 186% and was given the Edward R. Murrow Award for best non-broadcast affiliated website in 2008.
It’s clear Yahoo is making some big changes. Their homepage re-design has been a success, they’ve pretty much given their search results to Bing (read about it here), and now they’ve decided to get serious about their news presentation.
I think this is a great decision by Yahoo. The one gripe I’ve had with their news service in the past is the fluctuations in the quality of articles. Some are very well written, while others are riddled with typographical errors and poor insight. Placing the news in the hands of a capable editor seems like just the way to fix this problem.
Over on the YouTube Blog, Billy Biggs broke the news that YouTube will be supporting 1080p Videos:
“We’re excited to say that support for watching 1080p HD videos in full resolution is on its way.” Starting this week, “YouTube’s HD mode will add support for viewing videos in 720p or 1080p, depending on the resolution of the original source, up from our maximum output of 720p today.”
YouTube is by far the largest video site on the web and their transition to full 1080p HD will likely expand their reach even further. As reported by Internet News Google has been pursuing deals with various entertainment companies to bring their content to YouTube and place ads within the videos; similar to Hulu.
But what about videos already uploaded to YouTube in 1080p?
“Those of you who have already uploaded in 1080p, don’t worry. We’re in the process of re-encoding your videos so we can show them the way you intended.” says Billy.
As long as the ads stick to commercial content only, this is good news. It’s great to see entertainment companies streaming content online, its proven to be a great platform for gaining new viewers.
What do you think? Do you like the idea of YouTube becoming a giant DVR?
Rupert Murdoch, no stranger to controversy, is coming out against Google and other search engine giants:
“We’d rather have fewer people coming to our website, but paying.” In reference to “search people,” he next added, “They don’t suddenly become loyal readers of our content.” (source WebProNews)
Murdoch’s position makes sense. He establishes all his media outlets with profits in mind. He has taken Fox News Network to the top of Cable news by creating more entertaining (I’m not so sure about “fair and balanced” part though) content then his competitors and providing a point of view that was previously unheard on cable TV. He has also lifted the New York Post from near bankruptcy to a succesful right-wing newspaper through creative content.
Even more suprising, or not so considering his background, when asked about why NewsCorp. Doesn’t block search engines he stated, “Well, I think we will…”
While the posibility of NewsCorp. removing all their articles from search engines would be a dream come true for small news outlets, I’m not sure this will be the case. In the interview Murdoch does cite the Wall Street Journal’s online approach; which is offering samples in search results and offering the full article as part of a subscription service. This seems the more likely path for Murdoch and NewsCorps; why deny copious ammounts of traffic?
What do you think Murdoch and NewsCorps. will do?
Here is the video of the full interview with Sky News:
I recently read an article over at WebProNews that Google might be using the speed of your webpage as a factor that will affect search results. They haven’t made any changes yet, but an interview by WebProNews with Google’s Matt Cutts reveals their plans might change in 2010.
This is nothing but good news, I think sites that load quickly should be rewarded in the SERPS. Good hosting and programming provide a better experience to visitors. It’s becoming increasingly clear that seo and website design are becoming the same industry. If you’re not building a website with seo in mind you’re not building it right.
Even though Google hasn’t changed anything yet, they have already created a Google code resource center aimed at making the web “faster”. This seems to be a clear indication that speed will be a bigger issue in seo in the next year.
What are your thoughts?
Get a FREE ESTIMATE
Google has begun to take page layout very seriously when it comes to search results. They are now incorporating changes into their algorithm which will examine what appears “above the fold” on a webpage.
The algorithm will detect how much content is contained within the top portion of webpage. It will not be looking for a (Continue Reading...)