Atechnocrat Logo

« Previous PageNext Page »

Google still warn to the entire webmaster to avoid Hidden text and keyword stuffing, these practices are not so common today of it once was.

When you mention hidden text and keyword stuffing, people often think of the 50 lines long text white text on white background in the footer of a webpage. It essentially means placing text on a webpage but for search engines only, not the user.This was used broadly a decade ago, although it has evolved to include things such as using CSS to hide it off the page, to place it underneath another element already on the page, or to just set the text of hidden.

Keyword stuffing is all about put your keyword phrase, multiple times within a webpage, in so many cases the use of unnecessary keywords makes your content not so worthy. This is done so that the keywords are visible to search engine, with the idea that it will help it rank better in Google.

Matt Cutts makes a point of saying that using things like JavaScript that is used on websites to create mouse over menus and other user-friendly ways to show more text is generally OK.

Cutts very specifically brought up the point that many spinner programs – programs that will essentially take content already available on the web and “spin it” to create new content – often don’t pass the keyword spamming test. The output is often gibberish and nonsensical.

It is good to put your keyword into your content but excessive use of keywords or keyword phrases makes your content unnatural. For checking that if your content sounds like natural content or not you can read it loudly and in the term of reading if it sounds unnatural then you can easily understand that you use your keyword in excessive amount and you can try to make your content Search Engine Optimization friendly as well as user friendly.

Why is this problem? When users search for certain keywords and end up on your site, they want to see those keywords on the page in a useful format. They don’t want to end up there due to hidden text located in the footer. Hidden text and keyword stuffing often make for a very poor user experience for it the user coming in from Google, which is why Google take such a stance on the issue.

If you get the warning for hidden text or keyword stuffing, the solution is pretty easy: simply remove it. This type of SEO is still utilized today by some of the scammier SEO companies, so if you are unsure where it is, start looking at your actual source code find it, generally either close to the top or close to the bottom of the code.

Cutts said that occasionally it is a case of hacked site – hidden text and keyword stuffing is most common on WordPress sites, so if you get the warning on a WordPress site, your first thing to do should be to check whether you’ve been hacked, and if so, upgrade your WordPress and all your plug-ins, and then begin your cleanup process.

Read more….

Many people have been ranting in the forums for the past several years about the “unfair advantage” big brands have in Google’s search results. Ever since the “Vince Update” of February 2009, things have been very different in Search Engine Optimization.

I suggest that understanding Vince is perhaps the most important way to get your Search Engine Optimization programs on track.

Google’s Vince Update: What Happened?

When Vince was first announced, Matt Cutts of Google called it a “minor change”. Many smart folks in our industry discounted that assertion.

The first discussion on Vince began on WebmasterWorld. A member had posed the question as to whether others had noticed more preference toward “big brands” in Google’s search results.

The first respondent was, not surprisingly, Ted “Tedster” Ulle (who sadly passed away on June 28 – you can read a “tribute” that I had written for him here). Here’s what Ulle wrote in 2009:

I also have the sense that there is a change in this direction. In 2008 Eric Schmidt made some comments that brands were more important. My only question is whether the influence is from offline or possible some other factor – such as unlinked brand mentions, or social media buzz.

Aaron Wall wrote a great post on Vince shortly thereafter.

The truth is, big brands already had a lot of the “good stuff” baked into their web presence that others would have to fight like hell to achieve. The challenge had been that many of those big brands had no idea how to build a search engine friendly website.

Google has done a pretty good job of getting around those hurdles, as I had written about in my 2009 series on American Express and “credit cards” rankings on Google.

Check out the rankings for “credit cards” on Google, now. More brands. Less affiliates. Not at all surprising.

What Vince Taught Us

You must try and build a brand online, just like you – in the “old days” had to build a brand – by marketing the business using multiple channels.

Coca-Cola didn’t just buy television ads to become one of the world’s best-known brands. They used a multi-channel approach.

Some way to build a brand online could include:

  • Video
  • Public relations
  • Blog content that goes viral
  • Infographics
  • .

Once you’ve built a brand, you’ve established “trust” (with your target audience, as well as the search engines). Again, this isn’t achieved simply by writing some title tags and buying some links. Those days are, gladly, done and over.
Google may be trying to evolve past a (mostly) link-based algorithm. It’s pretty safe to assume that they can monitor for any mentions of your brand in social media (and elsewhere) and that they can probably attribute value to mentions of your domain/URL, even when it isn’t hyperlinked.

Once you have trust, you can get away with a lot more than other websites may be able to get away with.
Links Still Matter (For Now?)
If you represent a brand new website, it takes a lot of time and effort to build this trust (whether that be link equity, or what have you). In fact, if you employ some of the Search Engine Optimization tactics that I still see today from large brands, you could be in for a world of hurt.

Wow that was unusual. I just wrote an incredibly long comment but after I clicked submit my comment didn’t show up. Grrrr… well I’m not writing all that over again. Anyway, just wanted to say superb blog!
And the user’s name for this blog comment? “Keyword”.

The really scary thing about this is that this blog comment spam was posted 19 weeks ago. This is current stuff. And this is a big company. And, this company is “kicking my (hopefully soon-to-be) client’s butt” in this category.
So, if you were a “common SEO”, you might say to said client, “well, we know that this works for your competitor. We believe you should do this, too!”

Read more

The summer of 2013 has been full of Google updates. The last major update Google released was the Penguin 2.0 update. On July 18th, Google released their newest Panda update. In a statement from Google, we learn, “In the last few days we’ve been pushing out a new Panda update that incorporates new signals so it can be more finely targeted.”

In other words, this update incorporates new signals that should serve to help sites that were affected by original Panda update. Google will be updating Panda on a monthly basis over a 10 day period. What has been the result of this update so far?

Here are some of the experiences some webmasters have had so far:

  • Increase in impressions but same amount of CTR’s (Log into your Google Webmaster’s Tools for this information)
  • The rankings of informational sites such as Wikipedia and About.com have been heavily impacted.
  • Authority sites will be getting more prominence in the SERP’s
  • Sites using Google + are being rewarded

What types of things should you take action on?

  • Start using Google + if you haven’t already.
  • Enable Google Authorship
  • Stay within the guidelines set forth by Google and focus on quality.

Read more

Google’s Maile Ohye spoke about Search Engine Optimization best practices for technical implementation of mobile sites based on how Google crawls, indexed, and ranks mobile content and presents it to searchers on mobile devices.

Responsive Design, Dynamic Content, or Mobile URLs?

If you’re designing the mobile experience from scratch, this question is the first place to start. If you already have a mobile experience set up, then you can just jump to the section that applies to your site. All three options work well for users and for Google, so use the best implementation based on your infrastructure, content, and audience.

Implementation URLs Content
Responsive design One URL for both desktop and mobile The page serves basically the same content to all users but detects the device and screen size and builds the layout accordingly. As the screen size gets smaller, the page may show fewer images, less text, or a simplified navigation.
Dynamic Serving One URL for both desktop and mobile The page serves different content to users of different devices.
Mobile URLs Different URLs for desktop and mobile The mobile and desktop experience might be completely different.

Responsive Design

Using responsive design that detects the device and adjusts the layout accordingly can be a great one-size fits all implementation. You just have one URL for any type of device and the layout adjusts. This works great for smart phones, tablets, laptops, huge monitors, and the dashboard of your flying car. The crawl is efficient, users don’t experience the slowdowns that redirects bring, and search engines have just one page to index and rank.
Users love it; Google loves it; everyone’s happy.

Google recommends that you don’t block crawling of resources such as CSS and JavaScript as they need to be able to construct the responsive page elements.

Dynamic Serving

With this set up, the server detects the device before returning content and serves the response on a single URL (as above with responsive design). The difference is that the content that’s loaded onto that URL may be totally different depending on the device type.
This is a good option if loading the full content from the desktop version would slow the mobile page down, but it can be more complicated to implement.

Mobile URLs

Google’s mobile Web index stores these pages and feature phone users can search through them (yes, still). Mobile XML Sitemaps are for listing these types of pages.

But if your site has separate mobile URLs in these futuristic days (of flying cars and free espresso everywhere), it’s unlikely those pages are using one of these markups. It’s probably just a page you’ve constructed differently to better be used on a smaller screen.

Since Google sees different URLs as different pages, you can do several things to ensure that Google understands the relationship between your desktop and mobile pages so that your site is as visible to mobile searchers as it is to desktop searchers.

Google searches both desktop and smartphone users from a single index, and in cases where both a desktop and a mobile page exist, clusters them together and serves the appropriate version. (See more about this in the ranking section below.

This implementation is still a great choice, despite the newer options available. It can be a lot easier to keep track of technically and as long as you follow the tips below, it works well for both users and search engines as well.In particular, if the content you’re serving mobile users is fairly different from what you’re serving desktop users, this options makes a lot of sense.

Mobile URLs & Redirect Mapping

The first and best thing you can do for both search engines and users is to ensure that both your mobile and desktop pages redirect appropriately. Mobile user-agents that access the desktop pages should be redirected to the mobile versions and desktop user-agents that access the mobile pages should be redirected to the desktop versions. Sounds so simple. So many sites don’t do it.

If the site doesn’t redirect to the desktop version, you see the mobile page, which in addition to not being a great experience, doesn’t make the site any money since it doesn’t serve any ads.

You don’t need to do anything special for Googlebot-Mobile, as it crawls as a mobile browser, so both it and the regular Googlebot will be redirected correctly if these redirects are in place.

It’s bad enough to not redirect based on device type, but you know what’s even worse? Redirecting mobile users to the home page. If you don’t have a mobile equivalent and a mobile user accesses the desktop page, let them see the desktop page! Accessing a page on a mobile device that’s not designed for that screen isn’t great, but it’s better than being redirected away to a completely irrelevant page and not being able to access the information at all.

Mobile URLs & Adding Meta Data

Google uses a single index for serving content to desktop and mobile users, but clusters the desktop and mobile pages together and serves the appropriate version. In addition to redirects between them, you can add meta data to send signals to Google to make this mapping clear.

Rel=canonical

Use the desktop value for both the mobile and desktop version. This consolidates indexing and ranking signals (such as external links) and prevents confusion about potential duplicate content.

Rel=alternate media

This attribute enables you to map the desktop and mobile URLs. Use this attribute on the desktop page to specify the mobile version. (You don’t include this attribute on the mobile version to specify the desktop version.)
One the mobile page, include following (where max-width is whatever you’ve set the page to support):

You can also specify the alternate in the XML Sitemap.
Make sure you specify the canonical version of the mobile URL (and don’t dynamically just include the URL in the browser address bar, which might include optional parameters).

Rel=next/prev

If the site includes paginated content, you would also include the Rel=next and Rel=prev attributes. However, keep in mind that if the number of items listed per page is different on the mobile vs. desktop version, you can’t use Rel=alternate media to cluster the corresponding pages together since the content doesn’t match.

Vary: User-Agent HTTP Header

Whether the site redirects based on device type or simply shows different content (dynamic serving), configure the server to return the Vary: User-Agent” HTTP response header (see more on this above in the dynamic serving section).

Rankings & Mobile Devices

When someone searches Google from a smartphone, they are searching through the same index as they would from a desktop. Because Google clusters the desktop and mobile pages, the following happens in results:

  • Searchers see the desktop version of the URL listed
  • When the searcher clicks, Google loads the mobile version, not the desktop version (this improves the user experience because the page loads faster).

Different ranking signals for all kinds of things (type of query, location of searcher, type of device searcher is using). In the case of mobile searchers, signals include the mobile user experience of the page.

Read more………

Good news for webmasters who has been waiting for more flexibility over which deep links are shown in Bing’s search results. Bing has announced changes to deep link management.

You can now log into Bing Webmaster Tools and remove specific deep links that the Bing algorithm has automatically generated and selected to appear under the main search result for a website.

There have been issues in the past with which particular site links Bing has selected to show. Last year, Bing was selecting seemingly random news stories to appear as a deep link to an online newspaper, rather than selecting a much more appropriate section of the website, such as sports or obituaries (though this can be an issue on Google as well).

Now, website owners who note issues such as this in their own deep links in Bing search result simply need to login to their Bing Webmaster Tools account and use the deep links tool to block any particular deep URL from being displayed in this fashion.

You can also use the tool to prevent a deep link from showing only to specific countries or regions. This would be especially helpful for websites that serve an international audience but are still very country or region centric for actual visitors.

Read more…….

Google has announced its next generation algorithm conferred the Google Penguin 2.0 to help conflict web spam. It is affecting almost 2.3 % of search queries in the English Language. There are thousands of websites that have been affected by this update.

We have compiled a list of ways in which you can help your site to get penalize:-

1: Monitor vigilantly for changes:

Whenever you receive news that Google has performed an update to any algorithm, either it is a panda update or a penguin update, it is always mandatory that you maintain a process of observing the kind of fluctuations that are taking place when keywords are the main focus. If you want to recover from this set back in the shortest time span possible, then it is important for you to notice these fluctuations.

2: Quality and only quality content

As we aware from this factor that content is king. If your site have genuine, unique and informative content then you are safe from the penalization of Penguin 2.0.

3: Check your inbound and outbound links:

If your website still has not recovered from Google’s update and still has not reverted back to its original rankings, then maybe it is time to snoop around a bit. That first problem will be identified as inbound and outbound links.

To check bad inbound links which are directed to your website, all you have to do is to perform a Google query search by typing www.yourdomain.com. The searched query will pinpoint all the links directed to your website. Check each and every link and disavow the one that you feel that contains spam related attributes. Such sites can be easily recognized; they can be link farms, paid links etc.

To check for bad outbound links, all you have to do is to cross check of each and every page of your website directing to other websites. If your website is linked to a low quality website than there is a high chance that your website has fallen prey to the algorithm.

4: Create High Quality Back Links:

In order to rid yourself and the website that you own from the plague of the Penguin 2.0, then it is necessary for you to generate high quality back links. The back links that you are about to make should be natural and not artificial. Avoid paid directories and try actively participating in some good forum discussion to become recognized and then start generating these back links.

Google Penguin 2.0 is just an algorithm update. It is not any form of a penalty related program that Google has launched to penalize your website. Apply these tips and got your position.

Please share your view about the post,Thank you

Good news for webmasters who have been struggling to identify links and pages that have been trigger warnings in Google Webmaster Tools. Google will now include example URLs in their emails warning webmasters about manual spam actions.

There are many cases where webmasters know they have manual web spam action taken against their site, but they can’t figure out what is triggering the spam action, or they are confused and believe the wrong thing has caused the manual spam action.

In a Webmaster Help video, Google’s Matt Cutts details the types of cases where people are struggling to identify problems on their site, part of how Google Webmaster Tools is trying to provide more concrete actionable information in their emails to webmasters.

“For example, we’ve seen a couple sites that had millions of pages that had manual web spam action taken on just a very small number of pages, or in one case, just one page. But they got a message saying ‘Hey you need to look out because some of your content have been defaced’, and they didn’t know exactly where to look.”

Don’t think Google will hold your hand and include every single thing that is wrong with your website, but getting a couple very specific URLs to show what is wrong with your site, which you can then use to help identify other pages with the same issue, definitely helps take some of the guess work out of the cleanup equation.

“Now we won’t be able to show every single thing that we think is wrong for a couple reasons,” Cutts said. “Number one, it might help the spammers. Number two, if there are a lot of bad pages, we could be sending out emails that are 50 MB long. But we do think that it’s helpful if we can include a small number of example URLs that will help you as a webmasters know whenever you try to fix things and clean the site back up.

Read More………

As we know Matt Cutts has announced that Google Penguin 2.0 has rolling out after May 22, 2013. Here are some key points that you should know about Penguin 2.0

1.Penguin 2.0 affected 2.3% of all English-US queries

Lest 2.3% sound to you like a smallish number, keep in mind that there are an estimated 5 billion Google searches per day. 2.3% of 5 billion is a lot. The impact is bigger than a little decimal number may suggest.

2.Other language queries are also affected by Penguin 2.0

Although the vast majority of Google queries are conducted in English, there are hundreds of millions of queries conducted in other languages. Google’s algorithmic impact extends to these other languages, putting a bigger kibosh on webspam on a global level. Languages wither higher percentages of webspam will be affected more.

3.There will be more Penguins

We haven’t heard the last of Penguin. We expect additional adjustments of the algorithm, as Google has done with every single algorithmic change that they’ve ever performed. Algorithms evolve with the ever-changing web environment.

Matt Cutts mentioned,“We can adjust the impact but we wanted to start at one level and then we can modify things appropriately.” One commenter on his blog asked specifically about whether Google would be “denying value upstream for link spammers,” and Mr. Cutts replied, “that comes later.”

Over time, the algorithm eventually catches up with webspam. There may still be some ways to game the system, but the games come to a screeching halt when a Panda or Penguin walk onto the ball-field. It’s always best to obey the rules of the game.

Check if you are affected by Penguin 2.0

If you’re wondering whether Penguin 2.0 has affected you, you can perform your own analysis.

  • Check your keyword rankings. If they have declined substantially beginning on May 22, there is a good chance that your site is affected.
  • Analyze the pages that have received the most link building focus, for example your home page, conversion page, category page, or landing page. If traffic has declined drastically, this is sign of a Penguin 2.0 impact.
  • Track your organic traffic deep and wide. Google analytics is your friend as you study your site, and then recover from any impact. Pay special attention to the percentage of organic traffic, and do so across all of your major site pages.

If you’ve been affected by Penguin 2.0, here’s what you need to do:

How to Recover from Penguin 2.0

  • Identify and remove spammy or low-quality pages from your website.
  • and remove spammy inbound links. To identify which links could be bringing down your rankings and causing you to be affected by Penguin 2.0, you’ll need to perform an inbound link profile audit (or have a professional do it for you).
  • After you’ve identified which links need to be removed, attempt to remove them by emailing the webmasters and asking them politely to remove the link to your website. After you’ve completed your removal requests, be sure to disavow them as well, using Google’s Disavow Tool.
  • Engage in a new inbound link building campaign. You need to prove to Google that your website is worthy of ranking at the top of search results. To do so, you’ll need some trustworthy votes of confidence from credible third parties. These votes come in the form of inbound links from other publishers that Google trusts.

Focus on powerful content, and work only with reputable Search Engine Optimization agencies with a proven record of helping sites to succeed.

Much has been written about the value of title tags, the most important of all SEO attributes. What’s often highlighted are the dos and don’ts in terms of syntax. From a pure conversion and marketing perspective, however, there is a lot of additional value to be squeezed out of title tags.

Ecommerce businesses have already begun to differentiate themselves within the search engine results pages to steal more clicks from the competition. To do so, they have looked at title tags as another marketing platform – to convey offers, details on what’s inside (the site), to reinforce the brand’s value proposition, and to position by pricepoint.

1. Keyword Placement – Exact Match, Listed First

Keywords should be placed as exact match, starting with the most important term and following with the brand at the end. (If the brand is in the domain, this is already going to be a known match for the query.) Titles must be less than 70 characters; any term after this length is completely ignored.

Search engines read terms exactly as they are written, this means singular and plural are considered different terms and word order matters. By writing out key terms as exact match “emerald dresses” vs. “casual emerald cotton dresses” in the title tag, the page will more likely match the broader term.

In doing so, the site is giving the page better chances of being listed higher on the page in the results, which correlates with increased click-through rates. This also reassures searchers of the likelihood they will find product matching their query.

In a search for “popcorn maker,” the first result is a company’s name, followed by an Amazon category. (Unfortunately beating out Amazon in results is very difficult.) The first ecommerce result is Best Buy, with a title tag matching “popcorn maker” exactly. Search engines and consumers get a very clear sign what this result will return.

2. Free Shipping and Other Offers Included in Title Tag

Consumers have choices, not only where to shop, but which search engine listing to click. Offering free shipping has increasingly become necessary for all ecommerce businesses; consumer have now come to expect it, based on their experiences with Amazon, Zappos, and other sites.

3. Reinforcing the Brand Value Proposition

For some pure play ecommerce sites, the search results offer another way to reinforce the brand value proposition. Bluefly, a designer discount site, highlights the potential savings of shopping the site, right within the title tag. Shop Designer Jeans here, and you might save 70 percent off full retail.

4. For Niche Categories, Optimize for all Keyword Variations

For niche terms and/or categories, optimizing for multiple lower volume terms can offer the potential of greater traffic in aggregate.

5. Action Words – Convey Search Intent

Though consumers are now used to shopping online, it still isn’t a given that you can buy every type of item this way. For example, not all stores sell furniture online, or if they do, the delivery or shipping costs are prohibitively expensive. In a search for “buy couch online,” the only optimized title tag result found is for Overstock.

Summary

Based on title tags alone, it seems that the first steps of commerce are beginning to take place right within the search engine result page.

A consumer can now get a better sense of offers and pricing before even leaving a search; this is a course charted by Google design and more readily apparent with the newer paid Shopping results. And Google’s algorithm orders results based on what will give the searcher (here the consumer) what he/she wants the fastest.

Source – Search Engine Watch

Webmasters have been watching for Penguin 2.0 to hit the Google search results since Google’s Distinguished Engineer Matt Cutts first announced that there would be the next generation of Penguin in March. Cutts officially announced that Penguin 2.0 is rolling out late Wednesday afternoon on “This Week in Google”.

“It’s gonna have a pretty big impact on web spam,” Cutts said on the show. “It’s a brand new generation of algorithms. The previous iteration of Penguin would essentinally only look at the home page of a site. The newer generation of Penguin goes much deeper and has a really big impact in certain small areas.”

In a new blog post, Cutts added more details on Penguin 2.0, saying that the rollout is now complete and affects 2.3 percent of English-U.S. queries, and that it affects non-English queries as well. Cutts wrote:

We started rolling out the next generation of the Penguin webspam algorithm this afternoon (May 22, 2013), and the rollout is now complete. About 2.3% of English-US queries are affected to the degree that a regular user might notice. The change has also finished rolling out for other languages world-wide. The scope of Penguin varies by language, e.g. languages with more webspam will see more impact.

This is the fourth Penguin-related launch Google has done, but because this is an updated algorithm (not just a data refresh), we’ve been referring to this change as Penguin 2.0 internally. For more information on what SEOs should expect in the coming months, see the video that we recently released.

Webmasters first got a hint that the next generation of Penguin was imminent when back on May 10 Cutts said on Twitter, “we do expect to roll out Penguin 2.0 (next generation of Penguin) sometime in the next few weeks though.”

Read More on Search Engine Watch

================

SEO Services

« Previous PageNext Page »