Hello John Moved your question to a new thread. The Google Penguin updates are all about link quality and your "link scape", ie does it appear natural, artificial etc. If you have been penalised, then take a look at the way you have been building links. If you've been taking shortcuts, using dubious techniques, then you need to re-evaluate and address any issues. Look in your webmaster tools and see if you have any messages, or something like Open Site Explorer and look to try and spot any issues. If you have been concentrating on keyword anchor text, and you have countless links with the same anchor text, then this can be an issue. Remove any links from bad neighbourhoods if possible.
Not getting at the OP here, but why do so many people come into forums asking how to recover from these updates instead of simply preventing what they threaten to do? Perhaps people just sit tight and hope for the best, and so do nothing and think "well if I get hit then i'll do something about it". indizine indizine
Problem is, spamming, blackhat and shortcuts have worked so well for so long, it's not surprising so many people are caught up in it. I've had a few directory and large site owners ping me asking if we had also been hit hard from various updates, but other than a bit of keyword churn, overall (touch wood) we've weathered Panda, Penguin and the like pretty well. But if you game the system, at some point your going to crash. No doubt we've all done a bit of keyword anchor text, but overall I think we've played it pretty straight over the years, and the only time we've generated dubious quality links is when I've had a go at outsourcing some link building, which 9 times out of 10 has been an expensive waste of time ![]() I only hope that people that have been bitten by poor SEO techniques now don't get suckered in to Social Optimisation by buying 1000's of likes and +1's as they will no doubt get penalised and fall flat on their face in about 2 years time....
“....and the only time we've generated dubious quality links is when I've had a go at outsourcing some link building, which 9 times out of 10 has been an expensive waste of time ![]() Tell us about the successful 1 time please. ![]()
“If you have been concentrating on keyword anchor text, and you have countless links with the same anchor text, then this can be an issue.” Any thoughts on how many is too many plse? 10 - 20? 30 - 50? Can you mollify Google by very, very slightly varying your anchor text (eg graduate careers advice, graduate career advice, etc)? Also, I heard that Google's now ranking website pages according to the length of visits made by visitors using the appropriate key words. My stats for very similar key words on a page vary from around 7 mins to 0 seconds so how does Google make sense of it all? Ho hum. ![]() Linda CareersPartnershipUK
“Any thoughts on how many is too many plse? 10 - 20? 30 - 50? Can you mollify Google by very, very slightly varying your anchor text (eg graduate careers advice, graduate career advice, etc)?” Can only go my interpretation, so may be talking cobblers but I think it's more of a ratio rather than an actual number. Ie if 70% of your links had the exact same anchor text, then that looks unnatural and could be an issue. Likewise if you have hundreds, thousands etc but from the same domain, then these will be discounted as duplicates, so not an issue (eg blog rol, footers, forums etc). I had over 2 million from one of my domains pointing to the other with the same keyword phrase. Prob not best practice, but didn't seem to get penalised. Reality is, they will be disregarded as dupes. Have now changed it just in case! Varying anchor text is the way you should do it, makes it more natural. Ref google ranking on visitor time, I'm more cynical ![]() I've been told in the past they look at bounce ratios, length of visit etc, but if these are signals, they are minor ones and not worth losing sleep over. As a directory, we have a high bounce rate as people arrive mid site from google after looking for a biz or phone number, get it then leave. Where as something like a forum is a lot more sticky, so will have longer page times and lower bounce rate. But they serve different audiences, so hard to write a one size fits all rule. I still think content is a deciding factor and how it is presented, ie number of adverts non content above the fold etc rather than things like length of visit. Like I say, I can only talk from my experience ![]()
“Problem is, spamming, blackhat and shortcuts have worked so well for so long, it's not surprising so many people are caught up in it. I've had a few directory and large site owners ping me asking if we had also been hit hard from various updates, but other than a bit of keyword churn, overall (touch wood) we've weathered Panda, Penguin and the like pretty well. But if you game the system, at some point your going to crash. No doubt we've all done a bit of keyword anchor text, but overall I think we've played it pretty straight over the years, and the only time we've generated dubious quality links is when I've had a go at outsourcing some link building, which 9 times out of 10 has been an expensive waste of time ![]() I only hope that people that have been bitten by poor SEO techniques now don't get suckered in to Social Optimisation by buying 1000's of likes and +1's as they will no doubt get penalised and fall flat on their face in about 2 years time....” Yep.. so agree with you.. websites that are using black hat SEO had been penalize because Google keeps on updating their algorithm.. they are after quality contents now rather than spamming.. they are 5x more smarter these days.. ![]() caleb23
Penguin 2 is all about web spam and black hat practices, using exact match anchor text. You can take following actions to identify and recover from it:
CBIL360
I've seen quite a few cases of penalties, and in almost every single case it was cheaper, easier, and faster to start from scratch with a new domain. I know that's the absolute worst thing any website owner wants to hear, but it's true. I just haven't seen disavow-ing links like Google suggests do much at all. The only case I saw where it actually worked was when the site had a handful of spammy paid links, a handful links on a few blog networks, and maybe 100 spam forum profiles. Those weren't too difficult to clean up. As far as anchor text distribution goes...I'd recommend only a small fraction of all your links contain keywords. Like less than 10%...perhaps less than 5%. Co-citation is quickly becoming more important anyway. Case in point: I own a local SEO shop and one of our clients is a roofing company ranked #1 in their local area for the term "roofing companies"...not a single anchor is exact match. John Crenshaw |
Recent Posts Why PPC(Google Ads) Campaign Not Generate the Leads? 3 comments why cant just SEO be Simple & Affordable 5 comments ![]() Search Engine Optimisation for laymen and newbies 16 comments ![]() Is it worth it to do SEO Locally? 13 comments ![]() Finally, is 2023 the year Google search became irrelevant? 7 comments ![]() Has anyone used them? 2 comments ![]() Online Reviews - everything you need to know 8 comments Has anyone experienced Googles algorithm change 3 comments ![]() Asking Google to remove negative reviews - what happens? 4 comments How can we block our website link from search engine result? 4 comments What is the strategy you can follow to rank the SEO keywords in Yahoo? 3 comments The Fundamentals of SEO (a step by step guide) 2 comments Which tool is best for SEO audit the website? 13 comments Why we can't find the traffic from Classifieds and B Director? 7 comments ![]() Semantic Search Marketing & Structured Data - what is it? 8 comments |