Jump to content

Rich77ard

New Member
  • Content count

    10
  • Joined

  • Last visited

  • Days Won

    1

Rich77ard last won the day on November 23 2016

Rich77ard had the most liked content!

About Rich77ard

  • Rank
    New Member
  • Birthday 07/03/1972

Contact Methods

Profile Information

  • Gender
    Male
  • Location
    Australia
  • Interests
    Search engine optimisation and Seo software.
  1. Rich77ard

    WordPress SEO – 2016 Checklist

    Yes a lot of that stuff always gets overlooked and people wonder why they don't rank. Let's not forget: Schema markup Verifying sites with webmaster tools (including Bing) Other quick verification gains like Norton - Trend Micro - DMCA Links - Web of Trust Clear business credentials in footer, ie address - opening hours etc All basic but powerful trust signal for Google..
  2. Yes it's a fantastic update. I've seen sites recover back to first page for tough keywords and they didn't even do anything to recover. It's like Google just lifted the over optimization penalty from them and they just bounced back to where they should've been without the spammy links behind them. The over optimised links obviously weren't helping them much anyway, so they just recovered to the position they should have been in without the spam.
  3. To hide the links (block ahrefs bot) it needs to be done on both sites, the target site and the linking site. I doubt all 10 of those sites are doing that. And even if the anchor text entity is less than 1%, ahrefs still shows it as 1% as long as they have one exact match link. It's the way I try to rank my sites all the time, and that is having 0% exact match anchor text density. I found that using the keyword right next to some generic anchor text is working really well at the moment. For example: your keyword 'click here' – and the 'click here' is the hyperlink. For other search terms anchor text is still running riot. For example the keyword 'Seo Seattle' - the site in first place has a 31% exact match ratio (1.06k links) and the site in third place has a 75% exact match ratio (with 6.22k links)! The latest Penguin seems to be more about the quality of traffic, dwell time on site, and the overall quality of your entire link profile, and not whether or not you have a certain anchor text density. If it was purely based on anchor text entity over optimization then those two sites would have been penalized.
  4. I consistently see the top 10 results for some pretty big keywords where all the sites have absolutely not 1 single exact match anchor text link according to ahrefs. Yes they could be blocking the ahrefs bot, (though they have to do it on the money site and the site the link is coming from, otherwise that doesn't work and ahrefs can still see it if the block is only one-sided) but it's highly unlikely as I'm seeing it in a wide variety of niches. Here's two big keywords for you as an example: 1. SEO New York 2. SEO New Orleans Every site in the top 10 results for these two keywords has 0% exact match anchor text density for those keywords. Pretty surprising right? Just goes to show you that you don't need exact match links anymore to rank. Partial match, generic anchor text and naked links do just fine. There's also nothing better than knowing you're ranking in 1st place without any exact match anchors behind you.
  5. What's the perfect anchor text density post Penguin for 2016? The proof is hidden in plain sight - straight the Serps – and the results are counter-intuitive to all of the regurgitated propaganda you hear online. I just did a search for 'web design' and it's giving me results for Sydney (Aust) as that's where I'm searching from. That's a pretty difficult keyword, just - 'web design' I see the site in 1st has 10% exact match anchor for that 5th place site has 24% exact match anchor for that 8th place site has 41% exact match anchor for that 9th place site has 12% exact match anchor for that! The sites successfully ranking in 5th & 8th have huge exact match anchor text ratios. The sites also have a lot of links, 1.35K and 163K links respectively. (Local search volume for that keyword is 6,600 per month at $14 per click) Let's try another hard keyword - 'seo sydney' 4th place has 24% exact match (8k links) 5th place has 20% exact match (465 links) By traditional penguin theory all of those sites should have been hit but they're fine. The algo and it's tolerances are different for every niche. What you get away with in one niche will also bury you in another. What you'll notice as you start to dig down into those sites is that they have some really good links in their profiles. They're not just full of junk which is why they can get away with such high exact match ratios. Over optimization of anchor text isn't the main problem, the main issue is having exact match links coming from rubbish websites - that's the problem. Having exact match links coming from great sites will never be a problem, no matter how high your anchor text density is. After all, a truly good link is meant to be a link that you have no control over, so why would Google really care if everyone chose to link to you with the same anchor text?
  6. Why aren't you building links to Google local snack pack listings??? I did a quick test case study of this last year and got some fantastic results. I build a bunch of links to a Google business page and rocketed it into the top three for a really competitive term. The verified address was nowhere near the centroid of the city but a lot of the other top three spots weren't either. There was definitely no shortage on place listings as the local keyword search volume was over 2400 per month and the competition was definitely medium / high as it was a service that yielded in excess of $1000 per job. Because it's a Google property that already has a DA100, the business pages are very sensitive to links and you can achieve some great results with very few links, and it can also absorb spam as well with very little negative effects (although I wouldn't recommend it - as there is no need to do it like that) Want to know how I took the business page from position 15 to position to 2 in under 21 days? Just 14 (spun) blog comments on half decent sites – that's it! This strategy is still working great yet no one at all talks about it. Here's the type of links that work when you point them to your Google business page. 1) Good on topic PBN links using partial match anchors only. 2) Softly weighted 301's (a topically relevant (topical trust flow) expired domain with no more than 20 links behind it) - this way your building 20 links in one go. 3) Half decent blog comments work like gold as well - partial match anchors. 4) Even though they're nofollow, send links from all of your other legitimate profiles about the same business back to your Google+ business page. 5) An in-contextual link from your homepage 6) Of course the usual Citation links from directories. Sometimes I build citations and list the main website link as the Google business page URL instead of the website URL. That way you get a proper link back to the Google business page. This strategy works beautifully when complimented by a proper Google business page that has been fleshed out correctly with a few posts, some images and videos. Whenever you build new links to the page you should always update the page with a few more posts as they love fresh content on their own properties. There is no excuse for not being in the snack pack anymore with your Google business page, especially if you're in the top 10 already.
  7. Rich77ard

    My SEO Experiment

    Firstly, congratulations on your success so far. It seems as though you put 'taking action' as a higher priority than 'over analysing' and procrastinating. Keep going! (I want to know if the following things have effect on seo or not 1) Exact Match Domain = it does help a little if used correctly 2) Keyword In The Domain = yes that still helps, partial match is actually better for long-term than an EMD in my opinion. 3) Ranking HomePage = doesn't make a difference, Google ranks quality pages with good links going to them - it doesn't matter where that page is on a domain. 4) Do not Post Spun Articles On Your Own Domain ( I have 90% + most which are just spun articles - They make very little sense but each of them is at least 95% unique) = perhaps they just haven't caught up with you yet. With a lot of unreadable low quality content you'll eventually get hit by Panda. Maybe if you're content is diverse enough and you're not continually posting about exactly the same subject, you may get away with it for longer. If all of your post are roughly about the same keywords then that's a disaster. But if the post topic is very diversified, who knows, you may get away with it for years. But if those inner pages start ranking, and you start getting traffic, the analytics' metrics would tank the site as well because no one will stay on the page and read content that doesn't make sense. Content uniqueness has very little to do with it actually. Even if you completely copy it off another website, as long as 50% of it is unique, then Google will still index it and consider it as unique content. 5) Length Of The Article Should be between 400 to 700. (My article is 11,000+ words long) = if these articles I just to fatten up the content on your domain, then I guess they are serving their purpose. If you actually want those articles to rank, then the correct word count would be an average of the top 10 sites that are already ranking for the keyword you're trying to rank for.
  8. Rich77ard

    Gray Hat SEO?

    For me anything that is illegal is black hat. Think hacking sites etc. Everything else is pretty much fair game. Not that I condone it, but there are all types of dodgy grey hat strategies that work beautifully. These strategies still continue to work which is why so many white hats complain about it. Good strong white hat SEO (think proper content marketing and distribution) is still very very hard to beat in the Serps with shortcut grey hat strategies.
  9. Yes it's a great feature that very few website owners leverage properly. It has to be used sparingly though as many site owners preload too many pages or they preload the wrong page. When you use visitor behaviour profiling tools like Mouse Flow you can learn which links the majority of your site visitors click on and then you can use the rel=pre-load tag on those links only. It doesn't actually improve your sites speed test scores, but it does provide a faster experience for the user when they click through to the next page as it has already been preloaded.
  10. Yes I'm seeing quite a bit of fluctuation across my 50+ sites. Sites that have been stuck in first place for years due to rocksolid on-site Seo have actually dropped to the bottom of the first page for the first time in a very long time. And it seems as though all my sites that used off-site linking strategies all got a nice little boost. The new update seems to have favoured off-site linking again. As far as I can tell this is pretty good news as it seems to be placing more weight on offsite links again, so the rankings are much easier for us to manipulate again.
×