Jump to content
TerryKyle

Shoemoney Vs Rand Fishkin (Round 1)

Recommended Posts

The main problem I have with the idea is where he says

“with the eventual goal of classifying, identifying and removing/limiting link juice passed from sites/pagesâ€

Maybe I am misunderstanding what he is saying here, but since when does SEOmoz have any input whatsoever on how sites/pages rank? Umm……..Duh…… It’s not like Google relies on SEOmoz data for anything. In that respect he came across to me as some kind of “super ego d*ckhead†which I find very distasteful.

Aside from that part of the statement and in regards to the idea alone; I think there is market demand for a tool to measure how closely connected a page/site is with spam and how likely a site/page is to be considered spam.

If I was him, I would not bundle it with SEOmoz or even make it an SEOmoz service. I think it has too much bad juju associated with it. People will be royally pissed if you accuse their site of being spammy in any way. So, in that respect I think it would be really, really bad for the SEOmoz brand which as of now is in pretty respectable standing.

So my advice to him would be to move forward with the project, but build a completely separate website/business around it that just uses data from SEOmoz but is in no other way directly tied to SEOmoz.

I would create a toolbar that is linked to the data that offers various metrics such as:

1) Inbound links from probable spam domains

2) Inbound inks from probable spam pages

3) Outbound links from this domain to probable spam domains or pages

4) Total percentage of inbound links originating from likely manipulative sources (not necessarily considered spam)

Whether Rand chooses to develop this service or not, eventually someone will because there is demand for it in the marketplace. And now that he has suggested the idea, the demand for such a service will increase dramatically. Maybe some other data cruncher like Alexa or Majestic would someday also develop and offer a like minded service.

Heck, even Google could offer a metric such as “Total percentage of inbound links likely originating from low quality sources†as part of webmaster tools. If they want to keep their methodology a secret, they could accomplish that by simply pushing the data out to webmasters once or twice per year. That would make it really hard to run any meaningful tests on which domains/pages are considered low quality or not. Yet it would also allow webmasters some kind of big picture overview of their site according to Google.

Share this post


Link to post
Share on other sites

Although I think this "spam research project" by Rand is just utter BS, I'm actually glad he's doing it. I use SEOmoz metrics all the time and if his system works as advertised, this will help clean up the scores a bit to spit out less false positives. I'm tired of seeing domains with PA of 75 with nothing but spam.

Share this post


Link to post
Share on other sites

Who really decides what is spam and what isn't?

I wonder how long until some of these "ratings" begin to have legal implications? Let's say SEOMoz slams a domain as "spam" and "junk". The domain loses traffic, business and $$. Then the domain owner with deep financial pockets sues SEOMoz for slander and loss of business. And SEOMoz spend a couple years and many $$$ in legal fees.

Now that would be interesting.

Edited by Mike608

Share this post


Link to post
Share on other sites
I use SEOmoz metrics all the time

Really? I don't consider SEOmoz / OSE metrics to be worth using as anything. Mostly because...

I'm tired of seeing domains with PA of 75 with nothing but spam.

...which indicates these clowns have very little idea of what is "authoritative" in Google's / Microsoft's / Yandex's / Baidu's eyes.

Share this post


Link to post
Share on other sites

Here's what I've never understood about Rand Fishkin. On the one hand, he promotes himself as a white-hat SEO. On the other, he sells tools that are specifically designed to help grey and black-hat campaigns (why does a white hat SEO need to analyse backlinks of a 3rd party site?). The services he offers goes against the Google guidelines that say we should be building and marketing sites as if search engines didn't exist. Using OSE shouldn't be necessary if we believe Rand's white-hat blog pieces. He's happy to promote himself as a white hat, while making money from tools that help grey and black hat campaigns. I suppose money makes a hypocrite out of all of us....

Edited by colourofspring

Share this post


Link to post
Share on other sites

Really? I don't consider SEOmoz / OSE metrics to be worth using as anything. Mostly because...

...which indicates these clowns have very little idea of what is "authoritative" in Google's / Microsoft's / Yandex's / Baidu's eyes.

It's no different than a spammy site showing a PageRank of 8 even though it's practically worthless. There are always going to be ways to game these metrics, but it doesnt make them worthless. When you look at a backlink profile and don't see any spam links, a Page Authority of 60 beats a Page Authority of 19 any day of the week.

And actually, Page Authority correlates quite well with SERPs:

pagerank-common-metrics.gif

Share this post


Link to post
Share on other sites

I can appreciate where you're coming from, and having access to some metrics is certainly better than trying to assess things with no metrics.

Backing up your argument by quoting an image from a company's self-promoting post from two years ago in which the author doesn't state which 4,000 pages of the >1,000,000,000,000 in Google's index are used for the analysis is giggle-worthy though.

Share this post


Link to post
Share on other sites
Guest
This topic is now closed to further replies.

×