Welcome Guest.   Make a donation to an author on the site April 17, 2014, 10:07:22 PM  *

Please login or register.
Or did you miss your validation email?


Login with username and password (forgot your password?)
Why not become a lifetime supporting member of the site with a one-time donation of any amount? Your donation entitles you to a ton of additional benefits, including access to exclusive discounts and downloads, the ability to enter monthly free software drawings, and a single non-expiring license key for all of our programs.


You must sign up here before you can post and access some areas of the site. Registration is totally free and confidential.
 
The N.A.N.Y. Challenge 2014! Download dozens of custom programs!
   
   Forum Home   Thread Marks Chat! Downloads Search Login Register  
Pages: Prev 1 [2]   Go Down
  Reply  |  New Topic  |  Print  
Author Topic: Has SEO ruined the web?  (Read 9219 times)
superboyac
Charter Member
***
Posts: 5,520


Is your software in my list?

see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #25 on: June 03, 2010, 04:54:43 PM »

The answer to the thread title is yes, a good example to illustrate it is something I was doing just yesterday.
I was searching for an online stream of a soccer game not available on TV (putting aside the illegalities of streaming) google search results gave me pages and pages of "team x vs team y June 2 2010". I will let you all guess how many of those pages actually had the stream for the game.
That's precisely what I'm talking about.  You can do the same for just about anything.  Unless you already kind of sort of know where you're trying to go, good luck.
Logged

Renegade
Charter Member
***
Posts: 10,361



Tell me something you don't know...

see users location on a map View Profile WWW Give some DonationCredits to this forum member
« Reply #26 on: June 03, 2010, 05:20:20 PM »

Okay, so noting this mine field, what advice would be offered to one trying to get a website positioned correctly for the topic its content is relevant to?

(Bear with me, I know this is only vaguely on topic)

I ask because I'm currently in the process of redoing the company website which we've never really tried drawing attention to...because it's hidious. The new site which will (hopefully be non-hidious) feature online shopping while showcasing the company (yada yada yada) will need to be created with the SEO madness in mind.

So... any advice on what I should/should not do/be doing?

Too broad of a request. Check out the SEO Book blog and just write properly using the right tags. Get off-site links in -- that's crucial.
Logged

Slow Down Music - Where I commit thought crimes...

Freedom is the right to be wrong, not the right to do wrong. - John Diefenbaker
Paul Keith
Member
**
Posts: 1,965


see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #27 on: June 03, 2010, 10:33:01 PM »

Google is just one more sad example of how there has never been a system so respected, useful, or worthwhile that somebody didn't eventually show up and start gaming it for their own advantage.

It's even sadder when the people who created the system start doing it themselves.

And many do. undecided

That begs the question though. Was Google ever so respected, useful and worthwhile?

In the limited space of a fad or a well-reviewed item sure but does it hold up to a classic?

If anything Google merely superseded Yahoo and that sets the precedent for everything else.

I think Wikipedia did a better job of uprooting the mystique of Britannica despite having a shorter life span and even today it isn't as expected to be the better model but rather merely an alternative model.

Google though was merely a search engine. If it started to filter out search engines in the beginning, it would have not penetrated through and beat out Yahoo.

I think there is more evidence that humanity has never been so united that it was ever able to counter the social problems brought by technology compared to the technical problems.

For every one out there that works on Antivirus companies, you're most likely going to stumble upon spam methods from either an acquaintance or from being exposed long enough on the internet on your own.

In this same sense, for every customizeable blacklist and whitelist feature of a browser or instituted by a government, there is rarely the unity that benefits individual preferences over technological pragmatism.

What I mean by this is that from a non-techie perspective, if someone doesn't go out there and actually change or create something, the knowledgeable users of the internet who already possess the knowledge to bypass many of those annoyances won't go out of their way to make it easier for everyone else.

Yes, they will create malware blocklists and parental controls but they won't attempt to try and create a "You're better off with these site than these Google first page search results" site for people other than themselves but on the other side of the issue, it's because there's no truly solid culturally bound society that also thinks "this is such a bad problem that we all need to unite to help these blacklists/whitelists makers to better understand what sites we deem should be blocked."

Even worse so, IF there is a community that's been set up, it often becomes overtaken by censorship philosophy rather than for the name of progress. That is, to use guns and drugs as an analogy, it's much more tempting and effective to divide the issue between pro-gun/drug and anti-gun/drug rather than pro-education and anti-education of said things.

Such separation therefore allows for gaming to prosper not because human nature exist but because human unity does not find it as dedicated to producing a counter-gaming mentality. On a smaller more pop culture scale, it would be like crying that 3d gaming ruined the demand for 2d gaming but not saving 2d gaming by going beyond a mere social network for 2d gamers or actually providing superior 2d games that edge out 3d games.

It's also in human nature to provide better alternatives and to continue going against the flow. After all, that is how Google started in the first place. The dilemma though is that it's rude maybe ignorant or idealistic to mention this stuff because we for the most part don't want to hear the concept of "If you have a better idea, pursue it".

...or it's much more realistic to say, the more talented and knowledgeable people do pursue it but they pursue it to rid themselves of the problem. They pursue the problem by becoming a better or equally competent gamer of search results rather than collaborate their knowledge on upgrading the current model thus leaving it back to the same model of "waiting until a competitor of Google manages to develop something radical enough that it will beat out Google and repeat what Google did to Yahoo in a more modern context."
« Last Edit: June 03, 2010, 10:36:41 PM by Paul Keith » Logged

<reserve space for the day DC can auto-generate your signature from your personal PopUp Wisdom quotes>
Paul Keith
Member
**
Posts: 1,965


see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #28 on: June 03, 2010, 10:47:54 PM »

I apologize if the above was just a repeat of what I said (I didn't recheck), it's just that I'm currently reading this article and it made me want to post something like that to your (40hz's) post:

Quote
The case of South Korea also illustrates another peculiarity of successful modernity. When the Communists grabbed control of North Korea, this looked like a case of historical bad luck. Korean families were divided and the North was turned into a withdrawn and menacing state that enslaved its citizens. Yet, as Machiavelli observed, virtue is what you make of your fortune – whether that fortune happens to be good or bad.13 What makes a society virtuous in Machiavelli’s sense – able to master fortune and ride its ups and downs – is strong culture. Sometimes strong culture expresses itself through the medium of art, sometimes through philosophy, and sometimes through religion. It was Hegel who observed the crucial stimulating role that art, religion and philosophy play in highly dynamic societies.

What allowed South Korea to capitalize (literally) on its (bad) fortune? Calvinism imported from America played a part; so did Christianity more generally. Korean Christians took a leading role in the resistance to the Japanese Occupation, and became a major social force after World War II.14 Around 20 per cent of South Koreans are Protestants and ten per cent are Catholic. There is a clear relationship between the extraordinarily rapid spread of Christianity in Korea after 1945 and the emergence of a highly energetic Korean modernity. What is being suggested here is not that the Protestant ethic equals capitalism, but rather that great and dynamic societies have an enigmatic culture core.15 Protestantism in South Korea in part provides this because of the Protestant metaphysic in which individual conscience and free will are combined with a powerful sense of predestination and necessity.
Logged

<reserve space for the day DC can auto-generate your signature from your personal PopUp Wisdom quotes>
Paul Keith
Member
**
Posts: 1,965


see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #29 on: June 06, 2010, 07:43:41 AM »

Not sure if this is any more enlightening but IMO this seems like the simplest standard implement of SEO right now:



http://www.lancescoular.com/smTPo4.html
Logged

<reserve space for the day DC can auto-generate your signature from your personal PopUp Wisdom quotes>
higherstate
Participant
*
Posts: 27


View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #30 on: June 10, 2010, 07:41:55 AM »

Hehe, this is a can of worms.

Personally I don't know what I would do without search engines, it wasn't that long ago that if you wanted to do any kind of research then you had to make a trip to a library & hope they had the relevant books and then hope that had a decent index (or read the whole book) to find what you wanted.

Now we have the luxury of just going to google and doing some searches & in an instant finding what we want. I would say that it is extremely rare that I can't find exactly what I want when doing research (this of course depends on knowing exactly what you want).

If you are not getting the results you want in google then use the advanced search options & be very specific. Google is constantly changing it's algorithms and recently completed a major update called caffeine that it says updates much quicker and is more relevant than the old version. Who knows if that is true but you certainly have more options.

I would also like to make the point that wikipedia is one of the most seo'd sites out there. If you want to know how to setup your website to be loved by the search engines then just take a look at how they do it.

Also, a forum like this is, by its very nature, is heavily seo'd. Google loves large sites with every changing, always updated content i.e. a forum & the software this forum uses will have an seo setup to it.

Seo'ing is just like advertising in the offline world, if you don't advertise then no-one will know your product exists. That is not to say that the world is better or worse for it, it is just business. The difference is that you don't need to have a large amount of money and David can take on Goliath. It has levelled the playing field to a large extent.

I think that at the end of the day, the best sites will always come out on top, think facebook, wikipedia, newspaper sites etc I would also state that running these kinds of heavy content sites takes a huge amount of time, effort and staff i.e. they need to making money in some way. If the content is free then that way is usually via advertising. You tube lost something like 400 million last year, this year I suspect it will make 400 million. The difference is they have implemented advertising, one is sustainable, one is not.
Logged

My Antivirus Firewall Software blog & advice.
superboyac
Charter Member
***
Posts: 5,520


Is your software in my list?

see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #31 on: June 10, 2010, 09:10:34 AM »

I guess I learned a term here; scraping.  It's not SEO that ruined the web, it's the scraping sites.  The ones that copy other content into a new website.  That's the worst.  That's the problem I'm referring to.  I'm fine with SEO, in theory.
Logged

Paul Keith
Member
**
Posts: 1,965


see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #32 on: June 10, 2010, 03:38:32 PM »

Wikipedia is a poor example because it's automatically given precedence over anything else.

To understand how badly Wikipedia is over-ranked, test it on a pop culture entry with a more detailed Wikia entry. Wikipedia still wins out.

Imagine if these were more scholar level entries. You'd have to dissect the searches between Scholar and the main search to narrow down the content to the simplified but educated link most of the time to even get a casual understanding of a topic you know nothing about hence Wikipedia is the easy cop-out.

Any other personal page who copies Wikipedia's model is bound to fail simply for the very reason that they don't have as much leverage on the copping out issue. Example: Hubpages and Squidoo pages etc. etc.

In the end, it goes back to authority + fame. In that sense, it's much easier to work around a model of Twitter, Facebook, Linkedin, Youtube model than it is to copy Wikipedia's model because any new player in the SEO arena isn't going to usurp any tried and true reputation of an encyclopedia like Wikipedia did no matter the quality of their content. Even highly respected and well written websites can't match up with the Wikipedia model unless they are already linked to someone or some concept with prestige like a Web service, dictionary, professional magazine, online newspaper, etc. etc.

@superboyac,

I don't really follow your conclusion.

The Superior Software List (for Windows) is in itself a scraping site. The only questionable area is how much is being copied.

Still you're still copying the content of a software's title or the theme of a general review site. How then can you conclude that scraping is worst at ruining SEO when that is only slightly different from what your site is doing?

I can understand if you say blatant plagiarism is bad but scraping?

Not only does that get penalized by Google if it's a blatant copy but scraping helps people better gauge the notability and quality of anything that's on the web as you yourself tried to demonstrate with your site.
Logged

<reserve space for the day DC can auto-generate your signature from your personal PopUp Wisdom quotes>
superboyac
Charter Member
***
Posts: 5,520


Is your software in my list?

see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #33 on: June 10, 2010, 06:16:14 PM »

I don't know what you are saying, PK.  Maybe I'm not using the word scraping well.  My site is definitely not scraping anything.  Everything there is my own content that I have put a lot of thought into.

Scraping is when I search for something on google, like the top ten movies of 2010, and I get 5 pages of different websites with pretty much the same content.  The same movies in the same order with the same paragraphs, just in different website addresses.  That's what I'm talking about.

or the hundreds of software review sites that list tons and tons of software, with generic descriptions that are generated automatically somehow.  And they don't help the user at all in finding what he is looking for.  The categories for the are not consistent.  Often times, you are looking for a particluar kind of video software, for example, but it just lists all the software that has anything to do with video, and no matter what VLC will be at the top.  Stuff like that is what I'm talking about.  It's ruining the web because it's impossible to find anything good.
Logged

Paul Keith
Member
**
Posts: 1,965


see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #34 on: June 10, 2010, 06:55:24 PM »

No, that's much closer to plagiarism or backlinking. (and Google penalizes both although the latter gets a slow fix that anyone can SEO it much longer)

Scraping from my understanding is just that. Taking content and posting it to another website as a collection or set or link collection. (The amount, the quantity, the content is really up to the person's taste)

Quote
or the hundreds of software review sites that list tons and tons of software, with generic descriptions that are generated automatically somehow.  And they don't help the user at all in finding what he is looking for.  The categories for the are not consistent.  Often times, you are looking for a particluar kind of video software, for example, but it just lists all the software that has anything to do with video, and no matter what VLC will be at the top.  Stuff like that is what I'm talking about.  It's ruining the web because it's impossible to find anything good.

See, the problem with that definition is that it relies on your opinion of generic.

Albeit, the low quality sites are pretty obvious but how do you differentiate The Superior Software List with Download.com with Fileforum at the generic categorical level that search engine spiders play at?

From a personal user level or even a review level, it's very easy. However from a grand macro level of search value, it's almost similar.

You could for example take two different contents talking about the same program but at the generic level, the end result is just to convince the searcher, that you are among the hundred of software reviewers who praise this specific program name.

Sure, the content can be unique in the sense that you wrote it but if the general theme is the same from every other positive reviewer over the internet, it's really no more generic as having someone's spidered definition praising the very same program.

...And it gets more generic as the program becomes more popular.

Let's use simplespark.com as a more concrete example to reference with.

Is Simple Spark useful or useless?

Well before that, does it fit your definition of a scraping site with generic descriptions? The answer would be yes.

But is it useful?

The answer is also yes especially earlier on.

Why?

Because there aren't that much copies of Web 2.0 search engine services even today.

It's easy to spot the popular services but the rarer ones like ProtoPage, you often get much earlier before the blogs start reviewing them and scraping them into "Alternative services to Netvibes".

It is only ruining the web via SEO in the sense that there have been lots of copycats.

...but in those copycats, there are still sites who aim for a much more useful goal like your site does.

The question is, how do you separate the fluff from the value when the value of a search engine ranking is also very much vague?

Albeit Google could do a better manual job of fixing things but generally it's not just Google. It's DuckDuckGo. It's Bing. It's even pseudo-human powered search engines like Mahalo.

At the end of the day though, if there isn't even at least "1" semi-credible scrape site, it ruins the web more because it's much harder to discover these cool quality lesser known apps.

By that very same token, these scrape sites are no more ruining the web than malware sites are or even less because they're the easiest to filter out. Sure, they are still suckering and annoying casual surfers but...it's also much easier to discover that...now... sites like Digg, Reddit, Mixx, Propeller, Delicious, Diigo, Twitter, Facebook...even Wikipedia... are slightly less bogged down scraping sites to discover things because the idea of a scraping site evolved. The idea of a scraping site became added with up or down buttons or Wiki-like possibilities that anyone can improved upon or individually filled public bookmarks.

In the grand scheme of things, they don't go so far as to ruin the web precisely because they are scraping sites where all the crap is littered into one url instead of all those marketing and SEO dominant crap sites with tons of backlinks towards their own self-made little information Twitter/Facebook/Squidoo/Hubpage/Youtube/Linkbait set of channels.
Logged

<reserve space for the day DC can auto-generate your signature from your personal PopUp Wisdom quotes>
JavaJones
Review 2.0 Designer
Charter Member
***
Posts: 2,514



see users location on a map View Profile WWW Give some DonationCredits to this forum member
« Reply #35 on: June 11, 2010, 03:35:35 PM »

I think the definitions are getting kind of muddied up here. Scraping sites (as opposed to "web scraping" in a generic sense, which is just a technique that *scraping sites* use) are pretty much universally bad because they get their content in whole or in large part from other sites, almost exclusively without permission, thus being not only a duplication of other content, but also a copyright issue:
http://en.wikipedia.org/wiki/Scraper_site
This is different than a content aggregator like Google News which links back to its sources, or a directory system like Open Directory, or a search engine like Bing. Scrapers are pretty much virtueless. Wink

- Oshyan
Logged

The New Adventures of Oshyan Greene - A life in pictures...
superboyac
Charter Member
***
Posts: 5,520


Is your software in my list?

see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #36 on: June 11, 2010, 03:43:04 PM »

I think the definitions are getting kind of muddied up here. Scraping sites (as opposed to "web scraping" in a generic sense, which is just a technique that *scraping sites* use) are pretty much universally bad because they get their content in whole or in large part from other sites, almost exclusively without permission, thus being not only a duplication of other content, but also a copyright issue:
http://en.wikipedia.org/wiki/Scraper_site
This is different than a content aggregator like Google News which links back to its sources, or a directory system like Open Directory, or a search engine like Bing. Scrapers are pretty much virtueless. Wink

- Oshyan
Thanks JJ, that's a great description.
Logged

superboyac
Charter Member
***
Posts: 5,520


Is your software in my list?

see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #37 on: June 11, 2010, 04:02:04 PM »

Man, Paul, you always have like twenty different thoughts going on at the same time.  it's hard to follow what you're trying to say.

Anyway, let's put aside all the technicalities.  i think it's pretty clear what I'm trying to say, whether or not I'm using the exact correct words or not.

So here is what I'm talking about:  Fine, I will accept that a computer algorithm like Google has a hard time distinguishing my software reviews from the hundreds of meaningless software scraping sites (I don't care if I'm using the term incorrectly).  But when you go to my site (and if you are a somewhat experienced internet user), you can tell almost immediately that my articles are my own and unique to my site.  Now, if you go to download.com or one of the more worthless sites like freedownloadscenter.com, you can tell immediately that this is a piece of crap software site that is not going to be of much help or use.

But here is what I say to that.  Baloney.  Google has shown that it could do some mind-blowing stuff, to the point of virtually mind reading the user.  So if they really wanted to, they can easily fix the problem of being able to distinguish sites like mine from the stupid scraper sites.  So I don't buy the excuse that Google can't distinguish the sites.  i don't buy it at all.

I mean, think about it.  In Photoshop CS5, they've added the unbelievably amazing "Content-Aware Fill" feature, where the computer will actually seamlessly fill in parts of a photo by analyzing the rest of the photo.  It can automatically remove trees, remove background clouds, etc.  it's quite shocking...it floored me.  If we can do that with computer technology, I'm certain Google can fix all these SEO things I'm talking about.

But they won't because they don't want to.  Why don't they want to?  I don't know for sure, but it definitely has to do with money.  SEO is about search results.  There is cutthroat competition to be at the top of these search results.  How do you get there?  Ads, clicking, all that stuff that we hate.  But all that stuff is what makes Google 99% of their billions of dollars.  So there is absolutely no motivation for Google to do something to that system which will drastically reduce their profits.  Now, it's not Google's fault per se.  it's the fault of all the scraper site authors who are taking advantage of the rules Google has set in place.  But if Google really wanted to, they could fix it.  Just like antivirus companies can constantly and instantly update their databases as new threats emerge.

So it's not that Google can't fix it...it's that they won't fix it.
Logged

Paul Keith
Member
**
Posts: 1,965


see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #38 on: June 11, 2010, 07:52:37 PM »

The thing is though, those scraper sites, the ones fitting the article that JavaJones linked to are actually pretty rare nowadays relative to every other things that can be said to be ruining the web via using SEO.

Even the article admits these kinds of sites are being eliminated by search engines.

The thing is though there's the bigger fishes of "aggregators" and whatever the fine line is between aggregators and scraping sites.

I hate to say this but your experience with the internet and technical knowledge may be blinding you a bit but no, to a casual surfer, especially one who doesn't know the de facto popular web sites, the only difference is that your site seems to have less programs than Download.com.

Even if you include experienced internet users at that.

The only difference between experienced internet users and non-experienced ones is that they found out through experience what are the sites to avoid. They don't know what sites are good, they know what sites have a bad reputation.

If experienced internet users truly knew the distinction, search engine ranking would in itself be unnecessary since all everyone needs is to check are the top sites in an aggregator but then it's not that simple.

This is how things like Download.com and freedownloadscenter.com stay afloat. They tow the line but they aren't scraper sites in the context of what JavaJones is referring to because they link the publishing site back.

It's a huge gray area but as far as ruining the web, it still falls into the category that experience teaches one what site to avoid and inexperienced users don't really know the difference and yet still it takes a backstep towards whatever link first appears as scraper sites are often only found nowadays in early search terms if they are extremely niche search terms and that's more because people don't know that aggregators are much better for searching reliable rare software and that everything else should be checked up via several forums rather than a search engine once you go that far unless it's a document search.

As for having twenty different thoughts, I apologize. Maybe it's just my desire to clarify or verify my own lack of knowledge of things but I feel that the only way a better alternative can come is if people discuss these issues to such...I wouldn't say details...but level of clarification especially from an over-simplified philosophical stance that even inexperienced interent surfers can easily pick up if they stumble upon a topic.



Logged

<reserve space for the day DC can auto-generate your signature from your personal PopUp Wisdom quotes>
superboyac
Charter Member
***
Posts: 5,520


Is your software in my list?

see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #39 on: June 11, 2010, 08:47:43 PM »

I am not able to follow what is happening here.  i don't even remember what my original question was.
Logged

Tinman57
Charter Member
***
Posts: 1,697



Duck! It's another MicroSoft Patch!

View Profile Read user's biography. Give some DonationCredits to this forum member
« Reply #40 on: July 15, 2010, 08:09:59 PM »

  I just read the other day that Google just changed their algorithm.  The article also went on to say that they also manually changed the rankings for some reason.  I figure it's for the almighty $$$.  Move their paid advertisers rankings to the top to keep them happy and make more money.....

  I stopped using google years ago, and have been using Bing since it started up.  Only time I use google is if I need to do a HTTPS search......
Logged

((((TINMAN))))
Tinman57
Charter Member
***
Posts: 1,697



Duck! It's another MicroSoft Patch!

View Profile Read user's biography. Give some DonationCredits to this forum member
« Reply #41 on: July 15, 2010, 08:39:20 PM »

Google admits that employees change index rankings
http://www.zdnet.com/blog...hange-index-rankings/1420
Logged

((((TINMAN))))
superboyac
Charter Member
***
Posts: 5,520


Is your software in my list?

see users location on a map View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #42 on: July 15, 2010, 09:26:50 PM »

  I just read the other day that Google just changed their algorithm.  The article also went on to say that they also manually changed the rankings for some reason.  I figure it's for the almighty $$$.  Move their paid advertisers rankings to the top to keep them happy and make more money.....

  I stopped using google years ago, and have been using Bing since it started up.  Only time I use google is if I need to do a HTTPS search......
is bing really better?  I tried it a couple of times and wasn't that impressed.  Can you describe why you like it better?
Logged

Emma Morales
Participant
*
Posts: 4


View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #43 on: July 15, 2010, 09:58:05 PM »

  I just read the other day that Google just changed their algorithm.  The article also went on to say that they also manually changed the rankings for some reason.  I figure it's for the almighty $$$.  Move their paid advertisers rankings to the top to keep them happy and make more money.....

  I stopped using google years ago, and have been using Bing since it started up.  Only time I use google is if I need to do a HTTPS search......

Yeah, they change algorithm, precisely, they upgrade it. This stage of algorithm is now called "Google Caffeine".
I think they are trying to fix SEO problem in way that they will give more relevancy and trust to social media and websites mention on them, since you can't fix opinion of huge number of people like you could with bunch of scraper sites with the same content... Not genuine content is now penalized by Google....

Indeed, old SEO did partially ruin web with keyword stacking, content copying etc, but now when all is monitored by almighty Google, you can't evade "Google hell" if you make such stupid and useless mistakes.

Just create genuine website, with lots of useful resources and content, don't be afraid to post some links to other websites (link juice will not leak out smiley) promote it by social medias and Google will love you  cheesy
Logged
JavaJones
Review 2.0 Designer
Charter Member
***
Posts: 2,514



see users location on a map View Profile WWW Give some DonationCredits to this forum member
« Reply #44 on: July 15, 2010, 10:14:15 PM »

For god's sake, how does it even make sense for Google to high-rank sites in *organic search* that pay them for ads that compete with organic results? If their organic results are working well, why bother paying?

- Oshyan
Logged

The New Adventures of Oshyan Greene - A life in pictures...
Tinman57
Charter Member
***
Posts: 1,697



Duck! It's another MicroSoft Patch!

View Profile Read user's biography. Give some DonationCredits to this forum member
« Reply #45 on: July 16, 2010, 08:42:55 PM »


is bing really better?  I tried it a couple of times and wasn't that impressed.  Can you describe why you like it better?

  It wasn't so much as Bing being better than google, it was the fact that google has the tendency to spy on everyone and keep everyones search queries for years without letting anyone know.  I'm pretty sure they sold these saved searches to marketers and such, but either way, I didn't like it.
On the other hand, Bing has some pretty neat features that I like, just not HTTPS like the new google offers.  Only thing I don't like about Bing is the big Picture that has to be loaded up.
Logged

((((TINMAN))))
Emma Morales
Participant
*
Posts: 4


View Profile WWW Read user's biography. Give some DonationCredits to this forum member
« Reply #46 on: September 28, 2010, 05:18:37 AM »

Yeah, Bing should work a bit on HTML optimization (by Google or Yahoo standards). YSpeed is great for optimizing websites (someone should tell Bing guys about that:)

Google and payed listings are gray area, they telling us one thing, and do something else... Fortune is that for now featured websites are marked, but what if they remove any distinction between payed and organic searches...? Then we are screwed  cheesy
Logged
Pages: Prev 1 [2]   Go Up
  Reply  |  New Topic  |  Print  
 
Jump to:  
   Forum Home   Thread Marks Chat! Downloads Search Login Register  

DonationCoder.com | About Us
DonationCoder.com Forum | Powered by SMF
[ Page time: 0.059s | Server load: 0.03 ]