Home | Blog | Software | Reviews and Features | Forum | Help | Donate | About us
topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • September 01, 2015, 09:09:29 AM
  • Proudly celebrating 10 years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Author Topic: What went wrong at Digg: publishers, PR hacks and power users ruled the roost  (Read 3228 times)

mouser

  • First Author
  • Administrator
  • Joined in 2005
  • *****
  • Posts: 34,968
    • View Profile
    • Mouser's Software Zone on DonationCoder.com
    • Donate to Member
From BoingBoing today, comes a short piece on the collapse of the website digg.com.

Note: I wrote a longish piece attaking the digg model in 2006, back in it's heyday, and I wasn't the only one; see How Digg Gets Everything Backwards.. And How to Fix It .

Quote
The only way to consistently get stuff on the home page was to work at it like a job. ... everyday users were realizing that nothing they submitted ever even had a chance in hell of going to the front page. They weren't empowered netizens visiting from the future, but chumps who were being played by Digg and a bunch of "social-media consultants."



from http://boingboing.ne...g-dug-own-grave.html

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 5,430
  • Slartibartfarst
    • View Profile
    • Donate to Member
You could well have been right in what you said in that older (2006-09-07) DCF discussion that you kicked off.
I don't feel qualified to comment though, as, though Digg, StumbleUpon, and the other all-the-samers looked intriguingly interesting to me at first glance, I could never really understand what they were really supposed to be, nor see the point/use of them for me or other users. Same with Google's Wave or Buzz or Google+, for example. They seemed nebulous.
Don't get me wrong - I did have at least some understanding of their fly-paper-like business objectives, but that was probably all.

When I read that theatlantic.com article though, I thought it was so busy analysing and criticising Digg - in hindsight and after the fact - that it could well have missed the point about the lack of definition (nebulousness) of Digg.

I would approach that from a theoretical perspective: (because it tends to stand up in practice)
  • Function and purpose: Any business generally exists merely to do something in such a way as to enable it to make a profit from what it does, to maintain/increase shareholder value. (Conventional business model.)
  • Process: What a business does can be defined as a process. If you are unable to define it as a process, then you don't know what it is that the business is doing. (W.E. Deming.)
  • Process capability maturity: If you can attempt to define it as a process - for example, (say) using the GAO-preferred IDEF0/3 methodology (GAO BPR Assessment Guide 1997), with ICOMs (Inputs, Controls, Outputs, Mechanisms), but in a state where the ICOMs vary in the short term, then the Quality of the Output will tend to be subject to inherent statistical variability/inconsistency. (W.E. Deming.)
  • CMM Levels: This state of variability was defined in the 5-level Humphreys CMM (Capability Maturity Model) as Level 1 (Ad hoc) or Level 2 (Repeatable). Level 3 (Defined) was where you operated the processes consistently in a defined manner. Level 4 (Managed) was where you deliberately managed the processes (as distinct from managing the business), and Level 5 (Optimised) was where you optimised those managed processes. (Managing the Software Process, 1989, Watts Humphrey)
  • Processes in statistical control: These are processes where the process performance (e.g., the quality of a process Output) can be seen to be in statistical control and subject to Common Causes inherent in the process itself. There are no trends, the mean is flat, performance is consistent and repeatable, and any variability beyond normal Upper or Lower performance bounds is thus attributable to Special Causes. (W.A.Shewhart)
  • Processes out of statistical control: These are processes where the process performance (e.g., the quality of a process Output) can be seen to be out of statistical control and subject to Special Causes. There are trends, the mean moves all over the place, performance is inconsistent and not easily repeatable, and process performance is almost entirely attributable to Special Causes (e.g., the process changes). (W.A.Shewhart)

Deming showed (in "Out of the Crisis: Quality, Productivity and Competitive Position" 1982) that where a business' core business processes were out of Statistical Control (which would typically place them in Humphreys' CMM Level 1 or 2), then they would progressively be likely to:
  • Fail to make sales.
  • Lose customers.
  • Lose market share.
  • Make a net loss.
  • Go bust.

If you look at Digg and other recent innovative Internet-based businesses that seem to have failed (including those within Google), you are likely to find that they generally seemed to involve a poorly-defined business process and/or a process in a dynamic state of change. This would place them as business processes out of Statistical Control, in Level 1 or 2 of the CMM.
With a dreadful statistical certainty, it seems that they must fail.

There could be other examples in progress - e.g., interestingly, to an onlooker, Facebook might seem as though it is making all the right moves to follow suit.
Generally, if a business has a single major core business process, then such a process failure is likely to be a recipe for disaster. If they are not yet core processes, and are being exposed to the market as ß-tests in a sandbox (e.g., like Google Wave, Buzz, etc.), then the costs of failure would probably be written off as necessary market trialling or R&D costs, well before they started to adversely affect overall corporate profitability or shareholder value.
« Last Edit: July 13, 2012, 01:15:35 PM by IainB »

TaoPhoenix

  • Supporting Member
  • Joined in 2011
  • **
  • Posts: 4,372
    • View Profile
    • Donate to Member
Wow Iain, that post is incredible.

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 5,430
  • Slartibartfarst
    • View Profile
    • Donate to Member
Wow Iain, that post is incredible.
Hahaha - does that mean you don't believe it?    ;D
(Don't bother answering. It's a rhetorical Q. I don't ask for or expect belief.)

My training for improving or re-engineering business processes typically involves:
(a) gaining an understanding of the process by applying relevant theory to the process, then
(b) actions in line with that theory, to improve the process.
Quote
"Action which is not based on sound theory or "best"/good practice is irrational by definition." (WE Deming)

What I learned from studying Deming et al was that, for improving existing processes, if you focussed on the the ICOMs and the cost-effectiveness of process design, by modelling the AS-IS and TO-BE processes and applying (say) ABC (Activity-Based Costing) or Linear Programming/Optimisation (where the latter might be feasible/relevant), then you could usually streamline your process design on "paper" (I use computer-based models) before spending a lot of money introducing risky ad hoc changes to the actual process. This saves time and money, and reduces potential risk of a costly screw-up.

Of course, one piece of theory to apply is about processes in CMM Level 1 or 2. That is, you should try to avoid potentially wasting your time by putting any work into a process until it has been brought under Statistical Control (CMM Level 3 or above). It is axiomatic that only when an AS-IS process is in Statistical Control can you rationally develop a plan to improve its design.

So, when I read about "Where XYZ business went wrong", my approach is to focus on the ICOMs in a model of the processes involved, and not what I think/feel should be done about symptomatic or organisational problems. Theory is a powerful thinking tool for discovering truth, and it invariably leads us to the realisation that the causal problems in business are generally to be found within a business process. Every time a coconut.

TaoPhoenix

  • Supporting Member
  • Joined in 2011
  • **
  • Posts: 4,372
    • View Profile
    • Donate to Member
Wow Iain, that post is incredible.
Hahaha - does that mean you don't believe it?    ;D


Of course I believe it - it and the followup were at a high level, higher than most posts I've seen anywhere. Looks like you took some management training either at work or elsewhere.

It's especially funny if you look at the thread starter pic.  ;D

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 5,430
  • Slartibartfarst
    • View Profile
    • Donate to Member
...It's especially funny if you look at the thread starter pic.  ;D
Yes indeed! That pic makes me cringe - a frighteningly true image - but it was a good choice, quite funny.    ;D

Paul Keith

  • Member
  • Joined in 2008
  • **
  • Posts: 1,982
    • View Profile
    • Read more about this member.
    • Donate to Member
This was news back when mouser wrote that piece but unfortunately (no offense to IainB's effort) but I think any good analysis is better off predicting the death of Reddit or why Mixx might be wrong/right/would still fall despite turning into Chime.In using the death of Digg for reference.

It's not like movies like the Social Network didn't already hint at how most startups are designed to sell big before failing rather than being successful. Considering the amount of cash Digg generated for it's owners, it didn't really went wrong.

Edit:

I guess it's not really fair for me to post this but I've been frustrated by the direction of online communities and I'd rather most of these pretend gutsy analysis share risky predictions instead of explaining away stuff with hindsight at the safety of their case studies tower.

IainB

  • Supporting Member
  • Joined in 2008
  • **
  • Posts: 5,430
  • Slartibartfarst
    • View Profile
    • Donate to Member
I don't want to hijac this thread, but I would make an explanatory response to the below 3 statements/points:
This was news back when mouser wrote that piece...
...most startups are designed to sell big before failing rather than being successful.
...share risky predictions instead of explaining away stuff...

1. Old news and hindsight: Yes, as I wrote:
When I read that theatlantic.com article though, I thought it was so busy analysing and criticising Digg - in hindsight and after the fact - that it could well have missed the point about the lack of definition (nebulousness) of Digg.
- and...
2. Startups collapsing: Whilst I am not sure whether it is true - at least, not since the Dot.com bubble burst -  that:
Quote
"...most startups are designed to sell big before failing rather than being successful."
- (because that could presumably suggest some kind of fraudulent intent behind an IPO), it does nevertheless seem...

3. Prediction of collapse: ...to have been born out in practice that the simple application of the theoretical approach of process Statistical Control described enables the user to predict - given knowledge of the business processes involved - with some accuracy, whether a business is likely to go down the progressive path towards collapse that Deming described:
Quote
  • Fail to make sales.
  • Lose customers.
  • Lose market share.
  • Make a net loss.
  • Go bust.

I apologise if I wrote what I did in a confusing way, but this 3rd point was what I was working towards with:
With a dreadful statistical certainty, it seems that they must fail.
There could be other examples in progress - e.g., interestingly, to an onlooker, Facebook might seem as though it is making all the right moves to follow suit.
And you would do this by the application of simple Statistical Control process charts.
Statistical Process Control charts won't tell you whether the IPO was a fraud, but, if you have knowledge of the business processes involved, then they will tell you something about the degree of statistical control of the business processes, and that would point the likely way ahead. It would also give you some idea of what you might need to do to repair the situation if the process was on the path towards collapse that Deming described.
So the tool enables both prediction and diagnosis. That can be very constructive.

In Facebook's case, they seem to have been and to still be going through dynamic change of their business processes. A process that is in a state of dynamic change (e.g., process change thrashing) cannot be in statistical control, nor can it (by definition) be categorised as anything higher than CMM Level 1 or 2. There's a lot they could do - and probably are doing - to get back on a stable business process track.
_______________________________________________________________

Off-topic:
Spoiler
Quote
I recall Deming saying something to the effect that he had spent a good deal of his life translating something that he had found very complex to understand that his teacher (Shewhart) had taught him - the mathematical/statistical theory relating to processes in Statistical Control - into simple terms that people could understand, and then communicating that to business.
He said that it was very simple, but that people seemed to find it difficult to understand.
In my case at least, he was right on both counts.    :-[

There's an interesting reference to this in Wikipedia, here:
Quote
Dr. Shewhart's boss, George D. Edwards, recalled: "Dr. Shewhart prepared a little memorandum only about a page in length. About a third of that page was given over to a simple diagram which we would all recognize today as a schematic control chart. That diagram, and the short text which preceded and followed it, set forth all of the essential principles and considerations which are involved in what we know today as process quality control."[1]
Shewhart was alive March 18, 1891 – March 11, 1967, but the profound statistical theory of the process control chart approach seems to have been little-used and not generally taught in business schools.
Deming was alive October 14, 1900 – December 20, 1993, and was only listened to by Western industrialists because he was regarded (by the Japanese) as having laid the foundation for Japan's rise from a shattered post-war economy to huge economic prosperity. That is, "He must have done something right!"
I am not sure to what extent Deming/Shewhart is currently taught or even understood in business schools, but I have seen plenty of evidence to suggest that it's node widely understood.

It is ironic that when I had the opportunity to attend one of Deming's a 4-day seminars when he was 84 years old, that what he was talking about to the 300 or so attendees was, for most of them, completely new.
Furthermore, they were ignorant, and some of them resisted accepting what he said because it killed some sacred MBA cows and thus went against their paradigms and beliefs. Though Deming was not new to me (I had read about him and watched TV documentaries about him, and had asked to be sent to the seminar by my employer), I was one of those ignorant ones, and I am ashamed to admit that the penny only started to drop on the morning of the 3rd day.    :-[