Back before I started blogging about IT-related stuff, I used to write about pseudo-science on my other (very quiet) blog Pro-science, and before I had that blog, I used to comment a lot on a number of science and skeptic blogs.
First commenting and later blogging about science and pseudo-science has helped enable me to be better at detecting woo and pseudo-science in my daily life. Woo is a term used in the skeptic community to indicate something which is based on anti-science and have no scientific validity or data to back it up.
Woo and pseudo-science exists everywhere, and I have found that the field of IT is absolutely no exception. Rather, there are certain areas which is rampant with pseudo-science, often bordering on woo.
One such area is the field of Agile.
The field of Agile is not a well-defined field, but is a loosely collection of methodologies, techniques, and tools, used for systems development. It is commonly understood as having formed as a result of the agile manifesto, but in reality, it started developing before then, and would probably have come into existence as a field, even if the manifesto hadn't been written.
As a software developer, I use Agile methodologies, techniques, and tools on a daily basis. I also go to conferences and meetups about Agile frequently.
Due to that, I can't help noticing that there is not a whole lot of science behind Agile. Not only that, many of the arguments people use to promote Agile is not exactly based in science or data, but rather on pseudo-science, often in the form of popular science.
This summer, I went to a conference which had a talk on Agile and NLP. I didn't go to the talk, so I can't say anything about the actual content, but let's be perfectly clear: NLP is pseudoscience (.pdf) - see also Thirty-Five Years of Research on Neuro-Linguistic Programming. NLP Research Data Base. State of the Art or Pseudoscientific Decoration? by Tomasz Witkowski (.pdf).
If you do a search on the words "Agile" and "NLP", you'll find articles like NLP Patterns and Practices for High-Performance Teams and Achievers by J.D. Meier. If you look at the blogpost, and are in any way familiar with the writing of people pushing pseudo-science, you'll notice the patterns there. There are appeals to effectiveness and popularity, but no actual data to back those claims up.
It is highly likely that Meier actually believes that NLP helps, but there is no science to back it up - neither in the form of an explanation of how the NLP mechanisms works or in the form of data, backing up the claims of improvement.
This is, unfortunately, widespread in the field, and not just when it relates to NLP.
Agile (as all things related to systems development) is very much dependent on people, and since most realize this, there is a lot of focus on this aspect.
This results in many speakers on the subject on Agile, will pepper their talk with references to neuroscience, but rather than looking at the actual science papers, they will get their knowledge from popular science books, by people like Malcolm Gladwell.
In 2012, Steven Poole wrote an article for The New Stateman on the subject of popular science books on neuroscience, called Your brain on pseudoscience: the rise of popular neurobollocks. While he, in many peoples' opinion, painted with too broad a brush, there was an important message in the article, which we'll do well remembering - popular science books often have an agenda, and they are simplistic, and, in many cases, get the science completely wrong.
E.g., as a rule, one should expect anything written by Malcolm Gladwell to be wrong.
I understand why speakers and authors tend to use popular science books as sources, rather than the real science articles. With popular science books, there is a good chance that the audience has already read them, and the science is presented in a form which is easy to digest and regurgitate.
Just a pity that the science is often presented simplistic at best and wrong at worst.
Some speakers, such as Linda Rising and Kevlin Henney, instead read the original science papers, and refer to them instead. Given that those two speakers are among my favorite speakers, I hardly think it takes anything away from their talk. It just makes them more accurate.
Having addressed the use of pseudo-science and popular science, I think it is also important to address the fact of data on which to evaluate things.
This is obviously connected to the other problems. Lack of data, leads to bad conclusions.
A simple example of the lack of data, is this: How do we know Agile works?
Or put in another way: On what basis do we conclude that Agile works better than what was there before?
As far as I've been able to find out, there is no real data to support such claims.
Yes, there have been studies that indicate that Agile works better, but those are somewhat doubtful, since there was no baseline to compare to, no clear definition of what went before Agile, or even what Agile is.
People tend to compare Agile to Waterfall, but as Laurent Bossavit wrote in his excellent book The Leprechauns of Software Engineering, there is no clear evidence that Waterfall was ever used - at least not in the form that people commonly think of when talking about the Waterfall method.
Personally, I believe there is great value in using Agile, but I am not willing to make any claims without data to back me up, and in order to have that, we need to try to not base the field on pseudo-science and misrepresentations in popular science, and instead try to use real science, and to collect and evaluate data.
Not everything fit into the scientific method, and in a field that is so heavily dependent on people, there will be times where personal preferences will lead the way, but this doesn't mean that we can't do better.
Think of it this way: We would not accept a new UX design without an AB test or some other empirical data to back it up. So why would we accept it when it comes to our methodologies, techniques, and tools?
No comments:
Post a Comment
If the post is more than 14 days old, your comments will go into moderation. Sorry, but otherwise it will be filled up with spam.