Sunday, 18 January 2015

Inadvertent Local Optimisation

I saw this tweet the other day:
Followed swiftly by:
This got me thinking about a process I am often involved in. 

Reviews of 'testing capability and maturity' are a common product offered by many lone consultants & consultancies. I myself have done them on a regular basis, creating a number of (I believe) thoughtful strategies and recommendations that a client can implement themselves and/or in conjunction with a partner.

I like to think, I've probably done some good too. Looking for root causes over the dreaded 'low hanging fruit' that so many consultants recommend (see drug dealer).

When I really reflect though, I'm pretty sure most things I have recommended are inadvertent local optimisation. Conducting a review into testing is a classic misdirection, as a great many problems in how testing is done in an organisation, are symptoms of wider problems. A subsconscious misdirection, but a misdirection nonetheless.

Next time, before engaging, I'll ask:

  • Where does attending to the needs of your people come on your priority list?
  • Does work get done based on value or who can shout loudest?
  • Have you got too much work in progress?
  • Do you have teams with all the skills needed to deliver and autonomy to solve their problems?

Once we've dealt with these root causes, then we can talk about testing. Or we might not need to, perhaps dealing with what is systemic will assist with clearing up those symptoms.

Maybe Doctors and Consultants should share the same oath:
"First, do no harm." 
 

Tuesday, 30 December 2014

Hard Skills > Culture Fit


So here's a little bit of hiring people logic for you. I've expressed it as pseudo-code all those technical people who insist on exclusive hard skill hiring, despite the long term pain of it all.


private Handler handler;
public Employee effectiveEmployee;
public int numberofEffectiveEmployees = n; 
if (hard skills > culture fit) {
            handler = new Handler;
            numberofEffectiveEmployees -= 0.5;
        } else {
            effectiveEmployee = new Employee;
            numberofEffectiveEmployees++;
        }

What?

My code probably doesn't explain itself (and probably won't compile), so here goes.

So, a handler is the person (who may well have done the hiring if there is any justice) who tidies up the mess of a hire that doesn't fit with the culture that exists at your organisation. Not the public facing culture either, the actual one. Effectively for every poor cultural fit hired, you reduce the effectiveness of your remaining employees by a bit. Probably a fair bit. I went for half. Arbitrary. I harangue technical hires for this mainly as a sweeping generalisation, but it happens all over really.

Handler?

You can tell you're a handler when:

-The same person is in front of you all the time.
-You exhaust a repertoire of approaches to people management and problem solving that have served you quite well in a successful career thank you.
-Other people talk to you about that person all the time or the conversation always go that way.
-The organisation can't figure out what it wants from person.

Handlee?

You can tell you're a handlee when:

-You are always in front of your manager.
-Your manager appears irrational and changes approaches at seemingly random intervals.
-You appear to be the subject of conversation regularly in contexts that are probably nothing to do with you.
-The organisation can't figure out what to do with you.

Huh?

This is not a blame thing. Both handler and handlee are doing what comes naturally to them. Both are perfectly effective but just not right now. The problem is the culture black hole which exists between them, very, very slow drawing them both in. Or very fast. I forget which way time dilation and black holes work.

So?

A black hole is good analogy. As soon as you are committed to hire and the initial honeymoon period is done, the culture shock kicks in. And there are few ways to escape once the gravity well kicks in. None of them particularly pleasant.

Next time you think 'hey this person is a Perl wizard' also ask; 'will this person systematically alienate the rest of the humans around them?' You'll thank me for it.

Thursday, 11 December 2014

Train the Trainer - Course Retrospective


What's up with that then?

So, I've been charged with becoming a trainer within my organisation.

Just to set expectations here, I know a tiny amount about how to furnish humans with new knowledge, skills and attitudes. Make no mistake, if this field is an ocean, I am a puddle by comparison. I have dabbled with coaching, but much learning from me will probably have been via proximity and osmosis.

Personally, if I'm going to do something, I want to use my whole arse to do so, not just half of it. I want a set of models to apply in context and (more importantly) a strong paradigm, so when I discuss, create and iterate on training material and courses, I have a starting position to challenge/be challenged on. So, I attended the Train the Trainer course to compliment my own buccaneering learning tendencies.

What did you learn that t'internet didn't know for free?

The internet probably knows some of this stuff but here is a bunch of stuff I have learnt over the last few days:

  • I was pretty worried about creating material, how much time it would take and how I would fit everything else in. It turns out the angle of my thinking wasn't right. Instead of 'how can I create course material?' I should have been thinking 'how can I create exercises which transfer the onus onto the participant to learn.' Still be hard, but feel better.
  • Bloom's Taxonomy - A method of classification for learning objectives split into knowledge, skills and attitudes. If done really well, they will form your assessment too. If turns out my paradigms for knowledge, skill and attitude were a bit wonky too. Especially with reference to the difference between skills and knowledge and how to *really*  tell them part. Here goes:
    • Knowledge - I know how to do something
    • Skill - I can practically apply my knowledge of that something
    • Attitude - I have a belief or a will to do something
    • Simple maybe but its what I'll take forward with me! Have a look at Blooms, its fascinating stuff.
    • Excellent lexicon for objectives too, useful in many contexts.
  • My expectations - It turns out I don't need to try and impart all my knowledge and skills within a certain time period. Also my expectations of others post training course. They might not need to be geniuses. They might need to recall some things, recognise patterns in others, be able to apply for others still. 
  • Fluidity - training courses are not an iron clad, military exercise. They provide a scaffolding which allows room for manoeuvre, but the ability to flex on what really matters to the participants. Simple questions at the beginning of a topic like 'what is your experience of X' can help to frame a session, streamlining as appropriate to meet needs.
  • Objectives linked to activity is key. The opportunity to learn, reflect, add to our theoretical knowledge and apply that knowledge should be embedded in each activity. If that is simple matching of paired subjects or attempts to build competence in complex modelling techniques, I really appreciate the set of heuristics the course furnished me with to assist.
  • Me - I'm a pushy so and so. If you are not careful, I'll be in there, taken over the whole show and be happily reshaping things in my own glorious image. I shouldn't do that anyway. I really, really shouldn't do that in a training context. I'm not creating Cyber-men, I must curb my natural tendencies. I think this will be good for me.
It was worthy of the investment. Now, I look forward to getting the sharp nail of experience through my foot and the associated tetanus jab. Time to apply that knowledge, the real test one might argue.

And finally an external view on 'IT bods'.....

It was wonderful to spend time with people from background whose primary focus isn't technology. It can be a closeted world and certainly challenged my ability to explain the fundamentals of testing and agility in context!

Oh and those guys from different career paths and domains still perceive all 'IT projects' to be late, of poor quality and rarely solve the original problem. Or the problem doesn't exist any more by the time we are done. Or the company doesn't. So far still to go. 

Tuesday, 7 October 2014

The Procrustean Bed of ISO29119


The old stories can teach us a great deal. Every once in a while I see the parallels between antiquity and the present, shown through the lens of one of these stories.

The tale of Procrustes (first introduced to me by the work of Nassim Nicholas Taleb, he writes with great skill and knowledge) and the introduction of the "ISO29119 standard" resonate with each other in my mind.

The Tale of Procrustes in a Nutshell......
"Procrustes kept a house by the side of the road where he offered hospitality to passing strangers, who were invited in for a pleasant meal and a night's rest in his very special bed. Procrustes described it as having the unique property that its length exactly matched whomsoever lay down upon it. What Procrustes didn't volunteer was the method by which this "one-size-fits-all" was achieved, namely as soon as the guest lay down Procrustes went to work upon him, stretching him on the rack if he was too short for the bed and chopping off his legs if he was too long."
(Source : mythweb.com)

So, lets adapt this for our ISO29119 situation:
"The "ISO29119 standard" purports to be the only internationally-recognized and agreed standards for software testing, which will provide your organization with a high-quality approach to testing that can be communicated throughout the world. Advocates describe it as having the unique property that it offers offers a set of standards which can be used in any software development life cycle. What the advocates don't volunteer is that your business problem will need to be stretched or trimmed to meet the new standard. So rather than testing solving your business problem, the focus will be to deliver to the standard."
Who will be Theseus for the Craft of Testing?

In the end, Theseus (as part of his tests) dealt with Procrustes using his own vicious device. However this will most likely not be the case here, I believe most thinking testers are advocating the opposite, continuing to champion the principles of Context Driven Testing. Rightly so, merely rubbishing standards is only one half of the argument. I sincerely hope our community of minds will be our Theseus but time will tell. The uptake of the "ISO29119 standard" is an unknown, concerns are probably in large organisations and government, where group (and double) think can be prevalent, these are the soft targets for peddlers of the cult of the same. 

However all over the development world we desperately and continuously strive to leap into Procrustean Beds. Taking shallow solace in "standards", which humans have been doing for a long, long time as a proxy for thought. Once you jump into a Procrustean Bed, you never emerge quite the same.......

Consider investigating..................
http://www.amazon.co.uk/The-Bed-Procrustes-Philosophical-Practical/dp/0241954096
http://www.ipetitions.com/petition/stop29119
http://www.professionaltestersmanifesto.org
http://www.softwaretestingstandard.org
http://www.ministryoftesting.com/2014/08/iso-29119-debate


Friday, 3 October 2014

N things I want from a Business Analyst....

  
Business Analysts. I give them a hard time. I really do. I love them really but I couldn't eat a whole one.

Is something I used to say.

I even went to a Business Analyst meetup once and asked them if they thought they should still exist in our "agile" world or are they being washed away by the tidal wave. Looks can really hurt, in fact they can be pretty pointy.

I wouldn't do that now though, I think I've grown up a bit. Like any great programmer or tester they can really add to a team. And, conversely, like a really poor programmer or tester they can really do some damage. It was unfair to single them out and very possibly bandwagon jumping of the worst kind.

In addition, I fell into a common trap. I was full of hot air when it came to what was bad about Business Analysts, but could not articulate what might make them great.

So here goes............
  • I want a vivid (preferably visual) description of the problem or benefit - lets face it, none of us are George Orwell. We can't describe in words with clarity and paucity, all the complex concepts present in our lives. However, we can deploy many techniques bring flat material to life. Elevator pitches, mind map, product boxes, models, personas and the like are your buddies.
  • I want you to shape a backlog, not provide a shopping list - hearing a backlog described as a shopping list leads me down a path of despair. A backlog is themed beastie, which needs to be shaped. Delivering the stories in a backlog will not implicitly solve the problem, no more than a lump of marble and a hammer/chisel constitutes a statue. Items in a backlog are raw materials. They need sculpting with care to achieve their goals.
  • I want you to work in thirds - for you lucky so and so's who are trying to figure out how on earth to cope in the agile tsunami which is enveloping the world, here's a rule of thumb for you. One third, current sprint, one third, next sprint, one third, the future. The remaining 1% is up to you.
  • I want you to be technically aware but not necessarily technically proficient - technical awareness is a beautiful thing, many testers are a good way down this path. Knowing the strengths and weaknesses of given technology helps you to realise business benefits, because you can appreciate the whole picture, the need, the constraints, the potential.
  • I want you to really, really try with para-functional requirements - This is in two parts, the response times/capacity/scalability the business needs for the real world, coupled with the constraints of the technology deployed. The answer will be somewhere in the middle. If there is anything I have learnt about performance testing especially, is there are few absolutes, para-functional requirements should reflect that subtlety.
  • I want you to be experts in change - in fact you guys should love change deeply, being able to extol its benefits and risks. Helping teams to help their stakeholders realise the value of change in their marketplace. Not snuffing it out to protect business goals which time has rendered of dubious value. 
  • I want you to distinguish between needed and desired - this burns me deeply. The old chestnut about a small percentage of the product actually being used (linky) is serious business. By not determining the difference between what is needed and what is desired, products are being happily helped to fall silently on swords forged by Business Analysts who struggle to articulate this critical difference.
  • I want you to recognise that stories/use cases/whatever are inventory - Imagine the backlogs as a factory, piles of stuff everywhere that our brains are trying to navigate around, winding a path through these piles of stuff trying to find what we need. This takes time and steals from flow, which we can't afford to lose. Before you add it, stop and consider for a moment, whether or not you need it right now.
  • I want you to challenge really technical people to justify value - "Well, we'll need a satellite server configured with Puppet to centralise our upgrade process." Huh? We will? What value does that give the business? Is what I want you to ask. Anything worth building, should be worth articulating as a value proposition.
  • I want you to take ownership of the product goddammit - There ARE decisions you can and should make. If you wish to survive the agile tsunami its time to embrace that change is king, and it means decisions. Big and small, narrow and wide, they are there to be made. By you. YOU.
  • I want you to continuously improve and I'll be watching - I would never want you to do 10 things to improve yourselves. 'N' things please, ever changing in focus to ensure you are delivering value in the contexts you find yourselves.

Basically I want you guys to be superhuman. I think you can be.

Some say being a Business Analyst is old hat. I say it is a gift. But only if you embrace it.

Wednesday, 30 July 2014

The 'Just Testing Left' Fallacy

I am mindful that many of my blogs are descending into mini tirades against the various fallacies and general abuse of the lexicon of software development.

Humour me, for one last time (thats not true by the way).

In meetings, at the Scrum of Scrums, in conversation, I keep hearing it.

    "There's just testing left to do"
And then I read this:

http://bigstory.ap.org/article/social-security-spent-300m-it-boondoggle

An all too familiar software development tale of woe.




I thought; 'I bet everyone on that project is saying it too.' Next to Water Coolers, Coffee Machines, at the Vending Machine, in meetings and corridors.

At first, it gnawed at me a little.

Then a lot.

Then more than that.

I have three big problems with it:

  1. It's just not true. There is not 'just testing left.' What about understanding, misunderstanding, clarifying, fixing, discussing, showing, telling, checking, configuring, analysing, deploying, redeploying, building, rebuilding and all the small cycles that exist within. Does that sound like there is 'just testing left?' When I challenge back and say, "You mean there's 'just getting it done left?'" I get an array of raised eyebrows. 
  2. Its an interesting insight into how an organisation feels about testing. The implication of such statements about testing might be extensions of; end of the process, tick in the box, holding us up, not sure what the fuss is, my bit is done, its over the fence. Most affecting for me is the inferred: "We are not sure what value testing is adding?"
  3. On a personal level, it's not 'just testing.' Its what I do. And I'm good at it. It involves skill, thought, empathy and technical aptitude. I'm serious about it. As serious as you are about being a Project Manager, Programmer, Sys Admin and the rest.

I wouldn't want to not look into the flipside of this argument (latest neurosis).

What about testers who say:

    "I'm just waiting for the development to finish before I can get started"
What are the implications here then? Perhaps there is less understanding of how damned hard it is to get complicated things to JUST WORK. Never mind solve a problem. I used to make statements like this. Until I learnt to program. Then I found that even the seemingly simple can be fiendish. And people are merciless in their critique. Absolutely merciless. Not only from the testers, from senior managers who used to be technical can't understand why it takes so long (mainly because they have forgotten how complicated it can get, filtering out their own troubled past) to build such a 'simple' system.
 

And if I start hearing; 'there's just QA left'...................

Sunday, 13 July 2014

The name of the thing is not the thing


I often ponder the question 'should we care about what we call things in the software development world?' One could argue that as long as everyone has a common understanding, then it shouldn't matter right? I rarely see a common understanding (which is good and bad in context), suggesting that we do care enough to name things but sometimes not enough to care about the amount of precision those names have.

Gerald Weinberg quotes in the excellent 'Secrets of Consulting' that 'the name of the thing is not the thing.' As a tester (and critical thinker) this represents a useful message to us. The name given to a thing is not the thing in itself, its a name and we shouldn't be fooled by it. This is a useful device, as I believe the name is an important gateway, to both understanding and misunderstanding, and names take root and spread.....

Testing is not Quality Assurance

There are probably a great many blogs about this, but I hear/see this every day, so it needs to be said again (and again, and again).

The rise of the phrase 'QA' when someone means 'testing' is continuing without prejudice. Those of us who have the vocabulary to express the difference are in a constant correction loop, considered pedants at best, obstructive at worst.

What is at the root of this? The use of interchangeable terms carelessly (where there is no paradigm for either side of the equation and/or a belief there is no distinction), then wonderment at how expectations have not been met. 

So how do I respond to this misnomer?

(Counts to ten, gathers composure) 

Superficially - 'Testing cannot assure quality, but it can give you information about quality.'

If someone digs deeper?

Non superficially - 'When I tested this piece of functionality, I discovered its behaviours. Some are called 'bugs' which may or may not have been fixed. These behaviours were communicated to someone who matters. They then deemed that the information given was enough to make a decision about its quality.'

This feels like a long journey, but one worth making. I will continue to correct, cajole, inform and vehemently argue when I need to. If the expectations of your contribution are consistently misunderstood, then will your contribution as a tester be truly valued?

Test Management Tools Don't Manage Testing

On a testing message board the other day (and on other occasions) I spotted a thread containing the question; 'Which 'Test Management Tool' is best (free) for my situation?' There are many different flavours, with varying levels of cost (monetary and otherwise) accompanying their implementation.

I thought about this statement. I came to the conclusion that I dislike the phrase 'Test Management Tool' intensely. In fact, it misleads on a great many levels. On a grand scale, as its name does not describe it very well at all. It offers no assistance on which tests in what form suit the situation, when testing should start, end, who should do it, with what priority, with which persona. Not sure such a tool manages anything at all. 

So what name describes it accurately? For me, at best it is a 'Test Storage Tool.' A place to put tests, data and other trappings to be interacted with asynchronously. Like many other electronic tools it is at worst it is a 'Important Information Hiding Place.' To gauge this, put yourself in another's shoes. If you knew little about testing and you were confronted with this term, what would you believe? Perhaps that there is a tool that manages testing? Rather than a human.

So what.....?

So, whats the impact here? I can think of a few, but one springs to mind.

If we unwittingly mislead (or perpetuate myths) by remaining quiet when faced with examples like the above. How do you shape a culture which values and celebrates testing? By not saying anything when what testing is and the value it adds are diluted, misrepresented and denigrated certainly helps to shape that culture. Into something you might not like.