Friday, December 20, 2013

Ancient History - History of the BACCM

I was thumbing through old notebooks looking for lyrics to songs I wrote years ago, when I came across these pictures. I let out a bit of a whoop - I'd forgotten the careful drawing I'd done while attempting to formulate the model now called the Business Analysis Core Concept Model. The first picture - with the KAs embedded in it - was from 2011.

As it was first conceived, the model was called the 'Change Framework', and had just three core concepts: Need, Change, and Solution. Stakeholders were noted as having some relationship to a need - but that was all. Our original thinking had a significant process focus, showing the rough timing of Tasks in the various KAs.

Over the months, this evolved. We dropped the process ideas and focussed on the concepts, fleshing them out, adding Value and then Context.

We tried several configurations - not just the turtle diagram - including considering Change as a central, or primary concept. Eventually we realized that broke the model, and we settled back on the turtle.

The dice at that bottom are the latest version of the BACCM, showing all six core concepts and their relationships. They were a big hit at BBC 2013 - and they're useful too.

Improving Seth Godin's "Understanding [of] Luxury Goods"

This is a somewhat tongue-in-cheek title, but a serious topic. I think Seth Godin is both brilliant and very good at what he does -- and he demonstrates a desire to learn.

A while back he wrote about "Understanding Luxury Goods". His writing is usually brief, bold, and right on the mark. This one rambles a bit, possibly because the nature of value is not as clear as we assume. This is my attempt to clarify the discussion. After you read "Understanding Luxury Goods" come back for a few clarifying ideas.

Monday, October 21, 2013

Enterprise Architecture: A Specialization of Business Analysis

A Green Paper for IIBA Research and Innovation has taken over my life for the last month (Improving Business Analysis Performance: A Model for Empirical Testing) so blogging has fallen to second place in my writing. (You'll be hearing more about the paper soon.) Still, there are occasions where I find myself writing a lot on LinkedIn and other people's blogs - and some of those comments should be posts here.

Here's one. A thread I started in the IIBA LinkedIn Group (Systems Thinking Techniques) has inspired a lot of discussion - which has inspired a lot of discussion. Duane Banks (Holistic Analysis and Solutioning) has been adding a lot to the discussion, but his opinion of what is and is not business analysis is limited. For example, Duane says,
"No, the BA is not organization oriented, assuming organization pertains to initiatives at the business unit or line of business level."
"But it's too much to ask for that BA to also have responsibility "to know how the one small function they are working on fits in with the big, *the really big*, picture."
He also quoted Nick Malik, who said:

"Enterprise Architecture is a keeper of executive integrity.
Enterprise Architecture is the only profession (that I know of) that is focused on making sure that the strategy announced by an executive actually comes to pass. Enterprise Architects exist to make sure that the needed programs are created, and executed well, keeping in mind the end goals all along the way. EA’s go where angels fear to tread: to execute strategies and produce the desired results if they can be produced."
I had the pleasure of meeting Nick at a FEAPO summit a few months before he published that post. (It really was a pleasure - he's a great guy to hang out with, argue with, and laugh with.) At that meeting we (the FEAPO Member Organizations) talked a out the role of the BA as defined by the Tasks in the BABOK Guide. Several FMOs acknowledged that Business Analysis Tasks and Enterprise Architecture Tasks are the same Tasks, but performed in a different context. In this sense, Nick's argument from ignorance is unconvincing; IIBA makes the claim that Business Analysis is the profession that holds decision makers to account - and here's why: you can substitute 'EA', 'executive' and 'programs' for other terms and the statement is still true.

Profession / Professional
Decision Maker
Resulting Quote
Enterprise Architecture / Enterprise Architects
Enterprise Architecture is the only profession that is focused on making sure that the strategy announced by a[n] Executive actually comes to pass. Enterprise Architecture exists to make sure that the needed Programs are created, and executed well, keeping in mind the end goals all along the way. Enterprise Architects go where angels fear to tread: to execute strategies and produce the desired results if they can be produced.
Business Analysis / Business Analyst
Business Analysis is the only profession that is focused on making sure that the strategy announced by a[n] Sponsor actually comes to pass. Business Analysis exists to make sure that the needed Projects are created, and executed well, keeping in mind the end goals all along the way. Business Analysts go where angels fear to tread: to execute strategies and produce the desired results if they can be produced.

This hints that systems thinking must be as applicable to Business Analysis as it is to Enterprise Architecture - because they seem to be the same thing, at some level. But what level?

The idea that systems thinking is not BA work appears to be based on a false assumption (A) and a false statement (B), both noted below.
A) The systems that projects deal are less complex (have a smaller number of interacting components than) the systems that programs deal with. 
B) Business Analysis Tasks are performed only in IT, in Projects, or in IT Projects.

Disproving Assumption (A) - System Complexity

Imagine a system S with N components. Some of these components are subsystems (N'). Some are just things (N" = N - N').

Now imagine a system T with M components. As with S, some of these components are subsystems and some are just things. In this case lets choose T such that it has the same number of subsystems and 'just things' as S. This means N = M, N' = M', and N" = M" (the counts, not the actual components). We can go further, and choose T such that the relationships and interactions among M' and M" of similar complexity to the relationships and interactions among N' and N".

Now, imagine that T is in the set of N': T is a subsystem of S. We have a system and a subsystem with the same number of components. The same logic applies to the relationships between components and other factors that drive complexity.

Conclusion: Assumption (A) is false. The complexity of a subsystem is independent of the complexity of the system it is a component of.

Assumption (B) - False by Definition

If it is true that IT, Projects, or IT Projects are the full span of the domain of Business Analysis then BABOK Guide v2 misrepresented the profession and v3 will be worse. IIBA has made it clear time and again that the scope of business analysis is not limited to this range. A Guide to the Business Analysis Body of Knowledge® (BABOK® Guide) v2, section 1.2 states that "...Business analysis practitioners include not only people with the job title of business analyst, but ... any other person who performs the tasks described in the BABOK® Guide." It says a lot more - including calling out Enterprise Analysts and Business Architects as examples - but the fundamental point is simple: either EA is a specialization of Business Analysis, or there are Tasks that are performed in Enterprise Architecture that are not found in the BABOK Guide.

If these EA-only Tasks exist, no one has yet found them. Certainly, v2 has an IT and Project bias in the writing - but the point remains.

By the way, that bias in v2 of the BABOK® Guide is based on evidence, not just the opinion of an expert in one domain. The content was guided by a dozen volunteers, who lead many dozens of writers. That community developed content was reviewed by practitioners, experts, and (after rewriting) it was reviewed by the public - and then rewritten again before publication. Every bit of formal feedback got a formal response, too, following the ISO standard for Standards. (The development of v3 followed the same process: it represents the combined insight of hundreds of professionals so far, and will have inputs from thousands before it is published.)

In addition to this the Role Delineation Study for v2 indicated that the content was very much a reflection of what BAs really do. Extensive surveys of commonly used Techniques were also used to drive the selection of content.

So, if you accept the evidence that the contents of the BABOK® Guide v2 as a reasonable reflection of the business analysis profession, then you must either accept the claim that Enterprise Architecture is a specialization of the profession, or find a Task that is unique to EA.

Many have looked. No one has identified one. That's not to say no such EA Task exists - but there is a lot of absence of evidence.

End Note

Is a surgeon a medical doctor? Is a heart specialist a surgeon? Specializations are powerful, useful, and everywhere. Business Analysis has specializations that are powerful and useful. Enterprise Architecture is one of them. At least, that's what the evidence we have so far shows us.

Sunday, October 20, 2013

Another Astonishing Automation

From, 2009/07
So... How will Star Trek: The Next Generation tech affect your job a a BA?

Summary Article: Automatic speaker tracking in audio recordings (Science Daily, retrieved 2013-10-23)

Years ago it was clear that human speech would be completely comprehendible to computers. Now, a team of researchers has made a big advance in the field. It is one thing to interpret one person in a quiet room. Several people in conversation is a different task entirely.

Part of my joy in reading this on a sleepy Sunday comes from the realization that there is an entirely different kind of geometry that I had never heard of.


Journal Reference:
  1. Stephen H. Shum, Najim Dehak, Reda Dehak, James R. Glass. Unsupervised Methods for Speaker Diarization: An Integrated and Iterative ApproachIEEE Transactions on Audio, Speech, and Language Processing, 2013; 21 (10): 2015 DOI: 10.1109/TASL.2013.2264673

Thursday, September 19, 2013

GTC East - GovGirl Lunch Keynote

Note: Edited to correct the Twitter information for GovGirl - she doesn't have that account, and can be found @KristyFifelski.

GTC East is a 25 year old conference for senior state government workers. I'm here to talk about Managing Business Requirements, later this afternoon.

Right now we're listening to GovGirl - known as Kristy Fifelski when offline - our lunchtime keynote speaker. She is exploring whether 'Government Can Be Cool'. She's an interesting and funny speaker, with interesting and entertaining ideas.

Her first topic is to explore what 'cool' means. It turns out that 'rebel' cool is not really what people like. People who are helpful and share their expertise - they're rated as 'cool' by their peers. We don't know how to define 'cool', but we know it when we see it.

Cory Booker is her next example. His online audience thought he was cool for the things he's done - saving a woman from a burning building, for example - to the extent that they promote him with hilarious CoryBookerisms.

GovGirl believes that government workers have the opportunity to make work cooler - even meetings and emails. For example, try using FaceTime to have a remote meeting with your boss.

She also believes that the cool stuff that government workers do gets killed by Government Speak. The passion that people have as individuals - "Look how we're improving hundreds of thousands of lives!" - gets lost in dry, technical, bland copy. This is a problem for BAs too. It's not a jargon problem; it's logorrhea. Why use four words when you can use a flurry of words? Is it 

"Recipients of this notification are informed
that the interactive working session will be held
on the last business day of this week."


"The meeting is on Friday."

A big lesson is that effective use of social media is, well, social. It's people following and interacting with people, not legal entities trading information via internet interfaces. Social Media is a way to create relationships, not just a way to broadcast information. It is a particularly powerful in a crisis - as long as you have the relationships already exist. The day of the earthquake not the day to open the Facebook account to reach the people affected.

So how do you make government cooler?

Consider the White House petition site. Someone petitioned the government to build a deathstar by 2016. Instead of responding with "That's a silly idea," the response was titled "This is not the Petition Response You're Looking For" and it went viral. It was funny, fun, simple to understand, and connected with the people who cared about the petition.

There are many opportunities for Business Analysts to help our stakeholders connect to the information we need to relay. Sure, we have to be clear, precise, and boring with some information. An interface specification is unlikely to be an opportunity for dramatic prose. On the other hand, the people who need to build and use that interface will need to understand the purpose and context for that interface - and there you may find a story to be told.

GovGirl also recommends connecting a project to something that is hot - to ride the same wave of popularity. NASA did this last year with a parody of Gangnam Style. The video features students, astronauts, engineers, hilarious moves, and a clear simple message. She's playing it for the GTCEast audience right now, and there is a lot of laughter.

You don't need to use videos to take advantage of a trending topic. The CDC crashed their site with a simple blog post, in their Zombie Preparedness Campaign.

My own advice, atop this, is to be certain that the people putting together the message have a real interest in the trending topic. If GrumpyCat doesn't make you laugh, you're not the right person use GrumpyCat to share information about Social Security. If your message is supposed to connect with people it has to be personal. That authenticity is powerful - and when it's not there the internet can turn on you. 

Reinventing yourself is another way to engage with the people you serve. In the case of Libraries, they don't have a choice: it's adapt or die. Some are becoming coffee-shops. There is even a bookless library in Texas. Some librarians are using tools like Pinterest to post images of their archives.

This is a question close to my heart: what is it about Business Analysis that makes our relationships so hard? (See What's Wrong With Us? on the IIBA LinkedIn Group.) Willingness to take a leadership role is a big part of this, I think, but I don't have an answer to the question.

Kristy is wrapping up now, and giving advice. On one point I heartily agree: do it on a small scale first (perhaps as a volunteer effort) and demonstrate the ROI - in terms of value (time, effort, engagement, reputation, etc.) and in terms of money.

Bottom Line: A clear, authentic approach to relationships is the way to be cool. Good advice, online and real life. 

Wednesday, September 18, 2013

I'm a Requirementortionist!

From The Guardian (
David Morris wrote an article called "We're All Designers Now" that is worth reading. My concerns with this approach isn't that it's wrong or misguided. I was the primary advocate for integrating the concepts of 'requirement' and 'design' in the Business Analysis Core Concept Model (BACCM) and ultimately in the BABOK Guide v3 Draft. What I'm saying is that we can confidently state that IIBA has taken the position that requirements and designs are different points of view on the same artifacts.

We can also confidently state that IIBA has not taken the position that Business Analysts (role) are Designers (role), or that Business Analysis (disciple) is Design (discipline).

I think a lot of the confusion in this question comes from the way the disciplines are named. Perhaps not surprisingly, the people who call themselves Designers named their profession based on the artifact they create. Also not surprisingly, Business Analysts have named their profession based on the reason we create artifacts. That should sound pretty familiar.

The design-side of the requirements-design spectrum is solution-oriented, exploring specific ways to realize potential value. The requirements-side of the requirements-design spectrum is need-oriented, exploring the nature of potential value that could be realized.

What I'm saying is that Designers need someone like David to write an article called "We're All Requirementortionists Now", to explain how 'design thinking' is very much what we call 'business analysis'.

Also, I'd like to change my title to 'Requirementortionist' because reasons.

Saturday, September 14, 2013

New Eponyms?

If an "odyssey" now means a long and eventful journey because the man who had them was named Odysseus, perhaps we have some new meanings to add to a few existing words. Consider "fording": an ongoing public saga that comes from arrogant, belligerent, and prideful ignorance. In a traditional fording, everyone around the protagonist is embarrassed. The protagonist is not.

Now that I think of it, there could be a few related eponyms. Artists and entertainers could "miley" while politicians could "ford". We could also coin the term "harping" for shutting off debate and discussion - a nice counterpoint to the other meaning of the word.

Thursday, September 12, 2013

Design Or Not Design

Meet Andy Kowalewski. He thought that the requirements vs. designs discussion started by David Morris could use some up-culturing*, so he drew from the Bard and tailored 'To be or not to be...'

FWIW here's my reading of it.

The Dangerous Question - Design or Not Design

* My choice of silly words, not his.

Friday, August 30, 2013

What are Needs and Requirements?

There is a very active discussion about Agile and Waterfall over on the IIBA LinkedIn group. Many characters have been typed in that thread by many people (I have perpetrated quite a few). An important-but-separate topic has emerged there, on the nature of requirements.

Shamim Islam noted that the IEEE definition of a requirement is was:*

  1. a condition or capability needed by a user to solve a problem or achieve an objective 
  2. a condition or capability that must be met or possessed by a system or system component to satisfy a contract, standard, specification, or other formally imposed document
  3. a documented representation of a condition or capability as in (1) or (2) 

Update: This post has been updated based on new information. The definition above is still widely quoted and used across the internet, but in the IIBA LinkedIn group an updated version has been noted. The quote below is excerpted from a Yahoo Requirements Engineering group. I don't have access to the source standards document today, so I can't verify it - yet. The content is from Paul R. Croll (Chair, IEEE Software and Systems Engineering Standards Committee); I'll be contacting him about this.
The latest IEEE definitions for system and software requirements are contained in the new ISO/IEC/IEEE 29148:2011, Requirements engineering, which replaces IEEE 830 and 1233.

Requirement: statement which translates or expresses a need and its associated constraints and conditions 
NOTE Requirements exist at different tiers and express the need in high-level form (e.g. software component requirement). 
Further, "Defining requirements begins with stakeholder intentions (referred to as needs, goals, or objectives), that evolve into a more formal statement before arriving as valid stakeholder requirements. Initial stakeholder intentions do not serve as stakeholder requirements, since they often lack definition, analysis and possibly consistency and feasibility."
The old IEEE definition of a requirement is riddled with problems, as is the IIBA definition based on it. IIBA replaced 'system or system component' with 'solution or solution component', which clarified the role that requirements play in a change to an organizational system; "system" has become a synonym for "computerized or automated". That is a far cry from the intent in the IEEE definition, but one that the team writing BABOK Guide v2 had to take practical action to address.

The Core Team working on BABOK Guide v3 did a lot of analysis of these definitions - not knowing about the revision above) as part of the development of the Business Analysis Core Concept Model (BACCM). (I have written a lot about this model in the BA Connection Newsletter, starting in October 2012. The big BACCM Overview is in the November 2012 issue.) Over the course of about two years, the team came to some conclusions.

The old IEEE and BABOK Guide v2 definitions:
  1. Conflate "Needs" with the ways that needs are represented.
  2. Imply that requirements can exist whether or not they have been represented at all.
  3. Overly constrain the things that can be called a requirement. 
  4. Inappropriately limit "Value" to "things someone can do" and "things that are". 
This lead to two simpler definitions that broader than the previous definitions in many respects. They subsume the IIBA BABOK Guide v2 and old IEEE definitions - and are consistent with the new IEEE 29148:2011 definition:
    • Need: a problem, opportunity, or constraint with potential value to a stakeholder. 
    • Requirement: a usable representation of a need.
    From the BACCM view,
    • Part 1 (needed by a user) describes problems and opportunities; 
    • Part 2 (met or possessed) covers constraints of various sorts;
    • Part 3 (documented) is covered by the new definition of requirements.
    We'll take these in turn.

    "A Usable Representation Of A Need"

    The first two concerns that the core team identified are related, and addressed together by separating the meaning of 'need' from the ways that needs are represented. This approach recognizes that needs exist whether or not they are represented or even known. People needed vitamin C to live before vitamin C was known to exist. 'Requirement' is the term we use to mean the description of a need in any usable form. Requirements exist to be used by someone to fulfill a need - to do something to deliver that potential value.

    These definitions mean that requirements can not (by definition) be 'unrepresented'. Needs can be unrepresented. Requirements are representations of needs. This is a practical approach: businesses don't exist to make perfect models of needs. They exist to create / store / consume / move value among stakeholders. If the need is not represented then no action can be taken - so there's nothing a business can do about it.

    Specific Representations

    A few years ago, a need could not be represented using a wiki: the technology didn't exist. Often, requirements are not 'documented' in any traditional sense - nor should they be. In an Agile environment or an innovation shop a good conversation may be enough. Needs are still represented in some way, whether through speech, video, prototypes, working code, automated test cases, etc. Documents are in the set of representations - but representations include a lot more.

    The list of 'formally imposed documents' in IEEE /BABOK Guide v2 is therefore far too narrow. The new definitions are easily applied at any span of the organization, and at any strata. A business case can be understood as a requirements document mixed in with a design document, financial planning, and some other things. A project charter has requirements in it, too.

    Value - Features, Characteristics, Experiences

    The nature of value is a passion of mine. I have written about a lot, both in those BACCM articles and in an older series called 'The Future of Business in the Internet Economy' from 2010 and 2011. It is relevant to this discussion because the current definitions of 'requirement' limit our understanding of value, and frame our thinking in ways that make us blind to many opportunities.

    The mindset of the engineers who defined 'requirement' for IEEE was eminently practical. They were interested in what the system would be able to do once constructed and operational. These capabilities are often described in terms of features or functionality.

    Features have value - but they are not the whole story. The features of a cloth grocery bag are quite similar to a designer purse, and a knock-off 'Guoci' bag could be identical to a 'Gucci' bag. These bags have very different valuations, however. This means that features have value - but value is determined by more than just features.

    In the case of the bags, value is based on the characteristics that the product confers to the owner. The person with the Gucci bag is seen to be different from the person with the Guoci bag. In the case of an iPod "1000 songs in your pocket" was not valuable as a feature; it was an experience the owner of an iPod could have. Features are valuable because they affect what you can do. Characteristics and experiences are valuable because they affect who you are.

    The new definitions of requirements and needs encompass these concepts. They make it coherent to define requirements for the experience a person should have, or the way a product should affect social interactions among product owners. This is not just the realm of marketing or social scientists. It is also the realm of the most successful businesses, and of business analysis.

    Saturday, August 17, 2013

    Evidence Based HiPPOs

    A post over on has me feeling uneasy, which is strange given my job and mandate (Head of Research and Innovation for IIBA, tasked to "transform business analysis into an evidence based profession and practice"). After some consideration, I think the source of my discomfort lies in the implication that opinion is never a good option. The comment below is posted to his blog, but I thought it worth noting here, as well.

    (The image is from


    As Andrew noted, the evidence we have does show us that more evidence usually improves decision making and performance. But the evidence has to be good evidence. It must be:

    - Significant: the signal is clearly distinguished from the noise.
    - Accurate: the measures reflect reality with useful fidelity.
    - Unbiased: the data cover all relevant aspects of reality (not just the parts we like).

    Collecting more evidence addresses the first factor directly: a certain volume is necessary for good evidence, but volume is not sufficient for good evidence. But learning to create and collect accurate evidence takes training and practice - a.k.a. experience. Your writing demonstrates that know that bias is insidious and pervasive, and can be controlled (to a degree) through rigor, discipline, and focus - but these are also characteristics developed through experience.

    This leaves us in an uncomfortable situation. Good decisions* depend on good evidence - but good evidence depends on good decisions, and good decisions require experience (or expertise, or both).

    So who has the experience needed to make good decisions with the available evidence? Perhaps it is a HiPPO who has lives by a heuristic:

    "Evidence trumps reason trumps experience."**

    This evidence-based HiPPO goes with experience only in emergencies, when there is no time for reasoning and no evidence available. Even in a crisis, the evidence-based HiPPO uses some rational approach to guide a decision, based on the situation: from argument and debate in some cases, decision tables in others. In all other conditions, the evidence-based HiPPO makes decisions based on good evidence.

    The intersection of 'leadership skills' 'evidence-based decision making' and 'HiPPO' may be limited, but it exists, and is worth expanding. The HiPPO might be an endangered species on Vulcan, but they will never be rare on human worlds. This position is part of the way humans - irrational social animals that demand leadership - construct intentional collective endeavors. The power of evidence based practice is enormous, but it's not enough.***

    * We're talking about systems that change based on decisions here and not systems that change over time without decisions (evolutionary systems, being an obvious example). Our species demands agency in most events, and finds it very hard to accept the idea that 'things just happen'. This makes changes that avoid agency appear to be wasteful in ways that make them hard to adopt in human endeavors. A/B/N testing is a rare exception in the business world.

    **The pithier "evidence trumps experience" is usually how I express this idea, as one of four principles that guide how I live my life.

    *** Summed up in another personal principle: "emotion motivates; data doesn't."

    Sunday, August 11, 2013

    Make the Problem (Space) Bigger

    Ron Ross has a post titled "Four Big Questions: Why Aren’t We Acting Like We’re in a Knowledge Economy?!" (I was particularly taken by the use of an interrobang.) I posted a long response, which I invite you to thrash over on Ron's blog. His questions do highlight a general systems-thinking technique that I use, and you may find useful. I call it 'Make The Problem Bigger' (after an infamous quote that I can't attribute at the moment). The more accurate name is 'Expand The Problem Space'. Here's how it works.

    Questions, by their nature, imply or define a problem space. "Have you stopped stealing from the office?" is an example of a question that demands a binary, 'yes/no' response, but which likely can not be truthfully answered by either option. The form of the question limits the problem space in ways that make a satisfactory answer impossible.

    This is common to all kinds of human endeavor: we try to simplify humans , or at least human irrationality, out of the system. I'm reminded of a phrase I have heard echoed through many roles in many industries: "This job would be so easy if it weren't for the damn [humans]!" (Substitute the stakeholder of your choice - customers, users, managers, vendors, programmers, and so on.) That means it is the first place I look for more problem space. I try out "because, humans?" as an answer to almost any intractable problem, and see where it leads.

    Do you have a heuristic for expanding the problem space? What is it? Why does it work for you?

    Wednesday, August 7, 2013

    Bezos Buys Washington Post - A BA View

    +Jeff Jarvis is one of my favorite positive curmudgeons. He's not afraid of being passionate, outspoken, opinionated, angry, gleeful, and excited. He's also not afraid to be wrong and to learn and to be public about it.

    With those credentials, here's a quote from the BBC News Magazine:
    Jeff Jarvis, author of What Would Google Do?, says he hopes Bezos will shake things up at the Post and help it adapt to a post-print world. 
    "In some ways it has to be a philanthropic act," says Jarvis of the purchase. 
    "Bezos is trying to protect an American institution. But I hope he doesn't just pay its bills. 
    "My only fear is that he's famously secretive. It's part of his business. Whatever innovation he does at the Washington Post, it will help us all if it is done openly."
    I haven't heard Jeff weigh in on This Week in Google (TWiG) yet - very curious! Before then, I have a thought about how Bezos thinks, and what that might mean for the Post.

    From what I have read and seen, a big part of the genius of Amazon was the ability to break the business down into reusable, scalable parts. It's in their supply chain, their websites, and their web services. Some of these have become large businesses on their own as a result.

    Purpose is another hallmark of his leadership. Amazon has created a lot of businesses, each with clear direction and focus. Right from the start it was a business based on a vision, and that hasn't changed.

    If these behaviours are part of his nature, Bezos is looking at the Post in terms of it's parts and in terms of purpose. First things first - the post isn't a newspaper. Newspaper is a way to mass-distribute information when distribution is hard. The Washington Post, at it's core, does three things.

    1. Find news.
    2. Curate the news (for importance and relevance).
    3. Distribute the news (to the people who find it important and relevant).

    Somewhere among these three revenues must be collected and profit made. Right now the Post (and other news organizations) mostly monitise the third part. Ads and subscriptions are the traditional methods. Getting paid to find news invalidates the newsworthiness of the news, in the same way paying for love invalidates that love; non-monetary markets will remain a prime mover here. That leaves the money in curation.

    What do you think?

    Sunday, July 28, 2013

    Reporting Requirements - A Tale of Discovery

    Imagine the horror I felt as a young BA as I elicited my way into this scenario.

    Picture a dozen senior managers of various sorts arguing at a conference table. They were complaining about an "FTE" report that they all needed, but no one liked (FTE=Full Time Equivalent).

    I was a minor player - a low level employee, and the facilitator. The room was getting bitter and acrimonious so I pulled out the most powerful tool in my BA toolkit: the Village Idiot.

    "I'm lost here," I said. "I get that this report isn't doing what you need it to do - but I'm not familiar enough with your work to understand what it should do. Could you help me understand what you're trying to do with this information?"

    One of the managers piped up immediately. "We use it to figure out how much work we've gotten out of the staff. We expect some people to have to work more than 40 hours during project implementations or critical fixes, but we don't want to have the whole team running large amounts of overtime all the time. They'll burn out, and their work will suffer."

    Nine other people nodded their heads. Someone added, "If one team is consistently working at more than 1.25 FTE, we look at what's going on there, and if we need to move staff around."

    This made sense to me. On the whiteboard, I scrawled out this equation:

    FTE = ( # hours worked ) / ( 40 hours per person-week )

    "So if a team of ten works 400 hours, the manager has 10 FTE," I said. "If the same team of ten works 600 hours, the manager has 15 FTE." Ten people nodded.

    Ten nodded, but there were a dozen people there, and two of them were aghast. (This is not hyperbole. These people were shocked speechless.) Their mouths were hanging open and they were shaking their heads in stunned disbelief.

    "That's insane," one of them sputtered.

    "That calculation doesn't tell us anything!" the other declared. "How much were they paid?"

    The other ten people now confused and angered. Several appeared to fall into sputtering shock. Faces drew dark with anger.

    As a baby-BA, I had just uncovered a facilitation nightmare.

    After some discussion, the group determined that the finance people had to know the number of hours worked at a given pay rate, not just the number of hours. Consider a pay scale that works like this:

    • 1 to 40 hours: normal pay rate
    • 41 to 60 hours: time-and-a-half (1.5 * normal pay rate)
    • over 60 hours: double time (2 * normal pay rate)
    Ten people work for 600 hours in a week could earn as little as 17.5 salaries, but could end up over 20 salaries paid out. This overtime had a significant effect on the cashflow for the group, and had to be accounted for. Here's an example calculation:

    3 people worked normal hours
    = (3 * 40h) * 1.0 rate/h
    = 120 hours of pay

    4 people worked 60 hours
    = ( (4 * 40h) + ( (4 * 20h) * 1.5rate/h)
    = 160h + 80h * 1.5
    = 160h + 120h
    = 280 hours of pay

    3 people worked 80 hours
    = ( (3 * 40h) * 1.0 rate/h ) + ( (3 * 20h) * 1.5 rate/h ) + ( (3 * 20h) * 2.0 rate/h )
    = (120h * 1.0 rate/h) + (60h * 1.5 rate/h) + (60h * 2.0 rate/h)
    = 120 + 90 + 120 hours of pay
    = 330 hours of pay

    Total Hours Paid: 120 + 280 + 330 = 730 hours
    Team FTE: 730 hours / 40 hours per person-week = 18.25 FTE

    If one person did a normal week, seven people worked 80 hours each, and the other two went on vacation, payroll would owe 20.25 salaries of pay.

    Reporting Requirements Lessons

    1. Establish the Action or Decision First

    The senior managers in the room didn't know that they were using 'FTE' to mean two entirely different things, because they didn't know that there were two entirely different sets of decisions to be made. This lead to serious tension and distrust between groups of managers, since the 'other guys' were "demanding stupid changes" to the reports, which "stop me from doing my job." (If you're familiar with my other ideas you'll recognize the 'stupid/evil' heuristic at work here.)

    2. Establish the Actor or Decision Maker Second

    When the dual purpose for the report was exposed it was instantly obvious that the functional managers needed entirely different information from the finance managers. There were two stakeholders, making two different decisions. This helped to short-circuit the bitterness and acrimony between the groups, and set us up to define a new format for the report.

    3. Define the Information Third

    Clearly, this report needed to show two sets of information: FTE Hours, and FTE Pay. There was some discussion about the way the information should be sliced and diced - per week or per pay period, per team or per set of teams, and so on. This discussion lead the managers to realise that they were making other decisions based on this information, so we circled back to step 1, and defined those decisions. For example, financial and functional managers both wanted to see any individual or team that was an outlier - past a certain threshold for hours or dollars. I asked why this was important; what actions might be triggered by this information. Functional managers wanted to find teams that were not functioning as teams - where one person was doing too much and the others were coasting. Financial managers wanted to control costs.

    4. Define the Format Last

    In this case, the report was segmented into three sections. In the first, a per team summary of FTE-Pay and FTE-Hours was built, along with exceptions beyond the hour or dollar thresholds the managers set. The second section had the detailed breakdown of the financial FTE calculations, and the third had the detailed breakdown of the work-effort FTE calculations. There were other options for organising the information, including separate reports. Everyone felt that the "other side" should get a better understanding of the measures that "we need to make the organization work". (Yes, they did use the same words to describe each other's ignorance.)

    Sunday, June 30, 2013

    Requirements Measures

    I'm taking a short pause on the Changing Change series, as a large number of other questions have come up recently, and they deserve more immediate attention. Most of these are related to measures and metrics of various sorts.

    Over on the IIBA LinkedIn group, Tammy Goslant asked if anyone could share an industry standard template for non-functional requirements (NFR). I have never seen one, and suspect this is because there is not best way to represent needs: it's always situational. Every organization finds itself modifying someone else's template to address the organization's industry, change methodology, maturity, and BA competencies (across all BAs).

    Still, all templates for NFRs should do a few things really well. The most important, to my mind, is to ensure that measures for non-functional requirements make it easy to account for the ranges of acceptable operation for the solution.

    Requirements Measures

    In general, there are a spectrum of requirements measures related to the type of value that the requirements are supposed to deliver:

    • Functional Requirements tend to describe some potential gain to be sought.
    • Non-Functional Requirements tend to describe some potential loss of value to avoid.

    Note the use of 'tend to' in this discussion. This is a spectrum, not a set of absolutes, and all requirements represent some mixture of potential gain and avoidance of potential loss.

    Any kind of requirement measure can be placed in some range on the spectrum above.

    • Measures for potential gains tends to be relatively absolute. At the most basic level of individual features, the measure is 'works / doesn't work'. The functional requirements for a login process at the ATM are measured this way.
    • Measures for avoidable losses tends to be a range of acceptable - and unacceptable - behaviours. The ATM might log me in, but the response time should be 
      • within 5 seconds 90% of the time,
      • within 10 seconds 99% of the time,
      • within 15 seconds 99.999% of the time,
      • within 20 seconds 100% of the time.

    Many requirements should be measured based on both kinds of measures - the potential gain, and the avoidable loss. The broader the scope of the requirement, the more common it is for both kinds of measures to be relevant. For example, business objectives generally involve a range of potential benefits and a range of avoidable losses (or ideal operating conditions).

    Risk-Benefit vs Loss-Gain

    Another way to consider this spectrum is to think in terms of solution benefits and solution risks:

    • functional requirements tend to focus on potential solution benefits.
    • non-functional requirements tend to focus on solution risks.*

    Unlike change risks, which are managed as part of the process of making the change, solution risks should be managed as part of the solution. This may take the form of feedback or control loops, error trapping, performance reporting, or other mechanisms - but something needs to be built to address the risk. If you're an IIBA member, you can see a detailed, peer reviewed discussion of this in article CARRDS: Constraints, Assumptions, Risks, Requirements, and Dependencies in the Best Practices for Better Business Analysis (BP4BBA) publication; a shorter version appeared for the public in the December 2012 BA Connection Newsletter, under the same title.

    Using these terms can be useful, especially when discussing the importance of non-functional requirements with stakeholders who are sceptical of their importance.

    Acceptable Ranges

    In the ATM example, the lower limit of the acceptable range was zero seconds. In many solutions there is both a minimum and a maximum to the acceptable range. Jacob Nielsen mentions one such case in his article, Response Times: The 3 Important Limits. Nielsen says,

    "Normally, response times should be as fast as possible, but it is also possible for the computer to react so fast that the user cannot keep up with the feedback. For example, a scrolling list may move so fast that the user cannot stop it in time for the desired element to remain within the available window."
    Servers should operate in a temperature range. Supply chains should be optimized so outputs of one process happen at a rate very similar to the inputs for the next process. Sales demand that drastically exceeds the capacity of the manufacturing facility can lead to angry customers and lost profits.


    There may be no industry standard template for the recording of non-functional requirements - but there are some things that all templates should have. Coherent, clear, useful measures are at the top of my list.

    * From a Business Analysis Core Concept Model (BACCM) point of view, the term 'risk' and 'potential loss event' are essentially the same, so the 'solution risks' can be rephrased 'events with the potential to case a loss of solution value'. 

    Thursday, June 20, 2013

    Automation And You

    Last week I mentioned that there are some trends in human behaviour that go back to before recorded history - perhaps before we were human. This week we're going to look at the one that I find most personally fascinating: automation.

    Cyborgs are humans with tools integrated into their physical forms and behaviours. Glasses. Pacemakers. Pencils. Cars. We're all cyborgs.

    Automatons are machines with humanity integrated into their physical forms and behaviours. Lighters. Printers. Plumbing. Phones. Automation is everywhere.

    Notice a similarity?

    For hundreds of thousands of years humans (and our progenitors) have been automating ourselves into tools, and integrating our tools into ourselves. Other animals create and use tools or integrate them into themselves; we do both on a scale unparalleled in the rest of nature. We do this to preserve life and quality of life: artificial hips are one thing, and implanted artificial lenses that allow you to see the ultraviolet are another (thank you SGU!). Our dependency on technologies and tools dates back far further than most people consider. Only one primate survives with a digestive system that is tiny and weak.* Only one animal can throw at all.**

    Humans are not just tool using apes. We are apes made of tools, physical and mental. On the physical side, our brains are built to integrate the tools we use into the our sense of self. When a driver says "I can feel the road" or an athlete can "feel the puck" it's not a metaphor or imagination. (For a really deep dive into this check out "Natural Born Cyborgs" by Andy Clark, published by Oxford University Press in 2003. The Brain Science Podcast is also a great resource.)


    Automation and cyborization are important because they are both part of our nature. It bears repeating: we are not just tool-using animals; we are animals that integrate tools into ourselves and ourselves into our tools.

    Fire and ballistic weapons are baked into our DNA, and show up in our guts and our shoulders. In the first case, the capacity to cook food liberated so many calories that our physiology was transformed. In the second case, the capacity to accurately throw random objects with force likely did something similar.

    At the most fundamental level humans are animals with two extreme traits. We'll talk about social another time; auto-cybo-mati-rg-on has more territory to cover.


    What does this have to do with being a Business Analyst?

    Many of the things that BAs do today are hard to automate - but at some point in the past everything was hard to automate. The integration of humans and tools isn't over. It can't be. Firestarter was a job. Printer was a profession. Computer was a profession. Those were hard, yet they were turned into matches and boxes that sit on your desk. Human are not humans without tools.

    I describe this in three Automation Axioms of human behaviour:

    • AA1: Humans automate every behaviour we can using tools. 
    • AA2: Humans amplify every capacity we have, through tools. 
    • AA3: Humans invent new capabilities to have, with tools. 
    To be clear - when I use 'axiom' it is because I have not been able to discover behaviours, capacities, or capabilities that we have not tried to automate, or do not want to automate. If you can take any or all of these as a hypothesis and test it - do! Point out research that falsifies any or all of these assumptions.

    So why does this matter to BAs? It matters because the automation of entire industries, and the job displacements that resulted - they're not over. Knowledge workers are next.

    I was able to dictate this sentence into my computer, with no effort whatsoever. Transcriptions were once a purely human job. What will happen when a cell phone on the desk can record everything that you do in an elicitation session? What happens when that phone can translate all the diagrams on the whiteboard, all the words that were spoken, everything that was written down - and turn it all into a document that is indexed and useful? What happens when the cell phone can ask the questions, project the diagrams, and record the responses, without human intervention?

    It is going to happen. Business Analysis is hard work that requires expertise, subtlety, and knowledge, but betting that our profession won't be automated is betting against human nature - and that's a sucker's bet.


    This does not deny the pleasure we feel performing tasks "from scratch." My wife and I have recently started making (excellent if I must say so) bread at home. I am the bread machine (the one kneading the dough), and my shoulders are getting nicely toned (we do like the heavier loafs). Still... "from scratch"?

    We don't grow the flour.
    We don't milk the cows.
    We don't press the mixing bow.
    We don't build the tabletop.
    We don't mine the natural gas for the oven.
    We don't build the oven.
    We don't mine or smelt the metal ore for the oven.
    We don't assemble the tools to mine or smelt the ore for the oven.
    We don't blow the glass for the flour canister.

    We don't ... get the idea.

    If humanity (not individual humans) are engaged in a long term integration with our technologies, it is easy to fear the dystopian future where we are subjugated or subservient to our own creations. This concerns me, in the same way that house fires and car accidents concern me. I take reasonable precautions, and then get on with baking and driving.

    In part, I'm not afraid of the future because there are things that can not be automated, in the same sense that there are things that can not be bought. WE have a special name for the act of buying love, and for a reason. Humans are not rational animals, and the intent behind certain interactions has meaning.

    When your mother kissed your cheek good-night - that isn't something that can be automated. Sure, a machine could plant a kiss, but if the automaton can share the emotional interaction involved in a good-night tuck-in, it can't really be classified as a machine. In the episode "Measure of a Man" from Star Trek: The Next Generation, the claim that "Data is a toaster!" was clearly not true. But why? R2D2, HAL - these were not really machines in that sense. They were sentient beings clad in strange shapes and colours. They were people.

    Conclusion: Automation and You

    This long term trend of human-tool interaction should be of special interest to BAs. Automation will continue to enable our role in fantastically powerful ways - which is the same as saying that it erodes the things that make our role difficult. Creating androids that we relate to as people is a very challenging technical problem, and one that won't be solved quickly.

    Your star power as a BA is rarely based on your virtuosity with a tool or a technique. Sure, Word Wizardry can help - but WhiteBoard Wizardry can change lives too. Consider the parts of your role that machines won't be able to do any time soon. Most of them have to do with human contact.

    Start a video chat.
    Pick up the phone.
    Shake hands.
    Build relationships.

    Automatons can do a lot, but they can't be human - yet.


    * The one that mastered fire.***
    ** The one that mastered spears.***
    *** I am aware that other primates, such as Neanderthals, had some throwing ability. Not like us, though.

    Sunday, June 9, 2013

    What is Innovation?

    In last week's article, we covered four major advances in how people perform changes. The first three have been around in various forms for thousands of years: the Making itself, Tested for quality and variance, Controlled to coordinate people and resources. Modern project management is only about a century old, and can in many ways be traced to the building of the Panama Canal. The latest advance in our approach to making changes is Business Analysis. This discipline has appeared in a myriad of forms, ranging from enterprise architectural disciplines, to design thinking, to the community-driven standards that International Institute of Business Analysis® (IIBA) manages and sustains.

    Being part of the latest - and perhaps even the greatest - transformation of change brings me great pleasure. As a Business Analyst, it is also clear to me that "latest" is not the same as "last." Entirely new categories of industry are appearing at an accelerating rate; the complexity of change is increasing; technological advances are not slowing down. "Greatest" is a value judgement that depends on the context, and our context is arguably more dynamic than any time in history, excepting world-wide* catastrophes, such as war, famine, and pestilence.**

    Image Source: I-BADD Keynote by Julian Sammy, Head of R&I, IIBA
    Business Analysis works as long as the needs are understandable. There are things which we can not know and can not anticipate. What do we do about disruptive, unpredictable change?


    In it's most basic form, innovation can be represented as a simple equation:

    Innovation ≥ Invention + Delivery

    It is not enough to create something new, nor is it enough to deliver something. The combination of invention and delivery is greater than the sum of the parts.

    Lightbulbs are the iconic innovation story: Edison innovated, where others invented. He took an invention that had been proven as possible almost 100 years before, and made a practical product. It wasn't just the bulb that was the big deal, either: the whole supply chain for electrical power distribution was rather important too.
    In Edison's time, there was time to operationalize an invention. Now new categories of technology appear every few years, as do new business models. These both disrupt older models. In this environment, it isn't enough to see an opportunity in the market, and solve the problem first. Innovation includes the capacity to adapt to environments that don't exist at all, and that may be unimaginable.

    Business Analysis breaks when you reach the edges of the imagination; it is after all, an analytical discipline, and uses analytical tools. It is also slow. This is an asset to the profession: taking considered action is almost always more effective than instant reaction.

    So what do we do about conditions that are inconceivable? What of reactions have to happen faster than considered thought to have any meaning at all? Can any organization develop the capacity for immediate reactions to unimaginable conditions?

    One approach to this situation is found throughout nature: plant a huge set of seeds, each with different characteristics. Some will excel in local conditions; some will not. The seeds that are pre-adapted to the conditions that exist when they germinate are the ones that survive.

    This is an important idea: the seeds that grow are adapted to conditions that do not exist when the seeds are planted, because they lie in the future. When the seed starts to grow, the conditions may match the seed's ideal conditions, or the conditions may kill it.

    One characteristic of modern innovation is the capacity to plant seeds that can grow (and be profitable) in conditions that are unimaginable today. In business, this means a persistant, deep-seated corporate culture that demands many small changes be attempted. These small proofs of concept will have a high failure rate, but should be examined carefully (something that BAs can do, and do well). Figure out what conditions would have made that seed a success, and bank it just in case those conditions come to pass.

    Another characteristic of modern innovation is the capacity to nurture the seeds that are growing, while avoiding the sunk cost fallacy. This is a very hard problem, particularly because it requires a culture that values the past while relentlessly discarding it. Google and Apple are at the forefront of this behaviour today.

    Next Steps

    In the next few weeks we'll look at several key ideas that play into this modern concept of innovation. This will include delving into the way that operational disciplines have developed over time: practices like specialized people, assembly lines, and just-in-time supply chains have changed the world. We will also explore some trends that are rooted deeply in fundamentals of human nature and span thousands of years: increasing complexity of change practices is one; automation is another. A propeller-head post on the nature of change will also be coming - though that one involves set theory and weird mathematical notation, so I may post it as a separate series.


    * Historically relevant and broad uses of 'world-wide'.
    ** Often provoked by climate change or ecological collapse.
    *** I suspect there is another set of disruptive transformations occuring in these practices as well.

    Sunday, June 2, 2013

    Makers were first. Who will be last?

    Humans make changes to how we make changes - but first and foremost, we make changes to the world around us. When Dan Pink talks about the science of motivation ( he's talking about this drive to effect the world.

    So how do we make changes?


    Image Source: SXSW - MIY: The Maker Movement.
    A maker is an individual contributor who explores the world, fiddles about, and makes a change. A Maker may be an artisan, artist, scientist, engineer, inventor... the list is long. All makers have common threads: curiosity, expertise, and a preferred medium or form (which may change over time). Musicians make music. Teachers are makers too: they work with minds and bodies.

    Makers can be contrasted with Doers, who use what's been made. Everyone is some combination of both; it is a spectrum, not opposites or an isolated, forced choice.

    Without Doers, organisations can not be sustained.

    Without Makers, there is no "controlled transformation of an organisation"... or of anything else.
    This mode of change works as long as it's a change one maker can create alone. It breaks the moment that more than one person is making one change.


    Image Source:
    As soon as two people are working on one thing, the quality of the output will change. Maybe it goes up, maybe it goes down, but it will be different. And what if two makers are working on different components of one thing? Can the stonemason build the door for the house? Individual contribution breaks down as soon as the work requires more than one human; Testers ensure that the quality of that work high enough. Consistency, repeatability, quality - the Tester makes sure that Makers working together produce results that meet a certain standard. (Testers also work with Doers on sustaining an organisation - but that's a topic for another day.)

    This mode of change works as long as the change is simple enough to coordinate spontaneously. It breaks as soon as logistics get complicated.


    Image Source:
    What happens first? After that? And then? What if it doesn't? Do we have the resources (people, processes, tools, information) to make it happen? Where is that input coming from? When? For how much money?

    Testing the quality of a change is necessary, but it only matters if the change is completed. Controllers and coordinators - often called Project Managers in the modern day - deal with the logistics of a change. Controllers make it possible to coordinate tens of thousands of people for years, to resolve a single question or solve a single problem.

    This mode of change works as long as the need is easy to understand and describe.


    Image Source: Foundations of Software Engineering, Kenneth M. Anderson
    What are we trying to achieve? Why are we working toward this outcome? Will it matter if the context changes? Who gets value from this? What kind of value? What about needs that can't be stated in 21 words?

    Controlling a change is only valuable if it's a valuable change; it only matters if the purpose of the change is understood. When the purpose is simple, Control is enough - even when the solution is astoundingly complex. The Apollo program is a brilliant example.

    "...the goal, before this decade is out, of landing a man on the Moon and returning him safely to the Earth."

    A change that some had imagined but was at that time impossible, and the purpose articulated in 21 words. It was realised through project management, testing, and making. In the end, the Makers built something insanely complex and with shockingly high quality. Controllers made it happen in the right order - and made it happen at all.

    This mode of change works as the change can be described in a few dozen words. Small words. Quoting from Galaxy Quest (one of my favourite movies of all time), "Explain as if to a child."

    In case it isn't obvious - the change discipline that is founded in understanding is Business Analysis.

    This mode of change works as long as the needs can be understood, at least in principle. Information theory, mathematics, and basic logic all tell us that there are things which we can not know, however. What do we do about needs that can not be directly anticipated? What do we do about disruptive, unpredictable change?


    ...and that's what well talk about next week.

    Update 2013-06-08 - I just found out about the integration of comments with G+, so I'm updating this post and reshareing.

    Saturday, May 25, 2013

    Changing Change: A Series About The Future

    The Subject

    One day soon - in our lifetimes - most of what you do as a change agent will be done by machines. It is inevitable: humans have been integrating the skills and knowledge of experts into machines since long before recorded history, and we're not going to stop just because your job is hard or because your job takes expertise. Gutenberg automated publishing, and put monks out of business. Eventually printers were automated - integrated into their machines - and now they sit on desks. Go back far enough and 'firestarter' was a profession.

    Automation of expertise and extension of human abilities is one trend you shouldn't bet against. But how can you prepare? Our world is undergoing ever more frequent disruptions, the pace of change is exponential, and we may be on the cusp of changing the very nature of what it means to be human (not through genetic engineering - through regular old natural selection). In light of this, what role is safe? Can any career be future-proof? Can yours?

    As with most things, the business analysts' favourite answer is the most appropriate one: It depends.

    This Series

    In each article we will touch on some aspect of change, including how the approaches to change have themselves changed. This is a broad topic, so the discussion range far: information theory, evolution, processes, roles of change agents, behavioural psychology, automation, technological advances, and even the event horizon and singularity. Work by people +Jeff Jarvis, +Kevin Kelly, +Dan Pink, +Daniel Kahneman, +Dan Ariely, +Chris Anderson, +Clay Shirky, and +Steven Pinker will be referenced. It turns out that understanding how organisations change - how we create, alter, and destroy organisational systems - is a very large problem space.

    All of this work was first brought together in a keynote I was honoured to present at the recent I-BADD 2013 conference, called "Changing Change". Since then, I have had several conversations on the topic. In each case, everyone learned something new; it's time to have the conversation on a larger scale.

    The Schedule

    Given that this is a big subject, the hardest part is breaking it down into manageable chunks. I'll aim to make one post per week, generally on Saturdays.

    The Request

    Please comment, critique, argue, and disagree!