Background

I was at the Software Architect conference in London back in October and saw Rob Smallshire (@robsmallshire) give a talk on Conducting an Architecture Review. Rather than presenting techniques for reviewing an architecture, he instead focused on things that go on around a review that you need to get right in order to be successful: How to get stakeholders involved, how to get the right buy in, how to set expectations with the project owner, and how to get the right people to participate and how they should contribute. It was a good talk, you should check out the video in the link.

Chatting with people in the break it was clear that whilst lots of people do some kind of architecture / design review in their work, they aren’t very familiar with established techniques and methodologies for performing them. There was actually an industry study (Babar2009 [pdf]) a few years ago which showed that most organisations use home-grown, informal, ad-hoc techniques. And of the approximately 40% that claim to use a structured approach very few have heard of or use any established technique - despite how long they have existed and been successfully used.

THE GIST OF THIS SERIES

This is a short blog series about software architecture review techniques:

  • The ones that exist
  • The differences between them
  • How to choose between them based on your context

In this first post I’ll start with the simplest technique that can help improving your architecture reviews and then get into more detail later on.

HOW TO PERFORM AN ARCHITECTURE REVIEW

Let’s start with a simple overview. Like many topics in software architecture, Grady Booch has provided a short and too the point description that explains how to perform an architecture review (Booch2010):

  • Identify the forces on the system.
  • Grok the system’s essential architecture.
  • Generate scenarios that exercise the relevant forces against the architecture.
  • Throw the essential architecture against those scenarios, then evaluate how they land relative to the relevant forces.
  • Wash, rinse, repeat.

That list of steps is everything you need in a nutshell. Pretty much all other sources of information are just an elaboration of one or more of those steps in varying degrees of formality.

A sidenote: This reference (Booch2010) is to an IEEE Software article. Although IEEE SW is an excellent source of information, it is behind a paywall and therefore it may as well not exist for the vast majority of software architects that I know. Luckily they provide a free podcast where Grady Booch reads the article for you. For all the references mentioned in these blog posts I’ll try to also link publicly accessible sources for further reading (podcast index, mp3).

Where to start?

Let’s begin with a very basic technique: Look at the structure for more detailed reviews, and different aspects concerning context that help you find which parts of a detailed review structure that are necessary for you.

Q: What’s the simplest thing I could do that would be useful

A: Active design reviews

Active Design Reviews (Parnas1985 [pdf]) is one of the original discussions on the topic - and its still excellent. The principles are just as applicable now as they were back then.

Parnas' article begins with a description of how ad-hoc "design reviews" are performed in practice and why this makes it difficult for the review to produce a useful result. I’m going to repeat them here because even though these design review anti-patterns were described back in the 80s I still see them practiced today on large-scale projects with budgets in the order of 50-100M euros.

Events in an ad-hoc design review

Here’s the sequence of events Parnas describes in an ad-hoc review:

  1. A massive quantity of highly detailed design documentation is delivered to the reviewers three to four weeks before the review.
  2. The designated reviewers, many of them administrators who are not trained in software development, read as much of the documentation as is possible in the time allowed. Often this is very little.
  3. During the review, a tutorial presentation of the design is given by the design team. During this presentation, the reviewers ask any questions they feel may help them to understand the design better.
  4. After the design tutorial, a round-table discussion of the design is held. The result is a list of suggestions for the designers.

Consequences

  1. The reviewers are swamped with information, much of which is not necessary to understand the design. Decisions are hidden in a mass of implementation details.
  2. Most reviewers are not familiar with all of the goals of the design and the constraints placed on it.
  3. All reviewers may try to look at all of the documentation, with no part of the design receiving a concentrated examination.
  4. Reviewers who have a vague idea of their responsibilities or the design goals, or who feel intimidated by the review process, can avoid potential embarrassment by saying nothing.
  5. Detailed discussions of specific design issues become hard to pursue in a large all-encompassing design review meeting
  6. People who are mainly interested in learning the status of the project, or who are interested in learning about the purpose of the system may turn the review into a tutorial.
  7. Reviewers are often asked to examine issues beyond their competence.
  8. There is no systematic review procedure and no prepared set of questions to be asked about the design.
  9. As a result of unstated assumptions, subtle design errors may be implicit in the design documentation and go unnoticed.

Sound familiar? What I don’t understand is: We were aware of these problems, and how to overcome them, 30+ years ago and yet this still happens in many places. ...but that’s a topic for another rant…

We'll go into details about how to address these problems when we look at more recent techniques.

Be active

Let’s start with the most important part: The reviewer has to take an active stance. That involves two things:

  • Prepare checklists and scenarios
    As a reviewer you must prepare in advance checklists and scenarios that you want to test for the system qualities that you are interested in.
  • Ask active, open questions
    For those scenarios you must ask active, open questions. I.e., questions that require a descriptive answer and not simply «yes/no» questions. You want the designer to explain how the solution will solve a particular problem or how the design would need to change for a proposed future need. That is, never ask questions such as «is it highly available?». Instead, ask «which state is maintained between calls?», «what are the consequences if that server goes down?», etc.

Some other useful principles are also part of Parnas’ basic approach to running a review:

  • Take an iterative approach to the review
    You’ll need to run a number of sessions where you start with the overall architecture and then subsequent iterations to dig into more detail or specific areas of concern.
  • Identify the most important qualities of the design
    You shouldn’t approach the task as a single, all-encompassing, «let’s just review everything» workshop. Instead, you need to identify particular aspects of the design - the qualities of the system - that are most important. Also, you will most likely need multiple people on the review team and they should focus on particular qualities. When working with your typical enterprise applications there is usually a security expert who has a specific area of focus, but other system qualities such modifiability, performance, usability, testing, operations, etc., can also be assigned to specific people on the team.
  • Make sure there is a design representation that is actually reviewable
    That doesn’t mean you need 300 pages with inch-perfect UML specifications and mathematical proofs. It also doesn’t mean a random bunch of box and line pictures with no description of what those boxes and lines are supposed to represent. Identify the views that you need in order to depict the system qualities that are important. Then use a notation that other people understand. UML, Archimate or whatever.

THE SIMPLEST THING: ACTIVE DESIGN REVIEWS

If this is the only thing you get from this article series, then just focusing on these takeaways will help you perform better reviews. In fact, whenever you are in a review and you hear someone ask a question such as, «Is the response time good enough?», «is it secure?», «is it service oriented?», «can that be reused?», I recommend that you print out the Active Design Reviews article, roll it up, and give them a good whack with it - and then make them read it before they are allowed to take part in another review.

Up next

Before we get into specific review techniques its worth looking at the structure of a comprehensive review (in a little more detail than Booch’s 4 steps) so that it’s easier to compare them. That will be the topic for the next post:

Part 2 - building on our collective experience

Thanks to colleagues Mario, Morten M., Lisbeth, and Pär for reviewing a draft of this article.

This article is syndicated from the author’s own blog - http://swarchitectonics.blogspot.no/

Publisert 16.12.2014 av

Jason Baragry