Tag Archives: charts

Agile ’05 Conference Report

Part 1

The Agile '05 conference was July 24-29, 2005, in Denver, Colorado, USA. There were about ten or twelve tracks at all times, so this report is necessarily only a limited bit. Usually I teach, but this time I was an organizer so I got to be more like an attendee in some ways.

Brian Marick and Bob Martin Keynote: Where were we, where are we, where are we going?

Brian Marick: We have different disciplines. We become multi-specialists. "A team of multi-specialists can knock the socks off a team of specialists." But there's no career path for multi-specialists – what conference would you go to?

Brian invited Jim Highsmith up to announce the formation of the Agile Project Leadership Network (APLN). Its focus is on applying agile principles to all projects, not just software. Relative to the Agile Alliance, this organization has consistent principles and intends to collaborate; it just has a different focus.

Bob Martin: "Value the flow of value."

The software industry: was either no process or waterfall. Now we're recognizing the failure of waterfall and the need to measure in-process work. It's time to do "software at home." Where we're going is to resolve this challenge, with short cycles, feedback, testing, craftsmanship.

The agile community: The Scrum paper in '95, XP in '99: resonated with the community. But we were fractured into many agile methods; branding. Now, industry is learning to pull and mix practices; we're at the moment of convergence. Future: head away from brands, strong focus on project management. At the end, we'll have a definition of agile.

Users: Were small teams with a programming focus; limited acceptance tests, project managers "lost." We are a community that has grown. We see 80-people teams often mixing Scrum and XP. 300-person teams exist. A company of thousands doing transitions. Future: we'll grow. Agile project management will happen. Automated acceptance tests will be regarded as the measure of progress.

Brian Marick: There was a group of English cyberneticians. They made discoveries of performance, adaptation, surprise. But oops… no (single) institutional home, no acknowledged base body of techniques, no grooming of successors. Their tradition is gone. To avoid this, "we gotta take over a university department." He announced the "Gordon Pask award", to honor people whose recent contributions demonstrate their potential to be leaders of the field.

Open Space

There were a lot of topics; see the web site. One was "XP Rituals", where we identified a number of fun things:

  • Breath mints
  • Haiku written on card before starting a task
  • Hacky sack at standup
  • Traffic light for builds
  • Darts when a pair finishes a task
  • "Story starting" gong
  • Story completion frog (noise-maker) or bell
  • Daily progress chart – two thermometers, showing days used and points locked in
  • Automatic break program
  • Smiley stickers for story cards

Metaphors, by Dave West

In 2001, metaphor was listed as an official XP practice (part of architecture); in 2005, it was not. Kent has said its redundant as a practice since you can't help using it. (Dave argues it's a learned skill.)

Metaphor has a lifecycle from poetry through use. Lakoff and Johnson say all thought is based on metaphor. Dave argues:

  • If you don't choose consciously, you get one unconsciously.
  • Metaphor selection has a real effect on software design (architecture).
  • Metaphor is a skill you can develop.
  • Therefore, it should be an agile practice.

We have many default metaphors: machine, org chart, entity, dualism, computer science, software engineering, product manufacturing. There are alternatives: e.g., object as person.

To use metaphors well, learn: reflect, experiment, read widely, liberal arts, point of view.

Tim Lister: XP Denver Invited Talk

  • Managers are lonely – they don't have a collegial environment.
  • QA is a bottleneck, and no wonder: it's done late and in serial rather than in parallel.
  • Getting agreement on requirements is not neat. "Gathering requirements" sounds like they're Easter eggs in the yard. But really, most requirements have to be invented.
  • Problem: people believe the customer.
  • It's messy, so model and prototype. Don't prototype solutions – prototype wrong things on purpose (so they can laugh at it).
  • Estimating is overwhelmed by expectations: skew (never early). Separate goals from estimates.
  • Software in general is uninteresting. "One right way" is anti-intellectual.
  • "Never lose the satisfying feeling of making something valuable, together with your colleagues."

 

Part 2

Delivering APIs in an Agile Context, John Major

John Major described a project that was doing custom programming of lab workflows as part of the Human Genome Project. This was an environment with lots of churn: biology, instruments, J2EE.

Tensions:

  • Stable APIs vs. agile change
  • General features vs. "Do the simplest thing that could possibly work"
  • Frameworks vs. global refactoring
  • Framework design vs. features

The result was a platform API, a mix of data-driven (configuration) specification and custom code. The team had the attitude toward process, "We'll try anything for a month." There was lots of testing. They used Jemmy with Swing.

Unique practices:

  • Internal support: old and new features. Used the idea of "office hours" for support. Provided training. Tried having platform developers work on applications, but it didn't work so well.
  • Lightweight code reviews. Rather than pairing, they required at least a 20-minute weekly code review with an office-mate.
  • Agile database schemas. Problem: RDBMS is rigid, but schemas evolve. Solution: build agile schema features and tools to manage change.

Lessons learned: Good:

  • Agile practices help keep technical debt low.
  • Build tools to support RDBMS and large codebase.
  • Pull in senior people to help design and adoption.

Bad:

  • Cost of absorbing API changes is paid by the application teams, but benefits accrue to the business.
  • It's hard to get feature design right (to balance flexibility and focus).
  • The business had trouble understanding the costs of release management. (Branches made the whole thing even crazier; he described a "London subway" map they created to show all the paths.)

Ugly:

  • People issues – don't go into denial.
  • Weren't able to tell QA what the test suites did, so there was overlap between the automated tests and manual testing.
  • Be humble – the platform needs the app and vice versa.

Summary:

  • Build reusable platform technology
  • Use agile practices to cope with change
  • Work with "eyes wide open"

Part 3

Rachel Davies' and Mike Hill Workshop on Informative Workspaces

Informative workspaces:

  • Team memory
  • Visible status (keep it fresh)
  • Automation – light and sound
  • Hawthorne effect
  • Track "puzzles"
  • Root cause analysis
  • Positive focus

Themes: Ownership: Own the space; collective ownership of communication, accountability.

Transparency: Be honest with yourself – show where you really are; peer pressure; reveal hidden/unknown problems.

Keep it Fresh: Drop stale charts. Let anybody update. Automation (e.g., build server status, test status). May use projects or status monitor.

Intra-/Inter-Team Communication: Visual progress => motivation. Whose team sees it.

Hokey-ness: Lack of ownership. Formality. Ownership – lead team to the charts. Time limit for new things. Cool is fun, but must be cool to team.

Kent Beck Open Space on Renewing the Fire

"The edge of knowing and not knowing" – Troy Frever.

What helps keep the fire? A lot of disucssion on being in the zone, in flow – but that's only part of it. Many people crave novelty, mentoring, flow, living on the edge.

There's a "game face" you put on.

Pollyanna Pixton Open Space on Organizational Change

There's a "practice" level, for developers, managers, product owners. But there's also an organizational change level.

"Continuous retrospective" – collect cards all week.

Leadership "versus" self organization.

Kent Beck Open Space on XP for Beginning Teams

He uses an Appreciative Inquiry approach. Take a practice – what does it mean for you? Acts as a "practice inkblot".

An AI formulation:

  1. Remember a good (peak) situation
  2. Explore the circumstances that made it possible
  3. What is the meaning now?

We broke into pairs and did some mind-mapping of a practice.

Research Paper: A Case Study on the Impact of Scrum on Overtime and Customer Satisfaction – Mann and Maurer

Described a team that originally had variable sprints, then shifted its sprint size from 20 to 11 days at the customers' request. Research tried to address, "Does Scrum provide a sustainable pace?" They found there was less overtime (both mean and variance) after Scrum was introduced in this organization. Customers were more satisfied. (Planning closer to delivery led to less misdirected development; playing with the product in the release cycle let them tweak its direction.) Scrum gave them better control and visibility.

Research Paper: An Environment for Collaborative Iteration Planning (Liu, Erdogmus, Maurer)

They used a traditional-looking table with a projected image, and wireless tablet PCs.The thought was that people would create and edit stories on a tablet, then organize and prioritize stories using finger and pens. This would give real-time information, as well as persistence.

The display was nice. One neat trick was that you could "push" a card toward somebody and it would glide at a realistic-looking rate (but never fall onto the floor:)

Existing tools are either cards, which are easy to work with but lack persistence, or planning software, which creates an unbalanced environment (somebody controls the keyboard) but which does have persistence.

The research effort is just beginning; they want to evaluate, "Is it useful? Is it usable? How does it work compared to existing tools?"


Part 4

Jeff Sutherland on Advanced Scrum

"Better, faster, cooler." If this is interesting, see Jeff's paper. I made very quick notes but it's an interesting extension of the Scrum work and I plan to give it more study.

Scrum is out of development and into the whole company. This approach is for experienced ScrumMasters/developers only. See Lawrence Leach: "Eight Secrets to Supercharge Project Performance", Adv. Project, Inc.

Productivity is measured as "features/$100K"

How can we (agile?) win?

One study shows outsourcing cuts costs by only 20% – significant, but not quite the same as cutting 80% or more of costs as some have promised.

The new approach: anticipatory, requires accurate analysis of process and automatic monitoring.

Type A, B, and C Scrum – A: isolated cycles (breaks between sprints), B: overlapping iterations (no handoffs), C: all at once. Consider the sprint level. Need to do pre-staging work for the next iteration inside this one (or else we'll be tempted to fall back to type A).

Advanced scrum has multiple simultaneous concurrent sprints. All sprints release live software.

Changes:

  • MetaScrum for release planning
  • Variable-length sprints
  • Overlapping sprints for one team
  • Pre-stage product backlog
  • Daily scrum of scrums
  • Integrate product backlog and sprint backlog
  • Paperless project management and realtime reporting
  • Administrative overhead of 60 seconds/day/developer and 10 minutes/day/ScrumMaster.

One of the big challenges is having items in the backlog be ready. In their case, that means "intuitive for a doctor" – usable within one hour. This means stories need enough detail, and must have been prototyped with doctors. It forces the product owners to work with the developers to prototype things.

MetaScrum: weekly meeting of stakeholders. Led by the Product Owner. Use the three questions. All product decisions are made here. Release status. All decisions are communicated that day. Damage control plans are executed in the same day.

Iterations: three-month – major product releases (eliminate bugs from portfolio); 1-month sprint for new customer/upgrades (eliminate high-priority bugs); one-week sprint for critical issues. This gives them 45 production releases a year. "Assumes good practices. They're using one code base, working at the tip.

Pre-staging: Overlap sprints, with no break between them. Work only on product backlog that is ready to go the sprint. Pre-staging can double throughput in sprints.

Prioritizing sprints. Constraint theory tells us we must have slack. This lets you optimize multiple overlapping sprints. The backlog is fully automated, with priority levels for the different length sprints (week, month, quarter). Developers focus on one task at a time in priority order. Completing weekly/monthly tasks let them get to the "fun" quarterly tasks.

Sprint backlog: Set to embrace change. Every sprint releases. Customer satisfaction is the top priority. Can restart – a MetaScrum decision.

Automated backlog: Via a bug tracking tool. For each task, the developer is asked: 1. For tasks touched today, how much time is invested in the task? 2. What percent is done? This provides enough data for management "micro-accounting".


Part 5

Random notes from Open Space

Overheard: "Whenever I've been late, I've been yelled at by management. One time, we actually pulled it together and finished early. The president called me – not to praise me, but to say, 'You sandbagged me.'"

Is XP Sustainable? Some things people do:

  • Daily: variety
  • Medium-term: cleanup projects
  • Gold cards (explicitly scheduled 'free time')
  • Slack

Rick Mugridge, Domain-Driven Design Patterns and Fit. Examples: queries vs. operations, entities vs. value objects, aggregates (key root objects), repository (turns into query or iterator). Can use a different test harness with the same fixture and same code, but control whether to connect to a "real" database via configuration.

A test writing pattern: create a setup, do some action, repeat the setup but show what has changed.

Norm K erth: The Secrets of Leading from a Position of No Power

He showed clips of the film Gandhi, then debriefed to explore how Gandhi led. He points out that we can learn about leadership from studying history. Gandhi: "Without a journal of some kind, you cannot unite a community."

Some principles: transparency, persistence, courage. An unjust law: don't resist, but don't comply either.

How did he do it? "He decided soemthing needed change–something important, and worthy of his effort." He let go of assumed authority.

Inauthentic power = anointed power – ("I'm powerful because I'm a manager.") – a weak form of power since it goes away if the position does. Authentic power – inside yourself.

Gandhi accepted the cost of punishment. "Better to lose your job for doing something, or for doing nothing?" He was respectful, sought commonality. He knew his support system: the law. He started small, finding people of like mind.

Characteristics of change agents:

  1. Ability to articulate a vision of where you are going (though it can change)
  2. Persistence
  3. Confidence – inner energy for the right thing
  4. Optimism – energy that you lend to someone else so they can gain confidence.

You can cultivate these!

"The most effective coach is sitting next to you, involved from the beginning to the end."

Gandhi: it paid to advertise. Peaceful revolutions are the lasting ones – they're really an evolution. Pick your battles – you don't need to do everything yourself. Know your real mission.

Joshua Kerievsky: Commoditizing Agility

According to Josh, we're right at the edge of Moore's chasm, and need to make it easier to make that move.

He had statistics from a project showing productivity increases by a factor of 2.5, achieved in one year with an XP team.

"The agile community is thriving, but transitions are too slow and expensive." Why? We're not agile enough in transitions. Problem: serialized knowledge transfer. The shift is repetitive and exhausting.

A couple models: 16 days coaching for a 15-person community. Or, inhouse workshop with 3-6 month full-time coaching for a 20-30 person community. Then try to stretch to other communities, transferring experts out and future experts in. The Problem: people are trained technically but not so much in coaching.

The basics are taught manually. This gives inconsistent content and it doesn't scale. Books don't go far enough. Basic knowledge is fragmented. Books require dedicated study.

This marginalizes coaching time: the basics take away from the hard stuff. It burns out the coaches. So we need to commoditize the basics. Experts are in short supply. How many coaches have 3+ years experience? Internal experts are "too indispensable" to share.

Commoditization

"Products and services are so standardized that attributes are roughly the same." A commodity market has declining prices and profit margins, more competition, and lower barriers to entry.

Ian Murdoch: "Open source and the commoditization of software": Commoditization happens eventually. It's unstoppable, but it's good for all. There's a cycle: Innovation leads to standardization [accessible] leads to commoditization (maybe) [affordable] leads to innovation. Innovation builds on a custom platform.

Commoditization can go two ways: Decommoditization is an incompatible innovation, or customization. But be careful with decommoditization – it can leave you out of the innovation loop.

What can be commoditized: basics, stories, planning, chartering, retrospectives, … What can't: advanced practices, specialized things, "living agile values", people issues.

Josh showed an example of Husqvarna's sales training – a sort of animated comic book, with a model of the sales process built in. He showed a demo he made, using a screen capture tool.

Bottom line: Tomorrow we can have parallel processing for the basics, leaving more quality time for advanced topics. We can get quality, speedy, consistent knowledge transfer.

Workshop: TDD and Beyond, Astels and Marick

Brian Marick emphasized customer-facing tests, that help with project communication. We don't want to do all the tests at once. "Doing Fit tests in a blob creates a bottleneck at the customer/QA level. We don't need all tests, we just need one to get started." He had the image of "programmers as baby birds" [feed me].

Rick Mugridge: We need to move toward declarative tests. "Procedural tests are a smell."

Questions: "Do you have to have normal (JUnit) TDD in place before STDD?" "Do you need a strong customer to make it work?"

Roadblocks: Fit is a challenge because you need programmers and testers to make it work.

Fit Unification Summit

Friday (after the conference), a number of Fit implementors met for a Fit unification summit. If you're interested in that, look for the fit-devl mailing list.

[Published in 5 parts, Aug. 23-27, 2005]

 

 

 

OOPSLA ’04 Trip Report

I'm always struck by how everybody goes to a different conference. This was mine…

10-24-04 – Sunday, and 10-25-04 – Monday

"Usage-Centered Design in Agile Development", by Jeff Patton. This tutorial used a series of exercises to simulate how UCD works.

"Dungeons and Patterns", "Test-Driven Development Workout" – Steve Metsker and I offered our tutorials on patterns and TDD. We also did a session on Framegames for the Educator's Symposium.


10-26-04 – Tuesday

"The Future of Programming", by Richard Rashid. He described several interesting bits of research. One system created a "black box for humans", capturing video every few seconds. SPOT is Small Personal Object Technology, e.g., very smart watches. There will be a kit available 1Q05. He also described research in development tools, for better testing and better modeling.

"Mock Roles, not Objects" by Steve Freeman and Tim MacKinnon. This left me once again aware of how different the mock object approach is from how I do TDD. The design seems more conscious. I don't know how much that's good or bad. It does make dependency injection more natural.

"Systems of Names and other tools of the not-quite-tangible", by Ward Cunningham. He reviewed the idea of mining experiences for patterns. He used System of Names as an example of this, with a very simple Problem => Solution form. He also likes the idea of leaving room for new things: the wiki has a prompting statement for new pages. Finally, Ward reminded us of the importance of being receptive to discovery and integration of new ideas.

"Methodology Work is Ontology Work", by Brian Marick. Ontology refers to the kind of things that exist (philosophically). Brian highlighted Lakatos' philosophy, and suggested that the result is that it's rational to produce a program that seems exciting and spins off results (regardless of its "truth"). (To be fair, Brian pointed out that Lakatos would hate this attitude.)

So:

  • Have a hard core 3-6 postulates.
  • Work out the consequences, and merrily ignore counterexamples.
  • Prefer novel confirmations.
  • Keep throwing off new results.

Brian described a second "trick": use perception to provoke action and reinforce ontology. For example, have Big Visible Charts that show a team where it is; have monitors that go red when tests break.

"Agile Customer Panel" (various).

  • "Customer is not an administrative role" (?)
  • "Customer interaction patterns are simple but difficult" (Linda Rising)
  • "How do we know what has value?" Put it in ridiculous order, and let the customer rearrange it. Tie groups of features to business value, favoring early deployment as proof.
  • Customer prioritization is hard but has the best opportunity for creating high value.

"First courses in Computing Should be Child's Play", Alan Kay. Changing the bulk of people requires a contagion model. Flow as a balance of challenge and ability.


10-27-04 – Wednesday

"Code Complete", Steve McConnell. There are plenty of bad ideas, but there have been advances: higher-level design, daily build and smoke test, standard libraries, Visual Basic, Open Source Software, the web for research, incremental development, test-first development, refactoring as a discipline, faster computers. But – software's essential tensions remain: rigid plans vs. improvisation, discipline vs. flexibility, etc.

"JMock Demo".

"Wiki BOF". Seeding can be important: seeded pages with incomplete ideas, invited guests, no passwords, compelling questions, etc.


10-28-04 – Thursday

"Amazon Web Services", by Allan Vermeulen. There will be computer-to-computer "grid computing" superseding the person-to-computer web computing era. He demonstrated a variety of tools that you can use with Amazon to make this work.

"Outsourcing – How will your job change?" (panel). It's clear there's fear of outsourcing, but it can work. Approaches built around the idea that "they" aren't just as smart as "we" are are misguided and doomed.

"Exocumputing", Jaron Lanier. He tried to suggest different approaches to computing. Computers as built today are very brittle. Perhaps we can try new ways inspired by biology.


Overall, I enjoyed the conference. But it was a lot heavier on philosophy than technique. The thing I'm most inspired to do is investigate what's happening in the Amazon "grid service" space.

Coaching Charts Exercise – Answers

This page has answers for the coaching charts exercise developed by Ron Jeffries and Bill Wake.

Don't peek at this page unless you want to see answers.

The Graphs

 

1. Velocity

Velocity

This is a very artificial-looking velocity curve. It's hard to believe this is happening randomly; there must be something going on. Here are some possibilities; most are bad news.

  • The team is purposely controlling its velocity.
  • The team is supporting two different projects, and gives each one emphasis on alternate weeks.
  • The team developed the habit of taking it easy on alternate iterations.
  • The team is releasing on alternate iterations, and is spending too much time preparing for the release (and release isn't counted into velocity).
  • The team is delivering a lot one iteration, but then spending a lot of the next iteration cleaning up neglected refactorings or fixing bugs.

 

2. Lines of Code

Address two cases:
Case 1: Velocity is about the same each iteration
Case 2: Velocity has a curve similar to this one

Lines of code

Case 1: Velocity is about the same each iteration.
This could be a reasonable curve for a team that's doing refactoring:
– when they add features in new areas, the code size increases
– when they add code in existing areas, the code size increases, but more slowly
– they occasionally get a major insight that lets them drastically reduce the system's size.

That they're sustaining their velocity even when deleting code is a good sign.

Case 2: Velocity tracks LOC.
That sounds like a team that earns lots of points when it's adding code, and few points when it's refactoring to remove code. That suggests that refactoring is piling up; perhaps the team "crashes" and has to ask for time to clean up so they can make more progress. 
 

 3. Velocity

The team appears to be generally improving, though there is a lot of fluctuation. Will the velocity keep trending upward?
 

4. Acceptance Tests
 

 

Iteration Max Passing
1 E A,B,C,D,E
2 F A,C,E,F
3 G A,C,D,E,G
4 I A,C,E,F,G,H,I
5 J A,C,D,F,G,I,J
6 K A,C,E,F,G,H,I,J
7 K A,B,C,D,E,F,G,H,I,J,K

 

The most noticeable thing about this chart is its opacity: it doesn't present it's data in an interesting way. See the later acceptance tests chart for the same data presented better (and discussion of the data itself).
 

5. Checkins
  

Iteration Mon Tue Wed Thu Fri
1 xxxxx xxx xxxx xxxxx xxxx xxxxx xx xxxxx xxx
2 xxxxx xxxx x xxxxx xxxxx xxxxx xx xxxxx x xxxxx xx
3 xxxxx xxx xx xxxxx xxxxx xx xxxxx xxx xxxxx x
4 xxxxx xxxxx xx xxxxx xxxx xxxxx xxx xxxxx xxxxx x

It's clear that Tuesdays are the day when the least is getting checked in, and Wednesday seems to try to catch up a little. Is the planning meeting (or something else) on Tuesdays?

The team is otherwise fairly consistent from day to day and week to week. How many people are checking in? If it's two or three pairs, then each is checking in 3 or 4 times a day.

The pattern is clear; does the chart still help the team?

 

6. Tasks

This snapshot was taken Wednesday, halfway through the iteration.

Tasks

Story 1
    Task A
    Task B

Story 2
    Connect frobbles
    Persistence
    Darnagle the froogles

Story 3
    Lorem ipsit
    Quantius maximus

Story 4
    Hopp galoppe
    Coniunctirae prillin
    Bloddius rank

Story 5
    Trillin exertes
    Postulo mio
    Agricanka lama
    Needhle pind

The team is done with half of the tasks, but none of the stories. Are they cooperating well, or do we have one developer per story? It's hard to tell if the iteration is in jeopardy – if stories aren't completing for the same reason, we may have a real problem.

The team is treating all stories as equal priority. I'd definitely push the team to focus on getting the most important story completed first.
 

7. Acceptance Tests

 

This is the earlier acceptance test data, recorded in a more understandable form.

Overall, we see that this is not a team that keeps tests passing once they've passed the first time. (I prefer the ratchet approach: once it passes, it's kept green.) Note that the team is adding tests each iteration, but not very many.

Only the first and last iteration had all tests green. What will the next iteration be like? (Did someone just declare victory on the tests, or are they really all working right?)

Test B is clearly a problem: it's never passing. Why not? Why hasn't the team addressed this problem?

Test D is also interesting: it's passing on alternate runs. Sometimes this indicates that a test isn't properly cleaning up after itself. Or it may be a symptom of other fragility: we fix the problem, but then the next change breaks it again. In any case, the team needs to work on this test too.

Are these tests being run only once per iteration? Maybe more frequent reporting would help the team keep them green.
 

8. When can we ship?

When to ship?

The trend line is in a good direction: down. It looks like the team will be shipping in about two iterations.

The jig upwards at the start of each iteration represents growth in the number of points remaining, either due to re-estimates or due to added stories. But notice that this is getting smaller each iteration as well. It feels like this team is in control.

Thanks

Thanks to the attendees of the class Coaching Agile Software Teams for participating in this exercise.

[Written September, 2004.]

Coaching Charts Exercise

Charts can help a team see itself better.

This is an exercise for coaches, used in the course Coaching Agile Software Teams at XP Agile Universe '04, taught by Ron Jeffries and Bill Wake.

We ran the exercise using Thiagi's ENVELOPES framegame. This format has envelopes with problems on them, and running several rounds where teams try to come up with a solution to the problem. Finally, a team compares the solutions and chooses the one that best exemplifies the answer.

In this online version, we'll use a different approach. We'll show each graph, and you can write down your impression of what it's telling you. Finally, you can compare your answer to expert answers.

The Graphs

1. Velocity

Velocity keeps bouncing up and down

 

2. Lines of Code

Address two cases:
Case 1: Velocity is about the same each iteration
Case 2: Velocity has a curve similar to this one

Lines of code increases but then stays steady overall

 

3. Velocity

Continually increasing velocity

 

4. Acceptance Tests
 

Iteration Max Passing
1 E A,B,C,D,E
2 F A,C,E,F
3 G A,C,D,E,G
4 I A,C,E,F,G,H,I
5 J A,C,D,F,G,I,J
6 K A,C,E,F,G,H,I,J
7 K A,B,C,D,E,F,G,H,I,J,K

 

 

5. Checkins

Iteration Mon Tue Wed Thu Fri
1 xxxxx xxx xxxx xxxxx xxxx xxxxx xx xxxxx xxx
2 xxxxx xxxx x xxxxx xxxxx xxxxx xx xxxxx x xxxxx xx
3 xxxxx xxx xx xxxxx xxxxx xx xxxxx xxx xxxxx x
4 xxxxx xxxxx xx xxxxx xxxx xxxxx xxx xxxxx xxxxx x

 

6. Tasks

This snapshot was taken Wednesday, halfway through the iteration.

Tasks

Story 1
    Task A
    Task B

Story 2
    Connect frobbles
    Persistence
    Darnagle the froogles

Story 3
    Lorem ipsit
    Quantius maximus

Story 4
    Hopp galoppe
    Coniunctirae prillin
    Bloddius rank

Story 5
    Trillin exertes
    Postulo mio
    Agricanka lama
    Needhle pind

 

7. Acceptance Tests  

Acceptance tests

 

 

8. When can we ship?

When can we ship?

 

Expert Answers

Visit the answer page.

Further Study

Several of the XP books (Explained, Installed, Explored, and others) have examples of Big Visible Charts. In Agile Software Development, Alistair Cockburn frames them as information radiators. View (and contribute to!) the room and chart gallery for more examples.

Resources and Related Articles

[Written September, 2004.]

A Gallery of Team Rooms and Charts

This is a gallery of team rooms, and the Big Visible Charts/Information Radiators they use. Please send email to William.Wake@acm.org and have yours included!

Jump to the oldest photos at the bottom of the page.
 


18. Team room picture, from Ilja Preuß (Posted Feb. '09, pictures from April '08)

"Our team (see 11. at http://xp123.com/articles/a-gallery-of-team-rooms-and-charts/) moved into new rooms some time ago, and I finally got around shooting photos. Attached is a panorama photomontage of the room. Feel free to publish it on your website.

There are eight developers working in the room – a ninth will join us in a few weeks. Many of the workstations are equipped with a second mouse and keyboard for pair programming. Two even can switch their input devices and monitors to be connected to the same PC, being able to pair program without even leaving their table.

On the very left you can see a monitor, mouse, keyboard and speakers being connected to a server in a different room, running some custom made monitoring and "telemetrics" software. In the corner, you see the traffic light showing the state of the continuous build.

Behind the walls on the right is a smaller room having a flip chart, white board, single work station and sofa. It is used both as a meeting room, for discussions with customers over telephone, and for power napping.

The blue, lengthy bag on the right contains a foldable hammock – our emergency fallback when the sofa isn't available… ;)"
(Click to see full-sized picture)


17. Team room pictures, "Oxygen" (Posted Feb. '09, pictures from Nov. '07)

Originals at http://judykat.com/ken/2007/11/24/our-team-room/ (with some descriptive text).

I love the Space Invaders paint job.

(Click to see full-sized picture)

(Click to see full-sized picture)


16. Team room pictures, courtesy of Bryan Nehl (Oct., 2006)

"At the end of the open space we had a large whiteboard, bulletin board and team calendar.

The whiteboard was used to sketch out ideas and arrange user stories/tasks that were initially created by brainstorming with sticky pads.

The bulletin board had the teams 6 month goals, weekly goals, sample story card and cards in progress or future cards.  Also, you'll see some process diagrams, the "XP movers" cartoon, a list of overall project priorities, and a list of releases to production.

The Calendar was used for planning and coordinating time off.

We also had a meeting table in the middle of the open space where we could all come together for team meeting and collaborating."

(Click to see full-sized picture)

 

"At the station pictured you'll see the dual monitor, dual mouse/keyboard set up.  The laptop off to the side was for email and other production account use.  Development machines were on a separate domain."
(Click to see full-sized picture)

"We also had a user testing station in our area.

After having switched the team from a cubicle farm with silo'ed developers to XP with an open workspace all of the developers have said they would never want to do it any other way again."

(Click to see full-sized picture)
 

 


15. Ternary Software (Oct., 2006)

 

A status board showing who plans to do what today, and where people expect to be.
(Click to see full-sized picture)

A different team room.
(Click to see full-sized picture)

A progress chart. The part in the middle is from an old project (and is just "background" at this time). The orange sticky notes show progress (moving left as stories are done). The dots in the corners of the story cards represent priority.
(Click to see full-sized picture)

A task board for the system administrator, showing stories and priorities. The sticky notes on the cards show who will do the work and how long they expect it to take.
(Click to see full-sized picture)

 

"Permission for use and re-use for non-commercial purposes granted by Ternary Software, Inc."

 


14. Iteration progress chart.

 

Courtesy of Geoffrey Slinker, who says, "Items progress in the bottom boxes from right to left, then up into 'this iteration,' then progress across the top, then land in the finished box. I use sticky notes, and I stack them and keep them so that when performance review time comes along I can refresh my mind on everything I have accomplished." (May, 2006)

 

(Click to see full-sized picture)

 

 

 


13. Build toy at THI Insurance.
Jeff Morgan writes:

An XP team at THI Insurance has converted a trailer hitch into a build failure notification. When the build breaks, the deer flaps its front legs.

See a video (2M, AVI).

  (Click to see full-sized picture)


12. Build light at http://www.touchclarity.com
Courtesy of Chris Smith, who writes:

"This is a real, four foot high British traffic light, which is hooked up to our build machine.  It was a scoop at only £40 from ebay, but I didn't envy my colleague who had to bring it into work on a crowded London train.

The build machine also plays sounds when the build status changes:
Red: compilation failure; plays a submarine style siren sound
Amber: test failure; plays a 'wrong answer' sound from a game show
Green: build success; plays a 'Hallelujah!' sound

We also have a secret Friday night mode that plays disco music and alternates combinations of lights to the beat, but we haven't shown that to the Customer yet :-)

So, do we have the coolest build status lights?"

Chris adds, "you and others are free to use the photo wherever you like."


11. Cadenza team from disy Informationssysteme GmbH (www.disy.net). January, 2006.

Pictures provided by Ilja Preuß. There are more on their egroup, at http://de.groups.yahoo.com/group/agile-ka/ (German).


10. More pictures from William Pietri (Sept., 2005)

This is a small selection from his wonderful page at http://www.scissor.com/resources/teamroom/ (Used with permission)

The full story planning board.

 

The current stories.

 

 

  

 

 

 

 

 


9. Mountain Goat Software (2005)

 

Here's Mike Cohn's generic taskboard, then a couple "live" ones. These pictures are from http://www.mountaingoatsoftware.com/taskboard.php; used with permission.

 

Mike tracks by story. The other cards represent tasks, and show what state they're in.


8. Emergys
(March, 2004)
"What we do trying to expend less money as possible, we set a wireless access point in the room and we can move furniture all around whatever we like."  –Marco Marquez


7. Software team doing a fixed-price conversion project
(December, 2003)

 

The two columns show status for each test both pre-conversion and post-conversion.

Green squares show tests that pass.
Red squares show tests that failed.
Blue squares show tests that failed for reasons outside the teams' control.
X's show tests not yet run.

Numbers in the squares correspond to those issue list at the lower left.

The graph in the lower right shows tests over time, and should move to "all green."


6. www.binaryessentials.com  – An XP Workshop
An Indian organization for XP coaching and consulting. More photos from this workshop are available.


5. Key Technology, Medford, OR

"Normally we pair at separate computers (side by side) using a shared VNC session.  It takes a bit of getting used to since everyone has a keyboard and mouse.  It's easy to click or type and mess up someone else.  On the plus side, it takes even less time to switch roles.

The monitor you can see in the back corner does double duty for our build server and our Windows box."

(Picture is damaged.)

 


4. An XP team coached by William. Pietri
Build state shown by an Ambient Orb (TM): "In the photo, the build is broken."

Story cards: "Each row is a week of effort. Cards with a big checkmark are done; blue post-its contain customer-written outlines of tests cases; yellow stickies contain unresolved questions for the customer. The cards are held to the glass with little stick-on hooks. Down at the bottom right is a little pouch with blank cards and a pencil."


3. Ken Boucher's team room (fairlygoodpractices.com)



2. XPlayers Team (P. Bossi)

Remote pairing in progress.


 


(Link to PowerPoint picture of team room)

 


1. An XP Team (eidogen.com)

(September, 2003)

[Begun September, 2003.]