reviews


Focused Review Guidelines

Effective reviews have a purpose. This document suggests a series of focused reviews leading to product delivery that matches criterion set by users of an application.

It is important to remain in in the user language as long as possible to maximize the interaction with users, program champions and architects of the system.

A moderator and other roles are important to assign. The review findings should be written for specific additions and action items.

A focused review is one that does a check against some aspect of the system which allows the reviewers to review salient requirements. As the project matures the focus varies from gathering requirements to the exploration of performance.

Remember the earlier errors are detected, the easier they are to fix.

o Get requirements clear early on. o Catch design errors early on.

Gathering Requirements
Early reviews add general requirements and explore the users of the system. The role of external systems and other events become clear in this phase.

User Interface
Middle reviews fill in specifics and drive to known inputs and outputs. What the user will see is shown in these reviews. Once the user interface is known, it may be checked by a user interface expert for the proper look and feel.

Identify Iterative and Incremental Steps
Later reviews define what must be done first. This narrowing of the actual project into palatable pieces makes it manageable and measurable. Out of the narrowing process comes prototype needs, user acceptance criterion, user documentation, functional test documentation as well as the iterative and incremental plan of software releases.

At this point the project leader should be identifying the risks associated with this project so they can be addressed early.

Identify Usage Patterns
Final reviews explore usage patterns which indicate measures of performance. From these reviews testing is extended to include volume, stress and bench testing. This style of testing shakes out each software release and forms a foundation to measure the software from release to release.


Requirements Review

Populate a use case diagram while asking key questions to identify the users of the software and the roles they play as Actors. An Actor can also be an external system.

Some systems are driven by events so events may be the initiators of use cases.

Check a book on UML standards for use case diagramming information.

The focus of this review is:

o Follow through each role to check it. o Do I have the essential parts? o Am I missing any parts?

Also:

o Keep use case words in User Language. o Identify user needs in general terms. o Identify "extends" and "uses" relationships.


Requirements Walkthrough Review

Use the use case diagrams as a check point for the use cases. As you walk through each use case scenario by finger pointing, check off each touched use case. Complicated or unclear scenarios should be noted as action items for the next review.

See Use Case Workshops .

The focus of this review is:

o Check each use case against the Actor roles. o Do I have the essential parts? o Am I missing any parts? o Are the user scenarios understood?

Also:

o Keep use case words in User Language. o Identify user needs in general terms. o Identify frequency, volatility, errors.


User Interface Review

Develop a network of dialog maps. Use the scenarios, if needed, to walk through each dialog map. Check off every function on each dialog map.

The focus of this review is:

o Is there a use case for each dialog map function? o Is this an acceptable look and feel for dialog maps?

Also:

o Identify scenarios that need documentation. o Identify missing use cases.


Business Rules Review

A series of small sessions will identify rules and logic which define the logic and rules of how things work ... as opposed to the work or process flow.

A final review of these business rules with the group may help if the information is hidden in the minds of the whole group.

Often a prototype has been decided upon and is in the works as a proof of concept by this point. This sets the stage for implementation of a more robust model.


Release Review

This review focus is to make an agreement of what constitutes a release. The goal should be a minimal set of use cases to implement to constitute a system. From that base there can be incremental releases which iterate on this design to encourage stepwise refinement.

The focus of this review is:

o What is the minimal set of things to implement? o What determines an acceptable release?

Also:

o Identify sanity tests. o Identify feature / function tests. o Identify exception tests.


Design Review

See Responsibility Driven Design Walkthrough as one example of a design review.

The focus is:

o Does this design work? (catch errors early) o Is there a better way? o Does performance match the frequency of usage?

Also:

o Identify the risks of the design. o Break out the estimated completion.


Performance Review

This review sets the tone for measurability across each release. This may happen after an actual first release when performance is better understood.

The focus of this review is:

o What limits need to be tested? o What volume should be applied to those limits? o What stress should applied to those limits? o What fundamental bench measures this application? o What benchmark can apply across releases?


Post Mortem Review

Any project which winds down from a release can benefit from looking back with all members and asking fundamental questions. Starting from a time line of the project, all members will have a perspective of what went right and what went wrong. This course correction is a healthly beginning into each new incremental release.

See Post Mortem Agenda .


AUTHOR

David Scott David_Scott@Sequana.com October 1997

1