Presentations and Papers

O X F O R D   S O F T W A R E   E N G I N E E R I N G
C o n s u l t I n g   S o f t w a r e   E n g I n e e r s 


 What’s New    PODS    Products & Services    Library    Presentations & Papers    Related Sites    Events     History    Clients 








From time to time we are invited to give talks. They give us an opportunity to develop and present our ideas on software development, improvement and measurement. Our presentations are listed here. If the slides aren’t available contact us and we’ll send them to you, together with the paper if there is one. Some of these have developed into workshops, tutorials and training courses. If any of these topics are of particular interest to you contact us at








36. ‘The evolution of a super agile scalable software capability’, September 2015

Off to Warsaw on a day or two with a chance to talk about bwin’s great software process engineering work.  The presentation has little technically new – but the passage of time has given it a different perspective.








35. ‘Exploring the Creation Spectrum’, June 2015

Eventually managed to get this presentation on innovation together after revisiting some of our ideas on exploiting innovation in software development.








34. ‘Quantification’, March 2015

A last minute invitation to speak at the CMMI Institute’s conference in London gave me an opportunity to speak about software  measurement again.   Interesting to hear what is going on and to meet the CMMI people too.








33. ‘Defect Measurement and Analysis’, March 2015

A rather disorganized three hour workshop where the materials were only finalized at the last moment. My materials had to be put to one side and we had to work with a random sub set, randomly ordered – so – while not used (yet) here are the slides we intended to use – better luck next time I suppose.








32. ‘Requirement: The Edge of Fear’, June 2014

For Tom G’s seminar this year simply a presentation trying to identify the issues encountered with requirements activities. The presentation looks at some of the problems I have encountered and suggests some remedies and theories – including the idea that requirements are an illusion! Not particularly we received, as you’d expect.  (This years gathering was characterised by a fairly downbeat set of talks and an outstanding new graphic from IBM, Japan – the ‘Yukikogram’. )








31. ‘The evaluation of management (and other) methods and tools’, June 2013

Slightly different presentation for Tom Gilbs ‘GilbFest’ this time. No research and exploration but simply recovering material from work done some time ago.  The evaluation of methods and tools used to be routine – evaluating the features, attributes and critically the effects on the dev process and system qualities.  This talk  describes the range of evaluation techniques and proposes that a deductive approach is the best.








30. ‘<x>DD’, March 2013

Listening to FP people at UKSMA 2012 discussing how to establish an industry standard ‘work rate’ prompted this presentation (given at the excellent LLKD13) as a plea to recognise software development as a creative, design activity. With the increasing interest in production engineering methods and the ‘industrialization’ of development driving the joy out of software work  - as well as doing damage to ‘productivity’  - it is time to take a stand.








29. ‘Agile Software Measurement’, November 20121

As a last minute stand in for Brian Wells I prepared this presentation for the 2012 UKSMA conference. It is a call to move software measurement back into software development as a tool for software developers, not just as a stick for managers and administrators to beat developers with. Technology is providing developers (and others) with great opportunities for novel forms of analysis, and better decision making, but it will take time to remove the stigma that software measurement has acquired though misuse.








28. Hunting for Anti Patterns, June 2012

The idea that anti patterns were interesting arose at Tom Gilb’s 2011 seminar and the ideas were developed for this presentation in 2012. There is an opportunity to make significant improvements by stopping doing the wrong things – significantly more powerful than just doing more of the right ones – but it is an extreme sport.








27. ‘Case Study: Testing in a Super Agile Environment’, November 2011

This is the first time we have presented at the Next Generation Testing conference. Slightly worried about presenting here in front of the Dons of the Tester Nostra, but needn’t have worried, all went well.  A joint presentation by Gerald Grossmeyer and Clifford Shelley outlining how bwin developed their super agile development capability and how they managed to design their testing capability to fit the requirements for high quality systems, and extreme delivery predictability and responsiveness.   The slides are here .








26. ‘Software Development Analytics’, November 2011

Here is the presentation delivered at the 2011 UKSMA/COSMIC conference in November. It was a last minute entry when one of the intended presenters found they would not be allowed to deliver their paper. This ‘analytics’ presentation makes the case for recognizing and analysing the data we already have to give us an unbiased and informative understanding of our software and software development activity.








25. ‘The Evolution of a Super Agile, Scalable Software Capability’, September 2011

Here is a draft paper outlining the achievements of  bwin in developing a new software development capability combining the best of agile software development and the process ideas of CMMI. It is draft – so comments and corrections would be welcomed








24. ‘Designing Designing’ presented at Tom gilb’s ‘Solution Engineering’ seminar, June 2011

Back exhausted from Tom Gilb’s 2011 ‘solution engineering’ seminar. The slides from my presentation are here . They describe some of the essential elements of a good software design process and what software organizations know about their own particular design process – which it turns out isn’t very much, oddly.








23. bwin’s GQM/SPI Case Study, October 2010

Here are the slides for the presentation at this years UKSMA conference. It shows how bwin made major improvements in delivery predictability to give themselves a completely novel (to my knowledge) capability. They did this by combining agile development practices (super scrummy) with process discipline similar to that expected in high maturity organizations.  This work took bwin to the finals of the European Software Excellence awards.  However neither the agile community, or the process people seem to know to what to make of this – it doesn’t fit into their preconceived models.








22. Energizing CMMI, April 2010

Sometime last year a discussion on the poor state of much SPI work prompted our Energizing CMMI service and ten ‘rules’ of SPI. This developed into the webinar ‘Outcome Based Software Process Improvement, and a presentation ‘Energizing CMMI’ at the BCS SPIN last year. Here is the paper that elaborate on the topics discussed – your comments would be very welcome.








21. Fagan Inspection: The Silver Bullet No -One Wants to Fire, March 2010

A passing remark at a recent meeting prompted some discussion about why Fagan Inspection, one of the most effective quality controls, is not more widely used. This presentation is the result. We will be looking at this topic at the next BCS meeting in June this year.








20. eXtreme Measurement, October 2009

Here are the paper and the slides for the presentation at this years UKSMA conference. It discusses Austin’s model of measurement dysfunction, the limitations this places on software measurement, and what can be (and has been) done about it. Considering that most metrics people don’t really want to know about the limitations of their favourite topic it was quite well received – at least I didn’t have to make a run for the exit.








19. Changing Culture, June 2009

Another presentation for Tom Gilb’s seminar. This one looks at understanding or mapping a software culture. We have developed a mapping tool that places the culture of interest into its context and enables it to be described ‘as is’ and ‘to be’. The tool derives some of its features from a measurement definition method that requires the building of an attribute list that is then used as a checklist. This seems to have a number of advantages, perhaps the most important of which is it avoids any possibility of scoring. A number of culture change tools are also described but these really do needed to be treated with care. The paper is here








18. Imagining Managing Risk, June 2008

Another presentation for Tom Gilb’s 2008 seminar. Most software projects have a remarkably similar approach to managing risk; it is pretty much the industry de facto standard. While it can work well it does have a number of weaknesses and often makes assumptions about the role of risk that can be misleading. This presentation takes a look at the state of practice and proposes some straightforward changes that increase the value of this fundamental practice. The paper is here. (During the seminar several of the ideas presented here received qualified validation and we also learned a lot. Our risk management practice is now being revised to incorporate these new ideas.)








17. CMMI & Metrics, British Computer Society SPIN SG, February 2008

Measurement is a problem and usually delivers little of value. Here we looked at how CMMI expects measurement to be used. The model requirements are well thought out but do need to be interpreted to give sufficient value to those developing the measures and collecting and using the data.








16. Goal Question Metric – The Foundations of Measurement, UKSMA October 2007

The ability to develop and use measurement information is a fundamental but difficult software engineering skill. Our tutorial uses GQM as the framework for demonstrating the measurement fundamentals It is structured as a walkthrough through GQM and cast a procedure with methods tools, guidance and templates provided for each of the steps.








15. Smart Decision Making June 2007

Another venture into the lion’s den with this paper and presentation on smart decision making. This is my second paper for one of Tom Gilb’s seminars. It looks at the nature of decision making – what makes it smart and what is a smart decision making strategy?. It shows how these can be presented and analysed, together with an analysis of decision making ‘dysfunction’.








14. Additional Data from Software Inspections, UKSMA October 2006

The idea for profects was identified in an earlier paper but developed here for a paper for the at the UKSMA conference. The idea of Inspections (and other reviews) as an intelligent process was explored and the opportunity to measure and analyse design excellence developed with the idea of ‘profects’ which are recognizable elements of exceptional design.








13. Analysis of Defect (and other) Data, May 2005

This presentation was developed for my first presentation at one of Tom Gilb’s seminars. It was prompted by irritation at the unthinking misapplication of statistical process control to software development. In this presentation the limitations of SPC are examined with a series of questions that those attempting to use SPC should be able to answer convincingly. Appropriate, robust methods to deal with software engineering’s messy data are proposed, together with some ways of presenting them. The idea of ‘profects’ also appears for the first time in this presentation.  This presentation was also given at the UK SPIN meeting and the UKSMA conference. The accompanying paper is  here .








12. Six Sigma – and its application to software development, May 2003

This presentation was developed to explore the potential of the six sigma approach to process improvement. The technical origins of six sigma were reviewed and the development of effective tools for beneficial change discussed. There are surprises fore both those that think six sigma is not applicable to software development, and for those that do!








11. SPI Approaches – Successful Implementation, February 2003
This presentation was given at the UK SPIN in February 2003 as the basis of a workshop. OSEL first started to look at the different approaches to SPI to identify what was successful and why back in 1996. This work has continued and our thoughts are not just that some approaches work and others don't but some approaches work in some types of organisations where other approaches would be less successful. This has enabled us to identify the different types of organisation and therefore select the approaches that have the greatest probability of delivering benefits.








10. Metrics - Beyond Numbers, September 2002
This presentation was given for the first time at a SPIN meeting in September 2002. We take another look at the nature of measurements and their relationship to the software products or processes they are measuring. Prompted by observations of the way children learn to count a simple 'Measurement Maturity Model' has been developed. This model is described. It maps levels of sophistication and types of valid software measures to the level of empirical understanding of the product or process to be measured. The role of graphics and the careful display of data as synthetic graphics are also discussed together with some speculations on the future development of software metrics and display mechanisms.








9. Software Project Management: Monitoring and Control,
First presented in April 2001
We were invited to speak at a meeting hosted by Agilent in the spring of 2001. We took the opportunity to develop this presentation to review the ideas of monitoring and control from first principles. We developed a simple model of project control, identifying the essential characteristics of effective monitoring and control, and worked this through to practical advice and checklists. Much of the thinking behind this presentation was derived from our experience of designing and implementing PODS - our project information manager. It provides the information necessary to monitor (in real time) and control projects in a consistent and reliable manner. The presentation was delivered again at a SPIN UK meeting in 2001.








8. eXtreme Programming:
experiences with a new approach to software development
February 2001
This presentation describes the eXtreme Programming approach and our experiences of using some XP techniques in developing our software information management tool, PODS, which provides an organisation with real-time status of all of their programmes and projects. Analogies between the philosophy of XP and that of RPI (see 7 - below) are made.

Our experience with XP is described on the IEEE

This tends to give a less positive interpretation of XP than we intended. Since the Dynabook posting our valuation of XP has increased as our product matured and more XP practices came into play.








7. Rapid Process Improvement,
First presented in June 1999
Rapid Process Improvement has been developed from a number of sources. Prompted by the need to capture our own expertise we have developed a number of discrete tools for making and managing changes to the software process. During this process we presented our ideas to colleagues at Lockheed Martin under the umbrella title 'Software Production Engineering' where the role of a Software Production Engineer - analogous to a Production Engineer was discussed. Then with the response to the SPI Asset Repository presentation (see 5 - below) we began the development of a simple SPI tool taxonomy and populated it with the tools that we have developed and discovered in use elsewhere. This process improvement tool-set has been under continuous development and has matured in to what we believe to be an import set of assets for the SPI community.

The RPI workshop that describes the need for RPI and describes the tool-set has now been presented more that a dozen times both publicly and as in-house training workshops and is continually updated.








6. It's not the model - its what you do, February 1999
We take a look at the different types of software process models, the different approaches to SPI, and the different tools that can be used. This presentation looks at the strategies used to make change - rather than describing the models themselves. The advantages of the various strategies and some of the tools used to implement these strategies are described.








5. SPI Asset Repository for SPIN - UK
September 1998
This was the original proposal for the establishment of a UK SPIN SPI tools repository available for all. The response to this proposal was interesting. In general the idea was received with some caution. On being prompted no one appeared able or willing to describe any SPI tools. This has prompted us to identify the SPI tool-set described in the RPI presentation.








4. Metrics in the context of CMM/SPICE
September 1998
Software Metrics remains a popular topic. This presentation describes what measurement is and looks at the way measurement is included into two of the most influential software process models; the CMM and SPICE.








3. SPI How to get started - and keep going
February 1998
This presentation was developed to provide an opportunity to discuss the do's and don'ts of SPI. Given the poor track record of the majority of SPI initiatives and the startling success of others this presentation identifies those SPI characteristics that improve the probability of success.








2. Software Process Improvement and the Capability Maturity Model
March 1996.
This presentation was designed to introduce the ideas of process improvement and the SEI's Capability Maturity Model to the construction industry. A large number of points of points of recognition and similarity were identified resulting in the UK construction industry's SPICE project (not ISO 15504), managed by the University of Salford, which is developing a CMM style maturity model to direct construction industry process improvement and innovation initiatives.








1. Metrics Led Software Process Improvement
November 1995.
This presentation gives an overview of the three elements found in successful software measurement activities: an understanding of the software development environment - its capability and needs; the tools and techniques of sound measurement; and the human factor. Attributes for initiating and sustaining a successful measurement programme are described along with some common mistakes. The material covered in this presentation is discussed in more detail in chapter 4 'Making Software Measurement Work' of 'Implementing a Quality Management System', edited by D.N.Wilson, ISBN 1853125938








 What’s New    PODS    Products & Services    Library    Presentations & Papers    Related Sites    Events     History    Clients 

Copyright OSEL 1998-2015
This site was updated on 22/09/15
Comments about this website to