What’s  New ?

O X F O R D   S O F T W A R E   E N G I N E E R I N G
C o n s u l t I n g   S o f t w a r e   E n g I n e e r s

Home

What’s New    PODS    Products & Services    Library    Presentations & Papers    Related Sites    Events    History     Clients

 

 

 

 

 

 

 

As well as funded work and tool support we have several internal projects and other events that may be of interest to software developers and the SPI community. This page lists some of our activities and interests. Contact us at shelley@osel.netkonect.co.uk if you would like to know more about these.

 

 

 

 

 

 

 

 

 Friday 21 October 2016:  BCS SPIN SG – Autumn Software Process Engineering Workshop….

You are invited to the autumn BCS SPIN Workshop

Title:    BCS SPIN Autumn Software Process Engineering Workshop

Theme:    'Change in Design Intensive and Knowledge Working Environments'

Programme: The day's programme is a mix of workshop, presentations (including a report of an organization transformation of its dev capability), and discussion. The objectives are the shared exploration and increased understanding of change in software development organizations, and the exchange of information and experiences of individual participants.

We have scheduled a number of 'soapbox' sessions after lunch where attendees have can have five minutes to share ideas, experiences or views, or can request advice or help for problems and challenges.

There will be plenty of opportunities to quiz the experts and network with others engaged in this challenging work.

 

Who we are: SPIN (Software Process Improvement Network) is a network of software professionals engaged in improving or transforming software development capability. For nearly thirty years it has supported and reported the evolution of s/w development practice, from TQM and ISO 9000, and CMM , to six sigma (briefly), lean development, agile and DevOps.

 

Speakers include: Lisa Hughes, Simon Hodges, Clifford Shelley and Ian Seward.

Date: Friday, 21 October 2016

Venue: BCS London Offices, Southampton Street

Time: 9.30 - 16:30

Attendance: Free

Sustenance: tea coffee and lunch will be provided

Registration: Spaces are limited and registration is required:

To book online go to http://www.bcs.org/content/ConWebDoc/56551

 

 

 

 

 

 

 

 

9 October 2015:  BCS SPIN SG – Invitation to participate in the Autumn Workshop….

Title: Tools, Techniques and Tactics for Transforming or Tuning the Software Development Process

Description: Contemporary software development organizations now have a large choice of software development models: agile software development has matured and frameworks with origins in manufacturing have made a real impact. We have a range of excellent software development resources on which to base our development capability.

But there is surprisingly little guidance describing which models best suit particular organizational needs; how to select the right one, or the right components. And there is little that shows how these models can be put in place. Technology transfer to often is limited to 'coaching and mentoring' and certifications of uncertain value.

There is a tendency to a 'one size fits all' approach, and for the model pundits and vendors to a promote a complete framework. And an assumption that a better development capability means discarding current practices and replacing with something more popular.

This can be unwise. An organization's ability to develop and deliver software is a hard won, valuable, but frequently undervalued asset. It is a subtle mix of tools and technologies, processes and practices, and people's, often tacit, domain and application 'know how', fitted to its particular context. It should not be discarded lightly.

However deciding how to approach changing ways of working to better meet business needs is difficult. Transform or tune, Replace or refine? And how to do this?

This one day workshop brings together experience software process professionals to show how to make real improvements to your software development capability.

From an evaluation of a software development organization's circumstance and needs through to verifying the implementation and operation of the new or refined capability we will be surveying and discussing the tools techniques and tactics that make changing your organization's software dev. capability predictable, low risk and cost effective.

Speakers:  Speakers include: Roger Gamage, Ally Gill, Alan Kelly, Peter Leeson, and Ian Seward. All have wide ranging experience of transforming organizations' software development capabilities and will be sharing their experience throughout the day.

Who should attend: Software developers searching for fixes to their development practices or better ways of working; Technical staff and managers responsible for the care and maintenance of their organization's software development capability; Managers responsible for transforming their existing software development capability or considering or introducing a new one.

Whether you are intending to install a new capability to meet new business requirements, adapt a model to fit within your existing capability, or replicate or scale an existing way of working, you will find this one day workshop of value.

Date: Friday, 9 October 2015

Venue: BCS London Offices, Southampton Street

Time: 9.30 - 16:30

Meeting Format: Interactive Presentations

Attendance: Free

Sustenance: tea coffee and lunch will be provided

Registration: Spaces are limited and registration is required:

To book online go to http://www.bcs.org/content/ConWebDoc/55079

 

 

 

 

 

 

 

 

June 2015: Fixing the future: mode b to mode a….

 

At a recent seminar there was a very compelling talk by Nik Silver about the problems with poor quality risk registers. Lack lustre mitigation and contingency plans, unconvincing evaluations of probabilities – uncertainty about the inclusion of events that were certain to occur... all these can make the task of managing risks a chore of dubious value. The situations Nik described seemed to strike a chord with his audience.

And similar problems have been encountered with project estimation too – project estimates that lack credibility, that will not gain commitment. That appear to be the result of half hearted estimation undertaken by those somehow not really engaged, simply required expend nominal effort.

This may be due to the way people approach these future oriented estimation or risk evaluation tasks. There are two ways of perceiving the future:

The first is familiar; to view the future as a number of possible futures. Depending on events and decisions a particular future is made from these possible futures. They may be considered as a number of streams, perhaps interlinked as some sort of a braid. Our decision and actions determine which of these possible futures actually happens.

The alternative view is simpler. It is that there is only one future, just as there is one past and that we cannot know what it is. And that the one future will, in time, become the past. This is the Cassandra view of the fatalist and the fortune teller; 'what ever will be will be', we just have to wait and see what it is.

People may have a preferred way of thinking about the future but it need not be exclusive. Someone of a fatalist tendency will never-the-less make active choices or selections of, say, a meal in a restaurant, happily selecting from a menu, or perhaps spending a pleasant afternoon selecting a destination for a vacation.

Alternatively someone that tends to view the future as many branched, may when faced with certain situations where choices do not appear clear or easy may simply resign themselves to whatever happens.

The many futures way of thinking, lets call it 'mode a', is perceived as more positive and conducive to proactive ways of thinking than the fatalistic, one future, 'mode b', way of thinking

To illustrate these modes consider a simple toss of a coin. Asked to place a wager on the probability of the coin coming down heads few people will have difficulty. It is generally assumed that a fair coin has an even chance of landing either heads or tail. The coin is tossed, and the coin placed on the back of a hand and hidden from view by the other hand. Those that have wagered are again asked what the odds are. Most will be content to continue saying the odds are 50/50 - but a few will notice that while the wager is still fair the situation has changed. There is no probability now. The coin is either heads or it is tails. What has changed is that the uncertainty is no longer about the probability of a future state but about the actual state of the coin. Before the coin toss the information available was the best possible, after the toss it is not. Awareness of this can change thinking from mode a to mode b.

Project estimation and risk workshop type activities are concerned with evaluations (literally) of a possible future. Estimation activities are not simply attempting to determine numbers (durations, or costs) of a possible futures but considering alternatives and envisaging 'what ifs' in order to envisage the optimum future from among the possibilities.

The dissatisfaction often encountered with project estimations and risk registers may be due to estimation and risk evaluation be requiring 'mode a' thinking, but actually undertaken by those in (the more common?) 'mode b'. That is, in mode a people will tend to be proactively reviewing options and possible futures and making plans accordingly, with the best possible information. But for those in fatalistic, one future, mode b state, the invitation to evaluate risks or estimate some aspect of a planned project is an invitation to reveal their ignorance of the future – with the perception of others revealing their ignorance too. The impact of this on estimates or risk evaluations may or may not introduce some form of quantitative bias – but it does seem to limit the enthusiasm for these activities - and the value of any results that end up in estimates or risk registers. This is perhaps what is being perceived by Nik, and by others frustrated with moribund risk registers and unsatisfactory estimates.

So, perhaps as a first step it may perhaps be useful to at least be aware of these modes. And when undertaking estimation or risk activities only attempt them in mode a.

 

Whether it or not it is possible to flip from mode b to mode a, at will, remains to be seen.

 

 

 

 

 

 

 

 

June 2015: Innovation Seminar at Deutsche Bank

 

Back from Deutsche Bank and the Innovation seminar. Great time – I’ve seen the future and it is in Intel.  Robot/Avatar thingies are roaming the corridors of Intel and they really seem to work. Absolutely uncanny. What sold it for me was the reactions of people meeting a thingy rolling towards them – they say hello. I want one – or two. 

 

If last year was Japan’s year – the Yukikogram made it work while for me, backed up by the excellent process work by Sony/IBM – then this year it was the US that made it work while. Really good forward thinking from Intel and a great set of Ideas from Mark Kennaley who let us have a copy of his really, really interesting ‘Value Stream’ book. This is the first time I’ve seen these ideas articulated – don’t agree with all of them but they really need wider discussion. And clear thinking from Seattle.  Nearer home some thought provoking talk from Nik S – see above – lots of sensible discussion too.

 

Apart from being colourfully sick on the Underground it was a really good week. After a lot of difficulty (and far to many pictures of light bulbs from the web – this is my favourite – not sure if its dumb or ironic) I managed to get a talk about innovation together by revisiting some earlier ideas I’d forgotten about.  Some of which could do with reworking.

 

 

 

 

 

 

 

June 2015: Unwisely revisiting SAFe

 

A chance comment in a forum prompted me to revisit SAFe – and I wish I hadn’t. Initially annoyed by it asll over again (some sympathy for KS here) and at the presumption of the model I never-the-less found myself metaphorically peering through my fingers and thinking I could use that bit, and this….  My reaction seems to be pretty much par for the course.  Irritation at the engulfing of useful small scale models that had been a reaction to such models, but also a recognition that such models will be used, badly mostly, along with some cynicism at the motivations of the developers of SAFe.  Given that the model does it exist I’d suggest the best way to make use of it is to avoid ‘official’ interpretations and certified SAFe vendors and to simply pillage it for the good bits of this generic reference model – having thought carefully what the good bits might be – and use them as aids and components for the development of the organizations’ all important and unique operational model, that each organization has to figure out and put in place for for itself.

 

 

 

 

 

 

 

June 2015:  Innovation Seminar at Deutsche Bank

 

I’ll try to blag my way in to this year’s Gilbfest at Deutsche Bank. The theme is important and I’m sure that there will be an interesting debate. Still working on my ideas – just a heap of notes and books to re-read at present currently racing through Neil Postmans ‘Technopoly’, so I’ll have to find some more upbeat stuff before I settle on my innovation story.

 

 

 

 

 

 

 

March 2015: CMMI Institute visits UK

 

Had a chance to meet people from the CMMI Institute (CM2I2 ?) that were here for a conference (run faultlessly by Unicom – as you’d expect) in London. Interesting to hear what they intend to do with CMMI. Hopefully it will raise awareness of CMMI as a useful tool to almost all software organizations. They were rather hard on themselves, and the model – but better that than overselling. And they do seem to have have some blind spots too – perhaps too focussed on their reference model rather than the all important operational models? But the SEI had the same problem, after all their asset is the reference model.  I also had a chance to present to due to people dropping out so I tried to raise awareness of measurement in general and measurement in CMMI with this presentation.

 

( Unfortunately there appeared to be little time for questions or discussion. I’d have liked to reveal my favourite measure of software size, since there seemed to be an interest in s/w measurement. It’s the King James Bible. It’s about 100,000 lines of prose, remarkable, lyrical prose, so similar to code, and these lines of beautiful prose are of a similar length to typical lines of code – so a KJB is roughly equivalent to 100,000 lines of code (a lakh loc! ). This is useful because we can visualize a bible – both in terms of comprehensibility (about a brainful ?) and in terms of labour to produce. So – visualize or estimate code or count extant code in terms of KJBs. Most of us have a reasonable ability to estimate how long it would take to read, understand, edit, but perhaps not produce. )

 

 

 

 

 

 

 

5 March 2015:  BCS SGIST, Defect Measurement and Analysis Workshop

 

Just back from the testing conference – good to see some old friends and hear what’s going on in the testing world – which is quite a lot at the moment – factions forming and splitting, to keep people on their toes, and technology is driving testing and software quality in new directions too. Interesting talk from Chris Ambler – curates egg but a good proposal for a new BA/Tester role. (Not new really – Good analysts have been working with testers for ages – to great effect too – but I’ve never heard of this being acknowledged before.) Chris’s proposal is a good one – but a terrible name ‘Business Transformist’ for heavens sake! Prefer BA/Tester (or BAT?).

 

The workshop was useful, but problematic. Always good to hear what’s going on at the pointy end – lots of useful ideas (from the delegates) and a few home truths too. Not sure we were particularly useful, but an interesting challenge having to adapt my presentation to a random subset of my slides in random order in half the expected time. Oh well…..  Just for the record here is my set of slides – in the order I intended too….

 

 

 

 

 

 

 

3 March 2015 Software Measurement and Analytics course schedule

And again just updated the course schedule – running one every month now – if  anyone wants one of our other courses then get in touch and we can set something up.

 

 

 

 

 

 

 

18 September 2014  Software Measurement and Analytics courses

We’ve just updated the course schedule for the rest of the year and are now working through the materials to incl;ude the latest thinking and refine the remainder.  (The work we’ve been doing on the UKSMA defect manual has also prompted some thinking on the analytics side of things -  perhaps post more about this in the next few days…. )

 

(Good luck Scotland! – not sure what the right decision is, maybe not that important as borders mean less and less….)

 

 

 

 

 

 

 

9 August 2014 Tom Gilb’s seminar in June

Fortunate enough to be at Tom’s annual seminar,  hosted by Deutsche Bank again. The theme was high level requirements so I decided to present my experiences – many of which are quite mixed – see  www.osel.co.uk/presentations/Youstartcoding.pdf  . It seems I wasn’t alone in having mixed experience either . Many there related issues arising mostly from human issues – power, fear….etc.  so in general much of what we had to say was quite downbeat. The striking contrast was the contributions from Japan. We were fortunate to have people over from Sony and IBM Japan. Perhaps they were off topic but it was great to hear how quite evidently world class organizations address the issues of software development.  Interesting ideas in root cause analysis – with some cultural oddities – at least to a European eye.  And one of the high lights was a brilliant graphic from Yukiko of IBM – the ‘Yukikogram’  - that revealed project activity brilliantly – but in a way I’m not allowed to share yet.

 

 

 

 

 

 

 

9 June 2014:  BCS SPIN SG – Invitation to participate in the Autumn Workshop….

 

BCS SPIN SG

One Day SPI and Software Process Engineering Methods and Tools workshop

9th October 2014

BCS Offices, Southampton Street

 

The autumn meeting of the BCS SPIN SG will be about sharing experiences of methods, techniques and tools for software process engineering and software process improvement.

All sectors of the software industry have a need to develop and adapt working methods to better meet technical or commercial needs. SPI is well established in mature software engineering organizations, although success is still hard won and prone to distraction by the drive to comply with various models. The commercial sector is gaining experience with contemporary development methods that often include feedback loops. Perhaps the most challenging area is the public sector with conservative cultures generally resistant to change. All have something to contribute and all can learn from the experience of others.

(We don't want to hear about generic models like CMMI or kanban or lean. These are well known. We do want to hear about the tools and techniques - that may well be parts of these models or help implement these models - that have enabled change and improvement.)

The meeting will be structured around a number of forty minute sessions where practitioners can describe the experiences they have had with process engineering/improvement tools. Each session will be divided into two parts: twenty minutes presentation, then twenty minute questions and answers and discussion.

We are particularly keen to hear from practitioners working within their organization, rather than from consultants (which rules me out). Presentations providing evidence of the value of the tools will be given priority.

If you would like to share your experiences please email me Clifford Shelley at shelley@osel.netkonect.co.uk with a brief description of your talk.

 

 

 

 

 

 

 

28 March 2014:  Two software measurement notes

 

Two papers I wish I’d had time to talk through at the recent metrics workshop – firstly a position paper presented by Barbara Kitchenham in 1993 discussion the principle reasons why software measurement data is not comparable, and a notes on software measurement ‘maturity’.

 

 

 

 

 

 

 

17 March 2014:  Software Measurement and Software Development Analytics courses in May

 

We never did finalize the contents of the Estimation workshop last week – which made for a few sleepless nights. Never-the-less most of the people that attended seemed to get some value from it. I presented sections from recent conference presentations and parts from the principles and practice of software measurement course. As usual tried to do too much and probably hurried the material. And had to miss some material I particularly wanted to cover out – which was annoying.   Now we’re revisiting the PAPOSM and SDA course materials in preparation for the courses in May  - see our events page to find out more.

 

 

 

 

 

 

 

4 February 2014:  Upcoming events…

 

The BCS SPIN SG is hosting a joint event with the UK Software Metrics Association. We are having a one day estimation workshop on 14 March at the BCS Offices at Covent Garden. Draft details a can be found at the UKSMA web site www.uksma.co.uk - on the workshops page – and we’ll be posting details on the BCS website, LinkedIn and here when we know more about who will be doing what.

 

And we will be presenting our Software Measurement and Software Development Analytics courses on 22 and 23 of May – see out events page

 

 

 

 

 

 

 

15 October 2013:  We’ve posted our Measurement and Analytics training schedule  - and starting to plan the next BCS SPIN SG workshop…

 

We’ve just published our schedule of analytics and measurement courses for the first half of next year – see our events page. And we’re starting to think about another BCS SPIN workshop sometime early next year too. Last year the workshop was presented by two consultants, but as everyone knows the really good stuff comes from those that are doing SPI within their own organizations - so we are looking for some of those people to share their know how and experience. If you are one of them email me at shelley@osel.netkonect.co.uk.

 

 

 

 

 

 

 

 22 August 2013:  Office closed for a few days

 

We’re off to the sun for a few days – back in the office on September 2nd – we’ll piv ck up emails and messages then….

 

 

 

 

 

 

 

6 July 2013:  UKSMA Summer Seminar at Canterbury Christchurch

 

A really interesting day at Canterbury: UKSMA has decided to do more than just the annual conference at Canterbury was the venue for the first summer seminar.  A diverse audience getting their first taste of ‘software measurement’ reacted in unexpected ways  - they recognized the issues raised by my eXtrement Measurement talk and, reacted really  - surprisingly – positively to my ‘planning poker’ demo.  (Another example of a technique the reifies the seeming intangible and abstract ? )  And many thanks to Frederico for introducing me to the ideas of Karl Weick – his ‘loose coupling’ explains many of the thing we see in software development. A good day – we should do it again

 

 

 

 

 

 

 

June 2013:  GilbFest ’13 at Deutsche Bank

 

Another week of fun at Deutsche Bank with G’13. The usual variety of thought provoking and provocative thinking from some of the regulars together with opportunities to hear from some of the software world’s genuine stars.  All great stuff, plenty of material to absorb and process.  It was good to hear ideas on contemporary dev from within Intel – a bit out of my class – but did help me understand why kanban boards are so ridiculously popular.  My contribution to this year’s theme of ‘Management Methods’ was to recover some of my earlier work on the evaluation of software development (and management) methods and tools. My talk identified work done by DESMET and proposed a more focussed, chicken like approach to evaluation that delivers the most useful and actionable results – you’ll have to look at the slides

 

 

 

 

 

 

 

May 2013:  Software Visualization

 

I’ve been helping with the University of Hertford’s ‘Multi Dimensional Timelines’ project to develop visualizations of software projects – more to follow – and have been prompted to hunt down other beautiful visualizations. Just for fun here -   http://vimeo.com/1093745  is a visualization of the development of Python using code_swarm. And here  http://code.google.com/p/gource/wiki/Videos are some Gource videos showing code growing.  D3’s Codeflower is a development of these and I may see if this can be developed further as part of our analytics work…

 

 

 

 

 

 

 

 

May 2013:  The next BCS SPIN SG Meeting on 6 June : ‘The First UK SEMAT Workshop’

 

You are invited to the first UK SEMAT Workshop

 

Title:     The First UK SEMAT Workshop

 

Description:

 

The SEMAT (Software Engineering Method and Theory) 'call to action'

published by Ivar Jacobsen, Betrand Meyer and Richard Soley expressed a desire to improve our understanding of software engineering:

 

 Software engineering is gravely hampered today by immature practices.

Specific problems include:

 

 - The prevalence of fads more typical of fashion industry than of an engineering discipline.

 - The lack of a sound, widely accepted theoretical basis.

 - The huge number of methods and method variants, with differences little understood and artificially magnified.

 - The lack of credible experimental evaluation and validation.

 - The split between industry practice and academic research.

 

We support a process to refound software engineering based on a solid theory, proven principles and best practices that:

 

 - Include a kernel of widely-agreed elements, extensible for specific uses

 - Addresses both technology and people issues

 - Are supported by industry, academia, researchers and users

 - Support extension in the face of changing requirements and technology

 

 

The SEMAT initiative is an ambitious attempt to establish a common ground; to 'refound' software engineering, providing a rational, communicable and evidence based model for SE.

 

This free one day workshop is intended for both software professionals and academics to explore the rationale for SEMAT, the SEMAT model and the SEMAT 'kernel'  an actionable, extensible and practical model of 'the things that matter'.

 

We will begin the day by surveying the SE landscape that led to the SEMAT call for action, then exploring the SEMAT and the kernel itself, and finding out how to use them.

 

We believe this is the first time this model has been presented in the UK and that it is an opportunity not to be missed by software process engineers.

 

 

Speakers:           Ian Spence

 

Our principal speaker will be Ian Spence. He is the European CTO and Chief Scientist of Ivar Jacobson and a co author of the recently published 'The Essence of Software Engineering: Applying the SEMAT Kernel

 

 …and we have Dr Paul Ralph to explore the GTSE side….

 

 

 

Date:                                        Thursday, 6 June 2013

 

Venue:                                      BCS London Offices, Southampton Street

 

Time:                                        9.30 - 16:30

 

Meeting Format:                                    Interactive Workshop with some Presentations

 

Attendance:                               Free

 

Sustenance:                                Drinks only (will need to provide own lunch

 

Registration:                               Spaces are limited and registration is required:

 

To book online go to  http://www.bcs.org/content/ConWebDoc/50516

 

 

 

 

 

 

 

 

April  2013:  Deep Kanban visualization

 

Having been rude about the deep kanban visualization I saw at LLKD13 recently I’ve put my money were my mouth is an had a go at a different visualization. I quite like it – in It’s unconstrained and adaptable, and should limit comparison See it here.

 

 

 

 

 

 

 

March 2013:  ‘Getting Started’

 

Just found this in the archives. Presented at a British SPIN meeting it tried to describe how to do real SPI in the face of increasing numbers of organizations simply getting a ‘badge of maturity’.  And we were promoting a couple of our SPI tools.

 

Obviously we failed – CMM/CMMI appraisals have become audits, but the presentation is still interesting – and useful for those looking to adopt contemporary models of better ways of working.

 

 

 

 

 

 

 

March 2013:  BCS LLKD13 at Southampton Street

Great day – really useful stuff. I was traumatised when I realised it was to be on a Saturday but managed to get in on time for a coffee before Mike B. eased me into current kanban thinking – good, useful start. The first keynote was interesting (I wasn't aware of this work, but like the ideas – as, quite obviously, does DS) but too many big words too early in the day. My brain filled up and tripped out before we got to the punchline. I'll have to find out more later. Two things struck me during the day – the interest in making these ideas operational – putting the stuff into action, especially the occasional nods to experimenting and 'doing science' - and the relaxed approach to 'mix 'n match'. Zealotry well under control. Really nicely grounded – not too much nonsense.

There is a bucketful of PhDs to be had from investigating Info Radiators/Boards/post-its. Something odd is happening – the appeal of a determinedly low tech tool, how they're used and how well they are liked needs to be understood better. Not sure if this is transitional tech – the electronic boards just aren't convincing, but surely we can't go on using post its? Strange.

And I really want to hear more Spotify – really good talk. I'd love to know how they got to be that way, and how they're going to deal with dependencies – they're at the finger in the dyke stage at the moment – and are going to need some serious tools to mange dependencies in the future.

And the final keynote was a hoot - some entertaining flim flam and a truly epic graphics atrocity that will keep students of metrics malpractice happy for a while.

Our presentation on the perils of adopting manufacturing techniques for design intensive work seemed to go down quite well, which was a relief. This is the first time we’ve presented in this community and it wasn’t clear how this would be received. In fact it was not as contrarian as I thought it might be. And I was astonished that no one showed any interest in the fact that the costs of exploratory, trial and error software process improvement – as ‘Action Reseach’ can be reimbursed – 100%. People just don’t seem to believe it. The slide for this are here.

Many thanks to all the organizers – including the BCS who are getting really good at making this sort of thing work.

 

 

 

 

 

 

 

 

February 2013:  A new duplex supplier evaluation service

 

Check out our new duplex supplier evaluation service. This unique approach recognizes that there are two parties to a commercial relationship and that the success of both of them depends on both of them and their ability to work well and communicate with each other other. Think of it as B2B matchmaking – more information on our supplier evaluation page

 

 

 

 

 

 

 

February 2013: The first BCS SPIN SG meeting of 2013

 

The SPIN SG meeting of 2013 went well – good group of people, good discussion, no sales pitches, and some useful material. We’ll do it again. Notes from the meeting can be found here.

 

 

 

 

 

 

 

 

December 2012: The first BCS SPIN SG meeting of 2013

 

The SPIN SG meeting of 2013 is a free one day SPI workshop. It will be taking place at the BCS London Offices at Southampton Street

 

On  5 February, with arrival from 9.00 for 9.30 start, finishing about 5.00.

 

This workshop it designed for those responsible for, or actively engaged in software process improvement, process engineering, or organizational transformation – including the trasnsition to agile software development -  seeking to improve their effectiveness and to share their own ideas and experiences.

 

This wide ranging, interactive workshop will include:

 

-          the origins and original motivations for SPI,  giving insights to the strengths and weaknesses of process improvement models.

 

-          the ‘ten rules of SPI’, inspired by a recent discussions of SPI practice (and malpractice ), and work to develop an ‘SPI manifesto’.

 

-          the ‘Energizing CMMI’ seminar which describes how ‘Big SPI’ programmes can be recast to deliver genuine benefits in short timescales.

 

-          ‘Forget Process, Focus on People’  will present new ideas on a frequently neglected aspect of SPI: in design intensive and knowledge working environments the fundamental resource is people – how can they give their best?

 

-          The Top 5 SPI tools; tools are a powerful but often overlooked resource. We take a look at these: which ones work best? What tools should your SPI tool set contain?

 

-          There is a growing interest in applying agile approaches to SPI work. This exciting new development is returning SPI to it roots. This talk will compare contemporary Agile PI approaches and compare these with the proven models and tools for Agile Process Improvement.

 

-          SPI work is notorious for its high (but rarely reported) failure rate. We will identify the major traps that can derail SPI work, and how to avoid them.

 

…and we will be reviewing the most useful SPI resources – books, references and other readily accessible materials

 

Details will be posted on the BCS SPIN web page soon. To book now please me at: mailto:shelley@osel.netkonect.co.uk

 

 

 

 

 

 

 

 

December 2012: The first Software Measurement and Analytics Courses of 2013…

 

The Principles and Practice of Software Measurement and Software Development Analytics courses will be taking for the first time in 2013 in mid January (Thursday the 17th and Friday 18th) in Central London Take a look at the course detail by clicking on the links above or by visiting our events page.

 

And…

 

…the next BCS SPIN SG meeting will be held on 5 February at Southampton Street. This is going to be a free one day workshop on SPI itself. We will be taking time to look in depth at SPI practices, identify what works, what doesn’t, slaughtering a few sacred cows and having some fun.

 

Details will be posted here and on the BCS SPIN web site in the next few days. If you’re interested email me to reserve a place.

 

 

 

 

 

 

 

 

 

November 2012:  OSEL’s Software Development Analytics and Micro-Analytics services at zero cost for most EU clients ! !

 

We are delighted to announce that  our clients in the UK and most parts of the EU are now able to take advantage of our analytics and micro-analytics services (and some of our other services) at  zero cost !!

 

Email us at shelley@osel.netkonect.co.uk to find out more.

 

 

 

 

 

 

 

November 2012:  UKSMA Software Measurement Conference.

 

Just back from the UKSMA conference having presented on  ‘Agile Software Measurement’.  I wasn’t expecting this to be received well – it criticises the current state of software measurement (and offers a way forward) – and it wasn’t, but not for the reasons expected.  My presentation was a late entry, replacing Brian Wells who couldn’t make it. I found myself presenting in the FP stream to the FP specialists, who while not unduly hostile clearly had zero interest in s/w development practice, software measurement for developers, or software for that matter. This community appears divorced from the software producers. It appears to provide administrative services to, typically, large, publicly funded organizations, providing information on costing of large software systems. They undertake analysis of FP data, providing evaluations of s/w productivity in terms of FP/man days data.

 

It was disturbing to see such a disconnect between software development and those involved in software measurement. (In the UK software measurement is seen as almost synonymous with function point counting). It is also alarming to see how primitive analytical models of productivity (FPs/ man days…etc) are transformed into damaging performance measures, apparently wilfully ignoring the realities of software development and the value delivered by software. Some of the FP presentations did provide interesting analysis of the variability of s/w ‘productivity’, really, but this variability took no notice at all of the error proneness and bias in the data – uncertainties were not entertained at all – dismissed without discussion, or the value of FPs delivered. And there are serious errors in the perception that a s/w product’s ‘size’ can be estimated from outline requirements.

 

The model of s/w development shared by the FP ‘cost accountants’ appears to be one where greedy s/w suppliers operate a ‘black box’ software factory extruding  FPs at a given rate and cost, with homogenous, interchangeable  ‘resources’ working in this black box. The role of the FP community appears to be to ensure that the ignorant client is not fleeced by the greedy supplier. Attention is focussed on theological discussion about the best type of FP counting mechanism and the search for industry standard productivity figures to apply to this nightmare model.  I believe the impact of the FP community’s analytical metrics when transformed into operational performance metrics, in the course of contract negotiations, and then used for monitoring and control of the s/w producing ‘black box’ is simply not understood. I am beginning to understand how ‘productivity’ measures enable analysis and comparison in the large, but the damage they do when used as operational or performance metrics doesn’t seem to be acknowledged at all.  Perhaps this ignorance is our best hope of repairing this dreadful model. If the damage analytical FP productivity measures can do when they become performance figures can be recognized and understood then there is hope of building Chinese walls, or finding some other mechanism to prevent their unintentional misuse. It would be good to share understanding of where the value of software actually lies. There need to be ‘reconciliation talks’ to enable the various perspective to begin to share understanding – perhaps as agile s/w development is adopted by the UK government such opportunities will arise?

 

Consider this. We could evaluate the activities of surgeons by measuring the length of incisions they make, perhaps augmented by measures of the amount of tissue they remove. Busy, productive surgeons will cut more and remove more, obviously. Perhaps the measures should be number of cuts, and the degree of disease of damage to the tissue, but essential we have useful proxy measures – it’s what surgeons do. And maybe consider successful outcomes and deaths too. An analysis of such surgeon ‘productivity’ data is feasible, if odd.

 

What would happen if these measures were to be used to assess future surgeon performance and remuneration?

 

It’s similar for software – FP and LOC produced can, and are evaluated and analysed to compare s/w productivity – its what s/w people do – make software. And such analyses can be useful,  really – no irony here. But what happens when these measures become performance measures? Like the surgeons productivity measures they induce dysfunction. 

 

I fear that the current inability to distinguish analysis from performance measurement is leading us to follow in the footsteps of the pioneer statisticians whose enthusiasm for biometrics led to eugenics.

 

Fortunately I was not alone in my recognition of the failings of current software measurement practice and in attempting to bring attention to the true, contingent, design intensive, human character of software development and that the value of software is only loosely correlated to the amount of software created.

 

I believe that these nightmarish models, so disjoint, not ‘orthogonal’, but disjoint, from the realities of s/w development are one of the reasons large, publicly funded software projects are so prone to terrible overruns and failure, and are, too often, such horrible places to work.

 

It was good to see other presenters at the conference attempting to present a more useful role for measurement and a more humane approach to software development – in particular Steve Parry’s ‘Grant Rule memorial lecture’ was excellent and very timely – Grant would have been pleased.

 

 

 

 

 

 

 

October 2012:  Winter Software Measurement and Analytics Courses.

 

The Principles and Practice of Software Measurement and Software Development Analytics courses will be taking place early next month in Central London during the same week as the UKSMA conference. Courses will be on Tuesday, 6 November and Wednesday, 7 November.

 

Take a look at the course detail by clicking on the links above or by visiting our events page.

 

 

 

 

 

 

 

September 2012:  The Next BCS SPIN meeting – A review and reworking of the BCS SPIN Manifesto

You are invited to the BCS SPIN SG evening meeting in London on Tuesday evening, 25 September 2012.

This meeting will be devoted to reviewing and reworking the SPIN manifesto (yes we have one too, its good and predates that one by many years).

The BCS SPIN SG is the first non US Software Process Improvement Network (SPIN), established after its founders had visited the US to help ratify the first release of the CMM, encountered US SPINs and though they were a great way of sharing technology transfer know how.

 

 Early SPIN interests were dominated by emerging ISO software standards, including ISO 9000-3 and 9126, and the TQM toolset, with CMM only becoming popular later, despite being the inspiration for the founding of SPIN. We have also maintained a long term interest in software measurement and quantitative methods, including six sigma, although so far these have not delivered on their promise, but things appear to be changing there.

 

We were one of the first organizations to recognize the value of agile development with early interest in DSDM and XP. And SPIN also provided the stimulus to the development of agile process improvement methods and tools.

 

 Since its founding the world has changed, with the massive expansion of software development worldwide, an increased understanding of the nature of software and systems development and development cultures, and the remarkable developments in software development and collaboration technologies. It is time to take stock, and, perhaps, redirect and refocus our attention and ways of sharing.

 

 Much will change but some things will not: We value sharing of real evidence, and especially understandable, credible data. We welcome scepticism, constructive criticism and challenges to the status quo. Our focus is on real world software development and management, not research or theory, and as a BCS SG, we try to be a hype free and sales pitch free zone.

 

 This meeting is intended for those with a real interest in understanding and developing the way we make software. It will begin with a look at SPIN’s work and the changes we have seen.  An open discussion of contemporary software and systems development opportunities and challenges will then be followed by a reworking of our manifesto and plans.

 

Date: Tuesday, 25 September 2012

Venue: BCS London Offices, Southampton Street

Time: 17.30 for 18:00 start - finishes at 21:00

Attendance: Free

Sustenance: Free coffee, courtesy of the BCS.

 

Registration is required. To register click here …    https://events.bcs.org/book/385/

 

 

 

 

 

 

 

 

 

 

 

September 2012:  Autumn Software Measurement and Analytics Courses.

 

The Principles and Practice of Software Measurement and Software Development Analytics courses will be taking place in the next couple of weeks, on Thursday the 14th   and Friday 15th  of September.

 

Take a look at the course detail by clicking on the links above or by visiting our events page.

 

 

 

 

 

 

 

June 2012:  Gilb 2012

 

Looking forward to Tom gilb’s seminar later this month. The theme is ‘Principles, proverbs, practices, paradigms, patterns and heuristics’. (Not sure how heuristics crept in – should be pheuristics, which is my conflation of heuristics and furore and means: ‘to proceed by means of fads, crazes and fashion’ :) It’s being hosted by Deutche Bank – thank you Paul - and looks like there will be more people than every this year.  It should be good. I’ll be talking about software development antipatterns .

 

 

 

 

 

 

 

April 2012:  A Software Development Analytics paper from Microsoft….

An interesting and encouraging paper from Buse and Zimmerman looking at the need and potential of software development analytics. Not sure I agree with everything they say but clearly we are looking at the same thing, with the similar high expectations.

 

 

 

 

 

 

 

20 March 2012:   The 23rd UKSMA conference – call for papers…

 

 

The UK Software Metrics Association (UKSMA)

 

CALL FOR PAPERS

 

The 23rd Annual UKSMA Conference is to be held on Thursday November 8th 2012, in Central London.

 

The conference affords the opportunity to share experiences, ideas, case studies, and strategies for widening the relevance, appeal and benefit of measurement to the management of software development and maintenance. 

 

The theme of the conference this year, 21st Century Metrics is wide ranging to reflect the increasingly diverse nature of software measurement and changes in the software development and maintenance community. 

 

We are keen to receive submissions/abstracts on the topics of:

·         innovative and original approaches to software measurement;

·         introduction of measurement to software development and maintenance environments;

·         the presentation of measurement data to aid decision making;

·         best practice in measurement;

·         estimation;

·         measurement use in contracts or outsourced arrangements;

·         the use and benefits of measurement data;

·         measurement in CMMI, ITIL, Prince2 and ISO contexts.

       

Case studies of these topics that provide a balanced view of the development and use of measurement would be particularly welcome.

 

You are invited to submit papers or presentations to  conferences@uksma.co.uk.  

 

Please submit a short description (a précis) summarising your presentation.  Each presenter has 40 minutes, including 10 minutes for questions.

 

It would be appreciated if you would indicate your intention to submit a paper as soon as possible, including a provisional title, even if a précis is not yet available, so that we can gauge the response. 

 

Target Audience

  • Practitioners who measure software activities and products and who use the data for project and product quality control, performance improvement and estimating
  • Managers who rely on measurements and estimates for decision-making

 

Paper Submission

Preference will be given to papers that have not been published elsewhere or presented at other conferences. At least one of the authors of each accepted paper must register for the conference (for a very low conference fee) and present the paper.


Timetable

Issue Call for Papers: mid February

Précis/short description: to be submitted by 14th May

Author Notification: Authors of accepted papers will be informed by 1st June

Final Presentation/paper: to be submitted by 28th September

 

 

 

Information about UKSMA can be found at www.uksma.co.uk.

 

 

 

 

 

 

 

 

February 2012  The Next BCS SPIN meeting….

You are invited to the BCS SPIN SG evening meeting in London on Monday evening, 5 March 2012.

For the first meeting of 2012 we are adopting a different format. We will have some fun with a formal debate, and the topic promises to make this debate very lively. Originally conceived as either a diagnostic of software development capability or, alternatively, a formal audit of that capability, the appraisal has developed, or degenerated, over the years as have the expectations of to those commissioning them, performing them, and otherwise entangled in them. This debate will explore the evolution of the formal appraisal and the value it delivers.

This is free event with limited numbers, so it is on a first come first serve basis. The details are:

Title: “The Formal Appraisal considered harmful?”

Topic: A debate and vote on the motion “This house considers that formal appraisals are a waste of resources and are detrimental to the organization”

The speakers:

Proposing the motion:  Peter Leeson has many years of experience in formal appraisals and improvement models, visiting scientist with the Software Engineering Institute, having conducted appraisals and facilitated process improvement efforts around the world. He is a recognized speaker and will propose the motion that formal appraisals are detrimental to the organization being appraised.

Opposing the motion: Kieran Doyle is an SEI certified Lead Appraiser, he leads all types of CMMI appraisals and works closely with organisations to guide their improvement activities towards achievement of their improvement goals. In the past 2 years he has worked a lot with the CMMI for Services model. 

Date: Wednesday, 5 March 2012

Venue: BCS London Offices, Southampton Street

Time: 17.30 for 18:00 start - finishes at 21:00

Meeting Format: A debate. The motion will be proposed and a vote taken. The speakers will propose and oppose the motion and then debate. Questions and points will be taken from the floor. At the conclusion of the debate we will vote again.

Attendance: Free

Sustenance: Free coffee, courtesy of the BCS.

Registration: Spaces are limited and registration is required: To book online go to the BCS SPIN website and click on 'forthcoming events' or https://events.bcs.org/book/217/

 

 

 

 

 

 

 

November 2011  ACCU: First Contact

 

After joining ACCU earlier this year I attended a meeting of the Oxford Group. Excellent – interesting presentation, good discussion, and well grounded in the realities and practicalities of software development. I can recommend ACCU, check it out.

 

 

 

 

 

 

 

November 2011  Reference and Operational Software Models

 

At a recent meeting it was fascinating to hear the speakers coming to an unconscious consensus on the use of software development and process improvement models. The widespread failure of these models to deliver the benefit expected of them, despite best efforts, occasional successes and apparently soundness ans usefulness of the models led the participants to begin identifying the difference between generic ‘reference’ models, like CMMI and Scrum, and the organizations’ unique ‘operational’ models that are the foundation for day to day development practices and the baseline for improvement.  While the relationship between these two categories of models was never really made clear, let alone how they should be treated or used, it was it encouraging to see a growing awareness of the distinction. For a discussion of how these should be used see: http://www.osel.co.uk/papers/energizingcmmi.pdf  p6.

 

 

 

 

 

 

 

November 2011  Metronos Project Scope agreed

 

The scope of the UKSMA Metronos project has been agreed – see ‘metronos body of knowledge – scope – final.pdf’ in metronos.pbworks.com

 

 

 

 

 

 

 

November 2011  RPI to become Agile Process Development

 

After several years of watching the relentless march of the agile movement we may be caving in and renaming our rapid process improvement toolset. RPI is the natural low risk way to transform a development capability (to agile development if you’d like) and it has most of the characteristics of agile software development itself – frequent small iterations, frequent feedback and validation, low risk, and fun.  Initial thoughts were to rename RPI ‘Agile Process Engineering’ but this has an unfortunate acronym so we’ll settle for Agile Process Development or APD. See our RPI page before it transmogrifies into APD.

 

 

 

 

 

 

 

November 2011  Two conferences in November

 

Back in the office after a busy couple of weeks.  We presented our ideas and experiences of ‘Software Data Analytics’ at the joint UKSMA/COSMIC conference after a last minute drop out. Interesting to get a fix on two measurement communities and to watch function pointers sniping at each other.  Now I have to analyse the feedback form data….

 

And – presenting for the first time at the 2011 ‘Next Generation Testing’ Conference. Talking again about the excellent work done by bwin, with slides co-authored by their test manager Gerald Grossmayer.

 

 

 

 

 

 

 

 

30 August 2011  The Next BCS SPIN SG Meeting

 

The next SPIN meeting is on 13 September 2011. We have two talks and it is the AGM. Details below….

 

You are invited to the BCS SPIN SG evening meeting in London on Tuesday evening, 13 September 2011, this is free event with limited numbers, so it is on a first come first serve basis. We have two talks...

The first Talk....

Title: Forget Process Focus on People, (or “Do you know what your problem really is?”)

Description: Some 80% of all process improvement efforts fail – usually because they focus on the wrong thing. It is time that we start thinking about what we are actually trying to achieve before slavishly following some model or theory that was put together by potentially very smart people, in some academic context.

This talk, is aiming at trying to remind people what it means to work in a successful context, what are the things that are needed or expected before the level of quality that many organizations expect can actually be delivered.

Speaker: Peter Leeson

and the second talk....

Title: An Agile Case Study

Description: This case study reports how a European software organization, working between 2007 and 2009, transformed their already agile s/w development capability into a new business tool of strategic importance by using exemplary SPI practices to produce a highly predictable, scaleable and concurrent development capability. It is believed that this capability is unique. Data showing how the development capability will be presented and offered for analysis.

Speaker: Clifford Shelley

The Speakers:

Peter Leeson is a CMMI lead appraiser, instructor and consultant with 17 years experience in assisting international organizations with their improvement programmes and appraisals and over 35 years in the software industry. He is an SEI visiting scientist. However, his focus is not on process, models, conformity or ratings, but is firmly based on doing what is necessary to produce quality for your customers. Models, etc. are interesting tools, but they are no more the solution than your configuration management tool. He does not believe in the one-size-fits-all solution that so many consultants seem to offer, but focuses clearly on using (and adapting) a variety of solutions and approaches to the specifics of the problem at hand.”

Clifford Shelley is a consulting software engineer with experience of software development across various industry sectors and diverse development environments. He has been involved in all phases of software development and been responsible for the development of software systems and products. He is happy investigating and resolving software development problems in organizations working within severe resource constraints. He has a long standing interest in the software design process and managing software quality.

 

 

 

Date: Tuesday, 13 September 2011

Venue: BCS London Offices, Southampton Street

Time: 17.30,  with 18:00 start and finishing at  21:00

Meeting Format: Presentations and discussion, 'til 8.40, followed by AGM

Attendance: Free

Sustenance: Just tea and coffee this time (budget cuts I'm afraid). .

Registration: Spaces are limited and registration is required: To book online go to www.bcs.org/events/registration

 

 

 

 

 

 

 

 

August 2011: Next Generation Testing Conference , 2- 3 November 2011

 

We will be having fun at Unicom ‘s  NG Testing conference in November. We will be talking about bwin’s approach to testing in its ‘state–of-the-art’ agile delivery process….

Case Study:  Testing in a Super Agile Environment

This case study describes how a European software organization evolved their already agile dev/test capability to meet the organization's needs for a dependable and rapid implementation capability ready to meet unpredictable commercial and legal requirements.

The way in which new ways of testing were identified and refined is described, showing how dedicated testers, embedded in agile delivery teams, supported the emergence of this novel, possibly unique, dev/test capability.

Data collected during this transformation from 2007 to 2010 are presented. They show how delivery teams became highly responsive and predictable, enabling cross team sychronization and scaling of systems delivery, whilst systems quality continued to improve.

 

 

 

 

 

 

 

 

July 2011: Tom on Gerry

 

I was pointed to this item on Gerry Weinberg’s blog by Tom Gilb. It’s a fascinating description of how we adopt new technologies. At the moment there is a mania for agile development, that is beyond reason. Yes it is a very valuable approach, we’ve used it, and support it, and promote it – where it will be useful - but it is difficult to talk sensibly about it or to manage the transition effectively or appropriately while the industry is in this mood. And ironically most ‘agile’  isn’t, but this is nothing new….

 

"Monday, June 06, 2011

 

Beyond Agile Programming

 

After being in the computing business now for more than half a century, one thing worries me more than almost anything else: our lack of a sense of history. In order to contribute my bit to addressing that problem, I've posted this essay—one that's sure to infuriate many of my readers, including some of my best friends. So first let me tell you how it came about.

 

While reformatting my book, Rethinking Systems Analysis and Design for e-booking, I noticed a few places that might have needed updating to present realities. The version I was using was more than 20 years old, from just after the peak of excitement about "structured programming." In particular, there was a whole section entitled, "Beyond Structured Programming." As I contemplated updating that section, it dawned on me that I could almost update completely by substituting the name of any more recent "movement" (or fad) for the word "structured.

 

 I also knew how smart most of my readers are, so I figured they would see the same possibility without my updating a thing. Instead of changing the book, I decided to update the section and publish it on this blog. Why? Because I think it shows an important pattern—a script where only the names have changed over at least five decades. So, here is the section with "agile" substituted for "structured," just as "structured" had been substituted for some other fad a generation earlier.

 

The Restructured Essay

Before I proceed further with the task of rethinking systems analysis and design, I'd like to express myself on the subject of another great "rethinking" in programming—the agile programming revolution. Although this essay was written a generation ago (now two generation), and the agile programming "revolution" is now an exhausted fad (for most programmers), most of what this essay says still applies—though to the next rethinking fad, and the next, and the next. I believe it will still apply long after I'm no longer writing new editions. Why? Because our industry seems to require a new fad every decade to keep itself from being bored. So, just apply the lessons to whatever fad happens to be dominating the computer press at the time you're reading this.

 

Before anyone becomes overly enthusiastic about what the rest of this book says, I want to take stock of what this great agile rethinking has done. I don't claim to be starting a new revolution of the magnitude most of the fads claim, so I'd like people to realize how slow and how small the agile programming movement has been, in case they think this book is going to make much difference.

 

My own personal stock-taking on the subject of agile programming is based on visits to some forty installations on two continents over the past ten years, plus a few hundred formal and informal discussions with programmers, analysts, managers, and users during the same period. Because of the conditions under which these visits and interviews took place, I would estimate the sample is quite heavily biased toward the more progressive organizations. By "progressive," I mean those organizations more likely to:

• Send staff to courses

• Hire outside consultants, other than in panic mode

• Encourage staff to belong to professional organizations, and to attend their meetings.

Consequently, my stock-taking is likely to be rather optimistic about the scope and quality of the effects of agile programming.

 

The first conclusion I can draw from my data is this:

Much less has been done than the press would have you believe.

I interpret the word "press" very loosely, including such sources as:

• Enthusiastic upper management

• The trade press

• The vendors and their advertising agencies

• The universities, their public relations staffs, and their journals

• The consulting trade.

Although this may be the most controversial of my observations, it is the most easily verified. All you need do is ask for examples of agile programming—not anecdotes, but actual examples of agile behaviour and agile-produced code. If you're given any examples at all, you can peruse them for evidence of following the "rules" of agile programming. Generally, you will find:

 

a. Five percent can he considered thoroughly agile.

 

b. Twenty percent can be considered to follow agile practices sufficiently to represent an improvement over the average code of 1990.

 

c. Fifty percent will show some evidence of some attempt to follow some "agile rules," but without understanding and with little, if any, success.

 

d. Twenty-five percent will show no evidence of influence by any ideas about programming (not just agile) from the past twenty years.

 

Please remember: these percentages apply to the code and behaviour you will actually see in response to your request. If you ask software organizations at random for "agile examples," about two-thirds will manage to avoid giving you anything. We can merely speculate what they do, and what their code contains.

 

My second conclusion:

There are rather many conceptions of what agile programming ought to look like, all of which are reasonably equivalent if followed consistently.

The operative clause in this observation seems to be "if followed consistently." Some of these conceptions are marketed in books and/or training courses. Some are purely local to a single installation, or even to one team in an installation. Most are mixtures of some "patented" method and local adaptations.

 

 

My third observation:

Methods representing thoughtful adaptations of "patented" and "local" ideas on agile programming are far more likely to be followed consistently.

In other words, programmers seem disinclined to follow an agile methodology when it is either:

1. Blind following of "universal rules"

2. Blind devotion to the concept: anything "not invented here" must be worthless.

 

My fourth observation:

I have other observations to make, but now I must pause and relate the effect these observations have on many readers, perhaps including you. I recall a story about a little boy who was playing in the schoolyard rather late one evening. A teacher who had been working late noticed the boy and asked if he knew what time it was.

"I'm not sure," the boy said, "but I know it isn't six o'clock yet."

"And how do you know that?" the teacher asked.

"Because I'm supposed to be home at six, and I'm not home."

When I make my first three observations about agile programming, I have a similar reaction—something like this:

"These can't be right, because if they were right, why would there be so much attention to agile programming?"

 

In spite of its naive tone, the question deserves answering. The answer can serve as my fourth observation:

 

Agile programming has received so much attention for the following reasons:

• The need is very great for some help in programming.

• To people who don't understand programming at all, it seems chaotic, so the term "agile" sounds awfully promising.

 

• The approach actually works, when it is successfully applied, so there are many people willing to give testimonials, even though their percentages among all programmers may not be great.

 

• The computer business has always been driven by marketing forces, and marketing forces are paid to be optimistic, and not to distinguish between an idea and its practical realization.

 

In other words, the phrase "agile programming" is similar to the phrase "our latest computer," because each phrase can be used interchangeably in statements such as these:

 

• "If you are having problems in information processing, you can solve them by installing our latest computer."

 

• "Our latest computer is more cost effective and easier to use."

 

• "Your people will love our latest computer, although you won't need so many people once our latest computer has been installed."

 

• Conversion? No problem! With our latest computer, you'll start to realize savings in a few weeks, at most."

 

So actually, the whole agile programming pitch was pre-adapted for the ease of professionals, who have always believed "problems" had "solutions" which could be mechanically applied.

 

My final observation is related to all of the others:

Those installations and individuals who have successfully realized the promised benefits of agile programming tend to be the ones who don't buy the typical hardware or software pitch, but who listen to the pitch and extract what they decide they need for solving their problems. They do their own thinking, which includes using the thoughts of others, if they're applicable. By and large, they were the most successful problem solvers before agile programming, and are now even more successful.

 

There's yet another lesson in all this that's much bigger than agile programming or any new hardware or software or process:

 

Our profession contains few, if any, easy solutions. Success in problem solving comes to those who don't put much faith in the latest "magic," but who are willing to try ideas out for themselves, even when those ideas are presented in a carnival of public relations blather.

Based on this lesson, I'd like to propose a new "programming religion," a religion based on the following articles of faith:

 

• There's no consistent substitute for a thorough understanding of your problem, though sometimes people get lucky.

 

• There's no solution applicable to every problem, and what may be the best approach in one circumstance may be precisely the worst in another.

 

• There are many useful approaches applicable to more than one problem, so it pays to become familiar with what has worked before.

 

• The trick to problem solving is not just "know-how," but "know-when"—which lets you adapt the solution method to the problem, and not vice versa.

 

• No matter how much you know how or know when, some problems won't yield to present knowledge, and some aspects of the problem nobody currently understands, so humility is always in order.

 

I realize writing a book is not the most humble thing a person can do, but it's what I do best, and how I earn my living. I'd be embarrassed if anyone took this book too seriously. We don't need another "movement" just now, unless it is something analogous to a bowel movement—something to flush our system clean of waste material we've accumulated over the years.

 

Where to read the original

If you want to check on my historical work, you can find the original essay (and many others) in Rethinking Systems Analysis and Design, which is an ebook on Smashwords (where you can probably see it in the free sample) and Kindle and Barnes and Noble."

 

 

 

 

 

 

 

 

June 2011:  Gilb Seminar

 

Back from an excellent seminar with Tom Gilb, hosted by the people at Deutsche Bank (here we all are).  Nominally themed around ‘solution engineering’ we covered a wide range of topics from war gaming (sort of), the heuristics of engineering method (I think), rather too much psychology, and looking to other areas, for example sales, to find means for identifying and evaluating stakeholder needs.  My presentation – ‘Designing designing’ about the software design process and how we end up with the one we do is here. If I develop a paper from this, some time later, I’ll post it here too.

 

 

 

 

 

 

 

 

February 2011: The next BCS SPIN SG meeting….

 

Title:                        CMMI Version 1.3: The Good, the Bad, the Ugly

 

Description:   The Software Engineering Institute has published a new version of the CMMI, version 1.3. This is a minor upgrade (wait for version 2!), but includes a significant number of changes in all three “constellations” (Development, Services and Acquisition), as well as in the corresponding training and in the SCAMPI appraisal methodology. Naturally, the SEI believes that all these changes are positive and progressive, but are they?

 

This presentation and debate will discuss the more important changes (as well as some of the minor ones) and will be illustrated with a frank opinion as to what is an improvement, and why, while others might not necessarily be as positive as they sound. Finally, a discussion will be led on whether this will help the faltering process improvement industry after the credit crunch or if this may just finish it off.

 

Speaker:    Peter Leeson

 

Peter Leeson has been involved in the software industry for 35 years, and for a process improvement consultant for nearly twenty years. He is a certified lead appraiser, a CMMI instructor, an SEI visiting scientist and the director of Q:PIT Ltd. Peter is also a recognised public speaker who has regularly been elected as one of the best speakers at the yearly SEPG-Europe conference.

 

His approach to process improvement is to always focus first on the business needs of the organization rather than respect of the model, combined with a pragmatic approach to quality improvement. He is currently working on a people-based approach to quality improvement, seeking to demonstrate the relationship between the corporate objectives, the support provided to the staff and the quality of the outcomes rather than focusing on the systematic improvements of tools without consideration of their true purpose.

 

Date:                         Wednesday, 13 April 2011

 

Venue:                        BCS London Offices, Southampton Street

 

Time:                         18:00 - 21:00

 

Meeting Format:               Presentation and discussion afterwards

 

Attendance:                   Free

 

Sustenance:                   Sandwiches and refreshments provided from 17:30, with 18.00 start.

 

Registration:                 Spaces are limited and registration is required:  To book online go to     www.bcs.org/events/registration

 

Membership of the Institute is encouraged for continued involvement and event attendance. In order to find out more about Software Process Improvement Network (SPIN-UK) Specialist Group  and other Member Groups please visit www.bcs.org/membergroupsDelegates

 

 

 

 

 

 

 

 

 

February 2011:   Metronos Kick Off Meeting

 

UKSMA is now (finally) planning the kick off meeting for the Metronos project. It is scheduled for 16 March at Experimentus’ offices in London. The initiative was seeded by this.  The projects terms of reference here and a (very) draft agenda is here and an early view (guess) of the structure of the metronomicon (?) is here. We are looking for enthusiastic software measurement experts willing to commit time and resources to this project. If you are interested then contact me .

 

 

 

 

 

 

 

February 2011: The next BCS SPIN SG meeting….

 

 

Title:                                                                      Overcoming Organizational Stupidity: Understanding the Requirements of Organizational Intelligence

 

 

Description:     

 

Organizational intelligence is a critical measure of the management capacity of an organization in a demanding competitive environment. Organizational intelligence can be blocked in various ways, including fragmented sociotechnical systems and dysfunctional management. Along with removing these blocks, the intelligence of an organization may be helped by appropriate technologies used in appropriate ways to tackle the inherent complexity of modern business in a dynamic environment. Relevant technologies include business intelligence, event processing, knowledge management, process improvement and social networking.

 

In his talk, Richard will present his framework for improving the intelligence of your organization and understanding the requirements for sociotechnical change.

 

 

 

Speaker:          Richard Veryard

 

Richard Veryard is well-known as an independent industry analyst. He spent many years working with the CBDI Forum as an expert on SOA and enterprise architecture, and was previously a Senior Member of Technical Staff at Texas Instruments, where he pioneered methods for component-based business, technology change management and business excellence. He runs workshops for Unicom on Organizational Intelligence. http://unicom.co.uk/orgintelligence

 

 

Date:                         Tuesday, 15 February 2011

 

Venue:                        BCS London Offices, Davidson Building, 5 Southampton Street, London. WC2E7HA

 

                              (see map at http://www.bcs.org/server.php?show=nav.14724).

 

Time:                         18:00 for 18:30 to 20.30

 

Meeting Format:               Presentation and discussion afterwards

 

Attendance:                   Free

 

Sustenance:                   Sandwiches and refreshments provided from 18:030, with 18.30 start.

 

Registration:                 Spaces are limited and registration is required:  To book online go to     www.bcs.org/events/registration

 

 

 

 

 

 

 

 

 

January 2011

 

We were pointed to this paper by Prashanth Harish. It’s a really good summary of the issues s/w meaurement faces. The challenge now is to raise awareness of these – and to find solutions.

 

 

 

 

 

 

 

December 2010

 

A new course for 2011

 

Data Analysis and Statistics for Software Developers and Managers

 

We are completing the materials for our new course to be presented for the first time in early 2011. This course complements the ‘Principles and Practice of Software Measurement’ essentially picking up where that leaves off.  It looks at ways for selecting, assembling and analysing complex and  (usually) messy data to find out what it is trying to tell you. We’ve picked approaches that fit the nature of the data, are simple and quick and deliver insights that you can act on. For further details go to the course page. 

 

 

 

 

 

 

 

3 November 2010

 

Have just heard from Pat O’Toole that Watts Humphrey, founder of the SEI’s Software Process program and recipient of the National Medal of Technology, died on October 28, 2010 at his home in Sarasota, Florida.  He was 83.

 

To read about Humphrey’s legacy, see videos, read samples of his published work, and share your own story, visit www.sei.cmu.edu/watts.

 

 

 

 

 

 

 

October 2010: Rules for Management…

 

For interest…..

Kelly Johnson's 14 Rules of Management

1.The Skunk Works manager must be delegated practically complete control of his program in all aspects. He should report to a division president or higher.

2.Strong but small project offices must be provided both by the military and industry.

3.The number of people having any connection with the project must be restricted in an almost vicious manner. Use a small number of good people (10% to 25% compared to the so-called normal systems).

4.A very simple drawing and drawing release system with great flexibility for making changes must be provided.

5.There must be a minimum number of reports required, but important work must be recorded thoroughly.

6.There must be a monthly cost review covering not only what has been spent and committed but also projected costs to the conclusion of the program. Don't have the books ninety days late and don't surprise the customer with sudden overruns.

7.The contractor must be delegated and must assume more than normal responsibility to get good vendor bids for subcontract on the project. Commercial bid procedures are very often better than military ones.

8.The inspection system as currently used by the Skunk Works, which has been approved by both the Air Force and Navy, meets the intent of existing military requirements and should be used on new projects. Push more basic inspection responsibility back to subcontractors and vendors. Don't duplicate so much inspection.

9.The contractor must be delegated the authority to test his final product in flight. He can and must test it in the initial stages. If he doesn't, he rapidly loses his competency to design other vehicles.

10. The specifications applying to the hardware must be agreed to well in advance of contracting. The Skunk Works practice of having a specification section stating clearly which important military specification items will not knowingly be complied with and reasons therefore is highly recommended.

11. Funding a program must be timely so that the contractor doesn't have to keep running to the bank to support government projects.

12. There must be mutual trust between the military project organization and the contractor with very close cooperation and liaison on a day-to-day basis. This cuts down misunderstanding and correspondence to an absolute minimum.

13. Access by outsiders to the project and its personnel must be strictly controlled by appropriate security measures.

14. Because only a few people will be used in engineering and most other areas, ways must be provided to reward good performance by pay not based on the number of personnel supervised.

 

 

 

 

 

 

 

 

 

 

September 2010:  New Software Measurement Course for October

 

We have been busy developing and are now ready to present our updated software measurement course:

 

The Principles and Practice of Software Measurement

 

                                                                                                                                                                   … next month, the day after the UKSMA conference  . Go to the events page to find out more and to book a place.

 

 

 

 

 

 

 

 

June 2010:  Hybrid Agile/CMMI Software Development

 

This page is called ‘What’s New ?’. This note may be the first entry that really is about something completely new.  We have been very fortunate to work with an Austrian software organization. They have done some remarkable work combining agile software development (Scrum) with the principles embodied in CMMI, supported by good tools and technology. By following an exemplary approach to process improvement, using the various models to meet business need, and being willing to adapt them as needed (to the dismay of the model disciples, both agile and process) bwin have managed to develop highly predictable process used by all its delivery teams. This abnormally high predictability enables a capability to scale development with teams working and communicating concurrently.  We have never seen anything like this before.  This work was reported at Tom Gilb’s seminar recently and again at the SEPG conference at Porto.  We believe this may be a breakthrough.  - CCS

 

 

 

 

 

 

 

14 April 2010: Energizing CMMI

 

In July last year we presented the ‘Outcome Based Process Improvement ’ webinar that described some of this issues with current SPI practice and suggested ten ‘rules of SPI’. It was followed by the ‘Energizing CMMI’ presentation at a BCS SPIN SG meeting.  Here is the paper that expands upon that webinar and presentation. It describes how SPI drifted into the thrall of the most influential and useful model for process improvement (CMMI) and the problems this causes.  Approaches that mitigate some of these problems are proposed.

 

 

 

 

 

 

 

25 March 2010

 

A passing comment at a recent SPIN meeting to the effect that Fagan Inspections are rarely used triggered a debate and a renewed interest in this. Fagan Inspections are a real discovery by software engineering. They are the most effective software quality control we have and can deliver a 10:1 return on investment – but they are increasingly less well known with their use limited to a few engineering oriented organizations. What is happening?  Together with Chris Gerrard and Marilyn Bush we have produced a presentation to look at inspections and begin exploring the reasons for this curious neglect. We will be presenting ‘Fagan Inspections: The Silver Bullet No-one Wants to Fire’ on 25 of March and later in the year at the next BCS Spin meeting which will be devoted to exploring this topic.

 

 

 

 

 

 

 

February 2010:  BCS SPIN SG

 

Details of the next SPIN meeting...

 

 

Title:                        Supporting a Process Oriented Requirement Method

 

Description:

 

Process modelling as a way to inform requirements has seen a resurgence in recent years, particularly in those methods that use Role Activity Diagrams. By using these models within the requirements phase client needs can be captured effectively in a notation that makes sense to the business user, whilst also providing a rigorous description.

 

However, the move to specification still has pitfalls, notably in ensuring that the understanding gained from process modelling can be transferred effectively to the specification so that alignment of business need and software system is maintained.

 

This talk outlines issues and potential solutions in ensuring such alignment, and incorporates our recent experiences in attempting to provide tool support.

 

In particular, we describe tool support for model driven development (as part of the collaborative EC project VIDE), and the use of process mashups, including work undertaken at SAP research. In both cases our focus has been on providing sets of notations and tools which are accessible to a variety of users, often including those stakeholders who are not IT experts.

 

Mashups are a relatively new approach, and use web 2.0 technologies to combine data from different sources to create valuable information, principally for data aggregation applications. This utilises the potential of the internet and related technologies, to allow users to process tasks collaboratively, and form communities among those with similar interests. We present currently available mashup platforms and demonstrate how situational enterprise applications can be built by combining social software, feeds, widgets, web services, open APIs, and so on. 

 

The session does not require any specific technical skills, though experienced process engineers will have the opportunity to share their views.

 

Participants will learn:

 

·     How to use simple role based process models

·     Issues in moving from process model to specification

·     How to use simple notational devices to ensure alignment

·     How development tools can help, with a particular focus on the use of mashups

·     Current process mashup approaches and future directions.

 

Speakers:   Keith Phalp,      Sherry Jeary,     Dr Lai Xu

 

Keith Phalp is Associate Dean and Head of Software Systems and Psychology at Bournemouth University. He is particularly interested in the early, most crucial, phases of software projects, and in how best to produce software that meets the needs of its sponsors, stakeholders and users. This involves a variety of topics including: understanding business needs (strategic and operational), process modelling, software requirements, business and IT alignment, and software modelling. Dr Phalp has published extensively in these areas, has experience of process consultancy, including work with public sector bodies, both in the UK and Europe, and has taught process modelling to undergraduate, post graduate and industrial audiences. He was Principal Investigator on the EC funded framework 6 project VIDE, where Bournemouth led work to produce accessible models and interface, so that non-technical users could understand and be involved in requirements and specification of software. He has also published across a wide range of other computing disciplines, including software quality, process improvement, metrics and web methods.

 

Sherry Jeary is co-Director of the Software Systems Research Centre at Bournemouth University and has research interests that span Web Development Methods, Web Systems and the alignment of Information Technology with business (strategy and process). She had several years of management and systems experience across a variety of domains before moving into academia. She was the BU Project Manager for the EU Commission funded VIDE project on Model driven development and is currently investigating the production of useful models in complex enterprises.

 

 

Dr Lai Xu is a lecturer in Business and IT Alignment within the Software Systems Research Centre at Bournemouth University. Previously she was a Senior Researcher at SAP research, Switzerland, a Senior Research Scientist at CSIRO ICT Centre, Australia, a post-doctoral researcher at the Organisation and Information Group of the Institute of Information and Computing Sciences of the Utrecht University and at the Artificial Intelligence group of the Department of Computer Science of the Free University Amsterdam. She received her Ph.D. in Computerized Information Systems from Tilburg University in 2004. Her research interests include enterprise systems, support for business process collaboration, virtual enterprises and process mashups.

 

Date:                         Thursday, 4 March 2010

 

Venue:                        BCS London Offices, Southampton Street

 

Time:                         18:00 - 21:00

 

Meeting Format:               Presentation and discussion afterwards

 

Attendance:                   Free

 

Sustenance:                   Sandwiches and refreshments provided from 17:30, with 18.00 start.

 

Registration:                 Spaces are limited and registration is required:  To book online go to     www.bcs.org/events/registration

 

Membership of the Institute is encouraged for continued involvement and event attendance. In order to find out more about Software Process Improvement Network (SPIN-UK) Specialist Group  and other Member Groups please visit www.bcs.org/membergroupsDelegates

 

 

 

 

 

 

 

 

 

November 2009:  BCS SPIN SG

 

 

Details of the next SPIN meeting...

 

 

Title:                        A Review of the Draft Software Process Improvement Manifesto

 

Description:

 

In September this year software process professionals from around the world met to begin the development of a Software Process Improvement Manifesto.

 

This manifesto of 4 values and 14 principles is designed to 'give expression to state-of-the art SPI knowledge'... ...'grounded on hundreds of [man]years of SPI experience'.

 

We will review these values and principles to explore their scope and content, compare our own knowledge  - and perhaps get involved in the evolution of this document?

 

 

Date:                         Monday, 14 December 2009

 

Venue:                        BCS London Offices, Southampton Street

 

Time:                         17:30 - 20:00

 

Meeting Format:               Walkthough of SPI Manifesto and Workshop

 

Attendance:                   Free

 

Sustenance:                   Sandwiches and refreshments provided from 17:30, with 18.00 start.

 

Registration:                 Delegates must register, please goto

 

http://www.bcs.org/server.php?show=nav.9416.

 

 

 

 

 

 

 

 

16 October 2009:  UKSMA 2009

Back from the UKSMA conference unscathed.  My eXtreme Measurement paper didn’t appear to annoy too many people (paper and slides on the presentations page), and there were some excellent presentations from some of the others there – Jeremy Gardiner’s work on benchmarking using high volume automated testing sounds promising and Carl Bideau’s work for CAPGemini is a model for developing good estimation models. I missed the IGQM presentation but will try to pick it up on the UKSMA website when the papers are posted there.

 

Sadly there weren’t enough takers for the tutorial day so I didn’t get to do my measurement patterns and anti-patterns tutorial – if there are any takers for this in house do get in touch.

 

 

 

 

 

 

 

October 2009:  Rule 11

A while ago I posted a list of ten rules of SPI (below). I’ve spoilt it by coming up with another ten: (the other ten rules of SPI) , and another rule (rule 11) which sounds like an unimportant tactic, but may be one of the most important rules of all. I’ve just come across another example of how well it can work, which makes a pleasing change from all the instances of it being ignored.

 

 

 

 

 

 

 

24 August

 

Here are the details of the Autumn SPIN meeting...

 

Title:                        Outcome Based Process Improvement and the Ten Rules of SPI

 

Description:

 

Software Process Improvement is intended to benefit the business but is often deflected by other concerns. This session has been developed from the presenter's 'Energizing CMMI' article and the recent webinar 'An Outcome-based Approach to Process Improvement' that was developed from it.

 

We will look at the pressures that have developed to distort process improvement and look at the fundamental SPI practices and approaches that return SPI to its true purpose: to help those doing the work to do a better job.

 

Ten rules for process improvement will be presented for discussion and development.

 

 

Date:                         Tuesday, 8 September 2009

 

Speaker:                      Clifford Shelley

 

Clifford has a background in software development. He has been helping organizations increase the effectiveness of their development and test capability for more than twenty years. He has a particular interests in emerging development practices, software quality and the problems of software measurement.

 

Venue:                        BCS London Offices, Southampton Street

 

Time:                         18:00 - 20:00

 

Meeting Format:               Presentation and discussion afterwards

 

Attendance:                   Free

 

Sustenance:                   Sandwiches and refreshments provided from 18:00, with 18.30 start.

 

Registration:                 Delegates must register, please send notification to: Mandy Bauer: mandy.bauer@hq.bcs.org.uk  or Clifford Shelley: shelley@osel.netkonect.co.uk

 

(Dietary requirements: Please notify Mandy Bauer)

 

 

 

 

 

 

 

 

 

 

23 July 2009:  Realigning SPI for difficult times

 

It has becoming increasingly clear that the SPI community is facing the same problems as the rest of the industry as budgets are reviewed and investments reconsidered. In house process engineers are feeling vulnerable and vendors are having to rethink their offerings.  The default, long term,  faster, cheaper, better is no longer sufficient. And undirected performance improvement also misses the point.  There are two very particular areas that software organizations are giving considerable attention to, and that the SPI community should be well equipped to help with - right now – after all, if it’s not their job, whose is it?

 

The two areas are firstly cost control, of course, and secondly increased value. Cost control is naturally receiving the most attention.  Difficult, but easy to understand decisions do make an immediate difference to the budget. But without balancing cost control with increasing value delivered major damage can be inflicted to capability as intellectual and process assets are damaged or lost. And without the increasing value delivered cost control may become irrelevant as customers disappear.

 

It is ironic that the SPI community has the tools to help with cost control and increasing customer value, but too frequently is either unaware of this or lacks confidence to demonstrate it - and falls victim to cost control itself.  The SPI community have a wide selection of tools for managing cost, but have perhaps grown complacent over time with the focus shifting to compliance work and SPI ‘vanity projects’. And value delivered is often a poor runner up to conformance issues.

 

Now is the time for the SPI community to show what it can do – what it is really meant for. It is time to help our organizations and customers and to select and use, in anger, those tools for reducing costs and increasing value.

 

If you would like help in getting your SPI team to focus on the current business realities, help selecting or acquiring to tools to use to deliver value and reduce costs email me at shelley@osel.netkonect.co.uk

 

 

 

 

 

 

 

8 July 2009:  Energizing CMMI

 

Here are the slides from the webinar on 23 June. The title was changed to ‘An outcome based approach to SPI’ to make it clear that good SPI is good SPI and not limited to CMMI frameworks.

 

 

 

 

 

 

 

6 July 2009: Tom Gilb’s 2009 Seminar – Changing Culture

 

This year the seminar concentrated on changing software culture (see the paper and slides below). There were plenty of insights and ideas but perhaps the most surprising thing to emerge was the remarkable accuracy with which a culture can be described. While you would expect generic recognition points and values it appears that behaviours can be described and predicted with uncanny accuracy. What also emerged is that no one really knows what to do with this understanding. The usual tools for change were identified and a few new ones too, together with the promotion of favoured methods and tools, but there was nothing exploiting this ability for acute observation and characterization.

 

We also had the opportunity to meet Jeff Sutherland and hear about his work. He was impressive, and as you would expect he had clear views of the direction software development should be heading. While this is understandable it is a worry that with a few notable exceptions the big names in this industry promote a sub set of the models or tools that software developers need.  While promotion of a favourite, and to be fair, useful, approach is of course to be expected, its promotion as the answer, rather than an answer cannot be right. (I may set up a specialist group to promote the waterfall approach to restore some balance – anyone interested?) The time and effort required to compare, contrast and select from the many contemporary development models – DSDM, XP, SCRUM, FDD MDD TDD AMDD… could be far better used. And the implicit presumption that because this is right that is wrong is damaging. Other disciplines tend to progress by the gradual accumulation and dissemination of knowledge, but software development seems to be driven by cycles of fashion, packaging, re-badging and promoting a favoured sub set, and dismissing prior, hard won knowledge.  A notable exception to this habit was a presentation from one of the Construx folk. Construx (Steve McConnell’s organization) could certainly package and promote proprietary methods but don’t. They take a clear sighted and balanced view of the knowledge that software developers and managers should have access to – which I like and find more impressive. I’m sure that in the longer term this is of more value than fashion driven development.

 

 

 

 

 

 

 

26 May 2009:  The Ten Rules of Software Process Improvement

 

Recently we have been involved in several discussions where people (including us) have expressed discontent and concern about the current state of software process improvement (SPI). It has prompted the drafting of a list to capture the essence of ‘good’ SPI. Our list of rules looks like this at the moment:

 

In no particular order….

 

 

 1. Improvements are owned by those affected by them.

 

2. Focus on fixing real problems getting in the way of business goals - if you aren't have a d****d good reason.

 

3. Require rapid feedback (results) on the effect of changes...

 

4. ...and evaluate and act on them.

 

5. Use a model to provide a conceptual framework and scope (actually experience shows that two are better), know how to use it, and who's in charge  - don’t let model compliance become the primary objective.

 

6. Don't manage SPI as a project.

 

7. Measure progress by results, not schedule.

 

8. Tactics determine strategy. Strategies are valueless until you know what you can change in practice.

 

9. SPI is exploratory; many improvement efforts will fail. But these failures are offset by those improvements that work well.

 

10. SPI must pay for itself. Demonstrate this or stop.

 

 

Comments anyone?   After drafting these we found another set on the web by Yingxu Wang and Graham King.  Similar but different …

Rule 1: Software process improvement is complicated system engineering.

Rule 2: Software process improvement itself is a goal-driven and a continuous process

Rule 3: Software process improvement is an experiment process.

Rule 4: Software process improvement is risk-prone.

Rule 5: Software process improvement is a time varying system.

Rule 6: Software process improvement is a random system dominated by human factors

Rule 7: Software process improvement has preconditions.

Rule 8: Process improvement is based on process system reengineering.

Rule 9: Software process improvement achievement is cumulative.

They’re good and cover much of the same ground, albeit with bigger words. And they pre-empt ours by ten years. But they only have nine rules and we’ve got ten, so we win :  )

 

 

 

 

 

 

 

19 May 2009:  A visitor at OSEL

The web site has been given another tidy up a visitor to the office. Chris Shelley has been here on a ‘go to work’ day and has been reviewing and editing the web site. He also spent time listening in on a ‘webinar’ – but was not impressed – old hat to a thirteen year old.

 

 

 

 

 

 

 

18 May 2009:  Changing Culture

I’m intending to present at Tom Gilb’s seminar again this year. The topic is ‘Changing Culture’. While I’ve been a close student of software culture (you have to be if you want to have any influence at all) I’m not sure that I can claim to know how to change it. The paper describes software cultures and a way of mapping a culture in order to better understand it. It also lists a number of tools that can be used to change cultures. These are in no way complete or easy to use, but they do seem to work. And here are the slides

 

 

 

 

 

 

 

18 May 2009… ‘Energizing CMMI’

There has been some interest in the ‘Energizing CMMI’ page - we seem to have hit on something that matters to a lot of people. And following on from this interest we’ve been asked to do a webinar. It’s scheduled for 23 June. If you’d like to log on and hear more about SPI, CMMI and getting it to work then let me know and I’ll get you registered for it.

 

 

 

 

 

 

 

4 April 2009:  New Requirement Book

Attended the launch of Ian Alexander and Ljerka Beus-Dukic’s new requirements book: ‘Discovering Requirements: How to specify products and services’. It is a described as a set of techniques and methods for eliciting (or discovering as the authors say) a good set of requirements. Review to follow in due course CCS

 

 

 

 

 

 

 

March 2009:  Next SPIN meeting

 

The next SPIN meeting will be on the on the evening of Monday 1st June.

 

Adopting Agile: Traps and Pitfalls

Simon Whittington & Chris Cooper-Bland

 

Overview

Over the years we have all developed a comprehensive portfolio of best practices which will help us avoid the common problems on projects, the techniques include:
• Collaborative working
• Risk-Based Prioritisation
• Visual modelling
• Iterative working
• Change control
• Best practices and use of Tools
• Clear traceability: Linking scope & features to cost & value

However, when adopting agile processes many projects encounter difficulties or fail, a large number of these failures can be attributed to the application of bad practice in the application of processes. When you analyse the causes of failure a number of common pitfalls emerge which can be further developed by looking at the problems, their symptoms and ways to avoid or mitigate them.

Using experience gleaned from many projects, this session will present an overview of how the techniques apply to the adoption of agile processes and our ideas on how to use the techniques to avoid the pitfalls as a series of anti-patterns. This will lead into an interactive session where the group can discuss whether they agree with the techniques and pitfalls, and also propose new anti-patterns which they have observed on projects.

 

Audience Background

The session does not require any specific technical skills, although some knowledge of patterns would be helpful. Experienced process engineers will have the opportunity to share their views.

 

Benefits of participating

Participants will learn how to:
• Spot the rot and share experiences with the group on how best practices have been applied in agile projects
• Determine the best way to apply these best practices to avoid common project problems
• Share problems are commonly encountered by participants and how they were solved

 

email me for details or to book a place

 

 

 

 

 

 

 

March 2009:  SPIN goes online - BCS SPIN Forum:

The British SPIN (Software Process Improvement Network – a BCS SG) has been exploring, reporting and promoting software process improvement for nearly twenty years now, looking at what works - what actually improves software development and testing - and what doesn't.

From early interests in using ISO 9000 and TQM we have reported on the development and experiences with CMM, SPICE and later CMMI. (SPIN owes its existence to the development of CMM.) We have tracked the development and use of measurement for software engineering and management, and have learned from the SPI experiences of world class software organizations. We have also made an occasional foray into the more academic aspects of software development. SPIN was one of the first to recognize and take a look at the possibilities of agile development; we take a continuing interest in emerging development practices.  The value of six sigma, and other sets of methods for making changes in software development environments have also been, and continue, to be explored.

We now have an online forum. If you are actively involved in software process improvement, or technology change, or simply want to know more about it email me for an invitation join this independent forum. You will then be able to exchange experiences and opinions, ask questions, post papers, and collaborate with you colleagues in SPI…

 

 

 

…and…

 

…over this time we have been directed by the needs and interests of our members. Now, with the introduction of our forum (huddle) we have a great way of finding out where we should direct our attention now and in the future. What's next? Show us where the best ideas are. What direction should SPIN be going?  (And not clockwise, or anticlockwise please, or up or down.)

 

 

 

 

 

 

 

February 2009:  BCS SPIN – next meeting…

 

The CMMI and Innovation: A Lost Promise or the Coming Thing?

 

This year's SPIN evening meetings are scheduled for:

 

        -       Tuesday, 24 February,

        -       Tuesday, 26 May,

        -       Tuesday, 25 August,

        -       Tuesday, 24 November all at the usual time and venue - 6.oo pm at the BCS London office in Southampton Street.

 

As an antidote to the current fashion for CMMI compliance we have a talk about real process improvement. Marilyn Bush is taking a critical look at CMMI and proposing refinements to enable and encourage innovation in software development:

 

Title:

The CMMI and Innovation: A Lost Promise or the Coming Thing?

 

Abstract:

Modern companies are always looking ahead. Innovation, though, means more than aspiration. It corresponds to a choreographed and repeatable process - the foundations of the CMMI and before it the CMM. Yet over the years those emphases have been displaced in favour of top-down discipline. This paper will examine the evolution of the CMMI and the links between the CMMI and the kind of organizational values that in successful innovative companies foster systematic creativity.

 

Presenter:

Marilyn Bush (and Charley W. Bush). Marilyn is one of the authors of the original CMM and a CMMI assessor and trainer with a particular interest in innovation in software development.

 

 

Venue:                  BCS London Office, Southampton Street.

 

Time:                   18:00 - 20:00, 24 February 2009

 

Meeting Format:         Presentation and discussion afterwards

 

Attendance:             Free

 

Sustenance:            Sandwiches and refreshments provided from 18:00, with 18.30 start.

 

Registration:           Delegates must register, please send notification to: Mandy Bauer mailto:mandy.bauer@hq.bcs.org.uk

 

or Clifford Shelley mailto:shelley@osel.netkonect.co.uk

 

 

 

 

 

 

 

 

January 2009: A domino model of software and system processes

 

The increasing interest in statistical modelling requires the development of useful models to which the statistics can be applied. The development of these models is perhaps the most important aspect of any statistical or simulation modelling. A poor or incorrect model can be worse than no model at all. But the models that tend to be used at the moment are either simplistic, traditional, but unhelpful, or so complicated that they tend to be inflexible, very difficult to validate and difficult to have much confidence in. A simple model building technique is needed.

 

The domino model may provide that technique. Everyone is familiar with the idea of falling dominos, and will have seen elaborate patterns of falling dominos. This appears to be a useful way of looking at processes too. In software management the idea that a task is done or not done, not 90% done is generally accepted good practice, as is the idea of partitioning large pieces of work into small tasks. The analogy to dominos is clear. Domino models can be built of most software processes with opportunities to design for speed, low risk, flexibility - and other required software development outcomes. Some issues need development – designing domino models for synchronization is not yet clear but the domino model does appear to hold promise as an easy to use approach for  understanding and reasoning about software development models, and as a ‘substrate‘ onto which to graft statistical and probabilistic models.   CCS January 2009

 

 

 

 

 

 

 

October 2008:  Metronomicon

 

Some years ago Sarah Sheard wrote a paper called ‘The Frameworks Quagmire’. This attempted to describe the increasing number of software engineering standards, models and frameworks in use. The paper included a diagram showing the relationships between these models and standards, and explaining the title of the paper. (Since then the situation has got worse: models are products and tend to proliferate and get  ‘improved’, and it is often far easier to invent your own model that invest in an extant one.)  The software measurement community is in a similar situation. Despite the relative simplicity and abstract nature of software measurement its application in different domains and software cultures (engineering, commercial, financial, agile (who seem to be doing a reasonable job of using measurement)), the academic community (very keen on measures of software structure), the influence of the business culture and organizational performance measurement, the strong influence (or distortion) of a pervasive project management orientation, together with the diverse needs of the different users of measurement, the collectors and verifiers of data, and the designers of measures and analyses, have resulted in a metrics maze every bit as tangled and confused as the frameworks quagmire. Buried in this maze are a more or less complete set of methods and tools, important lessons on what works, and perhaps more important, what doesn’t. But identifying this good stuff, filtering out the worthless or obsolete is very difficult, especially for practicing engineers and managers simply looking to use measurement, not wanting to embark on a time consuming quest for knowledge.

 

The UK’s Software Metrics Association (UKSMA), prompted by its Chairman, Rob Radcliff, has initiated the METRONOS project to address this. METRONOS has two objectives: the first is to use UKSMA’s and the software measurement community’s experience to review software measurement knowledge - especially the almost unknown software measurement standards -  and organize and categorize it in a way that makes it easy for the non specialist to access and navigate around – the metronomicon; the second, when the first is achieved, is to design qualifications for those undertaking software measurement in much the same way as the software testing community has developed qualifications for testers. METRONOS is not inventing or developing anything new. Everything needed for effective software measurement is already available. It simply needs organizing and promulgating. This is an ambitious project depending on the good experience and judgement. If you would like to help with this work contact me , or any member of the UKSMA committee. CCS October 2008

 

 

 

 

 

 

 

October 2008:  Monte Carlo Methods for Software Projects and Processes

 

The use of statistical modelling is becoming increasing popular, and an expected part of software organizations’ planning and risk management, however the correct application of statistical modelling methods to complex and subtle software engineering environments can be problematic, leading to defective or misleading estimation, planning and management information, or loss of confidence in the delivered information. OSEL can provide informed and objective guidance on the development and application of robust and accountable methods based on wide ranging experience within software development and testing across most industry sectors, together with a background in the application of statistical models and methods.  If you are considering using Monte Carlo simulations, or other statistical modelling tools within your organization and would like help, ranging from guidance, to model development, the identification of appropriate distributions and running your simulations then contact us now at info@osel.co.uk for more information.

 

 

 

 

 

 

 

July 2008:  More than Projects

 

The usual way of describing software development work is as a project. It is a convenient shorthand, recognized and supposedly well understood across the industry, and built into models of software development and management. When a piece of work is identified as a project it triggers the use of project management tools and roles and provides a framework for the work. The downside of this is that much software development work is not suited to being treated as a project. The work may be simple, short term or repetitive or in other ways not like a project needing planning, tracking, project boards etc., etc. And by invoking the project model routine work may become inflexible, unresponsive and overloaded with management and administrative tools that increase costs, obstruct and delay the work, and discredit project management techniques. What are the alternatives? And why do we not have a choice? There are remarkably few words describing packages of work that deliver a result or value that are used in software development. Some, like Scrum, have a following (including me), but, despite its many strengths, proprietary terms like this will not attract a wide following and may fall from favour in the longer term. Terms like release and fix are understood but are often bundled into the project framework with all that implies in terms of management overheads. It is worth taking a look at other professions to see what terms they use for packages of work: case, engagement, brief, operation, commission, exercise, revision, value stream, service, mission, sortie, study…. . These sound odd when applied to software development or support but they do provide a variety of models or metaphors, rather than the largely unsuccessful project monoculture that software developers tends to be stuck with. However to be genuinely useful some way of categorizing the work to be performed and its context is needed to enable the correct model to be selected and built. First is the character of the problem to be solved. Manfred Bundschuh’s excellent categorization of problems - Interpolation (simple, puzzle like, ‘production’ oriented), Synthesis (real problem, interesting, engaging, containing ‘unknowns’, and ‘project like’), and Dialectic (research, exploratory, with ‘unk-unks’, and rather worrying) - is really useful for this.  Then there is the scale; how big is the task?, man days, weeks, years? This requires more than categorization into little project, project, or programme. There are qualitative differences too. The culture and organization (or organizations) within which the work is to be performed will also exert an influence and should be described and categorized.  And so on. With a useful categorization and selection criteria models for the work can then be evaluated for their suitability. There will always be projects, but these will be real projects requiring project management, not arbitrary bundles of work mislabelled as projects. There will be value streams for routine processing of software changes or defects, and perhaps small packages of work directed by no more than ‘five paragraph field orders’ (the military may be a very fruitful source of models – they are experts at delivering results in uncertain situations).  We are currently working to identify a set useful models for software development  - borrowing from other professions  - and to develop a simple methodology to enable the optimum model to be selected and implemented from patterns or pattern sets, to provide the optimum approach to software development and support. If you would like to help us with this work, or to trial some of the models and patterns do get in touch. CCS.

 

 

 

 

 

 

 

10 July 2008

 

BCS Software Process Improvement Network (SPIN) Specialist Group

 

CMMI Workshop

 

 20 Years of CMM and CMMI – Where are we now, where next?

 

-

 

21-22 October 2008, Oxford

 

 

CMMI, and its predecessors, SPA and CMM, have been in widespread use for twenty years. They have exerted a major influence on the way software development is managed worldwide.  The British SPIN is conducting a two day workshop this autumn to assess experiences with these models and to identify and propose improvements to CMMI model and the way it is used. This is an opportunity for experienced users of CMMI to compare their experiences and contribute improvement proposals to these influential tools.

 

There will be two aspects to the workshop:

 

The Model:

Areas of interest include:

 

The Scope of CMMI: CMMI addresses a wide range of software development and management activities in terms of ‘process areas’. Are these PAs correctly placed within the model? Are there too many PAs (at ML3)? Could the emphasis be shifted within or across PAs. Are some PAs missing?

The Structure of CMMI: The structure of CMM and CMMI has evolved over their lifetime, from the high profile maturity levels to the detailed practices. What aspects of CMMI’s structure are most useful, and what are the least? Is the structure sufficiently general to enable widespread and useful interpretation?

Best and Worst: CMMI is expensive. Which parts of the model give best return on investment and which the worst?

High Maturity: Maturity Levels 4 and 5 focus on quantitative aspects of software development. Does the focus on process control deliver the benefits expected? What is the value of the quantitative process models required of high maturity organizations? Does the drive for control driving out excellence?

 

Using CMMI:

Areas of interest include:

 

Patterns of Use: How is the model being interpreted and used to improve software development and management, are there better ways? How is CMMI best adapted for different types of organization?

Appraisals:  What works and what doesn’t. Are appraisals now audits? What influence do appraisals have on process improvement and innovation? Is the SCAMPI process now too onerous?

Ethics and conflicts of interest: Lead Appraisers are often placed in difficulties as they advise organizations on process improvements and then lead assessments. How should potential conflicts of interest be managed?

Perspectives: What perspectives is CMMI intended to be viewed from, and what perspectives are not intended, or ineffective? How should senior managers, software staff and customers view and use CMMI? How do their interests influence the way CMMI is used?

Other Models:  How should CMMI relate to emerging approaches to software development and management?  Is CMMI capable of coexisting with these contemporary approaches?

 

 

Users of CMMI are invited to submit a concise statement of an issue, concern or opportunity for improvement of the model or its use, based on evidence or experience, together with a description of its resolution. (The statement of issue and resolution can be from one to five pages in length. Guidelines for the statements will be provided on the SPIN website, or with this invitation.) Space at the workshop is limited and preference will be given to those statements received  early. Send your statement to any of the organizing committee:

 

ccb@badgerscroft.com

shelley@osel.netkonect.co.uk

LCHUGHES@qinetiq.com

mailto:m.w.bush@ieee.org

 

The closing date for submission is 15 September.

 

Selected workshop participants will be able to present their statement, contribute to workshop panels, and to work with other workshop participants in the development of a set of experiences and proposals for refinements to the CMMI model and its patterns of use. (Participants may submit more than one statement but will only be able to present one.)

 

It is intended that the products of the workshop will be refined, published and made widely available to the CMMI community.

 

Attendance at the workshop is free, but participants will be responsible for their own travel and accommodation arrangements, and  their expenses.

 

 

 

 

 

 

 

 

 

 

June 2008

Managing Risk and Uncertainty

 

This year Tom Gilb’s seminar was concerned with risk and uncertainty. Plenty of ideas – in particular I was surprised by the extent to which risk is dealt with implicitly, by designing the project and the software to minimize risk, rather than explicitly with traditional risk management techniques. I spoke about the customary, de facto standard technique; its strengths and weakness, and proposed some modifications to make it more effective and credible, including the need to  introduce some carefully selected risks into projects to inject life and value into them. The paper is here and the slides here. And I collected sufficient new ideas (thanks in particular to Russ and Matthew) and confirmation of existing ones to begin rebuilding our standard project risk management tools.

 

 

 

 

 

 

 

 

May 2008

 

The International Conference on Agile Processes and extreme Programming  XP208 will be in Limerick from 12 to 14 June – details below….

XP2008 - http://www.lero.ie/xp2008

Time: 9am - 6pm
Date: 12th - 14th June
Venue: University of Limerick, Ireland

Don't miss out on Early Registration extended until 15th May.

The International Conference on Agile Processes and eXtreme Programming in Software Engineering, XP2008, is the leading world event on the topics of agility in software and information systems development.

Keynote Speakers:
Kati Vilkki - Nokia Siemens Networks

Sean Hanly - co-founder and CTO of Exoftware
Dave Snowden - Founder and Chief Scientific Officer of Cognitive Edge

Philippe Kruchten -senior member of IEEE CS and co-founder of Agile Vancouver

Case Study:


"Inflight Agile Enablement in HBOS Retail IT"

Co-presented by Neil Munro, HBOS and Brian Hanly, Exoftware.

Some of the workshops include:
Experience the Human Side of Agile
Agile in the Large
Agile Testing and Assessment
Exploring Agile Coaching

The ninth conference (XP2008) will be hosted by Lero – the Irish Software Engineering Research Centre at the University of Limerick on the west coast of Ireland. The conference brings together both industrial practitioners and researchers in the fields of information systems and software engineering, and focuses specifically on theory, practical applications and implications of agile methods.

Register Here - http://www.lero.ie/XP2008/Registration.html

 

 

 

 

 

 

 

 

April 2008

 

BCS SPIN SG

Gradual Process Improvement using Agile Retrospectives  Tuesday 20 May 2008

Speaker:  Rachel Davies - Rachel provides consultancy and coaching teams in agile software development.

She has been applying agile approaches since 2000 and has experience of a range of agile methods including XP, SCRUM, Lean and DSDM. Rachel is a well-known presenter at and organizer of industry conferences and a long serving director of non-profit Agile Alliance.

Venue:                          BCS London Office

Time:                           18:00 - 20:00

Meeting Format:         Presentation and discussion afterwards

Attendance:             Free

Sustenance:             Sandwiches and refreshments provided from 18:00, with 18.30 start.

Registration:           Delegates must register, please send notification to: Mandy Bauer mandy.bauer@hq.bcs.org.uk 

or Clifford Shelley shelley@osel.netkonect.co.uk

(Dietary requirements: Please notify Mandy Bauer)

Full details of how to get to the BCS London office can be found here:

http://www.epsg.org.uk/locations/bcsss-guide.html

courtesy of the BCS Electronic Publishing group (these are by far the best location instructions I have seen anywhere).

Note: Delegates need to be BCS members. The BCS has a special offer open until the 30 April 2008 where SPIN members can join the BCS free of charge. If non-BCS members wish to take advantage of this offer, then please contact Clifford Shelley so that he can register you as a SPIN member with the BCS. For further information, please consult ‘Joining a Group’ on the SPIN website: http://www.bcs.org/server.php?show=nav.7057.

 

 

 

 

 

 

 

 

February 2008

BCS SPIN SG

CMMI and Metrics, 19 February 2008

 

This is free event with limited numbers, so it is on a first come first serve basis. The details are:

 

Title:                        CMMI and Metrics (A review of the application of measurement for CMMI)

 

Date:                   Tuesday 19 February 2008

Speaker:                Dr Clifford Shelley

Venue:                  BCS London Office

Time:                   18:00 - 20:00

Meeting Format:   Presentation and discussion afterwards

Attendance:             Free

Sustenance:             Sandwiches and refreshments provided from 18:00, with 18.30 start.

 

Registration:           Delegates must register, please send notification to: Mandy Bauer mailto:mandy.bauer@hq.bcs.org.uk

or Clifford Shelley mailto:shelley@osel.netkonect.co.uk

 

(Dietary requirements: Please notify Mandy Bauer)

 

Full details of how to get to the BCS London office can be found here:

 

http://www.epsg.org.uk/locations/bcsss-guide.html

 

courtesy of the BCS Electronic Publishing group (these are by the best location instructions I have seen anywhere)

 

As you are probably aware there has not been too much activity within the group for some time. In the past we have held full day meeting and charged for attendance, but recently it has proved hard to get enough attendee to make this worth while. So we have decided on a change of format, we are still looking for a lively debate and will value your input into the future of the group.

 

 

We would also like to draw your attention to another event being run by the Young Professionals Group, details are here:

 

http://www.bcs.org/server.php?show=ConWebDoc.16908

  

 

 

 

 

 

 

August 2007

Introduction to CMMI v1.2  Oxford October 2007…

 

Marilyn Bush is presenting an SEI  licensed ‘Introduction to CMMI v1.2’ course in October on the 10th to 12th, in Oxford. Marilyn is one of the contributors to the original S/W CMM and one of the most experienced and insightful CMMI assessors and trainers; if you are planning to attend an Intro to CMMI course this is the one. Course and booking details are here.

 

 

 

 

 

 

 

2 August 2007

The next SPIN meeting is on 14 August at the BCS’s London Offices. Visit http://www.bcs.org/server.php?show=nav.9416 for details and a booking form

 

 

 

 

 

 

 

July 2007

 

Smart Decision Making

 

Tom Gilb’s annual symposium in London was concerned  with ‘smart decision making’ this year. My talk concerned the anatomy of decisions, and looking at decision dysfunction (why organizations force smart people to make dumb decisions). A paper was also presented.

 

 

 

 

 

 

 

29 March 2007

 

Another new training course for 2007…

 

Formal Technical Reviews: From Inspection to Walkthrough: A one day workshop that presents the principles and practice of Formal Technical Reviews (FTRs). Participants will be able to plan what and when to review and how to perform reviews, in line with industry best practice, to reduce project risks and improve software quality to the required level. Call or email for details.

 

 

 

 

 

 

 

29 March 2007

The UK Software Metrics Association is issuing a call for papers for their autumn conference:

 

The UK Software Metrics Association (UKSMA)

CALL FOR PAPERS

The 18th Annual UKSMA Conference is to be held
on Tuesday 16th October 2007, in Central London.  

You are invited to submit papers or presentations to  conferences@uksma.co.uk.  

Please submit a short description (a précis) of what you offer to present.  Each presenter has 45 minutes, including 10 minutes for questions.

The deadline for précis submission is 30 April 2007.

Notification of acceptance by 31 May 2007.
 
The conference affords the opportunity to share experiences, ideas, case studies, and strategies for widening the relevance, appeal and benefit of measurement to the management of software development and maintenance.

The theme of the conference this year is wide ranging to reflect the increasingly diverse nature and usage of software measurement.  

We are keen to receive submissions/abstracts on the topics of:
-        innovative or original approaches to software measurement;
-        introduction of measurement to software development environments;
-        the presentation of measurement data to aid decision making;
-        best practice in measurement;
-        estimation;
-        measurement use in contracts or outsourced arrangements;
-        the use and benefits of measurement data;
-        measurement in CMMI, Prince2 and ISO contexts.
       
Case studies would be particularly welcome.

It would be appreciated if you would indicate your intention to submit a paper as soon as possible even if a précis is not yet available, so that we can gauge the response.  

Information about UKSMA can be found at www.uksma.co.uk

 

 

 

 

 

 

 

16 January 2007

Two pieces of work are nearing completion. We have a GQM procedure that we would like independently reviewed – based in the GQM course it presents GQM as a practical tool for use by individuals or teams, taking measurement from initial needs through to validation of measurement data. Supported by models and templates we think this takes the best of the GQM ideas and presents it in a useful and practical package useable by anyone. The other piece of work is handbook of software reviews. Again aimed at individuals and teams looking for practical advice this handbook shows how to perform reviews for best effect, managing the insidious ‘email review’ and ‘expert review’ problems.

 

 

 

 

 

 

 

16 January 2007

It is time to get the British Software Process Improvement Network (SPIN) functioning again. SPIN is a BCS specialist group for software people looking for better ways of working from both management and technical perspectives. If you have an interesting software process improvement, and, particularly, if you have ideas for SPIN – how it could be more effective, have ideas about what you would like from it, or would like to help the SPIN group then get in touch with Clifford Shelley using the email contact above.

 

 

 

 

 

 

 

15 January 2007

The UK Software Measurement Association is beginning to think about the next measurement conference this autumn. If you have ideas for a theme let us know. (I would like the theme to be measurement in other disciplines – other areas struggle with measurement too, and some have come up with useful ideas the software community could use. I’d also like to see some attention paid to measurement in the small. Too much attention is given to ‘measurement programmes’ with little new, useful or successful.)

 

 

 

 

 

 

 

24 October 2006

Just back from the UK Software Measurement Association (UKSMA) conference. Good material but the pace of development for software measurement needs to accelerate: the world is changing and approaches to software measurement need to keep up. OSEL’s paper on enhancing the inspection process to recognize design excellence – not just defects -  was well received. Here are the paper and slides.

 

 

 

 

 

 

 

12 September 2006

 

New training courses for 2006 and 2007…

 

Four new courses are being completed for delivery in late 2006 / early 2007.  These courses have been designed to meet the needs of software developers and managers working to meet changing business conditions, or respond sensibly to the increasing pressures to conform to new governance models and standards. The courses are:

 

TCM: The Innovation Management Tool.  A half day workshop introducing and using the people and process oriented methods that makes PDCA and continuous process improvement a practical reality.

 

Goal/Question/Metric. Measurements are increasingly required to demonstrate the value and integrity of software development, but defining and using software measures is notorious for it expense and ineffectiveness. This half day course distils the learning from the best software measurement methods and makes software measurement quick, simple and effective.

 

Six Sigma for Software Developers. A one day workshop specifically for software developers and managers that gives and overview of the six sigma approach and then selects and describes the tools that work best in software organizations.

 

Lean Software Development.  An intensive one day workshop introducing software developers to the ideas of lean software development and describing the thinking tools that reduce timescales and cut through waste.

 

We will be posting more details of these courses in due course. If you would like to find out more now please email us at info@osel.co.uk

 

 

 

 

 

 

 

28 July 2006

The UK SPIN group should be reforming again soon. At a meeting yesterday a provisional programme of meetings was discussed. Dates are 25 September, 4 December, 26 February ’07, and 21 March ’07 with topics covering process improvement failure modes, off shoring, and managing change.  The meetings will be held in London at the BCS Offices in Southampton Street.  The folk at Lamri are offering to help so expect SPIN to start presenting a more interesting and professional face to the world. If you want to know what process improvement is really like, and where its heading (without the spin – so to speak), or would like to present a paper do get in touch – either here or email Andrew Griffiths at Lamri.

 

 

 

 

 

 

 

8 June 2006

Developed from the note below this draft paper discusses the ‘opposite of a defect’ and why it matters. I submitted this to Eurostar with no luck but the perceptive people at UKSMA have asked me to present it (a little more developed) this autumn. I’ll have copies with me at the SEPG in Amsterdam next week. I’m keen to hear comments on these ideas - especially if its been developed elsewhere already.

 

 

 

 

 

 

 

1 February 2006

 

Peter Leeson is running the popular three day ‘Introduction to the CMMI’ course in April in Milton Keynes:

Course Description

This three-day course introduces systems and software engineering managers and practitioners, appraisal team members, and engineering process group (e.g., SEPG, EPG) members to Capability Maturity Model® Integration (CMMI) fundamental concepts. CMMI models are tools that help organizations improve their ability to develop and maintain quality products and services. CMMI models are an integration of best practices from proven discipline-specific process improvement models, including the CMM® for Software, EIA 731, and the Integrated Product Development CMM.

This course incorporates concepts from both of the original “Introduction to CMMI” courses. The course also includes improvements to slides, exercises, and other course materials identified in other change requests submitted by the community. Those who have already taken one of the original "Introduction to CMMI" courses (staged and continuous) do not need to take the new course unless they feel they need a refresher.

Cost

Participant cost is £1000.00 (plus VAT) per person. This includes the training material, a light lunch on site and the registration with the SEI. A non-refundable deposit of £100.00 is required upon registration. Full payment is required by first day of workshop.

Contact  Q:PIT for details.

 

 

 

 

 

 

 

 

6 January 2006

The excellent BCS Quality SiG (North West) is having a process modelling and definition seminar on 26 January. Details here.

 

 

 

 

 

 

 

4 January 2006 – happy new year!

The BCS’s SPIN group is planning their first meeting of the new year on 23 February at the BCS offices at Southampton St. London. The topic is metrics. The speakers are: Grant Rule, Rob Radcliff, Graham Thomas and Norman Fenton. Go to http://www.spin.bcs.org/events.htm for further details (in the near future). Alternatively email me at the address above.

 

 

 

 

 

 

 

December 2005

Here is a paper describing an elaboration of the review process usually encountered. Three stage formal reviews are not new, but not well known either. They are usually encountered in engineering and defence organizations – they deserve to be better known.

 

 

 

 

 

 

 

16 June 2005

What’s the opposite of a defect?  Identifying the mirror images of software defects may be useful for improving software development and development processes:

 

The data of primary interest from software inspections is defect data. Analysed with care it can provide information about the software, and the projects and processes that produced it. The value of defect data to software engineering is not surprising. Defect data  - whether as defect counts or of measures of deviations from a norm - is also the primary source of information for process control and revision in production engineering.

However there is a difference. Like manufacturing defects, software defects give rise to rework, delay, increased cost, system failures and customer dissatisfaction, but the uniformity, consistency and conformance to standard, so valued in manufacturing’s essentially replication processes, are not the only criteria by which software is judged: software development is design; and, as in manufacturing design, excellence is the driver, not uniformity.

Excellence is not assessed solely by the number of defects. A good design will have a certain number of defects, but an excellent design is not defined by fewer. There is something else. Whatever this is it isn’t assessed by quality control. But software quality control performed early in the development cycle, particularly inspections, is not quality control alone. It is an aware process performed by people that, as well as recognizing defects, may also recognize excellence: ingenious solutions to difficult problems; prior solutions; opportunities, and the elimination of complexity. This is recognized as one of the additional benefits of software inspections but does not appear to be formally managed or measured. In fact current practice may actively obstruct the capture of this information. Discussion of design options is discouraged, the focus to being fixed on detecting defects alone. This may be a reflection of the origins of inspection as a hardware defect detection technique, used to improve consistency and uniformity.

Software inspections can become even more valuable when adapted for use in a design process; when they are supplemented by a requirement to recognize, acknowledge and record those rare points of excellence that surprise and please (and give rise to the occasional twinge of envy). Like defects these points should be recorded, counted and analysed.

Similar counts and analyses of ingenuity could identify the nature and perhaps sources of excellence. (There doesn’t appear to be useable antonym for defect.) The analysis of points of excellence (undefects? proeffects? profects perhaps?) would be similar to that of defects but, unlike defects, the emphasis would be an amplifying or reusing them to produce ‘profective’ products and processes.

The removal of defects is a major contributor to the delivery of high quality software, but is bounded by the lower limit of zero defects; the amplification of excellence, by dissemination of profects, has no limit.

 

 

 

 

 

 

 

3 June 2005

Many thanks to Jaap Bloem who, in response to the defect analysis paper, pointed to the work of Professor Chris Verhoef who has been using elegant graphical techniques to reveal insights from unpromising looking software data. Professor Verhoef’s paper 'Quantifying SPI' is a model of the use of graphical techniques to analyse complex software data. The techniques are notable for their simplicity and power. These approaches really should be distinguished from the complex, obscuring, and often inappropriate statistical analyses that so much software data is subject to. (It’s tempting to add a ‘G’ for ‘Graphics’ to Basili’s GQM giving GQMGTuTu  - with the second G directed by Tukey and guided by Tufte.)

 

 

 

 

 

 

 

13 May 2005

Here is the draft of a paper about analysing software defect data I will be presenting later this year. Your comments would be welcome.  It suggests that SPC techniques are not a useful as might be thought and that simpler defect analysis techniques can do more and open up opportunities for improvement. It also proposes using software inspections for more than identifying defects.

 

 

 

 

 

 

 

3 May 2005

Download an evaluation copy of PODS v1.9 for Lotus Notes (R5 and later) with full functionality: defect tracker, change index, risk register, action tracker, test logger and much more!  The 814k zip file contains complete PODS, the PODS manual, and installation notes. Available on the PODS page.

 

 

 

 

 

 

 

18 April 2005

Unexpected validation for the rpi approach to process improvement (see www.osel.co.uk/rpi/rpi.htm): The emphasis with rpi is on problem solving – ‘if you aren’t solving a problem question what you are doing’, and ‘upstreaming’ (T  - 1). These are recommendations from the ‘theory of constraints’ (Goldratt, Theory of Constraints) which, in a nutshell, suggests that improvement work be focussed on removing bottlenecks. Optimisation in other areas does nothing but increase queues at the bottlenecks. This is precisely what we have been learning and promoting with rpi. Nice to know we’re getting it right.

 

 

 

 

 

 

 

6 April 2005

We are planning two public courses later in the year:

 

Rapid Process Improvement. A one day tutorial on the use of the RPI toolset, Central London, 8 September 2005, £285.

 

An Introduction to CMMI – Staged and Continuous,

A three day training course, Central London, 20-22 September 2005, £745.

 

Details of these courses will be posted on our events page in due course. If you would like details now please do email us.

 

 

 

 

 

 

 

24 March 2005

This is an observation that I’m not aware of having been made before. I’ve noticed that those organizations making progress in software process improvement - whether or not it is measured in terms of conformance to standard, CMM Levels perhaps, or improved performance - tend to be using two software process models; the operational model and the reference model. The operational model is the description of how work is performed, and may be an established lifecycle model or an acquired and tailored commercial model, RUP, say, or even a toolset. The reference model is used to direct and guide the development of the operational model - in many cases this is CMM(I) - although it can be something else, including models used by others as operational models. The value appears to be in the models complementing each other; the operational model being an explicit and coherent framework and tools to support working practice, and an interpretation of the reference model; the reference model providing direction and validation to the operational. Where this occurs it appears to have happened quite unconsciously, and contrasts with organizations where the new improved way of working (whatever it may be) is introduced as both the operational and reference model, overwriting existing practices, with predictable consequences. This two model approach is implied in CMM(I)’s OPD where architectures and lifecycles reside, but the relationship between these and the CMM(I) itself, especially with respect to effectiveness of SPI, appears obscure, and tends to be ignored when the ‘implementation’ of CMM(I) is discussed. (I have never liked the term CMM(I) implementation – maybe this is why.) Has this operational/reference, two model approach been noticed or reported elsewhere?

 

 

 

 

 

 

 

7 March 2005

I’ve been pointed to this and found it referenced in the lean software development book – see library. It’s not exactly an easy read but it may just open a door to understanding software processes a little better. S/w development is intimately connected to software management which is ‘difficult’. Professor Koskela’s paper may shed some light on this. Distinguishing service models from project models as alternatives for software and systems development (see the RPI slides) is not popular; the project being taken as the default regardless of its appropriateness. This paper may help justify non project oriented software management approaches.

 

 

 

 

 

 

 

28 February 2005

Thinking about the CMM again it occurred to me that you hear little about the many different ways it can be used for process improvement. Here are some suggestions:

 

1.       Use the CMM (or CMMI) to tell you what not to try. This was how we originally used it. When the CMM first appeared we were undertaking ambitious and diverse process improvement efforts, in particular software measurement. Some things worked others didn’t. The CMM made it very clear for the first time that some things come first and others, dependent on them, come later. Use the CMM to understand your capability and make that better – don’t launch process improvements to become more mature, use it to become more effective and more efficient. Work to be a really good <whatever level you are, even L1> organization and understand your organization and measure the effect of your work before even thinking about changing level.

 

2.       Start at the top. Look to Levels 4 and 5 for toolsets for improvement and not as distant maturity levels. The tools described at levels 4 and 5 can be used (carefully and locally) whatever the your ‘official’ maturity rating. (A TQM organization not developing software, or just starting software development would be level 1, but equipped with the L4 and 5 processes to make it more effective – why not try it too.)

 

3.       Think of the maturity levels not as levels but as types: project oriented, process oriented, TQM oriented…. Some organizations are intrinsically project oriented, others process oriented. An organization may be a really good intrinsically L2 type organization, or it could be naturally L3, but with holes. Which? Don’t spoil the good L2 organization by making it mimic a L3 organization, but the ventilated organization could really benefit by a drive to patch the holes. Trying to be the wrong type of organization, process oriented when project oriented, or visa versa, is like wearing shoes on the wrong feet, uncomfortable and unhelpful. Decide what type of organization you are and work to be really good at that. Don’t be a ‘cargo cult’ organization.

 

4.       Put the levels and process areas to one side and look at the really important features of the CMM, the common features. The common features describe all the process areas at all the levels in the s/w CMM. Why? -  because they are the key to understanding, managing and improving processes. Invest in understanding and putting in place your own common features and you will be unable to stop improvements in performance. (The common features persist in CMMI, but tend to be overshadowed by the model’s size and complexity.)

 

If you have found good ways of using the CMM  -  beyond formal assessment, ‘gap analysis’ then attempting to be the next level (or two) up by <scheduled date> - let us know and we can add them to the list.

 

 

 

 

 

 

 

17 January 2005

Peter Leeson is running a three day ‘Introduction to the CMMI’ course in March 29 - 31, from 9.00 to 5.00 in Milton Keynes, in the UK.  This course fulfils a prerequisite requirement for any course requiring an official SEI Introductory CMMI course. Email us at info@osel.co.uk or call on 01993 700878 for more details.

 

The SEI Introduction to CMMI®
(Staged and Continuous)

 

This three-day course introduces systems and software engineering managers and practitioners, appraisal team members, and engineering process group (e.g., SEPG, EPG) members to Capability Maturity Model Integration (CMMI) fundamental concepts. CMMI models are tools that help organizations improve their ability to develop and maintain quality products and services. CMMI models are an integration of best practices from proven discipline-specific process improvement models, including the CMM for Software, EIA 731 and the Integrated Product Development CMM.

 

In response to community requests, this course is an upgrade to the existing Introduction to CMMI, Staged Representation and Continuous Representation courses, incorporating concepts from both. The course also includes improvements to slides, exercises, and other course materials identified in other change requests submitted by the community.

 

The course is composed of lectures and class exercises with ample opportunity for participant questions and discussions. After attending the course, participants will be able to describe the components of CMMI models and their relationships, discuss the process areas in CMMI models, and are able to locate relevant information in the model.

 

 

 

 

 

 

 

7 December 2004

Just received this from the British Computer Society…

 

Following an exploratory meeting yesterday (25 November), a proposal is

being developed for the formation of a BCS Open Source Specialist Group.

Subject to approval of its formation by the Specialist Groups' Executive

Committee, this SG will have a wide remit and will also include

health/informatics.

 

If any member is interested in being a member of this group (and especially

if anyone would be interested in serving on an interim Committee to get the

group up and running), please contact Peter Murray, Chair, BCS HI Nursing

SG, by 15 December on peter@open-nurse.info

 

Regards

John Stephens

Specialist Groups' Officer

BCS

1 Sanford Street

SWINDON

SN1 1HJ

e-mail: jstephens@hq.bcs.org.uk

Tel (direct): 01793 417631

 

 

 

 

 

 

 

6 December 2004

We have added a dedicated RPI page where we will be trialing access to the RPI toolset assets prior to moving them across the SPIN website in March 2005. If you want something on the page that is not available yet email us for it, but take a look at the RPI slides  - below - to make sure you really do want it first.

 

 

 

 

 

 

 

2 December 2004

The US Sarbanes Oxley Act seems to have the potential to stir things up in the software process community. The requirements imposed by this act may trigger increased management interest in visibility and accountability in software development and support - and that can’t be bad.

 

 

 

 

 

 

 

(4 November 2004

The web site has been given a tidy up by a visitor to the office. Eleanor Shelley has been here on a ‘go to work’ day and has been making herself useful by conducting some usability tests on PODS and editing some of our web pages.)

 

 

 

 

 

 

 

October 2004

We have drafted a procedure called ‘Process Workshop’. It has been developed to provide a simple, template procedure to put in place changes to working practices and engage developers and managers in process improvement. Its value is as an explicit and teachable process that can be used by everyone and ‘officially’ recognized and measured. It encourages consensus building and provides publicity for small, well scoped changes. This is in contrast to TCM, a tool for larger scale process improvements used for identifying and agreeing the problem and then formulating the fix. Comments on the PW draft documents would be very welcome -  procedure.pdf, procedure diagram.pdf. It would be useful to us to know if you already do something similar, or if this would be useful to you. Notes on errors and omissions, except typos, also welcome.

 

 

 

 

 

 

 

October 2004

I’ve just been sent this  - radice.pdf.  It is a fascinating case study on the use and value of software inspections by Ron Radice (see book reviews). I’m not too sure if I should be posting this paper here - note it is copyrighted – but it does include links to the author’s web site.

 

 

 

 

 

 

 

September 2004

We have been contributing to the SPICE (Structured Process Improvement for Construction Enterprises) since its inception. SPICE is a model for process improvement, inspired by the SEI’s CMM, intended for the construction industry. SPICE III is now preparing to launch the Level 3 definition. See www.scri.salford.ac.uk or email us for details

 

 

 

 

 

 

 

September 2004

Don’t forget the UKSMA conference in September  - http://www.uksma.co.uk/ – we will be there speaking on Six Sigma ( paper  slides ) and presenting the Rapid Process Improvement (RPI) tutorial ( slides ).

 

 

 

 

 

 

 

July 2004

Just returned from the SCRUM course. SCRUM is a ‘wrapper’ for agile software development, or perhaps more accurately, an agile wrapper for software development. It’s a deceptively simple set of ideas - 30 day ‘sprints’ to develop and deliver workable code of value to the business – with daily team meetings (scrums) during sprints to keep development on course by the self-managing team. In essence a refined and practical take on the iterative/incremental development approach. SCRUM has a lot of potential – it meets a real need, it’s simple, technically easy to implement – but like anything involving new ways of working, potentially career limiting. It’s designed to break the software development log jam and get value out of software development fast. It sits neatly between the agile development methods and the business that must get value from them. Its not just process – like XP there’s some attitude in this too. There’s a SCRUM book (by Ken Schwaber and Mike Beedle, SCRUM’s originators  - ISBN 0-13-067634-9) but it doesn’t do credit to SCRUM itself. If you can, get yourself on a Certified Scrum Master course. The presenters at our 2 day course, Ken Schwaber and Joseph Pelrine, clearly know what they’re talking about and were able to hold our attention with great ideas, clear thinking and some good stories too. I don’t agree with everything they said but like any good idea it’s made me think and question what I believe. As an extra goodie it’s intended to keep SCRUM open so everyone can get access and contribute to its development. Find out more at http://www.controlchaos.com/.

 

If you would like a SCRUM tutorial or SCRUM training we now have a Certified ScrumMaster - contact us.

 

 

 

 

 

 

 

June 2004

SPIN has just started up again – kick started by Tom Gilb’s excellent seminar on Agile Inspection. At the next SPIN meeting this autumn we intend to start up two SPIN initiatives (if the committee will let us). Firstly restart the SPI tools repository to acquire and give SPI practitioners access to simple effective SPI tools. (This was tried in ’98 but while the need was there few tools were forthcoming. This has now changed.) Secondly start a SPIN project to develop ‘fault grids’ invented by Jeremy Dick. Fault grids appear to be the most effective way of analysing and presenting defect data. They have considerable potential but have had very little exposure in the software development community. This project hopes to change that.

 

 

 

 

 

 

 

26 May 2004

I’ve just received this. Short notice but could be interesting if you can make it…

 

The Thames Valley Agile Special Interest Group would like to invite you to our first seminar at Oxford Brookes University on the 1st June 2004.

 

Topics:    An introduction to Agile Software Development / Extreme Programming (XP) explained 

 

Speaker: Tim Bacon - Thoughtworks

 

Date:                        Tuesday 1st June 2004

 

Time:                        19.00 - approx 21.00

 

Cost:                        Free

 

Location:                 Oxford Brookes University

                                Headington Campus

                                Gipsy Lane

                                Oxford

                                OX3 0BP

                                (Arrive at main reception from where you will see signs to the Agile User Group).

                                 

Summary:

Agile Software Development has had a dramatic impact on the Software Development industry over the past few years. Since the release of Kent Beck's book, 'Extreme Programming Explained - Embrace Change', Agile SD has gone from strength to strength. As more and more organisations are reaping the advantages of using these disciplined yet lightweight approaches to deliver customer focused software faster, its is becoming imperative for software teams to be able to utilise these 'agile' skills to keep up in the market place.

 

 

 

 

 

 

 

April 2004

UKSMA – the software measurement group have issued a call for papers for their autumn conference:

 

The UK Software Metrics Association (UKSMA)

 

CALL FOR PAPERS

 

The 15th Annual UKSMA Conference is to be held

on 15 September 2004, at the University of  Wolverhampton. 

 

The theme of the conference will be Software Measurement in Practice.

 

You are invited to submit papers on the subject of software measurement to conferences@uksma.co.uk.  

Please submit a short description summarising your paper.  Each presenter has 45 minutes, including 10 minutes for questions.

It would be appreciated if you would indicate your intention to submit a paper as soon as possible even if the description is not yet ready, so that we can gauge the response. 

 

Deadline for papers is 7th May 2004 - Notification of acceptance by 28th May 2004.

 

 

 

 

 

 

 

March 2004

In passing – we’ve helped another of one of our client’s to CMM L2. They have just had a registered CBA IPI and have achieved L2. This in very short timescales and with clearly improved performance.

 

 

 

 

 

 

 

January 2004

The value of quality controls in all stages of software development and maintenance is becoming increasingly evident, not just as quality controllers identifying defects, but also as management and communications tools. This is leading us to find define and categorize the best of these techniques and processes. It looks like a big job. If you would like to help with this or simply contribute some views email us.

 

 

 

 

 

 

 

September 2003

We’ve come across some interesting new ideas – dynamic configuration of systems – update your systems while they’re running. Potentially makes the traditional release process easier simpler and less risky.

 

 

 

 

 

 

 

May 2003
OSEL has presented to the UK SPIN on ‘six sigma – its applicability to software development’.

 

 

 

 

 

 

 

April 2003
OSEL is pleased to announce that it will be collaborating with Satyam to deliver first class process consultancy services.

Satyam is a major Global consulting and IT services company assessed at CMM Level 5. Satyam has a wealth of software process and process improvement knowledge.

OSEL has detailed knowledge of UK and European software process capabilities and software process improvement programmes. It also has highly effective process improvement tools.

We will be working closely together with Satyam to maximise our capabilities by making world class performance available - effectively - to organisations requiring improved understanding and performance of their software processes.

 

 

 

 

 

 

 

March 2003
The Microsoft PODS tool is now being tested - like the Notes version, it manages:
 - project status
 - reporting
 - reviews
 - risks
 - actions
 - changes
 - information
 - tests
 - defects
 - resources (requests, assignments, and skills).

The integration of the P2 metrics tool with the Microsoft PODS is underway.

Following increasing interest from industry and also a request at the UKSPIN we have prepared an overview and comparison on Six Sigma and its applicability to software development.

 

 

 

 

 

 

 

February 2003
We hosted the UKSPIN meeting on "The SPI Approaches that Work" and distributed our Tactical Change Method (TCM) to attendees - one of a suite of tools that deliver improvements.

The growing interest in the Unified Process and experiences of PRINCE2 has shown a need to integrate these two methods. We are currently working on integrating the management method and the software process.

 

 

 

 

 

 

 

January 2003
We have submitted a proposal to present our Rapid Process Improvement tutorial - describing the RPI toolset - at the European SEPG meeting this summer.

PODS is now being ported to VB. It is intended that this variant of mPODS will replace the current Access based mPODS towards the end of this year. The new mPODS will retain all the lPODS functionality, include P2, and be capable of working from most databases.

We have begun a study to see how closely the PODS tool-set maps onto the PRINCE2 project management method. PODS was deliberately developed independently of any particular methods to ensure maximum applicability but the obvious synergies between PODS and PRINCE2 have prompted us to consider a PODS variant that explicitly supports PRINCE2.

We are offering lPODS 1.8(d) (running on Lotus Notes R5) free with a view to extending the user base. The distribution is of a design protected PODS but may change - see below. Contact shelley@osel.netkonect.co.uk for an lPODS pack.

OSEL is reviewing its PODS distribution options. Should we take PODS open source? The impact of the open source movement is impressive and opportunities for wider use and improvement of PODS are tempting. Any views?

 

 

 

 

 

 

 

November 2002
PEA has been revised to be more usable and extended to include risk profiles - renamed P2.

The Tactical Change Management (TCM) method is being revised. The current issue (1.4) is rather 'procedural' in feel. We are revising it to make it look less intimidating and to make it easier to learn.

 

 

 

 

 

 

 

September 2002
The GSP (Good Software Practices) document is being revised. It is being extended to include the management and exploitation of internal company and national and international standards (broadly comparable to the CMM's L3 OPD and OPF) and the entire set is being reworked to make it more useful as a baseline document for FQA (Focussed Quality Assurance).

 

 

 

 

 

 

 

June 2002
The Rapid Process Improvement tool-set has been extended. More tools have been identified. The Software Process Installation ('flat pack') tool is being revised.

 

 

 

 

 

 

 

May 2002
Users of PODS had commented on the need to capture metrics data - something that we had consciously avoided during PODS early development. In particular the presentation of time series data was requested. To meet this need we have developed the PEA (PODS Extraction and Analysis) tool. This is a simple stand-alone tool that periodically takes the running totals from PODS and presents them as data points on traditional time series graphs. Graphs currently include:
· cost management - centred on Earned Value
· schedule adherence
· slippage
· cumulative changes
· cumulative defects
· work analysis (this is an ongoing measure of cost of quality)
· resource utilization
(An interesting side effect of PEA's development was the demonstration that the definition and implementation of software measures is remarkably straightforward when processes delivering the data are already well defined.)

 

 

 

 

 

 


Home

What’s New    PODS    Products & Services    Library    Presentations & Papers    Related Sites    Events    History     Clients

© Copyright OSEL 1998-2016
This page was updated on 19/09/2016
Comments about this website to shelley@osel.netkonect.co.uk