Written by Mike Cottmeyer Saturday, 15 October 2011 01:21
How far ahead should we plan? I depends on what you are building, when you need to have it done… and if you aren’t going to get done… how soon do you need to know about it. If your goal is to build the highest value features possible, deliver continuously to market, get real time feedback… you might be able to get away with planning a sprint or two out… maybe less. If your goal is to deliver a specific set of predefined features, all of which need to be done by the end of the quarter, you may want to have all three months laid out. It’s not that we wouldn’t inspect and adapt and deal with reality, it’s just that we need to know if our velocity isn’t trending such that everything is going to get done. If we don’t know how we are doing against done, we don’t know what tradeoffs we need to make along the way.
I’ve worked with several clients recently that were trying to operate as if the software they were building was emergent. It wasn’t. They were being asked to deliver a specific outcome, with a pre-defined set of time and cost constraints. For these guys, it was absolutely silly to only plan their backlog two weeks at a pop. They had no idea how they were doing against the expectations of the business. They had no idea if they were on track or not or how they should approach the business to negotiate scope trade-offs. They had no means to determine if their approach was trending toward and acceptable outcome. The reality was that they were going to work really hard, probably deliver a great working product, and still have their stakeholders upset with them.
Having a plan doesn’t mean that we have to have a death march. Having a plan means that we have a baseline to measure against. Some way to determine if we are making the progress necessary to achieve our goals. Remember that line in the Agile Manifesto? We value responding to change over following a plan? While we value the items on the right, we value the items on the left more? The plan isn’t the problem… it’s failure to respond to change… to deal with reality that is the problem. If I have a fixed time, fixed cost, fixed scope project… I damn well better be delivering incrementally using an agile approach… it’s the only way of knowing if I’ve got a shot in hell of being successful. It’s the only way we can confidently let our stakeholders know if we are on track or not.
Not every team needs a project manager… but I think many could benefit from some really good project management. I’ve been an agile project management guy from the beginning, but I am becoming increasingly convinced that we need to be teaching teams, not just how to self-organize, but how to effectively manage delivery… product or project delivery, I don’t care which. Self organized teams need to have everything necessary to deliver an increment of value… it’s my opinion that everything necessary to deliver an increment of working product includes someone that knows how to manage risk, validate assumptions, communicate with stakeholders, assess progress against the goal, and know when things are off track. That can be the PO, the ScrumMaster, or someone else on the team… again, it doesn’t matter.
What matters is that project management is happening… no matter who does it.
Written by Mike Cottmeyer Monday, 3 October 2011 11:44
If you haven’t had a chance to fill out the annual VersionOne Agile Survey, time is running out. The V1 Survey has proven to be a great source of information about how we are doing as a community, and how far Agile has made its way into the mainstream. The guys at VersionOne would really appreciate it if you’d take a minute to head over and give them your feedback.
Here is a link to the site. Go fill it out now… seriously, get moving!
Written by Mike Cottmeyer Monday, 26 September 2011 11:37
Just wanted to let you guys know that I am running my second PMI-ACP Certification Prep Course on October 17-19, 2011. The class going to be held at the VersionOne Headquarters in Alpharetta, GA… just north of Atlanta. We had 10 folks at the first course and got some really good feedback. The second course should be even better!
The course I’m doing was co-developed with Dennis Stevens, Ahmed Sidky, Mike Griffiths, Sally Elatta, and yours truly… all of us were deeply involved with the creation of the PMI-ACP certification. The course is very comprehensive and designed to provide a solid foundation in Agile Project Management to get you prepared to sit for the PMI-ACP Exam.
We have enough confirmed participants to run the course… this will not be cancelled. If you are interested in joining us, you can use the code OCT2011BLOG to get a 20% discount off the $1695 3-day price. Click here for more information about the course, or go here to register now!.
Looking forward to seeing you there.
Written by Mike Cottmeyer Monday, 26 September 2011 06:00
Okay, so the beef with estimating seems to go like this… we all agree developers are bad at estimating, we all know that managers will misuse any estimates we give them, and we all know the team will have to go on a death march to meet an unreasonable date the manager set. Since we don’t like managers and we don’t like death marches, we conclude the creation of software is an unpredictable process, that estimates are bad, and we should never estimate at all. Is it possible that we don’t really have an estimating problem? Is it possible that we have a bad management problem? What if we decided to keep estimating and just decided to stop having bad managers? Would that solve the problem?
We could always just create an approach that didn’t have managers at all? Oh, wait… we’ve done that, it’s called Scrum
All kidding aside, I don’t see any reason to stop estimating. In fact, unless your business model supports totally emergent outcomes, chances are you have some sort of goal, some sort of business outcome you are tracking toward that is tied to a hard, somewhat pre-defined deliverable. Chances are you’ve sold something to some customer and now you have to make good on that commitment. These are not emergent outcomes, they are convergent… we are trying to manage the process to optimally hit some predefined target. Inspecting and adapting is about optimizing our chances of creating, not just any successful business outcome, but a specific business outcome. We can debate this business model all day long, but if that’s your reality, you need estimates.
But we have to remember, estimates are just that… estimates. We know they are wrong by their very nature. That said, I have personally coached teams that have gotten very good at relative estimation, and stabilizing their velocity, and are extremely reliable predicitng 3-6 months out. Their estimates are actually pretty good. Here’s the deal… estimates have to have ranges and probabilities. Assumptions have to be managed and risks have to be mitigated. We have to be able to measure how wrong our estimates are so that we can start to forecast that error forward. Is it possible that estimates aren’t the problem, but the failure to manage the estimates? We have to constantly adapt our plan to deal with our current reality… no matter how difficult that might be, no matter how much we might want the outcome to be different.
And therein may lie the real problem with bad estimates, and bad management in relation to bad estimates. We want our reality to be different, we need our reality to be different, our reality just HAS to be different… so we ignore the facts… we ignore what the data is telling us. We ignore the team’s actual performance against the estimate. There is so much competitive pressure to deliver, so much pressure to add those extra features by the end of next month so we can make the sale, so much pressure to deliver those features our sales team committed to win that big deal. So much pressure that we ignore reality. We promise against the most optimistic possible outcome. We have to have everything yesterday so we don’t buffer or plan for the unexpected. We don’t deal with reality when it presents itself.
The problem with estimating isn’t that estimates are bad. We know they are bad going in. Most organizations I work with can at least get them somewhat in control and a little bit consistent. Having teams that stay together and establish a stable velocity levels out many of the remaining bumps and makes them a useful approximation. Given a known estimated backlog, we can manage deviation and change and keep the business informed about how things are going. All we are really looking for is a baseline to measure against. The real problem is that, far too often, we oversell the organizations ability to deliver. The real problem lies in creating a ‘you have to deliver at all costs’ culture that doesn’t respect the teams established capacity to build working tested software.
More often than not, the root of this problem is selling features we don’t have in order to close business. Selling features on the chance that the team can deliver them on time, that’s where it all starts to break down. If we only sold what was available, or on the near term product roadmap, I think we’d be having a very different conversation about the value of estimates. We could use them to give us a rough idea of what’s possible and use them as a management baseline to measure where we were against where we hoped to be. Bad estimates become a problem, bad management becomes a problem, in the face of unyielding pressure to deliver against unreasonable expectations and inflexible project schedules and commitments.
If your company is in position that it has to sell features it doesn’t have to stay viable, you are taking a huge gamble that you’ll be able to deliver. Sometime that gamble is the right thing to do… I’m not arguing that. But you have to realize the risk and have a strategy to deal with it. Hoping that the developer estimates are going to be accurate is false hope. Failure to have a strategy for dealing with estimates when they are wrong is bad management. Having a well groomed backlog, keeping teams together over time, aligning teams with business units or products, holding teams accountable for establishing a stable velocity, will all help make estimates more accurate over time. Nothing is going to help if we can’t deal with the reality of an oversubscribed product delivery team.
Written by Mike Cottmeyer Friday, 23 September 2011 03:42
I’d like to see if we can generate a little discussion around the idea of metrics, goals, and improvement.
Yesterday I was exploring with a client the notion of quarterly objectives. Specifically we were discussing the kinds of metrics that would make sense in an agile environment to properly incentivize the kinds of outcomes we are really going for.
Out of that conversation came the best quote of the week, maybe the best quote of the year… “If you don’t want people to game the numbers, don’t make the numbers a game”. I wish I could take credit for that one, but credit goes to the client
So… here is the question. What are some ways to measure team performance without creating an incentive to game the system? What kinds of things do you guys measure and track that allow you to baseline, track, and reward authentic improvement?
I think it would be neat of we could get an inventory of ideas in the comments to this post. I’ll add mine… you add yours… and let’s see what we come up with. If you guys hurry, maybe I’ll share your feedback with the folks at the Agile Coach Camp tomorrow!
No events scheduled at this time.
If you want people to stop gaming the numbers, stop making the numbers a game
Working without a plan may seem scary. But blindly following a plan that has no relationship with reality is even scarier.37Signals Rework
When you don’t know what you believe, everything becomes an argument. Everything is debatable. But when you stand for something, decisions are obvious.37Signals Rework
The core of your business should be built around things that won’t change. Things that people are going to want today and ten years from now. Those are the things you should invest in.37Signals Rework