Searching \ for '[TECH] Agile programming (was Re: [P|C] Banksel)' in subject line. ()
Make payments with PayPal - it's fast, free and secure! Help us get a faster server
FAQ page: www.piclist.com/techref/microchip/devprogs.htm?key=programming
Search entire site for: 'Agile programming (was Re: [P|C] Banksel)'.

Exact match. Not showing close matches.
PICList Thread
'[TECH] Agile programming (was Re: [P|C] Banksel)'
2009\01\28@225126 by Vitaliy

flavicon
face
Rolf wrote:
> The best analogy I can think of is 'Agile Programming'... sure, it
> works, and it works well for some people, but it is not for everyone,
> and you can get programs that work just as well using other more
> traditional ways. Still, it has advantages and disadvantages.

I'm curious -- what is your experience with Agile? Can you address the
points you made, in more detail? Especially the "disadvantages" part?

Vitaliy

2009\01\29@002829 by Rolf

face picon face
Vitaliy wrote:
> Rolf wrote:
>  
>> The best analogy I can think of is 'Agile Programming'... sure, it
>> works, and it works well for some people, but it is not for everyone,
>> and you can get programs that work just as well using other more
>> traditional ways. Still, it has advantages and disadvantages.
>>    
>
> I'm curious -- what is your experience with Agile? Can you address the
> points you made, in more detail? Especially the "disadvantages" part?
>
> Vitaliy
>
>  
Sure, I have experience with 'diluted' Agile development. I work in the
financial software industry (... while Alan B. Pearce was sending code
to the moon, I was feeding code to the financial crisis... ) and we
often get threatened with 'Agile Development'. In fact, in many ways we
do some form of Agile development, but it is more a cherry-picked
version where the company culture has developed in such a way that
certain practices are compatible with the Agile concept, but it is not
in any way a formal 'Agile' strategy.

I am solely responsible for certain functionality, and for other aspects
I work in a team as we together develop and maintain a single complex
middleware application. On a company level we deliver a complete suite
of financial risk management software components that interract/live
together. Further, I get the 'client side' gig when there are 'serious'
issues on a client site (related to the middleware). It's the sort of
role where I focus on one thing for a few months and it is littered with
smaller diversions in to the other responsibilities. Our shop is small
enough to be able to get deep in to niche areas of the code, yet big
enough that you have to be able to play along with other departments,
teams, etc.

Further, the company I work for likes to believe they have the best
programmers (and I like to agree ;-). We are given a lot of freedom to
do things 'our way', but the payback is that we have to be flexible,
multi-skilled, and be willing to re-prioritize things on a management
whim. I am one of many people in the company who have specialist/niche
technical and business knowledge, and a broader overlapping system
knowledge.

When we investigated agile programming we found that it was likely not
going to fit with our 'culture' because it would actually reduce
flexibility as we have to regularly switch from development to support
roles, interspersed with customer interaction and other diversions.

On the other hand, we have a couple of 'Agile' teams that were put
together to fulfill specific client functionality where there was no
existing support for that in the current business solutions. These Agile
teams were assembled and then sequestered from the rest of us so as not
to be distracted (actually, a couple of them play opera music to keep
the pressure down, and we kicked them out because we don't like opera...
;-). Still the one team is still going after 2 years, hardly 'Agile'.
And worse, the product they developed has won all sorts of awards, and
is well regarded in the financial industry, unfortunately it is a bugger
to integrate with the rest of the system. It will probably be
stand-alone for a while.

In the end I guess you could say the company has a split  personality.
Development is scheduled from a business-functionality perspective (e.g.
we need to build a Credit Risk Assessment strategy to calculate Risk
Metrics on 'Toxic Mortgage Backed Assets'). The Finance Guru's will
build models of what calculations need to happen (about 30% of our
company has a phd, and many have 2 - not bad for 'programmers'), and the
process will be broken down in to what data needs to be available, etc.
From that point on a project manager will be assigned the
responsibility of ensuring the software can produce the correct metrics,
and will then start the battle of getting dozens of teams to schedule
time to actually ensure that the databases, middleware, reporting
frontends, calculation engines, etc. can actually process the data
coherently.

Agile development would put together a smallish team to work on the
whole thing from beginning to end with mini milestones, etc. This would
not work for us because we often need to get many dozens of people
co-ordinated to get the functionality available in many components that
need to be backward compatible for hundreds of other Risk-Management
processes.

Basically the scale of development, testing, integration, and regression
management preclude any one (small) team of people from having the time
to learn what they need to know to get the job done.

While I am sure that I am replaceable (the whole hit by a bus thing), I
also know that in the scheme of things, I have taken years to get to the
point where I know the financial models, the computing models, the data
models, and the development models well enough in our company to be able
to produce top-notch and cost effective software. While there are people
in the company that could replace me (and I them), a 'fresh' Agile team
would take years to gain the collective experience of a bunch of people
to get the task done, and not also run in to all sorts of compatibility,
regulatory and other issues.

An agile team will fail if it is required to disintegrate routinely to
work on unrelated maintenance tasks. That's basically why it would not
(does not) fly really well in the areas i work.

Still, there are times when we need to build (small) new pograms that do
whizz-bang things, and we get to do the especially fun stuff of rapid
prototype development, and then it is close to 'Agile'.

For the record, we got a new 'chief' recently up top at work, and his
philosophy is that maintenance is not much fun, we should all be doing
agile type development, and we should outsource the maintenance to
Estonia. Personally I think it would be a mistake because maintenance
should be the responsibility of the developer if only to ensure that
they develop maintainable code ... ;-).

Rolf

2009\01\29@020638 by Nate Duehr

face
flavicon
face

On Jan 28, 2009, at 10:27 PM, Rolf wrote:

> I am solely responsible for certain functionality, and for other  
> aspects

That is surprising in financial software.  You could add the famous  
"shave off all the rounded numbers into my Swiss bank account"  
function in your code, and no one would audit it?

We can't even do that in "answer the phone line" code in telco --  
hearing that there's ever only ONE engineer in charge of a function in  
financial code, doesn't give me warm fuzzies.

(Yeah, I know they'd catch the above example, but you get the idea.)

> I work in a team as we together develop and maintain a single complex
> middleware application. On a company level we deliver a complete suite
> of financial risk management software components that interract/live

I guess it doesn't manage risk all that well -- or your customers all  
didn't lose 30% last year like everyone else?  Hahahaha...

Well, you did say you were "feeding code to the financial crisis", and  
everything I've heard is that the mortgage stuff was quite heavily  
affected by "computer models".  Not that I don't hold the dolts who  
BELIEVED the models responsible, mind you... but it's interesting stuff.

> Further, the company I work for likes to believe they have the best
> programmers (and I like to agree ;-). We are given a lot of freedom to

That right there says something.  The best rarely think they are.  :-)

> do things 'our way', but the payback is that we have to be flexible,
> multi-skilled, and be willing to re-prioritize things on a management
> whim. I am one of many people in the company who have specialist/niche
> technical and business knowledge, and a broader overlapping system
> knowledge.

If they're reprioritizing on a "management whim", the managers aren't  
very good and don't have very good goal-setting skills in the first  
place.  So I wouldn't put too much stock in that "best programmers"  
thing, since they're not very good managers to begin with.

> And worse, the product they developed has won all sorts of awards, and
> is well regarded in the financial industry, unfortunately it is a  
> bugger
> to integrate with the rest of the system. It will probably be
> stand-alone for a while.
>
> In the end I guess you could say the company has a split  personality.

One that writes award-winning software, and one that doesn't?  Heh heh.

{Quote hidden}

And meanwhile the bank using the software should have just turned the  
idiot with the nose-ring and his girlfriend wearing the "I'm with  
Stupid" T-shirt down for the loan... hah.

{Quote hidden}

Isn't your job as you described it -- as an individual -- already a  
"small team" within that bigger whole that gets those big projects  
done?  You do things you say the other's can't, you have your own  
deadlines and goals... things change, you adapt... sounds "Agile" to me!

{Quote hidden}

This is why (and I don't like this model, but it's still commonly  
done) many companies break up "Product Engineering", and "Continuation  
Engineering" into different departments that don't co-mingle.

{Quote hidden}

How about pushing for code that needs LESS maintenance from the  
start?  Write once, use.  :-)

(Everyone says it's impossible, but we've all seen small projects  
where it somehow worked out and the "thing" hasn't been touched for  
years, and still works.  I'm not a programmer by trade, but it seems  
to me like Agile is all about just scaling that "small win" type  
situation up into the building blocks for a bigger project, isn't it?)

It *is* possible to write almost perfect code, after all...

http://www.fastcompany.com/node/28121/print

:-)

Nate

2009\01\29@022006 by Vitaliy

flavicon
face
"Rolf" wrote:
[snip]

Rolf, thank you for the background, it is very interesting.

What I know about Agile comes from an Agile "bootcamp" I attended about two
years ago, the books on Agile that I've read since, and (somewhat limited)
experience applying it at work. IANAE, but I do know a thing or two about
Agile, and it sounds like when you say "agile", you mean something else.


> When we investigated agile programming we found that it was likely not
> going to fit with our 'culture' because it would actually reduce
> flexibility as we have to regularly switch from development to support
> roles, interspersed with customer interaction and other diversions.

Switching between roles is not a problem for Agile. It is expected that
interruptions will occur, and an attempt is made to make the weekly/biweekly
estimates take them into account. Even combining part time/full time people
is not a problem.

Customer interaction is a must for Agile. Otherwise, you risk delivering a
product that does not meet the customer's needs.


> On the other hand, we have a couple of 'Agile' teams that were put
> together to fulfill specific client functionality where there was no
> existing support for that in the current business solutions. These Agile
> teams were assembled and then sequestered from the rest of us so as not
> to be distracted (actually, a couple of them play opera music to keep
> the pressure down, and we kicked them out because we don't like opera...
> ;-).

You say the Agile team was "sequestered", do you mean that before you were
all in the same room? That's actually what agilists recommend: instead of
every programmer having their own office, they should work as close together
as possible, preferably in a "bullpen" environment. The goal is to reduce
the cost of interaction.


> Still the one team is still going after 2 years, hardly 'Agile'.

I believe this is one of many Agile myths. There's nothing inherent about
Agile that limits the duration or scope of the project.


> And worse, the product they developed has won all sorts of awards, and
> is well regarded in the financial industry,

This is consistent with a survey that DDJ did in 2007: teams using Agile
report a higher success rate than traditional teams.


> unfortunately it is a bugger
> to integrate with the rest of the system.

What do you think makes it so?


{Quote hidden}

What you describe sounds like the traditional waterfall model: you gather
the requirements, then you do the design (on paper), then you implement, and
the final step (not mentioned above) is to test.

Sounds good in theory (to some, to others not so [1]), but as far as I know,
the waterfall model never works in practice.

I'm going to make a few guesses about the way things work at your company.

- For example, I bet that the Finance Gurus don't simply build the models,
and throw them over the wall. They interact with the people who are actually
going to implement the models, right up to the end of the project.

- Also, you don't build the whole thing all at once, you build it in pieces,
and test each piece. Then you make more changes, and test the pieces again.

- You routinely revise your assumptions and requirements throughout the
duration of the project.

Am I right?

My point is (in case anyone missed it :) that waterfall development
methodology does not work, and that Agile tends to fit the  "natural" model
of how programmers work, better.


> Agile development would put together a smallish team to work on the
> whole thing from beginning to end with mini milestones, etc. This would
> not work for us because we often need to get many dozens of people
> co-ordinated to get the functionality available in many components that
> need to be backward compatible for hundreds of other Risk-Management
> processes.

- The first statement is another myth. Agile is scaleable, even though best
results are achieved in small teams. I've read about projects which were
completed by dozens of geographically dispersed Agile teams (hundreds of
people). It is possible to have several "layers", where for example "product
owners", one from each team, have their own weekly meeting to coordinate the
higher-level efforts. The way it works is the same way you would naturally
break down a large project: by system, by component, by subcomponent, by
feature.

- "Milestones" are not an agile concept, it is something you hear about a
lot in waterfall-type projects. Agile has the concept of "iteration": a
short, fixed length of time, at the end of which you must have working
software. Working software, because "code does not lie". Documentation on
the progress of the project often does.


> Basically the scale of development, testing, integration, and regression
> management preclude any one (small) team of people from having the time
> to learn what they need to know to get the job done.

Imagine this: you have your team of, say, ten people working on a small
subset of functionality. There are several other teams that are working on
other subsets. You have short meetings (30 minutes to an hour) once every
two weeks to look at, and prioritize the task list. Your team has a "product
owner" -- basically, a delegate from your team who meets with delegates from
the other teams, to coordinate the higher level task list. If necessary,
this team of product owners can send its own delegate to a yet another
higher level meeting.

You may say this sounds ridiculous, but consider that with a team size of 10
the three levels can coordinate the work of 1000 programmers. If your
company only has 100 people, you only need one extra "layer". Most small
companies only need one.


> While there are people
> in the company that could replace me (and I them), a 'fresh' Agile team
> would take years to gain the collective experience of a bunch of people
> to get the task done, and not also run in to all sorts of compatibility,
> regulatory and other issues.

Your assumption is that you can't use existing people who share this
collective experience, to form Agile teams. Are there any reasons for this,
besides people's natural resistance to "new" ideas?


> An agile team will fail if it is required to disintegrate routinely to
> work on unrelated maintenance tasks. That's basically why it would not
> (does not) fly really well in the areas i work.

Has this been tried at your company? I don't understand why an agile team
would be different from any other team in this regard.


> Still, there are times when we need to build (small) new pograms that do
> whizz-bang things, and we get to do the especially fun stuff of rapid
> prototype development, and then it is close to 'Agile'.

Size does not seem to matter, agile processes still turn out to be more
efficient.


> For the record, we got a new 'chief' recently up top at work, and his
> philosophy is that maintenance is not much fun, we should all be doing
> agile type development, and we should outsource the maintenance to
> Estonia. Personally I think it would be a mistake because maintenance
> should be the responsibility of the developer if only to ensure that
> they develop maintainable code ... ;-).

I agree with you, you have to eat your own dog food, not feed the Estonians.
I would further suggest that your company should adopt the Test Driven
Development (TDD) approach (another "Agile" thing)

With TDD, you create a scaffolding around your code, so that if you break
something while fixing something else, you know it right away. TDD people
say that you're supposed to write the test first, run the program to make
sure the test fails, then write the code that the test will test. Not too
long ago Timothy J Weber and I talked about ways to make it work for
embedded development:

[EE] Test driven design for embedded applications

I'll admit I haven't tried it yet, although recent bug reports give me
plenty of reasons why I should. :)


In conclusion, if you haven't really looked at Agile, you definitely should.
It can save you tons of wasted effort, and help you produce a quality
product with features that your customers actually find useful.

Vitaliy



[1]  http://en.wikipedia.org/wiki/Agile_software_development#cite_note-0

^ Gerald M. Weinberg: We were doing incremental development as early as
1957, in Los Angeles, under the direction of Bernie Dimsdale [at IBM's
ServiceBureau Corporation]. He was a colleague of John von Neumann, so
perhaps he learned it there, or assumed it as totally natural. I do remember
Herb Jacobs (primarily, though we all participated) developing a large
simulation for Motorola, where the technique used was, as far as I can tell,
indistinguishable from XP. [. . .] All of us, as far as I can remember,
thought waterfalling of a huge project was rather stupid, or at least
ignorant of the realities. I think what the waterfall description did for us
was make us realize that we were doing something else, something unnamed
except for "software development. quoted in Larman, Craig; Victor R. Basili
(June 2003). "Iterative and Incremental Development: A Brief History" (pdf).
Computer 36 (No. 6): pp 47-56.

2009\01\29@022725 by Vitaliy

flavicon
face
"Nate Duehr" wrote:
> It *is* possible to write almost perfect code, after all...
>
> http://www.fastcompany.com/node/28121/print
>
> :-)

This has been brought up once before... :)

Unless you have unlimited funds and no deadlines, DO NOT follow the Space
Shuttle's team approach. They spend 99% of their time on bureaucracy, and 1%
on writing code. I believe the first time someone referenced this article, I
did a calculation on the cost in $ per line of code, and it was
astronomical*.

Vitaliy

*Which makes sense: after all, they work for NASA...

2009\01\29@024020 by Nate Duehr

face
flavicon
face

On Jan 29, 2009, at 12:18 AM, Vitaliy wrote:

> You say the Agile team was "sequestered", do you mean that before  
> you were
> all in the same room? That's actually what agilists recommend:  
> instead of
> every programmer having their own office, they should work as close  
> together
> as possible, preferably in a "bullpen" environment. The goal is to  
> reduce
> the cost of interaction.

There's at least one software business owner who wholeheartedly  
disagrees with the "bullpen" approach.  Have you read "Joel on  
Software"?

He posted this to his blog on January 13th as supporting evidence:

http://www.joelonsoftware.com/items/2009/01/13.html

I'm in "tech support" and since about 1994 have worked in "bullpen"  
environments.  I've adapted, but I dislike them.

I'm not as productive listening to all the crap that's not my  
responsibility going on around me, so I've developed a very good  
ability to ignore everything.  Sometimes when co-workers walk up and  
just want to chat, they marvel at the fact that I don't even know (nor  
care) that they're behind me, even if they address me.   My DSP filter  
in my brain keeps them below consciousness level unless it senses that  
they are asking something work related.  It also perks me up if I hear  
similar bugs/problems to new ones I'm working coming from a nearby desk.

But otherwise, I'd rather have a closed door and quiet.  I'd be more  
productive.  I'm ALWAYS more productive when weather or other  
circumstances have me working from (a quiet) home.  (If my wife's  
home, my productivity goes in the toilet, unless I excuse myself and  
go to another room on another floor of the house.)

And back waaaay back when I worked for Texaco, I had one... a desk and  
a door that could be closed.  I miss that.  I was just a peon "college  
summer hire" and I had a real office.

Interestingly, I started noticing how often the modular furniture is  
re-arranged and talked a bit with the facilities people about how much  
that costs to bring in electricians, re-wire the data cabling, etc.    
Not long after, I read an article that debunked the myth that modular  
furniture is "cheaper", since managers always seem to want to re-
arrange it every year or so.  Drywall and doors and a floor plan that  
doesn't move, is actually cheaper in the long run.  I then talked  
about this with facilities and they agreed... no moves/changes would  
have saved our company tens of thousands a year...

Oh well.  Sometimes common sense isn't so common, I guess.

The last time we had to move the modular furniture, they made it  
taller -- now we regularly see people doing dangerous and stupid  
things to talk to co-workers on the other side.  I just dial their  
extension and if they don't answer, leave a message.

The most AWESOME piece of equipment I have is my wireless headset.  We  
all have them, they don't seem to ever interfere with one another,  
they cost over $300 a pop, and they're the best thing for a tech  
support team since sliced bread.  I will probably buy my own for work,  
if I ever leave the company, it's that useful.   Any tech support  
manager who balks when their staff asks for them because of the cost,  
should have their head examined.  I'd rather have a slower PC, or any  
number of other things taken away, before I would drop the cordless  
headset from GNI Netcom.

Nate

2009\01\29@025438 by Nate Duehr

face
flavicon
face

On Jan 29, 2009, at 12:26 AM, Vitaliy wrote:

{Quote hidden}

I didn't say it was cheap.  I said the code worked.

If companies REALLY knew how much it would cost -- including all  
bugfixes and later releases and patches -- to do many of the things we  
do with computers today, when they STARTED the project -- they'd have  
kept the paper, pens, and filing cabinets.  Seriously.

Every trouble ticketing system I've EVER seen deployed, for example,  
is over-budget, missing critical features that then require paper or e-
mail workarounds, and never hits the mark 100%.

The cleanest and simplest system I've ever seen used was  
RequestTracker from BestPractical Software.  Their system keeps the  
crap down to a minimum, doesn't try to integrate to 20 other modules  
that run half the company, and has EMAIL INTEGRATION built in,  
something the $500K Siebel deployment at work doesn't even do.  And it  
takes a FLEET of people to maintain Siebel... I ran my own RT system  
from a Pentium 3 in my basement for a group for 10 years, who accessed  
it world-wide, and then migrated it to a VPS in Dallas where it still  
runs for that same group today.

That was a ticket system that didn't GET IN MY WAY as a support guy.  
Everything else I've used, did get in the way, and made the whole job  
of tracking customer's issues, harder than necessary.

I can (and have) done a better job of tracking customer issues  
currently being worked with a small whiteboard and a notebook carried  
in my pocket.

Trouble ticketing software and projects are an utter nightmare.  They  
almost NEVER ask the "customer" (the techs) what information they want  
to see displayed, what information they can get readily from a  
customer, and what information they don't care about.  Just a little  
UI work would go a long way -- but it's usually the "Business  
Development" group, or some Tiger Team of project managers and people  
who've never done tech support who set the screen layout for the large  
company systems I've worked on.

My favorite thing about our current system at work is that it causes  
pop-ups for every text entry box in a browser-based system.  You know  
how slow that is?  Incredibly stupid software design.  And like I  
said, I've heard it cost ... in total... about 1/2 a million bucks.

Probably anyone on the list could write a better web-based system,  
even if they're not coders (me included).  But I hear the CEO's  
neighbor runs or is otherwise in the higher ranks of Siebel.

(And now you know how real software decisions get made...)

Even if it's not true -- some salesperson got to him and said Siebel  
was the way to go... and that salesperson was wrong.

So yeah, many companies DO seem to have unlimited budgets and  
unlimited time to screw around with internal systems -- so why not do  
them right?

(And as a disclaimer, I'm not complaining about my employer really --  
I haven't seen a ticket system that worked right in five company name/
management changes, and four other companies, in my career.  But I've  
also never seen a company ASK the techs about any of it.  The best I  
saw was a single tech, from a single location, sat on a "multi-
functional" team and had one vote, versus 9 or 10 others, in one  
company.  That system sucked too.)

Ironically, ticket systems are SIMPLE things.  RT proves it.  I can  
work and notate and complete five tickets with RT in the same amount  
of time as I can do it in Siebel.  The reason?  The e-mail  
integration.  I can send an EMAIL to add a note to a ticket, OR to  
have the system send my comments back to the customer and LOG them.  
Or if I want to see the full history or whatever, I can pop open a web  
browser.  But e-mail's always open in a tech support department  
anyway... so... might as well use it!

Nate

2009\01\29@032830 by Tony Vandiver

flavicon
face
>
> Unless you have unlimited funds and no deadlines, DO NOT follow the Space
> Shuttle's team approach. They spend 99% of their time on bureaucracy, and 1%
> on writing code. I believe the first time someone referenced this article, I
> did a calculation on the cost in $ per line of code, and it was
> astronomical*.
>
> Vitaliy
>
> *Which makes sense: after all, they work for NASA...
>  
and the article is a little skewed when it mentions that a commercial
piece of software of the same size would have 5000 errors in it.  If
there are 420,000 lines of code, that's an error every 84 lines for
commercial code.  Can you imagine how much sleep you'ld get if you
thought your software that made it to production had total_lines/84
errors in it?  I can imagine telling my customers that "in order for
your application to be bug free", I'm going to need to hire 10 more
programmers full-time to maintain your 10,000 lines of code.  You don't
mind an extra $1M per year to support a product that grosses $0.5M per
year do you?  Or imagine getting them to maintain this contract even
after there has been a bug discovered (and consistently discovered year
after year as the article mentions albeit at a slower pace).  I
especially like the guy that leaves to the real world and comes back
complaining that the customer is always wrong.  Don't get me wrong, I'm
glad there are guys with this mentality working on things that could
easily screw up and kill someone, but I wouldn't do it.  I'd be the guy
on the launch pad trying to see if I could rig the system to a 16 bit
processor to make it simpler so that I could control the whole thing
with an RC airplane remote.  Hey guys, what happens if we just forget
about all this timing crap and just try to react to real world inputs to
make the system stable.  We'll start by sending a 1/10 scale model to
the moon at 1/100 the cost, and when we've done that till we're sick of
it, we'll start scaling up.  Now, where's my diet Dr. Pepper.  Somebody
order a pizza, it's going to be a long night.

Tony




2009\01\29@092008 by Rolf

face picon face


Nate Duehr wrote lots of things.... :
> On Jan 28, 2009, at 10:27 PM, Rolf wrote:
>
>  

Just as an over-all response to your mail:

1. the software we write does not do anything transactional (i.e. our
software can not change dollars and cents, there is no risk that enyone
could program our software to embezzle.
2. financial code is full of bugs. You should not have warm and fuzzies
for even the best financial institutions. Always check your statements,
etc. Still, the risk in finance of software errors causing you losses is
insignificant compared to direct human manipulation. 99.9% (a number I
have just invented) of money losses from clients/investors in financial
institutions will be related to errors other than software. Further,
software errors are typically discovered and fixed and the financial
consequence corrected. Human errors are seldom dealt with in such a
manner because your ciontract with the financial institution will have a
disclaimer about that....
3. Even though I say we write risk management software it does not mean
that our programs manage risk... our programs provide risk-management
data to risk-management people. These people understand the models we
use to calculate risk metrics, and they understand the flaws in the
models we use. The whole idea of financial modeling is that you simplify
the real-world in ways that make the problem more easy to understand and
quantify. The user of these models has to be aware of where the model is
deficient, etc. This is something you should be aware of in any
engineering task. Still, for the record, our clients have fared
relatively well in the past. Then again, people who are willing to spend
millions on licensing fees for our software are typically risk averse
anyway and likely would make similar decisions without our software.
Given that more than half of the world's largest banks use our software
and we have not lost a client yet, I guess our clients are survivors.
4. as for 'the best', well, it is a subjective thing. The company I work
for has earned the respect of the entire industry, and has a reputation
of excellence. I am a part of that. Take it for what it's worth.
5. as for management whims.... well, they are. When a client wants to
model a new type of financial instrument, and offers lots of money to
make it happen, the whims of managers can change. Suddenly that
performance problem you were engrossed in is not so important.
6. As for award winning software, the software we all write is
award-winning too. In fact, in most financial risk categories our suite
is acknowledged industry wide to be the best.
7. I guess all nose-ring wielding people are financially irresponsible?
What, African Americans too?, how about tattooed people? That comment is
despicable! Your spouse needs that tee-shirt.
8. The whole concept of Agile development can be applied in bits and
pieces to any development. it is hard to determine what exactly makes a
development process agile or not. Not everything that looks/smells/feels
agile is. That is why I said I had experience with a diluted form of it.
In fact, my understanding of agile is as fuzzy as the concept itself.
9. I believe software development is like a marriage... the two
'partners' are the product and the developers. Between the two of those
you have to come up with a system that works, and, since every developer
is different, and every product is different, every 'marriage' will be
different too. Different things work for different 'marriages'. Agile
development is a process that can be used to model the 'marriage' on,
but it is not a one-size-fits-all thing. Still, there are many
similarities between on set of developers and another, and one product
and another. As a consequence, most 'marriages' end up looking, for the
most part, very similar.
10. Why not use agile for our stuff? Well, we have 25 years of
investment in our product, and it is written parts in Cobol, C, C++, C#,
Java, web applications (asp, jsp, html, perl/CGI, javascript, ajax,
etc.), Windows GUI's, X GUI's, command-line, batch, interactive. The
biggest challenge is sometimes deciding whether to re-use a component or
to re-write. In most cases, re-using makes most sense.
11. The same type of processes and people that launched the shuttle also
launched the Mars Climate orbiter:
http://en.wikipedia.org/wiki/Mars_Climate_Orbiter

Overall I found your response to be more emotional than I would have
expected. Is everything OK?

Rolf

{Quote hidden}

2009\01\29@102146 by Rolf

face picon face
See comments inline....

Vitaliy wrote:
> "Rolf" wrote:
> [snip]
>
> Rolf, thank you for the background, it is very interesting.
>  
welcome ;-)
> What I know about Agile comes from an Agile "bootcamp" I attended about two
> years ago, the books on Agile that I've read since, and (somewhat limited)
> experience applying it at work. IANAE, but I do know a thing or two about
> Agile, and it sounds like when you say "agile", you mean something else.
>
>
>  
This is a common misconception about Agile, I know that I have
misconceptions of it as well. You can see that. I also know that "the
company" has investigated the concept and appl;ied some of the concepts
in certain ways, but overall we have not become an agile development
software house.
{Quote hidden}

I'm not going to comment on the details of Agile computing - I don't
know them well enough...
{Quote hidden}

We have a mostly open-plan office (6 floors of 'loft' style work areas
with a few offices on each floor for various people who routinely have
to deal with confidential matters (reviews, conference calls, etc.). The
one 'agile' project team moved in to the corner conference room to all
be together, and we now insist they shut the door when they play opera!
They mostly stick to themselves, we mostly stick to ourselves....
>> Still the one team is still going after 2 years, hardly 'Agile'.
>>    
>
> I believe this is one of many Agile myths. There's nothing inherent about
> Agile that limits the duration or scope of the project.
>
>
>  
Point taken.
>> And worse, the product they developed has won all sorts of awards, and
>> is well regarded in the financial industry,
>>    
>
> This is consistent with a survey that DDJ did in 2007: teams using Agile
> report a higher success rate than traditional teams.
>
>  
our non-agile software process has also garnered many awards, and
continues to do so.
>  
>> unfortunately it is a bugger
>> to integrate with the rest of the system.
>>    
>
> What do you think makes it so?
>
>
>  
The rest of the suite has a common look and feel (return codes on batch
processes, command-line arguments, color schemes in GUI's, etc. This
makes documentation/integration/education much easier). This particular
project uses different programming languages, different interfaces, and
different report formats. While it does it's job fine, it is also
different enough to become problematic.
{Quote hidden}

As I say, I believe our development cycle has aspects of Agile in it,
but it has aspects of other methodologies as well, waterfall being part
of it, and there is rapid prototyping in places too.
> I'm going to make a few guesses about the way things work at your company.
>
> - For example, I bet that the Finance Gurus don't simply build the models,
> and throw them over the wall. They interact with the people who are actually
> going to implement the models, right up to the end of the project.
>
>  
Yes, but not for the reason you insinuate.
> - Also, you don't build the whole thing all at once, you build it in pieces,
> and test each piece. Then you make more changes, and test the pieces again.
>
>  
Partly, but subtly different enough for your characterization to be
misleading.
> - You routinely revise your assumptions and requirements throughout the
> duration of the project.
>
>  
Yes, but again your characterization to be misleading.
> Am I right?
>
>  
Good try.
> My point is (in case anyone missed it :) that waterfall development
> methodology does not work, and that Agile tends to fit the  "natural" model
> of how programmers work, better.
>
>
>  
Sure, Agile may fit the 'natural' model of how programmers work, but try
to make a big bank look like a programmer!
{Quote hidden}

Fine, I'll not comment on the details of the Agile concept....
> - "Milestones" are not an agile concept, it is something you hear about a
> lot in waterfall-type projects. Agile has the concept of "iteration": a
> short, fixed length of time, at the end of which you must have working
> software. Working software, because "code does not lie". Documentation on
> the progress of the project often does.
>  
Fine, I'll not comment on the details of the Agile concept....

{Quote hidden}

Fine, I'll not comment on the details of the Agile concept.... I can see
the advantages to the above sort of approach.
{Quote hidden}

No, my assumption is that building an agile team that had all the
expertise required to fulfill a typical business requirment of ours
would require the expertise of too many people to form an effective
Agile team. This assumption may be wrong, but, doing a quick in-the-head
analysis of recent things I have worked on, they have required the
specialist input of many dozens of people, and since there are many
concurrent projects we all work on, we would all have to be part of
dozens of Agile teams to make things work.... and then you just end up
with dozens of Agile teams competing for access to the resources instead
of a more orderly scheduling/directing/management process. Again, my
understanding of Agile is fuzzy, so I may have just expressed many
myths, but, right now I have about 6 distinct issues 'in the air' that
each in the grand scheme could be an Agile project.
>> An agile team will fail if it is required to disintegrate routinely to
>> work on unrelated maintenance tasks. That's basically why it would not
>> (does not) fly really well in the areas i work.
>>    
>
> Has this been tried at your company? I don't understand why an agile team
> would be different from any other team in this regard.
>
>
>  
As I understand it, an Agile team is dedicated to particular business
logic building. In some ways the company has tried it (with some
success), but in other ways, in my area of expertise, it has not been tried.
>> Still, there are times when we need to build (small) new pograms that do
>> whizz-bang things, and we get to do the especially fun stuff of rapid
>> prototype development, and then it is close to 'Agile'.
>>    
>
> Size does not seem to matter, agile processes still turn out to be more
> efficient.
>
>
>  
Size matters. Typically when the client is involved it is a big project.
When the client is 'internal', the project is small. When we have a real
client we are less 'Agile'.
{Quote hidden}

We do that often (unit tests first, code to make the test pass, then
implement continuous regression testing so that it never later breaks,
etc.).
{Quote hidden}

Writing the tests first does not reduce bugs, it just means the bugs are
in the tests, *and* the code, and thus sometimes harder to find.
> In conclusion, if you haven't really looked at Agile, you definitely should.
> It can save you tons of wasted effort, and help you produce a quality
> product with features that your customers actually find useful.
>
> Vitaliy
>
>  
We have looked.... and learned some things, and discarded others.


I think, in conclusion, that the thing which makes Agile programming
hard for us is the concept of our client relationship. Take a typical
project....

1. an international regulatory body decides that financial institutions
need to be able to report certain risk metrics (this sort of thing
happens surprisingly often).
2. many of our clients are affected by this, and have a relatively
limited release schedule (couple of years) to get the numbers available.
3. our financial GURU's investigate the regulation and discover various
models that could solve the issues (takes a few months). They then go to
the clients, regulators, and other experts and discuss the models
required, and what limits are reasonable on the expectations of the
regulations. Typically there are issues that need months to resolve,
some issues take longer, much longer, and some issues are never resolved.
4. the guru's then put together an overall plan of what data is
required, what data is not yet available, what computational modeling
needs to be developed.
5. this gets assigned to a project manager who keeps tabs on the whole
excercise. They arrange for the access to the resources they need (back
to the guru's, to business analysts who determine where the data should
come from, to testers who need to develop test data, test routines, and
other infrastructure, etc., the programmers from multiple disciplines,
etc.). This is all really done at the management level, i.e. the project
manager will determine development needs to happen on component X, and
they ensure that the manager for the X component is aware of the
requirements.
6. there is a lot of back and forward as the component managers try to
shuffle their competing requirements to get a viable project plan with
the available resources, and schedule it all in the available timelines
without shifting existing projects, etc.
7. Development happens in earnest with various teams producing their
respective support at various times. There is often interdependence in
the components such that you have to wait for some functionality form
one component before you can return the favour to them with your
functionality. There is some guessing and imagination involved because
you are still waiting for some of the details to be ironed out at the
top design level.
8. Finally we get close to a completed product, and the guru's come back
and hammer away at the system to ensure that all the known functionality
is implemented, etc.
9. various strategioes are used to validate the models from a number of
perspectives, including financial valadation through regression testing,
unit testing, back testing (an art in itself), etc.
10. the software gets shipped to the clients who put it in to a test
environment for 6 months, and run extreme testing of their own because
fundamentally, they are responsible for the results...

Getting back to your earlier points where you guessed the way things
work at my work, well, requirements change over time, regulation and
models that take years to develop can change too, and the effects can be
significant in a lot of places, and time is limited to get things to the
client (in a 2 year delivery schedule, the first 6 months can be
requirement analysis, the next year can be development, testing,
integration, and the last 6 months it has to be at the client and
working reliably). There is no time to wait for the requirements to
settle before starting the development cycle.

Basically, the way I see it is that our clients can not be 'Agile', and
this feeds back the whole way through the development cycle.

Rolf

2009\01\29@155921 by Nate Duehr

face
flavicon
face
Hi Rolf,

As far as my being "emotional", I'm just getting tired of crappy software.
It happens after almost two decades in tech support, I suppose.  Looking at
leaving the industry in a few years -- just because it never really gets any
better.  I could go manage a support team, but I've been-there, done-that,
and would only enjoy it in certain sized organizations.

By the time the bugs are worked out of something, it's scrapped and a whole
new thing full of bugs gets shipped to the customers.  Not picking on my
current employer -- this has been true at ALL of them.  Software is seen as
something that SHOULD be replaced regularly, like air filters -- not
critical infrastructure like highways and bridges to get somewhere.  This is
sad.

The bugs get old, since you THINK they hire top-notch engineers who have
plenty of experience and know better, but then in every product there's
stupid programming mistakes, pointers falling off the end of stacks, memory
allocation screw ups, the usual.  Very few of the bugs I've found over the
years are "interesting", they're all the same old programmer screw-ups,
every year.  Different developers, all good people, release the same bugs.
You'd think someone would notice and find a way to stop those.

As I read more and more of the BS from the software industry giving
lip-service to actually trying to find ways to stop these things from
happening over and over again, they never come to pass.  Companies find the
solution is "too expensive", "too time consuming", or worse, you get those
responses like I got here in the list yesterday from the engineer, "Boy,
programming like that just wouldn't be any fun!"

My favorite false-premise in software development is that of "code re-use".
To be honest, PICsters seem to actually do it more than most.  But in
companies?  Nah... you're paid to code... so you code... even if there was
already code to do X checked into the CVS tree somewhere.

Let me whip out the world's smallest violin for that guy who said highly
disciplined coding wouldn't be any fun, and play it.  It's not SUPPOSED to
be fun, it's supposed to be a PROFESSION, and the software is supposed to
get BETTER over time.  Not be the same old re-hashed errors.

Would it be better if the software releases just had new features, and
weren't also bug-fix "service packs"?  Yep.  But no one cares anymore...
customers, software vendors, doesn't matter... everyone's just "used to" it
all sucking most of the time.  

Developers rant and rave about the 80/20 rule, but in reality, people want
100% of the things in their software to WORK.  There's no "80" that use all
the same features.  Ask any tech support department if they want the 20%
with bugs, fixed... and how much pain and overtime and late nights that 20%
causes.  

The software industry is just so full of excuses these days, it's sad.
Imagine if you bought a car and it behaved as badly as most software does.
Or a house.  You'd go looking for a better product.  "I'm sorry sir, but
you're attempting to use the 20% of your car/house/whatever that most people
don't use.  I'll turn in a trouble ticket and if enough people attempt to
use that portion of the product, we might not "de-scope" the bug report out
of the next version in lieu of new features, since new features bring in
more revenue than fixing your problem."

Getting tired of saying that, but in politically correct ways.  Really
tired.  You bought something from us, but you're calling me telling me it
doesn't work, and I have no ability to fix it for you.

This is PARTICULARLY bad on one particular product I currently work on,
because it's being phased out.  Development of ANYTHING on it is literally
decided on a case by case, ticket by ticket, basis -- and "end of sale" has
already been announced.  It had potential to be rock-solid, but
Engineering/Product Management is moving on to other things.  There's no
direct replacement in the new product lines.  It's extra painful for the
support team.

So I apologize if I got emotional and challenged you a bit.  "Software never
works right the first time" is so ingrained in the culture now, that it'll
be virtually impossible to eradicate.

The only good news is?  It means both you and I have permanent jobs, so to
speak...

If the engineers at any company could put the tech support people at that
company out of business, they'd have accomplished something.  So far, no
company I've worked for yet hasn't shown quarter over quarter growth in
people paying for service contracts.

I immensely enjoy reading http://joelonsoftware.com -- that guy seems to
"get it", but how many companies do?  

I can read that site for hours, and have tried his software, and it *is*
measurably better than most.  I also love his discussion of how his tech
support people are hired on a MANAGEMENT track, and required (at the
company's expense) to go get a Master's degree in technology management.
(And the perks for his developers are amazing, too -- but I'm not a
developer.)  The guy knows how to create his own culture and success, and
does it.  His company does all that and still turns a profit making
software.  Pretty impressive.

He speaks highly of Apple, as do I, too.  Their stuff "just works" most of
the time, for me anyway...

His article on shutting down a Windows machine:
http://www.joelonsoftware.com/items/2006/11/21.html

Contrasted with the fact that I just close the lid on my MacBook, and it
does what it's supposed to.  

If I need it "All the way off" I press the power button and have three
choices, the default one is "turn all the way the hell OFF".  Power button,
enter.  Done.  Or just close the lid, if sleep mode is appropriate.

I wish the rest of the software industry could figure that type of design
and elegance out.  

Have I run into Apple bugs?  Yes.  They're usually esoteric and weird, and
rarely found in the "80%" part of the code, and once reported -- they
usually DO get fixed.  It usually takes a while but they're gone in the next
major release.  Are they still infuriating?  Perhaps more-so.  You get used
to Apple's stuff "just working" and get more frustrated than usual when you
find a real bug in a feature you wanted to use.  :-)

Nate

{Original Message removed}

2009\01\29@215222 by iansmith

flavicon
face
On Thu, 29 Jan 2009, Nate Duehr wrote:
> My favorite false-premise in software development is that of "code re-use".
> To be honest, PICsters seem to actually do it more than most.  But in
> companies?  Nah... you're paid to code... so you code... even if there was
> already code to do X checked into the CVS tree somewhere.

This I can somewhat forgive.  For me, and most programmers I know, writing
new code is WAY easier than trying to understand and use code that is
already written.  This include code you wrote yourself!

Even with carefull coding and good comments, you just can't always
duplicate that mental state of understanding a complex problem in the same
way.

However... I have to agree 100% that the software industry is awful.
Companies tend to go bankrupt when they fail to ship products, so they
ship crap knowing they will be better off than if they spent more time to
fix bugs.  There just really isn't any punishment for writing buggy
software like there for designing a building that collapses.

Someone once said that if the car industry was run like the computer
industry, we would all be driving BMW's that got a thousand miles to the
gallon and cost $80 but would randomly explode after a few years, killing
everyone inside.

--
Ian Smith
http://www.ian.org

2009\01\30@035356 by Alan B. Pearce

face picon face
>However... I have to agree 100% that the software industry
>is awful. Companies tend to go bankrupt when they fail to
>ship products, so they ship crap knowing they will be better
>off than if they spent more time to fix bugs.  There just
>really isn't any punishment for writing buggy software
>like there for designing a building that collapses.

Trouble is they are able to, as the defects are hidden. It is not until you
attempt to do something useful that you find the software has problems, and
then you cannot be sure that it is not something you are doing in the way
you are using it, until you find that a heap of other people have the same
problem ...

>Someone once said that if the car industry was run like the
>computer industry, we would all be driving BMW's that got a
>thousand miles to the gallon and cost $80 but would randomly
>explode after a few years, killing everyone inside.

or would be supplied new with the fan missing from the heater, or only two
tyres, but 4 wheel rims, or ... some other defect that matches with the way
software seems to get shipped with important problems not fixed.

2009\01\30@084243 by Jake Anderson

flavicon
face
Alan B. Pearce wrote:
{Quote hidden}

I have been trying to work out why it is we can't make reliable
software, but we can (generally) build a bridge that is going to last
100 years.
It comes down to margins. When you are building something you say, well
this beam has a load of X so I'll put a beam in that will take 2X as a
load without breaking. Then down the track when it turns out the builder
is using particularly heavy bricks that beam sees 1.3X the load it was
meant to see everything is still handled.

What I am yet to see is how you can apply "margin" to software design,
its too "perfect" a system in many respects, If you miss one check to
see if a number is zero before doing a divide your hosed. Sure you can
try and "handle" those kinds of errors, but that seems more like
propping up the building after the balcony has fallen off.

You can apply margin to things like speed and the like but the only
program/system I'm aware of that feels like it has that "2x" safety
factor was the lunar module guidance computer.
http://klabs.org/history/apollo_11_alarms/eyles_2004/eyles_2004.htm


Anybody have any thoughts on "margin" as it applies to software?

2009\01\30@100154 by Carl Denk

flavicon
face
There just
>>> really isn't any punishment for writing buggy software
>>> like there for designing a building that collapses.
>>>

Remember the Holiday Inn, was it Kansas City, where the walkway
collapsed. The Structural Engineer of record (the one that affixed his
seal and signature to the drawings), was tried for manslaughter. And
there are others too.

> we can (generally) build a bridge that is going to last
> 100 years.

I-35 bridge, and a few years ago, I-90 in  Eastern Ohio, had a similar
problem, fortunately it was caught when suddenly there was a sudden bump
in the bridge deck a few years ago, and in the early 1900's, the Quebec
Bridge across the St. Lawrence failed twice during construction, but it
still is in use today. The trend is to retire people early, where the
experience is. Bridge design and construction, is today, well defined,
except for some exceptional projects, following good design and
construction practices results in a good bridge. In some instances, in
particular in some foreign lands where well established building codes
are not followed, or corruption exists, it's a disaster waiting to
happen. :(

2009\01\30@100946 by Michael Rigby-Jones

picon face


> -----Original Message-----
> From: spam_OUTpiclist-bouncesTakeThisOuTspammit.edu [.....piclist-bouncesKILLspamspam@spam@mit.edu] On
Behalf
> Of iansmith
> Sent: 30 January 2009 02:52
> To: Microcontroller discussion list - Public.
> Subject: RE: [TECH] Agile programming (was Re: [P|C] Banksel)
>
> However... I have to agree 100% that the software industry is awful.
> Companies tend to go bankrupt when they fail to ship products, so they
> ship crap knowing they will be better off than if they spent more time
to
> fix bugs.  There just really isn't any punishment for writing buggy
> software like there for designing a building that collapses.
>
> Someone once said that if the car industry was run like the computer
> industry, we would all be driving BMW's that got a thousand miles to
the
> gallon and cost $80 but would randomly explode after a few years,
killing
> everyone inside.

A quick example, a colleague was considering buying a Jaguar XK, so we
were looking at a UK car review website (http://www.hoestjohn.co.uk) where you
can check common faults and recalls.  I was slightly concerned to see
this one!

"April 2004: Recall because 6-speed automatic of latest cars can slip
into reverse at high speed due to computer software problem (per 'Auto
Express' 21-4-2004)"

Regards

Mike

=======================================================================
This e-mail is intended for the person it is addressed to only. The
information contained in it may be confidential and/or protected by
law. If you are not the intended recipient of this message, you must
not make any use of this information, or copy or show it to any
person. Please contact us immediately to tell us that you have
received this e-mail, and return the original to us. Any use,
forwarding, printing or copying of this message is strictly prohibited.
No part of this message can be considered a request for goods or
services.
=======================================================================

2009\01\30@104833 by Vitaliy

flavicon
face
"Rolf" wrote:
[after generous snipping]

[Rolf, please try to format your posts so that they're not all one long
block of text.. please]

> 8. The whole concept of Agile development can be applied in bits and
> pieces to any development.

True, but it is also true that when you combine these bits and pieces, you
get more of the sum.

> it is hard to determine what exactly makes a
> development process agile or not. Not everything that looks/smells/feels
> agile is. That is why I said I had experience with a diluted form of it.
> In fact, my understanding of agile is as fuzzy as the concept itself.

>From where I stand, Agile is certainly not "fuzzy". After you've had
experience with it, you would start develop an intuitive understanding of
what is Agile, and what is not -- and you would be able to explain why.


> 9. I believe software development is like a marriage...

I've heard many software analogies before ("building software", "growing
software", etc) but "marrying software" is a first! :-D


> 10. Why not use agile for our stuff? Well, we have 25 years of
> investment in our product, and it is written parts in Cobol, C, C++, C#,
> Java, web applications (asp, jsp, html, perl/CGI, javascript, ajax,
> etc.), Windows GUI's, X GUI's, command-line, batch, interactive. The
> biggest challenge is sometimes deciding whether to re-use a component or
> to re-write. In most cases, re-using makes most sense.

I fail to see your point. It doesn't matter what the project is, or the
language, or how many years you've spent on it, or any of the things you
mentioned. The only thing that changes with Agile, is how you approach
software development. Agile addresses two big fallacies with project
management: (1) men and months are interchangeable and (2) requirements
never change.

Agile accepts the fact that software is written by people (for people), and
that change happens. Most of the other principles flow out of these two
realizations. Have you read the Agile Manifesto, and the twelve principles?

http://agilemanifesto.org/
http://agilemanifesto.org/principles.html

I can see people drawing biblical parallels already. :) However, if you want
to have a real discussion about Agile, these are the points you should be
addressing, not what you *think* Agile is. I'm starting to get tired of
explaining what Agile is *not*.


> 11. The same type of processes and people that launched the shuttle also
> launched the Mars Climate orbiter:
> http://en.wikipedia.org/wiki/Mars_Climate_Orbiter

Maybe with Agile, they would have launched it for less money, and ahead of
schedule?

Vitaliy

2009\01\30@110719 by Vitaliy

flavicon
face
"Nate Duehr" wrote:
>> did a calculation on the cost in $ per line of code, and it was
>> astronomical*.
>
> I didn't say it was cheap.  I said the code worked.

In a free market, everything has a maximum price that the market will
bear -- the "break-even point". If software costs more than the users are
willing to pay, the project is a failure.


> If companies REALLY knew how much it would cost -- including all
> bugfixes and later releases and patches -- to do many of the things we
> do with computers today, when they STARTED the project -- they'd have
> kept the paper, pens, and filing cabinets.  Seriously.

Well, it's not that bad. As I mentioned, I worked in a call center, and we
used very crappy software. But at my company we do lots of things with
computers, and it's working pretty well. Most of the things would be very
hard to do without a computer, and some impossible.


> Every trouble ticketing system I've EVER seen deployed, for example,
> is over-budget, missing critical features that then require paper or e-
> mail workarounds, and never hits the mark 100%.

We've been using Kayako eSupport since 2005 at work, I've used it myself and
it works very well. Certainly beats email.


> Trouble ticketing software and projects are an utter nightmare.  They
> almost NEVER ask the "customer" (the techs) what information they want
> to see displayed, what information they can get readily from a
> customer, and what information they don't care about.  Just a little
> UI work would go a long way -- but it's usually the "Business
> Development" group, or some Tiger Team of project managers and people
> who've never done tech support who set the screen layout for the large
> company systems I've worked on.

When were were shopping for an inventory tracking and invoice system, the
future end users attended every presentation, and ultimately had the
decision-making power. CS and warehouse managers who themselves are end
users did the preliminary research. The software that we ended up buying is
not perfect, but it certainly improved the way we do things. And the staff
bought into it -- so there was no resistance to overcome.

Another example is the shopping cart. Our web developer did the research,
but there were several meetings with the end users to gather requirements,
and then a meeting where the shopping cart was presented and its features
checked against these requirements.


> My favorite thing about our current system at work is that it causes
> pop-ups for every text entry box in a browser-based system.  You know
> how slow that is?  Incredibly stupid software design.  And like I
> said, I've heard it cost ... in total... about 1/2 a million bucks.

At Getronics, we were running the support app over what looked like a Citrix
connection. Dialog boxes took several seconds to update, and you could not
see in real time what you were typing.


[snip]
> Ironically, ticket systems are SIMPLE things.  RT proves it.  I can
> work and notate and complete five tickets with RT in the same amount
> of time as I can do it in Siebel.  The reason?  The e-mail
> integration.  I can send an EMAIL to add a note to a ticket, OR to
> have the system send my comments back to the customer and LOG them.
> Or if I want to see the full history or whatever, I can pop open a web
> browser.  But e-mail's always open in a tech support department
> anyway... so... might as well use it!

eSupport captures emails, and converts them into tickets. They don't work in
parallel the way you're describing, however.

How does RT forward customer requests to email? Does each customer get
assigned to a particular tech?

Vitaliy

2009\01\30@110720 by Vitaliy

flavicon
face
Correction:

> True, but it is also true that when you combine these bits and pieces, you
> get more of the sum.

Should read: "...more than the sum".

2009\01\30@130722 by Rolf

face picon face
Vitaliy wrote:
> "Rolf" wrote:
> [after generous snipping]
>
> [Rolf, please try to format your posts so that they're not all one long
> block of text.. please]
>
>  
not quite sure what I did wrong, but anyways ....

[snip]
> >From where I stand, Agile is certainly not "fuzzy". After you've had
> experience with it, you would start develop an intuitive understanding of
> what is Agile, and what is not -- and you would be able to explain why.
>
>
>  
As I have said, I don't yet 'grok' the concept of Agile...
>> 9. I believe software development is like a marriage...
>>    
>
> I've heard many software analogies before ("building software", "growing
> software", etc) but "marrying software" is a first! :-D
>
>
>  
You're welcome.
>> 10. Why not use agile for our stuff? Well, we have 25 years of
>> investment in our product, and it is written parts in Cobol, C, C++, C#,
>> Java, web applications (asp, jsp, html, perl/CGI, javascript, ajax,
>> etc.), Windows GUI's, X GUI's, command-line, batch, interactive. The
>> biggest challenge is sometimes deciding whether to re-use a component or
>> to re-write. In most cases, re-using makes most sense.
>>    
>
> I fail to see your point.
maybe I just got lost in rambling... and there is no valid point.


snip
>
> I can see people drawing biblical parallels already. :) However, if you want
> to have a real discussion about Agile, these are the points you should be
> addressing, not what you *think* Agile is. I'm starting to get tired of
> explaining what Agile is *not*.
>
>
>  
I am not trying to advocate for or against Agile. You asked me how
things happen where I work. I tried to answer, trying also, within my
limited comprehension, to identify what I thought were or were not agile
traits. I am not particularly interested in learning more about Agile
right now (My brain is full of copyright and finance stuff at the
moment...), I was more trying to provide you with answers to your questions.
{Quote hidden}

All people make mistakes. Hopefully they are not too harmful, and
hopefully we learn something. Then hopefully we can move on.

Rolf


2009\01\30@160339 by Gerhard Fiedler

picon face
Vitaliy wrote:

> The only thing that changes with Agile, is how you approach software
> development. Agile addresses two big fallacies with project
> management: (1) men and months are interchangeable and (2)
> requirements never change.

I'm programming professionally for some 25 years now, on software
projects, hardware projects, embedded firmware projects, and neither of
these has been a premise/fallacy of the management of any project I've
worked on. So this can't be something that "changes with Agile".

You need to have some experience with the many different ways projects
and people can be managed to know whether something is really new.

> Agile accepts the fact that software is written by people (for
> people), and that change happens.

Many other ways do that, too. No bible and no religion (not even an
agile one :) necessary for this...

Gerhard

2009\01\30@161826 by Gerhard Fiedler

picon face
Rolf wrote:

> Basically, the way I see it is that our clients can not be 'Agile',
> and this feeds back the whole way through the development cycle.

This is one thing that often gets lost in discussions around Agile: the
contract with the client, and its influence on the process.

Gerhard

2009\01\30@180833 by Nate Duehr

face
flavicon
face
In RT you build groups of techs that would handle specific types of
problems, and can assign each "queue" its own e-mail address for direct
e-mail to ticket support.  All techs can be notified of new tickets in the
queue via e-mail, or you could just have techs monitoring the web interface
24/7, as desired.  (I like e-mail, but different company cultures might like
different things, or not like the "spam" of being e-mailed for everything...
you can crank up the e-mails or crank them down for every type of "event".)

In most companies, that would equate to a web page with "topics" that when
you clicked, it would open your mailer via a tag, I suppose.  The
system I built is a single-queue system -- we're all fixing stuff in the one
queue, since the load is light.

There can also be a "catch-all" queue, and a tech could easily reassign a
ticket in about three mouse clicks to another queue in a drop-down from that
"master" queue or send in commands via e-mail to reassign it.

Each queue can have automated status messages sent back to the customer
(again, up to the admin as to how many and at which "events") during the
process.  "Ticket created, here's your URL to watch the status, and we'll
continue to send you e-mails", "Ticket Taken/Assigned to X", "Ticket status
change", "Ticket Resovled, if you think we're looney, reply to this e-mail
and it'll automatically re-open."  

Stuff like that.  Pretty powerful, and easy enough to set up it was worth me
setting up in my "free time" for a group almost 10 years ago, instead of the
mish-mash of e-mails to a group alias/distribution list we'd been using
prior to installing it.  Didn't require much in the way of hardware for a
low-volume queue, but it is Perl/MySQL, so it could get CPU and RAM
intensive with thousands and thousands of tickets.

The other pretty decent system I've seen used (at work, but limited to
Engineering tickets -- not the system the front line techs use) is called
Jira.  I believe it's available openly, too.  Quite a bit more complex than
RT, and no e-mail integration of any kind, but does integrate with code
repositories, which is an interesting feature... bug tracker that handles
helping the coders find the source that got checked in, in the source
repository.  (I believe it supports multiple repositories in a business
environment, and multiple common types... CVS is still "spoken" here,
mostly... for better or for worse.  I'd prefer Subversion... or something
with a better "offline/road warrior" mode.)

Nate

{Original Message removed}

2009\01\31@010336 by Sean Breheny

face picon face
I have read that article before and it is fascinating. What astounds
me is that the very sort of thing which seems to work very well for
NASA (managing everything obsessively down to the last detail) seems
NOT to work (and indeed to backfire horribly) when applied to nearly
every other endeavor, whether business or government.

For example, the article claims that the software team for the space
shuttle software keeps a record of every bug they have ever
encountered, every change to each line of code and why it was made,
and every little hanging detail. It seems to me that such a thing is
not humanly possible. If you try to do that, you usually bog down in
layer upon layer of "what-ifs" and concerns which are extremely low
probability but which distract from solving the ones which are far
more likely.

Anyone have any idea how they manage to make this work with actual,
fallible, human programmers and managers?

Sean


On Thu, Jan 29, 2009 at 2:06 AM, Nate Duehr <natespamKILLspamnatetech.com> wrote:
>
> It *is* possible to write almost perfect code, after all...
>
> http://www.fastcompany.com/node/28121/print
>
> :-)
>
> Nate
> -

2009\01\31@014342 by Vitaliy

flavicon
face
"Sean Breheny" wrote:
>> It *is* possible to write almost perfect code, after all...
>>
>> http://www.fastcompany.com/node/28121/print
>
> Anyone have any idea how they manage to make this work with actual,
> fallible, human programmers and managers?

I have no idea how they get anything done if they really do things the way
they're described in the article. If your question is, "why their project
doesn't fail given the vast amounts of red tape and bureaucracy?", the
answer is -- virtually unlimited budget (money and time), and low enough
stress where the team burnout is kept to a minimum (people don't leave in
droves, knowledge is retained).

FWIW, I bet that not a single member of their team read the specs in their
entirety.

The feeling I get from reading about the Apollo project, is that back then,
things were done in a much more agile way. They broke down the project into
small steps, had a lot of "first drafts" and simulations, and let themselves
make plenty of mistakes. Notice that the Apollo project achieved all of its
goals, and is considered a resounding success. Meanwhile, the Space Shuttle
project did not achieve its objectives (whether they were realistic is
another subject), and is considered by many to be a huge waste of tax
dollars.

Vitaliy

2009\01\31@023348 by Vitaliy

flavicon
face
"Rolf" wrote:
> We have a mostly open-plan office (6 floors of 'loft' style work areas
> with a few offices on each floor for various people who routinely have
> to deal with confidential matters (reviews, conference calls, etc.).

Sounds like a good arrangement.


> The
> one 'agile' project team moved in to the corner conference room to all
> be together, and we now insist they shut the door when they play opera!
> They mostly stick to themselves, we mostly stick to ourselves....

That's funny. :)


>> This is consistent with a survey that DDJ did in 2007: teams using Agile
>> report a higher success rate than traditional teams.
>>
> our non-agile software process has also garnered many awards, and
> continues to do so.

Right, the ratio was something like 65%/75% success rate
(traditional/agile). There are other factors to consider, though -- quality
of the end product, how much it meets customer needs -- which are said to be
in favor of Agile.

Scott Ambler writes the "Agile Edge" column for Dr. Dobb's Journal. If you
have access to the magazine, I would highly recommend you to read his
articles.

{Quote hidden}

It sounds like the problems you describe, have little to do with development
methodology.


[snip]
> Sure, Agile may fit the 'natural' model of how programmers work, but try
> to make a big bank look like a programmer!

I think the nature of the project is irrelevant. If you have a set of
practices that result in better software, why would it matter whether this
software is used to create financial models, control a spacecraft, or run a
website?


[snip]
{Quote hidden}

I think I understand where you're coming from. Let me ask you this: right
now, who decides which teams/people get access to the resources? How many
people direct/manage the various projects?

It seems to me that regardless of the development methodology, the amount of
available resources (time, people, equipment) remains the same. It is
possible to multitask in an Agile environment just as well (or better) as
you would in a traditional environment.

However, regardless of the chosen project methodology, it is a good idea to
keep the number of concurrent projects to a minimum. There are two reasons
for this:

1. Every time you switch between projects, time gets wasted, because it
takes time to "get in the zone", remember the place where one left off.

2. Switching pushes the completion date of both projects into the future
(makes them both late). In a business environment, getting a product to
market sooner has great benefits. In other words, if you have a choice:

- Release Project A in three months, then release Project B three months
after that, or
- Switch b/w Proj A & B, and release both six months later,

The first choice is usually preferred. This is of course simplified, but in
general if you have many projects, doing them serially rather than in
parallel is usually more efficient.


> Writing the tests first does not reduce bugs, it just means the bugs are
> in the tests, *and* the code, and thus sometimes harder to find.

Hm, having limited experience w/ TDD, I'll try to abstain from arguing, but
it certainly sounds counter-intuitive.


> Getting back to your earlier points where you guessed the way things
> work at my work, well, requirements change over time, regulation and
> models that take years to develop can change too, and the effects can be
> significant in a lot of places, and time is limited to get things to the
> client (in a 2 year delivery schedule, the first 6 months can be
> requirement analysis, the next year can be development, testing,
> integration, and the last 6 months it has to be at the client and
> working reliably). There is no time to wait for the requirements to
> settle before starting the development cycle.

Agile would be perfect for you, then.

>From the Manifesto: "Responding to change over following a plan "

>From the Principles:

"We follow these principles:

[...]

"Welcome changing requirements, even late in
development. Agile processes harness change for
the customer's competitive advantage. "


> Basically, the way I see it is that our clients can not be 'Agile', and
> this feeds back the whole way through the development cycle.


It's actually one of the Agile FAQs is "How do you interact with the
customer, when he has not bought into Agile?"  My answer is -- if there is a
will, there is a way. :) After all, if you show the customer the benefits of
the Agile approach (faster time to market, lower risk, better quality) it is
really a no-brainer. If you don't believe in it yourself, of course there is
nothing to talk about.

I have no personal interest in converting you to Agile -- I'm not a
consultant, and you and I have never even met. I just think that the
principles are sound, they do in fact work, and are therefore at least worth
considering. If it makes you and your company a little productive, everybody
wins.

Vitaliy

2009\01\31@024829 by Vitaliy

flavicon
face
"Gerhard Fiedler" wrote:
>> Basically, the way I see it is that our clients can not be 'Agile',
>> and this feeds back the whole way through the development cycle.
>
> This is one thing that often gets lost in discussions around Agile: the
> contract with the client, and its influence on the process.

Let me cite the Manifesto in its entirety:

---------------------
Manifesto for Agile Software Development

We are uncovering better ways of developing software by doing it and helping
others do it. Through this work we have come to value:

- Individuals and interactions over processes and tools
- Working software over comprehensive documentation
- Customer collaboration over contract negotiation
- Responding to change over following a plan

That is, while there is value in the items on the right, we value the items
on the left more.
---------------------

Did you catch this -- "Customer collaboration over contract negotiation"?

The benefits to the customer are legion. IMHO you just need to understand
them yourself, and be able to articulate them to get the customer's buy-in.

Vitaliy


2009\01\31@030530 by Vitaliy

flavicon
face
"Gerhard Fiedler" wrote:
>> The only thing that changes with Agile, is how you approach software
>> development. Agile addresses two big fallacies with project
>> management: (1) men and months are interchangeable and (2)
>> requirements never change.
>
> I'm programming professionally for some 25 years now, on software
> projects, hardware projects, embedded firmware projects, and neither of
> these has been a premise/fallacy of the management of any project I've
> worked on. So this can't be something that "changes with Agile".
>
> You need to have some experience with the many different ways projects
> and people can be managed to know whether something is really new.

I never said that Agile was new. Many people have been doing things the
"Agile" way at the dawn of computing (1957, if you believe the guy).
However, the waterfall model (along with the Gantt chart) is still what they
teach in project management courses in college, and from time to time I hear
people advocating the "traditional" (fixed requirements, heaps of
documentation) way of programming -- including here on the PICList.

We agree on many things when it comes to software development (HLL vs
assembly, meaningful names vs comments, to name a couple). Perhaps you've
been an agilist all your life, and are blissfully unaware of the fact.
That's fine with me, I don't care what you call it or whether you even have
a name for it. :)


>> Agile accepts the fact that software is written by people (for
>> people), and that change happens.
>
> Many other ways do that, too. No bible and no religion (not even an
> agile one :) necessary for this...

Sure, but the prevailing wisdom is that you have to plan everything in
detail in advance, fix requirements at the beginning of the project, and put
provisions in the contract that deter the customer from making changes after
the project's been started. Am I wrong?

Vitaliy

2009\01\31@051629 by Gerhard Fiedler

picon face
Nate Duehr wrote:

> The other pretty decent system I've seen used (at work, but limited to
> Engineering tickets -- not the system the front line techs use) is
> called Jira.  I believe it's available openly, too.  

Jira is good, but AFAIK commercial license only. Mantis is also pretty
good, and open source.

Gerhard

2009\01\31@065429 by Gerhard Fiedler

picon face
Vitaliy wrote:

>>> The only thing that changes with Agile, is how you approach software
>>> development. Agile addresses two big fallacies with project
>>> management: (1) men and months are interchangeable and (2)
>>> requirements never change.
>>
>> [...] neither of these has been a premise/fallacy of the management
>> of any project I've worked on. So this can't be something that
>> "changes with Agile".
>
> I never said that Agile was new.

I didn't say you did. You said that something changed with Agile (note
the capital A). According to the Agile Manifesto, it emerged in 2001.
Many people did what you described above before that date, so it can't
have changed with Agile -- it was already there. Redefining what was
done before the Agile Manifesto as Agile doesn't help anybody. Maybe
what those people did is among the roots of Agile, but it's not a
consequence of it.

Programmers never had a problem with requirements change. Heck,
programmers like to change the requirements as they program. The problem
always was with the question who pays for the changed requirements --
and the Agile Manifesto is suspiciously quiet about this :)


> However, the waterfall model (along with the Gantt chart) is still
> what they teach in project management courses in college, and from
> time to time I hear people advocating the "traditional" (fixed
> requirements, heaps of documentation) way of programming -- including
> here on the PICList.

Gantt charts are a tool that can make interdependencies visible before
they catch you, and this is something that can be useful in certain
situations. And it's not something that is opposed to an Agile approach;
they can work together. If you have dependencies, you better consider
them as early as possible. Part of my manifesto: Use your tools.

I don't care too much about what is taught in management courses in (US)
colleges -- I've been largely unaffected by them (and by the ones who
took them). It's also not really relevant for the issue. Part of my
manifesto: Don't wake your education literally.

Fixed requirements are a fact of life. If you outsourced the development
of a PC software to interface to one of your products, I bet there would
be some "fixed requirements". Speaking against "fixed requirements"
shows that you're not looking at the bigger picture. The problem are not
fixed requirements per se (they are just a fixed requirement :), the
problem are requirements trying to define what can't be defined ahead of
time -- which is a different issue. It's not a question of whether or
not to have fixed requirements, it's how to determine what are the fixed
requirements, and what is flexible. Part of my manifesto: "Something Is
Evil" smells like religion (and religion is not about solving
programming problems).

In order to understand some issues, you just need heaps of
documentation. I know that there are many people out there who never
read a manual of any of the things they own. I also know that there are
many programmers who never read a manual ("RTFM" is common in programmer
circles :). But I still think that "RTM" is the appropriate (and only
workable) solution for some cases -- and some complex requirements
require a lot of RTM, and heaps of documentation. Try getting a bunch of
programmers to write a decent transmission controller without reading
heaps of documentation. They'd probably iteratively wreck quite a number
of transmissions (and possibly cars, and maybe drivers) before they know
through iterative experience what they could've known by reading through
a few heaps of documentation :) Part of my manifesto: There's never too
much good documentation.

As for the waterfall model... maybe because I never worked in big
companies with "professional managers" I never had to suffer "management
abuse". But then, this is a personal choice, and maybe the ones who
complain about it have the same choice... Part of my manifesto: If you
don't like what you're doing, the problem is not with whatever it is
you're doing, the problem is with you doing it.

> Sure, but the prevailing wisdom is that you have to plan everything in
> detail in advance, fix requirements at the beginning of the project,
> and put provisions in the contract that deter the customer from
> making changes after the project's been started. Am I wrong?

Yes. It happened, in my career, but it wasn't prevailing.

Gerhard

2009\01\31@081221 by Gerhard Fiedler

picon face
Vitaliy wrote:

> The benefits to the customer are legion. IMHO you just need to
> understand them yourself, and be able to articulate them to get the
> customer's buy-in.

I'd like to see you selling programming services to customers, saying
something like: "In a month we expect you to give us $50k for the work
done up to then. At that point you will get something that 'works'. It
will do 'something', but what it is that it will do I can't disclose or
define up front, as this would be against our methodology. We will
listen to all your input, and I assure you that we will do our best to
consider it. Now trust me and sign here at the dotted line, please."

Come on... this works in situations where the customer hires per hour,
trusts the programmers (and their managers), and is willing to pay for
whatever he gets, trusting that it will be as much as possible in
alignment with what he wants. Which is a possible scenario, but not very
common.

A more common scenario is a fixed price deal. In a fixed price deal, you
can of course work inside the deal with Agile, but you can't really let
the customer dictate the outcome as you go. Ever heard of featuritis? If
you don't put contractual breaks on the cart, the customer will usually
want more. You show him one prototype, he'll think of a dozen additional
things he'd like to see. You show him the next prototype (that features
a first draft of five of them), and boom -- there is another dozen new
features he'd like to have and hasn't thought of before. You see where
this goes without contractual definitions of what it is that you have to
deliver to get your money. You won't ever get your money if you don't
limit the deliverable -- which is a requirements document that defines
the deliverable.

The contractual conversations are quite often about finding the right
balance between what to do and what to pay. Many clients want to know up
front what they will get for their money -- and there you have the
requirements. Also, often there is the case that if they can't get to
point X with their budget, it doesn't make sense to start at all, so
they have to make sure that they can get at least to point X if they
don't want to waste money. Which has a definition, aka requirements.

There is/was another thread where we talked about incentives and how
important they are for the outcome. The programmers need to have an
incentive to get where the customer wants to go. One of the most common
incentives is payment. If this is not linked to some definition of the
desired result, it can become a very weak incentive. "Trust them to get
the job done"... hm. Can work, but can also go (expensively) wrong,
especially if it's not you who is hiring and managing (motivating) them.

There is also the idea of "working software", mentioned several times in
the Agile Manifesto. What does "working" mean? There needs to be some
definition, right? Wouldn't that be a requirement? And wouldn't it have
to be established up front? (With "up front" I don't necessarily mean
"fixed at the beginning of everything", but "at the beginning of an
iteration". If what "working" means is not at all clear at the beginning
of an iteration, I doubt that the iteration will be all-around called a
success. Which then is suspiciously close to a milestone... :)

Don't get me wrong... I don't think the principles of the Agile
Manifesto are wrong. But they are not /everything/ -- and they are not
that new. Maybe they were in certain circles before 2001, but they
weren't in others. They are a collection of some common sense principles
and unproven (and unprovable) axioms, nicely worded. It also seems to me
that it is targeted at a specific subset of programming settings; it is
easy for me to come up with situations where some of the principles
simply don't apply. Maybe the creators just didn't consider these
situations, because they were outside their target or experience. But
that doesn't make them any less real. There are also scenarios where one
principle contradicts another. So it's all not that easy.

Gerhard

2009\01\31@082702 by Gerhard Fiedler

picon face
Vitaliy wrote:

>> Sure, Agile may fit the 'natural' model of how programmers work, but
>> try to make a big bank look like a programmer!
>
> I think the nature of the project is irrelevant. If you have a set of
> practices that result in better software, why would it matter whether
> this software is used to create financial models, control a
> spacecraft, or run a website?

IMO this is exactly one of the major management fallacies. The nature of
the project does matter, the nature of the client does matter, the
nature of (each of) the programmers does matter... heck, the nature of
the manager does matter. It all matters, there is no silver bullet. The
most productive teams (IME) are not the ones that follow an ideology, no
matter how good the ideology, but the ones that efficiently adapt to the
nature of everything involved.

The set of practices that works well in one case may not (and probably
will not) work well in another case. Priorities are different,
constraints are different, goals are different... that's just how it is.

I'm not sure, but the way I read the principles in the Agile Manifesto,
this is something you could read into it: that the nature of the project
/is/ relevant, and needs to be considered when creating the practices
for that project.

>> Writing the tests first does not reduce bugs, it just means the bugs
>> are in the tests, *and* the code, and thus sometimes harder to find.
>
> Hm, having limited experience w/ TDD, I'll try to abstain from
> arguing, but it certainly sounds counter-intuitive.

It isn't that counter-intuitive when you consider that for non-trivial
programs, the tests are also non-trivial -- and often more complex than
the target program itself. So if you expect the target program to have
bugs (and that's exactly why you write the test program), you can expect
the test program to have /more/ bugs.

To get out of this recursion, you need to have a means to write target
programs without writing test programs first. Otherwise, you'd first
need a test program for the test program, and a test program for that
test program, and so on... :) And if you have a means to write a program
without writing a test program first... why not use it in the first
place?

Mind you, I'm not against writing test programs before the target
program. But this method has its quirks, too.

Gerhard

2009\01\31@115605 by Nate Duehr

face
flavicon
face

On Jan 31, 2009, at 12:46 AM, Vitaliy wrote:

> Did you catch this -- "Customer collaboration over contract  
> negotiation"?

As a non-programmer, but someone who's been involved in business  
purchase decisions, this seems to be Agile's achilles heel.

Customers want to know what they're entitled to before they pay.  If  
they start paying a company that's "Agile" to "collaborate" and the  
collaboration slows down or becomes bogged down in some set of  
details, what recourse do they have?   In the more traditional RFP/
Requirements/Then Build environment, if the software doesn't meet the  
requirements, they have legal recourse to sue.

If you've convinced them that they can just "collaborate" with your  
team, and then say -- decide you want to downsize the team  
(effectively making delivery dates stretch out longer), what can they  
do about it if they've been paying and collaborating all along.

Agile seems to be idealistic about software relationships between  
companies being a "forever" thing.  That only works (in business) in  
giant companies that are never going to restructure and have plenty of  
customers buying "products" from the Agile team.

Additionally, how do you handle it when one customer wants to  
completely change what the software does or how to interact with it,  
and the majority of customers want it another way?  Agile never  
addresses that.  They assume "all customers are equal".  We all know  
they're not.  Change this example to "your largest customer who brings  
in 25% of your revenue" and it starts to become very difficult to  
smash Agile development up against the cold reality of customer wants/
desires, doesn't it?

Maybe someone on an Agile team that has lots of customers can  
explain.  Do you just end up having to sales-pitch the smaller  
customers into liking what the largest "collaborators" want?

Nate


'[TECH] Agile programming (was Re: [P|C] Banksel)'
2009\02\02@022916 by Vitaliy
flavicon
face
"Gerhard Fiedler" wrote:
>> I never said that Agile was new.
>
> I didn't say you did. You said that something changed with Agile (note
> the capital A).

For me personally, and the way we do things at work -- when we learned about
Agile, and started implementing some of the concepts.


> According to the Agile Manifesto, it emerged in 2001.
> Many people did what you described above before that date, so it can't
> have changed with Agile -- it was already there. Redefining what was
> done before the Agile Manifesto as Agile doesn't help anybody. Maybe
> what those people did is among the roots of Agile, but it's not a
> consequence of it.

Well that's the point, most of what the agile methodologies are about, was
known for a long long time. Over time, the patterns emerged and were used,
and written about by different people. These same people also realized that
the traditional approaches to programming are based on false assumptions. So
these folks got together to share their ideas, and came up with the Agile
Manifesto.

I'm usually the first one to smirk when people appeal to authority, but in
fact these guys are industry leaders, they have been in the business of
software development for a long time and I think therefore that one should
at least consider what they have to say.


> Programmers never had a problem with requirements change.

Gerhard, I don't know how you can say this, considering your experience.

Changing requirements is the #1 complaint I get from new engineers (after a
while, they get used to how we do things). "But last month we decided..."
Who cares what we decided last month. We're not slaves to our past
decisions. The situation has changed, we learned something new, Joe
discovered a better way to implement the feature.


> Heck,
> programmers like to change the requirements as they program.

Not around here they don't. :) I thought it was common knowledge that
programmers are supposed to hate change.


> The problem
> always was with the question who pays for the changed requirements --
> and the Agile Manifesto is suspiciously quiet about this :)

The goal of software development is not to avoid expenses that result from
changing requirements. The goal of software development is to produce
business value.


{Quote hidden}

I can only tell you what I know from experience, and my personal experience
is that Gantt charts are a waste of time. Also, I regularly visit a local
university, and we've been sponsoring student teams for almost a year now.
The story is the same every time: the students put together an elaborate
Gantt chart, but end up throwing it out the window once the schedule slips
and/or requirements change.

Have you used Gantt charts for your projects? Was your experience different?


> I don't care too much about what is taught in management courses in (US)
> colleges -- I've been largely unaffected by them (and by the ones who
> took them). It's also not really relevant for the issue. Part of my
> manifesto: Don't wake your education literally.

The material point is that the traditional waterfall approach is considered
to be the industry standard. This is what they teach future engineers.


> Fixed requirements are a fact of life. If you outsourced the development
> of a PC software to interface to one of your products, I bet there would
> be some "fixed requirements".

We've tried outsourcing a couple of times, providing heaps of documentation
and demanded strict adherence to the requirements. None of the attempts were
successful.

If we were to try outsourcing again, it would definitely follow an agile
approach.


> Speaking against "fixed requirements"
> shows that you're not looking at the bigger picture. The problem are not
> fixed requirements per se (they are just a fixed requirement :), the
> problem are requirements trying to define what can't be defined ahead of
> time -- which is a different issue.

I agree. Which leads me to believe that you misunderstood what I was talking
about. :-)


[snip]
{Quote hidden}

There is a difference b/w project documentation, and user documentation (see
one of my previous posts). I'm all for user documentation.


> As for the waterfall model... maybe because I never worked in big
> companies with "professional managers" I never had to suffer "management
> abuse". But then, this is a personal choice, and maybe the ones who
> complain about it have the same choice... Part of my manifesto: If you
> don't like what you're doing, the problem is not with whatever it is
> you're doing, the problem is with you doing it.

I like your manifesto.


>> Sure, but the prevailing wisdom is that you have to plan everything in
>> detail in advance, fix requirements at the beginning of the project,
>> and put provisions in the contract that deter the customer from
>> making changes after the project's been started. Am I wrong?
>
> Yes. It happened, in my career, but it wasn't prevailing.

What was prevailing, then?

Vitaliy

2009\02\02@034707 by Vitaliy

flavicon
face
Gerhard Fiedler wrote:
>> The benefits to the customer are legion. IMHO you just need to
>> understand them yourself, and be able to articulate them to get the
>> customer's buy-in.
>
> I'd like to see you selling programming services to customers, saying
> something like: "In a month we expect you to give us $50k for the work
> done up to then. At that point you will get something that 'works'. It
> will do 'something', but what it is that it will do I can't disclose or
> define up front, as this would be against our methodology. We will
> listen to all your input, and I assure you that we will do our best to
> consider it. Now trust me and sign here at the dotted line, please."
>
> Come on... this works in situations where the customer hires per hour,
> trusts the programmers (and their managers), and is willing to pay for
> whatever he gets, trusting that it will be as much as possible in
> alignment with what he wants. Which is a possible scenario, but not very
> common.

Gerhard, please don't get me wrong, and don't take offense at what I say. I
think that you really should learn more about Agile, because your statement
reveals your ignorance about the way it really works. The problem of getting
the customer's buy-in has been addressed numerous times, to the point where
it became an almost cookbook solution.

Fixed-Price contracts in an agile organization
(sort of an FAQ):
http://tinyurl.com/2fw9rh

Selling Agile (you can skip to page 14)
http://tinyurl.com/c4zxld

Contracting Agile Projects
http://tinyurl.com/au7tss

While customers find comfort in fixed price (meaning fixed price, scope, and
timeline) agreements, you know that you are lying when you're giving the
estimate to the customer, because you have no idea what the project will
actually cost.

Lean Development & the Predictability Paradox
http://www.poppendieck.com/pdfs/Predictability_Paradox.pdf


{Quote hidden}

I've been on the customer end, and I understand why customers prefer fixed
price deals. I also know that most consultants/contractors want to charge on
the per hour basis (some people we requested quotes from, refused to sign
the NDA as soon as they heard that the fixed price requirement is
non-negotiable).


[snip]
> There is also the idea of "working software", mentioned several times in
> the Agile Manifesto. What does "working" mean? There needs to be some
> definition, right? Wouldn't that be a requirement?

"Working software" means software that has been tested and debugged, and
that does something useful. This is in contrast with the traditional method,
where the software (almost by definition) is basically broken for most of
its lifetime, until the integration and testing phases are complete.


> And wouldn't it have
> to be established up front? (With "up front" I don't necessarily mean
> "fixed at the beginning of everything", but "at the beginning of an
> iteration". If what "working" means is not at all clear at the beginning
> of an iteration, I doubt that the iteration will be all-around called a
> success. Which then is suspiciously close to a milestone... :)

You pretty much answered your own question. You described how an agile team
would approach this. :)

Of course, you have requirements for the project, in the form of the project
backlog. The tasks (features) on this project backlog come from the
customer. In practice, you can have stickie notes, one per task. The
customer then tells the developers which tasks are the most important. The
programmers pick enough tasks off the top of the backlog for one iteration
(aka "sprint"), write more detailed requirements if necessary, and start
coding.

At the end of the sprint, the programmers end up with working software. It
is presented to the customer, the customer provides feedback, together they
look at the tasks, add new ones/reprioritize the backlog, and the process is
repeated.

Here are some benefits of this method, compared to the traditional
"waterfall" approach:

1. At the end of each sprint, the customer gets working software. It may
even be useful enough to be deployed, even before the remaining features are
implemented.

2. As the development progresses, the customer can eliminate some features
based on the changing external factors, or the feedback they get from using
the software. With traditional approach, this would require contract
negotiation, so what you get in the end, is a product that has features that
no one uses.

3. At any point, the customer may decide they have enough useful
functionality, and terminate the project. In traditional environment, they
may end up with nothing but broken code, and heaps of paperwork (and
perhaps, a lawsuit).


> Don't get me wrong... I don't think the principles of the Agile
> Manifesto are wrong. But they are not /everything/ -- and they are not
> that new. Maybe they were in certain circles before 2001, but they
> weren't in others.

As I explained earlier, neither I nor the original signatories claim that
the principles are new.


> They are a collection of some common sense principles
> and unproven (and unprovable) axioms, nicely worded.

Which ones do you have in mind?


> It also seems to me
> that it is targeted at a specific subset of programming settings; it is
> easy for me to come up with situations where some of the principles
> simply don't apply. Maybe the creators just didn't consider these
> situations, because they were outside their target or experience. But
> that doesn't make them any less real. There are also scenarios where one
> principle contradicts another. So it's all not that easy.

For example?

Vitaliy

2009\02\02@035846 by Vitaliy

flavicon
face
Gerhard Fiedler wrote:
>>> Sure, Agile may fit the 'natural' model of how programmers work, but
>>> try to make a big bank look like a programmer!
>>
>> I think the nature of the project is irrelevant. If you have a set of
>> practices that result in better software, why would it matter whether
>> this software is used to create financial models, control a
>> spacecraft, or run a website?
>
> IMO this is exactly one of the major management fallacies. The nature of
> the project does matter, the nature of the client does matter, the
> nature of (each of) the programmers does matter... heck, the nature of
> the manager does matter. It all matters, there is no silver bullet. The
> most productive teams (IME) are not the ones that follow an ideology, no
> matter how good the ideology, but the ones that efficiently adapt to the
> nature of everything involved.

Consider the possibility that we're both right. :)

"Plan is not important, planning is important" (Agile proverb). ;)


> I'm not sure, but the way I read the principles in the Agile Manifesto,
> this is something you could read into it: that the nature of the project
> /is/ relevant, and needs to be considered when creating the practices
> for that project.

I think of it as "adapting" the practices to a given project. All agile
projects share some basic features, it is what makes them "agile". They
follow from the principles, and do not change -- doing the project in
successive iterations, delivering working software at the end of each
iteration, etc. You could, however, adjust the duration of each iteration to
better suit your project.


{Quote hidden}

Have you personally experienced this?

It seems to me that the test function is almost always simpler than the
function being tested. The classical example is a function that uses a
complex formula, and returns a value. You don't reproduce the formula in the
test function. It simply calls the user function, gets a value, and compares
it to a hand-calculated value. If they don't match, the test fails.


> To get out of this recursion, you need to have a means to write target
> programs without writing test programs first. Otherwise, you'd first
> need a test program for the test program, and a test program for that
> test program, and so on... :) And if you have a means to write a program
> without writing a test program first... why not use it in the first
> place?

I think you just created a straw man. :) I've never heard anyone recommend
writing a test program for the test program. What you have is the program
under test (the part that the users use), and the tests.

Vitaliy

2009\02\02@074404 by Gerhard Fiedler

picon face
Vitaliy wrote:

> Well that's the point, most of what the agile methodologies are about,
> was known for a long long time. Over time, the patterns emerged and
> were used, and written about by different people. These same people
> also realized that the traditional approaches to programming are
> based on false assumptions.

If those ideas have been known and used for a long time, how can they
not be part of the "traditional" approaches? Note that I come from a
background with relatively small companies and relatively few "studied"
managers; most managers I've worked with or for were engineers,
scientists, domain specialists rather than "professional" managers. They
tended to do things in a way that made sense to them rather than
following anything they learned. This may give me a different view of
what is "traditional", but this is what it is for me.

> [...] and I think therefore that one should at least consider what
> they have to say.

I do, I've read the Manifesto, I know about the movement and many if not
most techniques, I consider them where I think they are appropriate.
Most of the Manifesto I agree with. But to me it's nothing
overwhelmingly new.


>> Programmers never had a problem with requirements change.
>
> Gerhard, I don't know how you can say this, considering your
> experience.

It's my experience, and I say this considering my experience. The main
problem, IME, has always been that continuing towards the earlier set
goal would cost less, and continuing towards the new goal will cost
more, and who is going to pick up that tab. (I've never found a
programmer on a fixed price contract complain about a requirements
change that takes out previously required features -- but this is not
the normal case.)

>> Heck, programmers like to change the requirements as they program.
>
> Not around here they don't. :) I thought it was common knowledge that
> programmers are supposed to hate change.

This phrase specifically was a little bit tongue in cheek... I meant
that there are programmers who will tend to implement things a bit
different from the requirements, for a number of reasons. Which amounts
to "changing the requirements".


>> The problem always was with the question who pays for the changed
>> requirements -- and the Agile Manifesto is suspiciously quiet about
>> this :)
>
> The goal of software development is not to avoid expenses that result
> from changing requirements. The goal of software development is to
> produce business value.

Of course, but quite often there are different (e.g. two) businesses
involved that need to get value out of this. How to distribute the cost
and the value is an important question, and often this question can't be
addressed iteration by iteration.

A thought here: changing direction in mid-course where you /could/ have
gone in the required direction from the start is not creating business
value, it's creating (unnecessary) cost. This is why it makes sense to
have a requirements gathering phase, during which you find out what you
can and need to know about the project, separate the hard requirements
from the soft ones, and document at least the hard ones. I know that
documentation doesn't replace more direct communication, but it is also
my experience that verbal communication can't replace written
communication. Both have their place.

Helping the client to understand the scope of their project before
spending a lot of money on it is a good thing. This is not about
programming; this is about understanding. I know that this is not always
high on the priority list, but for me it is. More than once I've seen
projects change direction drastically during this phase, because of
previously unthought-of consequences. If someone had started coding
before this, it would all have been wasted. No business value created,
just the feeling of "doing" something, a paycheck for the programmer and
cost for the customer.


> Have you used Gantt charts for your projects? Was your experience
> different?

I don't use Gantt charts to create a detailed schedule of programming
projects. I do occasionally use Gantt charts to get a handle on the
interdependencies in situations with complex dependencies between
different tasks.

When you think "Before I go to LA next week Monday I need to have my
clutch checked. The mechanic said if there was a problem, it'll take him
a day to get the parts and another to fix it. I want to have a day
slack, so I better get my car to the mechanic by Wednesday max." you are
creating a Gantt chart in your head. A simple one, and you probably
won't draw it, but this is what you're doing.

If there are more tasks and the dependencies are more complex, actually
creating it to see how things /look/ helps me. "A picture is worth a
thousand words" or so they say, and a Gantt chart is basically a picture
of the schedule dependencies. For me it's easier to see what's going on
than having a dozen people fill my ears about what they need done before
they can do their thing; I'm not good in keeping such details in my
head, and a textual representation doesn't help me much. I generally
don't spend the time updating such charts as the project advances, at
least not regularly; for the projects I've been working on I didn't feel
this was worth the effort. But I did create some, and when I did, I
didn't feel that it was wasted effort.


> The material point is that the traditional waterfall approach is
> considered to be the industry standard. This is what they teach
> future engineers.

In fact, while I've heard a lot about the waterfall model, I've never
seen it implemented (maybe because the pure model is too far away from
reality). I've seen something similar once, in that a company spent a
few man-years gathering requirements before starting to write any code,
but I tend to think that this was a good thing in that case. It was not
only the program's requirements that they documented, it was the
company's billing procedures; they didn't have them documented up to
that point, completely and in one place. That seemed crazy to me, and
documenting them seemed to me a necessity that goes beyond the scope of
the programming project within which this was done, so the documentation
alone created business value (as I see it), independently of the program
that was based on this.

For me, this was a typical case of an extensive requirements gathering
phase.


{Quote hidden}

What went wrong, and how would you change it?


> There is a difference b/w project documentation, and user
> documentation (see one of my previous posts). I'm all for user
> documentation.

In many cases, the programmer is the "user" of something else (for
example when programming a custom controller). In many cases, this
"something else" is not sufficiently documented in order to write the
desired program. These are cases where "gathering requirements" before
writing code makes sense; for me, this is the original purpose of the
requirements gathering phase of the waterfall model. You seemed to have
argued that gathering requirements before writing code is a waste of
time, period. I'm not saying that there are not people out there wasting
time by trying to define things at a point where whatever thy try to
define can't be known. But I stated some examples where it makes sense
to me to extensively gather requirements.

I try to understand as much as reasonably possible (and not more!) what
I'm going to program before programming it, and I call this generally
"gathering requirements" when talking about it. This term may come from
the dreaded (and considered outdated) waterfall model, but I don't
really care about this. I do it when and as much as it makes sense to me
-- which is until I feel that spending more time on it would be
inefficient. Where this point is, exactly, is IMO a matter of
experience; there can't be a rule. I balance the (estimated) effect of
the next so many hours thinking about and researching the project
against the alternatives (like writing code). As long as this balance is
positive, I continue to research, plan, document (which is nothing more
and nothing less than keeping notes of the results of my researching and
planning). For me, this is creating more business value than writing
code that is based on wrong assumptions.


>>> Sure, but the prevailing wisdom is that you have to plan everything
>>> in detail in advance, fix requirements at the beginning of the
>>> project, and put provisions in the contract that deter the customer
>>> from making changes after the project's been started. Am I wrong?
>>
>> Yes. It happened, in my career, but it wasn't prevailing.
>
> What was prevailing, then?

Non-ideologic, pragmatic approaches, more influenced by the individual's
experiences and preferences and the constraints of the situation than by
any school of thought.

Gerhard

2009\02\02@082330 by Rolf

face picon face
Vitaliy wrote:
> Gerhard Fiedler wrote:
>  
And I wrote stuff before....
{Quote hidden}

Thinking of two examples:

how do you test the accuracy of your new whizz-bang ultra-accurate
pressure sensor ..... somehow you have to build an even more accurate
pressure container...

how do you test a software implemented queue that is accessed in a
multi-threaded environment for both adding to and removing from the queue...

in each of the above cases the testing system is far more complex than
the device under test....

Rolf

2009\02\02@095723 by olin piclist

face picon face
Vitaliy wrote:
> 1. At the end of each sprint, the customer gets working software. It
> may even be useful enough to be deployed, even before the remaining
> features are implemented.

Making that a goal sounds like a bad idea for many projects because it
promotes the slap it together versus careful architecture approach.  Slap
and evolve can work for some types of software like GUIs, but will be a
disaster for many others.  The failures I've seen have generally been due to
lack of thought thru architecture up front, usually because of immature or
incompetent programmers that needed quick gratification.  Unsophisticated
customers and bad managers don't help either because they want to see quick
results too.

Early on you may get what look like quick results.  However these don't have
any depth.  As you try to slap on more features the lack of coherent
architecture gets in the way.  Because nobody wants to throw out the
existing work, it gets kludged and patched to add more features.  In the end
you have a ball of bandaids if you ever get to the end at all.

Every consultant's worst customer is the one that says "We've got it 80%
working, we just need you to finish the remaining 20%".  And of course they
expect you to take only 1/4 of the time they have already spent.  Guess how
they got into this mess, and why you should get out of there immediately?
I'm sure everyone here that's done consulting for more than 2 years has
heard this and knows exactly what I'm talking about.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\02\02@100507 by olin piclist

face picon face
Vitaliy wrote:
> It seems to me that the test function is almost always simpler than
> the function being tested.

I heartily disagree.  There are definitely cases where testing and trying to
figure out whether the software is doing the best thing with the data is at
least as complicated as the operational software itself.  These cases occur
a lot more than "almost never".

> The classical example is a function that
> uses a complex formula, and returns a value. You don't reproduce the
> formula in the test function. It simply calls the user function, gets
> a value, and compares it to a hand-calculated value.

For small values of "tested".  Many functions are too complicated be
verified by a few simple input/output sets that can be generated by hand.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\02\02@102252 by olin piclist

face picon face
Gerhard Fiedler wrote:
>> What was prevailing, then?
>
> Non-ideologic, pragmatic approaches, more influenced by the
> individual's experiences and preferences and the constraints of the
> situation than by any school of thought.

That's been my experience of "prevailing" too.

********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\02\02@104858 by Rolf

face picon face
Olin Lathrop wrote:
> Gerhard Fiedler wrote:
>  
>>> What was prevailing, then?
>>>      
>> Non-ideologic, pragmatic approaches, more influenced by the
>> individual's experiences and preferences and the constraints of the
>> situation than by any school of thought.
>>    
>
> That's been my experience of "prevailing" too.
>
>  

Ditto.

Vitaliy

Typically effective managers have effective solutions for solving
problems effectively... right? Managers that are able to adapt their
style to the problem at hand, the team available, and the other
constraints imposed on the problem at hand.

Like most things, having the right tool for the job makes things much
easier, but, it is remarkable how much a sledge hammer can solve....
Good managers have a good tool-box full of good tools to solve many
problems easily, and the best managers know exactly when to use what
tool. The ultimate managers have every tool available, and are able to
finesse any problem by applying just the right amount of force with just
the right tool.

The same applies for programmers, accountants, engineers of all sorts,
parents, kids, teachers, and so on.

Knowing what to do, what to use, and when to stop is the hallmark of a
successful person.

I get the impression that you consider Agile programming to be the
sledge hammer in an otherwise limitless arsenal range of tools, and you
are trying to turn every problem in to something a sledge hammer will
fix. The old adage that "When all you have is a hammer, every problem
becomes a nail..." seems to come to mind.

Rolf

2009\02\03@085555 by Gerhard Fiedler

picon face
Vitaliy wrote:

>>> I think the nature of the project is irrelevant. [...]
>>
>> IMO this is exactly one of the major management fallacies. The
>> nature of the project does matter, [...]
>
> Consider the possibility that we're both right. :)

I'd like to. Can you explain this WRT these two statements?


>> I'm not sure, but the way I read the principles in the Agile
>> Manifesto, this is something you could read into it: that the nature
>> of the project /is/ relevant, and needs to be considered when
>> creating the practices for that project.
>
> I think of it as "adapting" the practices to a given project. All
> agile projects share some basic features, it is what makes them
> "agile". They follow from the principles, and do not change [...]

This is exactly my problem with /any/ school of management (be that
Agile or waterfall or whatever): once people subscribe to one, they tend
to apply the limited set of practices of that school to any given
project, rather than choose from whatever tools are available to
approach a given problem. Like e.g. using a Gantt chart to make complex
task dependencies visible. Like finding out how the transmission I'm
supposed to control actually works before I have damaged five of them by
iterating through a learning process.


{Quote hidden}

I am experiencing this, on an ongoing basis. Try to write a really
useful test for a non-trivial application, and you'll experience it,
too.

> It seems to me that the test function is almost always simpler than
> the function being tested. The classical example is a function that
> uses a complex formula, and returns a value. You don't reproduce the
> formula in the test function. It simply calls the user function, gets
> a value, and compares it to a hand-calculated value. If they don't
> match, the test fails.

This sort of test exists, but it is rarely useful. Running such a test
on a single value in a single-threaded calculation is very likely to
only catch the defects where the whole application would act "weird" and
the defect would be rather obvious, even without the test.

The tests that are really useful as tests do much more. First, a single
value usually doesn't cut it; one needs a distribution of possible
values across the possible inputs. This means that either you need to
reproduce the formula (that is, create a second implementation, ideally
done by a programmer that's not the one who did the target
implementation, and who doesn't know the target implementation) --
typically done for technical problems --, simulate the environment the
program interacts with or create a sufficient large case base with all
inputs and outputs defined (e.g. a billing system). All three are
considerable work.

However, this is only the start; it tests the "normal" operation. Then
come all the limit cases. Then come the out-of-bounds cases. And if the
target (sub)system happens to be multi-threaded or has any form of
real-time behavior, this all needs to be embedded in a test harness that
creates and checks different timings; again the "normal" cases, the
limit cases, the out-of-bounds cases.

You get into many combinations here, and running an exhaustive test is
impossible for all but the most trivial programs. So you need to create
something like a Monte Carlo test strategy.

Writing a good test (that actually tests what you need, not a small
subset of it) is in many cases as much or more work than writing the
target. And it can only be done after a good collection of requirements;
I don't see a way around this.

Test yourself: take a useful program, and try to design a test, thinking
like this: would the program be useful (that is, would I pay money for
it) if it /only/ satisfied my test -- everything else being undefined
(that is, with everything that's not part of the test the program could
do or not do whatever the programmer wants)? Thinking like this, testing
something simple like Windows Notepad becomes a major undertaking.


>> To get out of this recursion, you need to have a means to write
>> target programs without writing test programs first. Otherwise,
>> you'd first need a test program for the test program, and a test
>> program for that test program, and so on... :) And if you have a
>> means to write a program without writing a test program first... why
>> not use it in the first place?
>
> I think you just created a straw man. :) I've never heard anyone
> recommend writing a test program for the test program.

No straw man. Think about this for a minute -- why did you never hear
anyone recommend this? Because it's obviously recursive, of course, and
nobody wants to state something obviously recursive. So they don't say
it this way, but this doesn't mean that it is real.

Given that a useful test is a quite complex program, and if we assume
that complex programs should be created by writing the tests first...
How do you distinguish a complex program that is a test program from
another complex program that is not a test program? How can someone
propose a coding strategy that doesn't work for half the programs I have
to write? (If I take the "write the test before the target" seriously,
at least half of the programs I write are tests.)

Also, think about this... I've been contracted to write a test program
-- only the test program. And I did write a test program for the test
program :)


> What you have is the program under test (the part that the users use),
> and the tests.

Where do the tests come from? The ones I write are usually complex
programs that I need to code, using my normal coding procedures and
paradigms. I wouldn't want to use a coding paradigm that I can't use for
writing test programs, given the fact that this is part of my normal
coding life. I write these in the same way I write the target programs.
I can (and do) do both, and my coding paradigms work for both. I don't
have to have a "coding mode" for "normal" programs and another one for
test programs.

In some ways, the test program for the test program is the target
program. But this means that you need to debug both, observe both, have
a means to verify results that is outside of both. (This may be data
files produced that you manually verify occasionally, this may be a
scope hooked up to the inputs and outputs where you manually verify the
timing of certain events, etc.)

This all is not to say that tests are not a good thing. This is to say
that they are a complex and potentially expensive undertaking. And
again, there are tradeoffs... while you spend all the time writing an
exhaustive test, you could do other work. What produces more business
value depends on many factors (of which the definition of "business
value" is not the least important).

If (or where) customers put more emphasis on stability and reliability
than features, more tests are being written. Unluckily, though, there
are few areas where the balance is not strongly tending towards
features. IME this is in most cases not programmer or method driven,
this is customer driven.

Gerhard

2009\02\03@092323 by olin piclist

face picon face
Gerhard Fiedler wrote:
> In some ways, the test program for the test program is the target
> program. But this means that you need to debug both, observe both,
> have a means to verify results that is outside of both. (This may be
> data files produced that you manually verify occasionally, this may
> be a scope hooked up to the inputs and outputs where you manually
> verify the timing of certain events, etc.)

That's what I see most of the time.  The test and target are developed
independently by different people from the same spec.  Each will of course
do some basic testing using stubs or simple data or whatever, but the real
testing starts when they get together and try to make the combination work.
That's when scopes and protocol analysers and the like get used.  Of course
the data gets looked at too, to the extent a human can look at it and tell
bad from good.

On one project I worked on many years ago, we had a team of people designing
the real hardware, one person creating a automated test framework for it,
several others designing test suites for the automated framework to run, and
someone else creating a simulator of the hardware to independently verify
the tests and provide a reference for the hardware.  The system was too
complex to easily specify the desired result for each test.  Instead the
test framework would run a test on the functional simulator, then on the
gate-level simulated hardware and compare the two.  Discrepancies were
analyzed by humans.  Sometimes the functional simulator got it wrong,
sometimes the gate level design was wrong.  That's all part of testing.  In
the end the simulator turned out to be useful for the software people to
test their code on before real hardware was availble and to get more
diagnostic information than the real hardware supplied.

The important part binding all the efforts together was the functional spec.
Everybody referred to it as the reference on how the system was supposed to
function.  It was a critical document that initially took a full time person
to maintain.  There was no way you could have jumped into the middle, hacked
up something that partially worked, then refined it from there without
having a clear idea of how you were going to get to the full picture.  We
had three people initially working on the architecture before anyone started
laying gates.  The early versions of the functional simulator were used a
test bed for different architectures.  Most of the engineers that were to
design the final hardware weren't even hired until we had a decent idea of
the overall architecture.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\02\05@032123 by Vitaliy

flavicon
face
Olin Lathrop wrote:
>> It seems to me that the test function is almost always simpler than
>> the function being tested.
>
> I heartily disagree.  There are definitely cases where testing and trying
> to
> figure out whether the software is doing the best thing with the data is
> at
> least as complicated as the operational software itself.  These cases
> occur
> a lot more than "almost never".

You're the expert. :)


>> The classical example is a function that
>> uses a complex formula, and returns a value. You don't reproduce the
>> formula in the test function. It simply calls the user function, gets
>> a value, and compares it to a hand-calculated value.
>
> For small values of "tested".  Many functions are too complicated be
> verified by a few simple input/output sets that can be generated by hand.

Perhaps we're talking about different things? The tests I have in mind,
normally test the high and low limits, plus some random value. For example,
the following functions could be tested using this method:

- CRC/checksum
- Currency conversion
- ASCII to Unicode
- String to Hex
- Etc

What types of functions do you have in mind?

Vitaliy

2009\02\05@034524 by William \Chops\ Westfield

face picon face

On Feb 5, 2009, at 12:19 AM, Vitaliy wrote:
>
> What types of functions do you have in mind?

Pick an output interface and next-hop network encapsulation for an  
incoming IP packet, based on a configuration file exceeding 100kbyte  
in size...

I was always really unimpressed with the "provably correct" software  
efforts that were in vogue back when I was in school.  It always  
seemed "obvious" (to me, anyway) the the most interesting software was  
the software with outside input whose range was so large that it would  
be really hard to prove anything USEFUL.  I haven't heard much about  
that effort recently, though it's hard to tell whether they gave up,  
or whether they were just overwhelmed by the microprocessor revolution  
needing to start over and do everything AGAIN, if they got around to  
it...

BillW

2009\02\05@041121 by Vitaliy

flavicon
face
"Rolf" wrote:
> Vitaliy
>
[snip]
> I get the impression that you consider Agile programming to be the
> sledge hammer in an otherwise limitless arsenal range of tools, and you
> are trying to turn every problem in to something a sledge hammer will
> fix. The old adage that "When all you have is a hammer, every problem
> becomes a nail..." seems to come to mind.

Rolf, perhaps you're right, but no one has suggested a viable alternative so
far. What I'm hearing from you guys (you, Gerhard, Olin) is, "Every project
is different. What works for one project may not work for another project.
What doesn't work for one project may work for another project. There are no
rules, you must adapt."

This makes me very uncomfortable, because the implication is that experience
is irrelevant. I have to approach each project in a brute force fashion,
since each project is unique, and I don't know what will or will not work.
Everything is trial and error.

In addition, the statements you guys made reveal that you don't really know
what Agile is. If you don't know what it is, how can you make judgements
about its effectiveness?

When I started learning about lean development, I found that applying its
principles and techniques, made me a more efficient programmer. I can
develop better code, faster. I can tackle more complex projects, because I
learned ways to effectively manage complexity. I waste less time on things
that do not produce value, and have a clear strategy and a set of principles
that can be applied to any project. The change was really drastic.

The reason I started this thread was to try to clear up the confusion about
what Agile is not. New concepts, especially ones that have a catchy name
that begins with a capital letter, usually meet with resistance -- it is to
be expected.

If what you have works for you, great -- there's no reason to change if you
are happy where you are. If, on the other hand, you are dissatisfied with
the status quo (like I was three years ago), I would encourage you to
explore Agile.

Best regards,

Vitaliy

2009\02\05@051325 by Vitaliy

flavicon
face
Gerhard Fiedler wrote:
>> Well that's the point, most of what the agile methodologies are about,
>> was known for a long long time. Over time, the patterns emerged and
>> were used, and written about by different people. These same people
>> also realized that the traditional approaches to programming are
>> based on false assumptions.
>
> If those ideas have been known and used for a long time, how can they
> not be part of the "traditional" approaches?

Good question. I don't know why, I think they were practiced by a small
minority, and the majority was educated and encouraged to follow the
waterfall approach, and the likes of CMM or other "scientific" approaches to
software engineering.


>  Note that I come from a
> background with relatively small companies and relatively few "studied"
> managers; most managers I've worked with or for were engineers,
> scientists, domain specialists rather than "professional" managers. They
> tended to do things in a way that made sense to them rather than
> following anything they learned. This may give me a different view of
> what is "traditional", but this is what it is for me.

That's what I figured. I think of Agile as a set of practices that "make
sense". It's only fault as far as I can tell, is the fact that it has a
name. :)


>> [...] and I think therefore that one should at least consider what
>> they have to say.
>
> I do, I've read the Manifesto, I know about the movement and many if not
> most techniques, I consider them where I think they are appropriate.
> Most of the Manifesto I agree with.

I'm glad to hear that.


> But to me it's nothing
> overwhelmingly new.

So much the better, I like time-proven things. I'm also glad that some
people cared enough to write down this common sense wisdom, so that others
could take advantage of it.


>> The goal of software development is not to avoid expenses that result
>> from changing requirements. The goal of software development is to
>> produce business value.
>
> Of course, but quite often there are different (e.g. two) businesses
> involved that need to get value out of this. How to distribute the cost
> and the value is an important question, and often this question can't be
> addressed iteration by iteration.

I think it can. In fact, I think deciding it upfront is impossible --  
because too many assumptions would have to be made.


> A thought here: changing direction in mid-course where you /could/ have
> gone in the required direction from the start is not creating business
> value, it's creating (unnecessary) cost. This is why it makes sense to
> have a requirements gathering phase, during which you find out what you
> can and need to know about the project, separate the hard requirements
> from the soft ones, and document at least the hard ones.

Definitely. Agile development does not preclude requirements gathering. Have
you ever heard of "Iteration Zero"? The big difference is that Agile gathers
*just enough* requirements to get started, without trying to capture every
single detail upfront.


> I know that
> documentation doesn't replace more direct communication, but it is also
> my experience that verbal communication can't replace written
> communication. Both have their place.

Yes, absolutely.


> Helping the client to understand the scope of their project before
> spending a lot of money on it is a good thing. This is not about
> programming; this is about understanding. I know that this is not always
> high on the priority list, but for me it is. More than once I've seen
> projects change direction drastically during this phase, because of
> previously unthought-of consequences. If someone had started coding
> before this, it would all have been wasted. No business value created,
> just the feeling of "doing" something, a paycheck for the programmer and
> cost for the customer.

You see, it is statements like this that make me feel that you don't really
know how Agile works (again, no offense). Before a singe line of code is
written, the project team has a meeting with the customer, where the scope
and cost are discussed. It is a highly interactive process, generally the
customer would write down the "stories" (features/usage scenarios) and the
team provides an estimate for each one. There is a lot of back-and-forth to
make sure both sides have a good understanding of the requirements and the
costs. Usually this is contrasted with a situation where the customer
basically hands off a spec to the developers, and says "get back to me with
a quote next week"


{Quote hidden}

I think this is a great example of "plans are not important, planning is
important". To me, the biggest drawback of Gantt charts is that they make
you plan things way in advance, forcing you to make too many assumptions. A
couple of weeks into the project, the dates become meaningless.

Have you heard about Goldratt's "Theory of Constraints"? The charts that it
produces are far more useful, IMHO -- although the problem with dates still
remains.


{Quote hidden}

How were things done before the software was written? Wouldn't it have been
possible to pick a few high-value features, and implement them?

I am of the opinion that there don't exist any projects where 100% of the
requirements must be gathered before the implementation work can begin. You
can always start with a small piece, and build the software gradually.


>> We've tried outsourcing a couple of times, providing heaps of
>> documentation and demanded strict adherence to the requirements. None
>> of the attempts were successful.
>>
>> If we were to try outsourcing again, it would definitely follow an
>> agile approach.
>
> What went wrong, and how would you change it?

There were two major factors that contributed to the failure:

- We tried to document every detail upfront, literally spending several
man-months producing nothing but paper.
- We left the contractor out of the requirements gathering process. We
basically handled him the specs, and said "implement it exactly as we tell
you to" What's probably worse, we asked him to document everything he was
going to implement, before he implemented it.

The way I would do it now, is:

- I would put together high-level requirements, spending not more than a
couple of days.
- Have a meeting with the developer, explain and prioritize the requirements
- Have the developer pick a number of features "off the top" that he can
implement in two weeks
- Get regular (daily) software updates, and provide feedback to the
developer
- Get working (see earlier definition) software at the end of the iteration

We could have launched this project when only about 60% of the functionality
was completed.

>> There is a difference b/w project documentation, and user
>> documentation (see one of my previous posts). I'm all for user
>> documentation.
>
> In many cases, the programmer is the "user" of something else (for
> example when programming a custom controller). In many cases, this
> "something else" is not sufficiently documented in order to write the
> desired program. These are cases where "gathering requirements" before
> writing code makes sense;

Sure. However, this can still be done in several iterations.


> for me, this is the original purpose of the
> requirements gathering phase of the waterfall model.

Not according to the definition:

http://en.wikipedia.org/wiki/Waterfall_model


> You seemed to have
> argued that gathering requirements before writing code is a waste of
> time, period.

I'm sorry you got this impression, it is completely false.


{Quote hidden}

I don't have a problem with gathering requirements.


> I do it when and as much as it makes sense to me
> -- which is until I feel that spending more time on it would be
> inefficient. Where this point is, exactly, is IMO a matter of
> experience; there can't be a rule.

In Agile, the rule is simple: you gather enough requirements to do one
iteration.

In some instances, especially for very complex projects, there is sometimes
what's called an "Iteration 0". This is when the high-level, architectural
requirements are gathered.


> I balance the (estimated) effect of
> the next so many hours thinking about and researching the project
> against the alternatives (like writing code). As long as this balance is
> positive, I continue to research, plan, document (which is nothing more
> and nothing less than keeping notes of the results of my researching and
> planning). For me, this is creating more business value than writing
> code that is based on wrong assumptions.

As long as you do this throughout the project (iteratively), your method
appears to be not much different from that advocated by agilists. The whole
point of limiting the requirements gathering phase to what's needed for one
iteration, is to minimize the number of assumptions.


{Quote hidden}

It sounds like you and Oline have been spared the "scientific development"
approaches, then. Consider yourself lucky. :)

Vitaliy

2009\02\05@053310 by Vitaliy

flavicon
face
Olin Lathrop wrote:
>> 1. At the end of each sprint, the customer gets working software. It
>> may even be useful enough to be deployed, even before the remaining
>> features are implemented.
>
> Making that a goal sounds like a bad idea for many projects because it
> promotes the slap it together versus careful architecture approach.
[snip]

You must have missed the explanation of "working software"  -- there is
nothing "slap it together" about it. The goal is to end up with code that
works, at the end of each iteration -- debugged, tested, and doing something
useful.

Deploying the software early is not a goal in itself, but it is a
possibility with Agile projects. The customer may decide that it has enough
features to be useful, and release it early. With waterfall and similar
methodologies, you cannot do this.


> Early on you may get what look like quick results.  However these don't
> have
> any depth.  As you try to slap on more features the lack of coherent
> architecture gets in the way.  Because nobody wants to throw out the
> existing work, it gets kludged and patched to add more features.  In the
> end
> you have a ball of bandaids if you ever get to the end at all.

Lean methodologies have been around for many years now, and obvious concerns
like this have been addressed a hundred times.

You seem to assume that "careful architecture approach" guarantees that you
don't end up with a ball of bandaids. The fallacy of this argument is that
you claim to predict the future. By definition, a project has many unknowns.
Design upfront means design based on assumptions (aka bad information).

Agile tackles the issue of uncertainty by splitting the project into short
iterations. At the end of each iteration, the team has learned something and
can make decisions based on facts, rather than assumptions.

One way to keep a project from becoming a ball of bandaids, is by continuous
refactoring. It is way better to rewrite the code several times, than be
locked into a rigid architecture that was based on wrong assumptions.


> Every consultant's worst customer is the one that says "We've got it 80%
> working, we just need you to finish the remaining 20%".  And of course
> they
> expect you to take only 1/4 of the time they have already spent.  Guess
> how
> they got into this mess, and why you should get out of there immediately?
> I'm sure everyone here that's done consulting for more than 2 years has
> heard this and knows exactly what I'm talking about.

I agree with you, but I don't see how this is relevant.

Vitaliy


2009\02\05@053702 by Vitaliy

flavicon
face
Rolf wrote:
>> I think you just created a straw man. :) I've never heard anyone
>> recommend
>> writing a test program for the test program. What you have is the program
>> under test (the part that the users use), and the tests.
>>
>> Vitaliy
>>
>>
>
> Thinking of two examples:
>
> how do you test the accuracy of your new whizz-bang ultra-accurate
> pressure sensor ..... somehow you have to build an even more accurate
> pressure container...

I think this is like comparing apples and banana pies.


> how do you test a software implemented queue that is accessed in a
> multi-threaded environment for both adding to and removing from the
> queue...

You write functions that put something in the queue, and other functions
that pull stuff from the queue. What's complicated about it?

Vitaliy

2009\02\05@061702 by Vitaliy

flavicon
face
Gerhard Fiedler wrote:
>>>> I think the nature of the project is irrelevant. [...]
>>>
>>> IMO this is exactly one of the major management fallacies. The
>>> nature of the project does matter, [...]
>>
>> Consider the possibility that we're both right. :)
>
> I'd like to. Can you explain this WRT these two statements?

Projects have similarities and differences. There are things that will
always be true (or at least, most of the time). There are other things that
will be different.

I was saying that for the things that are always true, the nature of the
project is irrelevant.



{Quote hidden}

First of all, I thought we established that the waterfall model doesn't work
in the real world.

Second, Agile does not preclude one from using any tools or techniques, as
long as they don't contradict the principles. And I thought you said you
agree with the Manifesto? :)


> Like e.g. using a Gantt chart to make complex
> task dependencies visible.

As long as you don't waste time projecting the deadlines six months into the
future, and creating a monster chart that has every single trivial task
listed, I don't have a problem with it. Like I said, I found Gantt charts to
be a waste of time. There are other, more effective ways to make
dependencies visible.


> Like finding out how the transmission I'm
> supposed to control actually works before I have damaged five of them by
> iterating through a learning process.

I think that's a given.


> I am experiencing this, on an ongoing basis. Try to write a really
> useful test for a non-trivial application, and you'll experience it,
> too.

I'm seriously considering it. On a few occasions, I got bit by "small"
changes that I made that broke the code in subtle ways. I'm also tired of
running manual tests.


{Quote hidden}

Today I fixed a function with a small bug which only manifested itself under
certain conditions. I can remember several other similar instances where
small bugs resulted in intermittent problems, and were not entirely obvious.


[snip]
> Test yourself: take a useful program, and try to design a test, thinking
> like this: would the program be useful (that is, would I pay money for
> it) if it /only/ satisfied my test -- everything else being undefined
> (that is, with everything that's not part of the test the program could
> do or not do whatever the programmer wants)? Thinking like this, testing
> something simple like Windows Notepad becomes a major undertaking.

Sure, writing a suite of tests that tests *everything* is a waste of time.
But it probably makes sense to automate the testing of some functions. The
law of diminishing returns says that at some point writing more tests would
have a net negative impact on productivity.


>> I think you just created a straw man. :) I've never heard anyone
>> recommend writing a test program for the test program.
>
> No straw man. Think about this for a minute -- why did you never hear
> anyone recommend this? Because it's obviously recursive, of course, and
> nobody wants to state something obviously recursive. So they don't say
> it this way, but this doesn't mean that it is real.

You lost me. Of course it's not real, nobody does it this way. People
understand that you don't write tests to test the tests.


> Given that a useful test is a quite complex program, and if we assume
> that complex programs should be created by writing the tests first...
> How do you distinguish a complex program that is a test program from
> another complex program that is not a test program? How can someone
> propose a coding strategy that doesn't work for half the programs I have
> to write? (If I take the "write the test before the target" seriously,
> at least half of the programs I write are tests.)

Gerhard, are you playing the devil's advocate? :) You write tests for your
software, so you must know the answer to this question.

The only difference is the order: traditionally, tests are written after the
user functions have been implemented. TDD advocates writing the tests before
the user functions. It's the same amount of work, with the following
benefits:

- It establishes clear requirements for the target function
- Programmer can't "forget" to write a test
- The user function will be designed for testability


> Also, think about this... I've been contracted to write a test program
> -- only the test program. And I did write a test program for the test
> program :)

You mean, you wrote a program that the test program was testing? Kind of
like creating test data?


>> What you have is the program under test (the part that the users use),
>> and the tests.
>
> Where do the tests come from?

They are based on the requirements. For example, you could have a test
function that tests the return of an encryption function. You write the test
first, that calls the encryption function, provides the plain text argument,
and compares the return value with expected value (obtained using other
means -- hand calculated, using a calculator, or another program). .

The tests I'm talking about are internal, often they reside in the same
modules as the functions under test.

I have also written external test programs, for example there is a Delphi
program I wrote that checks that the PIC inside a device was correctly
programmed, and functions properly. It automated what was previously a
manual process, saving time and eliminating human error. Its operation is
extremely simple, it sends out messages to the PIC, and compares the
responses to a list of expected responses. Same idea, "distributed"
implementation.


[snip]
> This all is not to say that tests are not a good thing. This is to say
> that they are a complex and potentially expensive undertaking.

Simple tests don't cost much, and can be useful.


{Quote hidden}

You once again prove that we agree more often than not. :)

Vitaliy

2009\02\05@073600 by olin piclist

face picon face
Vitaliy wrote:
> Perhaps we're talking about different things? The tests I have in mind,
> normally test the high and low limits, plus some random value. For
> example, the following functions could be tested using this method:
>
> - CRC/checksum
> - Currency conversion
> - ASCII to Unicode
> - String to Hex

Those are all very simple function.  You'd probably stuff in a few manually
created test cases and manually check the result.  They are so trivial that
testing is hardly a issue and isn't where much of overall testing time goes
in a large software project.  For that reason I wasn't even considering
these in the discussion.

Here's a example from one of my current projects:  On this project I'm
currently only doing host software.  The problem is to find the location of
a active RF tag given the signal strength it was received with by a number
of receivers at known but arbitrary locations.  In large installations there
could be 100s to small 1000s of receivers and 1000s of tags.  The tags all
have to be tracked in near real time.  I don't want to get into the details
of the system beyond what is publicly said about it, but trying to figure
out how well the system is working, comparing location resolution
algorithms, regression testing, and time performance testing is not trivial.
One problem is that there are many sources of noise, both on direct
measurement and the understanding of when they occurred.  Some of this noise
is random, some is systematic, some depends on particular use patterns.
Then there are various tweak factors and algorithms to deal with certain
special cases.  There are currently more people and more overall time spent
on testing, verification, and creating automated test suites and
interpreting their result than on creating of the software that is being
tested.

Even though there is a crunch right now and I'm actually working on new and
clever algorithms, I spent most of Tuesday writing code to visualize some
intermediate results by writing images with annotation.  The data is too
complicated and too large to understand by looking at lists of numbers or
even simple plots.  Despite the crunch, this was time well spent and the
resulting images have already been quite useful.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\02\05@080039 by Rolf

face picon face
Vitaliy wrote:
{Quote hidden}

That's a fair assessment of what I feel, and the way I read Gerhard and
Olin's replies too. The last sentence is too broad though... there are
rules, good rules, but like most rules, there are times to break them
too. So, I guess it would be more accurate to say "there are guidelines,
but all guidelines have an appropriate context".

[snip]
> In addition, the statements you guys made reveal that you don't really know
> what Agile is. If you don't know what it is, how can you make judgements
> about its effectiveness?
>  
I freely admit to not knowing much about Agile at all. Further, all I
have really said is that the way we adapt/do things at work is fairly
flexible, and we do all sorts of different things, some 'Agile' in
nature, and some not. Also, I never suggested that Agile is not
effective, only that it is not effective for what we/I do at work.

[more snipped]
> The reason I started this thread was to try to clear up the confusion about
> what Agile is not. New concepts, especially ones that have a catchy name
> that begins with a capital letter, usually meet with resistance -- it is to
> be expected.
>  
I ave no idea why you started this particular thread, but I recall you
asking me how things happen at my work. I obliged and gave some insight
on how things happen. I expected you to use that information for
something, but I am not comfortable with you then trying to convince me
that what we do is wrong. I feel somewhat 'used' if you only asked me
how things work just so that you can tell me I am wrong. Go find someone
else to convert ;-) But, suggesting that what I am doing is somehow
inadequate, and then implying that I am somewhat 'slow' or 'entrenched'
because I don't want to change is insulting too. Where change has
realistic and significant benefits I am very swift to change course.
> If what you have works for you, great -- there's no reason to change if you
> are happy where you are. If, on the other hand, you are dissatisfied with
> the status quo (like I was three years ago), I would encourage you to
> explore Agile.
>
> Best regards,
>
> Vitaliy
>
>  
You know, Vitaliy, there are a lot of things happening in life. The
industry I work in is struggling (finance and software both), and people
are getting laid off all over the place. I am married, have a 3yr old
and a 5yr old. They are lots of fun, and take a lot of my time. I do
electronics, woodworking, and photography as a hobby, and also do a
bunch of other things. At work I am exploring all sorts of new areas in
both finance (credit risk), and technology (linux blade clusters, new
database systems, etc.). I have a system that gets things done as well
as prioritizes what needs to happen and when. My system  works for me,
and keeps me sane. There are enough stress points in my life that
introducing another one just to experience change seems daft.

I have no intention of ignoring my other demands and passions just so
that I can indulge your encouragement to try new things. Perhaps I am
set in my ways in some ways, but I am happy with things that work. Just
because you may not work the same way, and what works for me may not
work for you, does not make your systems any better (or worse).

For the record, I live in Canada. It is great, and is good for everyone.
If where you are works for you -- there's no reason to change if you are
happy where you are. If, on the other hand, you are dissatisfied with
the status quo (like I was before I came to Canada), I would encourage
you to emigrate. New places, especially ones that have a catchy name
that begins with a capital letter, usually meet with resistance -- it is
to be expected.

See, it just sounds belittling ....

Best regards

Rolf


2009\02\05@085215 by Rolf

face picon face
Vitaliy wrote:
{Quote hidden}

Your original assertion wash that the tests are simpler than the
program. Why is this comparing apples and banana pies?
{Quote hidden}

I can see how at face value it may seem simple, perhaps the example is
trivial. How about I said "how do you test an 'savings account' where
interest is calculated/adjsuted with each deposit/withdrawal in a
multi-threaded environment."... then again, perhaps my industry-specific
bias makes that 'simple' operation a whole lot more complex to get
right.... like telling an EE to just get a 16-bit ADC for an analog
signal... and they freak when you have no idea about noisy ground
planes/power supplies, jitter and bandwidth.... and you say "What's
complicated about it?".

For the record, in a multi-threaded environment it is not unusual for
the test mechanisms to subtly (and sometimes not so subtly) alter the
process  flow in such ways that the way the code works without the tests
is different to the way it works with the tests...

Rolf

2009\02\05@091147 by olin piclist

face picon face
Vitaliy wrote:
>> Early on you may get what look like quick results.  However these don't
>> have
>> any depth.  As you try to slap on more features the lack of coherent
>> architecture gets in the way.  Because nobody wants to throw out the
>> existing work, it gets kludged and patched to add more features.  In
>> the end
>> you have a ball of bandaids if you ever get to the end at all.
>
> Lean methodologies have been around for many years now, and obvious
> concerns like this have been addressed a hundred times.
>
> You seem to assume that "careful architecture approach" guarantees that
> you don't end up with a ball of bandaids.

Careful architecture doesn't guarantee anything, but it does decrease the
chance of ending up with a ball of bandaids.

> The fallacy of this argument
> is that you claim to predict the future. By definition, a project has
> many unknowns. Design upfront means design based on assumptions (aka
> bad information).

So you have two choices.  Architect with some idea of the future or none at
all.  In many case the "some idea" will be good enough to be useful.

{Quote hidden}

It is bad architecture, or usually the lack of even considering the overall
architecture that got them into this mess in the first place.  Anyone can
make the first 20% of features work.  With a bad architecture, once you get
to the 80% level or so, adding new features breaks old ones to a point where
the project stalls and the consultant is called to "finish" the last 20%.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\02\05@094607 by Rolf

face picon face
Olin Lathrop wrote:
> Vitaliy wrote:
>  
>>> Early on you may get what look like quick results.  However these don't
>>> have
>>> any depth.  
snip...
{Quote hidden}

and yet again I find myself agreeing with Olin here. Some of my biggest
learning experiences have been when the project is 'nearly' finished and
a last moment requirement change, though 'simple', invalidates much of
the early 'infrastructure' development, or worse, when I realise that I
made a mistake early in the cycle that makes the last part difficult.
Heck, I have been trapped in this cycle with a recent PIC program...
inexperience caught me out.

I have seen it on massive scales too, where inexperience and unknowns
have combined to cause huge project overruns and failures.

There are books written about these problems.

Yet again, it comes down to how you tackle the project from a strategic
level, how you set the foundations of the project that determines how it
will look at the end. The actual details of how the bricks get laid is
important, but the overrall architecture/design and build strategy are
essential too.

Perhaps Agile has a 'manifesto' about where to prioritize the
design/architecture process as well as the development process....

i think, more relevant to this discussion, it takes an experienced/wise
manager to determine what the first tasks should be so that the
difficult things are accomplished first. Like putting together a jigsaw
puzzle, if you get the process going right, it gets easier and easier as
you get fewer and fewer pieces left to place, and a good manager will
ensure that the last pieces fit, instead of being left with gaps when
pieces do not arrive, or are the wrong size...

Rolf

2009\02\05@141114 by Gerhard Fiedler

picon face
Vitaliy wrote:

>>> What was prevailing, then?
>>
>> Non-ideologic, pragmatic approaches, more influenced by the
>> individual's experiences and preferences and the constraints of the
>> situation than by any school of thought.
>
> It sounds like you and Oline have been spared the "scientific
> development" approaches, then. Consider yourself lucky. :)

Yup. Managers that are sworn in to a specific school of management seem
to be a major pain -- from what people tell me :)

Gerhard

2009\02\05@145704 by Bob Blick

face
flavicon
face

On Thu, 05 Feb 2009 09:45:35 -0500, "Rolf" <.....learrKILLspamspam.....rogers.com> said:

> and yet again I find myself agreeing with Olin here.
>
> Perhaps Agile has a 'manifesto' about where to prioritize the
> design/architecture process as well as the development process....
>
> i think, more relevant to this discussion, it takes an experienced/wise
> manager to determine what the first tasks should be so that the
> difficult things are accomplished first. Like putting together a jigsaw
> puzzle, if you get the process going right, it gets easier and easier as
> you get fewer and fewer pieces left to place, and a good manager will
> ensure that the last pieces fit, instead of being left with gaps when
> pieces do not arrive, or are the wrong size...

Wow, I'm agreeing with both Olin and Rolf :)

Here's a great article about how Agile misses the point, it's good
people that make things work, not the framework:

http://thedailywtf.com/Articles/The-Great-Pyramid-of-Agile.aspx

Cheerful regards,

Bob

--
http://www.fastmail.fm - And now for something completely different…

2009\02\05@155332 by Gerhard Fiedler

picon face
Rolf wrote:

>>> I get the impression that you consider Agile programming to be the
>>> sledge hammer in an otherwise limitless arsenal range of tools, and
>>> you are trying to turn every problem in to something a sledge
>>> hammer will fix. The old adage that "When all you have is a hammer,
>>> every problem becomes a nail..." seems to come to mind.
>>
>> Rolf, perhaps you're right, but no one has suggested a viable
>> alternative so far. What I'm hearing from you guys (you, Gerhard,
>> Olin) is, "Every project is different. What works for one project
>> may not work for another project. What doesn't work for one project
>> may work for another project. There are no rules, you must adapt."

This is a good rule. It means we need to pay attention, even when using
Agile-approved methods. It means we need to pay attention to the
conditions and results. It means we need to pay attention where we start
making compromises, not because something better suited would not be
available, but because what's available is not a "recognized technique"
by whatever authority.


> That's a fair assessment of what I feel, and the way I read Gerhard
> and Olin's replies too. The last sentence is too broad though...
> there are rules, good rules, but like most rules, there are times to
> break them too. So, I guess it would be more accurate to say "there
> are guidelines, but all guidelines have an appropriate context".

Right... a context and limits.

For example, you posted a link earlier about documentation. This guy had
a diagram in his page, where both richness and efficiency of
documentation increases from paper over audio to video. Utter BS, this
diagram, if you start to analyze it.

First, he left out electronic written formats in the documentation graph
in his diagram. This is too strange, given that this guy is writing
about software. There is a huge difference between paper documentation
and (searchable) electronic documentation, even if it's the same content
and even the same layout. It's also probably the most commonly used form
of documentation, at least in electronics and software. So why would he
leave it out? Where would he put the richness/efficiency spot of the
most commonly used documentation medium?

Secondly... efficiency? Can you imagine the "efficiency" of PIC
datasheets in video? Give me a break. Some people say that there is a
whole generation of near-analphabetics growing up, with an attention
span that goes from one commercial break to the next and not much
further, but this doesn't mean that written material is not efficient if
you can use it. I generally can read (and understand) an issue in about
a tenth or a fifth of the time it would take me to work through a video,
and probably most people with decent self-learning skills are similar in
this respect. FWIW, whoever is not good with written stuff shouldn't be
in writing software in the first place. So audio and video in this
respect is just silly. (And note that I'm not talking about the second
graph in his diagram, which is about something else, something more
interactive, "modeling" IIRC. I'm talking about the one he calls
"documentation", which is supposedly about documentation.)

No doubt for /some/ documentation situations a video is more
"efficient", but for issues related to software? IME rarely. So there is
definitely missing some context about the efficiency of different media,
and I think applying the usual software documentation context, for me at
least video and audio (even worse) are very inefficient media.


>> In addition, the statements you guys made reveal that you don't
>> really know what Agile is. If you don't know what it is, how can you
>> make judgements about its effectiveness?

In this thread, I commented almost exclusively on your comments (which
includes what you say that Agile is for you), not about Agile in itself.
I don't make judgments about Agile, but comment on specific
affirmations, independently of where they come from. (Like you said that
the Manifesto changed the management world, and I said IME it didn't
because the stuff written there sounds nice but not that new in the
context of the managers I've worked with and for. This doesn't say much
about what I think of Agile and what are considered Agile techniques.)


> For the record, I live in Canada. It is great, and is good for
> everyone. If where you are works for you -- there's no reason to
> change if you are happy where you are. If, on the other hand, you are
> dissatisfied with the status quo (like I was before I came to
> Canada), I would encourage you to emigrate. New places, especially
> ones that have a catchy name that begins with a capital letter,
> usually meet with resistance -- it is to be expected.

Ah, and Brazil... ever thought about moving here? :)

Gerhard

2009\02\05@161046 by Gerhard Fiedler

picon face
Rolf wrote:

> Vitaliy wrote:
>> Rolf wrote:
>>  
>>>> I think you just created a straw man. :) I've never heard anyone
>>>> recommend writing a test program for the test program. What you
>>>> have is the program under test (the part that the users use), and
>>>> the tests.
>>>>
>>> Thinking of two examples:
>>>
>>> how do you test the accuracy of your new whizz-bang ultra-accurate
>>> pressure sensor ..... somehow you have to build an even more
>>> accurate pressure container...
>>
>> I think this is like comparing apples and banana pies.
>>  
> Your original assertion wash that the tests are simpler than the
> program. Why is this comparing apples and banana pies?

I agree with Rolf. This is a good example of the test environment being
more complex than the DUT. And it is such a common scenario.


>>> how do you test a software implemented queue that is accessed in a
>>> multi-threaded environment for both adding to and removing from the
>>> queue...
>>
>> You write functions that put something in the queue, and other
>> functions that pull stuff from the queue. What's complicated about
>> it?

Ouch... :) The problem is not in the simple case of just putting in to
and pulling out from the queue -- in a single-threaded environment, this
is a task for students in an early semester. The problem is testing that
it works well in multi-tasked environments, and the different timings.
This is far from trivial. Try it sometime... create a queue that works
in a multi-threaded environment and a test harness for it that tests
whether it works correctly in a multi-threaded environment. I'm sure
you'll find that the test requires much more effort and analysis. It's
not only preventing harmful situations (which is what you need to think
about when creating the queue), it is about creating them -- and
creating all sorts of potentially tricky situations, not only the ones
that are tricky for this specific implementation, but for the problem
per se. (The test shouldn't depend on specific implementation details,
it should test the correctness of the DUT based on the requirements, not
the implementation.)


> I can see how at face value it may seem simple, perhaps the example is
> trivial.

It may seem trivial, but it isn't. You know that :)

Gerhard

2009\02\05@164546 by Gerhard Fiedler

picon face
Vitaliy wrote:

> Gerhard Fiedler wrote:
>>>>> I think the nature of the project is irrelevant. [...]
>>>>
>>>> IMO this is exactly one of the major management fallacies. The
>>>> nature of the project does matter, [...]
>>>
>>> Consider the possibility that we're both right. :)
>>
>> I'd like to. Can you explain this WRT these two statements?
>
> Projects have similarities and differences. There are things that
> will always be true (or at least, most of the time). There are other
> things that will be different.
>
> I was saying that for the things that are always true, the nature of
> the project is irrelevant.

What are the things that are always true? Really, I haven't found much
in this category. People come and go, and schools of thought, too --
this is probably one :)


> First of all, I thought we established that the waterfall model
> doesn't work in the real world.

You may have done that, but not me. I agree that the "pure" waterfall
model usually doesn't work well. But I also think that there are
projects where it is good to have an almost formal requirements
gathering phase before doing any coding, and there are situations in
general where completing one thing before starting another is highly
recommended. Which is the basic principle of the waterfall model
(according to me :).

> Second, Agile does not preclude one from using any tools or
> techniques, as long as they don't contradict the principles.

See, this is what I have a problem with. I could care less whether a
tool or technique contradicts the principles a few guys came up with. If
I have good reason to think it works (and I wouldn't think that it does
if I didn't think I had good reason :), I'm going to use it -- and not
waste a second of thought whether or not this contradicts somebody's
principles.

> And I thought you said you agree with the Manifesto? :)

I said I can agree with most that's written in the Manifesto. But the
Manifesto is rather generic; you can bring it in alignment with a lot;
it seems with more than most would consider "Agile".


>> Like e.g. using a Gantt chart to make complex task dependencies
>> visible.
>
> As long as you don't waste time projecting the deadlines six months
> into the future, and creating a monster chart that has every single
> trivial task listed, I don't have a problem with it.

I'm glad to hear that you don't have a problem with me using Gantt
charts. Seriously... :)

> Like I said, I found Gantt charts to be a waste of time.

That may be due to the way you used them, or just a matter of personal
preference.

> There are other, more effective ways to make dependencies visible.

Like for example?


>> Like finding out how the transmission I'm supposed to control
>> actually works before I have damaged five of them by iterating
>> through a learning process.
>
> I think that's a given.

This is what I'm talking about. This looks like a requirement gathering
phase, before even starting to think about code.

>> I am experiencing this, on an ongoing basis. Try to write a really
>> useful test for a non-trivial application, and you'll experience it,
>> too.
>
> I'm seriously considering it. On a few occasions, I got bit by
> "small" changes that I made that broke the code in subtle ways. I'm
> also tired of running manual tests.

Yes, I know what you're talking about. But there is a reason that it is
not so common: it takes a lot of time and effort to do it in a useful
way. Think about it... every time you add a feature to or change one in
your target, you need to update two applications -- and the test apps
are by no means simpler than the target apps. While they add to the
overall stability, they also add a lot to the schedule.


> Sure, writing a suite of tests that tests *everything* is a waste of
> time. But it probably makes sense to automate the testing of some
> functions. The law of diminishing returns says that at some point
> writing more tests would have a net negative impact on productivity.

Exactly. But you also need to put some considerable effort into it
before you start getting a productivity result. I don't know where the
balance is, and it probably depends a lot on the specific project and
the team. (You're getting tired of hearing this, right? :) But neither
testing nor finding the balance is generally trivial IMO.


{Quote hidden}

(Side note: Some do, some don't. As I said, I've been contracted before
to write a test software, and I did write tests for my test software. So
it is being done, in certain cases. Olin also provided an example.)

Just explain this to me:

Let's say that it can be shown (scientifically nonetheless) that it
results in better software when the software (of any kind) is written so
that the test software is written before the target software. Since this
applies to software of any kind, it also apply to test software. So if I
don't write test software with tests for it first, does it mean that the
test software is of inferior quality? If it isn't, what does this mean
for the premise? If it is, what does this mean for the reliability of
the tests?


> The only difference is the order: traditionally, tests are written
> after the user functions have been implemented.

Again this "traditionally". You seem to know an awful lot about what is
tradition in the business. I don't seem to know a small percentage of
this.

For example, IME tests are "traditionally" rarely written in an
organized, complete manner. Most complex software has some test
harnesses, some test cases, but rarely a complete unit test/overall
function test environment.

IMO it is more important to have the tests written by a different team
than the target than when exactly this happens. Think about it... in
both cases, what you need is a clear picture of the requirements.
Whether you get this writing a test or writing the target doesn't really
matter all that much. What you have to do in the end is debugging /both/
when you bring them together... there's nothing that guarantees that
your test application will be correct. (And believe me, in all but the
most trivial cases it won't be.)


>> Also, think about this... I've been contracted to write a test
>> program -- only the test program. And I did write a test program for
>> the test program :)
>
> You mean, you wrote a program that the test program was testing? Kind
> of like creating test data?

No, I wrote a program to test some other program. I was the "other" guy;
they didn't want the test program to be written by the same guy who did
the target program. So even though my program was a test program, I
treated it like any other program: I wrote tests for parts of it. But
this, together with what I wrote above, to illustrate that test programs
are not any different from other programs -- so whether writing them
first or second doesn't really make much of a difference.


>>> What you have is the program under test (the part that the users
>>> use), and the tests.
>>
>> Where do the tests come from?
>
> They are based on the requirements.

So we do gather requirements before we write code? Aren't we now getting
dangerously close to a (modified) waterfall model: gather requirements
before writing tests, then writing tests before writing target code,
then writing target code?


> Simple tests don't cost much, and can be useful.

Of course. This is a perfect application of the "it depends" principle
:)

Gerhard

2009\02\05@170446 by Sean Breheny

face picon face
Hi Bill,

I'm not sure what type of "provable" you mean. There is very much an
ongoing effort in CS and engineering circles to have programming
languages for which one can automatically prove that functions do what
they are intended to do for all inputs. In other words, if you have a
programming language which is restrictive enough (for example, does
not allow intercommunication between functions except by explicit
parameter passing), you can then state the proposition that function
"CountWidgets(x)" always returns the correct value for any x in the
set of integers. This proposition can then be fed to an automatic
theorem prover program which can produce a mathematical proof that the
proposition is true (or false, or perhaps unable to determine).

It is also true, though, that one can prove that certain vital aspects
of a program cannot be proven true in general. For example, the
halting problem:

http://en.wikipedia.org/wiki/Halting_problem

which says that for important classes of programs and possible inputs,
it is not possible to create a general algorithm which shows whether
they terminate or loop forever.

Sean


On Thu, Feb 5, 2009 at 3:44 AM, William Chops Westfield <EraseMEwestfwspam_OUTspamTakeThisOuTmac.com> wrote:
{Quote hidden}

> -

2009\02\05@171528 by Rolf

face picon face
Bob Blick wrote:
{Quote hidden}

Hey, it's a PicList Love-Fest. Who's for a hug?

The blog entry is a good one. I must learn more about Agile "Mantras" so
I can be more critical about arguing both for and against the concepts.

People at work I respect took it apon themselves to investigate the
concept, and, from that, 'borrowed' some concepts, but declared Agile as
a whole to be a non-starter for us on a company-wide scale.

I simply trusted their judgment on it, but maybe I will spend some time
sometime, and look deeper.

Rolf

2009\02\05@194130 by Gerhard Fiedler

picon face
Rolf wrote:

> Vitaliy wrote:
>> Rolf wrote:
>>  
>>>> I think you just created a straw man. :) I've never heard anyone
>>>> recommend writing a test program for the test program. What you
>>>> have is the program under test (the part that the users use), and
>>>> the tests.
>>>>
>>> Thinking of two examples:
>>>
>>> how do you test the accuracy of your new whizz-bang ultra-accurate
>>> pressure sensor ..... somehow you have to build an even more
>>> accurate pressure container...
>>
>> I think this is like comparing apples and banana pies.
>>  
> Your original assertion wash that the tests are simpler than the
> program. Why is this comparing apples and banana pies?

I agree with Rolf. This is a good example of the test environment being
more complex than the DUT. And it is such a common scenario.


>>> how do you test a software implemented queue that is accessed in a
>>> multi-threaded environment for both adding to and removing from the
>>> queue...
>>
>> You write functions that put something in the queue, and other
>> functions that pull stuff from the queue. What's complicated about
>> it?

Ouch... :) The problem is not in the simple case of just putting in to
and pulling out from the queue -- in a single-threaded environment, this
is a task for students in an early semester. The problem is testing that
it works well in multi-tasked environments, and the different timings.
This is far from trivial. Try it sometime... create a queue that works
in a multi-threaded environment and a test harness for it that tests
whether it works correctly in a multi-threaded environment. I'm sure
you'll find that the test requires much more effort and analysis. It's
not only preventing harmful situations (which is what you need to think
about when creating the queue), it is about creating them -- and
creating all sorts of potentially tricky situations, not only the ones
that are tricky for this specific implementation, but for the problem
per se. (The test shouldn't depend on specific implementation details,
it should test the correctness of the DUT based on the requirements, not
the implementation.)


> I can see how at face value it may seem simple, perhaps the example is
> trivial.

It may seem trivial, but it isn't. You know that :)

Gerhard

2009\02\05@235206 by Vitaliy

flavicon
face
"Bob Blick" wrote:
> Here's a great article about how Agile misses the point, it's good
> people that make things work, not the framework:
>
> http://thedailywtf.com/Articles/The-Great-Pyramid-of-Agile.aspx

What a bunch of nonsense. :)  I guess that's what one should expect from a
website with that kind of name.

For those who like to think for themselves and are not simply tempted to
join the chorus of nay-sayers: this article has it backwards. It is the
traditional methodologies (the likes of CMM) that assume it is possible to
use rules, procedures, and processes to force mediocre programmers to
produce good code.

Vitaliy

2009\02\05@235652 by Vitaliy

flavicon
face
Rolf wrote:
> The blog entry is a good one. I must learn more about Agile "Mantras" so
> I can be more critical about arguing both for and against the concepts.
>
> People at work I respect took it apon themselves to investigate the
> concept, and, from that, 'borrowed' some concepts, but declared Agile as
> a whole to be a non-starter for us on a company-wide scale.
>
> I simply trusted their judgment on it, but maybe I will spend some time
> sometime, and look deeper.

Rolf, if you really do that, I would feel that my time in this thread was
well spent. :)

Seriously, it's very hard to have a meaningful conversation with people who
have preconceived notions about Agile that they acquired through hearsay. I
got kind of tired of explaining what Agile is not, and explaining the basics
of the methodology.

Vitaliy

2009\02\06@001446 by Bob Blick

face
flavicon
face
Vitaliy wrote:

> For those who like to think for themselves

I think that all the nay-sayers in this thread ARE the ones thinking for
themselves. You drank the Kool-Aid.

I'm not saying Agile doesn't have some good practices. If it makes you
feel good, them maybe it's good for you to implement it. But if you had
Olin, Gerhard, Rolf and me as your employees, think how we'd respond if
you laid this out as the new plan. Did I say respond? I meant revolt.

Simple rules don't fit when humans are involved.

Cheerful regards,

Bob

2009\02\06@024633 by Vitaliy

flavicon
face
Bob Blick wrote:
> I think that all the nay-sayers in this thread ARE the ones thinking for
> themselves. You drank the Kool-Aid.

Well, I can only say I'm glad that at least you didn't specify what I
smoked. :)


> I'm not saying Agile doesn't have some good practices. If it makes you
> feel good, them maybe it's good for you to implement it. But if you had
> Olin, Gerhard, Rolf and me as your employees, think how we'd respond if
> you laid this out as the new plan. Did I say respond? I meant revolt.

Olin already quit:
> Seeya.  So long.  Get someone else to do this job.

Agile is about adapting to change. So if you guys really did work for me,
you would adapt. If you couldn't adapt, you would be swiftly looking for
employment elsewhere. :)

Seriously though, I think that once you've seen the practices in action, you
would change your mind. Unless you're just stubborn. In which case I'd fire
you. ;)


> Simple rules don't fit when humans are involved.

Agile is *not* about rules.

Did I mention that I'm tired of arguing about what Agile is not?

If you want to have a meaningful debate, you can start by reading the Agile
Manifesto (takes about 30 seconds to read in its entirety). Then, tell me
which part of it you disagree with, and why. Brownie points for doing the
same with the principles. Later we can move on to the practical matters of
organizing the team, collaborating with customers, and managing an agile
project.

Vitaliy

2009\02\06@024906 by Vitaliy

flavicon
face
"Gerhard Fiedler" wrote:
>>> Non-ideologic, pragmatic approaches, more influenced by the
>>> individual's experiences and preferences and the constraints of the
>>> situation than by any school of thought.
>>
>> It sounds like you and Oline have been spared the "scientific
>> development" approaches, then. Consider yourself lucky. :)
>
> Yup. Managers that are sworn in to a specific school of management seem
> to be a major pain -- from what people tell me :)

Oh yeah? :)  Which people?

FWIW, I'm not "sworn into" anything. With one exception:

http://en.wikipedia.org/wiki/Oath_of_citizenship_(United_States)

Vitaliy

2009\02\06@032912 by Vitaliy

flavicon
face
"Rolf"
>> Rolf, perhaps you're right, but no one has suggested a viable alternative
>> so
>> far. What I'm hearing from you guys (you, Gerhard, Olin) is, "Every
>> project
>> is different. What works for one project may not work for another
>> project.
>> What doesn't work for one project may work for another project. There are
>> no
>> rules, you must adapt."
>>
>>
>>
> That's a fair assessment of what I feel, and the way I read Gerhard and
> Olin's replies too. The last sentence is too broad though... there are
> rules, good rules, but like most rules, there are times to break them
> too. So, I guess it would be more accurate to say "there are guidelines,
> but all guidelines have an appropriate context".

Agile does not have "rules". There is the Manifesto (a set of values), and
the agile principles. Nobody seems to have a problem with either, but so far
everyone has chosen to fight the windmills.


{Quote hidden}

OK, I better address this, point-by-point. :)

- The reason I started this thread, was because of the statements you've
made about Agile, which I felt were inaccurate.
- I asked you about how things are done at your place of work, because I am
genuinely interested in learning about how software development works in the
real world.
- I am by no means an expert on Agile, or software development. Therefore I
held my objections, until I've had a chance to read about your experience,
and why Agile made a negative impression on you.
- I'm sorry that I made you feel uncomfortable and "used", it was not at all
my intent. I don't think I ever attacked you personally?
- I wasn't expecting to convert any of you. My intent was simply to correct
statements which I know to be wrong.
- You're not slow. Most people resist change, but I wouldn't call anyone
'entrenched' just because they're skeptical about Agile.
- I'm sorry I made you feel insulted. As I said before, I meant no offense,
and tried to attack the idea and not the person -- but I know that sometimes
I can be too direct. I'm sorry.


>> If what you have works for you, great -- there's no reason to change if
>> you
>> are happy where you are. If, on the other hand, you are dissatisfied with
>> the status quo (like I was three years ago), I would encourage you to
>> explore Agile.
>>
> You know, Vitaliy, there are a lot of things happening in life. The
> industry I work in is struggling (finance and software both), and people
> are getting laid off all over the place.

Our industry is suffering as well. Since September, our company lost 8
people (half of our staff) to attrition and layoffs.


>  I am married, have a 3yr old
> and a 5yr old. They are lots of fun, and take a lot of my time.

I'm happy for you. I'm also married, and have a six year old girl and a six
month old boy.


> I do
> electronics, woodworking, and photography as a hobby, and also do a
> bunch of other things. At work I am exploring all sorts of new areas in
> both finance (credit risk), and technology (linux blade clusters, new
> database systems, etc.). I have a system that gets things done as well
> as prioritizes what needs to happen and when. My system  works for me,
> and keeps me sane. There are enough stress points in my life that
> introducing another one just to experience change seems daft.

Would you mind telling me about your system? I primarily rely on my paper
planner, and reminders in my cell phone. I also make lots of checklists.

I really wish we could set aside the ideological differences, and focus on
sharing the practical things that work.


> I have no intention of ignoring my other demands and passions just so
> that I can indulge your encouragement to try new things. Perhaps I am
> set in my ways in some ways, but I am happy with things that work. Just
> because you may not work the same way, and what works for me may not
> work for you, does not make your systems any better (or worse).

I meant what I said above. If your system works for you, great. My system
did not work for me, so I desperately looked for ways that were better. Lean
development had a drastic impact on the way I manage projects and write
software.


> For the record, I live in Canada. It is great, and is good for everyone.
> If where you are works for you -- there's no reason to change if you are
> happy where you are. If, on the other hand, you are dissatisfied with
> the status quo (like I was before I came to Canada), I would encourage
> you to emigrate. New places, especially ones that have a catchy name
> that begins with a capital letter, usually meet with resistance -- it is
> to be expected.
>
> See, it just sounds belittling ....

Not to me. :)

I actually emigrated once already, to the United States. I lived in a state
bordering Canada, then moved to a state bordering Mexico.

I am indeed very happy where I am.

Vitaliy

2009\02\06@035604 by Vitaliy

flavicon
face
"Gerhard Fiedler" wrote:
>>> "Every project is different. What works for one project
>>> may not work for another project. What doesn't work for one project
>>> may work for another project. There are no rules, you must adapt."
>
> This is a good rule. It means we need to pay attention, even when using
> Agile-approved methods. It means we need to pay attention to the
> conditions and results. It means we need to pay attention where we start
> making compromises, not because something better suited would not be
> available, but because what's available is not a "recognized technique"
> by whatever authority.

What are you attacking with this, Gerhard? Do you expect me to argue that
with Agile, you don't need to pay attention to conditions and results? You
build strawmen, and force me to explain every time, what Agile is not. For
example, your statement above implies that Agile prohibits one from using
"unrecognized techniques".


>> That's a fair assessment of what I feel, and the way I read Gerhard
>> and Olin's replies too. The last sentence is too broad though...
>> there are rules, good rules, but like most rules, there are times to
>> break them too. So, I guess it would be more accurate to say "there
>> are guidelines, but all guidelines have an appropriate context".
>
> Right... a context and limits.

Fine, give me some contexts/limits where the values from the Agile Manifesto
would not be applicable.


> For example, you posted a link earlier about documentation. This guy had
> a diagram in his page, where both richness and efficiency of
> documentation increases from paper over audio to video. Utter BS, this
> diagram, if you start to analyze it.

I can't find the document you are referring to, but I'm pretty sure I know
which diagram you're talking about, and I think you misunderstood what it
was showing. The diagram was grading the efficiency of the forms of
_communication_, not _documentation_.


>>> In addition, the statements you guys made reveal that you don't
>>> really know what Agile is. If you don't know what it is, how can you
>>> make judgements about its effectiveness?
>
> In this thread, I commented almost exclusively on your comments (which
> includes what you say that Agile is for you), not about Agile in itself.
> I don't make judgments about Agile, but comment on specific
> affirmations, independently of where they come from.

That's not true. You keep making statements about the vision of Agile that
you have created.

You also said at one point, that the points in Agile manifesto contradict
themselves. I'm still waiting for you to identify the specific points in
question, and explain that statement.


> (Like you said that
> the Manifesto changed the management world,

When did I say this?!!


>> For the record, I live in Canada. It is great, and is good for
>> everyone. If where you are works for you -- there's no reason to
>> change if you are happy where you are. If, on the other hand, you are
>> dissatisfied with the status quo (like I was before I came to
>> Canada), I would encourage you to emigrate. New places, especially
>> ones that have a catchy name that begins with a capital letter,
>> usually meet with resistance -- it is to be expected.
>
> Ah, and Brazil... ever thought about moving here? :)

Why don't you

Vitaliy

2009\02\06@044438 by Vitaliy

flavicon
face
Nate Duehr wrote:
>> Did you catch this -- "Customer collaboration over contract
>> negotiation"?
>
> As a non-programmer, but someone who's been involved in business
> purchase decisions, this seems to be Agile's achilles heel.

Glad you decided to join the conversation.


> Customers want to know what they're entitled to before they pay.  If
> they start paying a company that's "Agile" to "collaborate" and the
> collaboration slows down or becomes bogged down in some set of
> details, what recourse do they have?   In the more traditional RFP/
> Requirements/Then Build environment, if the software doesn't meet the
> requirements, they have legal recourse to sue.

This is a valid concern, but one that has been addressed. I posted links to
three documents that describe in detail, how it works. See for example:

   Contracting Agile Projects
   http://tinyurl.com/au7tss

Lawsuits are expensive, and the customer must prove in the court of law,
that the company is in breach of contract.

Agile promotes customer collaboration, which helps avoid lawsuits.
Everything is transparent, at the end of each iteration the customer can see
for himself what has been done.

In the more traditional environment you alluded to, they get assurances in
the form of progress reports. Lawsuits happen when the customer feels he has
been lied to. Working code does not lie.


> If you've convinced them that they can just "collaborate" with your
> team, and then say -- decide you want to downsize the team
> (effectively making delivery dates stretch out longer), what can they
> do about it if they've been paying and collaborating all along.

They can walk away as soon as they feel they're not getting the results they
want, at a reasonable pace. Every iteration is only a couple of weeks long,
and the customer gets working software at the end of each one.


> Agile seems to be idealistic about software relationships between
> companies being a "forever" thing.

This couldn't be farther from the truth. In fact, because the customer gets
working software at the end of each iteration, they can theoretically decide
in the middle of the project that they have enough functionality, and
terminate the project. Try doing that with a traditional project, and you
will end up with a bunch of untested code, or worse.


> Additionally, how do you handle it when one customer wants to
> completely change what the software does or how to interact with it,
> and the majority of customers want it another way?  Agile never
> addresses that.  They assume "all customers are equal".  We all know
> they're not.  Change this example to "your largest customer who brings
> in 25% of your revenue" and it starts to become very difficult to
> smash Agile development up against the cold reality of customer wants/
> desires, doesn't it?

Another straw man. :)

Now, why would you say that "Agile assumes that all customers are equal"? Is
this something you read in the Agile Manifesto? Or maybe, it's one of the
agile principles?  :)

Agile teams have a concept of "Product Owner". The job of the Product Owner
is to act as a liason between the customers and the team, and to represent
the interests of the customers on a day-to-day basis.

This also goes back to "customer collaboration": in order for the agile
process to work, customer must be a part of this process. That's what the
sprint meetings are for -- once every iteration, the customer gets working
software, and is expected to provide feedback to the team, and help
prioritize the task list for the next sprint.


> Maybe someone on an Agile team that has lots of customers can
> explain.  Do you just end up having to sales-pitch the smaller
> customers into liking what the largest "collaborators" want?

We rarely work with outside customers on software projects, so I probably
don't have enough credibility to really answer your question. I do assume
the role of product owner from time to time, as do other developers, and
AFAIK what I described above is considered standard practice.

The bottom line is, business relationships, like any other form of
relationships, are based on mutual benefit. When one company contracts
another company to develop software, they are both interested in a
successful outcome. So to me, it makes sense to put the bulk of the effort
into making sure the project succeeds, rather than on negotiating a
contract.

Vitaliy

2009\02\06@051530 by Vitaliy

flavicon
face
Olin Lathrop wrote:
>> You seem to assume that "careful architecture approach" guarantees that
>> you don't end up with a ball of bandaids.
>
> Careful architecture doesn't guarantee anything, but it does decrease the
> chance of ending up with a ball of bandaids.

This sounds reasonable to me.


>> The fallacy of this argument
>> is that you claim to predict the future. By definition, a project has
>> many unknowns. Design upfront means design based on assumptions (aka
>> bad information).
>
> So you have two choices.  Architect with some idea of the future or none
> at
> all.  In many case the "some idea" will be good enough to be useful.

It's not black-and-white. While jumping right into coding without any
planning is a terrible idea, it's also possible to architect a project to
death. It's the proverbial "paralysis by analysis".

I'm not advocating against planning per se, only excessive, overly detailed
planning that looks too far into the future, and is therefore largely based
on assumptions. That's why I like the iterative approach, where planning is
followed by writing actual code. You gain useful information after every
iteration.


> It is bad architecture, or usually the lack of even considering the
> overall
> architecture that got them into this mess in the first place.  Anyone can
> make the first 20% of features work.  With a bad architecture, once you
> get
> to the 80% level or so, adding new features breaks old ones to a point
> where
> the project stalls and the consultant is called to "finish" the last 20%.

I've been there. A few months ago I described my experience with refactoring
(you remember, I made the famous statement about keeping functions under 10
lines), but sometimes the mess is so bad you just have to start from
scratch.

Vitaliy

2009\02\06@052457 by Vitaliy

flavicon
face
"Rolf"
> Yet again, it comes down to how you tackle the project from a strategic
> level, how you set the foundations of the project that determines how it
> will look at the end. The actual details of how the bricks get laid is
> important, but the overrall architecture/design and build strategy are
> essential too.

I'm all for requirements gathering, design, or planning. However, I know
from experience that initial assumptions are often proven wrong. Change
happens, it's a fact of life. It makes sense to accept it, and adopt to it,
Instead of trying to fight it.


> Perhaps Agile has a 'manifesto' about where to prioritize the
> design/architecture process as well as the development process....

I'm not sure I understand the quesiton 100%. How can you prioritize a
process?


> i think, more relevant to this discussion, it takes an experienced/wise
> manager to determine what the first tasks should be so that the
> difficult things are accomplished first. Like putting together a jigsaw
> puzzle, if you get the process going right, it gets easier and easier as
> you get fewer and fewer pieces left to place, and a good manager will
> ensure that the last pieces fit, instead of being left with gaps when
> pieces do not arrive, or are the wrong size...

IMO, it makes more sense to focus on things with the most business value.

Vitaliy

2009\02\06@053420 by Vitaliy

flavicon
face
"Rolf" wrote:
>>> how do you test the accuracy of your new whizz-bang ultra-accurate
>>> pressure sensor ..... somehow you have to build an even more accurate
>>> pressure container...
>>>
>>
>> I think this is like comparing apples and banana pies.
>>
>>
>>
> Your original assertion wash that the tests are simpler than the
> program. Why is this comparing apples and banana pies?

Because software is different from physical objects. So much so, that most
software analogies brought in from "the real world" are not very useful.
Prime example is comparing software to building a house, and yet Wouter et
al had a fun discussion recently about the merits of top-down, bottom-up,
and middle-out development approaches.


{Quote hidden}

You and Gerhard are both right, I don't have anough expertise to make
qualified statements about your examples. I already said I'm new to TDD, but
the concept  makes sense to me, and I find the the simple "sanity check"
tests useful.

Vitaliy

2009\02\06@065104 by Vitaliy

flavicon
face
Gerhard Fiedler wrote:
>> I was saying that for the things that are always true, the nature of
>> the project is irrelevant.
>
> What are the things that are always true?

Does this mean you finally want to go over the manifesto and the 12
principles, point-by-point? :)


> Really, I haven't found much
> in this category. People come and go, and schools of thought, too --
> this is probably one :)

Only time will tell, right? :)  It's been eight years since the manifesto
had been drafted. Agile is by and large accepted by the software industry
(according to a DDJ survey), and is trickling down to the embedded software
industry. I predict that it won't be long until you have a personal
encounter with it.


{Quote hidden}

You accused me earlier, of redefining terms. Now look at what you're doing.
:)

There's nothing vague about the definition of the waterfall model. You have
your phases, and you do them in the prescribed order:

http://upload.wikimedia.org/wikipedia/commons/5/51/Waterfall_model.png

http://en.wikipedia.org/wiki/Waterfall_model

Once you're done with a phase, you can't go back. Once you start coding, you
can't go back to revise the specs. You don't test the code as you write it
(that's what the Verification phase is for, and as the chart shows, it comes
after Implementation).

"Usually doesn't work well" is putting it mildly. How about "never works"?


>> Second, Agile does not preclude one from using any tools or
>> techniques, as long as they don't contradict the principles.
>
> See, this is what I have a problem with. I could care less whether a
> tool or technique contradicts the principles a few guys came up with. If
> I have good reason to think it works (and I wouldn't think that it does
> if I didn't think I had good reason :), I'm going to use it -- and not
> waste a second of thought whether or not this contradicts somebody's
> principles.

Your statement is so shocking, that temporarily I was at a loss for words
(an unusual occurence for me). :)

Say you worked with a younger engineer, who insisted on making his own
mistakes, and trusting only his experience. So one day, he disregarded your
advice, and ruined a transmission. Would it be fair to say that he is dumb,
because he did not listen to someone with more experience (you)?

Or how about an architect, who designs buildings based on what "makes sense"
to him, disregarding the collective experience of hundreds of thousands of
architects who went before him?

What makes software development so fundamentally different from other
engineering disciplines, that other people's (and even one's own) experience
is irrelevant?

Do you not read any books on software development?


>> And I thought you said you agree with the Manifesto? :)
>
> I said I can agree with most that's written in the Manifesto. But the
> Manifesto is rather generic; you can bring it in alignment with a lot;
> it seems with more than most would consider "Agile".

Gerhard, you're very slippery, you know that? :) Give me one example.


>>> Like e.g. using a Gantt chart to make complex task dependencies
>>> visible.
>>
>> As long as you don't waste time projecting the deadlines six months
>> into the future, and creating a monster chart that has every single
>> trivial task listed, I don't have a problem with it.
>
> I'm glad to hear that you don't have a problem with me using Gantt
> charts. Seriously... :)

I'm glad that you're glad. :)


>> There are other, more effective ways to make dependencies visible.
>
> Like for example?

http://en.wikipedia.org/wiki/Critical_Chain_Project_Management

The information density used in this type of chart is higher, it doesn't
force you to assign dates to tasks, and it's "agile" in the sense that it
doesn't lock you into a timeline. In other words, it's more conducive to
getting a project done early.


[snip]
{Quote hidden}

Have you ever heard of JUnit, CUnit, or DUnit? I don't think it makes sense
for me to spend my time basically paraphrasing the basics of TDD. Tests and
the software under test, are different things.


{Quote hidden}

You must have misunderstood what I meant. IIRC, you yourself expressed
surprise that TDD requires that tests be written *before* the function that
is being tested. Or is this how it's normally done, in your experience?


> IMO it is more important to have the tests written by a different team
> than the target than when exactly this happens. Think about it... in
> both cases, what you need is a clear picture of the requirements.
> Whether you get this writing a test or writing the target doesn't really
> matter all that much. What you have to do in the end is debugging /both/
> when you bring them together... there's nothing that guarantees that
> your test application will be correct. (And believe me, in all but the
> most trivial cases it won't be.)

I guess one can only hope that the wrong results won't match. :)


{Quote hidden}

FWIW, TDD tends to assume that the tests reside inside the program under
test, preferably in the same module/class that the tests are testing.


{Quote hidden}

Not if you do it iteratively. :)  You don't spend three months gathering
requirements, then three months writing tests, and another six months
writing the target code. What you do, is you define what a function should
do, write a test for it (which should fail initially), then write the
function. May only take you five minutes (we can call it a
"mini-iteration").

Mind you, I myself don't do the "write the test" part yet, I went as far as
reading a book on TDD and playing with the DUnit. Then I jumped back into
embedded development, where existing TDD frameworks do not exist. Timothy
Weber gave me some great ideas last time we talked about TDD, which I'm
planning to put to good use...


>> Simple tests don't cost much, and can be useful.
>
> Of course. This is a perfect application of the "it depends" principle
> :)

I prefer the agile principles, to the "every project is different, you're on
your own, start from scratch" kind of advice. ;-)

Show me something better, and we can get together for a public burning of
the Agile Manifesto.

Vitaliy

2009\02\06@083756 by Gerhard Fiedler

picon face
Vitaliy wrote:

> "Gerhard Fiedler" wrote:
>>>> "Every project is different. What works for one project may not
>>>> work for another project. What doesn't work for one project may
>>>> work for another project. There are no rules, you must adapt."
>>
>> This is a good rule. It means we need to pay attention, even when
>> using Agile-approved methods. It means we need to pay attention to
>> the conditions and results. It means we need to pay attention where
>> we start making compromises, not because something better suited
>> would not be available, but because what's available is not a
>> "recognized technique" by whatever authority.
>
> What are you attacking with this, Gerhard?

Nothing. I'm agreeing with (and expanding on) what Rolf wrote. Is this
an attack on anything? Didn't you start all this because (and this is
again quoting from memory) you wanted to hear what we think is
important? Here you have (some of) it. I happen to think that this is
more important than to know whether or not a given technique is
Agile-compliant.

> Do you expect me to argue that with Agile, you don't need to pay
> attention to conditions and results?

No. What makes you think I would?

> You build strawmen, and force me to explain every time, what Agile is
> not.

This is not about Agile. This is about the above affirmation from Rolf.

> For example, your statement above implies that Agile prohibits one
> from using "unrecognized techniques".

This is not about Agile, it's about what people wrote. You wrote earlier
in this thread: "Agile does not preclude one from using any tools or
techniques, as long as they don't contradict the principles." This is
pretty much synonymous with "Agile precludes the use of tools or
techniques that contradict the principles."

As I wrote before, when I think a tool or technique is useful in a given
situation, I couldn't care less whether "Agile" (whoever or whatever
that is) "precludes" that or not. I'll use it.

Again, I'm generally not writing about Agile (or what I think it is).
I'm writing about what people write here in this discussion.


>> Right... a context and limits.
>
> Fine, give me some contexts/limits where the values from the Agile
> Manifesto would not be applicable.

"Our highest priority is to satisfy the customer through early and
continuous delivery of valuable software."

There are cases where IMO "early delivery" and "valuable software" don't
go together, meaning that whatever software you will be delivering early
is not valuable and a waste of resources. (Note that I'm not saying that
this is generally wrong, just that this rule has limits and needs a
context.)


"Welcome changing requirements, even late in development. Agile
processes harness change for the customer's competitive advantage."

There are projects where limiting the change of requirements is
important. Sometimes the project manager needs to help the customer to
focus, or else the product never becomes ready -- and there's no
competitive advantage in that. (Note that I'm not saying that this is
generally wrong, just that this rule has limits and needs a context.)


"Business people and developers must work together daily throughout the
project."

A good thing, but only in certain contexts. There are situations and
environments where this would be highly inefficient. Also, not all
programming projects are in the business domain; maybe not even most. I
would also claim that there are projects where it is helpful for the
progress of the project if business people are kept away as much as
possible. (No smiley here...)


"Build projects around motivated individuals. Give them the environment
and support they need, and trust them to get the job done."

Good thing, but the project manager doesn't always have the position to
choose the involved individuals. I'd say in most bigger projects, no one
manager has the position to choose all of them, so this rule has some
limitations. Sometimes you can't build the project around motivated
individuals; you may have to work with what you've got, and the issue
may not be how to transform John into a motivated individual (because
that may be impossible), but how to manage his limited motivation so
that it doesn't harm the progress. Which may include a good dose of
supervision (as opposed to trust).


"The most efficient and effective method of conveying information to and
within a development team is face-to-face conversation."

This clearly needs a context. I have clients thousands of miles away,
and face-to-face conversation with them is nice, but not efficient for
most of the time (that is, when I'm here). /In this situation/ this
"rule" is obviously wrong, so there is a context missing.


"Working software is the primary measure of progress."

Didn't you say earlier that the measure of success is business value?
Working software is not necessarily business value, more working
software is not necessarily more business value. This is probably
downright wrong. (Which is not to say that working software is not /a/
measure of progress, but probably not the primary one.)


"Agile processes promote sustainable development."

This is not a rule, it is a claim. It may or may not be true; this
depends on the definition of the day of what is and what is not an
"Agile process" and on the specific situation where it is applied. It
also doesn't claim that Agile processes promote sustainable development
more than non-Agile processes, so it could even be bad (in the light of
this claim) to use Agile processes in a given situation. This needs a
/lot/ of context to be of any use.


"The sponsors, developers, and users should be able to maintain a
constant pace indefinitely."

This may suit some people, but not others. I used to work in "sprints",
followed by longer periods of, so-to-speak, non-commercial activities. I
liked that better at the time. I don't know why I should have done this
differently, just because the Agile Manifesto says so.


"Simplicity--the art of maximizing the amount of work not done--is
essential."

It seems to me that this is not meant in the way Wally (of Dilbert)
would understand it, so it definitely needs some context.


"The best architectures, requirements, and designs emerge from
self-organizing teams."

This is a rather broad claim, and I don't see how they could
substantiate it. Anybody can claim something like this, but that doesn't
make it true. By the same token, I could claim that the best
architectures, requirements, and designs emerge from teams with good
leadership. Now what?


"At regular intervals, the team reflects on how to become more
effective, then tunes and adjusts its behavior accordingly."

While this would be nice, reality is that there are always some humans
on the teams -- at least that's my experience. And among humans it seems
to be common to have difficulties to adjust certain behavior traits in a
timeframe relevant for a programming project. So just a word of caution:
this may not be achievable when working with humans, so there is
definitely a context missing here.


In general, I think there are situations where this collection applies
rather well, but there are also many situations where it doesn't. So
knowing the (not stated) limitations and the (not stated) context is
quite important when applying the techniques that are based on these
principles, rules and claims.


>> For example, you posted a link earlier about documentation. This guy
>> had a diagram in his page, where both richness and efficiency of
>> documentation increases from paper over audio to video. Utter BS,
>> this diagram, if you start to analyze it.
>
> I can't find the document you are referring to, but I'm pretty sure I
> know which diagram you're talking about, and I think you
> misunderstood what it was showing. The diagram was grading the
> efficiency of the forms of _communication_, not _documentation_.

<http://www.agilemodeling.com/essays/communication.htm>

Look at the diagram again, and tell me how "Documentation Options" is
not about documentation (the author says that these are the "options for
when you are documenting"). As I wrote before, I commented on the graph
"Documentation Options", not the graph "Modeling Options". Maybe read my
earlier comments again, now with the graph in front of you. (And note
that I'm not talking about the relative position of the documentation
dots on the "richness" scale, but on the "effectiveness" scale.)

I was wrong about he not including electronic media; in the text he
writes "paper includes electronic media such as HTML that could be
rendered to paper". To me this is not any less strange than leaving it
out. Electronic documentation, for me at least, is much more effective
than paper documentation: it's searchable. Whoever hasn't understood the
importance (and effectiveness) of being able to (electronically) search
documentation IMO didn't understand much of what documentation is about.
Which kind of reflects on everything else he says about it. And placing
electronic (searchable) documentation at the same effectiveness spot as
paper documentation shows a clear lack of understanding of this crucial
feature.


>> In this thread, I commented almost exclusively on your comments
>> (which includes what you say that Agile is for you), not about Agile
>> in itself. I don't make judgments about Agile, but comment on
>> specific affirmations, independently of where they come from.
>
> That's not true. You keep making statements about the vision of Agile
> that you have created.

Where?

> You also said at one point, that the points in Agile manifesto
> contradict themselves. I'm still waiting for you to identify the
> specific points in question, and explain that statement.

You never asked me before. In this message you asked me, and you got it.


>> (Like you said that the Manifesto changed the management world,
>
> When did I say this?!!

"Agile addresses two big fallacies with project management: (1) men and
months are interchangeable and (2) requirements never change."

Not exactly the same, so I slightly misquoted out of memory. But in
essence, this is what you wrote; addressing two big fallacies with
project management results in changing the management world.


{Quote hidden}

Not sure what you wanted to write here (it seems unfinished), but I hope
you are aware that (IMO) Rolf wrote this kind of tongue-in-cheek, and so
did I. Responding to a similar paragraph of yours (which I snipped).

Gerhard

2009\02\06@084952 by olin piclist

face picon face
Gerhard Fiedler wrote:
> See, this is what I have a problem with. I could care less whether a
> tool or technique contradicts the principles a few guys came up with.
> If I have good reason to think it works (and I wouldn't think that it
> does if I didn't think I had good reason :), I'm going to use it --
> and not waste a second of thought whether or not this contradicts
> somebody's principles.

I totally agree with Gerhard on this.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\02\06@085216 by Gerhard Fiedler

picon face
Vitaliy wrote:

> That's why I like the iterative approach, where planning is followed
> by writing actual code. You gain useful information after every
> iteration.

This is, in principle and in many cases, a good thing. Nobody has ever
claimed that it isn't. But it needs a context and has limits.

There are projects where there is no use in trying to deliver a first
iteration so soon as the Agile Manifesto suggests. Sometimes, the first
iteration takes /much/ longer than all others, because until we get to
the first iteration, we need to gather requirements (for the whole
project), we need to architect (the whole or large parts of the
project)... this all costs time. So treating the first iteration the
same as the others WRT its duration doesn't make sense in many projects.
Which means that this part of Agile doesn't apply to these projects.

Gerhard

2009\02\06@085645 by olin piclist

face picon face
Sean Breheny wrote:
> I'm not sure what type of "provable" you mean. There is very much an
> ongoing effort in CS and engineering circles to have programming
> languages for which one can automatically prove that functions do what
> they are intended to do for all inputs. In other words, if you have a
> programming language which is restrictive enough (for example, does
> not allow intercommunication between functions except by explicit
> parameter passing), you can then state the proposition that function
> "CountWidgets(x)" always returns the correct value for any x in the
> set of integers. This proposition can then be fed to an automatic
> theorem prover program which can produce a mathematical proof that the
> proposition is true (or false, or perhaps unable to determine).

All you're doing is pushing the human error to whatever input the prover
needs.  At some point a human is going to have to describe what he wants,
and that's the point at which screwups will always happen.

In your example, you may prove that CountWidgets does indeed perform the
stated function correctly.  But what if the error is that you're only
supposed to count blue widgets, not all of them, or the super sized widgets
are supposed to be counted double, etc, etc, etc?

All this correctness proving seems so pointless because it can only prove
the correctness of trivial code, and then only what you said it's supposed
to do, not what it's really supposed to do.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\02\06@090742 by olin piclist

face picon face
Vitaliy wrote:
> Seriously, it's very hard to have a meaningful conversation with
> people who have preconceived notions about Agile that they acquired
> through hearsay.

That's because you started talking about it but never bothered to explain
it.  No, I'm not going to chase down some web site or follow a link.  You
should be able to explain the basics in a couple of paragraphs.  Any details
beyond that would be beyond this discussion anyway.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\02\06@095218 by Gerhard Fiedler

picon face
Vitaliy wrote:

> Gerhard Fiedler wrote:
>>> I was saying that for the things that are always true, the nature of
>>> the project is irrelevant.
>>
>> What are the things that are always true?
>
> Does this mean you finally want to go over the manifesto and the 12
> principles, point-by-point? :)

No, it doesn't. It was a question for you to answer. (FWIW, I responded
to the first request where you asked me about my opinion on the
principles, just a few minutes ago. You never asked this before.)


>> Really, I haven't found much in this category. People come and go,
>> and schools of thought, too -- this is probably one :)
>
> Only time will tell, right? :)  It's been eight years since the
> manifesto had been drafted. Agile is by and large accepted by the
> software industry (according to a DDJ survey), and is trickling down
> to the embedded software industry. I predict that it won't be long
> until you have a personal encounter with it.

DDJ notwithstanding, I'm currently working in the software industry and
I don't see it. Given by the quick survey here of people working in the
industry, you seem to be the only one accepting it. Doesn't look like
it's "by and large" accepted.

But I've had my encounters with it, possibly before you. It's around, as
you noted, since 2001. Fact (of experience) is that it doesn't work as
claimed in many situations, and that's why my encounters with it were
more peripheral -- as seems to be the case for almost everybody I know
in the industry.

It probably did have some influence, though. But that's a different
story.

I think your picture that the ones who don't believe in it just don't
know enough about it may not always be correct.


{Quote hidden}

I didn't redefine, I defined what I'm talking about. That's different.

> There's nothing vague about the definition of the waterfall model.
> You have your phases, and you do them in the prescribed order:

This is the Waterfall Model (capital letters). I was talking about the
"basic principle" of it. Which is something different. You have a Yageo
resistor, and the basic principle of a resistor; there is a difference.

> "Usually doesn't work well" is putting it mildly. How about "never
> works"?

Right. You could have noticed that I try to abstain from absolutisms; I
think they are (mostly :) evil :)  

But the Waterfall Model is different from the basic principle of it.


{Quote hidden}

You're mixing domains. We are talking about managing people. You are
talking about the design of a transmission controller or a building.
These examples are analog to the architecture of a program, not analog
to the way the programming team is managed. Managing a programming team
and a builder crew is not that different, but of course designing a
program and designing a building is different. But then again, there is
a similar difference between managing a builder crew and designing a
building on one side and between managing a programming team and
designing a program on the other side. But managing and designing are
two different domains, with different rules.

> Do you not read any books on software development?

I do. Again, "development" and "managing" is not the same thing.
Software development includes managing and architecture and coding and
life cycle and maintenance and lots of things. We're talking here about
a management style.

Talking about management, I don't consider only the Waterfall School and
the Agile School. There are others, and there are techniques that are
not really part of any specific school.

Managing people is less "objective" than controlling transmissions or
designing houses or architecting software. While there are certain
rather objective principles that determine the static properties of a
building, there is much less objective knowledge about managing the
people involved in building it.


>> I said I can agree with most that's written in the Manifesto. But the
>> Manifesto is rather generic; you can bring it in alignment with a
>> lot; it seems with more than most would consider "Agile".
>
> Gerhard, you're very slippery, you know that? :) Give me one example.

Just sent you a few.


>>> There are other, more effective ways to make dependencies visible.
>>
>> Like for example?
>
> en.wikipedia.org/wiki/Critical_Chain_Project_Management
>
> The information density used in this type of chart is higher,

When you first mentioned Goldratt, I already checked this out, but I
didn't see (and still don't see) a chart there. So I don't really see
what you're talking about.


> Have you ever heard of JUnit, CUnit, or DUnit?

I know them (well, not DUnit). Have you used them?

Single-threaded unit test is one thing, multi-threaded unit test is
another thing, but testing the real problem (which is integrating
several units, or the whole application) is a completely different
animal.

Besides, you almost never can base a unit test on project requirements
(because the units are way below the project requirements), so if you
want to base tests on project requirements, you need to focus on a
higher level of testing.

> I don't think it makes sense for me to spend my time basically
> paraphrasing the basics of TDD. Tests and the software under test,
> are different things.

If you say so...


> You must have misunderstood what I meant. IIRC, you yourself expressed
> surprise [...]

No. No surprise anywhere. Just some disagreement. And other than just
stating that I'm wrong, you didn't really come up with an explanation
why (much less a convincing one).

Again, just because I don't completely agree with TDD, everything that
has been claimed about it (including by you) doesn't mean I don't know
it.

> [...] that TDD requires that tests be written *before* the function
> that is being tested. Or is this how it's normally done, in your
> experience?

I don't quite understand the question, especially because it seems to be
based on a "surprise" that I didn't have. It may help if you didn't
always assume that we (who don't agree with everything you write) just
don't do so because we don't know enough about Agile, TDD, whatever.


{Quote hidden}

It happens. That's why still more verification (outside of both DUT and
test) is necessary.


> FWIW, TDD tends to assume that the tests reside inside the program
> under test, preferably in the same module/class that the tests are
> testing.

Again, this has some merit, but is very limited, especially in
multi-threaded and real-time applications. Another major limitation of
this is that it generally doesn't test project specifications, as these
by definition don't apply to any single unit, but to the overall
program.


>>>> Where do the tests come from?
>>>
>>> They are based on the requirements.

Which requirements? It is statements like these that made me think that
you didn't talk about unit tests. You test architectural assumptions
with unit tests, not project requirements. Go back to the example of the
multi-threaded queue... the test doesn't have much to do with project
requirements.


>>> Simple tests don't cost much, and can be useful.
>>
>> Of course. This is a perfect application of the "it depends" principle
>> :)
>
> I prefer the agile principles, to the "every project is different, you're on
> your own, start from scratch" kind of advice. ;-)

"Start from scratch" and "you're on your own" are additions by you.
Nobody else claimed these. They don't have much to do with "every
project is different".


> Show me something better, and we can get together for a public burning
> of the Agile Manifesto.

Public burnings are something for fanatics. Whoever burns today the
Agile Manifesto because the Static Manifesto says so will burn tomorrow
the Static Manifesto because the Floating Manifesto says so. I rather
read all of them, use what I learn from them at my judgment, and move on
with what's important in life :)

Gerhard

2009\02\06@105256 by Rolf

face picon face
Gerhard Fiedler wrote:
{Quote hidden}

[ Whole bunch of fascinating stuff snipped..... ]

>
> Gerhard
>  


Oh, holy cow batman. Thanks Gerhard. Your post inspired me to read that
website. I could hardly believe that all those things you quoted were
the Agile 'mantra'. They sound like things written by poor advertising
agencies.

So, some quick responses to Agile development now that I have spent (a
little) time reading up on it.... in response to the 13 "Best practices
of AMDD"

1. Active Stakeholder Participation  - Our company has many clients
(hundreds) that all purchase and use the same software.... now what? Our
development happens in 8 countries and our clients are in many language
regions (from Colombia to Japan). Again, now what? "Active Stakeholder
Participation" using "Inclusive tools and techniques". OK, I go to
client meetings on occasion, and we have whiteboards, but I don't think
I can take a ball of string to my next meeting at Citibank. Then again,
we have whole teams of people who communicate regularly with clients and
determine the 'functionality voids' that become requirements for our
systems. They get this from discussions and on-site needs assessments
with the clients, and prioritize things, and we deliver at some point.
This process is all about prioritization and compromize.

2. Architecture Envisioning - they admit that this is really what I
would call 'good progect design'... choose the right
environments/people/teams up front because you really don;t want to
re-write everything later when your client uses UNIX instead of Windows,
or some other problem. This is where my 'Wise/Experienced' Manager comes
in - they instinctively know how a project will have to unfold in order
to be successful. This setup phase is critical for the success of the
project...

3. Document Late - I am surprised that Wiki pages do not appear in the
things I read. I find them invaluable for maintaining a close
relationship between the specifications and compromises that inevitably
happen during the development of the code. The formal documentation
people can then use the wiki's to keep their work in line with reality.
It also allows the documentation process to be started before even the
programming is, and then the documentation can be comprehensive about
the requirements, can be re-active to known issues, and can even be
completed at pretty much the same time as the program itself. Why
document Late when you can get better delivery by documentin
concurrently and keep the documentation up to date with useful tools
like e-mail and wiki's. A huge tool for our documentation guys is also
the bug-tracking software we have, because it is intricately linked with
a lot of things and all issues can be documented and the process is
comprehensive. The bug-tracking software has specific components
designed to be used by documentation people so that we (the developers)
can keep the documentation up to date with changing features/known
issues. Documentation is a process, not a product. Well, that's how I
feel anyway. If you leave it all to the last minute you get incomplete
documentation. Sure, it is hard to keep documentation current with the
code, but, they don't pay me money to slack up on the important stuff.
Documentation/comments/bug-tracking and prioritization are at least as
important as the program itself. Programming only means you do a 1/4 of
a job.

4. Executable Specification - we have another department of "Financial
Engineers". They determine how to model financial concepts, and part of
that process is the development of meaningful test data so that we can
develop with some form of certainty that we will hit the mark with the
final product. So, sure, I agree that "Executable Tests" are important.
We also have another department that takes the test data and warps and
breaks it in such a way that we ensure that our code can process broken
data and responds appropriately, so we also have an executable
ex-specification too.

5. Iteration Modelling - an Agile iteration is a 'short time' (couple of
weeks or so). This tells me that the developer should know where they
are going to aim for in the next 'iteration' - what they want to get
done. Sure, anything else is wrong, really. All this says is that if you
want to get to place X at the end of an iteration, then you have to know
where X is, and how to get there... I guess the alternate is to not do
iteration modelling, and just programm to no 'model' and maybe you will
get somewhere useful. I must be missing something else because this
concept appears so common-sensical.... Perhaps the agile 'Iteration' is
not what I understand, but, if I had to spend some time implementing a
financial feature, I would first determine what my data requirements
were, where that data was to be stored, how I was going to process it
(like in the database, or a seperate program), and how I was going to
present the results to the next in line process/report, etc. Break that
down in to 'iterations', and each can be modeled in more detail.

6. Just Barely Good Enough Artifacts - Do only as much as needed for the
situation at hand - I'll comment on this one with the next one....

7. Model a bit ahead - this is a contradiction of 6. I thought that you
should only do as much as needed for the situation at hand...? Now we
need to model/plan/think ahead 'to reduce overall risk'.

8. Model storming - "model on a just-in-time basis" ... "to think
through a design issue". Again, contradictory, do I model a bit ahead,
or do I model only when needed?

9. Multiple models - no idea if this makes sense. The only way it makes
sense to me in the context it is written is if the type of "Model" that
practice 9 refers to is different to the type of model that 5, 6, 7, and
8 refer to.

10. Prioritized Requirements - according to 'stakeholder' interests -
this makes no sense... The stakeholder wants results. Results are the
consequence of a process, and have a logical progression where there are
dependencies (perhaps co-dependencies) that need to be accomplished
before a result is achieved. In order for this to make sense, you have
to break the 'final product' down in to sensible sub-products that make
sense to the client, and then focus on the delivery of a logical subset
of the functionality. This is called "compromizing" and is common to
many projects I do. Not all functionality is delivered with the first
version, but will be expanded/completed later. This is part of a client
negotiation process, but, after that is done, then the order of
development is not determined by the client, but by the logical
progression required to accomplish the reduced goal (which the added
complexity of developing with the full product in mind so that when the
remainder of the product is implemented it needs as little re-work of
the initial deliverable as possible.

11. Requirements Envisioning - this is just a souped up way of saying
"know what your customer wants". I notice that it explicitly says "at
the beginning of an Agile project". So much for welcomming requirement
changes at the end.....

12. Single Source Information - keep information in one place and one
place only. Get real. I have a bridge to sell anyone who believes this
can be done. Data is unprocessed information. Information is data that
is taken in a context. The context of the data is as important as the
data itself. What the programmer considers to be information is
different to what the end used considers to be information. The same
data has to be represented different ways to make the data become
information for the different parties. Saying that information should be
kept in one place is diminishing the value of data. Data has to be
expressed in different ways to accomplish different goals.

13. Test Driven Design - No problem with this. I use the technique often
- in context (both big and small scale). But, remember that the tests
need validation too.




All in all, I have to agree with my colleagues at work. Agile is a
non-starter for our company. It seems to be most appropriate for
software projects with clients of a certain nature, and not the system I
have at work. Given that I take issue with most of the 13 best practices
of ADMM, I guess I misunderstand things.

My impression of the 'best practices' is that it is a collection of
compressed 'sound bites' that have lost the weight of the argument in
the compression.

Anyways, I like number 6 the most: "Just Barely Good Enough Artifacts".
Sounds like it can be applied to anything, including agile development
itself....

... I hereby claim to be an Agile programmer because my "programming
methodology" is Just Barely Good Enough! ;-)  ;-)

Finally, the way the whole site is written is not like a 'must-do'
thing. It is more a case of 'zen buddism' and 'third person' stuff. I
have a mental image of Yoda that is in my mind when I read the page.
"The Agility .. you must embrace ... or a Jedi you are not!" "A
disturbance in Agility, your document is too early".

Rolf

2009\02\06@110929 by William \Chops\ Westfield

face picon face

On Feb 6, 2009, at 3:50 AM, Vitaliy wrote:

> the Agile Manifesto.

So back in 1992 Dave Clark said “We reject: kings, presidents, and  
voting. We believe in: rough consensus and running code,”
Would you categorize that as fundamentally "agile" in nature?

BillW

2009\02\16@055145 by Vitaliy

flavicon
face
I don't have enough time to respond to everything, but at the same time I
feel I need to bring this thread to some soft of logical conclusion, so...
:)

Rolf wrote:
> Oh, holy cow batman. Thanks Gerhard. Your post inspired me to read that
> website. I could hardly believe that all those things you quoted were
> the Agile 'mantra'. They sound like things written by poor advertising
> agencies.

I posted links, and referenced the manifesto and the principles throughout
this thread. I think it's very typical that people choose to argue until
they're blue in the face, but don't even bother to find out exactly what it
is they're arguing against. It takes less than a minute to read both the
manifesto and the principles, in their entirety.


Gerhard wrote:
>> Why don't you
>
> Not sure what you wanted to write here (it seems unfinished), but I hope
> you are aware that (IMO) Rolf wrote this kind of tongue-in-cheek, and so
> did I. Responding to a similar paragraph of yours (which I snipped).

It was late, I was tired, and I was going to make a joke by inviting you
both to Arizona (it would be a change with a capital "A"). The joke was
lame, but I forgot to delete the beginning of the sentence.

It wasn't my intent to convert any of you (what's in in for me, anyway?). I
simply wanted to set things straight, because Rolf made some statements that
I know are not true. I probably confused the issue even more, for which I
apologize.

I learned a lot nonetheless, and maybe more than anything, this thread
confirmed what I already knew -- for example, that people resist change
(Olin admitted that the problem with Agile, is that it has a name).

I don't know where I'm going to go from here. I think I'll still share some
of what I learned from time to time, there are techniques that are useful
whether you subscribe to a certain development methodology, or not (although
there is synergy when several techniques are used together).

I don't like to think of programming as a black art, I think we can and must
learn how to do it well.

Vitaliy

2009\02\16@060450 by Vitaliy

flavicon
face
Olin Lathrop wrote:
>> Seriously, it's very hard to have a meaningful conversation with
>> people who have preconceived notions about Agile that they acquired
>> through hearsay.
>
> That's because you started talking about it but never bothered to explain
> it.  No, I'm not going to chase down some web site or follow a link.  You
> should be able to explain the basics in a couple of paragraphs.  Any
> details
> beyond that would be beyond this discussion anyway.

Olin, I explained it many times over, but I spent the bulk of my time
talking about what it's not, because you all have your preconceived notions.

You routinely tell people to RTFM. If you can't spent two minutes of your
time to click a link, and read the Agile Manifesto and the 12 principles
(the essense of what Agile is about), you are just lazy.

Agile is not difficult to understand, and it's not that "weird", you just
have to open your mind a bit.

Vitaliy

2009\02\16@060745 by Vitaliy

flavicon
face
"William "Chops" Westfield wrote:
>> the Agile Manifesto.
>So back in 1992 Dave Clark said “We reject: kings, presidents, and
voting. We believe in: rough consensus and running code,”
Would you categorize that as fundamentally "agile" in nature? <

No, not really.

2009\02\16@192445 by Gerhard Fiedler

picon face
Vitaliy wrote:

> I don't like to think of programming as a black art, I think we can
> and must learn how to do it well.

That's (part of) the point. Agile is "art" -- whether black or white is
in the eye of the beholder.

Gerhard

2009\02\17@005405 by Vitaliy

flavicon
face
Gerhard Fiedler wrote:
>> I don't like to think of programming as a black art, I think we can
>> and must learn how to do it well.
>
> That's (part of) the point. Agile is "art" -- whether black or white is
> in the eye of the beholder.

You may be OK with programming being an art, but I find comfort in knowing
that I have tools that give me a degree of predictability that make
programming more of a craft.

Vitaliy

2009\02\17@010002 by Vitaliy

flavicon
face
Olin Lathrop wrote:
>> See, this is what I have a problem with. I could care less whether a
>> tool or technique contradicts the principles a few guys came up with.
>> If I have good reason to think it works (and I wouldn't think that it
>> does if I didn't think I had good reason :), I'm going to use it --
>> and not waste a second of thought whether or not this contradicts
>> somebody's principles.
>
> I totally agree with Gerhard on this.

I already addressed this at least once. If an auto mechanic's apprentice
decides that he's going to make his own rules, just because he doesn't care
what a bunch of old geezers think is the proper way to replace a
transmission, I think the outcome would be quite predictable.

However, when it comes to programming, somehow other people's experience
suddenly becomes irrelevant.

Sounds like a double standard.

Vitaliy

2009\02\17@010445 by Vitaliy

flavicon
face
Olin Lathrop wrote:
> All this correctness proving seems so pointless because it can only prove
> the correctness of trivial code, and then only what you said it's supposed
> to do, not what it's really supposed to do.

I'm working on a project where trivial checks would be quite useful. For
example, I'm constantly messing up timings when I'm making unrelated changes
to the code logic. Manual testing is proving to be very tedious (I have to
check several scenarios), so I'm seriously considering writing a test
utility that would make sure that the changes I make don't break the
timings.

Vitaliy

2009\02\17@014725 by Vitaliy

flavicon
face
Gerhard Fiedler wrote:
>> Does this mean you finally want to go over the manifesto and the 12
>> principles, point-by-point? :)
>
> No, it doesn't. It was a question for you to answer. (FWIW, I responded
> to the first request where you asked me about my opinion on the
> principles, just a few minutes ago. You never asked this before.)

Sent: Monday, February 02, 2009 01:45
{Quote hidden}

I don't think you ever responded to this message. I will try to find your
message where you said you address the points...


> DDJ notwithstanding, I'm currently working in the software industry and
> I don't see it. Given by the quick survey here of people working in the
> industry, you seem to be the only one accepting it. Doesn't look like
> it's "by and large" accepted.

So it would seem. The embedded community has been known for lagging behind.
;-)


> I think your picture that the ones who don't believe in it just don't
> know enough about it may not always be correct.

I only said that you don't know what Agile is, based on the statements
you've made, and the questions you've raised. I'm sure you've been in a
situation where someone was talking about something, and it was obvious to
you that the other person didn't really understand the subject.


{Quote hidden}

This is getting absurd. :)


>> There's nothing vague about the definition of the waterfall model.
>> You have your phases, and you do them in the prescribed order:
>
> This is the Waterfall Model (capital letters).

There is no such thing. Read it for yourself, it's all lowercase:

http://en.wikipedia.org/wiki/Waterfall_model


> I was talking about the
> "basic principle" of it.

If you start reassigning meaning to words, you could be talking about Agile,
and calling it "waterfall model". Absurd.


{Quote hidden}

You basically evaded my question. Call it "software engineering", if you
like -- what makes it different from engineering a building, in the sense of
applying prior experience?


>> Do you not read any books on software development?
>
> I do. Again, "development" and "managing" is not the same thing.
> Software development includes managing and architecture and coding and
> life cycle and maintenance and lots of things. We're talking here about
> a management style.

I disagree. Lean development methodologies cover all of the things you
mentioned. It's more about creating a paradigm of how a team organizes and
works, than pure "management". Refactoring, pair programming, iterative
development have little to do with management.


{Quote hidden}

For some reason, I can't find any good critical chain graphs. But basically,
the reason it's higher density is simple: with a Gantt chart, time
increments are fixed, so a 1-day task is visually (physically) 100 times
shorter than a task that takes 100 days.

In a critical chain diagram, you have circles (or bubbles) that are
connected by arrows. Each arrow has the number of days required to complete
the task. One or 100, doesn't matter (visually).

I can send you the book, if you promise you'll read it. :)


>> Have you ever heard of JUnit, CUnit, or DUnit?
>
> I know them (well, not DUnit). Have you used them?

Yes, I played with the DUnit a little bit. As I mentioned before, I'm
totally new to TDD, so I'll skip the rest of the TDD comments (you may very
well be 100% correct, and I 100% wrong).


>> I prefer the agile principles, to the "every project is different, you're
>> on
>> your own, start from scratch" kind of advice. ;-)
>
> "Start from scratch" and "you're on your own" are additions by you.
> Nobody else claimed these. They don't have much to do with "every
> project is different".

They were implied. I asked you to share your experience, you said you can't
because every project is different. If you can't make any generalizations
(even conditional ones) about what constitutes good development practices,
it's as if you haven't learned anything from your previous experiences
(obviously not the case, I know you have).


>> Show me something better, and we can get together for a public burning
>> of the Agile Manifesto.
>
> Public burnings are something for fanatics. Whoever burns today the
> Agile Manifesto because the Static Manifesto says so will burn tomorrow
> the Static Manifesto because the Floating Manifesto says so. I rather
> read all of them, use what I learn from them at my judgment, and move on
> with what's important in life :)

What's important in life? :)

Vitaliy

2009\02\17@024334 by Vitaliy

flavicon
face
Gerhard Fiedler wrote:
>> For example, your statement above implies that Agile prohibits one
>> from using "unrecognized techniques".
>
> This is not about Agile, it's about what people wrote. You wrote earlier
> in this thread: "Agile does not preclude one from using any tools or
> techniques, as long as they don't contradict the principles." This is
> pretty much synonymous with "Agile precludes the use of tools or
> techniques that contradict the principles."
>
> As I wrote before, when I think a tool or technique is useful in a given
> situation, I couldn't care less whether "Agile" (whoever or whatever
> that is) "precludes" that or not. I'll use it.

I think of it more in terms of "natural laws", like the law of gravity. Or
more trivial: if I do A, B happens (if you step on my toe, I will scream in
pain).

I'm sure you have your own set of principles that you follow, "because they
make sense."

If a procedure places more value on documentation than on working code, I
will scrap it -- because it contradicts a principle which I consider sound.


{Quote hidden}

How is it a waste of resources? Why?


> "Welcome changing requirements, even late in development. Agile
> processes harness change for the customer's competitive advantage."
>
> There are projects where limiting the change of requirements is
> important. Sometimes the project manager needs to help the customer to
> focus, or else the product never becomes ready -- and there's no
> competitive advantage in that. (Note that I'm not saying that this is
> generally wrong, just that this rule has limits and needs a context.)

Every iteration starts with a meeting where the product owner (usually, the
customer) is asked to identify the priorities. That's your focus.


> "Business people and developers must work together daily throughout the
> project."
>
> A good thing, but only in certain contexts. There are situations and
> environments where this would be highly inefficient.

I asked you to be specific, remember? :)


> Also, not all
> programming projects are in the business domain; maybe not even most. I
> would also claim that there are projects where it is helpful for the
> progress of the project if business people are kept away as much as
> possible. (No smiley here...)

Give me some real world examples. By the way, I can't speak for the authors
of the manifesto, but I read "business people" to mean "customers" in
general: scientists, FBI agents, etc.


{Quote hidden}

I read somewhere that Sony's engineering is based on self-organizing teams.
The engineers pick the teams they want to work on, make arrangements with
their manager-to-be, and only then tell their current manager. They also
don't hire students based on their GPA, but based on their interests and
non-academic accomplishments, especially projects they did outside of
school. So this principle is applicable to big organizations.

Sure, there are challenges and managers can't always pick the people for
their team. That's too bad, but it doesn't make the principle any less
valid.


> Which may include a good dose of
> supervision (as opposed to trust).

Never worked for me in my entire career as a manager. Did it ever work for
you?

Eventually, I realized that whenever I lose trust in a person, it's time for
us to go our separate ways. Over time, we ended up with a team of highly
motivated, trustworthy individuals. I don't have to worry about employees
slacking off when I'm not there.


> "The most efficient and effective method of conveying information to and
> within a development team is face-to-face conversation."
>
> This clearly needs a context. I have clients thousands of miles away,
> and face-to-face conversation with them is nice, but not efficient for
> most of the time (that is, when I'm here). /In this situation/ this
> "rule" is obviously wrong, so there is a context missing.

Your statement does not invalidate the principle. It may not be possible for
you to have a face-to-face conversation, but it still is the most efficient
and effective method of conveying information. It is something to strive
for.


> "Working software is the primary measure of progress."
>
> Didn't you say earlier that the measure of success is business value?
> Working software is not necessarily business value, more working
> software is not necessarily more business value. This is probably
> downright wrong. (Which is not to say that working software is not /a/
> measure of progress, but probably not the primary one.)

I think you "misunderstand" on purpose. :) Working software is the only
thing that has business value. There is no business value in documentation
per se.


> "Agile processes promote sustainable development."
>
> This is not a rule, it is a claim. It may or may not be true; this
> depends on the definition of the day of what is and what is not an
> "Agile process" and on the specific situation where it is applied. It
> also doesn't claim that Agile processes promote sustainable development
> more than non-Agile processes, so it could even be bad (in the light of
> this claim) to use Agile processes in a given situation. This needs a
> /lot/ of context to be of any use.

I can see how it can be misread. I can paraphrase it as:

"Agile processes *are supposed to* promote sustainable development"

This usually means a reasonable work schedule, without back-to-back
marathons. It keeps people from burning out.


> "The sponsors, developers, and users should be able to maintain a
> constant pace indefinitely."
>
> This may suit some people, but not others. I used to work in "sprints",
> followed by longer periods of, so-to-speak, non-commercial activities. I
> liked that better at the time. I don't know why I should have done this
> differently, just because the Agile Manifesto says so.

As long as you could maintain this pace (frantic development/rest)
indefinitely (e.g., without burning out), and other people are OK with it,
that's fine.


> "Simplicity--the art of maximizing the amount of work not done--is
> essential."
>
> It seems to me that this is not meant in the way Wally (of Dilbert)
> would understand it, so it definitely needs some context.

"The amount of time is limited, do only what's important."


> "The best architectures, requirements, and designs emerge from
> self-organizing teams."
>
> This is a rather broad claim, and I don't see how they could
> substantiate it. Anybody can claim something like this, but that doesn't
> make it true. By the same token, I could claim that the best
> architectures, requirements, and designs emerge from teams with good
> leadership. Now what?

There are more of them, than of you. :)  The authors made the observation
(based on their experience) that people do better work when they are allowed
to organize themselves, and pick their own leaders (instead of having
leaders imposed on them).


{Quote hidden}

Do you not do a "post mortem" after a project is done, or at least reflect
on what when well, and what did not?

This just says that this is a good practice, and that it needs to be done
throughout the project, not at the end of it (which is often too late).


{Quote hidden}

"Documentation Options" is in the context of "Communication Options" (the
title of the chart).

"As the richness of your communication channel cools you lose physical
proximity and the conscious and subconscious clues that such proximity
provides.  You also lose the benefit of multiple modalities, the ability to
communicate through techniques other than words such as gestures and facial
expressions.  The ability to change vocal inflection and timing is also
lost, people not only communicate via the words they say but how they say
those words.  Cockburn points out that a speaker may emphasize what they are
saying, thus changing the way they are communicating, by speeding up,
slowing down, pausing, or changing tones.  Finally, the ability to answer
questions in real time, the point that distinguishes the modeling options
curve from the documentation options curve, are important because questions
provide insight into how well the information is being understood by the
listener. "



>>> In this thread, I commented almost exclusively on your comments
>>> (which includes what you say that Agile is for you), not about Agile
>>> in itself. I don't make judgments about Agile, but comment on
>>> specific affirmations, independently of where they come from.
>>
>> That's not true. You keep making statements about the vision of Agile
>> that you have created.
>
> Where?

Go back and count how many times you used the word "Agile". :)  "Redefining
what was done before the Agile Manifesto as Agile doesn't help anybody.",
etc (as if Agile claims to redefine something that was done before).

Although Rolf is of course more guilty of propagating the myths.

Vitaliy

2009\02\17@024815 by Vitaliy

flavicon
face
Olin Lathrop wrote:
{Quote hidden}

Thanks for sharing!


> Even though there is a crunch right now and I'm actually working on new
> and
> clever algorithms, I spent most of Tuesday writing code to visualize some
> intermediate results by writing images with annotation.  The data is too
> complicated and too large to understand by looking at lists of numbers or
> even simple plots.  Despite the crunch, this was time well spent and the
> resulting images have already been quite useful.

Would you say the process of creating the images was as important, as the
resulting images?

Vitaliy

2009\02\17@075332 by Gerhard Fiedler

picon face
Vitaliy wrote:

> Gerhard Fiedler wrote:
>>> I don't like to think of programming as a black art, I think we can
>>> and must learn how to do it well.
>>
>> That's (part of) the point. Agile is "art" -- whether black or white is
>> in the eye of the beholder.
>
> You may be OK with programming being an art, ...

Nope. I'm not (and you're not either) talking about programming. We are
talking about management of programming. Agile is /not/ about
programming; it's about management of programming.

OTOH, it may make some uncomfortable, but the difference between
("just") solid programming and good programming /is/ art. The difference
between ("just") solid engineering and good engineering is art. The
difference between doing anything solidly and doing it well is art; it's
something that can't (predictably) be taught (even though some are more
inspiring than others) and a feature of the individual.


> ... but I find comfort in knowing that I have tools that give me a
> degree of predictability that make programming more of a craft.

What do you consider the difference between "art" and "craft"
(predictability)? If predictability, predictability of what? What tools
are you talking about (Agile techniques)? And why do you think you can
know that they (Agile techniques, if that's what you mean) increase
predictability?

FWIW and IMO, the relatively frequent refactoring required by Agile
techniques  makes most (solid, "crafty") programmers uncomfortable. This
works better with a team of good (imaginative, "artsy") programmers.

Gerhard

2009\02\17@080029 by olin piclist

face picon face
Vitaliy wrote:
>> Even though there is a crunch right now and I'm actually working on new
>> and
>> clever algorithms, I spent most of Tuesday writing code to visualize
>> some intermediate results by writing images with annotation.  The data
>> is too complicated and too large to understand by looking at lists of
>> numbers or even simple plots.  Despite the crunch, this was time well
>> spent and the resulting images have already been quite useful.
>
> Would you say the process of creating the images was as important, as
> the resulting images?

I'm not sure what you are getting at, but the process itself was largely
irrelevant.  Being able to see the data via the images however was very
useful, so the process to make that possible was necessary and therefore
important to perform.  Nobody here is faulting me for having spent time
writing the software to make these images.


********************************************************************
Embed Inc, Littleton Massachusetts, http://www.embedinc.com/products
(978) 742-9014.  Gold level PIC consultants since 2000.

2009\02\17@102049 by Gerhard Fiedler

picon face
Vitaliy wrote:

> I think of it more in terms of "natural laws", like the law of
> gravity. Or more trivial: if I do A, B happens (if you step on my
> toe, I will scream in pain).

IMO there are no "natural laws" in this area. At least not as simple
ones as the Agile Manifesto.

> I'm sure you have your own set of principles that you follow, "because
> they make sense."

No, not really.

> If a procedure places more value on documentation than on working
> code, I will scrap it -- because it contradicts a principle which I
> consider sound.

See, I'm not working towards principles. If documentation is what I
need, that's what I put value in. If it is code, that's where I'm on.


{Quote hidden}

I wrote "there are cases". Delivering something that can be used by a
user costs time and effort. I must make sure that it runs (which may
really not be related to what I'm working on right now), I may have to
make it installable (say, install and configure a database and a bank
communication simulator), I may have to explain (and document!) how it
works, what the limits are, etc. There can be lots of work involved with
delivering even an intermediate result if it is supposed to do something
that makes sense. If that intermediate result by its very definition
isn't of value to the customer, this effort may be wasted.


{Quote hidden}

A customer may not have enough knowledge of the programming process to
be able to identify priorities iteration by iteration. That's often why
they outsource the programming in the first place. I regularly have to
help the customer understanding his own priorities -- exactly by
invoking the greater picture, by making clear in simple terms the
consequences of certain priorities, and so on. This often changes the
priorities drastically, and often in the way of "if there is not a
reasonable chance to get feature A within (overall) budget, there is no
need for feature B." So even though he may have chosen feature B as
priority for this iteration, looking at the greater picture, planning
ahead, estimating and budgeting, we come to the conclusion that since
it's not probable that we'll be getting to feature A within the next
five iterations, we won't implement feature B in this iteration. This
all involves quite a bit of planning ahead. Change is not always to the
customer's competitive advantage, as change costs time and money.

Also, in many cases, they don't want to spend the time to decide how to
proceed through the development process. They don't understand it, they
don't want to understand it (it's not their business, they have other
things to do), and so they hire me to do it.


>> "Business people and developers must work together daily throughout
>> the project."
>>
>> A good thing, but only in certain contexts. There are situations and
>> environments where this would be highly inefficient.
>
> I asked you to be specific, remember? :)

Depends on the people involved. Ask anybody here who works in a big
company what they think about marketing people being too closely
involved with product development. There simply are people who need to
be controlled, because they don't see that change is expensive. (I won't
mention any names, if this is what you mean. :)

>> Also, not all programming projects are in the business domain; maybe
>> not even most. I would also claim that there are projects where it
>> is helpful for the progress of the project if business people are
>> kept away as much as possible. (No smiley here...)
>
> Give me some real world examples.

As I wrote above. Some "business people" don't understand much about the
development process and can't see that change is expensive -- after all
"it's all software". (I've read this before here... where was it? :) If
you give them too much influence, the project never finishes. I know
that your premise is to deliver useful software with each iteration, but
this just isn't possible with every project.

With the billing system I wrote about a few times, unless it's complete,
it's just wasted bits. (The initially created documentation, after it
was completed, already had a value without the software. But the
software only had a value after it was almost completely working. There
were no meaningful intermediate deliveries. Well, there were... but only
after some 80% or more were done.) Any substantial change in
requirements before that would have cost tons of money without
delivering any value. And there are "business people" who would try to
make such a change, just because they are the new boss of the department
and think they need to change a few decisions of their predecessor.

> By the way, I can't speak for the authors of the manifesto, but I read
> "business people" to mean "customers" in general: scientists, FBI
> agents, etc.

I don't think that this is what "business people" means. I'm not a
native English speaker, but maybe someone else jumps in here and tells
us what "business people" means. I think the use of this term for
"clients" (and not "clients") reflects the environment the people who
created the Agile Manifesto come from.

Did you ever consider the possibility that the Manifesto was written
from a rather specific background, with a set of specific situations in
mind that don't necessarily reflect the whole spectrum of development
situations?


{Quote hidden}

Either it is a principle that I can apply to every project, or it's not.
If it's not, I need to use my judgment call whether or not to apply it.
With which we are back to the initial point in question: it's a question
of your judgment.

>> Which may include a good dose of supervision (as opposed to trust).
>
> Never worked for me in my entire career as a manager. Did it ever
> work for you?

Yes, and well.

> Eventually, I realized that whenever I lose trust in a person, it's
> time for us to go our separate ways.

Trust is not a Boolean variable. It's not about being able to trust or
not being able to trust. You always can trust people with some things,
but not with others. There are people I can trust to come up with a
suitable architecture, which even if it's not the one I'd have chosen I
can trust is a suitable and good one. But I may not be able to trust
them with keeping an eye on all the details involved with managing a
release (branching and tagging in the repository, creating the right
builds, update all release documentation, etc). And there are people
where just the opposite is valid. There are other people that are
generally good at what they do, but the quality of what they do is
highly dependent on their mood, which may depend on the situation at
home, so I need to be aware of that and may "trust" that person with
something important some times and not others.

> Over time, we ended up with a team of highly motivated, trustworthy
> individuals. I don't have to worry about employees slacking off when
> I'm not there.

This is good for you. But I'm sure part of this is that you know with
/what/ to trust each one.

Also, as I said, there are many situations where the manager of a
project can't choose all team members. Especially you generally don't
choose the representatives of the client. And they may not be
"trustworthy" in delivering on time, they may not be "trustworthy" in
understanding everything well enough to be able to make the right
decisions for the client, and so on. Note that if the rep of the client
screws up and the project fails, it's generally not that rep's problem,
but mine. That I "covered my ass" and can show that it wasn't a problem
on my end may help sometimes, others not that much. So being able to
assess the amount and type of trust I can put into every involved person
and put suitable controls in place can be really important, to the point
of being crucial for the success of the project.


{Quote hidden}

No, it's not. It /is/ possible, it is just not efficient in this
situation because it costs too much (time and money). In order for it to
be the most efficient method, you need to define the situations to only
include situations where the cost of the method is outweighed by its
advantage over alternative methods. But then, duh.

The challenge is not to find the best way in an already ideal situation,
but to find the best way in a real project. And that means to consider
all conditions. There are situations where face-to-face is /not/ the
most efficient way to communicate things -- among others, that's why we
are participating in this mailing list. It would be a lot more efficient
(considering only the communication part, not any other effects) if the
few who are interested in any particular issue just got together and
talked it over. This would definitely be possible if we were willing to
spend the time and resources. But considering our other priorities, it's
not as efficient as doing it this way -- and that's why we are doing it
this way.


{Quote hidden}

There can be business value in documentation that is created as part of
a programming project. I have mentioned a typical example in the first
few messages about this subject, and you have ignored this example ever
since. I also have created many estimates about what would it cost to
get to a certain product that helped many business owners to make the
decision whether or not to do it or how to approach it. Without someone
helping them see the bigger picture, they wouldn't be able to do so.
These estimates (which are a sort of documentation) in some cases had
tremendous business value, before and independently of any line of code
written.

Conversely, there is much more that has business value. Didn't you just
claim in another thread that marketing has lots of business value?
Working software, if it does the wrong thing or not enough, may not have
much value -- or none at all. For pretty much all products, and software
also, there is a threshold where it starts to have value. Where this
threshold is depends very much on the specific product.


{Quote hidden}

It's still a claim and not more. It doesn't even say whether this is a
good thing. (You could say this is implied, but it's definitely not
spelled out.) And it doesn't say whether any specific process actually
does promote it. So what does it say?

{Quote hidden}

It may be fine for you, and it was definitely fine for me, but I don't
think it can be characterized as "maintain a constant pace". There was
no constant pace, there was an extreme change of pace.

So while it was fine, it was something completely different from
"constant pace". And as history shows, I didn't maintain this pace
indefinitely. (For that matter, I'd like to see any programmer maintain
any non-zero pace indefinitely... :) But despite not being a "constant
pace" and "indefinitely", it still was fine, you say. So where leaves
that this statement?


>> "Simplicity--the art of maximizing the amount of work not done--is
>> essential."
>>
>> It seems to me that this is not meant in the way Wally (of Dilbert)
>> would understand it, so it definitely needs some context.
>
> "The amount of time is limited, do only what's important."

Right. Which leaves us with the art of determining what is important.

Important can be, for example, to plan enough ahead to avoid unnecessary
refactorings. As I already mentioned before, unnecessary refactorings
can kill a project. Determining how much to plan ahead to avoid
unnecessary refactoring without wasting time planning the unplannable is
a real art.

Important is also to refactor early when warranted. As it is important
to put off refactoring if there are more important things to tend to.
Having the wisdom to decide which one applies in a given situation is an
art that comes with experience :)


{Quote hidden}

At this point it's a matter of belief, not that different from how
people believe in religion. Always when somebody claims an authority
with some sort of unsubstantiated higher wisdom, I get suspicious. If
they are human, they are fallible. (They're probably fallible even if
they are not human.) I can claim the exact same thing you say about
Agile about Waterfall -- whoever created it (and, according to you,
teaches it) are more than you, and they have much more collective
experience than you and I together. This argument, however, doesn't
convince you in the case of the Waterfall method. Why should it convince
me in the case of Agile?

Also, I'm sure you don't really follow that in your own company in all
cases. I'm sure that in some cases "picking a different leader" would
mean "leaving the team", right?


{Quote hidden}

I do, but I don't base my going forward on the assumption that everybody
on the team "adjusts accordingly".

> This just says that this is a good practice, and that it needs to be
> done throughout the project, not at the end of it (which is often too
> late).

No, this is not what it says. This is part of what it says. If this were
what it said, it would be your phrase in the Manifesto and not theirs.
It says very clearly "the team reflects ... and adjusts its behavior
accordingly". Which may not work with every team.


{Quote hidden}

As I said before, /every/ documentation /is/ communication. It is about
communicating certain things to the reader. Some does it well, some not
that well.

If you think that documentation is not communication, look at Wikipedia
under that aspect. It's a huge pile of documentation, all created and
maintained through sometimes intense communication.

{Quote hidden}

Again, let's say Microchip wanted to communicate to you the features of
their chips in video form, the richest channel in that documentation
options curve. I'm sure that this communication channel is much richer
than the dry datasheets. I'm also sure that you'd get lots of insight
through the conscious and subconscious clues that listening to the chip
developers collectively talking about the chip gives you, including the
subtle details of the timing when they speak. You might even get
valuable insights on erratas and their state.

But I'm sure a majority here would not want to trade in datasheets for
such videos -- I wouldn't. Not because I wouldn't appreciate the
richness, but because I don't have time to lose. I want to find quickly
what I'm searching for, without listening for hours to people exchanging
ideas on a rich medium. There is tons of project documentation in the
same realm. Video may be rich, but it is not always (time-)efficient.

What they say is nice and true, but only for specific documentation in
specific situations. In other situations it just doesn't apply. Now
distinguishing where it applies and where not is... you guessed it, an
art and comes with experience.

> ... Finally, the ability to answer questions in real time, the point
> that distinguishes the modeling options curve from the documentation
> options curve, are important because questions provide insight into
> how well the information is being understood by the listener. "

Here they explain that they are really talking about documentation in
the documentation options curve. They clearly say that documentation is
about documentation, as opposed to (direct) communication. That's why
they don't mention any of the interactive channels in the documentation
options curve.



{Quote hidden}

That's not relevant. But you're welcome to pick every mention of "Agile"
I made in this email and explain to me how I was making a statement
about a "vision of Agile". I'm mostly talking about what they write in
their Manifesto, without creating any vision, while you're mostly
talking about the vision you created based on this (which in some cases
seems to diverge substantially from what's written there).

> "Redefining what was done before the Agile Manifesto as Agile doesn't
> help anybody.",

This doesn't make a statement about the vision of Agile. I just said
that if people did something before the Agile Manifesto that is similar
to what people now call "Agile" doesn't make "Agile" what they did back
then -- Agile wasn't born yet. It doesn't say anything about my vision
of what is or is not Agile (now, that it exists).

> etc (as if Agile claims to redefine something that was done before).

I didn't say that Agile claims to redefine something that was done
before. IIRC, you said something to that effect, and this is the answer
to /you/; it's not about Agile or my vision of it. (You really shouldn't
confuse me writing about what you write with me writing about what I
think of Agile. These are two completely different issues.) If you want
to go into more detail on this issue, I'd have to go back and find what
you wrote that prompted me to write this. But I know that it wasn't
about Agile; it was about something you wrote.

So the question still stands: where do I make statements about a vision
of Agile that I created?

Gerhard

2009\02\17@220237 by Gerhard Fiedler

picon face
Vitaliy wrote:

> Gerhard Fiedler wrote:
>>> Does this mean you finally want to go over the manifesto and the 12
>>> principles, point-by-point? :)
>>
>> No, it doesn't. It was a question for you to answer. (FWIW, I responded
>> to the first request where you asked me about my opinion on the
>> principles, just a few minutes ago. You never asked this before.)
>
> Sent: Monday, February 02, 2009 01:45
>>> They are a collection of some common sense principles
>>> and unproven (and unprovable) axioms, nicely worded.
>>
>> Which ones do you have in mind?
>> [...]

> I don't think you ever responded to this message. I will try to find
> your message where you said you address the points...

This is indeed a message that I have tagged for later responding,
because I wanted to spend a bit more time on some of the issues. So yes,
whatever you wrote in there went for now unanswered. Sorry for missing
this.

But I did respond to this specific request in the mean time, so I guess
it's somewhat a moot point.


>> DDJ notwithstanding, I'm currently working in the software industry and
>> I don't see it. Given by the quick survey here of people working in the
>> industry, you seem to be the only one accepting it. Doesn't look like
>> it's "by and large" accepted.
>
> So it would seem. The embedded community has been known for lagging
> behind. ;-)

I'm not sure you understood that I don't see it in the software industry
either. The embedded community may not be lagging behind.

>> I think your picture that the ones who don't believe in it just don't
>> know enough about it may not always be correct.
>
> I only said that you don't know what Agile is, based on the
> statements you've made, and the questions you've raised.

I never talked about what Agile is, only about your claims of what it
does. That's two completely different things. Whether a technique or
process is claimed to be Agile or not -- I let you be the judge. But
whether the technique or process does what you claim, I can very well
comment on, if I have some experience with it. If this doesn't agree
with what you think it does, this doesn't mean that I don't understand
about Agile, it just means that we may have different experiences. And
if you postulate that your experience is always valid, then my
experience invalidates your postulate. And all this without me ever
saying anything about Agile.

So from what I wrote you can conclude whether and to what degree I agree
or disagree with your assessment of the efficiency of some Agile
techniques, but not what Agile is or isn't. I'm not talking about Agile
-- this is too diverse, and many people have different views of what it
is or should be. Even "practicing Agilists". So I try to stick to
talking about what you say doesn't work.

For example, when you say that documentation never is useful, I can say
"it has been quite useful in a few projects I've participated in". This
doesn't say anything about what I think Agile is, it just disagrees with
your claim.


{Quote hidden}

Would you say that "completing one thing before starting another" is not
one of the basic principles of the waterfall model? Based on what?
What's so absurd about this? What am I redefining here?


>> I was talking about the "basic principle" of it.
>
> If you start reassigning meaning to words, you could be talking about
> Agile, and calling it "waterfall model". Absurd.

What would you say is the basic principle of the waterfall model? The
Wikipedia article says, among other things: "... one proceeds from one
phase to the next in a purely sequential manner." And "those who use
such methods do not always formally distinguish between the pure
waterfall model and the various modified waterfall models." That doesn't
sound so different from what I wrote, to me at least.

What would you say is the basic principle of the waterfall model, and
how does this difference affect the above statement of mine?


{Quote hidden}

No, I'm not evading them. Agile is not really about software
engineering, it is about project management. Software engineering is
much more than project management. And many aspects of project
management are quite similar in other engineering disciplines. People
tend to mix the concepts, because often the architects are also the team
leads, but the two domains are very different -- just as different as in
building or electronic engineering.


>>> Do you not read any books on software development?
>>
>> I do. Again, "development" and "managing" is not the same thing.
>> Software development includes managing and architecture and coding
>> and life cycle and maintenance and lots of things. We're talking
>> here about a management style.
>
> I disagree. Lean development methodologies cover all of the things
> you mentioned.

While the management style influences somewhat the architecture choices,
knowing Agile well doesn't help you a bit deciding on architecture. You
need to know other things, and it is that knowledge that allows you to
make sound architecture choices. There is nothing in Agile (or
waterfall, for that matter) that helps you decide on language choices,
framework choices, architecture choices and so on. There is /nothing/
about architecture in the Agile Manifesto, for example.

> It's more about creating a paradigm of how a team organizes and works,
> than pure "management".

What's the difference? Making a team work is management, not coding, not
architecture.

> Refactoring, pair programming, iterative development have little to do
> with management.

Huh? Deciding when and how much to refactor is a typical project
management decision. Deciding whether or not to program in pairs is a
clear project management decision. Deciding what the project iterations
are is a typical project management decision. On every project where I
was project manager, these were among my project management
responsibilities. Even when there was an architect on the team and
architecting was not among my functions as project manager.


> For some reason, I can't find any good critical chain graphs. But
> basically, the reason it's higher density is simple: with a Gantt
> chart, time increments are fixed, so a 1-day task is visually
> (physically) 100 times shorter than a task that takes 100 days.
>
> In a critical chain diagram, you have circles (or bubbles) that are
> connected by arrows. Each arrow has the number of days required to
> complete the task. One or 100, doesn't matter (visually).

Not sure it's the same, but I've seen charts that fit your description.
I agree that for situations where you have huge differences in task
durations and the tasks are more important than their (approximate)
durations, they can be more useful. But that's just a detail. Either one
documents task relationships. When I use Gantt charts, the timeline is
important to me, and its visual representation. One or hundred does
matter. It doesn't matter that the 1-day task is just a spec; it's the
fact whether it's holding up another task that's important -- and the
dependency arrows from one task to the others are the same size for all
tasks, no matter the duration.

> I can send you the book, if you promise you'll read it. :)

Thanks, but no. Priorities...


>>> I prefer the agile principles, to the "every project is different,
>>> you're on your own, start from scratch" kind of advice. ;-)
>>
>> "Start from scratch" and "you're on your own" are additions by you.
>> Nobody else claimed these. They don't have much to do with "every
>> project is different".
>
> They were implied. I asked you to share your experience, you said you
> can't because every project is different.

I don't know why you're repeating this. I'm sharing lots of experience
here (and so do others). You just insist in that we're mostly wrong in
how we see things, each one in his own way. We may be, but that's
different from not sharing -- you just choose to think our advice isn't
good advice.

> If you can't make any generalizations (even conditional ones) about
> what constitutes good development practices, it's as if you haven't
> learned anything from your previous experiences (obviously not the
> case, I know you have).

If you go back and take all of the responses you got in this and the
associated threads (by Olin, Rolf, Nate, me, others), it seems to me
there is lots of sharing and lots of helpful advice. It just doesn't fit
what you seem to want to hear, for some reason. But it's there, if you
want it. (It doesn't have a name, though :)


> What's important in life? :)

Being happy?

Gerhard

2009\02\18@043139 by Vitaliy

flavicon
face
Gerhard,

I think we're at the far end of diminishing returns, neither I not you nor
anyone else is benefitting from the conversation and I think we both have
more productive things to spend our time on (like being happy :).

If it's any consolation, I read all of your responses so far.

Vitaliy

2009\02\18@091711 by Gerhard Fiedler

picon face
Vitaliy wrote:

>>> See, this is what I have a problem with. I could care less whether a
>>> tool or technique contradicts the principles a few guys came up with.
>>> If I have good reason to think it works (and I wouldn't think that it
>>> does if I didn't think I had good reason :), I'm going to use it --
>>> and not waste a second of thought whether or not this contradicts
>>> somebody's principles.
>>
>> I totally agree with Gerhard on this.
>
> I already addressed this at least once. If an auto mechanic's
> apprentice decides that he's going to make his own rules, just
> because he doesn't care what a bunch of old geezers think is the
> proper way to replace a transmission, I think the outcome would be
> quite predictable.

First off, I think I'm a bit more than an apprentice.

Second, transmission replacement is a well-documented procedure with
clear (and documented!) rules. Managing people lacks such a document for
the people involved. A more appropriate analogy is that in the absence
of the manufacturer manual, you seem to have decided to rely on a single
3rd party repair book, whereas we say "before messing with it, get a few
more repair books that are out there about this transmission, and make
your own picture -- don't just follow a single source; they sometimes
miss things".

It's also easily possible that a bunch of old geezers transfer their
experience with automatic (hydraulic) transmissions to the new automatic
(double-clutch, non-hydraulic) transmissions and make serious mistakes
when doing so. And the apprentice actually read the repair manual for
that new transmission and is correct about whatever the dispute is.

So in any way, I don't think the outcome is quite predictable. Not in
the analogy, not in real life.


> However, when it comes to programming, somehow other people's
> experience suddenly becomes irrelevant.

No, they don't. This is a straw man that you put up from the beginning,
just so that you can classify all our experience as irrelevant. You
asked here about other people's experiences. You got lots of feedback.
And it is you who considers much of it irrelevant (seemingly because it
doesn't match some of the Agile creators' experiences).

See, experience is a very personal thing. I'm not saying that those guys
who wrote the Agile Manifesto didn't make experiences that make the
Manifesto seem sensible to them. But I also made experiences, and they
show me that the statements in the Manifesto have limits and specific
contexts, outside of which they don't stand well -- because I made
experiences that were outside of those limits and contexts. This
experience may be irrelevant for you, but it is relevant for me, and I
won't throw it out just because someone put up a Manifesto.


> Sounds like a double standard.

Not at all. I just don't throw my own experience overboard for
experiences of others. I may /add/ others' experiences (and what they
learned of them) to mine, but mine still stands -- and when someone
comes and says that my experience is "wrong", then that's his problem,
not mine.

Gerhard

2009\02\18@092839 by Gerhard Fiedler

picon face
Vitaliy wrote:

> Olin Lathrop wrote:
>> All this correctness proving seems so pointless because it can only
>> prove the correctness of trivial code, and then only what you said
>> it's supposed to do, not what it's really supposed to do.
>
> I'm working on a project where trivial checks would be quite useful.

Perfect. Implement those checks, if you think it's time well-spent.

Nobody said it can't be useful; sometimes it is, sometimes it isn't. The
thing is deciding whether it is, and that is -- again -- one of the
things that depend a lot on the specific case.

Just put unit tests and application tests and test rigs into your
toolbox, get a feeling for how expensive those tools are to use and what
kinds of errors they help avoid and how costly and frequent these errors
are under which circumstances, and you get slowly the means to decide
when to use a given tool and when not.

That's really what it boils down to. Every tool and every procedure has
a cost and a benefit. Both the cost and the benefit depend a lot on many
specific circumstances. With internal projects of your own company, you
are in control of many of those circumstances and can keep them
constant, or are easily aware of when they change, or can control their
change. This helps a lot. But this is not the only real-world situation.
Most program for someone else, in a multitude of scenarios and
dependencies. And the appropriate tools and procedures change
accordingly.

When talking about tools and procedures for managing programming
projects, IMO it's not about whether something is good or bad. It is
about learning what the trade-offs are. Of all the reasonable tools and
procedures, nothing is (or maybe few things are) always bad or always
good; everything has trade-offs, and their value depends on the
situation. Rather than focusing on whether something is "good" or "bad",
it's much more useful to learn about the involved trade-offs, so that
you can start to decide whether something is "more good" or "more bad"
in a given situation.

Gerhard

2009\02\24@040620 by Vitaliy

flavicon
face
Gerhard Fiedler wrote:
>> I already addressed this at least once. If an auto mechanic's
>> apprentice decides that he's going to make his own rules, just
>> because he doesn't care what a bunch of old geezers think is the
>> proper way to replace a transmission, I think the outcome would be
>> quite predictable.
>
> First off, I think I'm a bit more than an apprentice.

I think you need to pay more attention. :) I said "if you (you, Gerhard) had
an apprentice.."


>> However, when it comes to programming, somehow other people's
>> experience suddenly becomes irrelevant.
>
> No, they don't. This is a straw man that you put up from the beginning,
> just so that you can classify all our experience as irrelevant. You
> asked here about other people's experiences. You got lots of feedback.
> And it is you who considers much of it irrelevant (seemingly because it
> doesn't match some of the Agile creators' experiences).

Gerhard, stop! When did I say I consider your experience is irrelevant?

What I said was, if every project is really unique (bearing *no*
similarities to other projects), then it follows that your experience is
irrelevant. Your experience is obviously relevant, so any given project is
not really unique, and is similar in some way to other projects. This seems
self-evident to me, yet this concept encounters strong resistance from you
and Olin.


> See, experience is a very personal thing. I'm not saying that those guys
> who wrote the Agile Manifesto didn't make experiences that make the
> Manifesto seem sensible to them. But I also made experiences, and they
> show me that the statements in the Manifesto have limits and specific
> contexts, outside of which they don't stand well -- because I made
> experiences that were outside of those limits and contexts. This
> experience may be irrelevant for you, but it is relevant for me, and I
> won't throw it out just because someone put up a Manifesto.

I never asked you to throw out your experience. I started this thread simply
to try to set some things straight. You are entitled to expressing your
opinion. And I am entitled to challenging your opinion if I think it is
wrong.

You (and Rolf, and Olin, and a few others) at various points made fun of
Agile concepts, compared it to the Jedi religion, etc. This does not offend
me in any way, at least not any more than if you made similar statements
about Windows or cordless drills. To me, Agile is simply a tool that I use
to get things done.

A fool can shoot himself in the face with a nail gun. Is it the nail gun's
fault?

Vitaliy

2009\02\24@043331 by Vitaliy

flavicon
face
Gerhard Fiedler wrote:
> That's really what it boils down to. Every tool and every procedure has
> a cost and a benefit. Both the cost and the benefit depend a lot on many
> specific circumstances. With internal projects of your own company, you
> are in control of many of those circumstances and can keep them
> constant, or are easily aware of when they change, or can control their
> change. This helps a lot. But this is not the only real-world situation.
> Most program for someone else, in a multitude of scenarios and
> dependencies. And the appropriate tools and procedures change
> accordingly.

Humor me, do a little thought experiment. :)

Say you have a procedure A, in some cases the benefit of using this
procedure outweighs the costs. You also have procedures B, C, and D.

I think it's fair to say that different procedures will have different costs
of implementation and success rates. We define "success rate" as "benefit
outweighed the cost in X% of projects".

So imagine that procedures have the following overall success rates:

A: 95%
B: 50%
C: 40%
D: 1%

Would we be able to draw some conclusions based on these statistics?

To me, the iterative approach has very few drawbacks, and lots of benefits.
Same goes for removing communication barriers (there are plenty of options),
using a versioning system, and frequent reviews of goals and priorities.

You can also expand this to statements like "Under conditions X, success
rate of B increases to 100%".

In reality, you don't even need the numbers. For you, it is intuitive
(according to Alden, you are a Level 3). :)


> When talking about tools and procedures for managing programming
> projects, IMO it's not about whether something is good or bad. It is
> about learning what the trade-offs are. Of all the reasonable tools and
> procedures, nothing is (or maybe few things are) always bad or always
> good; everything has trade-offs, and their value depends on the
> situation. Rather than focusing on whether something is "good" or "bad",
> it's much more useful to learn about the involved trade-offs, so that
> you can start to decide whether something is "more good" or "more bad"
> in a given situation.

Sure! Gerhard, this is exactly what I was asking about. You said you shared
your experience, but so far it seems that it all boiled down to "every
project is different, there are no rules". When talking about various things
you've used to help you complete a project, you invariably concluded with
"this technique worked under some circumstances, but will not work under
others", without specifying what the circuimstances were (well, except in
your example of the billing system), or the circumstances under which it
would not work. Such statements are hardly useful.

Olin shared details of a recent project, which provided a glimpse into how
he approaches projects -- now, that was useful. It would be nice if you
could talk about the project that required one year's worth of upfront
documentation, and why you feel it was justified.

Vitaliy

2009\02\24@110607 by Gerhard Fiedler

picon face
Vitaliy wrote:

> What I said was, if every project is really unique (bearing *no*
> similarities to other projects), then it follows that your experience
> is irrelevant.

No, it doesn't.

> Your experience is obviously relevant, so any given project is not
> really unique, and is similar in some way to other projects. This
> seems self-evident to me, yet this concept encounters strong
> resistance from you and Olin.

The overwhelming experience seems to be that no single rule (or set of
rules) captures the possible spread of projects. This /is/ valuable
experience, and it is possible to use it. Another is that every
(reasonable) rule has a range of validity, and it's as important to know
this range than it is to know the rule.


> You (and Rolf, and Olin, and a few others) at various points made fun
> of Agile concepts, compared it to the Jedi religion, etc.

FWIW, I didn't.

> A fool can shoot himself in the face with a nail gun. Is it the nail
> gun's fault?

Nope. But anybody trying to describe the nail gun as a suitable device
for all types of reconstructive facial surgery may hear some things :)

Gerhard

2009\02\24@120414 by Gerhard Fiedler

picon face
Vitaliy wrote:

> Say you have a procedure A, in some cases the benefit of using this
> procedure outweighs the costs. You also have procedures B, C, and D.
>
> I think it's fair to say that different procedures will have
> different costs of implementation and success rates. We define
> "success rate" as "benefit outweighed the cost in X% of projects".
>
> So imagine that procedures have the following overall success rates:
>
> A: 95%
> B: 50%
> C: 40%
> D: 1%
>
> Would we be able to draw some conclusions based on these statistics?

Not really much useful. Averages are useful for some things, but not for
individual decisions where better information is available.

If you know more than the averages, you can maybe bring up the success
rate of D to 100%, because it is an excellent procedure where
appropriate but disastrous where not, and you know when to use it and
when not.

> To me, the iterative approach has very few drawbacks, and lots of
> benefits. Same goes for removing communication barriers (there are
> plenty of options), using a versioning system, and frequent reviews
> of goals and priorities.

Exactly. We always have been in agreement about this.

> You can also expand this to statements like "Under conditions X,
> success rate of B increases to 100%".

Again, exactly. Same for A, C, D. And under the "just right" conditions,
even A may become a disaster.

If you can't afford to be victim of the averages, you better know how to
skew them -- by knowing when and how A, B, C and D work and when and
where not, and use them accordingly. Then all of a sudden all four are
up in the nineties.

> In reality, you don't even need the numbers. For you, it is intuitive
> (according to Alden, you are a Level 3). :)

The numbers don't say much (at least not the averages). They pretend to,
but really don't.


> When talking about various things you've used to help you complete a
> project, you invariably concluded with "this technique worked under
> some circumstances, but will not work under others", without
> specifying what the circuimstances were (well, except in your example
> of the billing system), or the circumstances under which it would not
> work.

I have in various places stressed the importance of good and adequate
documentation, for example. I think it is a powerful tool that many
people don't know how to use well. Encourage people to put stuff that's
valid and helpful for more than a sort time up on the wiki, instead just
telling their colleague face-to-face.

I also have stressed the importance of observing the participating
individuals and how they "function" and make this work in the context of
the project. This is also something that is often overlooked; a project
management plan without considering the project members doesn't work
well. (This, in this form, of course doesn't apply that much to your
situation, as you seem to be doing it the other way round: you can
select the people that fit your style. This also works, when you can do
it -- which I mostly couldn't.)

Hierarchical structures also help, but it's not easy to tell where the
point is that they start to help. Sometimes teams of 4 or 5 is the
maximum that works, and when you need more people, you split it up into
such smaller teams. Sometimes, a single team of ten may work better...
depends on the way the project can be split up, on the people and other
aspects of the situation. Or how to create those teams... you may have 5
locals and 5 that are close together but far away from you. It may make
sense to have the 5 locals work as a team, and the 5 remotes, too --
more efficient in terms of the team work, but it may create a split in
the project, a lack of overall communication, two sub-projects with
different styles, it may introduce a "we vs them" thinking etc. So even
though the individual team situation is not as good, the overall project
situation may be better if you split them up and have a few locals and a
few remotes on each team.

The thing is that I've worked in so different situations that every
situation has its very specific particularities, and it is difficult to
come up with catchy phrases or simple solutions that are generically
valid.

Gerhard

2009\02\25@030806 by Vitaliy

flavicon
face
Gerhard Fiedler wrote:
>> To me, the iterative approach has very few drawbacks, and lots of
>> benefits. Same goes for removing communication barriers (there are
>> plenty of options), using a versioning system, and frequent reviews
>> of goals and priorities.
>
> Exactly. We always have been in agreement about this.

I don't know, you kept saying that waterfall approach has its place... :)

So, at which point are we going to decide that the subject is exhaused, and
that we'll just have to agree to disagree? :-)

Vitaliy

More... (looser matching)
- Last day of these posts
- In 2009 , 2010 only
- Today
- New search...