Steven,
Maybe we all should take some cold meds - you seem very coherent to me; get well soon. The only thing is that after settling on the shorter time to completion, you still spend a fraction of your programming effort/time to get the computer to test your work. It is that fraction that I am trying to quantify. Bill ________________________________________ From: [hidden email] [[hidden email]] On Behalf Of Steven Baker [[hidden email]] Sent: Sunday, February 27, 2011 3:04 PM To: [hidden email] Subject: Re: [Pharo-project] Good reference on time on unit testing? I've always felt that "test driving" code actually results in negative time spent on "testing". I spend (and everyone I know that tests well does as well) a lot less time writing code test-first than I ever would writing the same code without the tests first. Also, I spend far less time (almost none) manually testing functionality. So TDD results in net negative time difference. (Apologize if I'm incoherent, the cold and flu drugs are strong in this one.) -Steven On Sun, Feb 27, 2011 at 11:51 AM, Norbert Hartl <[hidden email]> wrote: > > On 27.02.2011, at 13:58, Schwab,Wilhelm K wrote: > >> Norbert, >> >> Excellent points - I take exception with only one: you assume that all developers test - that is sadly not true. I am involved with a group who seem to think that a handful of tests added at the last minute will somehow magically fix their problems. >> > It seems I wasn't clear on this one. I'm trying to making a point that there is no distinction between "testing" - "no testing" but between "testing" - "writing tests". If you take compilation you test the code for syntax and language quirks. If a developer runs the program he develops than he is testing already. Testing and running a program is the same form that point of view. He takes input parameter and expects output parameter. That is testing. The point is just that developers do it manually and that is not reproducable. So there is no real difference between manual testing all the time and written test case. Only that repeating the testing procedure is boring and therefor supposed to be done by a machine. So if developers could see that the just need to put the work they already doing into a test will ease the work without changing much. They are just changing style and save time. That's it. Talking about 100% test coverage and holistic views about what the perfect testing could be doesn't help. > >> I have no problem arguing that testing (if done well) can/will reduce overall development time; the question is how much of that time one should expect to devote to writing and maintaining unit and acceptance tests? >> > I understand your intention but the view is inapropriate. Coding and testing aren't two things. It is something that belongs together (if you changed coding style). So that's why it is hard to estimate a time that should be spent. To me it is more comparabale to this: We all know collections are great. Imagine someone asking you "Can anyone recommend a good reference on the amount of time one should expect to spend writing code that uses collections?". What would you say? There is no answer. That doesn't mean you can't solve the problem. But you won't solving it by saying "It is 38.345 % of the time". If people don't get it you have to convince and/or teach them. I had several times where I could show the developer that he is gaining time from doing it. That works. Everything else is targeted towards excel manipulators. > > Norbert > > >> ________________________________________ >> From: [hidden email] [[hidden email]] On Behalf Of Norbert Hartl [[hidden email]] >> Sent: Sunday, February 27, 2011 5:41 AM >> To: [hidden email] >> Subject: Re: [Pharo-project] Good reference on time on unit testing? >> >> Hi, >> >> On 27.02.2011, at 04:52, Schwab,Wilhelm K wrote: >> >>> Hello all, >>> >>> Can anyone recommend a good reference on the amount of time one should expect to spend writing tests? I will have to be the messenger (will be wearing running shoes just in case...), but I want the message to come from a solid source. >> >> I find that really hard to answer. To me the problem is the question itself. I heard the "..amount of time one should expect to spend writing tests?" so often in companies and I think it was always exactly this phrase. While the question is valid it gives the impression there is something that decucts time from your "normal" development work. So the people that are asking this question are often managing people that have read something about code quality and they want to apply _this_ to their teams. >> The definition over time is troublesome, too. Testing is not easy. Everything you read about testing gives you the impression that everyone knows how to test and that those developers are just too lazy. And that is not true. Most developers I met had problems to see what testing is all about. The don't see interfaces as provable promises etc. So if you tell any of these developers they should spend 1 hour a day in testing than you will get tests that are counter-productive. My favorite example is the one where you have any composite object with an add method. Than the test goes like adding something via the interface and then try to access the internal array and check if it is of size 1. To me it is the same as with documentation. I prefer to have less documentation than useless documentation. >> >> So every developer is testing in some way. You either test and debug on the way in an unstructured form or you write tests. To me writing tests is not an add-on it is a change in working style. From this point of view I would state that the time I need to spend _additionally_ for testing is negative. I grew up with a print statement being the ultimate debugging tool. A print/debugging statement is added at the time of debugging and probably removed if the error seems to be fixed. That can lead to a situation where you do this multiple times. Writing the same as a test (and that forces you to produce more fine grained code) you will have a saving in time and a little assurance about regression. Regression debugging (debugging it again later) is much more time spent because you have to fix the error _and_ you need to focus again on that problem (which takes most of the time). So the point in writing tests is not to spend time but to save time. >> >> The amount of time one should expect to spend writing tests is less than the time you need to spend for testing otherwise. :) >> >> Norbert >> > > > |
In reply to this post by Peter Hugosson-Miller
Let's keep the physical descriptions out of it, ok :) But you have the point: I am looking for published numbers to back up what we all have learned from experience.
And it has been surprisingly hard to find. ________________________________________ From: [hidden email] [[hidden email]] On Behalf Of Peter Hugosson-Miller [[hidden email]] Sent: Sunday, February 27, 2011 3:11 PM To: [hidden email] Subject: Re: [Pharo-project] Good reference on time on unit testing? Steven, you're making perfect sense to me, and I think almost everyone here agrees with you. Bill's problem (if I've understood him correctly) is that he needs to convince some pointy-haired-boss-type person, by directing him or her to a well-respected "official" statistic that "proves" what we all know to be true from experience. Sadly, I think it will be hard to find it :-( -- Cheers, Peter On Sun, Feb 27, 2011 at 9:04 PM, Steven Baker <[hidden email]<mailto:[hidden email]>> wrote: I've always felt that "test driving" code actually results in negative time spent on "testing". I spend (and everyone I know that tests well does as well) a lot less time writing code test-first than I ever would writing the same code without the tests first. Also, I spend far less time (almost none) manually testing functionality. So TDD results in net negative time difference. (Apologize if I'm incoherent, the cold and flu drugs are strong in this one.) -Steven On Sun, Feb 27, 2011 at 11:51 AM, Norbert Hartl <[hidden email]<mailto:[hidden email]>> wrote: > > On 27.02.2011, at 13:58, Schwab,Wilhelm K wrote: > >> Norbert, >> >> Excellent points - I take exception with only one: you assume that all developers test - that is sadly not true. I am involved with a group who seem to think that a handful of tests added at the last minute will somehow magically fix their problems. >> > It seems I wasn't clear on this one. I'm trying to making a point that there is no distinction between "testing" - "no testing" but between "testing" - "writing tests". If you take compilation you test the code for syntax and language quirks. If a developer runs the program he develops than he is testing already. Testing and running a program is the same form that point of view. He takes input parameter and expects output parameter. That is testing. The point is just that developers do it manually and that is not reproducable. So there is no real difference between manual testing all the time and written test case. Only that repeating the testing procedure is boring and therefor supposed to be done by a machine. So if developers could see that the just need to put the work they already doing into a test will ease the work without changing much. They are just changing style and save time. That's it. Talking about 100% test coverage and holistic views about what the perfect testing could be doesn't help. > >> I have no problem arguing that testing (if done well) can/will reduce overall development time; the question is how much of that time one should expect to devote to writing and maintaining unit and acceptance tests? >> > I understand your intention but the view is inapropriate. Coding and testing aren't two things. It is something that belongs together (if you changed coding style). So that's why it is hard to estimate a time that should be spent. To me it is more comparabale to this: We all know collections are great. Imagine someone asking you "Can anyone recommend a good reference on the amount of time one should expect to spend writing code that uses collections?". What would you say? There is no answer. That doesn't mean you can't solve the problem. But you won't solving it by saying "It is 38.345 % of the time". If people don't get it you have to convince and/or teach them. I had several times where I could show the developer that he is gaining time from doing it. That works. Everything else is targeted towards excel manipulators. > > Norbert > > >> ________________________________________ >> From: [hidden email]<mailto:[hidden email]> [[hidden email]<mailto:[hidden email]>] On Behalf Of Norbert Hartl [[hidden email]<mailto:[hidden email]>] >> Sent: Sunday, February 27, 2011 5:41 AM >> To: [hidden email]<mailto:[hidden email]> >> Subject: Re: [Pharo-project] Good reference on time on unit testing? >> >> Hi, >> >> On 27.02.2011, at 04:52, Schwab,Wilhelm K wrote: >> >>> Hello all, >>> >>> Can anyone recommend a good reference on the amount of time one should expect to spend writing tests? I will have to be the messenger (will be wearing running shoes just in case...), but I want the message to come from a solid source. >> >> I find that really hard to answer. To me the problem is the question itself. I heard the "..amount of time one should expect to spend writing tests?" so often in companies and I think it was always exactly this phrase. While the question is valid it gives the impression there is something that decucts time from your "normal" development work. So the people that are asking this question are often managing people that have read something about code quality and they want to apply _this_ to their teams. >> The definition over time is troublesome, too. Testing is not easy. Everything you read about testing gives you the impression that everyone knows how to test and that those developers are just too lazy. And that is not true. Most developers I met had problems to see what testing is all about. The don't see interfaces as provable promises etc. So if you tell any of these developers they should spend 1 hour a day in testing than you will get tests that are counter-productive. My favorite example is the one where you have any composite object with an add method. Than the test goes like adding something via the interface and then try to access the internal array and check if it is of size 1. To me it is the same as with documentation. I prefer to have less documentation than useless documentation. >> >> So every developer is testing in some way. You either test and debug on the way in an unstructured form or you write tests. To me writing tests is not an add-on it is a change in working style. From this point of view I would state that the time I need to spend _additionally_ for testing is negative. I grew up with a print statement being the ultimate debugging tool. A print/debugging statement is added at the time of debugging and probably removed if the error seems to be fixed. That can lead to a situation where you do this multiple times. Writing the same as a test (and that forces you to produce more fine grained code) you will have a saving in time and a little assurance about regression. Regression debugging (debugging it again later) is much more time spent because you have to fix the error _and_ you need to focus again on that problem (which takes most of the time). So the point in writing tests is not to spend time but to save time. >> >> The amount of time one should expect to spend writing tests is less than the time you need to spend for testing otherwise. :) >> >> Norbert |
In reply to this post by Steven R. Baker
Great, an idealist :)
In this case, the problem is flipped: they want tests, but expect that a hurried effort will fix all of their problems. ________________________________________ From: [hidden email] [[hidden email]] On Behalf Of Steven Baker [[hidden email]] Sent: Sunday, February 27, 2011 3:31 PM To: [hidden email] Subject: Re: [Pharo-project] Good reference on time on unit testing? Oh! I never worry about "proving" it. When I've worked for clients who have insisted that I leave the testing out because they don't want to "pay extra" for the time spent, I simply don't push the tests to their repo. I test for myself, because I have a personal commitment to do the best job possible. -Steven On Sun, Feb 27, 2011 at 12:11 PM, Peter Hugosson-Miller <[hidden email]> wrote: > Steven, you're making perfect sense to me, and I think almost everyone here > agrees with you. > > Bill's problem (if I've understood him correctly) is that he needs to > convince some pointy-haired-boss-type person, by directing him or her to a > well-respected "official" statistic that "proves" what we all know to be > true from experience. Sadly, I think it will be hard to find it :-( > > -- > Cheers, > Peter > > On Sun, Feb 27, 2011 at 9:04 PM, Steven Baker <[hidden email]> > wrote: >> >> I've always felt that "test driving" code actually results in negative >> time spent on "testing". I spend (and everyone I know that tests well >> does as well) a lot less time writing code test-first than I ever >> would writing the same code without the tests first. Also, I spend far >> less time (almost none) manually testing functionality. So TDD results >> in net negative time difference. >> >> (Apologize if I'm incoherent, the cold and flu drugs are strong in this >> one.) >> >> -Steven >> >> On Sun, Feb 27, 2011 at 11:51 AM, Norbert Hartl <[hidden email]> >> wrote: >> > >> > On 27.02.2011, at 13:58, Schwab,Wilhelm K wrote: >> > >> >> Norbert, >> >> >> >> Excellent points - I take exception with only one: you assume that all >> >> developers test - that is sadly not true. I am involved with a group who >> >> seem to think that a handful of tests added at the last minute will somehow >> >> magically fix their problems. >> >> >> > It seems I wasn't clear on this one. I'm trying to making a point that >> > there is no distinction between "testing" - "no testing" but between >> > "testing" - "writing tests". If you take compilation you test the code for >> > syntax and language quirks. If a developer runs the program he develops than >> > he is testing already. Testing and running a program is the same form that >> > point of view. He takes input parameter and expects output parameter. That >> > is testing. The point is just that developers do it manually and that is not >> > reproducable. So there is no real difference between manual testing all the >> > time and written test case. Only that repeating the testing procedure is >> > boring and therefor supposed to be done by a machine. So if developers could >> > see that the just need to put the work they already doing into a test will >> > ease the work without changing much. They are just changing style and save >> > time. That's it. Talking about 100% test coverage and holistic views about >> > what the perfect testing could be doesn't help. >> > >> >> I have no problem arguing that testing (if done well) can/will reduce >> >> overall development time; the question is how much of that time one should >> >> expect to devote to writing and maintaining unit and acceptance tests? >> >> >> > I understand your intention but the view is inapropriate. Coding and >> > testing aren't two things. It is something that belongs together (if you >> > changed coding style). So that's why it is hard to estimate a time that >> > should be spent. To me it is more comparabale to this: We all know >> > collections are great. Imagine someone asking you "Can anyone recommend a >> > good reference on the amount of time one should expect to spend writing code >> > that uses collections?". What would you say? There is no answer. That >> > doesn't mean you can't solve the problem. But you won't solving it by saying >> > "It is 38.345 % of the time". If people don't get it you have to convince >> > and/or teach them. I had several times where I could show the developer that >> > he is gaining time from doing it. That works. Everything else is targeted >> > towards excel manipulators. >> > >> > Norbert >> > >> > >> >> ________________________________________ >> >> From: [hidden email] >> >> [[hidden email]] On Behalf Of Norbert Hartl >> >> [[hidden email]] >> >> Sent: Sunday, February 27, 2011 5:41 AM >> >> To: [hidden email] >> >> Subject: Re: [Pharo-project] Good reference on time on unit testing? >> >> >> >> Hi, >> >> >> >> On 27.02.2011, at 04:52, Schwab,Wilhelm K wrote: >> >> >> >>> Hello all, >> >>> >> >>> Can anyone recommend a good reference on the amount of time one should >> >>> expect to spend writing tests? I will have to be the messenger (will be >> >>> wearing running shoes just in case...), but I want the message to come from >> >>> a solid source. >> >> >> >> I find that really hard to answer. To me the problem is the question >> >> itself. I heard the "..amount of time one should expect to spend writing >> >> tests?" so often in companies and I think it was always exactly this phrase. >> >> While the question is valid it gives the impression there is something that >> >> decucts time from your "normal" development work. So the people that are >> >> asking this question are often managing people that have read something >> >> about code quality and they want to apply _this_ to their teams. >> >> The definition over time is troublesome, too. Testing is not easy. >> >> Everything you read about testing gives you the impression that everyone >> >> knows how to test and that those developers are just too lazy. And that is >> >> not true. Most developers I met had problems to see what testing is all >> >> about. The don't see interfaces as provable promises etc. So if you tell any >> >> of these developers they should spend 1 hour a day in testing than you will >> >> get tests that are counter-productive. My favorite example is the one where >> >> you have any composite object with an add method. Than the test goes like >> >> adding something via the interface and then try to access the internal array >> >> and check if it is of size 1. To me it is the same as with documentation. I >> >> prefer to have less documentation than useless documentation. >> >> >> >> So every developer is testing in some way. You either test and debug on >> >> the way in an unstructured form or you write tests. To me writing tests is >> >> not an add-on it is a change in working style. From this point of view I >> >> would state that the time I need to spend _additionally_ for testing is >> >> negative. I grew up with a print statement being the ultimate debugging >> >> tool. A print/debugging statement is added at the time of debugging and >> >> probably removed if the error seems to be fixed. That can lead to a >> >> situation where you do this multiple times. Writing the same as a test (and >> >> that forces you to produce more fine grained code) you will have a saving in >> >> time and a little assurance about regression. Regression debugging >> >> (debugging it again later) is much more time spent because you have to fix >> >> the error _and_ you need to focus again on that problem (which takes most of >> >> the time). So the point in writing tests is not to spend time but to save >> >> time. >> >> >> >> The amount of time one should expect to spend writing tests is less >> >> than the time you need to spend for testing otherwise. :) >> >> >> >> Norbert > |
In reply to this post by Schwab,Wilhelm K
On 27.02.2011, at 23:19, Schwab,Wilhelm K wrote: > Norbert, > > Have you ever read of shops where people throw code at features and never much bother to test what they are doing, because it is "testing's" job to catch the bugs ? I had read about it too... > Yes, I did. Nearly all of the teams/developers I had to deal with were exactly like this. And myself 12 years ago wasn't that much better (before I encountered XP and started to think about it). All of what I have written is the outcome of a long history of pain. Let's call it experience. I know how hard it is to convince people that testing is useful. As long as testing is considered as additional effort you won't convince developers for the extra work and you won't convince manager for the extra money. That makes two problems. It seems that forecasting to spend more work on something to have to do less in overall is considered black magic. The natural reaction to this is disbelieve. > I understand what you are saying. > I know you do. I would be really glad to be helpful. But I didn't have a recipe to get it going by argueing. I have a good and an evil way of treating it. The good one is to be a role model in the daily work. The evil one is to write tests for others that are red :) Norbert > > ________________________________________ > From: [hidden email] [[hidden email]] On Behalf Of Norbert Hartl [[hidden email]] > Sent: Sunday, February 27, 2011 2:51 PM > To: [hidden email] > Subject: Re: [Pharo-project] Good reference on time on unit testing? > > On 27.02.2011, at 13:58, Schwab,Wilhelm K wrote: > >> Norbert, >> >> Excellent points - I take exception with only one: you assume that all developers test - that is sadly not true. I am involved with a group who seem to think that a handful of tests added at the last minute will somehow magically fix their problems. >> > It seems I wasn't clear on this one. I'm trying to making a point that there is no distinction between "testing" - "no testing" but between "testing" - "writing tests". If you take compilation you test the code for syntax and language quirks. If a developer runs the program he develops than he is testing already. Testing and running a program is the same form that point of view. He takes input parameter and expects output parameter. That is testing. The point is just that developers do it manually and that is not reproducable. So there is no real difference between manual testing all the time and written test case. Only that repeating the testing procedure is boring and therefor supposed to be done by a machine. So if developers could see that the just need to put the work they already doing into a test will ease the work without changing much. They are just changing style and save time. That's it. Talking about 100% test coverage and holistic views about what the perfect testing could be doesn't help. > >> I have no problem arguing that testing (if done well) can/will reduce overall development time; the question is how much of that time one should expect to devote to writing and maintaining unit and acceptance tests? >> > I understand your intention but the view is inapropriate. Coding and testing aren't two things. It is something that belongs together (if you changed coding style). So that's why it is hard to estimate a time that should be spent. To me it is more comparabale to this: We all know collections are great. Imagine someone asking you "Can anyone recommend a good reference on the amount of time one should expect to spend writing code that uses collections?". What would you say? There is no answer. That doesn't mean you can't solve the problem. But you won't solving it by saying "It is 38.345 % of the time". If people don't get it you have to convince and/or teach them. I had several times where I could show the developer that he is gaining time from doing it. That works. Everything else is targeted towards excel manipulators. > > Norbert > > >> ________________________________________ >> From: [hidden email] [[hidden email]] On Behalf Of Norbert Hartl [[hidden email]] >> Sent: Sunday, February 27, 2011 5:41 AM >> To: [hidden email] >> Subject: Re: [Pharo-project] Good reference on time on unit testing? >> >> Hi, >> >> On 27.02.2011, at 04:52, Schwab,Wilhelm K wrote: >> >>> Hello all, >>> >>> Can anyone recommend a good reference on the amount of time one should expect to spend writing tests? I will have to be the messenger (will be wearing running shoes just in case...), but I want the message to come from a solid source. >> >> I find that really hard to answer. To me the problem is the question itself. I heard the "..amount of time one should expect to spend writing tests?" so often in companies and I think it was always exactly this phrase. While the question is valid it gives the impression there is something that decucts time from your "normal" development work. So the people that are asking this question are often managing people that have read something about code quality and they want to apply _this_ to their teams. >> The definition over time is troublesome, too. Testing is not easy. Everything you read about testing gives you the impression that everyone knows how to test and that those developers are just too lazy. And that is not true. Most developers I met had problems to see what testing is all about. The don't see interfaces as provable promises etc. So if you tell any of these developers they should spend 1 hour a day in testing than you will get tests that are counter-productive. My favorite example is the one where you have any composite object with an add method. Than the test goes like adding something via the interface and then try to access the internal array and check if it is of size 1. To me it is the same as with documentation. I prefer to have less documentation than useless documentation. >> >> So every developer is testing in some way. You either test and debug on the way in an unstructured form or you write tests. To me writing tests is not an add-on it is a change in working style. From this point of view I would state that the time I need to spend _additionally_ for testing is negative. I grew up with a print statement being the ultimate debugging tool. A print/debugging statement is added at the time of debugging and probably removed if the error seems to be fixed. That can lead to a situation where you do this multiple times. Writing the same as a test (and that forces you to produce more fine grained code) you will have a saving in time and a little assurance about regression. Regression debugging (debugging it again later) is much more time spent because you have to fix the error _and_ you need to focus again on that problem (which takes most of the time). So the point in writing tests is not to spend time but to save time. >> >> The amount of time one should expect to spend writing tests is less than the time you need to spend for testing otherwise. :) >> >> Norbert >> > > > |
Free forum by Nabble | Edit this page |