Johan,
Thanks for the excellent feedback ... You have touched on some issues that I am aware of and appreciate your suggestions ... you have also touched on some issues that I hadn't thought about ... I want to let your feedback marinate a bit ... you have provided enough detail that I think I understand where the improvements are needed, but I'm not sure I know the right way to address them ... yet. Please continue to let us know about your learning adventure:) Dale On 04/22/2011 02:36 PM, Johan Fabry wrote: > Hi all, > > I'd like to give a bit of feeback on the metacello browser. First > some context: I am aware of the core concepts of configuration > management, but because of lack of time I never got to learn > Metacello. So I thought this browser would be ideal for me, I gave it > a go, and I'd like to report on my experience. > > The first try did not work at all for me, because I started with a > configuration that is too complex. So lets start with the second try, > something simpler, and first get that sorted out before we do the > complex one. Here we go: I have a SqueakSource project > TraitsApplication with one package TraitsApplication which contains > TraitsApplication and TraitsApplication-Tests categories (and the > former contains a TraitsApplication class :-P ). Following the > prompts in the browser I was able to make a configuration and save it > to SqueakSource, without any real problems. So the cool thing is, I > made my first Metacello config today and it even works :-) > > One issue is how to load my creation. I am used to a big do-it that > gets the config via Gofer and contains '(Smalltalk at: > #ConfigurationOfFoo) perform: #loadDefault'. Many configs do it like > this, as a user I considered it a standard convention. But this turns > out not to work, because no loadDefault has been defined. Perchance I > remembered how I had to load MetacelloBrowser > '(ConfigurationOfMetacelloBrowser project version: #stable) load' and > adapting it to TraitsApplication (not forgetting > version:#development) made it to work. So, the question: why is there > no #loadDefault generated, nor an obvious way to specify what the > default should be? (At least for simple cases of the configuration, > like this one) > > The second issue is the UI. It is confusing at times, there are > multiple issues and I list them in random order - Why can I not click > on '+Configuration' when there is no configuration selected in the > left hand side? - What do the unit-test icons at the left of the > configs and baselines-versions-... mean? I turn them green by > double-clicking on them, which also gets me an inspector on an empty > collection. Huh? - The context menu is too big and confusing. It > should be subdivided and ordered by group. Submenus are in order I > think. - What are groups? The help menu does not enlighten me. > > Next up: making a config for AspectMaps (the first thing I tried and > that did not work). But this is for another day ... -- Johan Fabry > [hidden email] - http://dcc.uchile.cl/~jfabry PLEIAD Lab - > Computer Science Department (DCC) - University of Chile > > > |
No problem ... the icons are there to indicate the validation status ..
When you select the 'validate' menu item, each of the versions is validated and the icon is green if there are no issues or red/orange if there are issues. clicking on the icon will return the list of validation issues ... I put this feature in early on, because I was validating a bunch of configurations and I needed to be able to tell which versions had issues... In recent weeks I have wondered if I should drop them altogether and then I run into a validation problem and am glad that they are there:) Right now I think that if they are to stay, they should be automatically updated (in which case we could drop a couple of menu items off of the list:) Dale On 04/22/2011 03:23 PM, Johan Fabry wrote: > Hi Dale, > > my pleasure :-) > > In the mean time could you satisfy my curiosity, and tell me what the unit-test icons at the left of the configs and baselines-versions-... mean? (Idem for groups?) > > Thanks in advance :-) > > On 22 Apr 2011, at 16:53, Dale Henrichs wrote: > >> Johan, >> >> Thanks for the excellent feedback ... You have touched on some issues that I am aware of and appreciate your suggestions ... you have also touched on some issues that I hadn't thought about ... >> >> I want to let your feedback marinate a bit ... you have provided enough detail that I think I understand where the improvements are needed, but I'm not sure I know the right way to address them ... yet. >> >> Please continue to let us know about your learning adventure:) >> >> Dale > > -- > Johan Fabry > [hidden email] - http://dcc.uchile.cl/~jfabry > PLEIAD Lab - Computer Science Department (DCC) - University of Chile > > > |
Dale, I was envisioning to have a 'run tests' command to run all the tests installed by a configuration (we can then decide recursively or not). I think this will consolidate the development process since running the tests is preferable before committing. In that case, the validation led may be misleading no?
I have the feeling that the validation is something that an end user does not have much to do. I remember you introduced it when you upgraded a bunch of configurations with symbols. This is not something many do today. Cheers, Alexandre On 22 Apr 2011, at 17:28, Dale Henrichs wrote: > No problem ... the icons are there to indicate the validation status .. When you select the 'validate' menu item, each of the versions is validated and the icon is green if there are no issues or red/orange if there are issues. clicking on the icon will return the list of validation issues ... > > I put this feature in early on, because I was validating a bunch of configurations and I needed to be able to tell which versions had issues... > > In recent weeks I have wondered if I should drop them altogether and then I run into a validation problem and am glad that they are there:) > > Right now I think that if they are to stay, they should be automatically updated (in which case we could drop a couple of menu items off of the list:) > > Dale > > On 04/22/2011 03:23 PM, Johan Fabry wrote: >> Hi Dale, >> >> my pleasure :-) >> >> In the mean time could you satisfy my curiosity, and tell me what the unit-test icons at the left of the configs and baselines-versions-... mean? (Idem for groups?) >> >> Thanks in advance :-) >> >> On 22 Apr 2011, at 16:53, Dale Henrichs wrote: >> >>> Johan, >>> >>> Thanks for the excellent feedback ... You have touched on some issues that I am aware of and appreciate your suggestions ... you have also touched on some issues that I hadn't thought about ... >>> >>> I want to let your feedback marinate a bit ... you have provided enough detail that I think I understand where the improvements are needed, but I'm not sure I know the right way to address them ... yet. >>> >>> Please continue to let us know about your learning adventure:) >>> >>> Dale >> >> -- >> Johan Fabry >> [hidden email] - http://dcc.uchile.cl/~jfabry >> PLEIAD Lab - Computer Science Department (DCC) - University of Chile >> >> >> > -- _,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;: Alexandre Bergel http://www.bergel.eu ^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;. |
Alexandre,
I think the 'run tests' command is great idea ... Validation wil be important as long as folks edit configs by hand, so validation needs to be part of the tool set (whether automatic or manual) ... Just because people don't do it doesn't mean that they shouldn't be doing it ... it is very easy to introduce an error that won't show up until you try to use the configuration, so validation is important... I would prefer to see it automatic which then begs the question of how to notify the user that there is a validation issue and which configuration/version has the issue ... presumably we have a set of feedback options that we can use ... Dale On Apr 23, 2011, at 7:46 AM, Alexandre Bergel wrote: > Dale, I was envisioning to have a 'run tests' command to run all the tests installed by a configuration (we can then decide recursively or not). I think this will consolidate the development process since running the tests is preferable before committing. In that case, the validation led may be misleading no? > I have the feeling that the validation is something that an end user does not have much to do. I remember you introduced it when you upgraded a bunch of configurations with symbols. This is not something many do today. > > Cheers, > Alexandre > > > On 22 Apr 2011, at 17:28, Dale Henrichs wrote: > >> No problem ... the icons are there to indicate the validation status .. When you select the 'validate' menu item, each of the versions is validated and the icon is green if there are no issues or red/orange if there are issues. clicking on the icon will return the list of validation issues ... >> >> I put this feature in early on, because I was validating a bunch of configurations and I needed to be able to tell which versions had issues... >> >> In recent weeks I have wondered if I should drop them altogether and then I run into a validation problem and am glad that they are there:) >> >> Right now I think that if they are to stay, they should be automatically updated (in which case we could drop a couple of menu items off of the list:) >> >> Dale >> >> On 04/22/2011 03:23 PM, Johan Fabry wrote: >>> Hi Dale, >>> >>> my pleasure :-) >>> >>> In the mean time could you satisfy my curiosity, and tell me what the unit-test icons at the left of the configs and baselines-versions-... mean? (Idem for groups?) >>> >>> Thanks in advance :-) >>> >>> On 22 Apr 2011, at 16:53, Dale Henrichs wrote: >>> >>>> Johan, >>>> >>>> Thanks for the excellent feedback ... You have touched on some issues that I am aware of and appreciate your suggestions ... you have also touched on some issues that I hadn't thought about ... >>>> >>>> I want to let your feedback marinate a bit ... you have provided enough detail that I think I understand where the improvements are needed, but I'm not sure I know the right way to address them ... yet. >>>> >>>> Please continue to let us know about your learning adventure:) >>>> >>>> Dale >>> >>> -- >>> Johan Fabry >>> [hidden email] - http://dcc.uchile.cl/~jfabry >>> PLEIAD Lab - Computer Science Department (DCC) - University of Chile >>> >>> >>> >> > > -- > _,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;: > Alexandre Bergel http://www.bergel.eu > ^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;. > > > > > |
I agree with you. The browser is not yet a replacement for manual editing. The validation mechanism has to be present.
Maybe the led can be traded with a red bold font for the project name. The meaning of the red font is clear and unambiguous. The led can then instead be used for the test. How does that sound? I will work on the test soon. This week end maybe. Alexandre Le 23 avr. 2011 à 11:43, Dale Henrichs <[hidden email]> a écrit : > Alexandre, > > I think the 'run tests' command is great idea ... > > Validation wil be important as long as folks edit configs by hand, so validation needs to be part of the tool set (whether automatic or manual) ... > > Just because people don't do it doesn't mean that they shouldn't be doing it ... it is very easy to introduce an error that won't show up until you try to use the configuration, so validation is important... > > I would prefer to see it automatic which then begs the question of how to notify the user that there is a validation issue and which configuration/version has the issue ... presumably we have a set of feedback options that we can use ... > > Dale > > On Apr 23, 2011, at 7:46 AM, Alexandre Bergel wrote: > >> Dale, I was envisioning to have a 'run tests' command to run all the tests installed by a configuration (we can then decide recursively or not). I think this will consolidate the development process since running the tests is preferable before committing. In that case, the validation led may be misleading no? >> I have the feeling that the validation is something that an end user does not have much to do. I remember you introduced it when you upgraded a bunch of configurations with symbols. This is not something many do today. >> >> Cheers, >> Alexandre >> >> >> On 22 Apr 2011, at 17:28, Dale Henrichs wrote: >> >>> No problem ... the icons are there to indicate the validation status .. When you select the 'validate' menu item, each of the versions is validated and the icon is green if there are no issues or red/orange if there are issues. clicking on the icon will return the list of validation issues ... >>> >>> I put this feature in early on, because I was validating a bunch of configurations and I needed to be able to tell which versions had issues... >>> >>> In recent weeks I have wondered if I should drop them altogether and then I run into a validation problem and am glad that they are there:) >>> >>> Right now I think that if they are to stay, they should be automatically updated (in which case we could drop a couple of menu items off of the list:) >>> >>> Dale >>> >>> On 04/22/2011 03:23 PM, Johan Fabry wrote: >>>> Hi Dale, >>>> >>>> my pleasure :-) >>>> >>>> In the mean time could you satisfy my curiosity, and tell me what the unit-test icons at the left of the configs and baselines-versions-... mean? (Idem for groups?) >>>> >>>> Thanks in advance :-) >>>> >>>> On 22 Apr 2011, at 16:53, Dale Henrichs wrote: >>>> >>>>> Johan, >>>>> >>>>> Thanks for the excellent feedback ... You have touched on some issues that I am aware of and appreciate your suggestions ... you have also touched on some issues that I hadn't thought about ... >>>>> >>>>> I want to let your feedback marinate a bit ... you have provided enough detail that I think I understand where the improvements are needed, but I'm not sure I know the right way to address them ... yet. >>>>> >>>>> Please continue to let us know about your learning adventure:) >>>>> >>>>> Dale >>>> >>>> -- >>>> Johan Fabry >>>> [hidden email] - http://dcc.uchile.cl/~jfabry >>>> PLEIAD Lab - Computer Science Department (DCC) - University of Chile >>>> >>>> >>>> >>> >> >> -- >> _,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;: >> Alexandre Bergel http://www.bergel.eu >> ^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;. >> >> >> >> >> > |
For now, I'd say go ahead and use the icon for the test ... we'll worry about how to indicate validation errors when and if we automate the validation...
BTW, don't forget to do you development in a branch ... branches for real work are welcome in the repository:) We'll see how it works. Dale On Apr 23, 2011, at 10:05 AM, Alexandre Bergel wrote: > I agree with you. The browser is not yet a replacement for manual editing. The validation mechanism has to be present. > > Maybe the led can be traded with a red bold font for the project name. The meaning of the red font is clear and unambiguous. The led can then instead be used for the test. How does that sound? > > I will work on the test soon. This week end maybe. > > Alexandre > > Le 23 avr. 2011 à 11:43, Dale Henrichs <[hidden email]> a écrit : > >> Alexandre, >> >> I think the 'run tests' command is great idea ... >> >> Validation wil be important as long as folks edit configs by hand, so validation needs to be part of the tool set (whether automatic or manual) ... >> >> Just because people don't do it doesn't mean that they shouldn't be doing it ... it is very easy to introduce an error that won't show up until you try to use the configuration, so validation is important... >> >> I would prefer to see it automatic which then begs the question of how to notify the user that there is a validation issue and which configuration/version has the issue ... presumably we have a set of feedback options that we can use ... >> >> Dale >> >> On Apr 23, 2011, at 7:46 AM, Alexandre Bergel wrote: >> >>> Dale, I was envisioning to have a 'run tests' command to run all the tests installed by a configuration (we can then decide recursively or not). I think this will consolidate the development process since running the tests is preferable before committing. In that case, the validation led may be misleading no? >>> I have the feeling that the validation is something that an end user does not have much to do. I remember you introduced it when you upgraded a bunch of configurations with symbols. This is not something many do today. >>> >>> Cheers, >>> Alexandre >>> >>> >>> On 22 Apr 2011, at 17:28, Dale Henrichs wrote: >>> >>>> No problem ... the icons are there to indicate the validation status .. When you select the 'validate' menu item, each of the versions is validated and the icon is green if there are no issues or red/orange if there are issues. clicking on the icon will return the list of validation issues ... >>>> >>>> I put this feature in early on, because I was validating a bunch of configurations and I needed to be able to tell which versions had issues... >>>> >>>> In recent weeks I have wondered if I should drop them altogether and then I run into a validation problem and am glad that they are there:) >>>> >>>> Right now I think that if they are to stay, they should be automatically updated (in which case we could drop a couple of menu items off of the list:) >>>> >>>> Dale >>>> >>>> On 04/22/2011 03:23 PM, Johan Fabry wrote: >>>>> Hi Dale, >>>>> >>>>> my pleasure :-) >>>>> >>>>> In the mean time could you satisfy my curiosity, and tell me what the unit-test icons at the left of the configs and baselines-versions-... mean? (Idem for groups?) >>>>> >>>>> Thanks in advance :-) >>>>> >>>>> On 22 Apr 2011, at 16:53, Dale Henrichs wrote: >>>>> >>>>>> Johan, >>>>>> >>>>>> Thanks for the excellent feedback ... You have touched on some issues that I am aware of and appreciate your suggestions ... you have also touched on some issues that I hadn't thought about ... >>>>>> >>>>>> I want to let your feedback marinate a bit ... you have provided enough detail that I think I understand where the improvements are needed, but I'm not sure I know the right way to address them ... yet. >>>>>> >>>>>> Please continue to let us know about your learning adventure:) >>>>>> >>>>>> Dale >>>>> >>>>> -- >>>>> Johan Fabry >>>>> [hidden email] - http://dcc.uchile.cl/~jfabry >>>>> PLEIAD Lab - Computer Science Department (DCC) - University of Chile >>>>> >>>>> >>>>> >>>> >>> >>> -- >>> _,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;: >>> Alexandre Bergel http://www.bergel.eu >>> ^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;. >>> >>> >>> >>> >>> >> |
Hi Dale,
> BTW, don't forget to do you development in a branch ... branches for real work are welcome in the repository:) We'll see how it works. Just to double check. There are three additions that I would like to make: - the possibility to add a new repository when loading a configuration - adding a test command - fixe issue #126 Shall I create a branch for each of these contributions? Each branch has a #development version that contains the different versions to implement a contribution. Since these contributions are independent from what is in 1.60, I can create the branch from 1.59.2. So, 3 branches? We then merge them later in the master branch Cheers, Alexandre -- _,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;: Alexandre Bergel http://www.bergel.eu ^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;. |
On Apr 23, 2011, at 3:19 PM, Alexandre Bergel wrote:
Yes, let's try it with three branches ... we're experimenting here so let's give different things a try and see how they feel and what trouble we run into and discover what fucntionality we need to make it work smoothly... Dale
|
> Yes, let's try it with three branches ... we're experimenting here so let's give different things a try and see how they feel and what trouble we run into and discover what fucntionality we need to make it work smoothly...
+1 I will let you know how it goes... Alexandre -- _,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;: Alexandre Bergel http://www.bergel.eu ^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;._,.;:~^~:;. |
Free forum by Nabble | Edit this page |