10 points for each of the items I, II, III, IV and V. Total is out of 7 points (total / 50 * 7) MedicAid: I. Design diagram and design quality 10 See comments in the next item. II. Design reasoning and discussions 7 The description on section 3.2.1 is suggesting operations on the control that should have been the job of the entity in the first place, isn't it? In the modified, it is not clear what role the control now plays vs. the entity. Section 3.2.2 is suggesting that the control class actually function as Entity. Not sure why that would be the case. It appears that wrong design decisions are being made so as to refine the design?! That is not iterative development. III. Development approaches 8 Not enough coverage of this, except a high level mentioning. IV. Details on problems faced 5 Not enough coverage of this. V. Report overall 10 Minor point: Strange that the table of contents was actually manually typed and not auto-generated! Total: 40 (5.6) Student Testing System: I. Design diagram and design quality 8 It is not clear where the response from a student is stored or how it is modeled? II. Design reasoning and discussions 10 This section has gotten a reasonable coverage, I think. III. Development approaches 8 Lacks more details on the task allocation and team organization. IV. Details on problems faced 10 V. Report overall 10 Excellent report! Very well written. Good job. And you were at least 30 hours ahead of schedule in delivering your report. Impressed. Seems like you are a little low on the number of test cases though. I would have expected to see a lot more. May be I am missing some thing. Total: 46 (6.44) SyndiSnag: I. Design diagram and design quality 10 II. Design reasoning and discussions 7 You say "We came up with the user stories in the following components: 1. GUI controls 2. Business logic" What does that mean? Do you mean that user stories were divided into tasks that were implemented in these component tiers? Usability concerns are valid and efforts to address those look good. The report is written with "I" instead of "we". Who is I, the person making the decision for that particular problem or the person writing the report? The discussion on context sensitive menu identifies the problem. That is good. However, looks like efforts are not being made to solve it. It appears more like, "so what"? Are there ways to solve the problem? III. Development approaches 7 I see the argument that test first coding was not effective for the UI development. However, was it used for the other tiers at all? The report appears to be the narration of one team member who wrote the report (and was likely involved with the UI tier) and not that of the team. A later session talks about NUnit, but an earlier section contradicts this. Not enough information about how this was used effectively. IV. Details on problems faced 5 Apart from design decisions, no mention of these can be found. V. Report overall 10 Remember to list the names of team members in your report. Your report was submitted barely in time and your zip file submission was 8 minutes late (and was not accepted - the first submission is what is used for grading). The follow up report was not adequate and was not copied to the rest of the team in the email. Team members complained that they did not receive the follow up report and had to ask several times for it to get a copy. Total: 39 (5.46) Bug-base: I. Design diagram and design quality 10 Class diagram in section 8.2.2 leads me to believe that those classes were identified in iteration 0 - seems like an awfully lot of classes to identify in one iteration. May be I am not reading this right. What is operator_1? II. Design reasoning and discussions 7 Reasoning as to why certain decisions were made say for selecting tools for testing, etc. is hard to find. (OK, found it finally in a later section). Reasoning for going to a desktop based approach does not state any cons though. It increases the installation cost which may have been eliminated by a web based system though! Redundant information describing XP is present. Lacks the "why"s of the design. III. Development approaches 7 Report contains more description of what the practices and tools are rather than how they were effectively used for your specific project. The details of each iteration are fairly well documented. IV. Details on problems faced 10 Expected to see details of some that are mentioned above as missing. V. Report overall 10 Total: 44 (6.16) Wheels.com: I. Design diagram and design quality 10 Good steps taken towards the design and adequate description of it as well. II. Design reasoning and discussions 7 I like the honest assessment of lack of test cases. However, the reasoning is not very convincing though. You should be developing your application in layers or tiers. Why can' test cases be developed before coding the middle tier and the lower tiers. Then the UI tier can be written independently and easily as well. The section on code review does not convey what you are really doing. Are you saying that the person who write the code is the one who actually reviews it as well? Red flag on integration. The code must be in source code control at the first hour of development. Not a reasonable explanation. Good realization that validation has to be moved from the UI. The class diagram shows the Handlers depending on the GUI classes. Has this been implemented. I am curious as to how this works. If you are using the Handlers to dictate which page gets displayed next, I may go with it. Otherwise, I am not sure where this is going to go. III. Development approaches 8 Need to get up to speed on applying some good practices and tools. IV. Details on problems faced 7 The issues in the design have been addressed fairly effectively. However, other problems and especially efforts to address them are were not effectively taken and/or stated. V. Report overall 10 You need to turn in one cohesive document and not a hodgepodge of pdf and doc files. Total: 42 (5.88)