Thursday, October 30, 2008

Code review: Learning from others

Introduction
DueDates version 1.1 gets reviewed by the other development teams in class. Team Purple was assigned to review Team Blue's DueDates and assess their version 1.1 code and documentation. The class, also was assigned to evaluate Google Project Hosting's code review tool because of issues from the previous Code Review.

Team Blue's code
Arthur and Erin did some nice things with their code like the user interface that I mentioned in the review of version 1.1. Another method I liked was their Comparator method, which handles the data structure that is used to store the book's due date and information, it also does sorting of values. I modification I thought needed to be done was to handle the due date as a Date type rather than a String. Because strings sorted lexicographically using the character's Unicode which could cause a problem when sorting 11/11/2008 and 4/4/2008 the sort will place 11/11/2008 ahead 4/4/2008.


/**
* Returns a Comparator object that allows BorrowedItem objects to be compared according to their
* due dates. The comparator is defined as an anonymous nested class.
*
* @return Comparator A comparator to compare BorrowedItem objects according to their due dates.
*/
public static final Comparator dueDateComparator = new Comparator() {
public int compare(BorrowedItem item1, BorrowedItem item2) {
int comp = (item1.yyyymmddDueDate).compareTo(item2.yyyymmddDueDate);
if (comp != 0) {
return comp;
}
// Compare the rest of the fields so that two BorrowedItem objects compare to zero
// if and only if all their fields are equal.
String item1Info = item1.libraryName.concat(item1.formattedResults);
String item2Info = item2.libraryName.concat(item2.formattedResults);
comp = item1Info.compareTo(item2Info);
return comp;
}
};

One area I thought Team Blue could use improvement was with their unit tests. A programmer needs to know how a system will respond to invalid values such as null or inputting an integer when a string was expected. A thorough test will catch most errors and bugs before a user gets to wreak havoc on an application. I suggested to Team Blue to test their Comparator to make sure it sorts what they expect to sort. Another test that should be performed is to make sure exceptions are thrown at the right time. Also, kudos to Team Blue for their homepage, job well done.

Team Purple's review
Team Violet was assigned to review our system and honestly I expected to get blasted in the review, I am always expecting the worst with reviews, but Anthony and Vincent were not too critical. Vincent did point out the expandability issue which does concern me a great deal. Adding more libraries to DueDates will make Team Purple's version one huge class. However, I thought Anthony's assessment of our code was insightful and will assist us in developing a more robust system. The review of Team Purple's system showed deficiencies in our code that will problematic when performing a build. Both Vincent and Anthony had problems with verifying our DueDates due to a JUnit error. Vincent could not successfully import Team Purple's code into Eclipse because of problems with HttpUnit, I think its a build path issue.

Emma coverage tool:
[concat] Emma Coverage summary
[concat] class: 67% (2/3)
[concat] method: 44% (15/34)
[concat] block: 18% (156/874)
[concat] line: 18% (36/199)

The Emma coverage does not concern me because version 1.1 does require more testing which should improve the coverage. Overall, Team Violet gave us valuable information that helps us improve the system.

Google Project Hosting
In the review of DueDate version 1.0, some flaws of Google's Code Review tool were discovered during the assessment of other DueDates implementations. Google had made some modifications to the Code Reviewer but I think it needs more improvement because it is difficult to locate a review if the reviewer does not provide a link. Although, someone doing an assessment should notify the development team tof a pending review, sometimes the person may forget to send a notice or assume the analysis would be easily found. Perhaps a function, where upon publishing a review a notice is sent to the team.

Conclusion
I am hoping that my review of Team Blue's system was insightful so they can make improvements to their code. Unless I hear differently from the development teams I review I am not sure of what kind of modifications can be done with how I look at a system. But I suppose one area I can improve upon is with explaining how a team should modify their program. I feel that I am not giving a complete picture of what I am trying to convey.