Tuesday, November 24, 2009

Eco-Depot Code Review

I performed a review on the Eco-Depot system. My summary is at the bottom if you don't wish to read through all the details, otherwise please make suggestions back to me about the quality of my review, I'd like to hear it.


Eco-Depot Review
reviewer: Remy Baumgarten


A. Review the build

The first part of any review is to verify that the system builds and passes automated quality assurance.

A1. Download the system, build it, and run any automated quality assurance checks (use "ant -f verify.build.xml"). If any errors occur, note these in your review.

BUILD SUCCESSFUL

B. Review system usage

If the system builds, the next step is to play with it.

B1. Run the system. Exercise as much of its functionality as possible. Does it implement all of the required features?

I ran the application with 11/23/2009 and it worked as it is supposed to.

B2. Try to break the system by providing it with unexpected input. Can you make the system crash or generate a stack dump? If so, note the input that caused the failure.

By deleting the date, and clicking submit it goes to 11/24/2009 instead of today, 23 or erroring out.

B3. Evaluate the user interface to the system. If it's a web app, is the interface self-explanatory? Does the system require unnecessary scrolling to see the data, or can it all fit in a single screen? Does it seem "professional", or does it look "amateurish"?

Data fits in 1 page. Decimal places should be eliminated. The key is good, it lets us know what we are looking at. A bit of CSS/JS code would be nice.

C. Review the JavaDocs.

Download the system and generate the javadocs (use "ant -f javadoc.build.xml"). Navigate to the build/javadoc folder and click on index.html to display the JavaDocs in a browser. Read through the JavaDocs and assess the following:

C1. Does the System Summary (provided in an overview.html file) provide an high-level description of the purpose of the system? Does it explain how each of the packages in the system relate to each other? Is the first sentence self-contained?

"A Wicket application of EcoDepot build system." This should be elaborated to describe the application briefly.

C2. Do the Package Summaries (provided in package.html files) provide a high-level description of the purpose of the package? Do they explain how the classes in the package related to each other? Is the first sentence self-contained?

Yes. This is very helpful in understanding what packages purposes are.

C3. Do the Class Summaries (provided at the top of each .java file) provide a high-level description of the purpose of the class? Does it provide sample code for clients of the class, if useful? Is the first sentence self-contained?

There is a highlevel description. There is no sample code. This would be helpful if someone wanted to use your class.

C4. Do the Method Summaries (provided before each method) explain, from a client-perspective, what the method does? Do they avoid giving away internal implementation details that could change? Do they document any side-effects of the method invocation? (Note that you can click on the method name to see the source code for the method, which is helpful to assessing the correctness and quality of the javadoc.)

The methods provided what I found sufficient documentation. setThresholds is a good example of this.

C5. Please review Chapter 4 of the Elements of Java Style for additional JavaDoc best practices, and check for compliance.

Very good.

D. Review the names

One of the most important (if not the most important) form of documentation in any software system is the choice of names for program elements, such as packages, classes, methods, instance variables, and parameters. Due to evolution in requirements and design changes, the name originally chosen for a program element may no longer be appropriate or optimal. An important goal of review is to ensure that the names of program elements are well suited to their function. Due to the refactoring capabilities in modern IDEs such as Eclipse, renaming need not be a burden.

D1. Do another pass through the JavaDocs, this time concentrating on the names of packages, classes, and methods. Are these names well chosen? Do they conform to the best practices in Elements of Java Style, Chapter 3? Can you propose better names?

I would propose the use of names that are not prepended by the name of the project, since these files are already in the project names package, it is redundant.

D2. Once you have reviewed the names displayed in the JavaDocs, review the source code for internal names in the same way.

Looking throughout the source I found the names to explain exactly what their purpose reflects. Easy to follow.

E. Review the testing.

The system should provide a useful set of test cases.

E1. Run Emma or some other code coverage tool on the system ("ant -f emma.build.xml"). Look at the uncovered code in order to understand one aspect of the limitations of the testing.

emma reports ~65-85% coverage. There are 4 test cases. They test for various conditions, not strictly happy paths. A few more tests to increase the coverage and the separation of tests into their own classes would be nice.

E2. Review the test cases. Is each component of the system exercised by a corresponding test class? Do the test cases exercise more than "happy path" behaviors?

Described above.

E3. Review the test output. Under default conditions, the test cases should not generate any output of their own. It output is desired for debugging purposes, it should be controlled by (for example) a System property.

There is no extraneous output.

F. Review the package design

The JavaDoc review is focussed on whether the system design is correctly explained. In this section, you start to look at whether the system design is itself correct.

F1. Consider the set of packages in the system. Does this reflect a logical structure for the program? Are the contents of each package related to each other, or do some package contain classes with widely divergent function? Can you think of a better package-level structure for the system?

After reviewing this requirement, I found that the classes are logically divided, yet lack packages.

G. Review the class design

Examine each class implementation with respect to at least the following issues.

G1. Examine its internal structure in terms of its instance variables and methods. Does the class accomplish a single, well-defined task? If not, suggest how it could be divided into two or more classes.

Yes, classes are logically divided.

G2. Are the set of instance variables appropriate for this class? If not, suggest a better way to organize its internal state.

Yes, the instance variables are related to the class.

G3. Does the class present a useful, but minimal interface to clients? In other words, are methods made private whenever possible? If not, which methods should be made private in order to improve the quality of the class interface to its clients?

There are public methods, but they should be public because they are needed by other classes. However, a restructure of this class could allow one method to be public and use internal methods (private) within.

H. Review the method design

Examine each method implementation with respect to at least the following issues.

H1. Does the method accomplish a single thing? If not, suggest how to divide it into two or more methods.

Yes, however I suggest refactoring as a wicket application for creating HTML and embedding CSS and use things such as labels, ListViews/Repeaters (for tables) instead of building strings to display on index.

H2. Is the method simple and easy to understand? Is it overly long? Does it have an overly complicated internal structure (branching and looping)? If so, suggest how to refactor it into a more simple design.

Displayed above.

H3. Does the method have a large number of side-effects? (Side effects are when the result of the method's operation is not reflected purely in its return value. Methods have side-effects when they alter the external environment through changing instance variables or other system state. All "void" methods express the results of their computation purely through side-effect.) In general, systems in which most methods have few or zero side-effects are easier to test, understand, and enhance. If a method has a large number of side-effects, try to think about ways to reduce them. (Note that this may involve a major redesign of the system in some cases.)

The methods are mainly self contained.

I. Check for common look and feel

I1. Is the code implemented consistently throughout the system, or do different sections look like they were implemented by different people? If so, provide examples of places with inconsistencies.

The code looks consistent.

J. Review the documentation

J1. Does the project home page provide a concise and informative summary of what the system accomplishes? Does it provide a screen dump that illustrates what the system does?

There is no screenshot or a clear description on the home page.

J2. Is there a UserGuide wiki page? Does it explain how to download an executable binary of the system? Does it explain the major functionality of the system and how to obtain it? Are there screen images when appropriate to guide use? Try following the instructions: do they work?

The userguide is detailed, and it explains exactly what the applications purpose is.

J3. Is there a DeveloperGuide wiki page? Does it explain how to download the sources and build the system? Does it provide guidance on how to extend the system? Try following the instructions: do they work?

Yes this was well written. It even describes how to setup the development environment and includes a link to the formatting xml configuration.

K. Review the Software ICU data

For this step, you must ask the Hackystat project owner to add you as a spectator to their project so you can invoke the Software ICU for their system. Run the Software ICU on the project.

K1. Is the Software ICU gathering data consistently and reliably? If not, what data appears to be missing, and why might it be missing?

Yes the data is present and shows a little data.

K2. If data is present, what does it tell you about the quality of the source code? Is the current level high, medium, or low? What are the trends in quality?




Coverage improved, complexity increased, coupling improved, churn went down a lot probably because of a lot of large commits. Dev time only shows one spike and well as commit and the build shows one spike as well. It looks like this project was done last minute from the looks of the stats.

K3. If data is present, what does it tell you about the group process? Are all members contributing equally and consistently?

It looks like 3 members contributed and one didn't build at all, but this could be due to improper setup with sensors.

L. Review Issue management

Go to the project home page, then click on the Issues tab. Next, search for "All Issues" to retrieve a page containing all issues, both open and closed. Next, select "Grid" view, and select "Owner" for the rows. The result should be a page of issues in a grid layout, where each rows shows all of the issues (both open and closed) for a particular project member.

L1. Does the issue management page indicate that the project members are doing planning? In other words, that they have broken down the project into a reasonable set of issues (i.e tasks taking one day or longer are represented by an issue)?

There are no issues present.

L2. Does the issue management page indicate that each member is contributing to the project? Does each member have at least one current "open" task? Has each member completed a reasonable number of tasks?

There are no issues present.

M. Review continuous integration

Go to the Hudson server, login, and select the continuous integration job associated with the project.

M1. Is the system being built regularly due to commits? Are there long periods (i.e. longer than a day) when there are no commits?

When the group started, there was a period of 5 days without any commits, and then activity toward the end.

M2. If the system build fails, are there situations in which the system stayed in a failed state for a long period of time (i.e. longer than an hour)?

Failed builds were fixed immediately.

M3. Is there a continuous integration job? If so, check the configuration. Is it correct? Will it be triggered by a commit? Check the console output from one of the invocations. Are the appropriate Ant tasks (i.e. checkstyle, pmd, junit, etc.) being executed?

Yes, it pulls svn every 5 minutes and builds with the ant tasks.

M4. Is there a daily build job? If so, check the configuration. Is it correct? Will the job be run regularly once a day? Check the console output from one of the invocations. Are the appropriate Ant tasks for a daily build job (i.e. coverage, coupling, complexity, size, etc.) being executed?

Yes. Based on the console output all the tasks are being run.

Summary

Based upon the above sections, provide an overall summary of your perspective on the project. Does it fulfill the requirements, and if not, why not? Is the code of reasonable quality based upon its design, implementation, and quality assurance techniques? If not, what should be improved? Does the group appear to be functioning well based upon Software ICU, Issue, and Continuous Integration data? If not, what appears to be wrong and what should they do to improve?

Good job on this project. I think working on this before it's too late might make it easier later if time is spent learning the wicket way of doing things, so time isn't wasted rewriting code. A little CSS would help too as well as packing the product appropriately. There are some great tutorials here http://w3schools.com/ that will help with CSS. I think this project has a lot of potential given time and effort. Good work.

Monday, November 23, 2009

WattWaiter 1.0 Released

My software development group has been working on a Wicket application that queries a server to retrieve data from the Oahu Power Grid and display an interface for a particular days power consumption or carbon intensities. We named this program WattWaiter.

Here is a brief overview of what the project is about:

Wattwaiter is a web application that provides a visual display of carbon intensities for a given day. This information can be used by consumers to adjust their electrical usage and become “smart” consumers. By observing carbon intensities, consumers can determine which times during the day are most efficient for electrical usage. This benefits the consumer by allowing usage of lower costing electricity and the electric company by balancing their load.

Carbon intensities are displayed in hourly increments with green, yellow, and red flags corresponding to low, moderate, and high carbon intensities respectively. Users are now able to adjust their electricity usage to coincide with lower intensity periods. A sample Wattwaiter display is shown below.

Here is a screenshot of the early stages of WattWaiter:

Wicket is a bit rough to learn, but reading the Wicket book and the wiki on Wicket helped a lot. I was soon able to program the Wicket way. Coming from a PHP, MySQL background, I found this a bit overkill, but I am sure that this kind of setup will pay off when the application grows.

Working in this team is really nice because I like to see all the assigned tasks come together. Communication between members is also really fun because we got to develop strategies and gain others perspectives as well as work with others code.

Honestly, I am quite happy with the way our system was designed, but not being an expert or even novice at Wicket at this point, I can guess that there will be a lot of room for improvement and better ways to do things.

Here is a screenshot of the software ICU (intensive care unit) as described in a previous post. This will give you an idea of how well our project is performing.




As you can see, we need to commit more often (churn). The problem with this is if there is a big overhaul in the project, or we need to redesign something, there's no way around this, because you're not going to commit. Also we need to develop more knowledge about the Wicket testing framework. We are having some issues developing proper tests for this.

Our Project Page is hosted here: http://code.google.com/p/wattwaiter/

Our distribution is here: http://wattwaiter.googlecode.com/files/wattwaiter-1.0.1122.zip

Our Wiki, User Guides and Dev Guides are here: http://code.google.com/p/wattwaiter/w/list

Sunday, November 15, 2009

WattDepot Version 2.0 Released

WattDepot v2.0 implements a few more query options and employs an updated API for the wattdepotclient class. The necessary changes that were in our code reviewers notes were also satisfied. Since we already had the design correct, changes were minimal. We have a lot of unit tests but it seems that we probably need some more since our code coverage is at about 82%.

Teamwork in version two was just as good as it was in v1.0's development phase. We met regularly on IRC, and with the assistance of a python bot I wrote, we get commit logs posted directly into the channel. We discuss design and divide the work there and on the phone.

What was really interesting was the opportunity to employ a new tool called Hackystat to measure the health of our project.

Here is a screenshot:



Hackystat is a great way to monitor your project as it computes various statistics about how your project is performing.
"Hackystat is an open source framework for collection, analysis, visualization, interpretation, annotation, and dissemination of software development process and product data."
Hackystat works by relying on the user to install sensors in their project as well as their IDE. It dispatches data to the Hackystat server every 5 minutes (in IDE) and everytime ant is invoked. The kind of information that is transmitted is related to the files your working on and the time you spend on them.

This application can answer questions like these:
  • What day and time during the month was Oahu energy usage at its highest? How many MW was this?
November [2,3,4,5,6,9,10,11,12,13,16,17,18,19,20,23,24,25,26,27] at 995MW
  • What day and time during the month was Oahu energy usage at its lowest? How many MW was this?
November [2,3,4,5,6,9,10,11,12,13,16,17,18,19,20,23,24,25,26,27] at 493MW
  • What day during the month did Oahu consume the most energy? How many MWh was this?
There is no data for this at the moment.
  • What day during the month did Oahu consume the least energy? How many MWh was this?
There is no data for this at the moment.
  • What day during the month did Oahu emit the most carbon (i.e. the "dirtiest" day)? How many lbs of carbon were emitted?
November [4,5,16,17,30] at 29,959lbs

  • What day during the month did Oahu emit the least carbon (i.e. the "cleanest" day)? How many lbs of carbon were emitted?
November [7,8] at 22,908lbs

You can download our distribution here: WattDepot CLI v2.0

Wednesday, November 11, 2009

1-on-1 Code Reviews

The Elua Branch had one-on-one code reviews today. I found this mildly useful in the respect that both parties were able to elaborate a little more on the reports. However, I found the reports much more detailed, and therefore useful. I think code reviews should be about going over every line in the code after the reviewer has read and understood the program, perhaps more then the author.


The reviews in general from the reports were very helpful simply because when someone else looks at your code, they notice things you didn't. For the next project I am hoping to get more time with the reviewer and hope they read the code, instead of glance at it looking for specific conventions.

Sunday, November 8, 2009

Code Review

It was my task to review the code of two watt-depot branches. I found that the projects were not completed to specification and hard to review because of this. However, I'll attempt to give insight and advice on what needs to be done, in terms of error handling and design.

Ewalu - The commands are all implemented and working, good job. I think this code needs a little more time reorganizing and it will be a really good project. Exception handling for errors needs to be worked on and error messages need to be given. There are NumberFormatException's, ArrayIndexOutOfBoundsException's and some design issues to be worked on. Design issues involving remapping the structure of the project into logically divided packages and classes. Currently the source is mostly in one file. Test cases should also contain tests that check how the program responds to invalid input and exceptions.
Click for Report

Ehiku - The Ehiku branch needs a bit more time as well to implement and convert the rest of the commands into a more organized project consisting of classes and branches. It seems currently the transition is underway in the way the commands are parsed and dispatched. I'd like to see this design completed. There are some mishandled exceptions in the code as well, IllegalArgumentException, ResourceNotFoundException and the code leaves '[]' after a result is printed to the screen. There are no test cases either. But as I said, this branch can easily clean this up and just needs a little more time to do so.
Click for Report

Wednesday, November 4, 2009

Elua Branch Experience

Kevin and I have been working on the Elua branch for about a week now everyday. It has been a wonderful learning experience. The programming part of it was not difficult, but the skills in the tools we had learned previously really came into use. By using quality assurance tools and continuous integration I really feel we developed a strong command line interface for the Wattdepot project.

I think the most interesting part of this experience was working in a team. I gained more experience coordinating tasks which was especially useful when working with two interoperable methods. We both worked on methods that complemented each other and continually committed to make sure that they integrated. We also found that dividing the work up into methods and tasks was useful, but we queried each other when there was a decision to be made.

The most interesting part, which I really want to learn more about is design patterns. The only design pattern I have a lot of experience with is MVC (Model View Controller) which I use extensively in PHP and a bit in Ruby on Rails.

We finished the entire specification to my knowledge, although it seems we can always make something better. I would never call a project finished, because that simply never happens. A project is always in a continuous state, once you stop working on it, it deteriorates. I have to admit I've grown a bit attached to this project because of the work I've put into it and I hope it can be used to evolve to something greater in the future.

Download and test out our CLI for Wattdepot, I'd like to know what you think

SVN Checkout:

svn checkout http://wattdepot-cli.googlecode.com/svn/branches/elua wattdepot-cli-read-only

Monday, November 2, 2009

Hudson for Continuous Integration


I found Hudson a really interesting and useful tool for continuous integration. I was surprised how slick the interface is and how easy it is to setup. It includes AJAX feedback when you are configuring your project and lets you know if you are setting it up incorrectly.

Hudson also came in really handy because when you are making a lot of changes, and committing code every 30 minutes or so, sometimes you forget to run ant -f verify.build.xml. Hudson will send you an email within 5 minutes (the time we configured Hudson to check our SVN repository) and let you know that your build failed. If it continuously fails then the little icon for your project will turn from a sun to a lightning storm (something that makes it fun to avoid!).

When I was converting my project over to smaller classes so it was not one big monolithic class, I tried to do the conversion all in one shot without checking verify or committing to let Hudson complain. That was a mistake. I had about 100 checkstyle issues I had to go through and fix, and it took a long time. It would of been easier if I had tried with one class and edit the following classes accordingly as to not let checkstyle complain.

All in all it was a great experience and I wouldn't go without Hudson and all the tools we are using now for any other project.