Our final solution is an augmented reality system centered around goggles that project plant images, soil information, and overall compatibility with the ground and surrounding plants.
The gardener "plants" sensor stems into the ground where he intends to place each plant. On top of each stem he chooses a leaf-tag that corresponds to the intended plant. The stem has soil sensors to detect moisture and pH, which are the most critical pieces of information to gather.
The goggles view the unique tag on each and interpret the type of plant, and project the appropriate image to show the gardener what could be there. If changes are needed, the gardener can rearrange and adjust the garden layout and composition by changing the tags and seeing the results in real-time.
By default, the user will see all plants mature at the same time to best assist with physical arrangement. However, the goggles themselves have a control wheel which, when moved from it's original position, changes the view to a timeline mode where the gardener can view different months in the future when the plants will bloom. Since so many plants have different blooming times, this information is crucial whether for flower gardens to look their best or for vegetable gardens to be ready at the appropriate times.
The only other control present on the goggles is a single, touch-sensitive button on the top of the goggles. This control is to take a still image of the garden as viewed by the gardener and take note of all the plants in view. The image and plant information are sent via email for the user to take to the store to buy plants or supporting supporting supplies (plant food, tools, etc.)
By creating a storyboard, we were able to sort through a lot of detail issues regarding the system including control systems and possible user expectations during certain points in the experience. The storyboard also functioned as a scene-by-scene guide for the photo-shoot that eventually became our video sketch.
The final prototype is composed of physical artifacts and a user interface. The following images are of the physical artifacts: the glasses, the leaf codes, and the sensor stems.
Below is a video sketch explaining how the glasses' interface informs the user.