Impressive! NavCook, while still in beta, touches upon the great potential that we all dreamed about when we first envisioned Glass!
The website offers 3 different demo recipes to try. I chose this one, and within a few seconds my Glass sounded a dong to indicate the recipe had arrived. A tap to open the bundle and:
Ingredients on listed on card 1 and photographed on card 2. Wow!
Each step has instructions and a picture. As they say, a picture is worth 1,000 words, but the picture is much faster and easier to understand.
These are a few more pics from the app. So that’s how you batter an avocado!
If you tap on Glass, it gives you the option to read that step aloud. You can even have the instructions read aloud as you look at the corresponding photo.
All in all, I found this to be a very impressive app! The website notes that all recipes are from tastykitchen.com. I looked at tastykitchen, and while they did have a lot of recipes, the recipes did not include these fabulous step-by-step photographs. Even the recipe for this particular salad on tastykitchen did not include any of these photographs, which leads me to believe the makers of NavCook created the content we see in this app. That’s a lot of work! I wonder what the long-term plan is for creating mass content.
I got a clue from my communication from Tejas Lagvankar, creater of NavCook, when he was explaining that the app was still a demo and “no recipes can be added by the user yet“. Perhaps he is planning to rely on user-created content…? If so, I can imagine some proud chefs documenting the step-by-step process of making their favorite recipes. However, I cannot imagine that they would do as good a job as Lagvankar has done with these demo recipes. The photography was excellent, the instructions were well worded, and that is not easy to accomplish!
Note – Lagvankar explained that tastykitchen has a section of step-by-step instructions and gave me this link for this recipe: http://tastykitchen.com/blog/2013/02/crispy-avocado-bacon-and-tomato-salad/ So that answers that, but I’ll still be interested in seeing how user-created content compares and what the end product will be like :-)
One make-it-or-break-it feature is voice command. Yes, I know we developers can’t access the Glass voice command functions yet, but it is an obvious must-have for this app. When your hands are covered with batter and breading and you want to know how high the stove should be set, you can’t be tapping and swiping at Glass and getting all that gunk in your hair. It would be better to wake up Glass with a nod and say “previous” and “next” to move through the steps. Or maybe, instead of using voice commands, Glass’s accelerometer could be used to sense head movement looking or nodding right or left to move back or forward in the timeline.
Currently, when Glass goes to sleep while using NavCook, you have to wake it up, swipe back through the timeline just to get to NavCook, then tap and swipe through all of the cards to get to the step you were reading before Glass went to sleep. There is no easy way to get back to where you were, at least not presently. But those UI issues are something every Glass developer is dealing with right now. Beyond that, NavCook is a first-class piece of Glassware!