Future Google Glass Features Buried Deep Inside Its Software Code

Thanks to Android Police who are always one step ahead when it comes to coding, rooting and scripting, we now know some new characteristics and functions of Google Glass. Apparently, some of the codes written in the programming language are prepared for some performances that Google hasn’t told anyone about. Not even to the developers working on creating customized software. Here’s what the guys from Android Police found.

Order your cab

Nowadays, calling a cab is easier than ever.You can order one by text, online, phone call or phone app and so on. Cab companies have invested a lot in making the service as easy as possible for the costumer. So, shortly you will be able to call a cab just by saying to the Glass “Glass, call a cab” and in a couple of minutes you will have one at your location. This certainly becomes useful when coming back from the supermarket with your hands full, or wearing gloves and using a smartphone would be difficult. Yes, the Glass will be able to make it even easier.

3D Model for 3D printing

Even though 3D printing is still a young technology, constructing the models is a huge part of its evolution. Considering that the Glass has a 3D camera, it will be easy to make a model which can be edited or rearranged before going to the print. After all, 3D printing will mean a lot in the future in more than one domain.

Bookmark your favorite recipes or search a recipe

Do you remember the last time you cooked a meal by heart? Without looking on the internet for at least some indications or ingredients? I don’t. And I’m sure that a lot of you are in the same position. When Google will allow it, this activity could be a whole lot easier because the Glass will show you step by step how to prepare a dish, in real time.

Command the Glass by eye movement

We all know by now that we can entirely control the Glass by voice command, and that in order for the device to do something all you have to do is say:”Ok Glass..”, do that. But Google has completely avoided the subject on command via eye movement. The report made by Android Police mentions that there are codes for this feature as well. So, the Glass could receive the command of taking a photo through blinking or winking. Precisely, the device can be set to take shots when the owner double blinks or winks. ,

Learn a Song

This code naming leaves room for a lot of activities. For instance, maybe the Glass could play guitar tutorials on its screen, with comments and indications while you play the notes on your own guitar, or maybe the same with karaoke songs. Or maybe it could simply recognize a song you’re listening. Each of these possibilities seem interesting to explore.

Glass Music Player

This service is another improvement Google is apparently planning for the Glass. The sound system based on bone-conduction would send the sounds through the skull bone. While we don’t know for sure what this would feel like, we can’t wait to try this new sound experience.

Organize a Round of Golf

Yes, the system is able to drive one through a golf course or towards a golf club nearby. This is certainly an interesting addition and we are sure there will be people using it. The Glass will also have incorporated a digital caddy.

Go for a Run/Go for a Bike Ride

This option can come handy to those new in big cities, especially for those who like exploring places and meeting new people. The Glass would provide its owner with navigation routes, directions and everything needed in order to get to his destination

Translate Feature

We’ve heard before about this feature. By using the camera and the voice command, the Glass can render the text it sees or hears, in a familiar language for its owner, using Google Translate. This feature would be very useful for those who travel a lot. No more language barriers!

Tuning an Instrument

Of course, if you play an instrument, or even repairing one, this feature could be quite useful or at least worthy to explore. There is no smoother sensor than the year when tuning an instrument, but we can’t wait to see what Google will be able to do in this area.