Launchment of Google Android Glass will be a global event that will mark a huge change in the mobile tech sector. With the Google Glasses we will see the world via augmented reality in a way that will change the way we act and it should make for us easier many tasks that today we’re making with various mobile devices like smartphones, tablets, GPS navigators , etc … The point is that the launchment of Android was a revolution in mobile technology and it seems that the Google Glass can produce a radical change in the way we understand how to make the typical things that many of us are making everyday with our phones and tablets. The question is, will we be able to adapt to this change of mind? Until now, when we wanted to perform tasks such as to know what’s the weather like, reply a comment on Facebook or how to go to a place using a GPS we needed to stop the attention to our surroundings. However, with the Google Glasses we will can do all this and more while watching what we have around us. Also, the way to share what we are doing will be more comfortable and often we won’t need to use our hands because voice commands will can perform many tasks.
Video about as Google Glasses work
Google has given us a video in which you can see the Google Glass interface as our eyes will see it and it also explains how you will be able to control the Google Glasses manually with the touchpad inserted in the right pin of the glasses. Remember, this video is just an example of what can be done with the Google Glass although I’m sure the final version will have many improvements since Google is getting from many developers enough feedback as to add new options and correct any current problem. Well, here’s the video …
After watching this video certainly more than one will be waiting eager for a long long time until Google goes on sale officially the expected glasses. At least that’s my case :-)