Virtual assistance

Will AI help us free up more time?

Everyone thinks they need more help. Busy days and a desire to pack as much into life as possible can leave people searching for extra time in the day. The emergence of the virtual assistants might just help provide this.

This will all sound pretty familiar to adept users of everything from note applications, web chat services, or even Skype, but as the development of artificial intelligence (AI) expands potential uses, some applications are learning to think for themselves.

It’s this ability to predict the wants and needs of the user that will set the most successful services or software apart. Tech companies including Apple, Facebook, Google and Microsoft have all made significant investments in proprietary virtual assistants, with mixed success. Some, like Apple’s Siri, are aimed at instigating other software through voice command, others, such as Google’s Now, point to information from third parties that users may need or find useful.

However the technology continues to develop, the level of interest in it points to the income potential developers suspect it could support: imagine the price companies may pay to be the first thing suggested. As Facebook rolls out tests of its own service, called ‘M’ and being developed as part of its stand alone Messenger app, the company is hoping it will be able to find information and complete tasks on a users behalf.

While the Facebook offering is powered by AI, it has the added bonus of being trained and supervised by real live people. This may speed its development and ultimately help fine-tune its usefulness. But it also points to a key challenge in making AI consumer friendly: getting it to think like a person.

Much of the pitch for these services is reducing the time people have to spend on simple mundane tasks. That sounds great, but all are still some way from doing more than preempting a web search and providing some novelty value.

Eye of Horus team

Eye of Horus is an open source platform to control any device just by looking at them. This device could help physically handicapped people on their tasks. The system combines eye tracking with a frontal camera to know where you are looking. The target devices are identified using light beacons (similar to LiFi technology) and controlled with wireless protocols.

 

We have developed a simple and low-cost solution to detect and identify the objects in our surrounding. Infrared LEDs are used as light beacons (similar to LiFi technology) emitting different frequency pulses for each device (PC, camera, TV, microwave…). The frontal camera of our dispositive detects this light, differentiating and communicating with the objects when you look at them.

The challenge is to design and build a wearable accessory that could be useful for physically handicapped people in their activities. The device would facilitate their work through a natural interface so they can do different things without using their hands. For instance, interacting with a distant device just by looking at it.

Given the increasing breadth of the Internet of Things (IoT), whereby internet connectivity is now reaching beyond traditional devices like computers, smartphones and tablets to a diverse range of devices and everyday objects, the Eye of Horus could have possibilities in allowing people to control the light level in their kitchens or turning on machines just using your vision. The device, therefore, must have built-in wireless capabilities, which is the base of IoT.

eoh_prototype.jpg

We have decided to make our product using 3D printers so that it is durable. This technology is cheap and easy to build so the device could be replicated worldwide or even in space and to keep the project as an open source platform, both hardware and software.

AI is helping to bring the reality of self-driving robot cars closer to the road

While self-driving cars are still very much in the ‘learner’ phase, artificial intelligence is rapidly helping them find a route to the roads.

German truck manufacturer Mercedes-Benz is among the first companies looking to deploy self-driving technology in commercial applications.

The company is testing and developing autopilot systems for its heavy trucks. Although these are some years from hitting the roads in earnest, the intention is to give long-haul truckers a concentration break on the highway, rather than taking over completely. Onboard systems will monitor speed and distance from other vehicles, while also keeping the steering on the straight and narrow.

Consumer versions of the technology are seen in the company’s ‘intelligent drive’ systems, which is gradually packaging together a growing number of driver assists allowing current production cars to manage more aspects of city driving, such as the ever-tricky parallel park.

Google meanwhile is applying its self-drive technology to existing vehicles, as well as developing new prototypes intended to be self-driving from the ground up. The company’s test fleet has already put more than a million miles under its wheels in locations around the US. While the technologies that make self-driving cars possible are gradually being brought together, it may still be a while before we can just take our hands off the wheel.

IEEE

IEEE is the world’s largest professional association dedicated to advancing technological innovation and excellence for the benefit of humanity. IEEE and its members inspire a global community through IEEE’s highly cited publications, conferences, technology standards, and professional and educational activities.