Wednesday, January 29, 2014

Intelligent Augmented Reality glasses designed for professors

Intelligent glasses designed for professors


This news release is available in Spanish.

The proposed system (Augmented Lecture Feedback System – ALFs) seeks to improve communication between students and professors during large lecture classes like those frequently given at universities. The way they work is quite intuitive: the professor wears a pair of augmented reality glasses that enable him/her to see symbols above each student; the symbols indicate the person's state while this activity is taking place. "These symbols are activated by the students via their cell phones and are used to tell the professor that they don't understand the explanation, or that they have understood it, to ask the professor to go more slowly, or to say whether or not they know the answer to the question that the professor has just asked the class," explains one of the researchers from UC3M's Grupo de Sistemas Interactivos (Interactive Systems Group), Telmo Zarraonandia. This way, the professor knows, simply by looking at the symbol a student has displayed over his/her head, exactly what that student wishes to communicate to him/her. In addition, on the upper part of the glasses, the system shows a diagram with the aggregate of the answers given by the students, which can be particularly useful in large groups.

[caption id="attachment_289" align="aligncenter" width="619"]Intelligent Augmented Reality glasses designed for professors Augmented-Reality-Technology-041 Intelligent Augmented Reality glasses designed for professors[/caption]

The main advantage of this device is that students have a new way to communicate that enables them to be in contact with the professor both immediately and privately, and without interrupting the class. "The channel that we've created will help overcome the problems of timidity or fear of speaking in front of the class that some students have", points out one of the researchers, Ignacio Aedo, a tenured professor in UC3M's Computer Science department. This way, the professor has a source of immediate information on what the students are grasping from his/her presentation. "The hope is that this system will make for more effective lecture classes, because receiving greater feedback, continuously, will allow the professor to adapt the class based on the students' actual knowledge and understanding, giving extra examples, varying the rhythm or skipping those parts of the lesson that the students indicate that they already know or remember," concludes Aedo. Moreover, through the glasses, the system allows the professor to visualize notes or comments that s/he doesn't want to forget to mention at specific moments, and which s/he can introduce in the system prior to the class.

Education of the future

The architecture of the system is described in a scientific article published in the British Journal of Educational Technology in a special monographic edition dedicated to the education of the future. The prototype that these researchers have developed is controlled by gestures, captured with a Microsoft Kinect; using these gestures, the professor selects the support slide for an explanation, or activates predetermined questions to which the students respond by displaying a variety of symbols that they select using their cell phones. The system can identify the students using facial recognition (by previously loading their photographs to a database) or, in larger groups, by using a positioning system based on markers.

In order for the students to be able to select the symbols, they just have to connect their cell phones to the server where the system is installed. The professor, on the other hand, just needs a pair of augmented reality glasses. "Because of their ability to display information on the user's field of vision, these devices have the potential to change the way in which we carry out many of our daily tasks, as well as offering many interesting possibilities from a research point of view," comments Telmo Zarraonandia. Currently, the various models of augmented reality glasses are costly and not very ergonomic because they are too heavy and make it difficult for the professor to move, but "it is hoped that in the next few years new models will come onto the market and these will be suitable for use in class, as might be the case with the new Google glasses, which could be adapted to this system," points out Ignacio Aedo.
###

This research is part of TIPEx (Information Technologies for Planning and Training in Emergencies), a project that has been funded by the Ministerio de Economía y Competitividad (Economy and Competitivity Ministry) in which researchers from the Universidad Politécnica of Valencia and the Universidad Pablo de Olavide also participate; the project examines how augmented reality and other technologies can be applied to the area of emergency management.

Further information:

Title: An augmented lecture feedback system to support learner and teacher communication Authors: Telmo Zarraonandia, Ignacio Aedo, Paloma Díaz, Álvaro Montero Journal: British Journal of Educational Technology. Volume: 44. Number: 4. July 2013 Article published on-line 4 June 2013 DOI: 10.1111/bjet.12047http://onlinelibrary.wiley.com/doi/10.1111/bjet.2013.44.issue-4/issuetoc

VIDEO: http://youtu.be/sNHFhL034F4

News Release Source : Intelligent glasses designed for professors

 

 

Wednesday, January 8, 2014

Metaio to integrate 3D Augmented Reality into Intel® RealSense™ Computing SDK

Metaio to integrate 3D Augmented Reality into Intel® RealSense™ Computing SDK


MUNICH and SAN FRANCISCO, Jan. 6, 2014 /PRNewswire/ -- Metaio, the world leader in augmented reality (AR) software and solutions, today announced at the 2014 Consumer Electronics Show (CES) the planned integration of their patented 3D augmented reality tracking software with the Intel® RealSense™ Software Development Kit (SDK).


(Logo: http://photos.prnewswire.com/prnh/20130530/SF23360LOGO)


Winner of multiple awards, including the ISMAR Tracking Competition and the recent 2013 Volkswagen Tracking Competition, Metaio's augmented reality tracking technology recognizes real-world images, objects and environments in order to attach relevant digital or virtual information, in real-time. Tracking is perhaps the most important aspect of any augmented reality experience. Now, with the Intel® RealSense™ 3D camera integrating into 2 in 1, Ultrabook, notebook, and AIO devices, real and virtual objects environments will interact with each other in practical as well as entertainment applications. For example, someone could accurately map a room in their house and virtually rearrange the furniture on their computing devices.




[caption id="attachment_284" align="aligncenter" width="640"]Metaio to integrate 3D Augmented Reality into Intel® RealSense™ Computing SDK Augmented-Reality-Technology-040 Metaio to integrate 3D Augmented Reality into Intel® RealSense™ Computing SDK[/caption]

"Intel's vision is to make computing more immersive and enable human-like natural interactions with our devices," said Mark Yahiro, Managing Director, New Business, Perceptual Computing, Intel Corporation.  "Using Intel RealSense 3D camera technology in combination with Metaio's augmented reality tools, we look forward to blurring the virtual and real worlds further than ever before. For example, children will be able to play with their favorite toys and customize their experiences with digital interactions in creative new ways."


The Intel RealSense Software Development Kit (SDK) will be released in the first half of 2014. It will be an evolution of the Intel® Perceptual Computing SDK, which was released in 2012 and has been downloaded over twenty-five thousand times by developers worldwide.  The Intel® RealSense SDK will include voice recognition in more than 9 languages; background subtraction, a capability that enables developers to add green-screen-like functionality to applications; close-range hand and finger tracking that permits users to control their computing devices with mid-air hand and finger gestures; and face analysis, which identifies users and tracks their facial features across the camera's field of view. Once the addition of the 3D tracking and recognition engine by Metaio is completed, the Intel RealSense Computing SDK will offer developers advanced augmented reality features using depth data from the integrated Intel RealSense 3D camera in computing devices.


This anticipated integration brings new levels of interactivity to the SDKs of both Metaio and Intel. The 65,000 developers on Metaio's AR platform will have access to advanced human-computer interaction features offered by Intel. Developers will be able to create experiences that utilize hand gestures or facial recognition; manipulate backgrounds and environments in real time; and utilize voice recognition, all for new augmented reality applications that bridge the physical and digital worlds for natural, intuitive and immersive experiences.


"Developers need the best tools," said Metaio CTO Peter Meier. "Metaio's collaboration with Intel for the Intel RealSense SDK with depth camera integration will allow developers to push the boundaries on creativity and use of technology in completely new ways of human interactions with computing devices."


To learn more about the Metaio SDK and download it for free, please visit http://metaio.com/products


About Metaio The worldwide leader in Augmented Reality (AR) research and technology, Metaio develops software products for visually interactive solutions between the real and the virtual world. Based on the Metaio Augmented Reality platform, digital and 3D content can be integrated seamlessly into the real world through the user's camera view. Serving over 65,000 developers and powering over 1,000 apps for enterprise, marketing, retail, publishing and industrial cases, over 30 Million consumers use Metaio's AR software. Learn more at www.metaio.com


SOURCE Metaio

RELATED LINKS
http://www.metaio.com

News Release Source : http://www.prnewswire.com/news-releases/metaio-to-integrate-3d-augmented-reality-into-intel-realsense-computing-sdk-238960841.html