Wednesday, August 19, 2015

Computer Scientists Introduce a New Software Modeling Program

UMass Amherst Computer Scientists Introduce New Graphics Software


August 11, 2015

AMHERST, Mass. – Computer scientists from the University of Massachusetts Amherst led by Evangelos Kalogerakis today unveiled a new software modeling program that uses sophisticated geometric matching and machine learning to successfully mimic the human perception of style, giving users powerful new tools to compare the style similarity of three-dimensional (3D) objects.

Computer Scientists Introduce a New Software Modeling Program Augmented-Reality-Technology-046
Computer Scientists Introduce a New Software Modeling Program
Kalogerakis and his doctoral student Zhaoliang Lun in the College of Information and Computer Sciences at UMass Amherst, with Alla Sheffer from the University of British Columbia, presented their new algorithm at one of the world’s largest computer graphics conferences, the annual Association for Computing Machinery’s (ACM) Special Interest Group on Computer Graphics and Interactive Techniques (SIGGRAPH) 2015, going on this week in Los Angeles.

Kalogerakis is an expert in developing computational techniques that analyze and synthesize visual content, focusing on machine learning algorithms that help people to create 3D models. To develop the new software, he and colleagues drew on observations about style similarity in art history and appraisal literature, which provided geometric elements including shape, proportion and lines, and visual motifs as key indicators of stylistic similarity.

They also used crowdsourcing to present object’s style comparisons to more than 2,500 people, including artists, who volunteered via Amazon Mechanical Turk on the Internet. This yielded more than 50,000 responses on seven structurally diverse categories, buildings, furniture, lamps, coffee sets, architectural pillars, cutlery and dishes. The human respondents agreed on style similarity on average 85 percent of the time.

As for the software tool, the researchers evaluated it by comparing its responses to the human evaluations and found that it achieves “a surprising agreement rate” of 90 percent, Kalogerakis reports, “making it your next-to-best alternative style expert for providing you with suggestions of objects to populate your home, dining table, or virtual reality environment.”

As he explains, computer graphics algorithms help people create movies, visual effects, games, virtual and augmented reality environments. They are also useful in manufacturing real objects and designing architectural scenery. More generally, the new algorithm is expected to be useful to those exploring databases of digital representations of buildings, pillars and other objects according to style attributes for designing virtual or real environments, creating content for a computer game, and populating an augmented reality environment with virtual objects.

Computer algorithms also run in the background on many devices, as well, he says, such as spell and grammar checkers, programs that deblur photographs or focus on faces. Robots run computer algorithms to recognize their environment to move around and pick up objects. Online search engines run computer algorithms to help users find documents, pictures and videos.

SIGGRAPH members include researchers, developers and users from the technical, academic, business and art communities who use computer graphics and interactive techniques. ACM is the world’s largest educational and scientific computing society for educators, researchers and professionals to inspire dialogue, share resources and address the field’s challenges.

Following links gives more information about new software modeling program

Elements of Style: Learning Perceptual Shape Style Similarity  (PDF)


Friday, August 7, 2015

First Eye Tracking Upgrade for Augmented Reality Glasses

SMI Shows First Eye Tracking Upgrade for Augmented Reality Glasses


SensoMotoric Instrument (SMI) adds eye tracking to Epson Moverio BT-200 for hands-free interaction, gaze sensitive on-demand content, and personalized visualization in augmented reality.

BOSTON and TELTOW, Germany, August 6, 2015 /PRNewswire/ --

At the 2015 Siggraph Conference, SensoMotoric Instruments (SMI) shows the world's first Eye Tracking Integration for augmented reality, based on Epson's Moverio BT-200 see-through head mounted display and on SMI's mobile eye tracking platform. With this new solution, unprecedented quality and efficiency is brought to personalized visualization and to multi-modal, hands-free interaction with context-sensitive displays. For the first time, professionals and researchers can integrate gaze, and visual attention in AR applications.

[caption id="attachment_310" align="aligncenter" width="500"]First Eye Tracking Upgrade for Augmented Reality Glasses Augmented-Reality-Technology-045 First Eye Tracking Upgrade for Augmented Reality Glasses[/caption]

SMI will demonstrate the solution at Siggraph in Los Angeles, booth 1022, August 11-13, 2015.

One critical component for a good AR user experience is the ability to accurately overlay and add content to objects, as if pinning it to them. To achieve this, precise knowledge about the users' physiognomy and gaze is required. SMI's advanced eye tracking algorithms offer this quality and add continuous adaption of the visualization to the user's biometric information, replacing current cumbersome and error-prone display calibration routines.

Visual orientation, search and inspection procedures are often crucial, yet error-prone, tasks in manufacturing, logistics, technical service, medical and other professional fields. Gaze-sensitive information processing supports tasks by exploiting the user's attention, e.g. by providing "in-the-loop" gaze guidance during procedures that are quality-sensitive and visually complex. This can be done by structuring, guiding and monitoring the search, and the visual intake progress. Eye tracking enables objective assessment and documentation of what is seen. Gaze overlay also allows for transfer of expertise and can make expert knowledge and procedures explicit and transparent to others.

With eye tracking in AR Displays, context sensitive information can be provided to the user exactly when he needs it, where he needs it. Superimposing information and instructions onto real objects gives the user access to instant 'at-a-glance' and on-demand information, such as condition of cargo, health data of patients, or performance data of machines. SMI's eye tracking integration into multimodal interaction helps to realize convenient, hands-free device and application operation in these challenging situations and environments.

"I believe that eye-tracking is one of the last remaining critical interaction mechanisms that will enable digital eyewear to be intuitive for end users and SMI is the clear leader in this space," says Pete Wassell, CEO and Co-Founder of Augmate.

Professor Dr. Thomas Schack, who leads the research group "Neurocognition and Action - Biomechanics" (NCA) at the Cluster of Excellence Cognitive Interactive Technology (CITEC) at Bielefeld University: "SMI's AR eye tracking is the technology platform that helps us realize a unique research project at CITEC, funded by the German Federal Ministry for Education and Research. ADAMAAS focuses on the development of a mobile adaptive assistance system in the form of intelligent glasses."

"SMI Eye Tracking for Augmented Reality Glasses completes our broad range of innovative eye tracking solutions for researchers and developers," says Eberhard Schmidt, Managing Director at SMI. "Many AR applications need and will greatly benefit from eye tracking for visualization and interaction purposes. With this latest addition, our eye tracking platform spans all relevant display categories from desktop over handheld and mobile to HMD and Smartglasses for VR and AR. According to market researchers, the AR smartglass market will grow tremendously to 100 million units in 2018 and it will be as big as the smartphone market in 10 years. We think that eye tracking will be of critical importance to equip AR devices and applications with the high-quality user experience to reach such numbers."

The SMI AR Eye Tracking upgrade package to the Moverio BT-200 comes with an SDK for easy creation of and integration with applications for the rapidly growing smart glasses and augmented reality applications market. Among other functionalities, the SMI SDK allows access to real-time streaming of gaze data synchronized with the AR display content.

- Cross reference: Picture is available at AP Images (http://www.apimages.com )

About SMI

SensoMotoric Instruments (SMI) is a world leader in eye tracking technology, developing and marketing eye & gaze tracking solutions for scientists and professionals, OEM and medical solutions for a wide range of applications. Find out more at www.smivision.com. Follow @SMIeyetracking on Facebook, Flickr, YouTube and Twitter.

News Release Source : SMI Shows First Eye Tracking Upgrade for Augmented Reality Glasses
Image Credit : SensoMotoric Instrument (SMI)