Google Patent | Gesture Component with Gesture Library

Patent: Gesture Component with Gesture Library

Publication Number: 20190011989

Publication Date: 2019-01-10

Applicants: Google

Abstract

A gesture component with a gesture library is described. The gesture component is configured to expose operations for execution by application of a computing device based on detected gestures. In one example, an input is detected using a three dimensional object detection system of a gesture component of the computing device. A gesture is recognized by the gesture component based on the detected input through comparison with a library of gestures maintained by the gesture component. An operation is then recognized that corresponds to the gesture by the gesture component using the library of gestures. The operation is exposed by the gesture component via an application programming interface to at least one application executed by the computing device to control performance of the operation by the at least one application.

Background

This background description is provided for the purpose of generally presenting the context of the disclosure. Unless otherwise indicated herein, material described in this section is neither expressly nor impliedly admitted to be prior art to the present disclosure or the appended claims.

Gestures continue to evolve as one of the primary ways in which users interact with an ever increasing variety of types of computing devices. For example, gestures have expanded from detection using a trackpad to touchscreen functionality and even detection of objects and motion in three dimensions. As the ways in which gestures may be detected has expanded, so too has use of these gestures expanded across an ever increasing variety of computing devices, from laptops, to mobile phones, tablets, televisions, game consoles, and other devices as part of the “Internet of Things.”

However, this implementation by conventional computing devices lacks consistency in both which gestures are supported as well as which operations of the computing devices are controlled by these gestures. For example, a gesture usable to control volume on a television may differ from a gesture usable to control volume on a game console. Thus, a result of this is that users may have difficulty in determining which operations of a computing device are controlled with which gestures. This is especially problematic due to the very nature of gestures. For example, it may be difficult for a user to determine which gestures are available to control operations by this variety of types of computing devices, as opposed to conventional input devices that use buttons and other hardware to initiate operations that may be readily identified on the button itself

Summary

This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter.

A gesture component with a gesture library is described. The gesture component is configured to expose operations for execution by application of a computing device based on detected gestures. In one example, an input is detected using a three dimensional object detection system of a gesture component of the computing device. A gesture is recognized by the gesture component based on the detected input through comparison with a library of gestures maintained by the gesture component. An operation is then recognized that corresponds to the gesture by the gesture component using the library of gestures. The operation is exposed by the gesture component via an application programming interface to at least one application executed by the computing device to control performance of the operation by the at least one application.

发表评论

电子邮件地址不会被公开。 必填项已用*标注