WO2012120520A1 - Interaction gestuelle - Google Patents
Interaction gestuelle Download PDFInfo
- Publication number
- WO2012120520A1 WO2012120520A1 PCT/IN2011/000136 IN2011000136W WO2012120520A1 WO 2012120520 A1 WO2012120520 A1 WO 2012120520A1 IN 2011000136 W IN2011000136 W IN 2011000136W WO 2012120520 A1 WO2012120520 A1 WO 2012120520A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- command
- gearshift
- user interface
- interactive
- graphical user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
Definitions
- Gestural interactive systems are computing systems that sense the movements of a user and translate those movements into
- the commands then cause an attached computing device to execute code based on the command.
- These gestural interactive systems may be used in situations wherein a user may desire a more intimate, efficient, and - meaningful interface with a computing device, and, more specifically, within gaming, media playback and browsing, and remote control applications, for example.
- GUI graphical user interfaces
- the graphical user interfaces are designed for productivity applications that the user must concentrate on and excel in using in order to properly perform desired tasks. Further, in using these graphical user interfaces, the goal is to memorize the gestures, and not use a teaching system. Thus, in these types of GUI's a novice user cannot utilize the gestural interactive system because he or she is unfamiliar with the techniques and gestures used to control the system through the GUI.
- FIG. 1 is a diagram of a gestural interactive system, according to one example of the principles described herein.
- Fig. 2 is a diagram of the computing device utilized within the gestural interactive system of Fig. 1 , according to one example of the principles described herein.
- FIG. 3 is a diagram of a gearshift graphical user interface (GUI) processed by a processor and displayed on a display device, according to one example of the principles described herein.
- GUI gearshift graphical user interface
- FIG. 4 is a diagram of a gearshift graphical user interface (GUI) in which a user initiates selection of a play/pause command, according to one example of the principles described herein.
- GUI gearshift graphical user interface
- FIG. 5 is a diagram of a gearshift graphical user interface (GUI) in which a user completes the selection of a play/pause command, according to one example of the principles described herein.
- GUI gearshift graphical user interface
- FIG. 6 is a diagram of a gearshift graphical user interface (GUI) in which a user initiates selection of a fast forward command, according to one example of the principles described herein.
- GUI gearshift graphical user interface
- Fig. 7 is a diagram of the gearshift graphical user interface (GUI) of Fig. 6 in which a user initiates selection of a fast forward command but partially deviates from a channel in the gearshift graphical user interface (GUI), according to one example of the principles described herein.
- GUI gearshift graphical user interface
- Fig. 8 is a diagram of the gearshift graphical user interface (GUI) of Fig. 6 in which a user initiates selection of a fast forward command, but fully deviates from a channel in the gearshift GUI, and terminates the selection of the fast forward command, according to one example of the principles described herein.
- GUI gearshift graphical user interface
- Fig. 9 is a diagram of the gearshift graphical user interface (GUI) of Fig. 6 in which a cursor returns to a neutral position after the user fully deviates from a channel in the gearshift graphical user interface (GUI) and terminates the selection of the fast forward command, according to one example of the principles described herein.
- GUI gearshift graphical user interface
- Fig. ⁇ O is a flowchart showing an exemplary method of browsing media using single stroke gestural interaction, according to one example of the principles described herein.
- identical reference numbers designate similar, but not necessarily identical, elements.
- An individual unfamiliar with a gestural interactive system may find it difficult to interact with the system because the gestural interactive system has a particular mode of interaction such as, for example, specific gestures to be performed to induce certain commands and the method by which the user is to interact with the system of which the user may not have foreknowledge.
- utilization of a familiar metaphor such as, for example, a gearshift graphical user interface provides a user with a familiar setting in which the user may instruct the gestural interactive system to perform a command.
- the graphical user interface is analogized with shifting gears in an automobile; an action generally familiar to all users.
- the gestural interactive system should make two things immediately understandable to a user in one visualization: all available commands, and how to perform a given command.
- all available commands In contrast to the systems and methods of the present application, other menu-ing systems are not immediately understandable to a novice user.
- the gearshift graphical user interface and an associated sensor and display device of the present application simultaneously provide for all available commands to be displayed to a user. Further, the user can instinctively know or quickly learn how to perform a given command based on the displayed gearshift graphical user interface.
- gestural interactive system is meant to be understood broadly as any system that interprets and utilizes the gestures of a user to command a processor to execute code.
- Some examples in which a gestural interactive system is used may comprise computer gaming systems, computing systems in which a mouse and/or keyboard is replaced with gesture interaction, remote control of media devices such as televisions and media playback devices, and robotics in which the gesture of a user is used to control a robotic device, among others.
- the terms "gearshift graphical user interface” or “gearshift GUI” are meant to be understood broadly as any graphical user interface that utilizes a single action to interact with a gestural interactive system.
- the graphical user interface may be presented to a user on a display device in the form of a gearshift pattern in which the terminals of the gearshift pattern represent a number of functions to be performed via the graphical user interface.
- a number of or similar language is meant to be understood broadly as any positive number comprising 1 to infinity; zero not being a number, but the absence of a number.
- Fig. 1 is a diagram of a gestural interactive system (100), according to one example of the principles described herein.
- the gestural interactive system (100) with which a number of users (120) interacts may comprise a computing device (1 15), a display device (105) communicatively coupled to the computing device (1 15), and a sensor (1 10) communicatively coupled to the computing device (1 15).
- the display device (105) may be any device from which the user (120) receives visual feedback while operating the gestural interactive system (100).
- the display (105) may be a liquid crystal display (LCD), a light-emitting diode (LED) display, an organic light emitting diode (OLED) display, a plasma display panel, a televisions, a computer monitor, a high-definition television, a cathode ray tube (CRT) display, or a display that utilizes a projector system, among others.
- LCD liquid crystal display
- LED light-emitting diode
- OLED organic light emitting diode
- plasma display panel a plasma display panel
- televisions a computer monitor
- high-definition television a cathode ray tube (CRT) display
- CTR cathode ray tube
- the sensor (1 10) of Fig. 1 may be any device that detects the presence and motion of a number of users (120).
- the sensor (110) comprises an imaging device that captures full-body three dimensional (3D) motion of each user (120).
- the sensor (110) comprises a sensor bar that detects the presence and motion of a number of handheld remotes with which the users (120) interact with the gestural interactive system (100).
- the sensor (1 10) comprises a sensor bar that detects the presence and motion of a number of glove remotes with which the users (120) interact with the gestural interactive system (100).
- the computing device (1 15), the display device (105), and the sensor (1 10) are separate devices communicatively coupled to each other.
- the principles set forth in the present specification extend equally to any alternative configuration in which a computing device (1 15), the display device (105), and the sensor (110) are configured as one device, or two devices with one device comprising one of these devices, and the other device comprising tow of these devices.
- the computing device (1 15), the display device (105), and the sensor (1 10) are implemented by the same computing device, examples in which the functionality of the computing device (115) is implemented by multiple interconnected computers, for example, a server in a data center and a user's client machine, and examples in which the computing device (1 15), the display device (105), and the sensor (110) communicate directly through a bus without intermediary network devices.
- the gestural interactive system (100) further comprises a computing device (1 15).
- the computing device will now be described in connection with Fig. 2. Fig.
- the computing device (1 15) of the present example determines the selection of a command such as, for example, a playback function by the user based on the data or information detected by the sensor (1 10). In the present example, this is accomplished by the computing device (1 15) receiving data from the sensor (1 10), determining the position of a cursor within a gearshift graphical user interface (GUI), and, based in the position of the cursor, executing the function or command.
- a command such as, for example, a playback function by the user based on the data or information detected by the sensor (1 10. In the present example, this is accomplished by the computing device (1 15) receiving data from the sensor (1 10), determining the position of a cursor within a gearshift graphical user interface (GUI), and, based in the position of the cursor, executing the function or command.
- GUI gearshift graphical user interface
- the computing device (1 15) includes various hardware components. These hardware components may comprise a processor (125), a number of data storage devices (130), and peripheral device adapters (135), among others. These hardware components may be interconnected through the use of one or more busses and/or network connections. In one example, the processor (125), data storage device (130), and peripheral device adapters (135) are communicatively . coupled via bus (107).
- the processor (125) may include the hardware architecture for retrieving executable code from the data storage device (130) and executing the executable code.
- the executable code when executed by the processor (125), causes the processor (125) to implement at least the functionality of determining the selection of a command by the user (120) based on the data or information detected by the sensor (1 10), and executing that command as described herein.
- the processor (125) may receive input from and provide output to one or more of the remaining hardware units.
- the computing device (115), and, specifically, the processor (125) receives data from the sensor (1 10); the data being indicative of the position of a cursor relative to the gearshift GUI.
- the sensor (1 10) captures the position of a hand of a user or a handheld remote used by the user (120) relative to the gearshift GUI displayed on the display device (105).
- the data storage device (130) may store data such as executable code as discussed above. This executable code is processed and produced by the processor (125).
- the data storage device (130) may include various types of memory devices, including volatile and nonvolatile memory.
- the data storage device (130) of the present example includes Random Access Memory (RAM), Read Only Memory (ROM), and Hard Disk Drive (HDD) memory, among others.
- RAM Random Access Memory
- ROM Read Only Memory
- HDD Hard Disk Drive
- the present specification contemplates the use of many varying type(s) of memory in the data storage device (130) as may suit a particular application of the principles described herein.
- different types of memory in the data storage device (130) may be used for different data storage needs.
- the processor (125) may boot from Read Only Memory (ROM), maintain nonvolatile storage in the Hard Disk Drive (HDD) memory, and execute program code stored in Random Access Memory (RAM).
- the data storage device (130) may comprise a computer readable storage medium.
- the data storage device (130) may be, but not limited to, an electronic, magnetic, optical,
- a computer 1 readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- peripheral device adapters (135) in the computing device (1 15) enable the processor (125) to interface with various other hardware elements, external and internal to the computing device (1 15).
- peripheral device adapters (135) may provide an interface to input/output devices, such as, for example, the display device (105) and the sensor (1 10), a keyboard, and a mouse, among others, to create a user interface and/or access external sources of memory storage, for example.
- Fig. 3 is a diagram of a gearshift graphical user interface (GUI) (300) processed by the processor (125) and displayed on the display device (105).
- the gearshift GUI (300) may comprise a number of channels (302).
- the channels comprise a number of terminals (304) that terminate the channels (302).
- the gearshift GUI (300) further comprises a number of commands (305 through 330) that instruct the gestural interactive system (100) to perform those commands, respectively.
- the commands (305 through 330) are located juxtaposition to a respective terminal (304) indicating to a user that if such a terminal (304) is selected, then the gestural interactive system (100) will perform the command, juxtaposition to that terminal (304).
- the commands may comprise any function of the gestural interactive system (100).
- the commands comprise the following media playback and browsing commands: stop (305), pause/play (310), fast forward (325), rewind (330), next (song/photo/chapter) (320), and previous (song/photo/chapter) (315), among others.
- the commands (305 through 330) will cause the gestural interactive system (100) to affect the playback of various forms of media including, for example, music, video, and pictures and the browsing of these forms of media.
- the gearshift GUI (300) further comprises a cursor (340).
- the cursor (340) is a circle.
- the cursor (340) may be represented as any shape.
- the cursor (340) provides feedback to the user regarding the position of the user's hand with respect to the sensor, display device, or a combination of these.
- the cursor (340) is visible within the displayed gearshift GUI (300) when the user's hand or controlling device such as a handheld remote is visible to the gestural interactive system (100), and, specifically, the sensor (1 10).
- the cursor (340) is visible within the displayed gearshift GUI (300) when the user is interacting with the gestural interactive system (100). In this example, the cursor (340) appears when the user moves his or her hand or controlling device.
- the cursor (340) is located in a neutral position (Fig. 4, 303). While in the neutral position (Fig. 4, 303), the . cursor (340) does not indicate the selection of a command (305 through 330). In the physical world, this is equivalent to the user (120) not gesturing in any direction, or not interacting with the gestural interactive system (100).
- the cursor (340) is presented as a relatively larger rendering of the cursor (340) when in the neutral position (Fig. 4, 303) with respect to when the cursor (340) is not in the neutral position (Fig. 4, 303).
- the cursor (340) is presented as a relatively smaller rendering of the cursor when hot in the neutral position (Fig. 4, 303).
- FIGs. 4 and 5 a diagram of a gearshift graphical user interface (GUI) (300) in which a user selects a pause/play command (310) is shown.
- GUI gearshift graphical user interface
- the user (120) begins to move his or her hand, for example, in a downward direction.
- the sensor (110) identifies and tracks the movement of the user's hand
- the processor (125) and display device (105) provide feedback within the gearshift GUI (300) displayed on the display device (105) by moving the cursor (340) in a downward direction relative to the movement of the user's hand.
- the processor (125) of the gestural interactive system (100) receives this input, and
- the gestural interactive system (100) provides for a single action or gesture to be used in interacting with the gestural interactive system to implement a command.
- FIG. 6 is a diagram of a gearshift graphical user interface (GUI) (300) in which a user (120) initiates selection of a fast forward command (325), according to one example of the principles described herein. This is
- the user (120) moves his or her hand or handheld device to the right in order to select the fast forward command (325) on the gearshift GUI (300).
- the display device (105) displays the movement of the cursor (340) to the right and towards the fast forward command (325) of the gearshift GUI (300).
- the user (120) receives visual feedback from the gearshift GUI (300) that the cursor (340) is in the channel (340) that bridges the neutral position (303) and the terminal (304) juxtaposition to the fast forward command (325).
- Fig. 7 is a diagram of the gearshift graphical user interface (GUI) (300) of Fig. 6 in which a user (120) initiates selection of a fast forward command (325) but partially deviates from the channel (340) that bridges the neutral position (303) and the terminal (304) juxtaposition to the fast forward command (325).
- GUI gearshift graphical user interface
- the cursor (340) if the cursor (340) deviates from a channel (302) of the gearshift GUI (300) to a predetermined degree, then the cursor (340) returns to the neutral position (303).
- the user may cancel a gesture before a command (305 through 330) is selected by causing the cursor (340) to move perpendicular or approximately perpendicular to a channel (302). This method of canceling a gesture is described in more detail in connection with Figs. 8 and 9.
- Fig. 8 is a diagram of the gearshift graphical user interface (GUI) (300) of Fig. 6 in which a user (120) initiates selection of a fast forward command (325), but fully deviates from a channel (302) in the gearshift GUI (300), and terminates the selection of the fast forward command (325), according to one example of the principles described herein.
- the cursor (340) is moved substantially away from the channel (340) that bridges the neutral position (303) and the terminal (304) juxtaposition to the fast forward command (325). In this manner, the gesture that the user (120) initiated is now canceled, and the fast forward command (325) is not selected.
- GUI gearshift graphical user interface
- GUI gearshift graphical user interface
- full deviation of the cursor (340) from a channel (302) causes the cancelation of a gesture.
- a portion of the cursor (340) deviates from a channel (302), then the gesture is canceled.
- the amount of deviation of the cursor (340) from a channel (302) that implements a cancelation of the gesture is determined by a user (120) as a user definable parameter.
- Fig. 10 is a flowchart showing an exemplary method of browsing media using single stroke gestural interaction, according to one example of the principles described herein.
- the method of Fig. 0 may begin by the processor (125) displaying a gearshift GUI (300) to a user (120) on the display device (105) (block 1005).
- the imaging device (110) images the scene including the user (120) to sense the movements of the user (120) (block 1010).
- the movements of the user (120) are transmitted to the processor (125).
- the movements of the. user are then translated into movement of the cursor (340), and displayed as feedback to the user (120) on to display device (105) (block 1020).
- the processor (125) continually monitors the user's movements and implements the display of these movements on the display device (105).
- the processor (125) determines, based on the imaged movements of the user (120) by the imaging device (110), if the movements are perpendicular or approximately perpendicular to a channel (302) of the gearshift GUI (300) (block 1025) indicating the user's instructions to cancel the gesture.
- the processor (125) determines that the movements are perpendicular or approximately perpendicular to a channel (302) of the gearshift GUI (300) (block 1025, Determination YES), then the gesture is canceled at block 1030. If, however, the processor (125) determines that the movements are not perpendicular or approximately perpendicular to a channel (302) of the gearshift GUI (300) (block 1025, Determination NO), then the processor continues to monitor whether the movements are perpendicular or
- block 1035 may comprise continually performing the method of block 1025.
- the processor (125) executes the command (305 through 330). In this manner, the user (120) can interact with the gestural interactive system (100) to select a playback function, for example.
- the above-described system and method although described in the context of media playback and browsing may be applied in any command scenario.
- the above described gearshift GUI (300) and associated method of utilizing the gearshift GUI may be applied in any environment wherein a user (120) interactively selects commands via a gestural interactive system (100).
- the above-described system and method may be used in connection with computer gaming environments, computing systems in which a mouse and/or keyboard is replaced with gesture interaction, remote control of media devices such as televisions, and robotics in which the gesture of a user is used to control a robotic device, among others.
- a gearshift graphical user interface (300) is presented to a user (120) on a display device (105) that provides intuitive know-how regarding the operation thereof.
- the gearshift user interface (300) comprises a number of interactive commands (305 through 330).
- a sensor (110) detects the position of a user's hand, and the position of the user's hand relative to the command (305 through 330) of the gearshift user interface (300) is represented on the gearshift graphical user interface (300). If the representation of the user's hand indicates selection of a command (305 through 330), then a processor (125) implements the selected command (305 through 330).
- This single stroke gestural interaction may have a number of advantages, including the following: (1 ) the entire set of commands understood by the gestural interactive system, and instructions for performing them are always visible to the user; (2) the gearshift GUI is operated by tracking the coarse position of a single hand; (3) feedback about hand position is provided constantly, and it is possible to cancel out of a gesture at any time simply by moving perpendicular to the channel or groove; and (4) the system is more robust with respect to user and system errors.
- the fourth advantage if a user moves his hand to the left, for example, but does not move it sufficiently upwards, he may end up rewinding the movie rather than going to the previous chapter. However, this is as opposed to having distinct gestures for the different commands where the consequences of user or system errors may be unpredictable.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Un procédé d'interaction gestuelle consiste à présenter une interface utilisateur graphique de changement de vitesse sur un dispositif de visualisation, l'interface utilisateur graphique de changement de vitesse comprenant un certain nombre de commandes interactives, détecter, avec un capteur, la position de la main d'un utilisateur par rapport à l'interface utilisateur graphique, représenter sur l'interface utilisateur graphique de changement de vitesse la position de la main de l'utilisateur relativement aux commandes interactives avec un curseur, et si le curseur indique la sélection d'une commande, alors, avec un processeur, exécuter la commande sélectionnée.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/IN2011/000136 WO2012120520A1 (fr) | 2011-03-04 | 2011-03-04 | Interaction gestuelle |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/IN2011/000136 WO2012120520A1 (fr) | 2011-03-04 | 2011-03-04 | Interaction gestuelle |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2012120520A1 true WO2012120520A1 (fr) | 2012-09-13 |
Family
ID=46797561
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IN2011/000136 Ceased WO2012120520A1 (fr) | 2011-03-04 | 2011-03-04 | Interaction gestuelle |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2012120520A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103019562A (zh) * | 2012-12-07 | 2013-04-03 | 东莞宇龙通信科技有限公司 | 终端和控制托盘配置方法 |
| GB2513200A (en) * | 2013-04-21 | 2014-10-22 | Biogaming Ltd | Kinetic user interface |
| CN104536556A (zh) * | 2014-09-15 | 2015-04-22 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070046643A1 (en) * | 2004-08-06 | 2007-03-01 | Hillis W Daniel | State-Based Approach to Gesture Identification |
| US20090102800A1 (en) * | 2007-10-17 | 2009-04-23 | Smart Technologies Inc. | Interactive input system, controller therefor and method of controlling an appliance |
| CN101621609A (zh) * | 2008-07-03 | 2010-01-06 | 深圳华为通信技术有限公司 | 一种在机顶盒中进行操作的方法、系统及机顶盒 |
| CN201689407U (zh) * | 2010-03-31 | 2010-12-29 | 北京播思软件技术有限公司 | 一种利用用户手势替代终端设备退出键和确认键的装置 |
| CN101943947A (zh) * | 2010-09-27 | 2011-01-12 | 鸿富锦精密工业(深圳)有限公司 | 交互显示系统 |
-
2011
- 2011-03-04 WO PCT/IN2011/000136 patent/WO2012120520A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070046643A1 (en) * | 2004-08-06 | 2007-03-01 | Hillis W Daniel | State-Based Approach to Gesture Identification |
| US20090102800A1 (en) * | 2007-10-17 | 2009-04-23 | Smart Technologies Inc. | Interactive input system, controller therefor and method of controlling an appliance |
| CN101621609A (zh) * | 2008-07-03 | 2010-01-06 | 深圳华为通信技术有限公司 | 一种在机顶盒中进行操作的方法、系统及机顶盒 |
| CN201689407U (zh) * | 2010-03-31 | 2010-12-29 | 北京播思软件技术有限公司 | 一种利用用户手势替代终端设备退出键和确认键的装置 |
| CN101943947A (zh) * | 2010-09-27 | 2011-01-12 | 鸿富锦精密工业(深圳)有限公司 | 交互显示系统 |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103019562A (zh) * | 2012-12-07 | 2013-04-03 | 东莞宇龙通信科技有限公司 | 终端和控制托盘配置方法 |
| CN103019562B (zh) * | 2012-12-07 | 2016-04-06 | 东莞宇龙通信科技有限公司 | 终端和控制托盘配置方法 |
| GB2513200A (en) * | 2013-04-21 | 2014-10-22 | Biogaming Ltd | Kinetic user interface |
| CN104536556A (zh) * | 2014-09-15 | 2015-04-22 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9009594B2 (en) | Content gestures | |
| US10261742B2 (en) | Visual focus-based control of couples displays | |
| US9836146B2 (en) | Method of controlling virtual object or view point on two dimensional interactive display | |
| Forlines et al. | Hybridpointing: fluid switching between absolute and relative pointing with a direct input device | |
| TWI553540B (zh) | 突顯顯示器上之物件 | |
| US8866781B2 (en) | Contactless gesture-based control method and apparatus | |
| US9041649B2 (en) | Coordinate determination apparatus, coordinate determination method, and coordinate determination program | |
| EP2656193B1 (fr) | Interface de lancement d'applications pour de multiples modes | |
| US9007299B2 (en) | Motion control used as controlling device | |
| US20220326967A1 (en) | Devices, methods, systems, and media for an extended screen distributed user interface in augmented reality | |
| US20130061180A1 (en) | Adjusting a setting with a single motion | |
| US8605219B2 (en) | Techniques for implementing a cursor for televisions | |
| US20140178047A1 (en) | Gesture drive playback control for chromeless media players | |
| US20130314320A1 (en) | Method of controlling three-dimensional virtual cursor by using portable electronic device | |
| US20120229392A1 (en) | Input processing apparatus, input processing method, and program | |
| US8769409B2 (en) | Systems and methods for improving object detection | |
| CN106796351A (zh) | 通过视线控制的头戴式显示装置及其控制方法、用于控制该装置的计算机程序 | |
| JP7252252B2 (ja) | 手の位置に基づいたモーダル制御の開始 | |
| Ryu et al. | GG Interaction: a gaze–grasp pose interaction for 3D virtual object selection | |
| CN104685461A (zh) | 使用来自被控制的设备的输入模式数据的输入设备 | |
| EP2960763A1 (fr) | Systèmes et procédés informatisés de montage en cascade des animations d'élément d'interface utilisateur | |
| CN104182035A (zh) | 一种操控电视应用程序的方法和系统 | |
| WO2012120520A1 (fr) | Interaction gestuelle | |
| Suay et al. | Humanoid robot control using depth camera | |
| EP2698697A2 (fr) | Procédé de recherche d'emplacement de lecture d'applications multimédia et son dispositif électronique |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11860295 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 11860295 Country of ref document: EP Kind code of ref document: A1 |