MY147059A - Apparatus and method for multiple-touch spatial sensors - Google Patents

Apparatus and method for multiple-touch spatial sensors

Info

Publication number
MY147059A
MY147059A MYPI20072085A MYPI20072085A MY147059A MY 147059 A MY147059 A MY 147059A MY PI20072085 A MYPI20072085 A MY PI20072085A MY PI20072085 A MYPI20072085 A MY PI20072085A MY 147059 A MY147059 A MY 147059A
Authority
MY
Malaysia
Prior art keywords
spatial
image data
dimensional
cameras
sensors
Prior art date
Application number
MYPI20072085A
Inventor
Hon Hock Woon
Tan Shern Shiou
Original Assignee
Mimos Berhad
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mimos Berhad filed Critical Mimos Berhad
Priority to MYPI20072085A priority Critical patent/MY147059A/en
Priority to PCT/MY2008/000164 priority patent/WO2009066998A2/en
Publication of MY147059A publication Critical patent/MY147059A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

THE PRESENT INVENTION RELATES TO AN APPARATUS AND A METHOD FOR MULTIPLE-TOUCH THREE-DIMENSIONAL CONTACTLESS CONTROL FOR SPATIAL SENSING. THE APPARATUS COMPRISES TWO CAMERAS (101, 102) HAVING SPATIAL SENSORS TO CAPTURE OBJECT POSITION IN A FORM OF AN IMAGE, A REGISTER FOR REGISTERING THE SPATIAL DIRECTIONS AS SENSED BY THE SPATIAL SENSORS, A DATA PROCESSING UNIT, AND A COMPUTER FOR COMPUTING OBJECT POINT DERIVATION AND BLOB ANALYSIS. THE METHOD MULTIPLETOUCH THREE-DIMENSIONAL CONTACTLESS CONTROL FOR SPATIAL SENSING COMPRISING THE STEPS OF: POSITIONING FIRST AND SECOND CAMERA (101, 102) AND CAPTURING IMAGE DATA OF AN OBJECT (107) BY THE CAMERAS (101,102); TRANSFERRING THE CAPTURED IMAGE DATA OF THE OBJECT (107) THROUGH BACKGROUND AND FOREGROUND SEGMENTATION USING IMAGE PROCESSING FUNCTION; DETERMINING SPATIAL POSITIONS OF EACH OF THE IMAGE DATA OF THE OBJECT (107) AND DERIVING A THREE-DIMENSIONAL SPATIAL POSITION OF THE POINT; AND PROCESSING THE CAPTURED IMAGE DATA THROUGH BLOB ANALYSIS.
MYPI20072085A 2007-11-23 2007-11-23 Apparatus and method for multiple-touch spatial sensors MY147059A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
MYPI20072085A MY147059A (en) 2007-11-23 2007-11-23 Apparatus and method for multiple-touch spatial sensors
PCT/MY2008/000164 WO2009066998A2 (en) 2007-11-23 2008-11-24 Apparatus and method for multiple-touch spatial sensors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
MYPI20072085A MY147059A (en) 2007-11-23 2007-11-23 Apparatus and method for multiple-touch spatial sensors

Publications (1)

Publication Number Publication Date
MY147059A true MY147059A (en) 2012-10-15

Family

ID=40668031

Family Applications (1)

Application Number Title Priority Date Filing Date
MYPI20072085A MY147059A (en) 2007-11-23 2007-11-23 Apparatus and method for multiple-touch spatial sensors

Country Status (2)

Country Link
MY (1) MY147059A (en)
WO (1) WO2009066998A2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2947348B1 (en) * 2009-06-25 2011-08-19 Immersion DEVICE FOR HANDLING AND VISUALIZING A VIRTUAL OBJECT
CN107945172A (en) * 2017-12-08 2018-04-20 博众精工科技股份有限公司 A kind of character detection method and system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3795647B2 (en) * 1997-10-29 2006-07-12 株式会社竹中工務店 Hand pointing device
US7242388B2 (en) * 2001-01-08 2007-07-10 Vkb Inc. Data input device
US7277599B2 (en) * 2002-09-23 2007-10-02 Regents Of The University Of Minnesota System and method for three-dimensional video imaging using a single camera
KR100815159B1 (en) * 2005-12-08 2008-03-19 한국전자통신연구원 3D input apparatus by hand tracking using multiple cameras and its method

Also Published As

Publication number Publication date
WO2009066998A2 (en) 2009-05-28
WO2009066998A3 (en) 2009-10-15

Similar Documents

Publication Publication Date Title
TWI708216B (en) Method and system for calibrating vision system in environment
CN104626169B (en) Robot part grabbing method based on vision and mechanical comprehensive positioning
WO2007025300A8 (en) Capturing and processing facial motion data
TW201915943A (en) Method, apparatus and system for automatically labeling target object within image
GB201119501D0 (en) An apparatus, method and system
ATE514055T1 (en) COMPUTER ARRANGEMENT AND METHOD FOR COMPUTING MOTION VECTORS USING DISTANCE DATA
WO2015134794A2 (en) Method and system for 3d capture based on structure from motion with simplified pose detection
JP2010539557A5 (en)
CN101799717A (en) Man-machine interaction method based on hand action catch
CN103530599A (en) Method and system for distinguishing real face and picture face
EP2846308A3 (en) Pointing direction detecting device and its method, program and computer readable-medium
WO2006015236A3 (en) Audio-visual three-dimensional input/output
CN104700385B (en) Binocular vision positioning device based on FPGA
TW201600829A (en) System and method for measuring an object
WO2017215351A1 (en) Method and apparatus for adjusting recognition range of photographing apparatus
WO2009112761A3 (en) System for measuring clearances and degree of flushness and corresponding method
CN111488775A (en) Apparatus and method for determining gaze degree
WO2008123462A1 (en) Image processing device, control program, computer-readable recording medium, electronic device, and image processing device control method
WO2009125132A3 (en) Method for determining a three-dimensional representation of an object using a sequence of cross-section images, computer program product, and corresponding method for analyzing an object and imaging system
CN104376323B (en) A kind of method and device for determining target range
CN112146589A (en) A three-dimensional topography measurement system and method based on ZYNQ platform
CN102044034A (en) Commodity catalog display system and method
MY147059A (en) Apparatus and method for multiple-touch spatial sensors
CN101561248A (en) Position measurement device and measuring method
TW201234235A (en) Method and system for calculating calibration information for an optical touch apparatus