WO2019206129A1 - 数据处理方法、装置、电子设备及计算机可读存储介质 - Google Patents
数据处理方法、装置、电子设备及计算机可读存储介质 Download PDFInfo
- Publication number
- WO2019206129A1 WO2019206129A1 PCT/CN2019/083854 CN2019083854W WO2019206129A1 WO 2019206129 A1 WO2019206129 A1 WO 2019206129A1 CN 2019083854 W CN2019083854 W CN 2019083854W WO 2019206129 A1 WO2019206129 A1 WO 2019206129A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- processing unit
- image
- laser
- controller
- bus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2513—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B5/00—Measuring arrangements characterised by the use of mechanical techniques
- G01B5/0011—Arrangements for eliminating or compensation of measuring errors due to temperature or weight
- G01B5/0014—Arrangements for eliminating or compensation of measuring errors due to temperature or weight due to temperature
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three-dimensional [3D] modelling for computer graphics
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional [3D] objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/175—Controlling the light source by remote control
- H05B47/18—Controlling the light source by remote control via data-bus transmission
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Definitions
- the present application relates to the field of computer technologies, and in particular, to a data processing method, a data processing device, an electronic device, and a computer readable storage medium.
- the electronic device can emit laser light through a laser such as a laser lamp, and collect a face image illuminated by the laser through a camera, and construct a 3D face through the structured light.
- a laser such as a laser lamp
- the control circuit for controlling the laser, the camera, and the like by the electronic device is complicated and costly.
- Embodiments of the present application provide a data processing method, a data processing device, an electronic device, and a computer readable storage medium.
- the data processing method of the embodiment of the present application includes: when the first processing unit receives the image acquisition instruction sent by the second processing unit, controlling at least one of the floodlight and the laser to be turned on, and controlling the laser camera to acquire the target image; The first processing unit processes the target image and transmits the processed target image to the second processing unit.
- the data processing apparatus of the embodiment of the present application includes a control module and a processing module.
- the control module is configured to control at least one of the floodlight and the laser light to be turned on when the first processing unit receives the image capturing instruction sent by the second processing unit, and control the laser camera to acquire the target image.
- the processing module is configured to process the target image by using the first processing unit, and send the processed target image to the second processing unit.
- the electronic device of the embodiment of the present application includes a first processing unit and a second processing unit.
- the first processing unit is configured to: when the first processing unit receives the image acquisition instruction sent by the second processing unit, control at least one of the floodlight and the laser to be turned on, and control the laser camera to acquire the target An image; processing the target image, and transmitting the processed target image to the second processing unit.
- a computer readable storage medium of an embodiment of the present application on which a computer program is stored.
- the above described data processing method is implemented when the computer program is executed by a processor.
- FIG. 1 and 2 are schematic flowcharts of a data processing method according to some embodiments of the present application.
- 3 to 5 are application scenario diagrams of a data processing method according to some embodiments of the present application.
- 6 to 13 are schematic flowcharts of a data processing method according to some embodiments of the present application.
- 14 and 15 are application scenario diagrams of a data processing method according to some embodiments of the present application.
- 16 and 17 are schematic flowcharts of a data processing method according to some embodiments of the present application.
- 18 is a block diagram of an electronic device of some embodiments of the present application.
- 19 to 22 are block diagrams of data processing apparatus of some embodiments of the present application.
- first may be referred to as a second client
- second client may be referred to as a first client, without departing from the scope of the present application.
- Both the first client and the second client are clients, but they are not the same client.
- Data processing methods include:
- the target image is processed by the first processing unit 110, and the processed target image is sent to the second processing unit 120.
- the data processing method of the present application can be applied to the electronic device 100.
- the electronic device 100 includes a laser camera 102, a floodlight 104, a laser light 106, a first processing unit 110, and a second processing unit 120.
- the first processing unit 110 and the second processing unit 120 are connected.
- step 001 controls at least one of the floodlight 104 and the laser light 106 to be turned on when the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120. And controlling the laser camera 102 to acquire the target image includes steps 011 and 012.
- 011 When the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120, send a control instruction to the controller through the bidirectional two-wire synchronous serial I2C bus, and the control instruction is used to control the opening of the floodlight 104 and the laser. At least one of the lamps 106.
- the application may send a data acquisition request to the second processing unit 120, where the face data may include, but is not limited to, a face unlocking, a face payment, and the like. Data for face verification and face depth information.
- the second processing unit 120 may send an image acquisition instruction to the first processing unit 110.
- the first processing unit 110 may be an MCU module, and the second processing unit 120 may be a CPU module.
- the electronic device 100 can also include a controller 130 that can be coupled to the floodlight 104 and the laser light 106, respectively, and the floodlight 104 and the laser light 106 can be controlled by the same controller 130.
- the controller 130 controls the floodlight 104 and the laser light 106, which may include controlling the floodlight 104 or the laser light 106 to be turned on, controlling the switching between the floodlight 104 and the laser light 106, and controlling the floodlight and the laser light 104.
- 106 transmit power, etc.
- the first processing unit 110 can be connected to the controller 130 through an I2C bus, and the I2C bus can realize data transmission between the devices connected to the I2C bus through one data line and one clock line.
- the control instruction can be sent to the controller 130 through the I2C bus.
- the controller 2130 After receiving the control instruction, the controller 2130 turns on the floodlight 104 and the laser according to the control instruction. At least one of the lamps 106.
- a pulse is transmitted to the controller 130 by the pulse width modulation PWM module 112 to illuminate at least one of the turned-on floodlight 104 and the laser light 106, and the target image is acquired by the laser camera 102.
- the first processing unit 110 can be coupled to the controller 130 via the PWM module 112. If the first processing unit 110 is to illuminate at least one of the floodlight 104 and the laser light 106, the PWM module 112 may send a pulse to the controller 130 to illuminate at least at least the open floodlight 104 and the laser light 106. One. Optionally, the PWM module 112 can continuously send a pulse signal to the controller 130 according to a certain voltage amplitude and a certain time interval, and illuminate at least one of the floodlight 104 and the laser lamp 106.
- the first processing unit 110 may acquire a target image through the laser camera 102, and the target image may include an infrared image, a speckle image, and the like. If the floodlight 104 is turned on, the PWM module 112 can send a pulse to the controller 130 to illuminate the floodlight 104.
- the floodlight 104 can be a surface light source that can uniformly illuminate in all directions, when the floodlight 104 is When illuminated, infrared light can be emitted, and the laser camera 102 can collect reflected light from the face back to obtain an infrared image. If the laser light 106 is turned on, the PWM module 112 can send a pulse to the controller 130 to illuminate the laser light 106.
- the emitted laser light can be diffracted by a lens and a DOE (diffractive optical element) to produce a pattern with speckle particles, and the pattern with speckle particles is projected onto the target object with speckle particles.
- the pattern generates offset of the speckle particles because the distances between the points on the target object and the electronic device 100 are different, and the laser camera 102 collects the pattern after the speckle particles are offset to obtain a speckle image.
- the target image is processed by the first processing unit 110, and the processed target image is sent to the second processing unit 120.
- the laser camera 102 can transmit the acquired target image to the first processing unit 110, and the first processing unit 110 can process the target image.
- the target image may include an infrared image, a speckle image, and the like.
- the target image corresponding to the image type may be acquired according to the determined image type, and the target image is processed correspondingly.
- the number of PWM modules 112 may be one or more. When the number of PWM modules 112 is plural, the PWM module 112 may include a first PWM module and a second PWM module.
- the number of controllers 130 may also be one or more. When the number of the controllers 130 is plural, the controller 130 may include the first controller and the second controller.
- the first processing unit 110 may send a pulse to the first controller through the first PWM module, illuminate the floodlight 104, and collect the infrared image through the laser camera 02, and the first processing unit 110 pairs The infrared image is processed to obtain an infrared parallax map.
- the first processing unit 110 may send a pulse to the second controller through the second PWM module, illuminate the laser lamp 106, and collect the speckle image through the laser camera 102, and the first processing unit 110 The speckle image is processed to obtain a speckle disparity map.
- the first processing unit 110 may acquire a speckle image and process the acquired speckle image to obtain a depth disparity map.
- the first processing unit 110 may perform a correction process on the target image, and the correction process refers to correcting the image content due to the internal and external parameters of the laser camera 102 and the RGB camera 108, for example, due to the deflection angle of the laser camera 102.
- the image content is shifted by the placement position between the laser camera 102 and the RGB camera 108, and the like.
- a disparity map of the target image can be obtained.
- the infrared image is corrected to obtain an infrared disparity map
- the speckle image is corrected to obtain a speckle disparity map or a depth disparity map. Wait.
- the first processing unit 110 performs a correction process on the target image, and can prevent a situation in which an image finally presented on the screen of the electronic device 100 is ghosted.
- the first processing unit 110 processes the target image, and can transmit the processed target image to the second processing unit 120.
- the second processing unit 120 can obtain a desired image according to the processed target image, such as an infrared image, a speckle image, a depth image, and the like.
- the second processing unit 120 can further process the desired image according to the requirements of the application.
- the second processing unit 120 may perform face detection on the obtained desired image or the like, wherein the face detection may include face recognition, face matching, and living body detection.
- Face recognition refers to the recognition of whether there is a face in the desired image.
- Face matching refers to matching a face in a desired image with a pre-existing face.
- In vivo detection refers to whether a human face in a desired image is biologically active or the like. If the application needs to acquire the depth information of the face, the second processing unit 120 may upload the generated depth image to the application, and the application may perform the beauty processing, the three-dimensional modeling, and the like according to the received depth image.
- the control instruction is sent to the controller 130 through the I2C bus to control the floodlight 104 and At least one of the laser lamps 106 is turned on, and sends a pulse to the controller 130 through the PWM module 112 to illuminate at least one of the turned-on floodlights 104 and the laser lamps 106, collects the target image, and then processes the target image.
- a controller 130 can control the floodlight 104 and the laser light 106, which can reduce the complexity of controlling the floodlight 104 and the laser light 106, and saves cost.
- FIG. 3 is an application scenario diagram of a data processing method of the embodiment shown in FIG.
- the electronic device 100 includes a laser camera 102, a floodlight 104, a laser lamp 106, a first processing unit 110, a second processing unit 120, and a controller 130.
- the first processing unit 110 may be an MCU (Microcontroller Unit) module or the like, and the second processing unit 120 may be a CPU (Central Processing Unit) module or the like.
- the first processing unit 110 can be connected to the laser camera 102 and the second processing unit 120, and the first processing unit 110 can be connected to the controller 130 through an I2C bus.
- the first processing unit 110 can include a PWM (Pulse Width Modulation) module 112 and is connected to the controller 130 through the PWM module 112.
- the controller 130 can be connected to the floodlight 104 and the laser lamp 106, respectively.
- PWM Pulse Width Modulation
- the first processing unit 110 When the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120, it sends a control instruction to the controller 130 through the I2C bus, and the control instruction can be used to control at least one of the floodlight 104 and the laser light 106 to be turned on. .
- the first processing unit 110 may send a pulse to the controller 130 through the PWM module 112, illuminate at least one of the turned-on floodlight 104 and the laser light 106, and acquire a target image through the laser camera 102.
- the first processing unit 110 may process the target image and transmit the processed target image to the second processing unit 120.
- FIG. 4 is another application scenario diagram of the data processing method of the embodiment shown in FIG. 2.
- the electronic device 100 can include a camera module 101 , a second processing unit 120 , and a first processing unit 110 .
- the second processing unit 120 can be a CPU module.
- the first processing unit 110 can be an MCU module or the like.
- the first processing unit 110 is connected between the second processing unit 120 and the camera module 101.
- the first processing unit 110 can control the laser camera 102, the floodlight 104, and the laser light 106 in the camera module 101.
- the second processing unit 120 can control the RGB camera 108 in the camera module 101.
- the camera module 101 includes a laser camera 102, a floodlight 104, an RGB camera 108, and a laser light 106.
- the laser camera 102 can be an infrared camera for acquiring infrared images.
- the floodlight 104 is a surface light source that emits infrared light.
- the laser lamp 106 is a point source that emits laser light and that emits laser light to form a pattern. Wherein, when the floodlight 104 emits infrared light, the laser camera 102 can acquire an infrared image according to the reflected light.
- the laser light 106 emits laser light, the laser camera 102 can acquire a speckle image based on the reflected light.
- the speckle image is an image in which the patterned laser light emitted by the laser lamp 106 is reflected and the pattern is deformed.
- the second processing unit 120 may include a CPU core running in a TEE (Trusted Execution Environment) environment and a CPU core running in a REE (Rich Execution Environment) environment.
- TEE Trusted Execution Environment
- REE Raster Execution Environment
- the TEE environment and the REE environment are the operating modes of the ARM module (Advanced RISC Machines, advanced reduced instruction set processor).
- the security level of the TEE environment is high, and there is only one CPU core in the second processing unit 120 that can run in the TEE environment at the same time.
- the operation behavior of the security level in the electronic device 100 needs to be performed in the CPU core in the TEE environment, and the operation behavior with lower security level can be performed in the CPU core in the REE environment.
- the first processing unit 110 includes a PWM module 112, a SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) interface 114, and a RAM (Random Access Memory). Module 116 and depth engine 118.
- the first processing unit 110 can be connected to the floodlight 104 and the controller 130 of the laser light 106 (shown in FIG. 3) through the PWM module 112.
- the controller 130 can be connected to the floodlight 104 and the laser light 106, respectively. Lamp 104 and laser light 106 are controlled.
- the first processing unit 110 can also be coupled to the controller 130 via an I2C bus to control the floodlight 104 or the laser light 106 to be turned on via the I2C bus.
- the PWM module 112 can emit pulses to the camera module 101 to illuminate the turned-on floodlight 104 or laser light 106.
- the first processing unit 110 may acquire an infrared image or a speckle image through the laser camera 102.
- the SPI/I2C interface 114 is configured to receive an image acquisition instruction sent by the second processing unit 120.
- the depth engine 118 can process the speckle image to obtain a depth disparity map.
- the image can be sent to the first processing unit 110 by the CPU core running in the TEE environment. Acquisition instructions.
- the first processing unit 110 can send a control command to the controller 130 through the I2C bus, control and control the floodlight 104 in the camera module 101 to be turned on, and then transmit the pulse to the controller 130 through the PWM module 112.
- the wave illuminates the floodlight 104 and controls the laser camera 102 to acquire an infrared image through the I2C bus.
- the first processing unit 110 can also send a control command to the controller 130 through the I2C bus, control the laser light 106 in the camera module 101 to be turned on, and then transmit the pulse wave point to the controller 130 through the PWM module 112.
- the laser light 106 is illuminated and the laser camera 102 is controlled to acquire a speckle image through the I2C bus.
- the camera module 101 can transmit the collected infrared image and the speckle image to the first processing unit 110.
- the first processing unit 110 may process the received infrared image to obtain an infrared parallax map, and may also process the received speckle image to obtain a speckle disparity map or a depth disparity map.
- the processing of the infrared image and the speckle image by the first processing unit 110 refers to correcting the infrared image or the speckle image, and removing the influence of the internal and external parameters of the camera module 101 on the image.
- the first processing unit 110 can also be set to different modes, and images output by different modes are different.
- the first processing unit 110 processes the speckle image to obtain a speckle disparity map, and the target speckle image can be obtained according to the speckle disparity map;
- the first processing unit 110 is set to In the depth map mode, the first processing unit 110 processes the speckle image to obtain a depth disparity map, and the depth disparity map can obtain a depth image, and the depth image refers to an image with depth information.
- the first processing unit 110 may send the infrared disparity map and the speckle disparity map to the second processing unit 120, and the first processing unit 110 may also send the infrared disparity map and the depth disparity map to the second processing unit 120.
- the second processing unit 120 may acquire a target infrared image according to the infrared disparity map, and acquire a depth image according to the depth disparity map. Further, the second processing unit 120 may perform face recognition, face matching, living body detection, and acquiring depth information of the detected face according to the target infrared image and the depth image.
- Communication between the first processing unit 110 and the second processing unit 120 is through a fixed security interface to ensure the security of the transmitted data.
- the data sent by the second processing unit 120 to the first processing unit 110 is passed through the SECURE SPI/I2C 130, and the data sent by the first processing unit 110 to the second processing unit 120 is passed through the SECURE MIPI (Mobile Industry Processor). Interface, mobile industry processor interface) 140.
- SECURE SPI/I2C 130 SECURE SPI/I2C 130
- MIPI Mobile Industry Processor
- Interface mobile industry processor interface
- the first processing unit 110 may also acquire the target infrared image according to the infrared disparity map, calculate the acquired depth image according to the depth disparity map, and then send the target infrared image and the depth image to the second processing unit 120.
- the step of acquiring the target image by the laser camera 102 includes: controlling the laser camera 102 to acquire the target image through the I2C bus.
- the first processing unit 110 can be connected to the laser camera 102 through an I2C bus, and control the laser camera 102 to acquire a target image through the connected I2C bus.
- the first processing unit 110, the laser camera 102, and the controller 130 can be connected to the same I2C bus.
- the floodlight 104 or the laser light 106 can be controlled to be turned on by the I2C bus, and the pulse is sent to the controller 130 through the PWM module 112 to illuminate.
- the floodlight 104 or the laser lamp 106 is turned on, and the laser camera 102 is controlled by the connected I2C bus to collect a target image such as an infrared image or a speckle image.
- the first processing unit 110 can address the controller 130 through the connected I2C bus, and send the signal to the controller 130.
- the control command controls the floodlight 104 or the laser light 106 to be turned on, and then addresses the laser camera 102 through the connected I2C bus, controls the laser camera 102 to collect the target image, and reconstructs the connected I2C bus at different times. Use, save resources.
- FIG. 5 is a schematic diagram of the first processing unit 110, the laser camera 102, and the controller 130 of one embodiment connected to the same I2C bus.
- the electronic device 100 includes a laser camera 102, a floodlight 104, a laser lamp 106, a first processing unit 110, a second processing unit 120, and a controller 130.
- the first processing unit 110 can be connected to the second processing unit 120.
- the PWM module 112 can be included in the first processing unit 110 and connected to the controller 130 through the PWM module 112.
- the controller 130 can be connected to the floodlight 104 and the laser lamp 106, respectively.
- the laser camera 102, the first processing unit 110, and the controller 130 can be connected to the same I2C bus.
- the control command may be sent to the controller 130 through the I2C bus, and the floodlight 104 or the laser light 106 is controlled to be turned on, and passed through the PWM module 112.
- a pulse is sent to the controller 130 to illuminate the turned-on floodlight 104 or the laser light 106, and the laser camera 102 is controlled by the connected I2C bus to acquire a target image such as an infrared image or a speckle image.
- the I2C bus is multiplexed by controlling the floodlight 104, the laser lamp 106, and the laser camera 102 through the same I2C bus, which can reduce the complexity of the control circuit and reduce the cost.
- the step of transmitting the control command to the controller 130 through the I2C bus includes the following steps:
- 0111 Determine the type of image acquired according to the image acquisition instruction.
- the first processing unit 110 sends a first control instruction to the controller 130 through the I2C bus, where the first control instruction is used to instruct the controller 130 to turn on the floodlight 104.
- the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120, and may determine the type of the acquired image according to the image acquisition instruction, where the image type may be one or more of an infrared image, a speckle image, and a depth image.
- the image type may be determined according to the face data required by the application, and after receiving the data acquisition request, the second processing unit 120 may determine the image type according to the data acquisition request, and send the image collection including the image type to the first processing unit 110. instruction.
- the second processing unit 120 may determine that the image type is an infrared image and a speckle image, and the face depth information is required, and further determine the image type as a depth image or the like, but is not limited thereto.
- the first processing unit 110 may send the first control instruction to the controller 130 through the connected I2C, and the controller 130 may be according to the first The control command turns on the floodlight 104.
- the first processing unit 110 can transmit a pulse to the controller 130 through the PWM module 112 to illuminate the floodlight 104.
- the first processing unit 110 may address the controller 130 through the I2C and send the first control instruction to the controller 130.
- the first processing unit 110 sends a second control instruction to the controller 130 through the I2C bus, and the second control instruction is used to instruct the controller 130 to turn on the laser light 106.
- the first processing unit 110 may send the second control instruction to the controller 130 through the connected I2C, the controller The laser light 106 can be turned on according to the second control command.
- the first processing unit 110 can transmit a pulse to the controller 130 through the PWM module 112 to illuminate the laser lamp 106.
- the first processing unit 110 determines an image type according to an image acquisition instruction, and the image type may include at least two types, for example, the image type may include the first type and the second type at the same time.
- the image type includes both the infrared image and the speckle image, or both the infrared image and the depth image
- the camera module 101 needs to simultaneously acquire the infrared image and the speckle image.
- the first processing unit 110 can control the camera module 101 to first collect the infrared image, or first collect the speckle image, and does not limit the sequential acquisition sequence.
- the first processing unit 110 may first send a first control command to the controller 130 through the I2C bus, turn on the floodlight 104, and send a pulse to the controller 130 through the PWM module 112, illuminate the floodlight 104, and then control through the I2C bus.
- the laser camera 102 captures an infrared image.
- the second control command may be sent to the controller 130 through the I2C bus, the laser lamp 106 is turned on, and is transmitted to the controller 130 through the PWM module 112. Pulse, the laser lamp 106 is illuminated, and the laser camera 102 is controlled to collect the speckle image through the I2C bus.
- the first processing unit 110 may also first send a second control instruction to the controller 130 through the I2C bus, turn on the laser light 106, and pass the PWM module 112 to the controller. 130 emits a pulse, illuminates the laser lamp 106, and then controls the laser camera 102 to collect the speckle image through the I2C bus.
- the first control command may be sent to the controller 130 through the I2C bus, the floodlight 104 is turned on, and the controller 130 is passed to the controller 130 through the PWM module 112. The pulse is emitted, the floodlight 104 is illuminated, and the infrared camera is controlled by the laser camera 102 through the I2C bus.
- the first processing unit 110 may send the first control instruction and the second control instruction to the controller 130 at different times, and the time when the first processing unit 110 sends the first control instruction and the time when the second control instruction is sent
- the time interval between the time may be less than the time threshold, and the laser camera 102 may collect the speckle image in a time interval less than the time threshold after the infrared image is acquired, so that the acquired infrared image and the image content of the speckle image are more consistent, which is convenient for subsequent Perform face detection and other processing.
- the time threshold can be set according to actual needs, such as 20 milliseconds, 30 milliseconds, and the like.
- switching and control of the floodlight 104 and the laser lamp 106 can be realized by a controller 130, which can reduce the complexity of the control circuit and reduce the cost.
- step 002 processes the target image by the first processing unit 110, and processes the processed target.
- Sending the image to the second processing unit 120 includes the following steps:
- the line perpendicular to the plane of the imaging plane and passing through the center of the mirror is the Z axis. If the coordinates of the object in the camera coordinate system are (X, Y, Z), then the Z value is the object imaged in the camera. Plane depth information. If the application needs to obtain the depth information of the face, it is necessary to collect a depth image containing the face depth information.
- the first processing unit 110 can control the laser light 106 to be turned on through the I2C bus, and control the laser camera 102 to collect the speckle image through the I2C bus.
- a reference speckle map may be pre-stored in the first processing unit 110, and the reference speckle map may be provided with reference depth information, and the depth of each pixel included in the speckle image may be acquired according to the collected speckle image and the reference speckle image. information.
- the first processing unit 110 may sequentially select a preset size pixel block, for example, a 31 pixel (pixel) * 31 pixel size, and search for and select the pixel on the reference speckle image, centering on each pixel point included in the collected speckle image.
- the block matches the block.
- the first processing unit 110 may find two points in the speckle image and the reference speckle image respectively on the same laser optical path from the blocks selected by the selected pixel image and the reference speckle image.
- the speckle information of two points on the same laser beam path is consistent, and two points on the same laser beam path can be identified as corresponding pixel points.
- the depth information of the points on each laser light path is known.
- the first processing unit 110 may calculate an offset between the target speckle image and the reference speckle image on two pixels of the same laser light path, and calculate the acquired speckle pattern according to the offset. Depth information for each pixel.
- the first processing unit 110 performs the calculation of the offset of the collected speckle image and the reference speckle pattern, and calculates the depth information of each pixel point included in the speckle image according to the offset amount, and the calculation formula thereof is calculated.
- Z D represents the depth information of the pixel point, that is, the depth value of the pixel point
- L is the distance between the laser camera 102 and the laser (ie, the laser lamp 106); f is the focal length of the lens in the laser camera 102
- Z 0 is
- the reference plane is the depth value of the laser camera 102 of the electronic device 100
- P is the offset between the acquired speckle image and the corresponding pixel in the reference speckle image.
- P can be obtained by multiplying the pixel amount of the target speckle pattern and the pixel point offset in the reference speckle pattern by the actual distance of one pixel point.
- P is a negative value when the distance between the target object and the laser camera 102 is less than between the reference plane and the laser camera 102.
- P is a positive value.
- 0143 Generate a depth disparity map according to the reference depth information and the matching result, and send the depth disparity map to the second processing unit 120, and process the depth disparity map by the second processing unit 120 to obtain a depth map.
- the first processing unit 110 obtains depth information of each pixel point included in the collected speckle image, and can perform correction processing on the collected speckle image, and correct the collected speckle image due to internal and external parameters of the laser camera 102 and the RGB camera 108. The resulting image content is offset.
- the first processing unit 110 may generate a depth disparity map according to the corrected speckle image and the depth value of each pixel in the speckle image, and send the depth disparity map to the second processing unit 120.
- the second processing unit 120 may obtain a depth map according to the depth disparity map, and the depth map may include depth information of each pixel.
- the second processing unit 120 may upload the depth map to the application, and the application may perform beauty, three-dimensional modeling, and the like according to the depth information of the face in the depth map.
- the second processing unit 120 can also perform the living body detection according to the depth information of the face in the depth map, and can prevent the collected face from being a two-dimensional plane face or the like.
- the second processing unit 120 in the electronic device 100 may include two operating modes, where the first operating mode may be a TEE.
- TEE is a trusted operating environment with a high security level; the second operating mode can be REE, REE is a natural operating environment, and REE has a lower security level.
- the image acquisition instruction may be sent to the first processing unit 110 through the first operation mode.
- the second processing unit 120 is a single-core CPU, the single core can be directly switched from the second operating mode to the first operating mode; when the second processing unit 120 is multi-core, one core can be used by the second operating mode. Switching to the first mode of operation, the other cores are still running in the second mode of operation and transmitting image acquisition instructions to the first processing unit 110 by the kernel running in the first mode of operation.
- the processed target image may be sent to the kernel running in the first operation mode, so that the first processing unit 110 can always be operated in a trusted operating environment.
- the second processing unit 120 can obtain a desired image according to the processed target image in the kernel running in the first operation mode, and process the desired image according to the requirements of the application. For example, the second processing unit 120 can perform face detection on a desired image in a kernel running in the first mode of operation.
- the image processing instruction is sent to the first processing unit 110 by the high security core of the second processing unit 120, which ensures that the first processing unit 110 is in a highly secure environment and improves data security.
- the second processing unit 120 since the kernel running in the first operation mode is unique, the second processing unit 120 performs face detection on the target image in the TEE environment, and the serial manner can be collected to perform face recognition on a target image by pair of target images. , face matching and live detection.
- the second processing unit 120 may perform face recognition on the desired image first, and when the face is recognized, match the face included in the desired image with the pre-stored face to determine whether the face is the same face. . If the same face is used to perform a living body detection on the face according to the desired image, the face to be captured is prevented from being a two-dimensional plane face or the like. When the face is not recognized, the face matching and the living body detection may not be performed, and the processing pressure of the second processing unit 120 may be alleviated.
- the depth information of the acquired image can be accurately obtained by the first processing unit 110, the data processing efficiency is high, and the accuracy of the image processing is improved.
- step 011 controls the floodlight 104 and the laser light 106 when the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120. At least one of turning on, and controlling the laser camera 102 to acquire the target image includes steps 021 and 022.
- the application may send a data acquisition request to the second processing unit 120, where the face data may include, but is not limited to, a face unlocking, a face payment, and the like. Data for face verification and face depth information.
- the second processing unit 120 may send an image acquisition instruction to the first processing unit 110, where the first processing unit 110 may be an MCU module, and the second processing unit may be a CPU module.
- the laser processing camera 102, the floodlight 104, and the laser lamp 106 in the first processing unit 110 and the camera module 101 can be connected to the same I2C bus.
- the I2C bus can transfer data between devices connected to the I2C bus through a data line and a clock line.
- the first processing unit 110 may send a control command to the floodlight 104 and/or the laser lamp 106 simultaneously connected to the I2C bus through the I2C bus to control the floodlight 104. At least one of the laser lights 106 is turned on.
- the first processing unit 110 may determine, according to the image acquisition instruction, whether the floodlight 104 or the laser light 106 is currently controlled. If it is required to control the floodlight 104 to be turned on, the first processing unit 110 can address the floodlight 104 connected to the I2C bus through the I2C bus, and then send a control command to the floodlight 104 to control the floodlight 104. Open. If it is desired to control the laser light 106 to be turned on, the first processing unit 110 can address the laser light 106 connected to the I2C bus through the I2C bus, and then send a control command to the laser light 106 to control the laser light 106 to be turned on.
- the first processing unit 110 controls the laser camera 102 to acquire a target image through an I2C bus.
- the first processing unit 110 controls at least one of the floodlight 104 and the laser lamp 106 to be turned on through the I2C bus.
- the laser camera 102 can be controlled to acquire a target image through the I2C bus.
- the target image can include an infrared image, a speckle image, and the like.
- the first processing unit 110 can control the floodlight 104 in the camera module 101 to be turned on through the I2C bus, and control the laser camera 102 to collect infrared images through the I2C bus, wherein the floodlight 104 can be uniformly illuminated in all directions.
- the surface light source, the light emitted by the floodlight 104 may be infrared light, and the laser camera 102 may collect infrared light reflected by the human face to obtain an infrared image.
- the first processing unit 110 can control the laser light 106 in the camera module 102 to be turned on through the I2C bus, and control the laser camera 102 to collect a speckle image or the like through the I2C bus.
- the emitted laser light can be diffracted by a lens and a DOE (diffractive optical element) to produce a pattern with speckle particles, and the pattern with speckle particles is projected onto the target object with speckle particles.
- the pattern of the speckle particles is generated because the points on the target object are different from the distance of the electronic device, and the laser camera 102 collects the pattern after the speckle particles are offset to obtain a speckle image.
- the first processing unit 110 addresses the floodlight 104 or the laser light 106 connected to the I2C bus via an I2C bus, and sends a control command to the floodlight 104 or the laser light 106 to control the floodlight.
- the laser camera 102 connected to the I2C bus can be addressed through the I2C bus, and a control command can be sent to the laser camera 102 to control the laser camera 102 to acquire an infrared image or a speckle image.
- the target image is processed by the first processing unit 110, and the processed target image is sent to the second processing unit 120.
- the laser camera 102 can transmit the acquired target image to the first processing unit 110, and the first processing unit 110 can process the target image.
- the first processing unit 110 can be set to different modes, and different modes can collect different target images, perform different processing on the target image, and the like.
- the first processing unit 110 can control the floodlight 104 to be turned on through the I2C bus, and control the laser camera 102 to collect infrared images through the I2C bus, and can process the infrared image to obtain an infrared parallax map. .
- the first processing unit 110 can control the laser light 106 to be turned on through the I2C bus, and control the laser camera 102 to collect the speckle image through the I2C bus, and the speckle image can be processed. Speckle disparity map.
- the first processing unit 110 can control the laser light 106 to be turned on through the I2C bus, control the laser camera 102 to collect the speckle image through the I2C bus, and process the speckle image to obtain a depth disparity map. .
- the first processing unit 110 may perform a correction process on the target image, and the correction process refers to correcting the image content due to the internal and external parameters of the laser camera 102 and the RGB camera 108, for example, due to the deflection angle of the laser camera 102.
- the image content is shifted by the placement position between the laser camera 102 and the RGB camera 108, and the like.
- a disparity map of the target image can be obtained.
- the first processing unit 110 performs correction processing on the infrared image to obtain an infrared disparity map, and corrects the speckle image to obtain a speckle disparity map or a depth disparity map. Wait.
- the first processing unit 110 performs a correction process on the target image, and can prevent a situation in which an image finally presented on the screen of the electronic device 100 is ghosted.
- the first processing unit 110 processes the target image, and can transmit the processed target image to the second processing unit 120.
- the second processing unit 120 can obtain a desired image according to the processed target image, such as an infrared image, a speckle image, a depth image, and the like.
- the second processing unit 120 can process the desired image according to the needs of the application.
- the second processing unit 120 may perform face detection on the obtained desired image or the like, wherein the face detection may include face recognition, face matching, and living body detection.
- Face recognition refers to the recognition of whether there is a face in the desired image.
- Face matching refers to matching a face in a desired image with a pre-existing face.
- In vivo detection refers to whether a human face in a desired image is biologically active or the like. If the application needs to acquire the depth information of the face, the second processing unit 120 may upload the generated depth image to the application, and the application may perform the beauty processing, the three-dimensional modeling, and the like according to the received depth image.
- the laser camera 102, the floodlight 104, the laser lamp 106, and the first processing unit 110 are connected to the same I2C bus, and the first processing unit 110 controls the floodlight to be turned on through the I2C bus.
- At least one of 104 and the laser lamp 106, and the target image is collected by the I2C control laser camera 102, and the floodlight 104, the laser lamp 106, and the laser camera 102 are controlled by the same I2C bus, and the I2C bus is multiplexed, which can be reduced. Control circuit complexity and reduce costs.
- FIG. 5 is an application scenario diagram of a data processing method of the embodiment shown in FIG. 8.
- the electronic device 10 includes a laser camera 102, a laser lamp 106, a floodlight 104, a first processing unit 110, a second processing unit 120, and a controller 130.
- the first processing unit 110 may be an MCU (Microcontroller Unit) module or the like, and the second processing unit 120 may be a CPU (Central Processing Unit) module or the like.
- the first processing unit 110 can be coupled to the laser camera 102, the laser light 106, the floodlight 104, and the second processing unit 120.
- the controller 130 can be connected to the laser lamp 106 and the floodlight 104, respectively, and the controller 130 can control the laser lamp 106 and the floodlight 104.
- the laser camera 102, the controller 130, and the first processing unit 110 are connected to the same I2C (Inter-Integrated Circuit) bus.
- I2C Inter-Integrated Circuit
- the first processing unit 110 When the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120, at least one of the floodlight 104 and the laser lamp 106 can be turned on by the I2C bus.
- the first processing unit 110 can send a control command to the controller 130 connected to the I2C bus. After receiving the control command, the controller 130 can control at least one of the floodlight 104 and the laser light 106 to be turned on according to the control command.
- the processing unit 110 can illuminate the floodlight 104 and the laser lamp 106 by a PWM (Pulse Width Modulation) module 112.
- the first processing unit 110 can control the laser camera 102 to acquire a target image through the I2C bus.
- the first processing unit 110 processes the acquired target image, and may transmit the processed target image to the second processing unit 120.
- FIG. 4 is another application scenario diagram of the data processing method of the embodiment shown in FIG. 8.
- the electronic device 100 can include a camera module 101 , a second processing unit 120 , and a first processing unit 110 .
- the second processing unit 120 can be a CPU module.
- the first processing unit 110 can be an MCU module or the like.
- the first processing unit 110 is connected between the second processing unit 120 and the camera module 101.
- the first processing unit 110 can control the laser camera 102, the floodlight 104 and the laser light 106 in the camera module 101, and the second processing Unit 120 can control RGB camera 108 in camera module 101.
- the camera module 101 includes a laser camera 102, a floodlight 104, an RGB camera 108, and a laser lamp 106.
- the laser camera 102 can be an infrared camera for acquiring infrared images.
- the floodlight 104 is a surface light source that emits infrared light.
- the laser lamp 106 is a point source that is capable of emitting laser light and emitting laser light to form a pattern. Wherein, when the floodlight 104 emits infrared light, the laser camera 102 can acquire an infrared image according to the reflected light. When the laser light 106 emits laser light, the laser camera 102 can acquire a speckle image based on the reflected light.
- the speckle image is an image in which the patterned laser light emitted by the laser lamp 106 is reflected and the pattern is deformed.
- the laser camera 102, the floodlight 104, the laser lamp 106, and the first processing unit 110 can be coupled to the same I2C bus.
- the second processing unit 120 may include a CPU core running in a TEE (Trusted Execution Environment) environment and a CPU core running in a REE (Rich Execution Environment) environment.
- TEE Trusted Execution Environment
- REE Raster Execution Environment
- the TEE environment and the REE environment are the operating modes of the ARM module (Advanced RISC Machines, advanced reduced instruction set processor).
- the security level of the TEE environment is high, and there is only one CPU core in the second processing unit 120 that can run in the TEE environment at the same time.
- the operation behavior of the security level in the electronic device 100 needs to be performed in the CPU core in the TEE environment, and the operation behavior with lower security level can be performed in the CPU core in the REE environment.
- the first processing unit 110 includes a PWM module 112, a SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) interface 114, and a RAM (Random Access Memory). Module 116 and depth engine 118.
- the first processing unit 110 can control the floodlight 104 or the laser light 106 through a connected I2C bus, and the PWM module 112 can emit a pulse to the camera module 101 to illuminate the turned-on floodlight 104 or the laser light 106.
- the first processing unit 110 can control the laser camera 102 to acquire an infrared image or a speckle image through the I2C.
- the SPI/I2C interface 114 is configured to receive an image acquisition instruction sent by the second processing unit 120.
- the depth engine 118 can process the speckle image to obtain a depth disparity map.
- the image can be sent to the first processing unit 110 by the CPU core running in the TEE environment. Acquisition instructions.
- the first processing unit 110 can control the floodlight 104 in the control camera module 101 through the I2C bus control, and then emit the pulse wave to illuminate the floodlight 104 through the PWM module 112, and pass the I2C bus.
- the laser camera 102 is controlled to collect infrared images, and the laser lamp 106 in the camera module 102 can be turned on by the I2C bus, and the laser camera 102 can be controlled to collect the speckle image through the I2C bus.
- the camera module 101 can transmit the collected infrared image and the speckle image to the first processing unit 110.
- the first processing unit 110 may process the received infrared image to obtain an infrared parallax map, and process the received speckle image to obtain a speckle disparity map or a depth disparity map.
- the processing of the infrared image and the speckle image by the first processing unit 110 refers to correcting the infrared image or the speckle image, and removing the influence of the internal and external parameters of the camera module 101 on the image.
- the first processing unit 110 can be set to different modes, and images output by different modes are different.
- the first processing unit 110 When the first processing unit 110 is set to the speckle pattern, the first processing unit 110 processes the speckle image to obtain a speckle disparity map, and the target speckle image can be obtained according to the speckle disparity map; when the first processing unit 110 is set to In the depth map mode, the first processing unit 110 processes the speckle image to obtain a depth disparity map, and the depth disparity map can obtain a depth image, and the depth image refers to an image with depth information.
- the first processing unit 110 may send the infrared disparity map and the speckle disparity map to the second processing unit 120, and the first processing unit 110 may also send the infrared disparity map and the depth disparity map to the second processing unit 120.
- the second processing unit 120 may acquire a target infrared image according to the infrared disparity map, and acquire a depth image according to the depth disparity map. Further, the second processing unit 120 may perform face recognition, face matching, living body detection, and acquiring depth information of the detected face according to the target infrared image and the depth image.
- Communication between the first processing unit 110 and the second processing unit 120 is through a fixed security interface to ensure the security of the transmitted data.
- the data sent by the second processing unit 120 to the first processing unit 110 is passed through the SECURE SPI/I2C 130, and the data sent by the first processing unit 110 to the second processing unit 120 is passed through the SECURE MIPI (Mobile Industry Processor). Interface, mobile industry processor interface) 140.
- SECURE SPI/I2C 130 SECURE SPI/I2C 130
- MIPI Mobile Industry Processor
- Interface mobile industry processor interface
- the first processing unit 110 may also acquire the target infrared image according to the infrared disparity map, calculate the acquired depth image according to the depth disparity map, and then send the target infrared image and the depth image to the second processing unit 120.
- the step of controlling at least one of the floodlight and the laser light through the I2C bus control comprises the following steps:
- the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120, and may determine the type of the acquired image according to the image acquisition instruction, where the image type may be one or more of an infrared image, a speckle image, and a depth image.
- the image type may be determined according to the face data required by the application, and after receiving the data acquisition request, the second processing unit 120 may determine the image type according to the data acquisition request, and send the image collection including the image type to the first processing unit 110. instruction. For example, if the data of the face unlocking is required, the image type may be determined as an infrared image and a speckle image, and if the face depth information is required, the image type may be determined as a depth image or the like, but is not limited thereto.
- the first processing unit 110 sends a first control instruction to the controller 130 through the I2C bus, where the first control instruction is used to instruct the controller 130 to turn on the floodlight 104.
- the electronic device 10 can be provided with a controller 130.
- the floodlight 104 and the laser light 106 can share the same controller 130.
- the controller 130 can be connected to the floodlight 104 and the laser light 106, respectively.
- the floodlight 104 and the laser light 106 are controlled to include controlling the floodlight 104 or the laser light 106 to be turned on, controlling the switching between the floodlight 104 and the laser light 106, and controlling the emission power of the floodlight 104 and the laser light 106. Wait.
- the controller 130 can be connected to the same I2C bus with the laser camera 102 and the first processing unit 110.
- the first processing unit 110 may send a first control command to the controller 130 through the connected I2C, and the controller 130 may turn on the floodlight 104 according to the first control command.
- the first processing unit 110 can transmit a pulse to the controller 130 through the PWM module 112 to illuminate the floodlight.
- the first processing unit 110 may address the controller 130 through the I2C and send the first control instruction to the controller 130.
- the first processing unit 110 sends a second control instruction to the controller 130 through the I2C bus, and the second control instruction is used to instruct the controller 130 to turn on the laser lamp 106.
- the first processing unit 110 may send a second control command to the controller 130 through the connected I2C, and the controller 130 may be turned on according to the second control laser lamp 106.
- the first processing unit 110 can transmit a pulse to the controller 130 through the PWM module 112 to illuminate the laser lamp 106.
- the image type may include multiple types, and may include both an infrared image and a speckle image, or both an infrared image and a depth image, or both an infrared image, a speckle image, and a depth image.
- the first processing unit 110 can respectively control the floodlight 104 to be turned on to collect an infrared image, and control the laser light 106 to be turned on to collect a speckle image.
- the first processing unit 110 can control the laser camera 102 to first collect the infrared image, and can also control the laser camera 102 to first collect the speckle image, and does not limit the sequential acquisition sequence.
- the first processing unit 110 may first send the first control instruction to the controller 130 through the I2C bus, and turn on the floodlight 104.
- the infrared camera image is controlled by the laser camera 102 through the I2C bus, and then the second control command is sent to the controller 130 through the I2C bus, the laser lamp 106 is turned on, and the laser camera 102 is controlled to collect the speckle image through the I2C bus.
- the first processing unit 110 may also first send a second control instruction to the controller 130 through the I2C bus to turn on the laser lamp 106.
- the laser camera 102 is controlled to collect the speckle image through the I2C bus, and then sends a first control command to the controller 130 through the I2C bus, turns on the floodlight 104, and controls the laser camera 102 to collect the infrared image through the I2C bus.
- time division multiplexing of the same I2C bus can reduce the complexity of the control circuit and reduce the cost.
- the first processing unit 110 can implement switching and control of the floodlight 104 and the laser lamp 106 through a controller 130, which can further reduce the complexity of the control circuit and reduce the cost. .
- step 002 processes the target image by the first processing unit 110, and processes the processed target image.
- the sending to the second processing unit 120 includes the following steps:
- the line perpendicular to the plane of the imaging plane and passing through the center of the mirror is the Z axis. If the coordinates of the object in the camera coordinate system are (X, Y, Z), then the Z value is the object imaged in the camera. Plane depth information. If the application needs to obtain the depth information of the face, it is necessary to collect a depth image containing the face depth information.
- the first processing unit 110 can control the laser light 106 to be turned on through the I2C bus, and control the laser camera 102 to collect the speckle image through the I2C bus.
- a reference speckle map may be pre-stored in the first processing unit 110, and the reference speckle map may be provided with reference depth information, and the depth of each pixel included in the speckle image may be acquired according to the collected speckle image and the reference speckle image. information.
- the first processing unit 110 may sequentially select a preset size pixel block, for example, a 31 pixel (pixel) * 31 pixel size, and search for and select the pixel on the reference speckle image, centering on each pixel point included in the collected speckle image.
- the block matches the block.
- the first processing unit 110 may find two points in the speckle image and the reference speckle image respectively on the same laser optical path from the blocks selected by the selected pixel image and the reference speckle image.
- the speckle information of two points on the same laser beam path is consistent, and two points on the same laser beam path can be identified as corresponding pixel points.
- the depth information of the points on each laser light path is known.
- the first processing unit 110 may calculate an offset between the target speckle image and the reference speckle image on two pixels of the same laser light path, and calculate the acquired speckle pattern according to the offset. Depth information for each pixel.
- the first processing unit 110 performs the calculation of the offset of the collected speckle image and the reference speckle pattern, and calculates the depth information of each pixel point included in the speckle image according to the offset amount, and the calculation formula thereof is calculated.
- equation (2) can be as shown in equation (2):
- Z D represents the depth information of the pixel point, that is, the depth value of the pixel point
- L is the distance between the laser camera 102 and the laser (ie, the laser lamp 106); f is the focal length of the lens in the laser camera 102
- Z 0 is
- the reference plane is the depth value of the laser camera 102 of the electronic device 100
- P is the offset between the acquired speckle image and the corresponding pixel in the reference speckle image.
- P can be obtained by multiplying the pixel amount of the target speckle pattern and the pixel point offset in the reference speckle pattern by the actual distance of one pixel point.
- P is a negative value when the distance between the target object and the laser camera 102 is less than between the reference plane and the laser camera 102.
- P is a positive value.
- 0243 Generate a depth disparity map according to the reference depth information and the matching result, and send the depth disparity map to the second processing unit, and process the depth disparity map by the second processing unit to obtain a depth map.
- the first processing unit 110 obtains depth information of each pixel point included in the collected speckle image, and can perform correction processing on the collected speckle image, and correct the collected speckle image due to internal and external parameters of the laser camera 102 and the RGB camera 108. The resulting image content is offset.
- the first processing unit 110 may generate a depth disparity map according to the corrected speckle image and the depth value of each pixel in the speckle image, and send the depth disparity map to the second processing unit 120.
- the second processing unit 120 may obtain a depth map according to the depth disparity map, and the depth map may include depth information of each pixel.
- the second processing unit 120 may upload the depth map to the application, and the application may perform beauty, three-dimensional modeling, and the like according to the depth information of the face in the depth map.
- the second processing unit 120 can also perform the living body detection according to the depth information of the face in the depth map, and can prevent the collected face from being a two-dimensional plane face or the like.
- the depth information of the acquired image can be accurately obtained by the first processing unit 110, the data processing efficiency is high, and the accuracy of the image processing is improved.
- the data processing method further includes the following steps:
- the temperature of the laser lamp 106 is collected every acquisition time period, and the reference speckle image corresponding to the temperature is acquired by the second processing unit 120.
- the electronic device 100 may be provided with a temperature sensor beside the laser lamp 106, and collect the temperature of the laser lamp 106 or the like by the temperature sensor.
- the second processing unit 120 may acquire the temperature of the laser light 106 collected by the temperature sensor every acquisition time period, wherein the collection time period may be set according to actual needs, for example, 3 seconds, 4 seconds, etc., but is not limited thereto. Since the camera module 101 may be deformed when the temperature of the laser lamp 106 changes, the internal and external parameters of the laser lamp 106 and the laser camera 102 are affected. The effects on the camera module 101 are different at different temperatures, and therefore, different temperatures may correspond to different reference speckle images.
- the second processing unit 120 may acquire a reference speckle image corresponding to the temperature, and process the speckle image acquired at the temperature according to the reference speckle image corresponding to the temperature to obtain a depth map.
- the second processing unit may preset a plurality of different temperature intervals, such as 0° C. (photometry) to 30° C., 30° C. to 60° C., 60° C. to 90° C., etc., but is not limited thereto.
- the temperature interval can correspond to different reference speckle images. After the second processing unit 120 collects the temperature, the temperature interval in which the temperature is located may be determined, and a reference speckle image corresponding to the temperature interval is acquired.
- the second processing unit 120 After the second processing unit 120 acquires the reference speckle image corresponding to the collected temperature, it may be determined whether the reference speckle image acquired this time is consistent with the reference speckle image stored in the first processing unit 110, and the reference speckle image may be Carrying an image identifier, the image identifier may be composed of one or more of a number, a word line, a character, and the like.
- the second processing unit 120 may read the image identifier of the stored reference speckle image from the first processing unit 110, and identify the image identifier of the reference speckle image acquired this time and the image identifier read from the first processing unit 110. Compare.
- the two image identifiers are inconsistent, it can be said that the reference speckle image acquired this time is inconsistent with the reference speckle image stored in the first processing unit 110, and the second processing unit 120 can write the reference speckle image acquired this time.
- the first processing unit 110 is entered.
- the first processing unit 110 may store the newly written reference speckle image and delete the previously stored reference speckle image.
- the data processing method of the embodiment shown in FIG. 11 can obtain a reference speckle image corresponding to the temperature according to the temperature of the laser lamp 106, and reduce the influence of the temperature on the final output depth map, so that the obtained depth information is more accurate.
- the data processing method further includes step 0261.
- the step of transmitting the processed target image to the second processing unit includes step 0262.
- the image acquisition instruction is sent to the first processing unit 110 by the kernel running in the first operation mode in the second processing unit 120, where the first operation mode is a trusted operation environment.
- the second processing unit 120 in the electronic device 100 may include two operating modes, wherein the first operating mode may be a TEE, the TEE is a trusted operating environment, and the security level is high; the second operating mode may be a REE, and the REE is a natural operating environment. , REE has a lower security level.
- the image acquisition instruction may be sent to the first processing unit 110 through the first operation mode.
- the second processing unit 120 is a single-core CPU, the single core can be directly switched from the second operating mode to the first operating mode; when the second processing unit 120 is multi-core, one core can be used by the second operating mode. Switching to the first mode of operation, the other cores are still running in the second mode of operation and transmitting image acquisition instructions to the first processing unit 110 by the kernel running in the first mode of operation.
- the first processing unit 110 sends the processed target image to the kernel running in the first operating mode in the second processing unit 120.
- the processed first image may be sent to the kernel running in the first operating mode, so that the first processing unit 110 is always in a trusted operating environment. Run to improve security.
- the second processing unit 120 may obtain a target image according to the processed first image in the kernel running in the first operation mode, and process the target image according to the requirements of the application. For example, the second processing unit 120 may perform face detection on the target image in the kernel running in the first operation mode.
- the second processing unit 120 performs face detection on the target image in the TEE environment, and can collect the serial manner to perform face recognition on a target image by pair, Face matching and live detection.
- the second processing unit 120 may perform face recognition on the target image first, and when the face is recognized, match the face included in the target image with the pre-stored face to determine whether it is the same face. If the same face is used to perform a living body detection on the face according to the target image, the face to be captured is prevented from being a two-dimensional plane face or the like. When the face is not recognized, the face matching and the living body detection may not be performed, and the processing pressure of the second processing unit 120 may be alleviated.
- the image processing instruction is sent to the first processing unit 110 by the core with high security of the second processing unit 120, so that the first processing unit 110 can be ensured to be in a high security environment, and the data is improved. Security.
- step 011 controls the floodlight 104 and the laser when the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120. At least one of the 106s is turned on, and controlling the laser camera 102 to acquire the target image includes step 031, step 032, and step 033.
- the application may send a data acquisition request to the second processing unit 120, where the face data may include, but is not limited to, a face unlocking, a face payment, and the like. Data for face verification and face depth information.
- the second processing unit 120 may send an image acquisition instruction to the first processing unit 110, where the first processing unit 110 may be an MCU module, and the second processing unit 120 may be a CPU module.
- the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120, and may determine the type of the acquired image according to the image acquisition instruction, where the image type may be one or more of an infrared image, a speckle image, and a depth image.
- the image type may be determined according to the face data required by the application, and after receiving the data acquisition request, the second processing unit 120 may determine the image type according to the data acquisition request, and send the image collection including the image type to the first processing unit 110. instruction.
- the second processing unit 120 may determine that the image type is an infrared image and a speckle image, and the face depth information is required, and further, the image type may be determined as a depth image, etc., but is not limited thereto. .
- the image type is the first type, turn on the floodlight in the camera module 101, and send a pulse to the first controller 131 through the first pulse width modulation PWM module 1121, illuminate the floodlight 104, and then pass The laser camera 102 in the camera module 101 collects a target image corresponding to the first type.
- the first processing unit 110 may send a control instruction to the first controller 131, where the control instruction may be used to enable the camera module 101.
- Floodlights 104 The first processing unit 110 may send a pulse signal to the first controller 131 for controlling the floodlight 104 through the first PWM module 1121 to illuminate the floodlight 104.
- the first PWM module 1121 can control the illumination of the floodlight 104 by a pulse signal continuously sent to the floodlight 104 according to a certain voltage amplitude and a certain time interval.
- the floodlight 104 can be a surface light source that uniformly illuminates in all directions. When the floodlight 104 is illuminated, infrared light can be emitted, and the laser camera 102 can collect infrared light reflected by the human face to obtain an infrared image.
- the image type is the second type, turn on the laser light 106 in the camera module 102, and send a pulse to the second controller 132 through the second PWM module 1122, illuminate the laser light 106, and then pass the camera module 101.
- the laser camera 102 in the middle captures a target image corresponding to the second type.
- the first processing unit 110 may send a control instruction to the second controller 132, and the control instruction may be used to enable Laser light 106 in camera module 101.
- the first processing unit 110 can send a pulse signal to the second controller 132 for controlling the laser lamp 106 through the second PWM module 1122 to illuminate the laser lamp 106.
- the second PWM module 1122 can control the illuminating the laser lamp 106 by continuously transmitting a pulse signal to the laser lamp 106 according to a certain voltage amplitude and a certain time interval.
- the emitted laser light can be diffracted by a lens and a DOE (diffractive optical element) to produce a pattern with speckle particles, which are projected onto the target object by a pattern with speckle particles, with speckle
- the pattern of the particles may cause a shift of the speckle pattern because the points on the target object are different from the distance of the electronic device 100, and the laser camera 102 collects the pattern after the speckle particles are shifted to obtain a speckle image.
- the target image is processed by the first processing unit 110, and the processed target image is sent to the second processing unit 120.
- the laser camera 102 can transmit the acquired target image to the first processing unit 110, and the first processing unit 110 can process the target image, wherein the target image can include an infrared image, a speckle image, and the like.
- the target image corresponding to the image type may be acquired according to the determined image type, and the target image is processed correspondingly.
- the first processing unit 110 may send a pulse to the first controller 131 through the first PWM module 1121, illuminate the floodlight 104, and collect the infrared image through the laser camera 102, and then perform the infrared image on the infrared image.
- the infrared disparity map is obtained by processing.
- the first processing unit 110 may send a pulse to the second controller 132 through the second PWM module 1122, illuminate the laser lamp 106, and collect the speckle image through the laser camera 102, and then the speckle image. The image is processed to obtain a speckle disparity map.
- the first processing unit 110 may acquire a speckle image and process the acquired speckle image to obtain a depth disparity map.
- the first processing unit 10 may perform a correction process on the target image, and the correction process refers to correcting the image content due to the internal and external parameters of the laser camera 102 and the RGB camera 108, for example, due to the deflection angle of the laser camera 102.
- the image content is shifted by the placement position between the laser camera 102 and the RGB camera 108, and the like.
- a disparity map of the target image can be obtained.
- the infrared image is corrected to obtain an infrared disparity map
- the speckle image is corrected to obtain a speckle disparity map or a depth disparity map. Wait.
- the first processing unit 110 performs a correction process on the target image, and can prevent a situation in which an image finally presented on the screen of the electronic device 100 is ghosted.
- the first processing unit 100 processes the target image, and may transmit the processed target image to the second processing unit 120.
- the second processing unit 120 can obtain a desired image according to the processed target image, such as an infrared image, a speckle image, a depth image, and the like.
- the second processing unit 120 can process the desired image according to the needs of the application.
- the second processing unit 120 may perform face detection on the obtained desired image or the like, wherein the face detection may include face recognition, face matching, and living body detection.
- Face recognition refers to the recognition of whether there is a face in the desired image.
- Face matching refers to matching a face in a desired image with a pre-existing face.
- In vivo detection refers to whether a human face in a desired image is biologically active or the like. If the application needs to obtain the depth information of the face, the generated depth image can be uploaded to the application, and the application can perform beauty processing, three-dimensional modeling, and the like according to the received depth image.
- the image type is determined according to the image acquisition instruction, and if the image type is the first type, A PWM module 1121 illuminates the floodlight 104 and collects a target image corresponding to the first type through the laser camera 102. If the image type is the second type, the laser light is illuminated to the second controller 132 by the second PWM module 1122. The target image corresponding to the second type is collected by the laser camera 102, and the floodlight 104 and the laser light 106 are separately controlled by the two PWM modules, so that the real-time switching is not required, the data processing complexity can be reduced, and the first processing unit can be reduced. 110 processing pressure.
- FIG. 14 is an application scenario diagram of the data processing method of the embodiment shown in FIG. As shown in FIG. 14, the data processing method can be applied to an electronic device 100.
- the electronic device 100 can include a laser camera 102, a floodlight 104, a laser light 106, a first processing unit 110, a second processing unit 120, and a first control.
- the first processing unit 110 can be connected to the laser camera 102 and the second processing unit 120, wherein the first processing unit 110 can be an MCU (Microcontroller Unit) module or the like, and the second processing unit 120 can be a CPU (Central). Processing Unit, CPU, etc.
- the first controller 131 is connected to the floodlight 104, and the second controller 132 is connected to the laser light 106.
- the first processing unit 110 may include a first PWM (Pulse Width Modulation) module 1121 and a second PWM module 1122.
- the first processing unit 110 is connected to the first controller 131 through the first PWM module 1121, and the first processing is performed.
- Unit 110 is coupled to second controller 132 by a second PWM module 1122.
- the image type is determined according to the image acquisition instruction. If the image type is the first type, the floodlight 104 is turned on, and a pulse is sent to the first controller 131 through the first PWM module 1121 to illuminate the floodlight 104, and then the laser camera 102 collects the corresponding type corresponding to the first type. Target image. If the image type is the second type, the laser light 106 is turned on, and a pulse is sent to the second controller 132 through the second PWM module 1122 to illuminate the laser light 106, and then the target image corresponding to the second type is collected by the laser camera 102. . The first processing unit 110 may process the target image acquired by the laser camera 102 and transmit the processed target image to the second processing unit 120.
- FIG. 4 is another application scenario diagram of the data processing method of the embodiment shown in FIG.
- the electronic device 100 can include a camera module 101 , a second processing unit 120 , and a first processing unit 110 .
- the second processing unit 120 can be a CPU module.
- the first processing unit 110 can be an MCU module or the like.
- the first processing unit 110 is connected between the second processing unit 120 and the camera module 101.
- the first processing unit 110 can control the laser camera 101, the floodlight 104, and the laser light 106 in the camera module 101.
- the processing unit 120 can control the RGB camera 108 in the camera module 101.
- the camera module 101 includes a laser camera 102, a floodlight 104, an RGB camera 108, and a laser light 106.
- the laser camera 102 can be an infrared camera for acquiring infrared images.
- the floodlight 104 is a surface light source that emits infrared light.
- the laser lamp 106 is a point source that emits laser light and that emits laser light to form a pattern. Wherein, when the floodlight 104 emits infrared light, the laser camera 102 can acquire an infrared image according to the reflected light.
- the laser light 106 emits laser light, the laser camera 102 can acquire a speckle image based on the reflected light.
- the speckle image is an image in which the patterned point source emitted by the laser lamp 106 is reflected and the pattern is deformed.
- the second processing unit 120 may include a CPU core running in a TEE (Trusted Execution Environment) environment and a CPU core running in a REE (Rich Execution Environment) environment.
- TEE Trusted Execution Environment
- REE Raster Execution Environment
- the TEE environment and the REE environment are the operating modes of the ARM module (Advanced RISC Machines, advanced reduced instruction set processor).
- the security level of the TEE environment is high, and there is only one CPU core in the second processing unit 120 that can run in the TEE environment at the same time.
- the operation behavior of the security level in the electronic device 100 needs to be performed in the CPU core in the TEE environment, and the operation behavior with lower security level can be performed in the CPU core in the REE environment.
- the first processing unit 110 includes a PWM module 112, a SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) interface 114, and a RAM (Random Access Memory). Module 116 and depth engine 118.
- the PWM module 232 can include a first PWM module 1121 and a second PWM module 1122.
- the first PWM module 1121 can be connected to the controller 131 of the floodlight 104 to control the floodlight 104 to be turned on and sent to the floodlight 104.
- the second PWM module 1122 can be coupled to the controller 132 of the laser light 106 to control the laser light 106 to turn on and to transmit a pulsed laser light 106 to the laser light 106.
- the SPI/I2C interface 114 is configured to receive an image acquisition instruction sent by the second processing unit 120.
- the depth engine 118 can process the speckle image to obtain a depth disparity map.
- the image can be sent to the first processing unit 110 by the CPU core running in the TEE environment. Acquisition instructions.
- the first processing unit 110 may transmit a pulse wave to illuminate the floodlight 104 through the first PWM module 1121 in the PWM module 112, and acquire an infrared image through the laser camera 102, which may pass through the PWM module 112.
- the second PWM module 1122 emits a pulse wave to illuminate the laser lamp 106 and collects a speckle image through the laser camera 102.
- the camera module 101 can transmit the collected infrared image and the speckle image to the first processing unit 110.
- the first processing unit 110 may process the received infrared image to obtain an infrared parallax map, and process the received speckle image to obtain a speckle disparity map or a depth disparity map.
- the processing of the infrared image and the speckle image by the first processing unit 110 refers to correcting the infrared image or the speckle image, and removing the influence of the internal and external parameters in the camera module 101 on the image.
- the first processing unit 110 can be set to different modes, and images output by different modes are different.
- the first processing unit 110 When the first processing unit 110 is set to the speckle pattern, the first processing unit 110 processes the speckle image to obtain a speckle disparity map, and the target speckle image can be obtained according to the speckle disparity map; when the first processing unit 110 is set to In the depth map mode, the first processing unit 110 processes the speckle image to obtain a depth disparity map, and the depth disparity map can obtain a depth image, and the depth image refers to an image with depth information.
- the first processing unit 110 may send the infrared disparity map and the speckle disparity map to the second processing unit 120, and the first processing unit 110 may also send the infrared disparity map and the depth disparity map to the second processing unit 120.
- the second processing unit 120 may acquire a target infrared image according to the infrared disparity map, and acquire a depth image according to the depth disparity map. Further, the second processing unit 120 may perform face recognition, face matching, living body detection, and acquiring depth information of the detected face according to the target infrared image and the depth image.
- Communication between the first processing unit 110 and the second processing unit 120 is through a fixed security interface to ensure the security of the transmitted data.
- the data sent by the second processing unit 120 to the first processing unit 110 is passed through the SECURE SPI/I2C 130, and the data sent by the first processing unit 110 to the second processing unit 120 is passed through the SECURE MIPI (Mobile Industry Processor). Interface, mobile industry processor interface) 140.
- SECURE SPI/I2C 130 SECURE SPI/I2C 130
- MIPI Mobile Industry Processor
- Interface mobile industry processor interface
- the first processing unit 110 may also acquire the target infrared image according to the infrared disparity map, calculate the acquired depth image according to the depth disparity map, and then send the target infrared image and the depth image to the second processing unit 120.
- the method further includes: when detecting that the camera module 101 is activated, the second processing unit 120 configures the floodlight 104 and the laser lamp 106 respectively through the I2C bus.
- the camera module 101 can be activated and images can be acquired by the camera module 101.
- the second processing unit 120 can respectively configure the floodlight 104 and the laser light 106 through the I2C bus, wherein the I2C bus can pass through one data line and one clock line. Achieve data transfer between devices connected to the I2C bus.
- the second processing unit 120 may first read the configuration file and configure the floodlight 104 and the laser light 106 according to the parameters included in the configuration file.
- the configuration file may record parameters such as the emission power and the emission current of the floodlight 104 and the laser lamp 106, but is not limited thereto, and may be other parameters.
- the second processing unit 120 can configure parameters such as the emission power, the emission current, and the like of the floodlight 104 and the laser lamp 106 according to the parameters in the configuration file.
- the second processing unit 120 can be connected to the floodlight 104 lamp and the laser lamp 106 through the same I2C bus, the floodlight 104, the laser lamp 106, and the The two processing units 120 can be connected to the same I2C bus.
- the second processing unit 120 configures the floodlight 104 and the laser light 106
- the floodlight 104 can be first addressed through the I2C bus, and the floodlight 104 can be configured, and then the laser light 106 can be performed through the I2C bus. Addressing and configuring the laser light 106.
- the second processing unit 120 may first address the laser lamp 106 through the I2C bus, configure the laser lamp 106, and then address the floodlight 104 through the I2C bus, and the floodlight 104 Configure it.
- the complexity of the control circuit can be reduced, resources are saved, and cost is reduced.
- the second processing unit 120 can also be connected to the floodlight 104 and the laser lamp 106 through two I2C buses, and the second processing unit 120 can pass an I2C.
- the bus is connected to the floodlight 104 and to the laser lamp 106 via another I2C bus.
- the second processing unit 120 configures the floodlight 104 and the laser light 106
- the floodlight 104 can be addressed through the I2C bus connected to the floodlight, and the floodlight 104 can be configured while passing
- the I2C bus connected to the laser lamp 106 addresses the laser lamp 106 and configures the laser lamp 106.
- the floodlight 104 and the laser light 106 can be configured in parallel to improve the data processing speed.
- FIG. 15 is a schematic diagram of the second processing unit 120 of an example connected to the floodlight 104 and the laser lamp 106 via an I2C bus.
- the second processing unit 120 connects the floodlight 104 and the laser lamp 106 via the same I2C bus.
- the second processing unit 120 is connected to the floodlight 104 and the laser lamp 106 via two I2C buses, respectively, and the second processing unit 120 can be connected to the floodlight 104 through an I2C bus and through another I2C.
- the bus is connected to the laser light 106.
- the second processing unit 120 can configure the floodlight 104 and the laser light 106 through the I2C bus when the camera module 101 is started, so that the image collection can be controlled more accurately and improved. Data processing efficiency.
- the timing at which the first PWM module 1121 sends a pulse to the first controller 131 and the second PWM module 1122 send a pulse to the second controller 132 The time difference between the time when the first PWM module 1121 sends a pulse to the first controller 131 and the time when the second PWM module 1122 sends a pulse to the second controller 132 is less than the time threshold.
- the first processing unit 110 determines an image type according to an image acquisition instruction, and the image type may include at least two types, for example, the image type may include the first type and the second type at the same time.
- the image type contains infrared images and speckle images, or contains infrared images and depth images, it is necessary to simultaneously acquire infrared images and speckle images.
- the first processing unit 110 can simultaneously send a pulse to the first controller 131 through the first PWM module 1121, and send a pulse to the second controller 132 through the second PWM module 1122 to illuminate the floodlight 104 and the laser lamp 106.
- the timing at which the first PWM module 1121 sends a pulse to the first controller 131 and the timing at which the second PWM module 1122 sends a pulse to the second controller 132 may be different, thereby illuminating the floodlight 104 and the laser lamp 106 at different times.
- the first processing unit 110 may acquire an infrared image through the laser camera 102 at a timing when the first PWM module 1121 sends a pulse to the first controller 131, and may pass the laser camera at a timing when the second PWM module 1122 sends a pulse to the second controller 132. 102 collects a speckle image.
- a time interval between a time when the first PWM module 1121 sends a pulse to the first controller 131 and a time when the second PWM module 1122 sends a pulse to the second controller 132 is less than a time threshold, and the laser camera 102 is collected.
- the speckle image can be acquired in a time interval less than the time threshold, so that the acquired infrared image and the image content of the speckle image are more consistent, which facilitates subsequent processing such as face detection.
- the time threshold can be set according to actual needs, such as 20 milliseconds, 30 milliseconds, and the like.
- the first processing unit 110 can separately collect infrared images and speckle images at different times by the laser camera 102, and can ensure that the collected infrared images are consistent with the image content of the speckle images, and the subsequent face detection is improved. The accuracy.
- step 002 processes the target image by the first processing unit 110, and sends the processed target image to the second.
- the processing unit 120 includes the following steps:
- the line perpendicular to the plane of the imaging plane and passing through the center of the mirror is the Z axis. If the coordinates of the object in the camera coordinate system are (X, Y, Z), then the Z value is the object imaged in the camera. Plane depth information. If the application needs to obtain the depth information of the face, it is necessary to collect a depth image containing the face depth information.
- the first processing unit 110 can control the laser light 106 to be turned on through the I2C bus, and control the laser camera 102 to collect the speckle image through the I2C bus.
- a reference speckle map may be pre-stored in the first processing unit 110, and the reference speckle map may be provided with reference depth information, and the depth of each pixel included in the speckle image may be acquired according to the collected speckle image and the reference speckle image. information.
- the first processing unit 110 may sequentially select a preset size pixel block, for example, a 31 pixel (pixel) * 31 pixel size, and search for and select the pixel on the reference speckle image, centering on each pixel point included in the collected speckle image.
- the block matches the block.
- the first processing unit 110 may find two points in the speckle image and the reference speckle image respectively on the same laser optical path from the blocks selected by the selected pixel image and the reference speckle image.
- the speckle information of two points on the same laser beam path is consistent, and two points on the same laser beam path can be identified as corresponding pixel points.
- the depth information of the points on each laser light path is known.
- the first processing unit 110 may calculate an offset between the target speckle image and the reference speckle image on two pixels of the same laser light path, and calculate the acquired speckle pattern according to the offset. Depth information for each pixel.
- the first processing unit 110 performs the calculation of the offset of the collected speckle image and the reference speckle pattern, and calculates the depth information of each pixel point included in the speckle image according to the offset amount, and the calculation formula thereof is calculated.
- Z D represents the depth information of the pixel point, that is, the depth value of the pixel point
- L is the distance between the laser camera 102 and the laser (ie, the laser lamp 106); f is the focal length of the lens in the laser camera 102
- Z 0 is
- the reference plane is the depth value of the laser camera 102 of the electronic device 100
- P is the offset between the acquired speckle image and the corresponding pixel in the reference speckle image.
- P can be obtained by multiplying the pixel amount of the target speckle pattern and the pixel point offset in the reference speckle pattern by the actual distance of one pixel point.
- P is a negative value when the distance between the target object and the laser camera 102 is less than between the reference plane and the laser camera 102.
- P is a positive value.
- the first processing unit 110 obtains depth information of each pixel point included in the collected speckle image, and can perform correction processing on the collected speckle image, and correct the collected speckle image due to internal and external parameters of the laser camera 102 and the RGB camera 108. The resulting image content is offset.
- the first processing unit 110 may generate a depth disparity map according to the corrected speckle image and the depth value of each pixel in the speckle image, and send the depth disparity map to the second processing unit 120.
- the second processing unit 120 may obtain a depth map according to the depth disparity map, and the depth map may include depth information of each pixel.
- the second processing unit 120 may upload the depth map to the application, and the application may perform beauty, three-dimensional modeling, and the like according to the depth information of the face in the depth map.
- the second processing unit 120 can also perform the living body detection according to the depth information of the face in the depth map, and can prevent the collected face from being a two-dimensional plane face or the like.
- the data processing method of the embodiment shown in FIG. 16 can accurately obtain the depth information of the collected image through the first processing unit 110, and the data processing efficiency is high, and the accuracy of the image processing is improved.
- the data processing method further includes the following steps:
- the temperature of the laser light 104 is collected every sampling period, and the reference speckle image corresponding to the temperature is acquired by the second processing unit 120.
- the electronic device 100 may be provided with a temperature sensor beside the laser lamp 104, and collect the temperature of the laser lamp 106 or the like by the temperature sensor.
- the second processing unit 120 may acquire the temperature of the laser light 106 collected by the temperature sensor every acquisition time period, wherein the collection time period may be set according to actual needs, for example, 3 seconds, 4 seconds, etc., but is not limited thereto. Since the camera module 101 may be deformed when the temperature of the laser lamp 106 changes, the internal and external parameters of the laser lamp 106 and the laser camera 102 are affected. The effects on the camera module 101 are different at different temperatures, and therefore, different reference speckle images can be corresponding at different temperatures.
- the second processing unit 120 may acquire a reference speckle image corresponding to the temperature, and process the speckle image acquired at the temperature according to the reference speckle image corresponding to the temperature to obtain a depth map.
- the second processing unit may preset a plurality of different temperature intervals, such as 0 ° C (Celsius) to 30 ° C, 30 ° C to 60 ° C, 60 ° C to 90 ° C, etc., but is not limited thereto, and different temperature intervals Can correspond to different reference speckle images.
- the second processing unit 120 collects the temperature, the temperature interval in which the temperature is located may be determined, and a reference speckle image corresponding to the temperature interval is acquired.
- the second processing unit 120 After the second processing unit 120 acquires the reference speckle image corresponding to the collected temperature, it may be determined whether the reference speckle image acquired this time is consistent with the reference speckle image stored in the first processing unit 110, and the reference speckle image may be Carrying an image identifier, the image identifier may be composed of one or more of a number, a word line, a character, and the like.
- the second processing unit 120 may read the image identifier of the stored reference speckle image from the first processing unit 110, and identify the image identifier of the reference speckle image acquired this time and the image identifier read from the first processing unit 110. Compare.
- the two image identifiers are inconsistent, it can be said that the reference speckle image acquired this time is inconsistent with the reference speckle image stored in the first processing unit 110, and the second processing unit 120 can write the reference speckle image acquired this time.
- the first processing unit 110 is entered.
- the first processing unit 110 may store the newly written reference speckle image and delete the previously stored reference speckle image.
- the data processing method of the embodiment shown in FIG. 17 can obtain a reference speckle image corresponding to the temperature according to the temperature of the laser lamp 106, and reduce the influence of the temperature on the final output depth map, so that the depth information obtained is more accurate.
- the data processing method provided by the application includes the following steps:
- the target image is processed by the first processing unit 110, and the processed target image is sent to the second processing unit 120.
- step 001 includes steps 011 and 012.
- Step 011 when the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120, sends a control instruction to the controller 130 through the bidirectional two-wire synchronous serial I2C bus, and the control instruction is used to control the floodlight 104 to be turned on. And at least one of the laser lights 106.
- the step 001 includes: determining, according to the image collection instruction, the acquired image type; if the image type is the first type, the first processing unit 110 sends the first control instruction to the controller 130 through the I2C bus, where the first control instruction is used.
- the indicator controller 130 turns on the floodlight 104; if the image type is the second type, the first processing unit 110 sends a second control command to the controller 130 through the I2C bus, and the second control command is used to instruct the controller 130 to turn on the laser.
- Light 106 includes: determining, according to the image collection instruction, the acquired image type; if the image type is the first type, the first processing unit 110 sends the first control instruction to the controller 130 through the I2C bus, where the first control instruction is used.
- the indicator controller 130 turns on the floodlight 104; if the image type is the second type, the first processing unit 110 sends a second control command to the controller 130 through the I2C bus, and the second control command is used to instruct the controller 130 to turn on the laser.
- the data processing method further includes: when the image type includes the first type and the second type, the first processing unit 110 sends the first to the controller 130 through the I2C bus.
- a control command turns on the floodlight 104; after acquiring the target image corresponding to the first type by the laser camera 102, the second control command is sent to the controller 130 through the I2C bus to turn on the laser light 106.
- the data processing method further includes: when the image type includes the first type and the second type, the first processing unit 110 sends the first to the controller 130 through the I2C bus.
- the second control command turns on the laser lamp 106; after acquiring the target image of the second type by the laser camera 102, the first control command is sent to the controller 130 through the I2C bus to turn on the floodlight 104.
- a time interval between a time when the first processing unit 110 sends the first control instruction and a time when the second control instruction is sent is less than a time threshold.
- a pulse is transmitted to the controller 130 by the pulse width modulation PWM module 112 to illuminate at least one of the turned-on floodlight 104 and the laser light 106, and the target image is acquired by the laser camera 102.
- the first processing unit 110, the controller 130, and the laser camera 102 are connected to the same I2C bus; the step of acquiring the target image by the laser camera comprises: controlling the laser camera 102 to acquire the target image through the I2C bus.
- Step 002 the target image is processed by the first processing unit 110, and the processed target image is sent to the second processing unit 120.
- the target image includes a speckle image
- step 002 includes: acquiring a stored reference speckle image, the reference speckle image with reference depth information; and matching the reference speckle image with the speckle image to obtain a matching result;
- the depth disparity map is generated by referring to the depth information and the matching result, and the depth disparity map is sent to the second processing unit, and the depth disparity map is processed by the second processing unit to obtain a depth map.
- the control instruction is sent to the controller 130 through the I2C bus, and the at least one of the floodlight 104 and the laser light 106 is controlled to be turned on.
- the control of the floodlight 104 and the laser lamp 106 can reduce the complexity of controlling the floodlight 104 and the laser lamp 106, and the like, and save cost.
- step 001 includes steps 021 and 022.
- the data processing method of the embodiment is applied to the electronic device 100.
- the electronic device 100 includes a camera module 101, a first processing unit 110, and a second processing unit 120.
- 101 is connected;
- the camera module 101 includes a laser camera 102, a floodlight 104, and a laser lamp 106.
- the laser camera 102, the floodlight 104, the laser lamp 106, and the first processing unit 110 are connected to the same bidirectional two-wire synchronous serial I2C bus.
- Step 021 When the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120, at least one of the floodlight 104 and the laser lamp 106 is turned on by the I2C bus.
- the electronic device 100 further includes a controller 130 for controlling the floodlight 104 and the laser light 106, and the controller 130 is connected to the I2C bus.
- Step 021 includes: determining the type of the image to be acquired according to the image acquisition instruction; if the image type is an infrared image, the first processing unit 110 sends a first control instruction to the controller 130 through the I2C bus, where the first control instruction is used to instruct the controller 130.
- the floodlight 104 is turned on; if the image type is a speckle image or a depth image, the first processing unit 110 sends a second control command to the controller 130 through the I2C bus, and the second control command is used to instruct the controller 130 to turn on the laser light 106. .
- the data processing method further includes: when the image type includes the infrared image and the speckle image, or includes the infrared image and the depth image, the first processing unit 110 passes the I2C.
- the bus sends a first control command to the controller 130, turns on the floodlight 104, and controls the laser camera 102 to collect the infrared image through the I2C bus, and then sends a second control command to the controller 130 through the I2C bus, turns on the laser lamp 106, and passes
- the I2C bus controls the laser camera 102 to acquire a speckle image.
- the data processing method further includes: when the image type includes the infrared image and the speckle image, or includes the infrared image and the depth image, the first processing unit 110 passes the I2C.
- the bus sends a second control command to the controller 130, turns on the laser lamp 106, and controls the laser camera 102 to collect the speckle image through the I2C bus, and then sends a first control command to the controller 130 through the I2C bus to turn on the floodlight 104, and
- the laser camera 102 is controlled to acquire an infrared image through an I2C bus.
- step 022 the first processing unit 110 controls the laser camera 102 to acquire a target image through the I2C bus.
- Step 002 the target image is processed by the first processing unit 110, and the processed target image is sent to the second processing unit 120.
- step 002 includes: acquiring a stored reference speckle image, the reference speckle image with reference depth information; matching the reference speckle image with the speckle image to obtain a matching result; generating according to the reference depth information and the matching result
- the depth disparity map is sent to the second processing unit 120, and the depth disparity map is processed by the second processing unit 120 to obtain a depth map.
- the data processing method further includes: collecting the temperature of the laser lamp 106 every acquisition time period, and acquiring a reference speckle image corresponding to the temperature by using the second processing unit 120; When the reference speckle image acquired this time does not coincide with the reference speckle image stored in the first processing unit 110, the reference speckle image acquired this time is written by the second processing unit 120 to the first processing unit 110.
- the data processing method further includes: sending an image collection instruction to the first processing unit 110 by using a kernel running in the first operation mode in the second processing unit 120, where the first operation mode is a trusted operating environment.
- Step 002 includes: the first processing unit 110 sends the processed target image to the kernel running in the first operating mode in the second processing unit 120.
- the laser camera 102, the floodlight 104, the laser lamp 106, and the first processing unit 110 are connected to the same I2C bus, and the first processing unit 110 controls the opening of the floodlight 104 and the laser lamp 106 through the I2C bus. At least one of them, and the target image is collected by the I2C control laser camera 102, and the floodlight 104, the laser lamp 106, and the laser camera 102 are controlled by the same I2C bus to multiplex the I2C bus, thereby reducing the complexity of the control circuit. And reduce costs.
- step 001 includes step 031, step 032, and step 023.
- Step 031 When the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120, the image type is determined according to the image acquisition instruction.
- the second processing unit 120 is connected to the floodlight 104 and the laser light 106 respectively through a bidirectional two-wire synchronous serial I2C bus; before step 031, the method further includes: when detecting that the camera module 101 is activated, the second The processing unit 120 configures the floodlight 104 and the laser lamp 106 via the I2C bus, respectively.
- the second processing unit 120 is connected to the floodlight 104 and the laser lamp 106 through the same I2C bus.
- the second processing unit 120 is connected to the floodlight 104 via an I2C bus and to the laser light 106 via another I2C bus.
- Step 032 If the image type is the first type, turn on the floodlight 104 in the camera module 101, and send a pulse to the first controller 131 through the first pulse width modulation PWM module 1121 to illuminate the floodlight 104.
- the target image corresponding to the first type is then collected by the laser camera 102 in the camera module 101.
- Step 032 If the image type is the second type, turn on the laser light 106 in the camera module 101, and send a pulse to the second controller 132 through the second PWM module 1122, illuminate the laser light 106, and then pass the camera module.
- the laser camera 102 in 101 acquires a target image corresponding to the second type.
- the timing at which the first PWM module 1121 sends a pulse to the first controller 131 is different from the timing at which the second PWM module 1122 sends a pulse to the second controller 132, and the first PWM module 1121 sends a pulse to the first controller 131.
- the time interval between the time instant and the time when the second PWM module 1122 sends a pulse to the second controller 132 is less than the time threshold.
- Step 002 the target image is processed by the first processing unit 110, and the processed target image is sent to the second processing unit 120.
- the target image includes a speckle image
- step 002 includes: acquiring a stored reference speckle image, the reference speckle image with reference depth information; and matching the reference speckle image with the speckle image to obtain a matching result;
- the depth disparity map is generated by referring to the depth information and the matching result, and the depth disparity map is sent to the second processing unit 120, and the depth disparity map is processed by the second processing unit 120 to obtain a depth map.
- the data processing method further includes: collecting the temperature of the laser lamp 106 every acquisition time period, and acquiring a reference speckle image corresponding to the temperature by using the second processing unit 120; When the reference speckle image acquired this time does not coincide with the reference speckle image stored in the first processing unit 110, the reference speckle image acquired this time is written by the second processing unit 120 to the first processing unit 110.
- the image type is determined according to the image acquisition instruction, and if the image type is the first type, it is illuminated by the first PWM module 1121.
- the floodlight 104 collects a target image corresponding to the first type through the laser camera 102. If the image type is the second type, the laser light 106 is illuminated to the second controller 132 by the second PWM module 1122 and passes through the laser camera 102.
- the target image corresponding to the second type is collected, and the floodlight 104 and the laser light 106 are respectively controlled by the two PWM modules, so that the real-time switching is not required, the data processing complexity can be reduced, and the processing pressure of the first processing unit 110 can be alleviated.
- steps in the various flow diagrams described above are sequentially displayed as indicated by the arrows, these steps are not necessarily performed in the order indicated by the arrows. Except as explicitly stated herein, the execution of these steps is not strictly limited, and the steps may be performed in other orders. Moreover, at least some of the steps in the various flow diagrams above may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be executed at different times, these sub-steps or stages The order of execution is not necessarily performed sequentially, but may be performed alternately or alternately with at least a portion of other steps or sub-steps or stages of other steps.
- the electronic device 100 includes a processor 20, a memory 30, a display screen 40, and an input device 50 that are coupled by a system bus 60.
- the memory 30 may include a non-volatile storage medium 32 and an internal memory 30.
- the non-volatile storage medium 32 of the electronic device 100 stores an operating system and a computer program, which is executed by the processor 20 to implement any of the data processing methods provided in the embodiments of the present application.
- the processor 20 is configured to provide computing and control capabilities to support operation of the entire electronic device 100.
- the internal memory 30 in the electronic device 100 provides an environment for the operation of a computer program in the non-volatile storage medium 32.
- the display screen 40 of the electronic device 100 may be a liquid crystal display or an electronic ink display screen.
- the input device 50 may be a touch layer covered on the display screen 40, or may be a button, trackball or touch provided on the outer casing of the electronic device 100.
- the board can also be an external keyboard, trackpad or mouse.
- the electronic device 100 can be a cell phone, a tablet or a personal digital assistant or a wearable device or the like. It will be understood by those skilled in the art that the structure shown in FIG. 18 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation of the electronic device 100 to which the solution of the present application is applied, and the specific electronic device. 100 may include more or fewer components than shown in the figures, or some components may be combined, or have different component arrangements.
- the present application further provides an electronic device 100.
- the electronic device 100 includes a first processing unit 110 and a second processing unit 120.
- the first processing unit 100 is configured to: when the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120, control at least one of the floodlight 104 and the laser light 106 to be turned on, and control the laser camera 102 to collect the target. An image; processing the target image and transmitting the processed target image to the second processing unit 120.
- the electronic device 100 includes a camera module 101 , a first processing unit 110 , a second processing unit 120 , and a controller 130 .
- the first processing unit 110 and the second processing unit respectively 120 is connected to the camera module 101.
- the first processing unit 110 is coupled to the controller 130 via an I2C bus.
- the camera module 101 includes a laser camera 101, a floodlight 104, and a laser lamp 106.
- the floodlight 104 and the laser lamp 106 are connected to the controller 130, respectively.
- the first processing unit 110 includes a PWM module 112 that is coupled to the controller 130 via a PWM module 112.
- the first processing unit 110 is configured to, when receiving the image acquisition instruction sent by the second processing unit 120, send a control instruction to the controller 130 through the I2C bus, where the control instruction is used to control at least at least one of the floodlight 104 and the laser light 106 is turned on.
- One sending a pulse to the controller 130 through the pulse width modulation PWM module 112, lighting at least one of the turned-on floodlight 104 and the laser light 106, and collecting the target image through the laser camera 102, processing the target image, and The processed target image is sent to the second processing unit 120.
- the first processing unit 110, the controller 130, and the laser camera 102 are connected on the same I2C bus.
- the first processing unit 110 is further configured to control the laser camera 102 to acquire a target image through an I2C bus.
- multiplexing the I2C bus can reduce the complexity of the control circuit and reduce the cost.
- the first processing unit 110 is further configured to determine, according to the image capturing instruction, the acquired image type. If the image type is the first type, the first processing unit 110 sends the first control instruction to the controller 130 through the I2C bus. A control command is used to instruct the controller 130 to turn on the floodlight 104. If the image type is the second type, the first processing unit 110 sends a second control command to the controller 130 through the I2C bus, and the second control command is used to indicate the control. The device 130 turns on the laser light 106.
- the first processing unit 110 is further configured to: when the image type includes the first type and the second type, the first processing unit 110 sends the first control instruction to the controller 130 through the I2C bus, and turns on the floodlight 104. After the target image corresponding to the first type is acquired by the laser camera 102, the second control command is sent to the controller 130 through the I2C bus to turn on the laser lamp 106.
- the first processing unit 110 is further configured to: when the image type includes the first type and the second type, the first processing unit 110 sends a second control instruction to the controller 130 through the I2C bus, turning on the laser light 106, After the laser camera 102 acquires the target image of the second type, the first control command is sent to the controller 130 through the I2C bus to turn on the floodlight 104.
- a time interval between a time when the first processing unit 110 sends the first control instruction and a time when the second control instruction is sent is less than a time threshold.
- switching and control of the floodlight 104 and the laser lamp 106 can be realized by a controller 130, which can reduce the complexity of the control circuit and reduce the cost.
- the target image includes a speckle image.
- the first processing unit 110 is further configured to acquire the stored reference speckle image, match the reference speckle image with the speckle image, obtain a matching result, generate a depth disparity map according to the reference depth information and the matching result, and send the depth disparity map.
- the reference speckle image is provided with reference depth information.
- the second processing unit 120 is further configured to process the depth disparity map to obtain a depth map. In this way, the depth information of the acquired image can be accurately obtained by the first processing unit 110, the data processing efficiency is high, and the accuracy of the image processing is improved.
- the control instruction is sent to the controller 130 through the I2C bus, and the at least one of the floodlight 104 and the laser light 106 is controlled to be turned on.
- the control of the floodlight 104 and the laser lamp 106 can reduce the complexity of controlling the floodlight 104 and the laser lamp 106, and the like, and save cost.
- the electronic device 100 includes a camera module 101 , a first processing unit 110 , and a second processing unit 120 .
- the first processing unit 110 can be connected to the second processing unit 120 and the camera module 101, respectively.
- the camera module 101 can include a laser camera 102, a floodlight 104, a laser lamp 106, and the like.
- the laser camera 102, the floodlight 104, the laser lamp 106, and the first processing unit 110 are connected to the same two-wire serial I2C bus.
- the first processing unit 110 is configured to control at least one of the floodlight 104 and the laser light 106 to be turned on by the I2C bus when receiving the image acquisition instruction sent by the second processing unit 120, and control the laser camera 102 to acquire the target image through the I2C bus. Processing the target image and transmitting the processed target image to the second processing unit 120.
- the electronic device 100 further includes a controller 130.
- the controller 130 can be respectively connected to the floodlight 104 and the laser light 106.
- the controller 130 is used to control the floodlight 104 and the laser light 106, and the controller 130 and the I2C bus. connection.
- the first processing unit 110 is further configured to determine the acquired image type according to the image capturing instruction. If the image type is an infrared image, send a first control instruction to the controller 130 through the I2C bus, where the first control instruction is used to instruct the controller 130 to turn on.
- the floodlight 104 if the image type is a speckle image or a depth image, sends a second control command to the controller 130 through the I2C bus, and the second control command is used to instruct the controller 130 to turn on the laser lamp 106.
- the first processing unit 110 is further configured to determine, according to the image capturing instruction, the acquired image type. If the image type is an infrared image, send a first control instruction to the controller 130 through the I2C bus, where the first control instruction is used to indicate The controller 130 turns on the floodlight 104. If the image type is a speckle image or a depth image, the second control command is sent to the controller 130 through the I2C bus, and the second control command is used to instruct the controller 130 to turn on the laser light 106.
- the first processing unit 110 is further configured to: when the image type includes the infrared image and the speckle image, or include the infrared image and the depth image, send the second control instruction to the controller 130 through the I2C bus, and turn on the laser light 106,
- the laser camera 102 is controlled to collect the speckle image through the I2C bus, and then sends a first control command to the controller 130 through the I2C bus, turns on the floodlight 104, and controls the laser camera 102 to collect the infrared image through the I2C bus.
- the switching and control of the floodlight 104 and the laser lamp 106 can be realized by one controller 130, which can further reduce the complexity of the control circuit and reduce the cost.
- the first processing unit 110 is further configured to acquire the stored reference speckle image, and match the reference speckle image with the speckle image to obtain a matching result, and generate a depth disparity map according to the reference depth information and the matching result, and The depth disparity map is sent to the second processing unit 120, and the reference speckle image is provided with reference depth information.
- the second processing unit 120 is configured to process the depth disparity map to obtain a depth map. In this way, the depth information of the acquired image can be accurately obtained by the first processing unit 110, the data processing efficiency is high, and the accuracy of the image processing is improved.
- the second processing unit 120 is further configured to collect the temperature of the laser light 106 every acquisition time period, and acquire a reference speckle image corresponding to the temperature, when the reference speckle image acquired this time and the first processing unit 110 When the reference speckle images stored in the image are inconsistent, the reference speckle image acquired this time is written to the first processing unit 110.
- the reference speckle image corresponding to the temperature can be obtained according to the temperature of the laser lamp 106, and the influence of the temperature on the final output depth map can be reduced, so that the obtained depth information is more accurate.
- the second processing unit 120 is further configured to send an image collection instruction to the first processing unit 110 by using a kernel running in the first operating mode in the second processing unit 120, where the first operating mode is a trusted operating environment.
- the first processing unit 110 is further configured to send the processed target image to the kernel running in the first operating mode in the second processing unit 120.
- the laser camera 102, the floodlight 104, the laser lamp 106, and the first processing unit 110 are connected to the same I2C bus, and the first processing unit 110 controls the opening of the floodlight 104 and the laser lamp 106 through the I2C bus. At least one of them, and the target image is collected by the I2C control laser camera 102, and the floodlight 104, the laser lamp 106, and the laser camera 102 are controlled by the same I2C bus to multiplex the I2C bus, thereby reducing the complexity of the control circuit. And reduce costs.
- the electronic device 100 includes a camera module 101, a first processing unit 110, and a second processing unit 120.
- the modules 101 are connected.
- the camera module 101 includes a laser camera 102, a floodlight 104, and a laser lamp 106.
- the floodlight 104 is coupled to a first controller 131, and the laser lamp 106 is coupled to a second controller 132.
- the first processing unit 110 includes a first PWM module 1121 and a second PWM module 1122.
- the first processing unit 110 is connected to the first controller 131 through the first PWM module 1121, and the first processing unit 110 passes the second PWM module 1122 and the first The second controller 132 is connected.
- the first processing unit 110 is configured to determine an image type according to an image capturing instruction when receiving the image capturing instruction sent by the second processing unit 120, and if the image type is the first type, turn on the floodlight 104 in the camera module 101. And transmitting a pulse to the first controller 131 through the first pulse width modulation PWM module 1121, illuminating the floodlight 104, and collecting the target image corresponding to the first type through the laser camera 102 in the camera module 101;
- the type is the second type
- the laser lamp 106 in the camera module 101 is turned on, and the second PWM module 1122 sends a pulse to the second controller 132 to illuminate the laser lamp 106, and then passes through the laser camera in the camera module 101.
- 102 acquiring a target image corresponding to the second type; processing the target image, and transmitting the processed target image to the second processing unit 120.
- the second processing unit 120 is connected to the floodlight 104 and the laser lamp 106 via a bidirectional two-wire synchronous serial I2C bus, respectively.
- the second processing unit 120 is further configured to configure the floodlight 104 and the laser light 106 respectively through the I2C bus when detecting that the camera module 101 is activated.
- the second processing unit 120 is connected to the floodlight 104 and the laser lamp 106 via the same I2C bus.
- the second processing unit 120 is coupled to the floodlight 104 via an I2C bus and to the laser light 106 via another I2C bus. In this way, the second processing unit 120 can configure the floodlight 104 and the laser lamp 106 through the I2C bus when the camera module 101 is activated, which can more accurately control image acquisition and improve data processing efficiency.
- the timing at which the first PWM module 1121 sends a pulse to the first controller 131 is different from the timing at which the second PWM module 1122 sends a pulse to the second controller 132, and the first PWM module 1121 sends a pulse to the first controller 131.
- the time interval between the time instant and the time when the second PWM module 1122 sends a pulse to the second controller 132 is less than the time threshold.
- the first processing unit 110 can separately collect the infrared image and the speckle image at different times by the laser camera 102, and can ensure that the acquired infrared image and the image content of the speckle image are more consistent, and the accuracy of the subsequent face detection is improved.
- the first processing unit 110 is further configured to acquire the stored reference speckle image, match the reference speckle image with the speckle image, obtain a matching result, and generate a depth disparity map according to the reference depth information and the matching result,
- the depth disparity map is sent to the second processing unit 120, and the reference speckle image is provided with reference depth information.
- the second processing unit 120 is further configured to process the depth disparity map to obtain a depth map. In this way, the depth information of the acquired image can be accurately obtained by the first processing unit, the data processing efficiency is high, and the accuracy of the image processing is improved.
- the second processing unit 120 is further configured to acquire a temperature of the laser 106 lamp every acquisition time period, and acquire a reference speckle image corresponding to the temperature by using the second processing unit 120, when the reference speckle image acquired this time
- the reference speckle image stored in the first processing unit 110 does not coincide with, the reference speckle image acquired this time is written by the second processing unit 120 to the first processing unit 110.
- the reference speckle image corresponding to the temperature can be obtained according to the temperature of the laser lamp 106, and the influence of the temperature on the final output depth map can be reduced, so that the obtained depth information is more accurate.
- the image type is determined according to the image acquisition instruction, and if the image type is the first type, it is illuminated by the first PWM module 1121.
- the floodlight 104 collects a target image corresponding to the first type through the laser camera 102. If the image type is the second type, the laser lamp 106 is illuminated to the second controller 132 by the second PWM module 1122 and passes through the laser camera 102.
- the head collects the target image corresponding to the second type, and controls the floodlight 104 and the laser light 106 through the two PWM modules respectively, without real-time switching, which can reduce the data processing complexity and reduce the processing pressure of the first processing unit 110.
- Data processing device 80 includes a control module 801 and a processing module 802.
- the control module 801 is configured to control at least one of the floodlight 104 and the laser light 106 to be turned on when the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120, and control the laser camera 102 to acquire the target image.
- the processing module 802 is configured to process the target image by the first processing unit 110 and send the processed target image to the second processing unit 120.
- the control module 801 includes an instruction transmitting unit 811 and a first pulse transmitting unit 812.
- the instruction sending unit 811 is configured to send a control instruction to the controller 130 through the bidirectional two-wire synchronous serial I2C bus when the first processing unit 110 receives the image capturing instruction sent by the second processing unit 120, and the control instruction is used to control the opening of the pan. At least one of the light 104 and the laser light 106.
- the first pulse transmitting unit 812 is configured to send a pulse to the controller 130 through the pulse width modulation PWM module 112, illuminate at least one of the turned-on floodlight 104 and the laser light 106, and acquire a target image through the laser camera 102.
- the processing module 802 is configured to process the target image by the first processing unit 110 and send the processed target image to the second processing unit 120.
- the first processing unit 110, the controller 130, and the laser camera 102 are connected on the same I2C bus.
- the first pulse transmitting unit 811 is further configured to control the laser camera 102 to acquire a target image through an I2C bus.
- multiplexing the I2C bus can reduce the complexity of the control circuit and reduce the cost.
- the instruction sending unit 811 includes a first type determining subunit, a first sending subunit, and a second sending subunit.
- the first type determining subunit is configured to determine the type of image acquired based on the image acquisition instructions.
- the first sending subunit is configured to send a first control instruction to the controller 130 through the I2C bus if the image type is the first type, and the first control instruction is used to instruct the controller 130 to turn on the floodlight 104.
- the second sending subunit is configured to send a second control instruction to the controller 130 through the I2C bus if the image type is the second type, and the second control instruction is used to instruct the controller 130 to turn on the laser light 106.
- the first sending subunit 110 is further configured to: when the image type includes the first type and the second type, send the first control instruction to the controller 130 through the I2C bus, and turn on the floodlight 104.
- the second transmitting subunit is further configured to: after collecting the target image corresponding to the first type by the laser camera 102, send a second control instruction to the controller 130 through the I2C bus to turn on the laser light 106.
- the second sending subunit is further configured to: when the image type includes the first type and the second type, send the second control instruction to the controller 130 through the I2C bus, and turn on the laser light 106.
- the first transmitting subunit is further configured to, after acquiring the target image of the second type by the laser camera 102, send the first control command 110 to the controller 130 through the I2C bus, and turn on the floodlight 104.
- a time interval between a time when the first processing unit sends the first control instruction and a time when the second control instruction is sent is less than a time threshold.
- the processing module 802 includes a first image acquiring unit, a first matching unit, and a first generating unit.
- the first image acquisition unit is configured to acquire the stored reference speckle image, and the reference speckle image is provided with reference depth information.
- the first matching unit is configured to match the reference speckle image with the speckle image to obtain a matching result.
- the first generating unit is configured to generate a depth disparity map according to the reference depth information and the matching result, and send the depth disparity map to the second processing unit 120, and process the depth disparity map by the second processing unit 120 to obtain a depth map. In this way, the depth information of the acquired image can be accurately obtained by the first processing unit 110, the data processing efficiency is high, and the accuracy of the image processing is improved.
- the control instruction is sent to the controller 130 through the I2C bus, and the at least one of the floodlight 104 and the laser light 106 is controlled to be turned on.
- the control of the floodlight 104 and the laser lamp 106 can reduce the complexity of controlling the floodlight 104 and the laser lamp 106, and the like, and save cost.
- the control module 81 includes a first control unit 821 and a second control unit 822.
- the data processing device in this embodiment is applicable to the electronic device 100.
- the electronic device 100 includes a camera module 101, a first processing unit 110, and a second processing unit 120.
- the modules 101 are connected.
- the camera module 101 includes a laser camera 102, a floodlight 104, and a laser lamp 106.
- the laser camera 102, the floodlight 104, the laser lamp 106, and the first processing unit 110 are connected to the same two-wire serial I2C bus.
- the first control unit 821 is configured to control at least one of the floodlight 104 and the laser lamp 106 to be turned on by the I2C bus when the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120.
- the second control unit 822 is configured to control the laser camera 102 to acquire a target image through the I2C bus.
- the processing module 802 is configured to process the target image by the first processing unit 110 and send the processed target image to the second processing unit 120.
- the electronic device 100 further includes a controller 130 for controlling the floodlight 104 and the laser light 106, and the controller 130 is connected to the I2C bus.
- the first control unit 821 includes a second type determination subunit and an instruction transmission subunit.
- the second type determining subunit is configured to determine the type of image acquired based on the image acquisition instructions.
- the command sending subunit is configured to send a first control command to the controller 130 through the I2C bus if the image type is an infrared image, and the first control command is used to instruct the controller 130 to turn on the floodlight 104.
- the command sending subunit is further configured to send a second control command to the controller 130 through the I2C bus if the image type is a speckle image or a depth image, and the second control command is used to instruct the controller 130 to turn on the laser light 106.
- the first control unit 821 is further configured to: when the image type includes the infrared image and the speckle image, or include the infrared image and the depth image, send the first control instruction to the controller 130 through the I2C bus, and turn on the floodlight 104. And controlling the laser camera 102 to collect the infrared image through the I2C bus, then sending the second control command to the controller 130 through the I2C bus, turning on the laser lamp 106, and controlling the laser camera 102 to collect the speckle image through the I2C bus.
- the first control unit 821 is further configured to: when the image type includes the infrared image and the speckle image, or include the infrared image and the depth image, send the second control instruction to the controller 130 through the I2C bus, turn on the laser light 106, and The laser camera 102 is controlled by the I2C bus to collect the speckle image, then sends a first control command to the controller 130 through the I2C bus, turns on the floodlight 104, and controls the laser camera 102 to collect the infrared image through the I2C bus.
- the switching and control of the floodlight 104 and the laser lamp 106 can be realized by one controller 130, which can further reduce the complexity of the control circuit and reduce the cost.
- the processing module 802 includes a second image acquiring unit, a second matching unit, and a second generating unit.
- the second image acquisition unit is configured to acquire the stored reference speckle image, and the reference speckle image is provided with reference depth information.
- the second matching unit is configured to match the reference speckle image with the speckle image to obtain a matching result.
- the second generating unit is configured to generate a depth disparity map according to the reference depth information and the matching result, and send the depth disparity map to the second processing unit 120, and process the depth disparity map by the second processing unit 120 to obtain a depth map. In this way, the depth information of the acquired image can be accurately obtained by the first processing unit 110, the data processing efficiency is high, and the accuracy of the image processing is improved.
- the data processing device 80 of the embodiment includes a first temperature collecting module and a first writing module, in addition to the control module 801 and the processing module 802.
- the first temperature collecting module is configured to collect the temperature of the laser lamp 106 every acquisition time period, and acquire a reference speckle image corresponding to the temperature through the second processing unit 120.
- the first writing module is configured to write the reference speckle image acquired this time to the first by the second processing unit 120 when the reference speckle image acquired this time is inconsistent with the reference speckle image stored in the first processing unit Processing unit 110.
- the reference speckle image corresponding to the temperature can be obtained according to the temperature of the laser lamp 106, and the influence of the temperature on the final output depth map can be reduced, so that the obtained depth information is more accurate.
- the data processing device 80 of the embodiment further includes a first sending module, in addition to the control module 801, the processing module 802, the first temperature collecting module, and the first writing module.
- the first sending module is configured to send an image capturing instruction to the first processing unit 110 by using a kernel running in the first operating mode in the second processing unit 120, where the first operating mode is a trusted operating environment.
- the processing module 802 is further configured to send the processed target image to the kernel running in the first operating mode in the second processing unit 120 by using the first processing unit 110.
- the first processing unit 110 can be ensured to be in a highly secure environment, thereby improving data security.
- the laser camera 102, the floodlight 104, the laser lamp 106, and the first processing unit 110 are connected to the same I2C bus, and the first processing unit 110 controls the opening of the floodlight 104 and the laser lamp 106 through the I2C bus. At least one of them, and the target image is collected by the I2C control laser camera 102, and the floodlight 104, the laser lamp 106, and the laser camera 102 are controlled by the same I2C bus to multiplex the I2C bus, thereby reducing the complexity of the control circuit. And reduce costs.
- the control module 801 includes a type determining unit 831, a second pulse transmitting unit 832, and a third pulse transmitting unit 833.
- the type determining unit 831 is configured to determine an image type according to the image capturing instruction when the first processing unit 110 receives the image capturing instruction sent by the second processing unit 120.
- the second pulse sending unit 832 is configured to turn on the floodlight 104 in the camera module 101 if the image type is the first type, and send a pulse to the first controller 131 through the first pulse width modulation PWM module 1121 to illuminate
- the floodlight 104 collects a target image corresponding to the first type through the laser camera 102 in the camera module 101.
- the third pulse sending unit 833 is configured to turn on the laser light 106 in the camera module 101 if the image type is the second type, and send a pulse to the second controller 132 through the second PWM module 1122 to illuminate the laser light 106.
- the target image corresponding to the second type is then acquired by the laser camera 102 in the camera mode 101 group.
- the processing module 802 is configured to process the target image by the first processing unit 110, and send the processed target image to the second processing unit.
- the image type is determined according to the image acquisition instruction, and if the image type is the first type, it is illuminated by the first PWM module 1121.
- the floodlight 104 collects a target image corresponding to the first type through the laser camera 102. If the image type is the second type, the laser light 106 is illuminated to the second controller 132 by the second PWM module 1122 and passes through the laser camera. The target image corresponding to the second type is collected by 102, and the floodlight 104 and the laser light 106 are respectively controlled by the two PWM modules. Without real-time switching, the data processing complexity can be reduced, and the processing pressure of the first processing unit 110 can be reduced.
- the second processing unit 120 is connected to the floodlight 104 and the laser lamp 106 via a bidirectional two-wire synchronous serial I2C bus, respectively.
- the data processing device 80 of this embodiment includes a configuration module in addition to the control module 801 and the processing module 802.
- the configuration module is configured to configure the floodlight 104 and the laser light 106 respectively through the I2C bus when detecting that the camera module 101 is activated.
- the second processing unit 120 is connected to the floodlight 104 and the laser lamp 106 via the same I2C bus.
- the second processing unit 120 is coupled to the floodlight 104 via an I2C bus and to the laser light 106 via another I2C bus. In this way, the second processing unit 120 can configure the floodlight 104 and the laser lamp 106 through the I2C bus when the camera module 101 is activated, which can more accurately control image acquisition and improve data processing efficiency.
- the timing at which the first PWM module 1121 sends a pulse to the first controller 131 is different from the timing at which the second PWM module 1121 sends a pulse to the second controller 132, and the first PWM module 1121 sends a pulse to the first controller 131.
- the time interval between the time instant and the time when the second PWM module 1121 sends a pulse to the second controller 132 is less than the time threshold.
- the first processing unit 110 can separately collect the infrared image and the speckle image at different times by the laser camera 102, and can ensure that the acquired infrared image and the image content of the speckle image are more consistent, and the accuracy of the subsequent face detection is improved.
- the processing module 802 includes a third image acquiring unit, a third matching unit, and a third generating unit.
- the third image acquisition unit is configured to acquire the stored reference speckle image, and the reference speckle image is provided with reference depth information.
- the third matching unit is configured to match the reference speckle image with the speckle image to obtain a matching result.
- a third generating unit configured to generate a depth disparity map according to the reference depth information and the matching result, and send the depth disparity map to the second processing unit 120, and process the depth disparity map by the second processing unit 120 to obtain a depth map. In this way, the depth information of the acquired image can be accurately obtained by the first processing unit 110, the data processing efficiency is high, and the accuracy of the image processing is improved.
- the data processing device 80 of the embodiment further includes a second temperature collecting module and a second writing module, in addition to the control module 801, the processing module 802, and the configuration module.
- the second temperature collecting module is configured to collect the temperature of the laser lamp 106 every acquisition time period, and acquire a reference speckle image corresponding to the temperature through the second processing unit 120.
- the second writing module is configured to write the reference speckle image acquired this time by the second processing unit 120 when the reference speckle image acquired this time is inconsistent with the reference speckle image stored in the first processing unit 110 A processing unit 110. In this way, the reference speckle image corresponding to the temperature can be obtained according to the temperature of the laser lamp 106, and the influence of the temperature on the final output depth map can be reduced, so that the obtained depth information is more accurate.
- the present application also provides a computer readable storage medium having stored thereon a computer program that, when executed by a processor, implements the data processing method of any of the above embodiments.
- the present application also provides a computer program product comprising a computer program, when executed on a computer device, causes the computer device to perform the data processing method described in any one of the above embodiments.
- the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or the like.
- Non-volatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
- Volatile memory can include random access memory (RAM), which acts as an external cache.
- RAM is available in a variety of forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronization.
- SRAM static RAM
- DRAM dynamic RAM
- SDRAM synchronous DRAM
- DDR SDRAM dual data rate SDRAM
- ESDRAM enhanced SDRAM
- synchronization Link (Synchlink) DRAM (SLDRAM), Memory Bus (Rambus) Direct RAM (RDRAM), Direct Memory Bus Dynamic RAM (DRDRAM), and Memory Bus Dynamic RAM (RDRAM).
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Artificial Intelligence (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Studio Devices (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Analysis (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
Abstract
一种数据处理方法、数据处理装置(80)、电子设备(100)及计算机可读存储介质。数据处理方法包括:(001)当第一处理单元(110)接收到第二处理单元(120)发送的图像采集指令时,控制泛光灯(104)和镭射灯(106)中的至少一个开启,并控制激光摄像头(102)采集目标图像;(002)通过第一处理单元(110)对目标图像进行处理,并将处理后的目标图像发送给第二处理单元(120)。
Description
优先权信息
本申请请求2018年4月28日向中国国家知识产权局提交的、专利申请号为201810402998.6、201810402999.0及201810401326.3的专利申请的优先权和权益,并且通过参照将其全文并入此处。
本申请涉及计算机技术领域,特别涉及一种数据处理方法、数据处理装置、电子设备及计算机可读存储介质。
3D(3Dimensions,三维)人脸在人脸识别、美颜、3D模型建立等不同应用场景中均起到重要的作用。电子设备可通过镭射灯等激光器发射激光,并通过摄像头采集被激光照射的人脸图像,通过结构光构建3D人脸。在传统的方式中,电子设备控制激光器、摄像头等的控制电路较为复杂,成本高。
发明内容
本申请实施方式提供一种数据处理方法、数据处理装置、电子设备及计算机可读存储介质。
本申请实施方式的数据处理方法包括:当第一处理单元接收到第二处理单元发送的图像采集指令时,控制泛光灯和镭射灯中的至少一个开启,并控制激光摄像头采集目标图像;通过所述第一处理单元对所述目标图像进行处理,并将处理后的所述目标图像发送给所述第二处理单元。
本申请实施方式的数据处理装置包括控制模块和处理模块。所述控制模块用于当第一处理单元接收到第二处理单元发送的图像采集指令时,控制泛光灯和镭射灯中的至少一个开启,并控制激光摄像头采集目标图像。所述处理模块用于通过所述第一处理单元对所述目标图像进行处理,并将处理后的所述目标图像发送给所述第二处理单元。
本申请实施方式的电子设备包括第一处理单元和第二处理单元。所述第一处理单元用于:当所述第一处理单元接收到所述第二处理单元发送的图像采集指令时,控制泛光灯和镭射灯中的至少一个开启,并控制激光摄像头采集目标图像;对所述目标图像进行处理,并将处理后的所述目标图像发送给所述第二处理单元。
本申请实施方式的计算机可读存储介质,其上存储有计算机程序。所述计算机程序被处理器执行时实现上述的数据处理方法。
本申请实施方式的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。
本申请的上述和/或附加的方面和优点可以从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1和图2是本申请某些实施方式的数据处理方法的流程示意图。
图3至图5是本申请某些实施方式的数据处理方法的应用场景图。
图6至图13是本申请某些实施方式的数据处理方法的流程示意图。
图14和图15是本申请某些实施方式的数据处理方法的应用场景图。
图16和图17是本申请某些实施方式的数据处理方法的流程示意图。
图18是本申请某些实施方式的电子设备的模块图。
图19至图22是本申请某些实施方式的数据处理装置的模块示意图。
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅用以解释本申请,并不用于限定本申请。
可以理解,本申请所使用的术语“第一”、“第二”等可在本文中用于描述各种元件,但这些元件不受这些术语限制。这些术语仅用于将第一个元件与另一个元件区分。举例来说,在不脱离本申请的范围的情况下,可以将第一客户端称为第二客户端,且类似地,可将第二客户端称为第一客户端。第一客户端和第二客户端两者都是客户端,但其不是同一客户端。
请参阅图1和图2,本申请提供一种数据处理方法。数据处理方法包括:
001:当第一处理单元110接收到第二处理单元120发送的图像采集指令时,控制泛光灯104和镭射灯106中的至少一个开启,并控制激光摄像头102采集目标图像;和
002:通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
本申请的数据处理方法可以应用于电子设备100。电子设备100包括激光摄像头102、泛光灯104、镭射灯106、第一处理单元110和第二处理单元120。第一处理单元110和第二处理单元120连接。
请参阅图2至图4,在一个实施例中,步骤001当第一处理单元110接收到第二处理单元120发送的图像采集指令时,控制泛光灯104和镭射灯106中的至少一个开启,并控制激光摄像头102采集目标图像包括步骤011和步骤012。
011:当第一处理单元110接收到第二处理单元120发送的图像采集指令时,通过双向二线制同步串行I2C总线向控制器发送控制指令,控制指令用于控制开启泛光灯104和镭射灯106中的至少一个。
当电子设备100中的应用程序需要获取人脸数据时,应用程序可向第二处理单元120发送数据获取请求,其中,人脸数据可包括但不限于人脸解锁、人脸支付等场景下需要进行人脸验证的数据以及人脸深度信息等。第二处理单元120接收数据获取请求后,可向第一处理单元110发送图像采集指令。其中,第一处理单元110可以是MCU模块,第二处理单元120可以是CPU模块。
电子设备100还可包括控制器130,控制器130可分别与泛光灯104和镭射灯106连接,泛光灯104和镭射灯106可通过同一个控制器130进行控制。控制器130对泛光灯104和镭射灯106进行控制,可包括控制泛光灯104或镭射灯106开启,控制泛光灯104和镭射灯106之间的切换,控制104泛光灯和镭射灯106的发射功率等。第一处理单元110可通过I2C总线与控制器130连接,I2C总线可通过一根数据线和一根时钟线实现连接于I2C总线上的各个器件之间的数据传输。当第一处理单元110接收到第二处理单元120发送的图像采集指令时,可通过I2C总线向控制器130发送控制指令,控制器2130接收控制指令后,根据控制指令开启泛光灯104和镭射灯106中的至少一个。
012:通过脉冲宽度调制PWM模块112向控制器130发送脉冲,点亮开启的泛光灯104和镭射灯106中的至少一个,并通过激光摄像头102采集目标图像。
第一处理单元110可通过PWM模块112与控制器130连接。第一处理单元110若要点亮泛光灯104和镭射灯106中的至少一个,则可通过PWM模块112向控制器130发送脉冲,点亮开启的泛光灯104和镭射灯106中的至少一个。可选地,PWM模块112可按照一定电压幅度、一定时间间隔向控制器130连续发出脉冲信号,点亮泛光灯104和镭射灯106中的至少一个。
第一处理单元110可通过激光摄像头102采集目标图像,目标图像可包括红外图像、散斑图像等。若开启的为泛光灯104,PWM模块112可向控制器130发送脉冲,点亮泛光灯104,泛光灯104可为一种可向四面八方均匀照射的面光源,当泛光灯104被点亮时,可发射红外光,激光摄像头102可采集被人脸反射回红外光以得到红外图像。若开启的为镭射灯106,PWM模块112可向控制器130发送脉冲,点亮镭射灯106。镭射灯106被点亮时,发出的激光可由透镜和DOE(diffractive optical elements,衍射光学元件)进行衍射产生带散斑颗粒的图案,带散斑颗粒的图案投射到目标物体后,带散斑颗粒的图案因为目标物体上的各点与电子设备100的距离不同产生了散斑颗粒的偏移,激光摄像头102采集散斑颗粒偏移后的图案得到散斑图像。
002:通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
激光摄像头102可将采集的目标图像发送给第一处理单元110,第一处理单元110可对目标图像进行处理。其中,目标图像可包括红外图像、散斑图像等。第一处理单元110根据图像采集指令确定图像类型后,可根据确定的图像类型采集与图像类型对应的目标图像,并对目标图像进行相应的处理。PWM模块112的个数可为一个或多个。当PWM模块112的个数为多个时,PWM模块112可包括第一PWM模块和第二PWM模块。控制器130的个数也可为一个或多个。当控制器130的个数为多个时,控制器130可包括第一控制器和第二控制器。当图像类型为红外图像时,第一处理单元110可通过第一PWM 模块向第一控制器发送脉冲,点亮泛光灯104,并通过激光摄像头02采集红外图像,第一处理单元110再对红外图像进行处理得到红外视差图。当图像类型为散斑图像时,第一处理单元110可通过第二PWM模块向第二控制器发送脉冲,点亮镭射灯106,并通过激光摄像头102采集散斑图像,第一处理单元110再对散斑图像进行处理得到散斑视差图。当图像类型为深度图像时,第一处理单元110可采集散斑图像,并对采集的散斑图像进行处理得到深度视差图。
进一步地,第一处理单元110可对目标图像进行校正处理,进行校正处理是指校正目标图像由于激光摄像头102及RGB摄像头108的内外参数等造成的图像内容偏移,例如由于激光摄像头102偏转角度、激光摄像头102和RGB摄像头108之间的摆放位置等引起的图像内容偏移等。第一处理单元110对目标图像进行校正处理后,可得到目标图像的视差图,例如,对红外图像进行校正处理得到红外视差图,对散斑图像进行校正可得到散斑视差图或深度视差图等。第一处理单元110对目标图像进行校正处理,可以防止最终在电子设备100的屏幕上呈现的图像出现重影的情况。
第一处理单元110对目标图像进行处理,可将处理后的目标图像发送给第二处理单元120。第二处理单元120可根据处理后的目标图像得到所需的图像,比如红外图像、散斑图像及深度图像等。第二处理单元120可根据应用程序的需求对所需的图像进行进一步的处理。
例如,应用程序需要进行人脸验证时,则第二处理单元120可对得到的所需的图像等进行人脸检测,其中,人脸检测可包括人脸识别、人脸匹配和活体检测。人脸识别是指识别所需的图像中是否存在人脸。人脸匹配是指将所需的图像中的人脸与预存的人脸进行匹配。活体检测是指检测所需的图像中人脸是否具有生物活性等。若应用程序需要获取人脸的深度信息,则第二处理单元120可将生成的深度图像上传至应用程序,应用程序可根据接收到的深度图像进行美颜处理、三维建模等。
在图2所示实施例的数据处理方法中,当第一处理单元110接收到第二处理单元120发送的图像采集指令时,通过I2C总线向控制器130发送控制指令,控制泛光灯104和镭射灯106中的至少一个开启,并通过PWM模块112向控制器130发送脉冲,点亮开启的泛光灯104和镭射灯106中的至少一个,采集目标图像后再对目标图像进行处理,通过一个控制器130即可实现对泛光灯104和镭射灯106的控制,可以降低控制泛光灯104和镭射灯106等的复杂度,且节约成本。
图3为图2所示实施例的数据处理方法的一个应用场景图。如图3所示,电子设备100包括激光摄像头102、泛光灯104、镭射灯106、第一处理单元110、第二处理单元120和控制器130。第一处理单元110可为MCU(Microcontroller Unit,微控制单元)模块等,第二处理单元120可为CPU(Central Processing Unit,中央处理器)模块等。第一处理单元110可与激光摄像头102、第二处理单元120连接,第一处理单元110可通过I2C总线与控制器130连接。第一处理单元110中可包PWM(Pulse Width Modulation,脉冲宽度调制)模块112,并通过PWM模块112与控制器130连接,控制器130可分别与泛光灯104、镭射灯106连接。
当第一处理单元110接收到第二处理单元120发送的图像采集指令时,通过I2C总线向控制器130发送控制指令,该控制指令可用于控制泛光灯104和镭射灯106中的至少一个开启。第一处理单元110可通过PWM模块112向控制器130发送脉冲,点亮开启的泛光灯104和镭射灯106中的至少一个,并通过激光摄像头102采集目标图像。第一处理单元110可对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
图4为图2所示实施例数据处理方法的另一个应用场景图。如图4所示,电子设备100可包括摄像头模组101、第二处理单元120,第一处理单元110。第二处理单元120可为CPU模块。第一处理单元110可为MCU模块等。其中,第一处理单元110连接在第二处理单元120和摄像头模组101之间,上述第一处理单元110可控制摄像头模组101中的激光摄像头102、泛光灯104和镭射灯106,第二处理单元120可控制摄像头模组101中的RGB摄像头108。
摄像头模组101包括激光摄像头102、泛光灯104、RGB摄像头108和镭射灯106。激光摄像头102可为红外摄像头,用于获取红外图像。泛光灯104为可发射红外光的面光源。镭射灯106为可发射激光且发射的激光可形成图案的点光源。其中,当泛光灯104发射红外光时,激光摄像头102可根据反射回的光线获取红外图像。当镭射灯106发射激光时,激光摄像头102可根据反射回的光线获取散斑图像。散斑图像是镭射灯106发射的形成图案的激光被反射后图案发生形变的图像。
第二处理单元120可包括在TEE(Trusted execution environment,可信运行环境)环境下运行的CPU 内核和在REE(Rich Execution Environment,自然运行环境)环境下运行的CPU内核。其中,TEE环境和REE环境均为ARM模块(Advanced RISC Machines,高级精简指令集处理器)的运行模式。TEE环境的安全级别较高,第二处理单元120中有且仅有一个CPU内核可同时运行在TEE环境下。通常情况下,电子设备100中安全级别较高的操作行为需要在TEE环境下的CPU内核中执行,安全级别较低的操作行为可在REE环境下的CPU内核中执行。
第一处理单元110包括PWM模块112、SPI/I2C(Serial Peripheral Interface/Inter-Integrated Circuit,串行外设接口/双向二线制同步串行接口)接口114、RAM(Random Access Memory,随机存取存储器)模块116和深度引擎118。第一处理单元110可通过PWM模块112与泛光灯104和镭射灯106的控制器130(图3所示)连接,控制器130可分别与泛光灯104和镭射灯106连接,对泛光灯104和镭射灯106进行控制。第一处理单元110还可通过I2C总线与控制器130连接,以通过I2C总线控制泛光灯104或镭射灯106开启。PWM模块112可向摄像头模组101发射脉冲,点亮开启的泛光灯104或镭射灯106。第一处理单元110可通过激光摄像头102采集红外图像或散斑图像。SPI/I2C接口114用于接收第二处理单元120发送的图像采集指令。深度引擎118可对散斑图像进行处理得到深度视差图。
当第二处理单元120接收到应用程序的数据获取请求时,例如,当应用程序需要进行人脸解锁、人脸支付时,可通过运行在TEE环境下的CPU内核向第一处理单元110发送图像采集指令。当第一处理单元110接收到图像采集指令后,可通过I2C总线向控制器130发送控制指令,控制控制摄像头模组101中的泛光灯104开启,再通过PWM模块112向控制器130发射脉冲波点亮泛光灯104,并通过I2C总线控制激光摄像头102采集红外图像。第一处理单元110接收到图像采集指令后,还可通过I2C总线向控制器130发送控制指令,控制开启摄像头模组101中的镭射灯106,再通过PWM模块112向控制器130发射脉冲波点亮镭射灯106,并通过I2C总线控制激光摄像头102采集散斑图像。摄像头模组101可将采集到的红外图像和散斑图像发送给第一处理单元110。第一处理单元110可对接收到的红外图像进行处理得到红外视差图,还可对接收到的散斑图像进行处理得到散斑视差图或深度视差图。其中,第一处理单元110对红外图像和散斑图像进行处理是指对红外图像或散斑图像进行校正,去除摄像头模组101的内外参数对图像的影响。第一处理单元110还可设置成不同的模式,不同模式输出的图像不同。当第一处理单元110设置为散斑图模式时,第一处理单元110对散斑图像处理得到散斑视差图,根据散斑视差图可得到目标散斑图像;当第一处理单元110设置为深度图模式时,第一处理单元110对散斑图像处理得到深度视差图,根据深度视差图可得到深度图像,深度图像是指带有深度信息的图像。第一处理单元110可将红外视差图和散斑视差图发送给第二处理单元120,第一处理单元110也可将红外视差图和深度视差图发送给第二处理单元120。第二处理单元120可根据红外视差图获取目标红外图像、根据深度视差图获取深度图像。进一步的,第二处理单元120可根据目标红外图像、深度图像来进行人脸识别、人脸匹配、活体检测以及获取检测到的人脸的深度信息。
第一处理单元110与第二处理单元120之间通信是通过固定的安全接口,用以确保传输数据的安全性。如图4所示,第二处理单元120发送给第一处理单元110的数据是通过SECURE SPI/I2C 130,第一处理单元110发送给第二处理单元120的数据是通过SECURE MIPI(Mobile Industry Processor Interface,移动产业处理器接口)140。
可选地,第一处理单元110也可根据红外视差图获取目标红外图像、根据深度视差图计算获取深度图像,再将目标红外图像、深度图像发送给第二处理单元120。
对于图2所示实施例的数据处理方法,请结合图3及图4,可选地,步骤通过激光摄像头102采集目标图像包括:通过I2C总线控制激光摄像头102采集目标图像。
第一处理单元110可通过I2C总线与激光摄像头102连接,并通过该连接的I2C总线控制激光摄像头102采集目标图像。在一个例子中,第一处理单元110、激光摄像头102和控制器130可连接同一个I2C总线。当第一处理单元110接收到第二处理单元120发送的图像采集指令后,可通过该I2C总线控制泛光灯104或镭射灯106开启,并通过PWM模块112向控制器130发送脉冲,点亮开启的泛光灯104或镭射灯106,再通过该连接的I2C总线控制激光摄像头102采集红外图像或散斑图像等目标图像。
对于图2所示实施例的数据处理方法,请结合图3及图4,可选地,第一处理单元110可通过该连接的I2C总线对控制器130进行寻址,并向控制器130发送控制指令,控制泛光灯104或镭射灯106开启,再通过该连接的I2C总线对激光摄像头102进行寻址,控制激光摄像头102采集目标图像,在不同 的时刻对连接的同一个I2C总线进行复用,节省资源。
图5为一个实施例的第一处理单元110、激光摄像头102和控制器130连接在同一个I2C总线的示意图。如图5所示,电子设备100包括激光摄像头102、泛光灯104、镭射灯106、第一处理单元110、第二处理单元120和控制器130。第一处理单元110可与第二处理单元120连接。第一处理单元110中可包PWM模块112,并通过PWM模块112与控制器130连接,控制器130可分别与泛光灯104、镭射灯106连接。激光摄像头102、第一处理单元110和控制器130可连接在同一个I2C总线。当第一处理单元110接收到第二处理单120元发送的图像采集指令后,可通过该I2C总线向控制器130发送控制指令,控制泛光灯104或镭射灯106开启,并通过PWM模块112向控制器130发送脉冲,点亮开启的泛光灯104或镭射灯106,再通过该连接的I2C总线控制激光摄像头102采集红外图像或散斑图像等目标图像。图5所示的实施例,通过同一个I2C总线控制泛光灯104、镭射灯106和激光摄像头102,对I2C总线进行复用,可以降低控制电路的复杂度,并减少成本。
对于图2所示实施例的数据处理方法,请一并参阅图3、图4及图6,可选地,步骤通过I2C总线向控制器130发送控制指令包括以下步骤:
0111:根据图像采集指令确定采集的图像类型。
0112:若图像类型为第一类型,则第一处理单元110通过I2C总线向控制器130发送第一控制指令,第一控制指令用于指示控制器130开启泛光灯104。
第一处理单元110接收第二处理单元120发送的图像采集指令,可根据图像采集指令确定采集的图像类型,其中,图像类型可以是红外图像、散斑图像及深度图像等中的一种或多种。图像类型可根据应用程序所需的人脸数据进行确定,第二处理单元120接收数据获取请求后,可根据数据获取请求确定图像类型,并向第一处理单元110发送包含该图像类型的图像采集指令。例如,应用程序需要进行人脸解锁的数据,则第二处理单元120可确定图像类型为红外图像及散斑图像,需要人脸深度信息,进一步确定图像类型为深度图像等,但不限于此。
若图像类型为第一类型,在本实施例中,第一类型可以是红外图像,则第一处理单元110可通过连接的I2C向控制器130发送第一控制指令,控制器130可根据第一控制指令开启泛光灯104。第一处理单元110可通过PWM模块112向控制器130发射脉冲,点亮泛光灯104。可选地,第一处理单元110可通过I2C对控制器130进行寻址,并向控制器130发送第一控制指令。
0113:若图像类型为第二类型,则第一处理单元110通过I2C总线向控制器130发送第二控制指令,第二控制指令用于指示控制器130开启镭射灯106。
若图像类型为第二类型,在本实施例中,第二类型可以是散斑图像或深度图像等,则第一处理单元110可通过连接的I2C向控制器130发送第二控制指令,控制器130可根据第二控制指令开启镭射灯106。第一处理单元110可通过PWM模块112向控制器130发射脉冲,点亮镭射灯106。
第一处理单元110根据图像采集指令确定图像类型,图像类型可包含有至少两种,比如,图像类型可同时包含第一类型和第二类型。当图像类型同时包含红外图像和散斑图像,或是同时包含红外图像和深度图像时,摄像头模组101需要同时采集红外图像和散斑图像。第一处理单元110可控制摄像头模组101先采集红外图像,也可先采集散斑图像,并不限定先后采集顺序。第一处理单元110可先通过I2C总线向控制器130发送第一控制指令,开启泛光灯104,并通过PWM模块112向控制器130发射脉冲,点亮泛光灯104,再通过I2C总线控制激光摄像头102采集红外图像。在第一处理单元110控制激光摄像头102采集与第一类型对应的目标图像后,可通过I2C总线向控制器130发送第二控制指令,开启镭射灯106,并通过PWM模块112向控制器130发射脉冲,点亮镭射灯106,再通过I2C总线控制激光摄像头102采集散斑图像。或者,当图像类型同时包含第一类型和第二类型时,第一处理单元110也可先通过I2C总线向控制器130发送第二控制指令,开启镭射灯106,并通过PWM模块112向控制器130发射脉冲,点亮镭射灯106,再通过I2C总线控制激光摄像头102采集散斑图像。在第一处理单元110控制激光摄像头102采集与第二类型对应的目标图像后,可通过I2C总线向控制器130发送第一控制指令,开启泛光灯104,并通过PWM模块112向控制器130发射脉冲,点亮泛光灯104,再通过I2C总线控制激光摄像头102采集红外图像。
可选地,第一处理单元110可在不同时刻分别向控制器130发送第一控制指令和第二控制指令,第一处理单元110发送第一控制指令的时刻和发送第二控制指令的时刻之间的时间间隔可小于时间阈值, 激光摄像头102在采集完红外图像后可在小于时间阈值的时间间隔中采集到散斑图像,使得采集的红外图像与散斑图像的图像内容较为一致,方便后续进行人脸检测等处理。时间阈值可根据实际需求进行设定,例如20毫秒、30毫秒等。保证采集的红外图像与散斑图像的图像内容较为一致,可以提高后续人脸检测的准确性。在本实施例中,可通过一个控制器130实现泛光灯104和镭射灯106的切换和控制,可以降低控制电路的复杂度,并减少成本。
对于图2所示实施例的数据处理方法,请一并参阅图3、图4及图7,可选地,,步骤002通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120包括以下步骤:
0141:获取存储的参考散斑图像,参考散斑图像带有参考深度信息。
在摄像机坐标系中,以垂直成像平面并穿过镜面中心的直线为Z轴,若物体在摄像机坐标系的坐标为(X,Y,Z),那么其中的Z值即为物体在该摄像机成像平面的深度信息。若应用程序需要获取人脸的深度信息,则需要采集包含人脸深度信息的深度图像。第一处理单元110可通过I2C总线控制镭射灯106开启,并通过I2C总线控制激光摄像头102采集散斑图像。第一处理单元110中可预先存储有参考散斑图,参考散斑图可带有参考深度信息,可根据采集的散斑图像及参考散斑图像获取散斑图像中包含的各个像素点的深度信息。
0142:将参考散斑图像与散斑图像进行匹配,得到匹配结果。
第一处理单元110可依次以采集的散斑图像中包含的各个像素点为中心,选择一个预设大小像素块,例如31pixel(像素)*31pixel大小,在参考散斑图像上搜索与选择的像素块相匹配的块。第一处理单元110可从采集的散斑图像中选择的像素块和参考散斑图像相匹配的块中,找到散斑图像及参考散斑图像中分别在同一条激光光路上的两个点,同一激光光路上的两个点的散斑信息一致,在同一条激光光路上的两个点可认定为对应的像素点。参考散斑图像中,每一条激光光路上的点的深度信息都是已知的。第一处理单元110可计算目标散斑图像与参考散斑图像在同一条激光光路上的两个对应的像素点之间的偏移量,并根据偏移量计算得到采集的散斑图中包含的各个像素点的深度信息。
在一个例子中,第一处理单元110将采集的散斑图像与参考散斑图进行偏移量的计算,根据偏移量计算得到散斑图像中包含的各个像素点的深度信息,其计算公式可如式(1)所示:
其中,Z
D表示像素点的深度信息,也即像素点的深度值;L为激光摄像头102与激光器(即镭射灯106)之间的距离;f为激光摄像头102中透镜的焦距,Z
0为参考散斑图像采集时参考平面距离电子设备100的激光摄像头102的深度值,P为采集的散斑图像与参考散斑图像中对应像素点之间的偏移量。P可由目标散斑图与参考散斑图中像素点偏移的像素量乘以一个像素点的实际距离得到。当目标物体与激光摄像头102之间的距离大于参考平面与激光摄像头102之间的距离时,P为负值,当目标物体与激光摄像头102之间的距离小于参考平面与激光摄像头102之间的距离时,P为正值。
0143:根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元120,通过第二处理单元120对深度视差图进行处理得到深度图。
第一处理单元110得到采集的散斑图像中包含的各个像素点的深度信息,可对采集的散斑图像进行校正处理,校正采集的散斑图像由于激光摄像头102及RGB摄像头108的内外参数等造成的图像内容偏移。第一处理单元110可根据校正后的散斑图像,以及散斑图像中各个像素点的深度值,生成深度视差图,并将深度视差图发送给第二处理单元120。第二处理单元120可根据深度视差图得到深度图,深度图中可包含各个像素点的深度信息。第二处理单元120可将深度图上传至应用程序,应用程序可根据深度图中人脸的深度信息进行美颜、三维建模等。第二处理单元120也可根据深度图中人脸的深度信息进行活体检测,可防止采集的人脸是二维的平面人脸等。
对于图2所示实施例的数据处理方法,请结合图3及图4,可选地,电子设备100中第二处理单元120可包括两种运行模式,其中,第一运行模式可以为TEE,TEE为可信运行环境,安全级别高;第二运行模式可以为REE,REE为自然运行环境,REE的安全级较低。当第二处理单元120接收到应用程序发送的数据获取请求后,可通过第一运行模式向第一处理单元110发送图像采集指令。当第二处理单元120为单核的CPU时,可直接将上述单核由第二运行模式切换到第一运行模式;当第二处理单元120 为多核时,可将一个内核由第二运行模式切换到第一运行模式,其他内核仍运行在第二运行模式中,并通过运行在第一运行模式下的内核向第一处理单元110发送图像采集指令。
第一处理单元110对采集的目标图像进行处理后,可将处理后的目标图像发送给该运行在第一运行模式下的内核,可保证第一处理单元110一直在可信运行环境下运行,提高安全性。第二处理单元120可在该运行在第一运行模式下的内核中,根据处理后的目标图像得到所需的图像,并根据应用程序的需求对所需的图像进行处理。比如,第二处理单元120可在运行在第一运行模式下的内核中对所需的图像进行人脸检测。通过第二处理单元120安全性高的内核向第一处理单元110发送图像采集指令,可保证第一处理单元110处于安全性高的环境中,提高数据的安全。
在一个实施例中,由于运行在第一运行模式的内核是唯一的,第二处理单元120在TEE环境下对目标图像进行人脸检测,可采集串行的方式逐一对目标图像进行人脸识别、人脸匹配和活体检测等。第二处理单元120可先对所需的图像进行人脸识别,当识别到人脸时,再将所需的图像中包含的人脸与预先存储的人脸进行匹配,判断是否为同一人脸。若为同一人脸再根据所需的图像对人脸进行活体检测,防止采集的人脸是二维的平面人脸等。当没有识别到人脸时,可不进行人脸匹配和活体检测,可减轻第二处理单元120的处理压力。
在本实施例中,通过第一处理单元110可精准得到采集的图像的深度信息,数据处理效率高,且提高了图像处理的精准性。
请参阅图4、图5和图8,在另一个实施例中,步骤011当第一处理单元110接收到第二处理单元120发送的图像采集指令时,控制泛光灯104和镭射灯106中的至少一个开启,并控制激光摄像头102采集目标图像包括步骤021和步骤022。
021:当第一处理单元110接收到第二处理单元120发送的图像采集指令时,通过I2C总线控制开启泛光灯104和镭射灯106中的至少一个。
当电子设备100中的应用程序需要获取人脸数据时,应用程序可向第二处理单元120发送数据获取请求,其中,人脸数据可包括但不限于人脸解锁、人脸支付等场景下需要进行人脸验证的数据以及人脸深度信息等。第二处理单元120接收数据获取请求后,可向第一处理单元110发送图像采集指令,其中,第一处理单元110可以是MCU模块,第二处理单元可以是CPU模块。
第一处理单元110和摄像头模组101中的激光摄像头102、泛光灯104、镭射灯106可与同一个I2C总线连接。I2C总线可通过一根数据线和一根时钟线实现连接于I2C总线上的各个器件之间的数据传输。第一处理单元110接收第二处理单元120发送的图像采集指令后,可通过I2C总线向同时连接在该I2C总线上的泛光灯104和/或镭射灯106发送控制指令,控制泛光灯104和镭射灯106中的至少一个开启。
在一个例子中,第一处理单元110接收图像采集指令后,可根据图像采集指令判断当前需要控制的是泛光灯104还是镭射灯106。若需要控制泛光灯104开启,则第一处理单元110可通过I2C总线对连接在该I2C总线上的泛光灯104进行寻址,然后向泛光灯104发送控制指令,控制泛光灯104开启。若需要控制镭射灯106开启,则第一处理单元110可通过I2C总线对连接在该I2C总线上的镭射灯106进行寻址,然后向镭射灯106发送控制指令,控制镭射灯106开启。
022:第一处理单元110通过I2C总线控制激光摄像头102采集目标图像。
第一处理单元110通过I2C总线控制泛光灯104和镭射灯106中的至少一个开启,可通过该I2C总线控制激光摄像头102采集目标图像,目标图像可包括红外图像、散斑图像等。第一处理单元110可通过I2C总线控制摄像头模组101中的泛光灯104开启,并通过该I2C总线控制激光摄像头102采集红外图像,其中,泛光灯104可为一种向四面八方均匀照射的面光源,泛光灯104发射的光线可为红外光,激光摄像头102可采集被人脸反射回的红外光得到红外图像。第一处理单元110可通过I2C总线控制摄像头模组102中的镭射灯106开启,并通过该I2C总线控制激光摄像头102采集散斑图像等。镭射灯106被点亮时,发出的激光可由透镜和DOE(diffractive optical elements,衍射光学元件)进行衍射产生带散斑颗粒的图案,带散斑颗粒的图案投射到目标物体后,带散斑颗粒的图案因为目标物体上的各点与电子设备的距离不同会产生散斑颗粒的偏移,激光摄像头102对散斑颗粒偏移后的图案进行采集得到散斑图像。
在一个例子中,第一处理单元110通过I2C总线对连接在该I2C总线上的泛光灯104或镭射灯106进行寻址,并向泛光灯104或镭射灯106发送控制指令,控制泛光灯104或镭射灯106开启之后,可通 过I2C总线对连接在该I2C总线上的激光摄像头102进行寻址,并向该激光摄像头102发送控制指令,控制激光摄像头102采集红外图像或散斑图像。
002:通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
激光摄像头102可将采集的目标图像发送给第一处理单元110,第一处理单元110可对目标图像进行处理。第一处理单元110可设置成不同的模式,不同模式可采集不同的目标图像,并对目标图像进行不同的处理等。当第一处理单元110设置为红外模式时,第一处理单元110可通过I2C总线控制泛光灯104开启,并通过I2C总线控制激光摄像头102采集红外图像,可对红外图像进行处理得到红外视差图。当第一处理单元110设置为散斑图像模式时,第一处理单元110可通过I2C总线控制镭射灯106开启,并通过I2C总线控制激光摄像头102采集散斑图像,可对散斑图像进行处理得到散斑视差图。当第一处理单元设置为深度图模式时,第一处理单元110可通过I2C总线控制镭射灯106开启,通过I2C总线控制激光摄像头102采集散斑图像,并对散斑图像进行处理得到深度视差图。
进一步地,第一处理单元110可对目标图像进行校正处理,进行校正处理是指校正目标图像由于激光摄像头102及RGB摄像头108的内外参数等造成的图像内容偏移,例如由于激光摄像头102偏转角度、激光摄像头102和RGB摄像头108之间的摆放位置等引起的图像内容偏移等。对目标图像进行校正处理后,可得到目标图像的视差图,例如,第一处理单元110对红外图像进行校正处理得到红外视差图,对散斑图像进行校正可得到散斑视差图或深度视差图等。第一处理单元110对目标图像进行校正处理,可以防止最终在电子设备100的屏幕上呈现的图像出现重影的情况。
第一处理单元110对目标图像进行处理,可将处理后的目标图像发送给第二处理单元120。第二处理单元120可根据处理后的目标图像得到所需的图像,比如红外图像、散斑图像及深度图像等。第二处理单元120可根据应用程序的需求对所需的图像进行处理。
例如,应用程序需要进行人脸验证时,则第二处理单元120可对得到的所需的图像等进行人脸检测,其中,人脸检测可包括人脸识别、人脸匹配和活体检测。人脸识别是指识别所需的图像中是否存在人脸。人脸匹配是指将所需的图像中人脸与预存的人脸进行匹配。活体检测是指检测所需的图像中人脸是否具有生物活性等。若应用程序需要获取人脸的深度信息,则第二处理单元120可将生成的深度图像上传至应用程序,应用程序可根据接收到的深度图像进行美颜处理、三维建模等。
图8所示实施例的数据处理方法中,激光摄像头102、泛光灯104、镭射灯106和第一处理单元110与同一I2C总线连接,第一处理单元110通过该I2C总线控制开启泛光灯104和镭射灯106中的至少一个,并通过该I2C控制激光摄像头102采集目标图像,通过同一个I2C总线控制泛光灯104、镭射灯106和激光摄像头102,对I2C总线进行复用,可以降低控制电路的复杂度,并减少成本。
图5为图8所示实施例的数据处理方法的一个应用场景图。如图5所示,电子设备10包括激光摄像头102、镭射灯106、泛光灯104、第一处理单元110、第二处理单元120和控制器130。第一处理单元110可为MCU(Microcontroller Unit,微控制单元)模块等,第二处理单元120可为CPU(Central Processing Unit,中央处理器)模块等。第一处理单元110可与激光摄像头102、镭射灯106、泛光灯104和第二处理单元120连接。控制器130可分别与镭射灯106及泛光灯104连接,控制器130可对镭射灯106和泛光灯104进行控制。激光摄像头102、控制器130和第一处理单元110与同一I2C(Inter-Integrated Circuit,双向二线制同步串行)总线连接。
当第一处理单元110接收到第二处理单元120发送的图像采集指令时,可通过I2C总线控制开启泛光灯104和镭射灯106中的至少一个。第一处理单元110可向连接在该I2C总线的控制器130发送控制指令,控制器130接收到控制指令后,可根据控制指令控制泛光灯104和镭射灯106中的至少一个开启,第一处理单元110可通过PWM(Pulse Width Modulation,脉冲宽度调制)模块112对泛光灯104和镭射灯106进行点亮。第一处理单元110可通过该I2C总线控制激光摄像头102采集目标图像。第一处理单元110对采集的目标图像进行处理,可将处理后的目标图像发送给第二处理单元120。
图4为图8所示实施例的数据处理方法的另一个应用场景图。如图4所示,电子设备100可包括摄像头模组101、第二处理单元120,第一处理单元110。第二处理单元120可为CPU模块。第一处理单元110可为MCU模块等。其中,第一处理单元110连接在第二处理单元120和摄像头模组101之间,第一处理单元110可控制摄像头模组101中激光摄像头102、泛光灯104和镭射灯106,第二处理单元120可控制摄像头模组101中的RGB摄像头108。
摄像头模组101中包括激光摄像头102、泛光灯104、RGB摄像头108和镭射灯106。激光摄像头102可为红外摄像头,用于获取红外图像。泛光灯104为可发射红外光的面光源。镭射灯106为为可发射激光且发射的激光可形成图案的点光源。其中,当泛光灯104发射红外光时,激光摄像头102可根据反射回的光线获取红外图像。当镭射灯106发射激光时,激光摄像头102可根据反射回的光线获取散斑图像。散斑图像是镭射灯106发射的形成图案的激光被反射后图案发生形变的图像。激光摄像头102、泛光灯104、镭射灯106和第一处理单元110可与同一个I2C总线连接。
第二处理单元120可包括在TEE(Trusted execution environment,可信运行环境)环境下运行的CPU内核和在REE(Rich Execution Environment,自然运行环境)环境下运行的CPU内核。其中,TEE环境和REE环境均为ARM模块(Advanced RISC Machines,高级精简指令集处理器)的运行模式。TEE环境的安全级别较高,第二处理单元120中有且仅有一个CPU内核可同时运行在TEE环境下。通常情况下,电子设备100中安全级别较高的操作行为需要在TEE环境下的CPU内核中执行,安全级别较低的操作行为可在REE环境下的CPU内核中执行。
第一处理单元110包括PWM模块112、SPI/I2C(Serial Peripheral Interface/Inter-Integrated Circuit,串行外设接口/双向二线制同步串行接口)接口114、RAM(Random Access Memory,随机存取存储器)模块116和深度引擎118。第一处理单元110可通过连接的I2C总线控制泛光灯104或镭射灯106,上述PWM模块112可向摄像头模组101发射脉冲,点亮开启的泛光灯104或镭射灯106。第一处理单元110可通过I2C控制激光摄像头102采集红外图像或散斑图像。SPI/I2C接口114用于接收第二处理单元120发送的图像采集指令。深度引擎118可对散斑图像进行处理得到深度视差图。
当第二处理单元120接收到应用程序的数据获取请求时,例如,当应用程序需要进行人脸解锁、人脸支付时,可通过运行在TEE环境下的CPU内核向第一处理单元110发送图像采集指令。当第一处理单元110接收到图像采集指令后,可通过I2C总线控制开启控制摄像头模组101中的泛光灯104,再通过PWM模块112发射脉冲波点亮泛光灯104,并通过I2C总线控制激光摄像头102采集红外图像,可通过I2C总线控制开启摄像头模组102中的镭射灯106,并通过I2C总线控制激光摄像头102采集散斑图像。摄像头模组101可将采集到的红外图像和散斑图像发送给第一处理单元110。第一处理单元110可对接收到的红外图像进行处理得到红外视差图,以及对接收到的散斑图像进行处理得到散斑视差图或深度视差图。其中,第一处理单元110对红外图像和散斑图像进行处理是指对红外图像或散斑图像进行校正,去除摄像头模组101的内外参数对图像的影响。其中,第一处理单元110可设置成不同的模式,不同模式输出的图像不同。当第一处理单元110设置为散斑图模式时,第一处理单元110对散斑图像处理得到散斑视差图,根据散斑视差图可得到目标散斑图像;当第一处理单元110设置为深度图模式时,第一处理单元110对散斑图像处理得到深度视差图,根据上述深度视差图可得到深度图像,深度图像是指带有深度信息的图像。第一处理单元110可将红外视差图和散斑视差图发送给第二处理单元120,第一处理单元110也可将红外视差图和深度视差图发送给第二处理单元120。第二处理单元120可根据红外视差图获取目标红外图像、根据深度视差图获取深度图像。进一步的,第二处理单元120可根据目标红外图像、深度图像来进行人脸识别、人脸匹配、活体检测以及获取检测到的人脸的深度信息。
第一处理单元110与第二处理单元120之间通信是通过固定的安全接口,用以确保传输数据的安全性。如图4所示,第二处理单元120发送给第一处理单元110的数据是通过SECURE SPI/I2C 130,第一处理单元110发送给第二处理单元120的数据是通过SECURE MIPI(Mobile Industry Processor Interface,移动产业处理器接口)140。
可选地,第一处理单元110也可根据红外视差图获取目标红外图像、根据深度视差图计算获取深度图像,再将目标红外图像、深度图像发送给第二处理单元120。
对于图8所示实施例的数据处理方法,请一并参阅图4、图5和图9,可选地,步骤通过I2C总线控制开启泛光灯和镭射灯中的至少一个包括以下步骤:
0221:根据图像采集指令确定采集的图像类型。
第一处理单元110接收第二处理单元120发送的图像采集指令,可根据图像采集指令确定采集的图像类型,其中,图像类型可以是红外图像、散斑图像及深度图像等中的一种或多种。图像类型可根据应用程序所需的人脸数据进行确定,第二处理单元120接收数据获取请求后,可根据数据获取请求确定图像类型,并向第一处理单元110发送包含该图像类型的图像采集指令。例如,需要进行人脸解锁的数据, 则可确定图像类型为红外图像及散斑图像,需要人脸深度信息,则可确定图像类型为深度图像等,但不限于此。
0222:若图像类型为红外图像,则第一处理单元110通过I2C总线向控制器130发送第一控制指令,第一控制指令用于指示控制器130开启泛光灯104。
电子设备10中可设置有控制器130,泛光灯104和镭射灯106可共用同一个控制器130,该控制器130可分别与泛光灯104和镭射灯106连接,控制器130用于对泛光灯104和镭射灯106进行控制,可包括控制泛光灯104或镭射灯106开启,控制泛光灯104和镭射灯106之间的切换,控制泛光灯104和镭射灯106的发射功率等。控制器130可与激光摄像头102、第一处理单元110连接于同一个I2C总线。
若图像类型为红外图像,第一处理单元110可通过连接的I2C向控制器130发送第一控制指令,控制器130可根据第一控制指令泛光灯104开启。第一处理单元110可通过PWM模块112向控制器130发射脉冲,点亮泛光灯。可选地,第一处理单元110可通过I2C对控制器130进行寻址,并向控制器130发送第一控制指令。
0223:若图像类型为散斑图像或深度图像,则第一处理单元110通过I2C总线向控制器130发送第二控制指令,第二控制指令用于指示控制器130开启镭射灯106。
若图像类型为散斑图像或深度图像,第一处理单元110可通过连接的I2C向控制器130发送第二控制指令,控制器130可根据第二控制镭射灯106开启。第一处理单元110可通过PWM模块112向控制器130发射脉冲,点亮镭射灯106。
可选地,图像类型可包括多种,可能同时包括红外图像和散斑图像,或是同时包括红外图像和深度图像,或是同时包括红外图像、散斑图像和深度图像等。第一处理单元110可以分别控制泛光灯104开启以采集红外图像、控制镭射灯106开启以采集散斑图像。第一处理单元110可控制激光摄像头102先采集红外图像,也可控制激光摄像头102先采集散斑图像,并不限定先后采集顺序。
可选地,当图像类型包括红外图像和散斑图像,或包括红外图像和深度图像时,第一处理单元110可先通过I2C总线向控制器130发送第一控制指令,开启泛光灯104,并通过I2C总线控制激光摄像头102采集红外图像,然后通过I2C总线向控制器130发送第二控制指令,开启镭射灯106,并通过I2C总线控制激光摄像头102采集散斑图像。
可选地,当图像类型包括红外图像和散斑图像,或包括红外图像和深度图像时,第一处理单元110也可先通过I2C总线向控制器130发送第二控制指令,开启镭射灯106,并通过I2C总线控制激光摄像头102采集散斑图像,然后通过I2C总线向控制器130发送第一控制指令,开启泛光灯104,并通过I2C总线控制激光摄像头102采集红外图像。
如此,对同一个I2C总线进行分时复用,可以降低控制电路的复杂度,并减少成本。
在图9所示实施例的数据处理方法中,第一处理单元110可通过一个控制器130实现泛光灯104和镭射灯106的切换和控制,可以进一步降低控制电路的复杂度,并减少成本。
对于图8所示实施例的数据处理方法,请一并参阅图4、图5和图10,可选地,步骤002通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120包括以下步骤:
0241:获取存储的参考散斑图像,参考散斑图像带有参考深度信息。
在摄像机坐标系中,以垂直成像平面并穿过镜面中心的直线为Z轴,若物体在摄像机坐标系的坐标为(X,Y,Z),那么其中的Z值即为物体在该摄像机成像平面的深度信息。若应用程序需要获取人脸的深度信息,则需要采集包含人脸深度信息的深度图像。第一处理单元110可通过I2C总线控制镭射灯106开启,并通过I2C总线控制激光摄像头102采集散斑图像。第一处理单元110中可预先存储有参考散斑图,参考散斑图可带有参考深度信息,可根据采集的散斑图像及参考散斑图像获取散斑图像中包含的各个像素点的深度信息。
0242:将参考散斑图像与散斑图像进行匹配,得到匹配结果。
第一处理单元110可依次以采集的散斑图像中包含的各个像素点为中心,选择一个预设大小像素块,例如31pixel(像素)*31pixel大小,在参考散斑图像上搜索与选择的像素块相匹配的块。第一处理单元110可从采集的散斑图像中选择的像素块和参考散斑图像相匹配的块中,找到散斑图像及参考散斑图像中分别在同一条激光光路上的两个点,同一激光光路上的两个点的散斑信息一致,在同一条激光光路上的两个点可认定为对应的像素点。参考散斑图像中,每一条激光光路上的点的深度信息都是已知的。第 一处理单元110可计算目标散斑图像与参考散斑图像在同一条激光光路上的两个对应的像素点之间的偏移量,并根据偏移量计算得到采集的散斑图中包含的各个像素点的深度信息。
在一个例子中,第一处理单元110将采集的散斑图像与参考散斑图进行偏移量的计算,根据偏移量计算得到散斑图像中包含的各个像素点的深度信息,其计算公式可如式(2)所示:
其中,Z
D表示像素点的深度信息,也即像素点的深度值;L为激光摄像头102与激光器(即镭射灯106)之间的距离;f为激光摄像头102中透镜的焦距,Z
0为参考散斑图像采集时参考平面距离电子设备100的激光摄像头102的深度值,P为采集的散斑图像与参考散斑图像中对应像素点之间的偏移量。P可由目标散斑图与参考散斑图中像素点偏移的像素量乘以一个像素点的实际距离得到。当目标物体与激光摄像头102之间的距离大于参考平面与激光摄像头102之间的距离时,P为负值,当目标物体与激光摄像头102之间的距离小于参考平面与激光摄像头102之间的距离时,P为正值。
0243:根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元,通过第二处理单元对深度视差图进行处理得到深度图。
第一处理单元110得到采集的散斑图像中包含的各个像素点的深度信息,可对采集的散斑图像进行校正处理,校正采集的散斑图像由于激光摄像头102及RGB摄像头108的内外参数等造成的图像内容偏移。第一处理单元110可根据校正后的散斑图像,以及散斑图像中各个像素点的深度值,生成深度视差图,并将深度视差图发送给第二处理单元120。第二处理单元120可根据深度视差图得到深度图,深度图中可包含各个像素点的深度信息。第二处理单元120可将深度图上传至应用程序,应用程序可根据深度图中人脸的深度信息进行美颜、三维建模等。第二处理单元120也可根据深度图中人脸的深度信息进行活体检测,可防止采集的人脸是二维的平面人脸等。
图10所示实施例的数据处理方法,通过第一处理单元110可精准得到采集的图像的深度信息,数据处理效率高,且提高了图像处理的精准性。
对于图8所示实施例的数据处理方法,请一并参阅图4、图5及图11,可选地,数据处理方法在步骤0241获取存储的参考散斑图像之前,还包括以下步骤:
0251:每隔采集时间段采集镭射灯106的温度,并通过第二处理单元120获取与温度对应的参考散斑图像。
电子设备100可在镭射灯106旁设置有温度传感器,并通过温度传感器采集镭射灯106等的温度。第二处理单元120可每隔采集时间段获取温度传感器采集的镭射灯106的温度,其中,采集时间段可根据实际需求进行设定,例如3秒、4秒等,但不限于此。由于当镭射灯106的温度发生变化时,可能会对摄像头模组101造成形变,影响镭射灯106和激光摄像头102的内外参数。不同温度下对摄像头模组101的影响不同,因此,不同的温度可对应不同的参考散斑图像。
第二处理单元120可获取与温度对应的参考散斑图像,并根据与温度对应的参考散斑图像对在该温度下采集的散斑图像进行处理,得到深度图。可选地,第二处理单元可预先设定多个不同的温度区间,比如0℃(摄式度)~30℃,30℃~60℃,60℃~90℃等,但不限于此,不同温度区间可对应不同的参考散斑图像。第二处理单元120采集温度后,可确定该温度所处的温度区间,并获取与该温度区间对应的参考散斑图像。
0252:当本次获取的参考散斑图像与第一处理单元110中存储的参考散斑图像不一致时,通过第二处理单元120将本次获取的参考散斑图像写入第一处理单元110。
第二处理单元120获取与采集的温度对应的参考散斑图像后,可判断本次获取的参考散斑图像与第一处理单元110中存储的参考散斑图像是否一致,参考散斑图像中可携带有图像标识,图像标识可以由数字、字线及字符等中的一种或多种组成。第二处理单元120可从第一处理单元110中读取存储的参考散斑图像的图像标识,并将本次获取的参考散斑图像的图像标识与从第一处理单元110读取的图像标识进行比较。若两个图像标识不一致,则可说明本次获取的参考散斑图像与第一处理单元110中存储的参考散斑图像不一致,则第二处理单元120可将本次获取的参考散斑图像写入第一处理单元110。第一处理单元110可存储新写入的参考散斑图像,并删除之前存储的参考散斑图像。
图11所示实施例的数据处理方法,可根据镭射灯106的温度获取与温度对应的参考散斑图像,减少温度对最后输出的深度图的影响,使得到的深度信息更为精准。
对于图8所示实施例的数据处理方法,请一并参阅图4、图5及图12,可选地,在步骤021之前,数据处理方法还包括步骤0261。步骤将处理后的目标图像发送给第二处理单元包括步骤0262。
0261:通过第二处理单元120中运行在第一运行模式的内核向第一处理单元110发送图像采集指令,第一运行模式为可信运行环境。
电子设备100中第二处理单元120可包括两种运行模式,其中,第一运行模式可以为TEE,TEE为可信运行环境,安全级别高;第二运行模式可以为REE,REE为自然运行环境,REE的安全级较低。当第二处理单元120接收到应用程序发送的数据获取请求后,可通过第一运行模式向第一处理单元110发送图像采集指令。当第二处理单元120为单核的CPU时,可直接将上述单核由第二运行模式切换到第一运行模式;当第二处理单元120为多核时,可将一个内核由第二运行模式切换到第一运行模式,其他内核仍运行在第二运行模式中,并通过运行在第一运行模式下的内核向第一处理单元110发送图像采集指令。
0262:第一处理单元110将处理后的目标图像发送给第二处理单元120中运行在第一运行模式的内核。
第一处理单元110对采集的第一图像进行处理后,可将处理后的第一图像发送给该运行在第一运行模式下的内核,可保证第一处理单元110一直在可信运行环境下运行,提高安全性。第二处理单元120可在该运行在第一运行模式下的内核中,根据处理后的第一图像得到目标图像,并根据应用程序的需求对目标图像进行处理。比如,第二处理单元120可在运行在第一运行模式下的内核中对目标图像进行人脸检测。
在一个例子中,由于运行在第一运行模式的内核是唯一的,第二处理单元120在TEE环境下对目标图像进行人脸检测,可采集串行的方式逐一对目标图像进行人脸识别、人脸匹配和活体检测等。第二处理单元120可先对目标图像进行人脸识别,当识别到人脸时,再将目标图像中包含的人脸与预先存储的人脸进行匹配,判断是否为同一人脸。若为同一人脸再根据目标图像对人脸进行活体检测,防止采集的人脸是二维的平面人脸等。当没有识别到人脸时,可不进行人脸匹配和活体检测,可减轻第二处理单元120的处理压力。
图12所示实施例的数据处理方法,通过第二处理单元120安全性高的内核向第一处理单元110发送图像采集指令,可保证第一处理单元110处于安全性高的环境中,提高数据的安全。
请一并参阅图4、图13和图14,在又一个实施例中,步骤011当第一处理单元110接收到第二处理单元120发送的图像采集指令时,控制泛光灯104和镭射灯106中的至少一个开启,并控制激光摄像头102采集目标图像包括步骤031、步骤032和步骤033。
031:当第一处理单元110接收到第二处理单元120发送的图像采集指令时,根据图像采集指令确定图像类型。
当电子设备100中的应用程序需要获取人脸数据时,应用程序可向第二处理单元120发送数据获取请求,其中,人脸数据可包括但不限于人脸解锁、人脸支付等场景下需要进行人脸验证的数据以及人脸深度信息等。第二处理单元120接收数据获取请求后,可向第一处理单元110发送图像采集指令,其中,第一处理单元110可以是MCU模块,第二处理单元120可以是CPU模块。
第一处理单元110接收第二处理单元120发送的图像采集指令,可根据图像采集指令确定采集的图像类型,其中,图像类型可以是红外图像、散斑图像及深度图像等中的一种或多种。图像类型可根据应用程序所需的人脸数据进行确定,第二处理单元120接收数据获取请求后,可根据数据获取请求确定图像类型,并向第一处理单元110发送包含该图像类型的图像采集指令。例如,应用程序需要进行人脸解锁的数据,则第二处理单元120可确定图像类型为红外图像及散斑图像,需要人脸深度信息,进一步可确定图像类型为深度图像等,但不限于此。
032:若图像类型为第一类型,则开启摄像头模组101中的泛光灯,并通过第一脉冲宽度调制PWM模块1121向第一控制器131发送脉冲,点亮泛光灯104,再通过摄像头模组101中的激光摄像头102采集与第一类型对应的目标图像。
若图像类型为第一类型,在本实施例中,第一类型可以是红外图像,则第一处理单元110可向第一 控制器131发送控制指令,该控制指令可用于开启摄像头模组101中的泛光灯104。第一处理单元110可通过第一PWM模块1121向用于控制泛光灯104的第一控制器131发送脉冲信号,点亮泛光灯104。可选地,第一PWM模块1121可按照一定电压幅度、一定时间间隔向泛光灯104连续发出的脉冲信号,控制点亮泛光灯104。泛光灯104可为一种向四面八方均匀照射的面光源,当泛光灯104被点亮时,可发射红外光,激光摄像头102可采集被人脸反射回的红外光得到红外图像。
033:若图像类型为第二类型,则开启摄像头模组102中的镭射灯106,并通过第二PWM模块1122向第二控制器132发送脉冲,点亮镭射灯106,再通过摄像头模组101中的激光摄像头102采集与第二类型对应的目标图像。
若图像类型为第二类型,在本实施例中,第二类型可以是散斑图像或深度图像等,则第一处理单元110可向第二控制器132发送控制指令,该控制指令可用于开启摄像头模组101中的镭射灯106。第一处理单元110可通过第二PWM模块1122向用于控制镭射灯106的第二控制器132发送脉冲信号,点亮镭射灯106。可选地,第二PWM模块1122可按照一定电压幅度、一定时间间隔向镭射灯106连续发出的脉冲信号,控制点亮镭射灯106。镭射灯106被点亮时,发出的激光可由透镜和DOE(diffractive optical elements,衍射光学元件)进行衍射产生带散斑颗粒的图案,通过带散斑颗粒的图案投射到目标物体后,带散斑颗粒的图案因为目标物体上的各点与电子设备100的距离不同会产生散斑图案的偏移,激光摄像头102采集散斑颗粒偏移后的图案得到散斑图像。
002:通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
激光摄像头102可将采集的目标图像发送给第一处理单元110,第一处理单元110可对目标图像进行处理,其中,目标图像可包括红外图像、散斑图像等。第一处理单元110根据图像采集指令确定图像类型后,可根据确定的图像类型采集与图像类型对应的目标图像,并对目标图像进行相应的处理。当图像类型为红外图像时,第一处理单元110可通过第一PWM模块1121向第一控制器131发送脉冲,点亮泛光灯104,并通过激光摄像头102采集红外图像,再对红外图像进行处理得到红外视差图。当图像类型为散斑图像时,第一处理单元110可通过第二PWM模块1122向第二控制器132发送脉冲,点亮镭射灯106,并通过激光摄像头102采集散斑图像,再对散斑图像进行处理得到散斑视差图。当图像类型为深度图像时,第一处理单元110可采集散斑图像,并对采集的散斑图像进行处理得到深度视差图。
进一步地,第一处理单元10可对目标图像进行校正处理,进行校正处理是指校正目标图像由于激光摄像头102及RGB摄像头108的内外参数等造成的图像内容偏移,例如由于激光摄像头102偏转角度、激光摄像头102和RGB摄像头108之间的摆放位置等引起的图像内容偏移等。第一处理单元110对目标图像进行校正处理后,可得到目标图像的视差图,例如,对红外图像进行校正处理得到红外视差图,对散斑图像进行校正可得到散斑视差图或深度视差图等。第一处理单元110对目标图像进行校正处理,可以防止最终在电子设备100的屏幕上呈现的图像出现重影的情况。
第一处理单元100对目标图像进行处理,可将处理后的目标图像发送给第二处理单元120。第二处理单元120可根据处理后的目标图像得到所需的图像,比如红外图像、散斑图像及深度图像等。第二处理单元120可根据应用程序的需求对所需的图像进行处理。
例如,应用程序需要进行人脸验证时,则第二处理单元120可对得到的所需的图像等进行人脸检测,其中,人脸检测可包括人脸识别、人脸匹配和活体检测。人脸识别是指识别所需的图像中是否存在人脸。人脸匹配是指将所需的图像中的人脸与预存的人脸进行匹配。活体检测是指检测所需的图像中人脸是否具有生物活性等。若应用程序需要获取人脸的深度信息,则可将生成的深度图像上传至应用程序,应用程序可根据接收到的深度图像进行美颜处理、三维建模等。
图13所示实施例的数据处理方法中,当第一处理单元110接收到第二处理单元120发送的图像采集指令时,根据图像采集指令确定图像类型,若图像类型为第一类型,通过第一PWM模块1121点亮泛光灯104并通过激光摄像头102采集与第一类型对应的目标图像,若图像类型为第二类型,则通过第二PWM模块1122向第二控制器132点亮镭射灯106并通过激光摄像头102采集与第二类型对应的目标图像,通过两个PWM模块分别控制泛光灯104和镭射灯106,无需进行实时切换,可以降低数据处理复杂度,并减轻第一处理单元110的处理压力。
图14为图13所示实施例的数据处理方法的一个应用场景图。如图14所示,该数据处理方法可应用于电子设备100,电子设备100可包括激光摄像头102、泛光灯104、镭射灯106、第一处理单元110、 第二处理单元120、第一控制器131和第二控制器132。第一处理单元110可分别与激光摄像头102和第二处理单元120连接,其中,第一处理单元110可为MCU(Microcontroller Unit,微控制单元)模块等,第二处理单元120可为CPU(Central Processing Unit,中央处理器)模块等。第一控制器131与泛光灯104连接,第二控制器132与镭射灯106连接。第一处理单元110可包括第一PWM(Pulse Width Modulation,脉冲宽度调制)模块1121和第二PWM模块1122,第一处理单元110通过第一PWM模块1121与第一控制器131连接,第一处理单元110通过第二PWM模块1122与第二控制器132连接。
当第一处理单元110接收到第二处理单元120发送的图像采集指令时,根据图像采集指令确定图像类型。若图像类型为第一类型,则开启泛光灯104,并通过第一PWM模块1121向第一控制器131发送脉冲,点亮泛光灯104,再通过激光摄像头102采集与第一类型对应的目标图像。若图像类型为第二类型,则开启镭射灯106,并通过第二PWM模块1122向第二控制器132发送脉冲,点亮镭射灯106,再通过激光摄像头102采集与第二类型对应的目标图像。第一处理单元110可对激光摄像头102采集的目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
图4为图13所示实施例的数据处理方法的另一个应用场景图。如图4所示,电子设备100可包括摄像头模组101、第二处理单元120,第一处理单元110。第二处理单元120可为CPU模块。第一处理单元110可为MCU模块等。其中,第一处理单元110连接在第二处理单元120和摄像头模组101之间,第一处理单元110可控制摄像头模组101中的激光摄像头101、泛光灯104和镭射灯106,第二处理单元120可控制摄像头模组101中的RGB摄像头108。
摄像头模组101包括激光摄像头102、泛光灯104、RGB摄像头108和镭射灯106。激光摄像头102可为红外摄像头,用于获取红外图像。泛光灯104为可发射红外光的面光源。镭射灯106为可发射激光且发射的激光可形成图案的点光源。其中,当泛光灯104发射红外光时,激光摄像头102可根据反射回的光线获取红外图像。当镭射灯106发射激光时,激光摄像头102可根据反射回的光线获取散斑图像。散斑图像是镭射灯106发射的带有图案的点光源被反射后图案发生形变的图像。
第二处理单元120可包括在TEE(Trusted execution environment,可信运行环境)环境下运行的CPU内核和在REE(Rich Execution Environment,自然运行环境)环境下运行的CPU内核。其中,TEE环境和REE环境均为ARM模块(Advanced RISC Machines,高级精简指令集处理器)的运行模式。TEE环境的安全级别较高,第二处理单元120中有且仅有一个CPU内核可同时运行在TEE环境下。通常情况下,电子设备100中安全级别较高的操作行为需要在TEE环境下的CPU内核中执行,安全级别较低的操作行为可在REE环境下的CPU内核中执行。
第一处理单元110包括PWM模块112、SPI/I2C(Serial Peripheral Interface/Inter-Integrated Circuit,串行外设接口/双向二线制同步串行接口)接口114、RAM(Random Access Memory,随机存取存储器)模块116和深度引擎118。PWM模块232可包括第一PWM模块1121和第二PWM模块1122,其中,第一PWM模块1121可与泛光灯104的控制器131连接,控制泛光灯104开启,并向泛光灯104发送脉冲点亮泛光灯104;第二PWM模块1122可与镭射灯106的控制器132连接,控制镭射灯106开启,并向镭射灯106发送脉冲点亮镭射灯106。SPI/I2C接口114用于接收第二处理单元120发送的图像采集指令。深度引擎118可对散斑图像进行处理得到深度视差图。
当第二处理单元120接收到应用程序的数据获取请求时,例如,当应用程序需要进行人脸解锁、人脸支付时,可通过运行在TEE环境下的CPU内核向第一处理单元110发送图像采集指令。当第一处理单元110接收到图像采集指令后,可通过PWM模块112中的第一PWM模块1121发射脉冲波点亮泛光灯104,并通过激光摄像头102采集红外图像,可通过PWM模块112中的第二PWM模块1122发射脉冲波点亮镭射灯106,并通过激光摄像头102采集散斑图像。摄像头模组101可将采集到的红外图像和散斑图像发送给第一处理单元110。第一处理单元110可对接收到的红外图像进行处理得到红外视差图,以及对接收到的散斑图像进行处理得到散斑视差图或深度视差图。其中,第一处理单元110对红外图像和散斑图像进行处理是指对红外图像或散斑图像进行校正,去除摄像头模组101中的内外参数对图像的影响。第一处理单元110可设置成不同的模式,不同模式输出的图像不同。当第一处理单元110设置为散斑图模式时,第一处理单元110对散斑图像处理得到散斑视差图,根据散斑视差图可得到目标散斑图像;当第一处理单元110设置为深度图模式时,第一处理单元110对散斑图像处理得到深度视差图,根据深度视差图可得到深度图像,深度图像是指带有深度信息的图像。第一处理单元110可将红外视差图 和散斑视差图发送给第二处理单元120,第一处理单元110也可将红外视差图和深度视差图发送给第二处理单元120。第二处理单元120可根据红外视差图获取目标红外图像、根据深度视差图获取深度图像。进一步的,第二处理单元120可根据目标红外图像、深度图像来进行人脸识别、人脸匹配、活体检测以及获取检测到的人脸的深度信息。
第一处理单元110与第二处理单元120之间通信是通过固定的安全接口,用以确保传输数据的安全性。如图4所示,第二处理单元120发送给第一处理单元110的数据是通过SECURE SPI/I2C 130,第一处理单元110发送给第二处理单元120的数据是通过SECURE MIPI(Mobile Industry Processor Interface,移动产业处理器接口)140。
可选地,第一处理单元110也可根据红外视差图获取目标红外图像、根据深度视差图计算获取深度图像,再将目标红外图像、深度图像发送给第二处理单元120。
对于图13所示实施例的数据处理方法,请结合图4和图14,可选地,在步骤031当第一处理单元110接收到第二处理单元120发送的图像采集指令时,根据图像采集指令确定图像类型之前,还包括:当检测到摄像头模组101启动时,第二处理单元120通过I2C总线分别对泛光灯104和镭射灯106进行配置。
当电子设备100的应用程序需要通过摄像头模组101采集所需的图像数据时,可启动摄像头模组101,并通过摄像头模组101采集图像。当电子设备100检测到摄像头模组101启动时,第二处理单元120可通过I2C总线分别对泛光灯104和镭射灯106进行配置,其中,I2C总线可通过一根数据线和一根时钟线实现连接于I2C总线上的各个器件之间的数据传输。第二处理单元120可先读取配置文件,并根据配置文件中包含的参数对泛光灯104和镭射灯106进行配置。配置文件中可记录有泛光灯104和镭射灯106的发射功率、发射电流等参数,但不限于此,也可以是其他参数。第二处理单元120可根据配置文件中的参数对泛光灯104和镭射灯106的发射功率、发射电流等参数进行配置。
对于图13所示实施例的数据处理方法,可选地,,第二处理单元120可通过同一个I2C总线分别与泛光104灯和镭射灯106连接,泛光灯104、镭射灯106和第二处理单元120可连接在同一个I2C总线上。第二处理单元120对泛光灯104和镭射灯106进行配置时,可先通过I2C总线对泛光灯104进行寻址,并对泛光灯104进行配置,再通过I2C总线对镭射灯106进行寻址,并对镭射灯106进行配置。可选地,第二处理单元120也可先通过I2C总线对镭射灯106进行寻址,并对镭射灯106进行配置,再通过I2C总线对泛光灯104进行寻址,并对泛光灯104进行配置。通过对连接的同一个I2C总线进行分时复用,可以降低控制电路的复杂度,节省资源,降低成本。
对于图13所示实施例的数据处理方法,可选地,,第二处理单元120还可通过两个I2C总线分别与泛光灯104和镭射灯106连接,第二处理单元120可通过一个I2C总线与泛光灯104连接,并通过另一个I2C总线与镭射灯106连接。当第二处理单元120对泛光灯104和镭射灯106进行配置时,可通过与泛光灯连接的I2C总线对泛光灯104进行寻址,并对泛光灯104进行配置,同时可通过与镭射灯106连接的I2C总线对镭射灯106进行寻址,并对镭射灯106进行配置。通过两个I2C总线分别连接泛光灯104和镭射灯106,可以并行对泛光灯104和镭射灯106进行配置,提高数据处理速度。
图15为一个例子的第二处理单元120通过I2C总线与泛光灯104和镭射灯106连接的示意图。如图15所示,在(1)中,第二处理单元120通过同一个I2C总线分别连接泛光灯104和镭射灯106。在(2)中,第二处理单元120通过两个I2C总线分别与泛光灯104和镭射灯106连接,第二处理单元120可通过一个I2C总线与泛光灯104连接,并通过另一个I2C总线与镭射灯106连接。
图15所示实施例的数据处理方法,第二处理单元120可在摄像头模组101启动时,通过I2C总线对泛光灯104和镭射灯106进行配置,可以更加精准地控制图像采集,并提高了数据处理效率。
对于13所示实施例的数据处理方法,可选地,,请参阅图14,第一PWM模块1121向第一控制器131发送脉冲的时刻与第二PWM模块1122向第二控制器132发送脉冲的时刻不同,第一PWM模块1121向第一控制器131发送脉冲的时刻与第二PWM模块1122向第二控制器132发送脉冲的时刻之间的时间间隔小于时间阈值。
第一处理单元110根据图像采集指令确定图像类型,图像类型可包含有至少两种,比如,图像类型可同时包含第一类型和第二类型。当图像类型包含红外图像和散斑图像,或是包含红外图像和深度图像时,需要同时采集红外图像和散斑图像。第一处理单元110可同时通过第一PWM模块1121向第一控 制器131发送脉冲,通过第二PWM模块1122向第二控制器132发送脉冲,对泛光灯104和镭射灯106进行点亮。第一PWM模块1121向第一控制器131发送脉冲的时刻与第二PWM模块1122向第二控制器132发送脉冲的时刻可不同,从而在不同的时刻点亮泛光灯104和镭射灯106。第一处理单元110可在第一PWM模块1121向第一控制器131发送脉冲的时刻通过激光摄像头102采集红外图像,可在第二PWM模块1122向第二控制器132发送脉冲的时刻通过激光摄像头102采集散斑图像。
可选地,第一PWM模块1121向第一控制器131发送脉冲的时刻与第二PWM模块1122向第二控制器132发送脉冲的时刻之间的时间间隔小于时间阈值,激光摄像头102在采集完红外图像后可在小于时间阈值的时间间隔中采集到散斑图像,使得采集的红外图像与散斑图像的图像内容较为一致,方便后续进行人脸检测等处理。时间阈值可根据实际需求进行设定,例如20毫秒、30毫秒等。在本实施例中,第一处理单元110可通过激光摄像头102在不同时刻分别采集红外图像及散斑图像,且可保证采集的红外图像与散斑图像的图像内容较为一致,提高后续人脸检测的准确性。
对于图13所示实施例的数据处理方法,请结合图14和16所示,可选地,步骤002通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120,包括以下步骤:
0341:获取存储的参考散斑图像,参考散斑图像带有参考深度信息。
在摄像机坐标系中,以垂直成像平面并穿过镜面中心的直线为Z轴,若物体在摄像机坐标系的坐标为(X,Y,Z),那么其中的Z值即为物体在该摄像机成像平面的深度信息。若应用程序需要获取人脸的深度信息,则需要采集包含人脸深度信息的深度图像。第一处理单元110可通过I2C总线控制镭射灯106开启,并通过I2C总线控制激光摄像头102采集散斑图像。第一处理单元110中可预先存储有参考散斑图,参考散斑图可带有参考深度信息,可根据采集的散斑图像及参考散斑图像获取散斑图像中包含的各个像素点的深度信息。
0342:将参考散斑图像与散斑图像进行匹配,得到匹配结果。
第一处理单元110可依次以采集的散斑图像中包含的各个像素点为中心,选择一个预设大小像素块,例如31pixel(像素)*31pixel大小,在参考散斑图像上搜索与选择的像素块相匹配的块。第一处理单元110可从采集的散斑图像中选择的像素块和参考散斑图像相匹配的块中,找到散斑图像及参考散斑图像中分别在同一条激光光路上的两个点,同一激光光路上的两个点的散斑信息一致,在同一条激光光路上的两个点可认定为对应的像素点。参考散斑图像中,每一条激光光路上的点的深度信息都是已知的。第一处理单元110可计算目标散斑图像与参考散斑图像在同一条激光光路上的两个对应的像素点之间的偏移量,并根据偏移量计算得到采集的散斑图中包含的各个像素点的深度信息。
在一个例子中,第一处理单元110将采集的散斑图像与参考散斑图进行偏移量的计算,根据偏移量计算得到散斑图像中包含的各个像素点的深度信息,其计算公式可如式(3)所示:
其中,Z
D表示像素点的深度信息,也即像素点的深度值;L为激光摄像头102与激光器(即镭射灯106)之间的距离;f为激光摄像头102中透镜的焦距,Z
0为参考散斑图像采集时参考平面距离电子设备100的激光摄像头102的深度值,P为采集的散斑图像与参考散斑图像中对应像素点之间的偏移量。P可由目标散斑图与参考散斑图中像素点偏移的像素量乘以一个像素点的实际距离得到。当目标物体与激光摄像头102之间的距离大于参考平面与激光摄像头102之间的距离时,P为负值,当目标物体与激光摄像头102之间的距离小于参考平面与激光摄像头102之间的距离时,P为正值。
0343:根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元120,通过第二处理单元120对深度视差图进行处理得到深度图。
第一处理单元110得到采集的散斑图像中包含的各个像素点的深度信息,可对采集的散斑图像进行校正处理,校正采集的散斑图像由于激光摄像头102及RGB摄像头108的内外参数等造成的图像内容偏移。第一处理单元110可根据校正后的散斑图像,以及散斑图像中各个像素点的深度值,生成深度视差图,并将深度视差图发送给第二处理单元120。第二处理单元120可根据深度视差图得到深度图,深度图中可包含各个像素点的深度信息。第二处理单元120可将深度图上传至应用程序,应用程序可根据深度图中人脸的深度信息进行美颜、三维建模等。第二处理单元120也可根据深度图中人脸的深度信息 进行活体检测,可防止采集的人脸是二维的平面人脸等。
图16所示实施例的数据处理方法,通过第一处理单元110可精准得到采集的图像的深度信息,数据处理效率高,且提高了图像处理的精准性。
对于13所示实施例的数据处理方法,请参阅图14和图17,可选地,数据处理方法在步骤0341获取存储的参考散斑图像之前,还包括以下步骤:
0351:每隔采集时间段采集镭射灯104的温度,并通过第二处理单元120获取与温度对应的参考散斑图像。
电子设备100可在镭射灯104旁设置有温度传感器,并通过温度传感器采集镭射灯106等的温度。第二处理单元120可每隔采集时间段获取温度传感器采集的镭射灯106的温度,其中,采集时间段可根据实际需求进行设定,例如3秒、4秒等,但不限于此。由于当镭射灯106的温度发生变化时,可能会对摄像头模组101造成形变,影响镭射灯106和激光摄像头102的内外参数。不同温度下对摄像头模组101的影响不同,因此,在不同的温度可对应不同的参考散斑图像。
第二处理单元120可获取与温度对应的参考散斑图像,并根据与温度对应的参考散斑图像对在该温度下采集的散斑图像进行处理,得到深度图。可选地,第二处理单元可预先设定多个不同的温度区间,比如0℃(摄氏度)~30℃,30℃~60℃,60℃~90℃等,但不限于此,不同温度区间可对应不同的参考散斑图像。第二处理单元120采集温度后,可确定该温度所处的温度区间,并获取与该温度区间对应的参考散斑图像。
0252:当本次获取的参考散斑图像与第一处理单元110中存储的参考散斑图像不一致时,通过第二处理单元120将本次获取的参考散斑图像写入第一处理单元110。
第二处理单元120获取与采集的温度对应的参考散斑图像后,可判断本次获取的参考散斑图像与第一处理单元110中存储的参考散斑图像是否一致,参考散斑图像中可携带有图像标识,图像标识可以由数字、字线及字符等中的一种或多种组成。第二处理单元120可从第一处理单元110中读取存储的参考散斑图像的图像标识,并将本次获取的参考散斑图像的图像标识与从第一处理单元110读取的图像标识进行比较。若两个图像标识不一致,则可说明本次获取的参考散斑图像与第一处理单元110中存储的参考散斑图像不一致,则第二处理单元120可将本次获取的参考散斑图像写入第一处理单元110。第一处理单元110可存储新写入的参考散斑图像,并删除之前存储的参考散斑图像。
图17所示实施例的数据处理方法,可根据镭射灯106的温度获取与温度对应的参考散斑图像,减少温度对最后输出的深度图的影响,使得到的深度信息更为精准。
本申请提供的数据处理方法包括以下步骤:
001:当第一处理单元110接收到第二处理单元120发送的图像采集指令时,控制泛光灯104和镭射灯106中的至少一个开启,并控制激光摄像头102采集目标图像;和
002:通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
在一个实施例中,步骤001包括步骤011和步骤012。步骤011,当第一处理单元110接收到第二处理单元120发送的图像采集指令时,通过双向二线制同步串行I2C总线向控制器130发送控制指令,控制指令用于控制开启泛光灯104和镭射灯106中的至少一个。
可选地,步骤001包括:根据图像采集指令确定采集的图像类型;若图像类型为第一类型,则第一处理单元110通过I2C总线向控制器130发送第一控制指令,第一控制指令用于指示控制器130开启泛光灯104;若图像类型为第二类型,则第一处理单元110通过I2C总线向控制器130发送第二控制指令,第二控制指令用于指示控制器130开启镭射灯106。
可选地,在步骤根据图像采集指令确定采集的图像类型之后,数据处理方法还包括:当图像类型包括第一类型和第二类型时,第一处理单元110通过I2C总线向控制器130发送第一控制指令,开启泛光灯104;在通过激光摄像头102采集与第一类型对应的目标图像之后,通过I2C总线向控制器130发送第二控制指令,开启镭射灯106。
可选地,在步骤根据图像采集指令确定采集的图像类型之后,数据处理方法还包括:当图像类型包括第一类型和第二类型时,第一处理单元110通过I2C总线向控制器130发送第二控制指令,开启镭射灯106;在通过激光摄像头102采集与第二类型的目标图像之后,通过I2C总线向控制器130发送第一控制指令,开启泛光灯104。
可选地,第一处理单元110发送第一控制指令的时刻与发送第二控制指令的时刻之间的时间间隔小于时间阈值。
步骤012,通过脉冲宽度调制PWM模块112向控制器130发送脉冲,点亮开启的泛光灯104和镭射灯106中的至少一个,并通过激光摄像头102采集目标图像。
可选地,第一处理单元110、控制器130和激光摄像头102连接在同一个I2C总线上;步骤通过激光摄像头采集目标图像包括:通过I2C总线控制激光摄像头102采集目标图像。
步骤002,通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
可选地,目标图像包括散斑图像;步骤002包括:获取存储的参考散斑图像,参考散斑图像带有参考深度信息;将参考散斑图像与散斑图像进行匹配,得到匹配结果;根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元,通过第二处理单元对深度视差图进行处理得到深度图。
在本实施例中,当第一处理单元110接收到第二处理单元120发送的图像采集指令时,通过I2C总线向控制器130发送控制指令,控制开启泛光灯104和镭射灯106中的至少一个,并通过PWM模块112向控制器130发送脉冲,点亮开启的泛光灯104和镭射灯106中的至少一个,采集目标图像后再对目标图像进行处理,通过一个控制器130即可实现对泛光灯104和镭射灯106的控制,可以降低控制泛光灯104和镭射灯106等的复杂度,且节约成本。
在另一个实施例中,步骤001包括步骤021和步骤022。本实施例的数据处理方法应用于电子设备100,电子设备100包括摄像头模组101、第一处理单元110和第二处理单元120,第一处理单元110分别与第二处理单元120和摄像头模组101相连;摄像头模组101包括激光摄像头102、泛光灯104和镭射灯106。激光摄像头102、泛光灯104、镭射灯106和第一处理单元110与同一双向二线制同步串行I2C总线连接。
步骤021,当第一处理单元110接收到第二处理单元120发送的图像采集指令时,通过I2C总线控制开启泛光灯104和镭射灯106中的至少一个。
可选地,电子设备100还包括控制器130,控制器130用于控制泛光灯104和镭射灯106,控制器130与I2C总线连接。步骤021包括:根据图像采集指令确定采集的图像类型;若图像类型为红外图像,则第一处理单元110通过I2C总线向控制器130发送第一控制指令,第一控制指令用于指示控制器130开启泛光灯104;若图像类型为散斑图像或深度图像,则第一处理单元110通过I2C总线向控制器130发送第二控制指令,第二控制指令用于指示控制器130开启镭射灯106。
可选地,在步骤根据图像采集指令确定采集的图像类型之后,数据处理方法还包括:当图像类型包括红外图像和散斑图像,或包括红外图像和深度图像时,第一处理单元110通过I2C总线向控制器130发送第一控制指令,开启泛光灯104,并通过I2C总线控制激光摄像头102采集红外图像,然后通过I2C总线向控制器130发送第二控制指令,开启镭射灯106,并通过I2C总线控制激光摄像头102采集散斑图像。
可选地,在步骤根据图像采集指令确定采集的图像类型之后,数据处理方法还包括:当图像类型包括红外图像和散斑图像,或包括红外图像和深度图像时,第一处理单元110通过I2C总线向控制器130发送第二控制指令,开启镭射灯106,并通过I2C总线控制激光摄像头102采集散斑图像,然后通过I2C总线向控制器130发送第一控制指令,开启泛光灯104,并通过I2C总线控制激光摄像头102采集红外图像。
步骤022,第一处理单元110通过I2C总线控制激光摄像头102采集目标图像。
步骤002,通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
可选地,步骤002包括:获取存储的参考散斑图像,参考散斑图像带有参考深度信息;将参考散斑图像与散斑图像进行匹配,得到匹配结果;根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元120,通过第二处理单元120对深度视差图进行处理得到深度图。
可选地,在步骤获取存储的参考散斑图像之前,数据处理方法还包括:每隔采集时间段采集镭射灯106的温度,并通过第二处理单元120获取与温度对应的参考散斑图像;当本次获取的参考散斑图像与 第一处理单元110中存储的参考散斑图像不一致时,通过第二处理单元120将本次获取的参考散斑图像写入第一处理单元110。
可选地,在步骤021之前,数据处理方法还包括:通过第二处理单元120中运行在第一运行模式的内核向第一处理单元110发送图像采集指令,第一运行模式为可信运行环境。步骤002包括:第一处理单元110将处理后的目标图像发送给第二处理单元120中运行在第一运行模式的内核。
在本实施例中,激光摄像头102、泛光灯104、镭射灯106和第一处理单元110与同一I2C总线连接,第一处理单元110通过该I2C总线控制开启泛光灯104和镭射灯106中的至少一个,并通过该I2C控制激光摄像头102采集目标图像,通过同一个I2C总线控制泛光灯104、镭射灯106和激光摄像头102,对I2C总线进行复用,可以降低控制电路的复杂度,并减少成本。
在又一个实施例中,步骤001包括步骤031、步骤032和步骤023。
步骤031,当第一处理单元110接收到第二处理单元120发送的图像采集指令时,根据图像采集指令确定图像类型。
可选地,第二处理单元120通过双向二线制同步串行I2C总线分别与泛光灯104和镭射灯106连接;在步骤031之前,还包括:当检测到摄像头模组101启动时,第二处理单元120通过I2C总线分别对泛光灯104和镭射灯106进行配置。
可选地,第二处理单120元通过同一个I2C总线分别与泛光灯104和镭射灯106连接。
可选地,第二处理单元120通过一个I2C总线与泛光灯104连接,并通过另一个I2C总线与镭射灯106连接。
步骤032,若图像类型为第一类型,则开启摄像头模组101中的泛光灯104,并通过第一脉冲宽度调制PWM模块1121向第一控制器131发送脉冲,点亮泛光灯104,再通过摄像头模组101中的激光摄像头102采集与第一类型对应的目标图像。
步骤032,若图像类型为第二类型,则开启摄像头模组101中的镭射灯106,并通过第二PWM模块1122向第二控制器132发送脉冲,点亮镭射灯106,再通过摄像头模组101中的激光摄像头102采集与第二类型对应的目标图像。
可选地,第一PWM模块1121向第一控制器131发送脉冲的时刻与第二PWM模块1122向第二控制器132发送脉冲的时刻不同,第一PWM模块1121向第一控制器131发送脉冲的时刻与第二PWM模块1122向第二控制器132发送脉冲的时刻之间的时间间隔小于时间阈值。
步骤002,通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
可选地,目标图像包括散斑图像;步骤002包括:获取存储的参考散斑图像,参考散斑图像带有参考深度信息;将参考散斑图像与散斑图像进行匹配,得到匹配结果;根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元120,通过第二处理单元120对深度视差图进行处理得到深度图。
可选地,在步骤获取存储的参考散斑图像之前,数据处理方法还包括:每隔采集时间段采集镭射灯106的温度,并通过第二处理单元120获取与温度对应的参考散斑图像;当本次获取的参考散斑图像与第一处理单元110中存储的参考散斑图像不一致时,通过第二处理单元120将所述本次获取的参考散斑图像写入第一处理单元110。
在本实施例中,当第一处理单元110接收到第二处理单元120发送的图像采集指令时,根据图像采集指令确定图像类型,若图像类型为第一类型,通过第一PWM模块1121点亮泛光灯104并通过激光摄像头102采集与第一类型对应的目标图像,若图像类型为第二类型,则通过第二PWM模块1122向第二控制器132点亮镭射灯106并通过激光摄像头102采集与第二类型对应的目标图像,通过两个PWM模块分别控制泛光灯104和镭射灯106,无需进行实时切换,可以降低数据处理复杂度,并减轻第一处理单元110的处理压力。
应该理解的是,虽然上述各个流程示意图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,上述各个流程示意图中的至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执 行,这些子步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。
图18为一个实施例中电子设备100的框图。如图18所示,该电子设备100包括通过系统总线60连接的处理器20、存储器30、显示屏40和输入装置50。其中,存储器30可包括非易失性存储介质32及内存储器30。电子设备100的非易失性存储介质32存储有操作系统及计算机程序,该计算机程序被处理器20执行时以实现本申请实施例中提供的任意一种数据处理方法。该处理器20用于提供计算和控制能力,支撑整个电子设备100的运行。电子设备100中的内存储器30为非易失性存储介质32中的计算机程序的运行提供环境。电子设备100的显示屏40可以是液晶显示屏或者电子墨水显示屏等,输入装置50可以是显示屏40上覆盖的触摸层,也可以是电子设备100外壳上设置的按键、轨迹球或触控板,也可以是外接的键盘、触控板或鼠标等。该电子设备100可以是手机、平板电脑或者个人数字助理或穿戴式设备等。本领域技术人员可以理解,图18中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的电子设备100的限定,具体的电子设备100可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
请参阅图4,本申请还提供一种电子设备100。电子设备100包括第一处理单元110和第二处理单元120。第一处理单元100可用于:当第一处理单元110接收到第二处理单元120发送的图像采集指令时,控制泛光灯104和镭射灯106中的至少一个开启,并控制激光摄像头102采集目标图像;对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
请结合图3和图4,在一个实施例中,电子设备100包括摄像头模组101、第一处理单元110、第二处理单元120和控制器130,第一处理单元110分别与第二处理单元120和摄像头模组101相连。第一处理单元110通过I2C总线与控制器130连接。摄像头模组101包括激光摄像头101、泛光灯104和镭射灯106,泛光灯104和镭射灯106分别与控制器130连接。第一处理单元110包括PWM模块112,第一处理单元110通过PWM模块112与控制器130连接。第一处理单元110用于当收到第二处理单元120发送的图像采集指令时,通过I2C总线向控制器130发送控制指令,控制指令用于控制开启泛光灯104和镭射灯106中的至少一个,通过脉冲宽度调制PWM模块112向控制器130发送脉冲,点亮开启的泛光灯104和镭射灯106中的至少一个,并通过激光摄像头102采集目标图像,对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
可选地,第一处理单元110、控制器130和激光摄像头102连接在同一个I2C总线上。第一处理单元110还用于通过I2C总线控制激光摄像头102采集目标图像。如此,通过同一个I2C总线控制泛光灯104、镭射灯106和激光摄像头102,对I2C总线进行复用,可以降低控制电路的复杂度,并减少成本。
可选地,第一处理单元110还用于根据图像采集指令确定采集的图像类型,若图像类型为第一类型,则第一处理单元110通过I2C总线向控制器130发送第一控制指令,第一控制指令用于指示控制器130开启泛光灯104,若图像类型为第二类型,则第一处理单元110通过I2C总线向控制器130发送第二控制指令,第二控制指令用于指示控制器130开启镭射灯106。
可选地,第一处理单元110还用于当图像类型包括第一类型和第二类型时,第一处理单元110通过I2C总线向控制器130发送第一控制指令,开启泛光灯104,在通过激光摄像头102采集与第一类型对应的目标图像之后,通过I2C总线向控制器130发送第二控制指令,开启镭射灯106。
可选地,第一处理单元110还用于当图像类型包括第一类型和第二类型时,第一处理单元110通过I2C总线向控制器130发送第二控制指令,开启镭射灯106,在通过激光摄像头102采集与第二类型的目标图像之后,通过I2C总线向控制器130发送第一控制指令,开启泛光灯104。
可选地,第一处理单元110发送第一控制指令的时刻与发送第二控制指令的时刻之间的时间间隔小于时间阈值。如此,可通过一个控制器130实现泛光灯104和镭射灯106的切换和控制,可以降低控制电路的复杂度,并减少成本。
可选地,目标图像包括散斑图像。第一处理单元110还用于获取存储的参考散斑图像,将参考散斑图像与散斑图像进行匹配,得到匹配结果,根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元120,参考散斑图像带有参考深度信息。第二处理单元120还用于对深度视差图进行处理得到深度图。如此,通过第一处理单元110可精准得到采集的图像的深度信息,数据处理效率高,且提高了图像处理的精准性。
在本实施例中,当第一处理单元110接收到第二处理单元120发送的图像采集指令时,通过I2C总线向控制器130发送控制指令,控制开启泛光灯104和镭射灯106中的至少一个,并通过PWM模块112向控制器130发送脉冲,点亮开启的泛光灯104和镭射灯106中的至少一个,采集目标图像后再对目标图像进行处理,通过一个控制器130即可实现对泛光灯104和镭射灯106的控制,可以降低控制泛光灯104和镭射灯106等的复杂度,且节约成本。
请结合图3和图4,在另一个实施例中,电子设备100包括摄像头模组101、第一处理单元110和第二处理单元120。第一处理单元110可分别与第二处理单元120和摄像头模组101相连。摄像头模组101可包括激光摄像头102、泛光灯104和镭射灯106等,激光摄像头102、泛光灯104、镭射灯106和第一处理单元110与同一两线式串行I2C总线连接。第一处理单元110用于当接收到第二处理单元120发送的图像采集指令时,通过I2C总线控制开启泛光灯104和镭射灯106中的至少一个,通过I2C总线控制激光摄像头102采集目标图像,对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
可选地,电子设备100还包括控制器130,控制器130可分别与泛光灯104和镭射灯106连接,控制器130用于控制泛光灯104和镭射灯106,控制器130与I2C总线连接。第一处理单元110还用于根据图像采集指令确定采集的图像类型,若图像类型为红外图像,则通过I2C总线向控制器130发送第一控制指令,第一控制指令用于指示控制器130开启泛光灯104,若图像类型为散斑图像或深度图像,则通过I2C总线向控制器130发送第二控制指令,第二控制指令用于指示控制器130开启镭射灯106。
可选地,第一处理单元110还用于根据图像采集指令确定采集的图像类型,若图像类型为红外图像,则通过I2C总线向控制器130发送第一控制指令,第一控制指令用于指示控制器130开启泛光灯104,若图像类型为散斑图像或深度图像,则通过I2C总线向控制器130发送第二控制指令,第二控制指令用于指示控制器130开启镭射灯106。
可选地,第一处理单元110还用于当图像类型包括红外图像和散斑图像,或包括红外图像和深度图像时,通过I2C总线向控制器130发送第二控制指令,开启镭射灯106,并通过I2C总线控制激光摄像头102采集散斑图像,然后通过I2C总线向控制器130发送第一控制指令,开启泛光灯104,并通过I2C总线控制激光摄像头102采集红外图像。如此,可通过一个控制器130实现泛光灯104和镭射灯106的切换和控制,可以进一步降低控制电路的复杂度,并减少成本。
可选地,第一处理单元110还用于获取存储的参考散斑图像,并将参考散斑图像与散斑图像进行匹配,得到匹配结果,根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元120,参考散斑图像带有参考深度信息。第二处理单元120用于对深度视差图进行处理得到深度图。如此,,通过第一处理单元110可精准得到采集的图像的深度信息,数据处理效率高,且提高了图像处理的精准性。
可选地,第二处理单元120还用于每隔采集时间段采集镭射灯106的温度,并获取与温度对应的参考散斑图像,当本次获取的参考散斑图像与第一处理单元110中存储的参考散斑图像不一致时,将本次获取的参考散斑图像写入第一处理单元110。如此,可根据镭射灯106的温度获取与温度对应的参考散斑图像,减少温度对最后输出的深度图的影响,使得到的深度信息更为精准。
可选地,第二处理单元120还用于通过第二处理单元120中运行在第一运行模式的内核向第一处理单元110发送图像采集指令,第一运行模式为可信运行环境。第一处理单元110还用于将处理后的目标图像发送给第二处理单元120中运行在第一运行模式的内核。如此,通过第二处理单元120安全性高的内核向第一处理单元110发送图像采集指令,可保证第一处理单元110处于安全性高的环境中,提高数据的安全。
在本实施例中,激光摄像头102、泛光灯104、镭射灯106和第一处理单元110与同一I2C总线连接,第一处理单元110通过该I2C总线控制开启泛光灯104和镭射灯106中的至少一个,并通过该I2C控制激光摄像头102采集目标图像,通过同一个I2C总线控制泛光灯104、镭射灯106和激光摄像头102,对I2C总线进行复用,可以降低控制电路的复杂度,并减少成本。
请结合图4和图14,在又一个实施例中,电子设备100包括摄像头模组101、第一处理单元110和第二处理单元120,第一处理单元110分别与第二处理单元120和摄像头模组101相连。摄像头模组101包括激光摄像头102、泛光灯104和镭射灯106,泛光灯104与第一控制器131连接,镭射灯106与第 二控制器132连接。第一处理单元110包括第一PWM模块1121和第二PWM模块1122,第一处理单元110通过第一PWM模块1121与第一控制器131连接,第一处理单元110通过第二PWM模块1122与第二控制器132连接。第一处理单元110用于当接收到第二处理单元120发送的图像采集指令时,根据图像采集指令确定图像类型,若图像类型为第一类型,则开启摄像头模组101中的泛光灯104,并通过第一脉冲宽度调制PWM模块1121向第一控制器131发送脉冲,点亮泛光灯104,再通过摄像头模组101中的激光摄像头102采集与第一类型对应的目标图像;若图像类型为第二类型,则开启摄像头模组101中的镭射灯106,并通过第二PWM模块1122向第二控制器132发送脉冲,点亮镭射灯106,再通过摄像头模组101中的激光摄像头102采集与第二类型对应的目标图像;对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
可选地,第二处理单元120通过双向二线制同步串行I2C总线分别与泛光灯104和镭射灯106连接。第二处理单元120还用于当检测到摄像头模组101启动时,通过I2C总线分别对泛光灯104和镭射灯106进行配置。
可选地,第二处理单元120通过同一个I2C总线分别与泛光灯104和镭射灯106连接。或者,第二处理单元120通过一个I2C总线与泛光灯104连接,并通过另一个I2C总线与镭射灯106连接。如此,第二处理单元120可在摄像头模组101启动时,通过I2C总线对泛光灯104和镭射灯106进行配置,可以更加精准地控制图像采集,并提高了数据处理效率。
可选地,第一PWM模块1121向第一控制器131发送脉冲的时刻与第二PWM模块1122向第二控制器132发送脉冲的时刻不同,第一PWM模块1121向第一控制器131发送脉冲的时刻与第二PWM模块1122向第二控制器132发送脉冲的时刻之间的时间间隔小于时间阈值。如此,第一处理单元110可通过激光摄像头102在不同时刻分别采集红外图像及散斑图像,且可保证采集的红外图像与散斑图像的图像内容较为一致,提高后续人脸检测的准确性。
可选地,第一处理单元110还用于获取存储的参考散斑图像,将参考散斑图像与所述散斑图像进行匹配,得到匹配结果,根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元120,参考散斑图像带有参考深度信息。第二处理单元120还用于对深度视差图进行处理得到深度图。如此,通过第一处理单元可精准得到采集的图像的深度信息,数据处理效率高,且提高了图像处理的精准性。
可选地,第二处理单元120还用于每隔采集时间段采集镭射106灯的温度,并通过第二处理单元120获取与温度对应的参考散斑图像,当本次获取的参考散斑图像与第一处理单元110中存储的参考散斑图像不一致时,通过第二处理单元120将本次获取的参考散斑图像写入第一处理单元110。如此,可根据镭射灯106的温度获取与温度对应的参考散斑图像,减少温度对最后输出的深度图的影响,使得到的深度信息更为精准。
在本实施例中,当第一处理单元110接收到第二处理单元120发送的图像采集指令时,根据图像采集指令确定图像类型,若图像类型为第一类型,通过第一PWM模块1121点亮泛光灯104并通过激光摄像头102采集与第一类型对应的目标图像,若图像类型为第二类型,则通过第二PWM模块1122向第二控制器132点亮镭射灯106并通过激光摄像102头采集与第二类型对应的目标图像,通过两个PWM模块分别控制泛光灯104和镭射灯106,无需进行实时切换,可以降低数据处理复杂度,并减轻第一处理单元110的处理压力。
请参阅图4和图19,本申请提供一种数据处理装置80。数据处理装置80包括控制模块801和处理模块802。控制模块801用于当第一处理单元110接收到第二处理单元120发送的图像采集指令时,控制泛光灯104和镭射灯106中的至少一个开启,并控制激光摄像头102采集目标图像。处理模块802用于通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
请参阅图4、图5和图20,在一个实施例中,控制模块801包括指令发送单元811和第一脉冲发送单元812。指令发送单元811用于当第一处理单元110接收到第二处理单元120发送的图像采集指令时,通过双向二线制同步串行I2C总线向控制器130发送控制指令,控制指令用于控制开启泛光灯104和镭射灯106中的至少一个。第一脉冲发送单元812用于通过脉冲宽度调制PWM模块112向控制器130发送脉冲,点亮开启的泛光灯104和镭射灯106中的至少一个,并通过激光摄像头102采集目标图像。处理模块802用于通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单 元120。
可选地,第一处理单元110、控制器130和激光摄像头102连接在同一个I2C总线上。第一脉冲发送单元811还用于通过I2C总线控制激光摄像头102采集目标图像。如此,通过同一个I2C总线控制泛光灯104、镭射灯106和激光摄像头102,对I2C总线进行复用,可以降低控制电路的复杂度,并减少成本。
可选地,指令发送单元811包括第一类型确定子单元、第一发送子单元及第二发送子单元。第一类型确定子单元用于根据图像采集指令确定采集的图像类型。第一发送子单元用于若图像类型为第一类型,则通过I2C总线向控制器130发送第一控制指令,第一控制指令用于指示控制器130开启泛光灯104。第二发送子单元用于若图像类型为第二类型,则通过I2C总线向控制器130发送第二控制指令,第二控制指令用于指示控制器130开启镭射灯106。
可选地,第一发送子单元110还用于当图像类型包括第一类型和第二类型时,通过I2C总线向控制器130发送第一控制指令,开启泛光灯104。第二发送子单元还用于在通过激光摄像头102采集与第一类型对应的目标图像之后,通过I2C总线向所述控制器130发送第二控制指令,开启镭射灯106。
可选地,第二发送子单元还用于当图像类型包括第一类型和第二类型时,通过I2C总线向控制器130发送第二控制指令,开启镭射灯106。
第一发送子单元还用于在通过激光摄像头102采集与第二类型的目标图像之后,通过I2C总线向控制器130发送第一控制指令110,开启泛光灯104。
可选地,第一处理单元发送第一控制指令的时刻与发送第二控制指令的时刻之间的时间间隔小于时间阈值。如此,可通过一个控制器130实现泛光灯104和镭射灯106的切换和控制,可以降低控制电路的复杂度,并减少成本。
可选地,处理模块802包括第一图像获取单元、第一匹配单元及第一生成单元。第一图像获取单元用于获取存储的参考散斑图像,参考散斑图像带有参考深度信息。第一匹配单元用于将参考散斑图像与散斑图像进行匹配,得到匹配结果。第一生成单元用于根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元120,通过第二处理单元120对深度视差图进行处理得到深度图。如此,通过第一处理单元110可精准得到采集的图像的深度信息,数据处理效率高,且提高了图像处理的精准性。
在本实施例中,当第一处理单元110接收到第二处理单元120发送的图像采集指令时,通过I2C总线向控制器130发送控制指令,控制开启泛光灯104和镭射灯106中的至少一个,并通过PWM模块112向控制器130发送脉冲,点亮开启的泛光灯104和镭射灯106中的至少一个,采集目标图像后再对目标图像进行处理,通过一个控制器130即可实现对泛光灯104和镭射灯106的控制,可以降低控制泛光灯104和镭射灯106等的复杂度,且节约成本。
请参阅图4和图21,在另一个实施例中,控制模块81包括第一控制单元821和第二控制单元822。本实施例中的数据处理装置适用于电子设备100,该电子设备100包括摄像头模组101、第一处理单元110和第二处理单元120,第一处理单元110分别与第二处理单元120和摄像头模组101相连。摄像头模组101包括激光摄像头102、泛光灯104和镭射灯106,激光摄像头102、泛光灯104、镭射灯106和第一处理单元110与同一两线式串行I2C总线连接。第一控制单元821用于当第一处理单元110接收到第二处理单元120发送的图像采集指令时,通过I2C总线控制开启泛光灯104和镭射灯106中的至少一个。第二控制单元822用于通过I2C总线控制激光摄像102头采集目标图像。处理模块802用于通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元120。
可选地,电子设备100还包括控制器130,控制器130用于控制泛光灯104和镭射灯106,控制器130与I2C总线连接。第一控制单元821包括第二类型确定子单元及指令发送子单元。第二类型确定子单元用于根据图像采集指令确定采集的图像类型。指令发送子单元用于若图像类型为红外图像,则通过I2C总线向控制器130发送第一控制指令,第一控制指令用于指示控制器130开启泛光灯104。指令发送子单元还用于若图像类型为散斑图像或深度图像,则通过I2C总线向控制器130发送第二控制指令,第二控制指令用于指示控制器130开启镭射灯106。
可选地,第一控制单元821还用于当图像类型包括红外图像和散斑图像,或包括红外图像和深度图像时,通过I2C总线向控制器130发送第一控制指令,开启泛光灯104,并通过I2C总线控制激光摄像 头102采集红外图像,然后通过I2C总线向控制器130发送第二控制指令,开启镭射灯106,并通过I2C总线控制激光摄像头102采集散斑图像。或者,第一控制单元821,还用于当图像类型包括红外图像和散斑图像,或包括红外图像和深度图像时,通过I2C总线向控制器130发送第二控制指令,开启镭射灯106,并通过I2C总线控制激光摄像头102采集散斑图像,然后通过I2C总线向控制器130发送第一控制指令,开启泛光灯104,并通过I2C总线控制激光摄像头102采集红外图像。如此,可通过一个控制器130实现泛光灯104和镭射灯106的切换和控制,可以进一步降低控制电路的复杂度,并减少成本。
可选地,处理模块802包括第二图像获取单元、第二匹配单元及第二生成单元。第二图像获取单元用于获取存储的参考散斑图像,参考散斑图像带有参考深度信息。第二匹配单元用于将参考散斑图像与散斑图像进行匹配,得到匹配结果。第二生成单元用于根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元120,通过第二处理单元120对深度视差图进行处理得到深度图。如此,通过第一处理单元110可精准得到采集的图像的深度信息,数据处理效率高,且提高了图像处理的精准性。
可选地,本实施例的数据处理装置80除了包括控制模块801及处理模块802,还包括第一温度采集模块及第一写入模块。第一温度采集模块用于每隔采集时间段采集镭射灯106的温度,并通过第二处理单元120获取与温度对应的参考散斑图像。第一写入模块用于当本次获取的参考散斑图像与第一处理单元中存储的参考散斑图像不一致时,通过第二处理单元120将本次获取的参考散斑图像写入第一处理单元110。如此,可根据镭射灯106的温度获取与温度对应的参考散斑图像,减少温度对最后输出的深度图的影响,使得到的深度信息更为精准。
可选地,本实施例的数据处理装置80除了包括控制模块801、处理模块802、第一温度采集模块及第一写入模块,还包括第一发送模块。第一发送模块用于通过第二处理单元120中运行在第一运行模式的内核向第一处理单元110发送图像采集指令,第一运行模式为可信运行环境。
处理模块802还用于通过第一处理单元110将处理后的目标图像发送给第二处理单元120中运行在第一运行模式的内核。如此,通过第二处理单元120安全性高的内核向第一处理单元110发送图像采集指令,可保证第一处理单元110处于安全性高的环境中,提高数据的安全。
在本实施例中,激光摄像头102、泛光灯104、镭射灯106和第一处理单元110与同一I2C总线连接,第一处理单元110通过该I2C总线控制开启泛光灯104和镭射灯106中的至少一个,并通过该I2C控制激光摄像头102采集目标图像,通过同一个I2C总线控制泛光灯104、镭射灯106和激光摄像头102,对I2C总线进行复用,可以降低控制电路的复杂度,并减少成本。
请参阅图4、图14及图22,在又一个实施例中,控制模块801包括类型确定单元831、第二脉冲发送单元832、第三脉冲发送单元833。类型确定单元831用于当第一处理单元110接收到第二处理单元120发送的图像采集指令时,根据图像采集指令确定图像类型。第二脉冲发送单元832用于若图像类型为第一类型,则开启摄像头模组101中的泛光灯104,并通过第一脉冲宽度调制PWM模块1121向第一控制器131发送脉冲,点亮泛光灯104,再通过摄像头模组101中的激光摄像头102采集与第一类型对应的目标图像。第三脉冲发送单元833用于若图像类型为第二类型,则开启摄像头模组101中的镭射灯106,并通过第二PWM模块1122向第二控制器132发送脉冲,点亮镭射灯106,再通过摄像头模101组中的激光摄像102头采集与第二类型对应的目标图像。处理模块802用于通过第一处理单元110对目标图像进行处理,并将处理后的目标图像发送给第二处理单元。
在本实施例中,当第一处理单元110接收到第二处理单元120发送的图像采集指令时,根据图像采集指令确定图像类型,若图像类型为第一类型,通过第一PWM模块1121点亮泛光灯104,并通过激光摄像头102采集与第一类型对应的目标图像,若图像类型为第二类型,则通过第二PWM模块1122向第二控制器132点亮镭射灯106并通过激光摄像头102采集与第二类型对应的目标图像,通过两个PWM模块分别控制泛光灯104和镭射灯106,无需进行实时切换,可以降低数据处理复杂度,并减轻第一处理单元110的处理压力。
可选地,第二处理单元120通过双向二线制同步串行I2C总线分别与泛光灯104和镭射灯106连接。本实施例的数据处理装置80,除了包括控制模块801及处理模块802,还包括配置模块。配置模块用于当检测到摄像头模组101启动时,通过I2C总线分别对泛光灯104和镭射灯106进行配置。
可选地,第二处理单元120通过同一个I2C总线分别与泛光灯104和镭射灯106连接。或者,第二 处理单元120通过一个I2C总线与泛光灯104连接,并通过另一个I2C总线与镭射灯106连接。如此,第二处理单元120可在摄像头模组101启动时,通过I2C总线对泛光灯104和镭射灯106进行配置,可以更加精准地控制图像采集,并提高了数据处理效率。
可选地,第一PWM模块1121向第一控制器131发送脉冲的时刻与第二PWM模块1121向第二控制器132发送脉冲的时刻不同,第一PWM模块1121向第一控制器131发送脉冲的时刻与第二PWM模块1121向第二控制器132发送脉冲的时刻之间的时间间隔小于时间阈值。如此,第一处理单元110可通过激光摄像头102在不同时刻分别采集红外图像及散斑图像,且可保证采集的红外图像与散斑图像的图像内容较为一致,提高后续人脸检测的准确性。
可选地,处理模块802包括第三图像获取单元、第三匹配单元及第三生成单元。第三图像获取单元用于获取存储的参考散斑图像,参考散斑图像带有参考深度信息。第三匹配单元用于将参考散斑图像与散斑图像进行匹配,得到匹配结果。第三生成单元,用于根据参考深度信息和匹配结果生成深度视差图,并将深度视差图发送给第二处理单元120,通过第二处理单元120对深度视差图进行处理得到深度图。如此,通过第一处理单元110可精准得到采集的图像的深度信息,数据处理效率高,且提高了图像处理的精准性。
可选地,本实施例的数据处理装置80除了包括控制模块801、处理模块802及配置模块,还包括第二温度采集模块及第二写入模块。
第二温度采集模块用于每隔采集时间段采集镭射灯106的温度,并通过第二处理单元120获取与温度对应的参考散斑图像。第二写入模块用于当本次获取的参考散斑图像与第一处理单元110中存储的参考散斑图像不一致时,通过第二处理单元120将本次获取的参考散斑图像写入第一处理单元110。如此,可根据镭射灯106的温度获取与温度对应的参考散斑图像,减少温度对最后输出的深度图的影响,使得到的深度信息更为精准。
本申请还提供一种计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现上述任意一个实施例所述的数据处理方法。
本申请还提供一种包含计算机程序的计算机程序产品,当其在计算机设备上运行时,使得计算机设备执行时实现上述任意一个实施例所述的数据处理方法。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一非易失性计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)等。
如此处所使用的对存储器、存储、数据库或其它介质的任何引用可包括非易失性和/或易失性存储器。合适的非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM),它用作外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDR SDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)。
以上所述实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。
Claims (49)
- 一种数据处理方法,其特征在于,包括:当第一处理单元接收到第二处理单元发送的图像采集指令时,控制泛光灯和镭射灯中的至少一个开启,并控制激光摄像头采集目标图像;和通过所述第一处理单元对所述目标图像进行处理,并将处理后的所述目标图像发送给所述第二处理单元。
- 根据权利要求1所述的数据处理方法,其特征在于,所述当第一处理单元接收到第二处理单元发送的图像采集指令时,控制泛光灯和镭射灯中的至少一个开启,并控制激光摄像头采集目标图像,包括:当所述第一处理单元接收到所述第二处理单元发送的所述图像采集指令时,通过双向二线制同步串行I2C总线向控制器发送控制指令,所述控制指令用于控制开启所述泛光灯和所述镭射灯中的至少一个;和通过脉冲宽度调制PWM模块向所述控制器发送脉冲,点亮开启的所述泛光灯和所述镭射灯中的至少一个,并通过所述激光摄像头采集所述目标图像。
- 根据权利要求2所述的数据处理方法,其特征在于,所述第一处理单元、所述控制器和所述激光摄像头连接在同一个I2C总线上;所述通过所述激光摄像头采集所述目标图像,包括:通过所述I2C总线控制所述激光摄像头采集所述目标图像。
- 根据权利要求3所述的数据处理方法,其特征在于,所述通过双向二线制同步串行I2C总线向控制器发送控制指令,包括:根据所述图像采集指令确定采集的图像类型;若所述图像类型为第一类型,则所述第一处理单元通过所述I2C总线向所述控制器发送第一控制指令,所述第一控制指令用于指示所述控制器开启所述泛光灯;若所述图像类型为第二类型,则所述第一处理单元通过所述I2C总线向所述控制器发送第二控制指令,所述第二控制指令用于指示所述控制器开启所述镭射灯。
- 根据权利要求4所述的数据处理方法,其特征在于,在所述根据所述图像采集指令确定采集的图像类型之后,所述数据处理方法还包括:当所述图像类型包括所述第一类型和所述第二类型时,所述第一处理单元通过所述I2C总线向所述控制器发送所述第一控制指令,开启所述泛光灯;和在通过所述激光摄像头采集与所述第一类型对应的所述目标图像之后,通过所述I2C总线向所述控制器发送所述第二控制指令,开启所述镭射灯。
- 根据权利要求4所述的数据处理方法,其特征在于,在所述根据所述图像采集指令确定采集的图像类型之后,所述数据处理方法还包括:当所述图像类型包括所述第一类型和所述第二类型时,所述第一处理单元通过所述I2C总线向所述控制器发送所述第二控制指令,开启所述镭射灯;和在通过所述激光摄像头采集与所述第二类型对应的所述目标图像之后,通过所述I2C总线向所述控制器发送所述第一控制指令,开启所述泛光灯。
- 根据权利要求5或6所述的数据处理方法,其特征在于,所述第一处理单元发送所述第一控制指令的时刻与发送所述第二控制指令的时刻之间的时间间隔小于时间阈值。
- 根据权利要求2所述的数据处理方法,其特征在于,所述目标图像包括散斑图像;所述通过所述第一处理单元对所述目标图像进行处理,并将处理后的所述目标图像发送给所述第二处理单元,包括:获取存储的参考散斑图像,所述参考散斑图像带有参考深度信息;将所述参考散斑图像与所述散斑图像进行匹配,得到匹配结果;和根据所述参考深度信息和所述匹配结果生成深度视差图,并将所述深度视差图发送给所述第二处理单元,通过所述第二处理单元对所述深度视差图进行处理得到深度图。
- 根据权利要求1所述的数据处理方法,其特征在于,所述数据处理方法应用于电子设备,所述电子设备包括摄像头模组、所述第一处理单元和所述第二处理单元,所述第一处理单元分别与所述第二处理单元和所述摄像头模组相连;所述摄像头模组包括所述激光摄像头、所述泛光灯和所述镭射灯,所述 激光摄像头、所述泛光灯、所述镭射灯和所述第一处理单元与同一双向二线制同步串行I2C总线连接;所述当第一处理单元接收到第二处理单元发送的图像采集指令时,控制泛光灯和镭射灯中的至少一个开启,并控制激光摄像头采集目标图像,包括:当所述第一处理单元接收到所述第二处理单元发送的所述图像采集指令时,通过所述I2C总线控制开启所述泛光灯和所述镭射灯中的至少一个;和所述第一处理单元通过所述I2C总线控制所述激光摄像头采集目标图像。
- 根据权利要求9所述的数据处理方法,其特征在于,所述电子设备还包括控制器,所述控制器用于控制所述泛光灯和所述镭射灯,所述控制器与所述I2C总线连接;所述通过所述I2C总线控制开启所述泛光灯和所述镭射灯的至少一个,包括:根据所述图像采集指令确定采集的图像类型;若所述图像类型为红外图像,则所述第一处理单元通过所述I2C总线向所述控制器发送第一控制指令,所述第一控制指令用于指示所述控制器开启所述泛光灯;若所述图像类型为散斑图像或深度图像,则所述第一处理单元通过所述I2C总线向所述控制器发送第二控制指令,所述第二控制指令用于指示所述控制器开启所述镭射灯。
- 根据权利要求10所述的数据处理方法,其特征在于,在所述根据所述图像采集指令确定采集的图像类型之后,所述数据处理方法还包括:当所述图像类型包括所述红外图像和所述散斑图像,或包括所述红外图像和所述深度图像时,所述第一处理单元通过所述I2C总线向所述控制器发送所述第一控制指令,开启所述泛光灯,并通过所述I2C总线控制所述激光摄像头采集所述红外图像,然后通过所述I2C总线向所述控制器发送所述第二控制指令,开启所述镭射灯,并通过所述I2C总线控制所述激光摄像头采集所述散斑图像。
- 根据权利要求10所述的数据处理方法,其特征在于,在所述根据所述图像采集指令确定采集的图像类型之后,所述数据处理方法还包括:当所述图像类型包括所述红外图像和所述散斑图像,或包括所述红外图像和所述深度图像时,所述第一处理单元通过所述I2C总线向所述控制器发送所述第二控制指令,开启所述镭射灯,并通过所述I2C总线控制所述激光摄像头采集所述散斑图像,然后通过所述I2C总线向所述控制器发送所述第一控制指令,开启所述泛光灯,并通过所述I2C总线控制所述激光摄像头采集所述红外图像。
- 根据权利要求9所述的数据处理方法,其特征在于,所述目标图像包括散斑图像;所述通过所述第一处理单元对所述目标图像进行处理,并将处理后的所述目标图像发送给所述第二处理单元,包括:获取存储的参考散斑图像,所述参考散斑图像带有参考深度信息;将所述参考散斑图像与所述散斑图像进行匹配,得到匹配结果;和根据所述参考深度信息和所述匹配结果生成深度视差图,并将所述深度视差图发送给所述第二处理单元,通过所述第二处理单元对所述深度视差图进行处理得到深度图。
- 根据权利要求13所述的数据处理方法,其特征在于,在所述获取存储的参考散斑图像之前,所述数据处理方法还包括:每隔采集时间段采集所述镭射灯的温度,并通过所述第二处理单元获取与所述温度对应的参考散斑图像;和当本次获取的所述参考散斑图像与所述第一处理单元中存储的所述参考散斑图像不一致时,通过所述第二处理单元将本次获取的所述参考散斑图像写入所述第一处理单元。
- 根据权利要求9至14任一项所述的数据处理方法,其特征在于,在所述当所述第一处理单元接收到所述第二处理单元发送的所述图像采集指令时,通过所述I2C总线控制开启所述泛光灯和所述镭射灯中的至少一个之前,所述数据处理方法还包括:通过所述第二处理单元中运行在第一运行模式的内核向所述第一处理单元发送所述图像采集指令,所述第一运行模式为可信运行环境;所述将处理后的所述目标图像发送给所述第二处理单元,包括:所述第一处理单元将处理后的所述目标图像发送给所述第二处理单元中运行在所述第一运行模式的内核。
- 根据权利要求1所述的数据处理方法,其特征在于,所述当第一处理单元接收到第二处理单元 发送的图像采集指令时,控制泛光灯和镭射灯中的至少一个开启,并控制激光摄像头采集目标图像,包括:当所述第一处理单元接收到所述第二处理单元发送的所述图像采集指令时,根据所述图像采集指令确定图像类型;若所述图像类型为第一类型,则开启摄像头模组中的所述泛光灯,并通过第一脉冲宽度调制PWM模块向第一控制器发送脉冲,点亮所述泛光灯,再通过所述摄像头模组中的所述激光摄像头采集与所述第一类型对应的所述目标图像;若所述图像类型为第二类型,则开启所述摄像头模组中的镭射灯,并通过第二PWM模块向第二控制器发送脉冲,点亮所述镭射灯,再通过所述摄像头模组中的所述激光摄像头采集与所述第二类型对应的所述目标图像。
- 根据权利要求16所述的数据处理方法,其特征在于,所述第二处理单元通过双向二线制同步串行I2C总线分别与所述泛光灯和镭射灯连接;在所述当所述第一处理单元接收到所述第二处理单元发送的所述图像采集指令时,根据所述图像采集指令确定图像类型之前,所述数据处理方法还包括:当检测到所述摄像头模组启动时,所述第二处理单元通过所述I2C总线分别对所述泛光灯和所述镭射灯进行配置。
- 根据权利要求17所述的数据处理方法,其特征在于,所述第二处理单元通过同一个I2C总线分别与所述泛光灯和所述镭射灯连接。
- 根据权利要求17所述的数据处理方法,其特征在于,所述第二处理单元通过一个I2C总线与所述泛光灯连接,并通过另一个I2C总线与所述镭射灯连接。
- 根据权利要求16所述的数据处理方法,其特征在于,所述第一PWM模块向所述第一控制器发送脉冲的时刻与所述第二PWM模块向所述第二控制器发送脉冲的时刻不同,所述第一PWM模块向所述第一控制器发送脉冲的时刻与所述第二PWM模块向所述第二控制器发送脉冲的时刻之间的时间间隔小于时间阈值。
- 根据权利要求16所述的数据处理方法,其特征在于,所述目标图像包括散斑图像;所述通过所述第一处理单元对所述目标图像进行处理,并将处理后的所述目标图像发送给所述第二处理单元,包括:获取存储的参考散斑图像,所述参考散斑图像带有参考深度信息;将所述参考散斑图像与所述散斑图像进行匹配,得到匹配结果;和根据所述参考深度信息和所述匹配结果生成深度视差图,并将所述深度视差图发送给所述第二处理单元,通过所述第二处理单元对所述深度视差图进行处理得到深度图。
- 根据权利要求21所述的数据处理方法,其特征在于,在所述获取存储的参考散斑图像之前,所述数据处理方法还包括:每隔采集时间段采集所述镭射灯的温度,并通过所述第二处理单元获取与所述温度对应的参考散斑图像;和当本次获取的所述参考散斑图像与所述第一处理单元中存储的所述参考散斑图像不一致时,通过所述第二处理单元将本次获取的所述参考散斑图像写入所述第一处理单元。
- 一种数据处理装置,其特征在于,包括:控制模块,用于当第一处理单元接收到第二处理单元发送的图像采集指令时,控制泛光灯和镭射灯中的至少一个开启,并控制激光摄像头采集目标图像;和处理模块,用于通过所述第一处理单元对所述目标图像进行处理,并将处理后的所述目标图像发送给所述第二处理单元。
- 根据权利要求23所述的数据处理装置,其特征在于,所述控制模块包括:指令发送单元,用于当所述第一处理单元接收到所述第二处理单元发送的所述图像采集指令时,通过双向二线制同步串行I2C总线向控制器发送控制指令,所述控制指令用于控制开启所述泛光灯和所述镭射灯中的至少一个;和脉冲发送单元,用于通过脉冲宽度调制PWM模块向所述控制器发送脉冲,点亮开启的所述泛光灯和所述镭射灯中的至少一个,并通过所述激光摄像头采集所述目标图像。
- 根据权利要求23所述的数据处理装置,其特征在于,所述数据处理装置应用于电子设备,所述 电子设备包括摄像头模组、所述第一处理单元和所述第二处理单元,所述第一处理单元分别与所述第二处理单元和所述摄像头模组相连;所述摄像头模组包括所述激光摄像头、所述泛光灯和所述镭射灯,所述激光摄像头、所述泛光灯、所述镭射灯和所述第一处理单元与同一两线式串行I2C总线连接;所述控制模块包括:第一控制单元,用于当所述第一处理单元接收到所述第二处理单元发送的所述图像采集指令时,通过所述I2C总线控制开启所述泛光灯和所述镭射灯中的至少一个;和第二控制单元,用于通过所述I2C总线控制所述激光摄像头采集所述目标图像。
- 根据权利要求23所述的数据处理装置,其特征在于,所述控制模块包括:类型确定单元,用于当所述第一处理单元接收到所述第二处理单元发送的所述图像采集指令时,根据所述图像采集指令确定图像类型;第一脉冲发送单元,用于若所述图像类型为第一类型,则开启摄像头模组中的所述泛光灯,并通过第一脉冲宽度调制PWM模块向第一控制器发送脉冲,点亮所述泛光灯,再通过所述摄像头模组中的所述激光摄像头采集与所述第一类型对应的所述目标图像;第二脉冲发送单元,用于若所述图像类型为第二类型,则开启所述摄像头模组中的所述镭射灯,并通过第二PWM模块向第二控制器发送脉冲,点亮所述镭射灯,再通过所述摄像头模组中的所述激光摄像头采集与所述第二类型对应的所述目标图像。
- 一种电子设备,其特征在于,包括第一处理单元和第二处理单元;所述第一处理单元用于:当所述第一处理单元接收到所述第二处理单元发送的图像采集指令时,控制泛光灯和镭射灯中的至少一个开启,并控制激光摄像头采集目标图像;和对所述目标图像进行处理,并将处理后的所述目标图像发送给所述第二处理单元。
- 根据权利要求27所述的电子设备,其特征在于,所述电子设备还包括摄像头模组和控制器,所述第一处理单元分别与所述第二处理单元和所述摄像头模组相连,所述第一处理单元通过双向二线制同步串行I2C总线与所述控制器连接;所述摄像头模组包括所述激光摄像头、所述泛光灯和所述镭射灯,所述泛光灯和镭射灯分别与所述控制器连接;所述第一处理单元包括PWM模块,所述第一处理单元通过所述PWM模块与所述控制器连接;所述第一处理单元还用于:当收到所述第二处理单元发送的所述图像采集指令时,通过所述I2C总线向所述控制器发送控制指令,所述控制指令用于控制开启所述泛光灯和所述镭射灯中的至少一个,通过所述脉冲宽度调制PWM模块向所述控制器发送脉冲,点亮开启的所述泛光灯和所述镭射灯中的至少一个,并通过所述激光摄像头采集目标图像。
- 根据权利要求28所述的电子设备,其特征在于,所述第一处理单元、所述控制器和所述激光摄像头连接在同一个所述I2C总线上;所述第一处理单元还用于通过所述I2C总线控制所述激光摄像头采集所述目标图像。
- 根据权利要求29所述的电子设备,其特征在于,所述第一处理单元还用于:根据所述图像采集指令确定采集的图像类型;若所述图像类型为第一类型,则所述第一处理单元通过所述I2C总线向所述控制器发送第一控制指令,所述第一控制指令用于指示所述控制器开启所述泛光灯;若所述图像类型为第二类型,则所述第一处理单元通过所述I2C总线向所述控制器发送第二控制指令,所述第二控制指令用于指示所述控制器开启所述镭射灯。
- 根据权利要求30所述的电子设备,其特征在于,当所述图像类型包括所述第一类型和所述第二类型时,所述第一处理单元通过所述I2C总线向所述控制器发送所述第一控制指令,开启所述泛光灯,在通过所述激光摄像头采集与所述第一类型对应的所述目标图像之后,通过所述I2C总线向所述控制器发送所述第二控制指令,开启所述镭射灯。
- 根据权利要求30所述的电子设备,其特征在于,当所述图像类型包括所述第一类型和所述第二类型时,所述第一处理单元通过所述I2C总线向所述控制器发送所述第二控制指令,开启所述镭射灯,在通过所述激光摄像头采集与所述第二类型对应的所述目标图像之后,通过所述I2C总线向所述控制器发送所述第一控制指令,开启所述泛光灯。
- 根据权利要求31或32所述的电子设备,其特征在于,所述第一处理单元发送所述第一控制指 令的时刻与发送所述第二控制指令的时刻之间的时间间隔小于时间阈值。
- 根据权利要求28所述的电子设备,其特征在于,所述目标图像包括散斑图像;所述第一处理单元还用于获取存储的参考散斑图像,将所述参考散斑图像与所述散斑图像进行匹配,得到匹配结果,根据所述参考深度信息和所述匹配结果生成深度视差图,并将所述深度视差图发送给所述第二处理单元,所述参考散斑图像带有参考深度信息;第二处理单元还用于对所述深度视差图进行处理得到深度图。
- 根据权利要求27所述的电子设备,其特征在于,所述电子设备还包括摄像头模组,所述第一处理单元分别与所述第二处理单元和所述摄像头模组相连;所述摄像头模组包括所述激光摄像头、所述泛光灯和所述镭射灯,所述激光摄像头、所述泛光灯、所述镭射灯和所述第一处理单元与同一双向二线制同步串行I2C总线连接;所述第一处理单元还用于当接收到所述第二处理单元发送的所述图像采集指令时,通过所述I2C总线控制开启所述泛光灯和所述镭射灯中的至少一个,通过所述I2C总线控制所述激光摄像头采集所述目标图像。
- 根据权利要求35所述的电子设备,其特征在于,所述电子设备还包括控制器,所述控制器可分别与所述泛光灯和所述镭射灯连接,所述控制器用于控制所述泛光灯和所述镭射灯,所述控制器与所述I2C总线连接;所述第一处理单元还用于根据所述图像采集指令确定采集的图像类型,若所述图像类型为红外图像,则通过所述I2C总线向所述控制器发送第一控制指令,所述第一控制指令用于指示所述控制器开启所述泛光灯,若所述图像类型为散斑图像或深度图像,则通过所述I2C总线向所述控制器发送第二控制指令,所述第二控制指令用于指示所述控制器开启所述镭射灯。
- 根据权利要求36所述的电子设备,其特征在于,所述第一处理单元还用于当所述图像类型包括所述红外图像和所述散斑图像,或包括所述红外图像和所述深度图像时,通过所述I2C总线向所述控制器发送所述第一控制指令,开启所述泛光灯,并通过所述I2C总线控制所述激光摄像头采集所述红外图像,然后通过所述I2C总线向所述控制器发送所述第二控制指令,开启所述镭射灯,并通过所述I2C总线控制所述激光摄像头采集所述散斑图像。
- 根据权利要求36所述的电子设备,其特征在于,所述第一处理单元还用于当所述图像类型包括所述红外图像和所述散斑图像,或包括所述红外图像和所述深度图像时,通过所述I2C总线向所述控制器发送所述第二控制指令,开启所述镭射灯,并通过所述I2C总线控制所述激光摄像头采集所述散斑图像,然后通过所述I2C总线向所述控制器发送所述第一控制指令,开启所述泛光灯,并通过所述I2C总线控制所述激光摄像头采集所述红外图像。
- 根据权利要求35所述的电子设备,其特征在于,所述第一处理单元还用于获取存储的参考散斑图像,并将所述参考散斑图像与所述散斑图像进行匹配,得到匹配结果,根据所述参考深度信息和所述匹配结果生成深度视差图,并将所述深度视差图发送给所述第二处理单元,所述参考散斑图像带有参考深度信息;所述第二处理单元用于对所述深度视差图进行处理得到深度图。
- 根据权利要求39所述的电子设备,其特征在于,所述第二处理单元还用于每隔采集时间段采集所述镭射灯的温度,并获取与所述温度对应的参考散斑图像,当本次获取的所述参考散斑图像与所述第一处理单元中存储的所述参考散斑图像不一致时,将本次获取的所述参考散斑图像写入所述第一处理单元。
- 根据权利要求39所述的电子设备,其特征在于,所述第二处理单元还用于通过所述第二处理单元中运行在第一运行模式的内核向所述第一处理单元发送所述图像采集指令,所述第一运行模式为可信运行环境;所述第一处理单元还用于将处理后的所述目标图像发送给所述第二处理单元中运行在所述第一运行模式的内核。
- 根据权利要求27所述的电子设备,其特征在于,所述电子设备还包括摄像头模组,所述第一处理单元分别与所述第二处理单元和所述摄像头模组相连;所述摄像头模组包括所述激光摄像头、所述泛光灯和所述镭射灯,所述泛光灯与第一控制器连接,所述镭射灯与第二控制器连接;所述第一处理单元包括第一PWM模块和第二PWM模块,所述第一处理单元通过所述第一PWM模块与所述第一控制器连接,所述第一处理单元通过所述第二PWM模块与所述第二控制器连接;所述第一处理单元还用于:当接收到所述第二处理单元发送的所述图像采集指令时,根据所述图像采集指令确定图像类型;若所述图像类型为第一类型,则开启所述摄像头模组中的所述泛光灯,并通过所述第一PWM模块向所述第一控制器发送脉冲,点亮所述泛光灯,再通过所述摄像头模组中的所述激光摄像头采集与所述第一类型对应的所述目标图像;若所述图像类型为第二类型,则开启所述摄像头模组中的所述镭射灯,并通过所述第二PWM模块向所述第二控制器发送脉冲,点亮所述镭射灯,再通过所述摄像头模组中的所述激光摄像头采集与所述第二类型对应的所述目标图像。
- 根据权利要求42所述的电子设备,其特征在于,所述第二处理单元通过双向二线制同步串行I2C总线分别与所述泛光灯和所述镭射灯连接;所述第二处理单元用于当检测到所述摄像头模组启动时,通过所述I2C总线分别对所述泛光灯和所述镭射灯进行配置。
- 根据权利要求43所述的电子设备,其特征在于,所述第二处理单元通过同一个I2C总线分别与所述泛光灯和所述镭射灯连接。
- 根据权利要求43所述的电子设备,其特征在于,所述第二处理单元通过一个I2C总线与所述泛光灯连接,并通过另一个I2C总线与所述镭射灯连接。
- 根据权利要求42所述的电子设备,其特征在于,所述第一PWM模块向所述第一控制器发送脉冲的时刻与所述第二PWM模块向所述第二控制器发送脉冲的时刻不同,所述第一PWM模块向第一控制器发送脉冲的时刻与所述第二PWM模块向所述第二控制器发送脉冲的时刻之间的时间间隔小于时间阈值。
- 根据权利要求42所述的电子设备,其特征在于,所述第一处理单元还用于获取存储的参考散斑图像,将所述参考散斑图像与所述散斑图像进行匹配,得到匹配结果,根据所述参考深度信息和所述匹配结果生成深度视差图,并将所述深度视差图发送给所述第二处理单元,所述参考散斑图像带有参考深度信息;所述第二处理单元还用于对所述深度视差图进行处理得到深度图。
- 根据权利要求47所述的电子设备,其特征在于,所述第二处理单元还用于每隔采集时间段采集所述镭射灯的温度,并通过所述第二处理单元获取与所述温度对应的参考散斑图像,当本次获取的所述参考散斑图像与所述第一处理单元中存储的所述参考散斑图像不一致时,通过所述第二处理单元将本次获取的所述参考散斑图像写入所述第一处理单元。
- 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至22任一项所述的数据处理方法。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| ES19792981T ES2938471T3 (es) | 2018-04-28 | 2019-04-23 | Método de procesamiento de datos, dispositivo electrónico y medio de almacenamiento legible por ordenador |
| EP19792981.3A EP3672223B1 (en) | 2018-04-28 | 2019-04-23 | Data processing method, electronic device, and computer-readable storage medium |
| US16/743,533 US11050918B2 (en) | 2018-04-28 | 2020-01-15 | Method and apparatus for performing image processing, and computer readable storage medium |
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201810402999.0 | 2018-04-28 | ||
| CN201810402998.6A CN108810516B (zh) | 2018-04-28 | 2018-04-28 | 数据处理方法、装置、电子设备及计算机可读存储介质 |
| CN201810401326.3A CN108833887B (zh) | 2018-04-28 | 2018-04-28 | 数据处理方法、装置、电子设备及计算机可读存储介质 |
| CN201810402998.6 | 2018-04-28 | ||
| CN201810402999.0A CN108696682B (zh) | 2018-04-28 | 2018-04-28 | 数据处理方法、装置、电子设备及计算机可读存储介质 |
| CN201810401326.3 | 2018-04-28 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/743,533 Continuation US11050918B2 (en) | 2018-04-28 | 2020-01-15 | Method and apparatus for performing image processing, and computer readable storage medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019206129A1 true WO2019206129A1 (zh) | 2019-10-31 |
Family
ID=68293796
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2019/083854 Ceased WO2019206129A1 (zh) | 2018-04-28 | 2019-04-23 | 数据处理方法、装置、电子设备及计算机可读存储介质 |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US11050918B2 (zh) |
| EP (1) | EP3672223B1 (zh) |
| ES (1) | ES2938471T3 (zh) |
| WO (1) | WO2019206129A1 (zh) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111239729A (zh) * | 2020-01-17 | 2020-06-05 | 西安交通大学 | 融合散斑和泛光投射的ToF深度传感器及其测距方法 |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111815695B (zh) * | 2020-07-09 | 2024-03-15 | Oppo广东移动通信有限公司 | 深度图像获取方法、装置、移动终端及存储介质 |
| CN112543285A (zh) * | 2020-12-02 | 2021-03-23 | 维沃移动通信有限公司 | 图像处理方法、装置、电子设备及可读存储介质 |
| US20220394221A1 (en) * | 2021-06-08 | 2022-12-08 | Himax Technologies Limited | Structured-light scanning system with thermal compensation |
| CN113936050B (zh) * | 2021-10-21 | 2022-08-12 | 合肥的卢深视科技有限公司 | 散斑图像生成方法、电子设备及存储介质 |
| CN117011366A (zh) * | 2022-04-28 | 2023-11-07 | 北京小米移动软件有限公司 | 校验散斑编码图案的方法、装置、电子设备及介质 |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102970548A (zh) * | 2012-11-27 | 2013-03-13 | 西安交通大学 | 一种图像深度感知装置 |
| CN105407610A (zh) * | 2015-12-09 | 2016-03-16 | 深圳天珑无线科技有限公司 | 一种控制双闪光灯的方法及装置 |
| CN107341481A (zh) * | 2017-07-12 | 2017-11-10 | 深圳奥比中光科技有限公司 | 利用结构光图像进行识别 |
| CN108564032A (zh) * | 2018-04-12 | 2018-09-21 | Oppo广东移动通信有限公司 | 图像处理方法、装置、电子设备和计算机可读存储介质 |
| CN108573170A (zh) * | 2018-04-12 | 2018-09-25 | Oppo广东移动通信有限公司 | 信息处理方法和装置、电子设备、计算机可读存储介质 |
| CN108696682A (zh) * | 2018-04-28 | 2018-10-23 | Oppo广东移动通信有限公司 | 数据处理方法、装置、电子设备及计算机可读存储介质 |
| CN108810516A (zh) * | 2018-04-28 | 2018-11-13 | Oppo广东移动通信有限公司 | 数据处理方法、装置、电子设备及计算机可读存储介质 |
| CN108833887A (zh) * | 2018-04-28 | 2018-11-16 | Oppo广东移动通信有限公司 | 数据处理方法、装置、电子设备及计算机可读存储介质 |
Family Cites Families (29)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003050757A (ja) * | 2001-08-08 | 2003-02-21 | Toshiba Corp | 通信アプリケーションの環境設定情報の提供方法 |
| JP2009262344A (ja) | 2008-04-22 | 2009-11-12 | Ricoh Co Ltd | 画像形成装置、及び画像補正方法 |
| US9019503B2 (en) * | 2010-04-19 | 2015-04-28 | The United States Of America, As Represented By The Secretary Of The Navy | MEMS microdisplay optical imaging and sensor systems for underwater and other scattering environments |
| US10456209B2 (en) * | 2010-10-13 | 2019-10-29 | Gholam A. Peyman | Remote laser treatment system with dynamic imaging |
| CN102438111A (zh) | 2011-09-20 | 2012-05-02 | 天津大学 | 一种基于双阵列图像传感器的三维测量芯片及系统 |
| CN103139331B (zh) | 2011-12-02 | 2015-05-27 | 青岛海信移动通信技术股份有限公司 | 一种手机拍照电路及具有拍照电路的移动终端 |
| CN102622591B (zh) | 2012-01-12 | 2013-09-25 | 北京理工大学 | 3d人体姿态捕捉模仿系统 |
| US9462253B2 (en) | 2013-09-23 | 2016-10-04 | Microsoft Technology Licensing, Llc | Optical modules that reduce speckle contrast and diffraction artifacts |
| CN103971405A (zh) | 2014-05-06 | 2014-08-06 | 重庆大学 | 一种激光散斑结构光及深度信息的三维重建方法 |
| US20160240057A1 (en) * | 2015-02-13 | 2016-08-18 | IPVideo Corporation | System and method for providing alerts regarding occupancy conditions |
| JP6531823B2 (ja) * | 2015-04-02 | 2019-06-19 | 株式会社ニコン | 撮像システム、撮像装置、撮像方法、及び撮像プログラム |
| EP3286916A1 (en) * | 2015-04-23 | 2018-02-28 | Ostendo Technologies, Inc. | Methods and apparatus for full parallax light field display systems |
| WO2016172960A1 (en) * | 2015-04-30 | 2016-11-03 | SZ DJI Technology Co., Ltd. | System and method for enhancing image resolution |
| US20170034456A1 (en) | 2015-07-31 | 2017-02-02 | Dual Aperture International Co., Ltd. | Sensor assembly with selective infrared filter array |
| CN105513221B (zh) | 2015-12-30 | 2018-08-14 | 四川川大智胜软件股份有限公司 | 一种基于三维人脸识别的atm机防欺诈装置及系统 |
| DE112017001209T5 (de) * | 2016-03-10 | 2018-12-13 | Hamamatsu Photonics K.K. | Laserlicht-Bestrahlungsvorrichtung und Laserlicht-Bestrahlungsverfahren |
| CN106331453A (zh) | 2016-08-24 | 2017-01-11 | 深圳奥比中光科技有限公司 | 多图像采集系统及图像采集方法 |
| US10089738B2 (en) * | 2016-08-30 | 2018-10-02 | Microsoft Technology Licensing, Llc | Temperature compensation for structured light depth imaging system |
| CN106454287B (zh) | 2016-10-27 | 2018-10-23 | 深圳奥比中光科技有限公司 | 组合摄像系统、移动终端及图像处理方法 |
| CN106973251B (zh) | 2017-03-23 | 2020-11-03 | 移康智能科技(上海)股份有限公司 | 图像数据传输方法及装置 |
| CN107105217B (zh) | 2017-04-17 | 2018-11-30 | 深圳奥比中光科技有限公司 | 多模式深度计算处理器以及3d图像设备 |
| CN107424187B (zh) * | 2017-04-17 | 2023-10-24 | 奥比中光科技集团股份有限公司 | 深度计算处理器、数据处理方法以及3d图像设备 |
| CN106851927B (zh) | 2017-04-19 | 2018-06-12 | 慈溪锐恩电子科技有限公司 | 一种语音识别的多路调光调色led驱动电路 |
| CN107169483A (zh) | 2017-07-12 | 2017-09-15 | 深圳奥比中光科技有限公司 | 基于人脸识别的任务执行 |
| CN107450591A (zh) | 2017-08-23 | 2017-12-08 | 浙江工业大学 | 基于英伟达tx2处理器的无人机运动跟踪系统 |
| CN107370966B (zh) | 2017-08-24 | 2018-05-04 | 珠海安联锐视科技股份有限公司 | 一种智能红外控制电路及其控制方法 |
| CN107371017A (zh) | 2017-08-29 | 2017-11-21 | 深圳市度信科技有限公司 | 一种mipi摄像头信号长距离传输系统及方法 |
| CN107749949B (zh) | 2017-11-02 | 2020-08-18 | 奇酷互联网络科技(深圳)有限公司 | 摄像头自适应方法、摄像头自适应装置和电子设备 |
| CN107944422B (zh) | 2017-12-08 | 2020-05-12 | 业成科技(成都)有限公司 | 三维摄像装置、三维摄像方法及人脸识别方法 |
-
2019
- 2019-04-23 WO PCT/CN2019/083854 patent/WO2019206129A1/zh not_active Ceased
- 2019-04-23 EP EP19792981.3A patent/EP3672223B1/en active Active
- 2019-04-23 ES ES19792981T patent/ES2938471T3/es active Active
-
2020
- 2020-01-15 US US16/743,533 patent/US11050918B2/en active Active
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102970548A (zh) * | 2012-11-27 | 2013-03-13 | 西安交通大学 | 一种图像深度感知装置 |
| CN105407610A (zh) * | 2015-12-09 | 2016-03-16 | 深圳天珑无线科技有限公司 | 一种控制双闪光灯的方法及装置 |
| CN107341481A (zh) * | 2017-07-12 | 2017-11-10 | 深圳奥比中光科技有限公司 | 利用结构光图像进行识别 |
| CN108564032A (zh) * | 2018-04-12 | 2018-09-21 | Oppo广东移动通信有限公司 | 图像处理方法、装置、电子设备和计算机可读存储介质 |
| CN108573170A (zh) * | 2018-04-12 | 2018-09-25 | Oppo广东移动通信有限公司 | 信息处理方法和装置、电子设备、计算机可读存储介质 |
| CN108696682A (zh) * | 2018-04-28 | 2018-10-23 | Oppo广东移动通信有限公司 | 数据处理方法、装置、电子设备及计算机可读存储介质 |
| CN108810516A (zh) * | 2018-04-28 | 2018-11-13 | Oppo广东移动通信有限公司 | 数据处理方法、装置、电子设备及计算机可读存储介质 |
| CN108833887A (zh) * | 2018-04-28 | 2018-11-16 | Oppo广东移动通信有限公司 | 数据处理方法、装置、电子设备及计算机可读存储介质 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3672223A4 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111239729A (zh) * | 2020-01-17 | 2020-06-05 | 西安交通大学 | 融合散斑和泛光投射的ToF深度传感器及其测距方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3672223A1 (en) | 2020-06-24 |
| EP3672223A4 (en) | 2020-10-14 |
| US20200154033A1 (en) | 2020-05-14 |
| ES2938471T3 (es) | 2023-04-11 |
| EP3672223B1 (en) | 2022-12-28 |
| US11050918B2 (en) | 2021-06-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN108419017B (zh) | 控制拍摄的方法、装置、电子设备及计算机可读存储介质 | |
| WO2019206129A1 (zh) | 数据处理方法、装置、电子设备及计算机可读存储介质 | |
| CN108696682B (zh) | 数据处理方法、装置、电子设备及计算机可读存储介质 | |
| WO2019205887A1 (zh) | 控制拍摄的方法、装置、电子设备及计算机可读存储介质 | |
| CN108650472B (zh) | 控制拍摄的方法、装置、电子设备及计算机可读存储介质 | |
| CN108668078A (zh) | 图像处理方法、装置、计算机可读存储介质和电子设备 | |
| CN108833887B (zh) | 数据处理方法、装置、电子设备及计算机可读存储介质 | |
| CN111523499B (zh) | 图像处理方法、装置、电子设备和计算机可读存储介质 | |
| CN112565733B (zh) | 基于多相机同步拍摄的三维成像方法、装置及拍摄系统 | |
| CN108924426A (zh) | 图像处理方法和装置、电子设备、计算机可读存储介质 | |
| TWI708192B (zh) | 影像處理方法、電子設備、電腦可讀儲存媒體 | |
| CN108810516B (zh) | 数据处理方法、装置、电子设备及计算机可读存储介质 | |
| WO2020015403A1 (zh) | 图像处理方法、装置、计算机可读存储介质和电子设备 | |
| CN108846310A (zh) | 图像处理方法、装置、电子设备和计算机可读存储介质 | |
| CN109120846B (zh) | 图像处理方法和装置、电子设备、计算机可读存储介质 | |
| CN108965716B (zh) | 图像处理方法和装置、电子设备、计算机可读存储介质 | |
| CN108986153B (zh) | 图像处理方法和装置、电子设备、计算机可读存储介质 | |
| CN109064503B (zh) | 图像处理方法和装置、电子设备、计算机可读存储介质 | |
| WO2020024603A1 (zh) | 图像处理方法和装置、电子设备、计算机可读存储介质 | |
| HK40025029B (zh) | 数据处理方法、电子设备及计算机可读存储介质 | |
| HK40025029A (zh) | 数据处理方法、电子设备及计算机可读存储介质 | |
| HK40025204B (zh) | 控制拍摄的方法、电子设备及计算机可读存储介质 | |
| HK40025204A (zh) | 控制拍摄的方法、电子设备及计算机可读存储介质 | |
| CN116524073A (zh) | 对象同步方法、装置、设备及存储介质 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19792981 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2019792981 Country of ref document: EP Effective date: 20200318 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |