WO2012146984A2 - Système intégré de capture, traitement et représentation d'image en trois dimensions - Google Patents
Système intégré de capture, traitement et représentation d'image en trois dimensions Download PDFInfo
- Publication number
- WO2012146984A2 WO2012146984A2 PCT/IB2012/001246 IB2012001246W WO2012146984A2 WO 2012146984 A2 WO2012146984 A2 WO 2012146984A2 IB 2012001246 W IB2012001246 W IB 2012001246W WO 2012146984 A2 WO2012146984 A2 WO 2012146984A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- camera
- plenoptic
- calculation
- line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single two-dimensional [2D] image sensor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/571—Depth or shape recovery from multiple images from focus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—Three-dimensional [3D] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
- G06T15/205—Image-based rendering
Definitions
- the capture takes place from a multitude of points of view, and those points of view can be captured with a single camera using an array of microlenses, which is known as a plesoptic camera.
- Plenopic cameras are devices designed to capture 3D images. These cameras are based on an idea presented more than 100 years ago by Lippman to record the 3 D information on a 2D sensor. In particular, the plenopic cameras record in the sensor all uses series of elementary images that are carriers of information on position and inclination of the rays emitted by the 3D sample. This information now allows the calculation of stereoscopic pairs for projection on a stereoscopic monitor.
- this type of projection has two essential disadvantages. The first is that by generating only two images, the inherent advantages of plenoptic systems, which are multi-perspective visualization, and vertical parallax, are lost. The second, and more important, lies in the fact that to observe stereoscopic images there is a strong decoupling between the visual convergence mechanisms Binocular and accommodation. This conflict generates a strong discomfort and visual fatigue, which prevents you from prolonged observation.
- the captured becomes a plenoptic image, which must be calibrated to be processed by the algorithms described here.
- a calibration system is provided.
- this plenotic image captured with the proposed system can be projected on a display like the one described, thus avoiding the typical visual discomfort after observing prologues in conventional 3D projectors today, both stereoscopic and autostereopic.
- Stage 1 Acquisition of the plenopic image.
- Image acquisition can be done by a plenopic camera or by a pienopic lens.
- the proposed plenoptic object is an optical system composed of four clearly differentiated elements.
- the images formed by the multi-composite system or array are collected by an optical imaging vehicle that constitutes the third optical element and works in a near object mode which, once the incident incident in this subsystem is collided, is collected by the fourth element.
- the complete optical system includes a real variable aperture optical diaphragm for the adaptation of the received energy to the sensitivity of the detector and which, in turn, ensures the spatial sampling of the projected scene on the lens array or system multi-compound, so that the aüasing of! optical system by the adaptation that is achieved of the spatial frequencies transmitted to the Nyquist frequency of the complete system.
- Another diaphragm placing next to the array of lenses or multicomponent system correctly limits the maximum extent of the projected scene allowing homogeneous illumination free of vignetting.
- the complete optical system allows the optical or mechanical adjustment of the first system to project a sharp image in the array or multi-composite system.
- the focal plane includes a focusing mechanism that allows both axial and transverse adjustment of the CCD or CMOS or, if necessary, perform rotations in the focal plane that ensure the highest quality of the image formed.
- An exemplary embodiment of this device is shown in Figure I. -Stage 2. Image calibration procedure in the plenoptic chamber.
- the plenoptic camera is based on the use of an array of microlenses located in the focus of an image system, or in the vicinity of it, so that a set of images of the input pupil in a multiplicity is obtained in the detector of addresses.
- This physical arrangement of the elements allows an effective sampling of the "Ligh ⁇ f ⁇ eld" of the space in front of the camera, from which many of the properties of the visual field can be obtained, among which the depths to the objects and the phase are highlighted. of the wavefront,
- the physical construction of a plenoptic camera requires the location of an array of microlenses aligned with an image sensor.
- the alignment process is modeled as a combined fruit of a turn and a lateral detachment, although it is also necessary to obtain a measure of the pitch of the microlenses expressed in fractional units of pixel of! sensor, which may vary slightly depending on the image system used (zooin).
- the parameters that identify the calibration of a pienoptic camera are: a) The pitch (repetitive spatial frequency) of the microlenses in the sensor image, measured in pixels. It will be assumed that it is identical for both the horizontal and vertical directions, although hexagortalmenie distributed microlenses or with any other structure could also be admitted. As an exemplary embodiment, the microlenses will be considered to be manufactured with a Cartesian distribution.
- the algorithm to be used must always be completely automatic, based on the acquired optical image, without requiring any operator intervention.
- the minimization performed last exploits the information available "a priori!" that the array of microlenses is manufactured with perfection, that is, with a single pitch valid both horizontally and vertically, with all microlenses aligned except for a single turn of the assembly, This step allows to solve the possible erroneous response of some microbeat, and empowers that the method is also used in images that do not necessarily correspond to a uniform scene.
- the general procedure is as follows: a) The module of the two-dimensional Four-Wire Transform of the image is calculated, which has very sharp peaks at the inverse spatial frequency of the pitch of the micro lens array, to improve the accuracy of calculation of the same, the image is made to be bordered with pixels of zero value until the initial number of pixels is completed four times, then proceeding to a quadratic adjustment based on the maximum module frequency value and the two adjacent ones. This peak inspection is carried out both to the left and to the right of the zero frequencies, averaging both results to obtain the pitch.
- Procedure A correlation in one-dimensional phase a) A simulated perioptical image is generated in which the light of the microiente follows a sinusoidal profile both vertically and horizontally, complying with the known pitch and inclination values. This process uses the usual property that the centers of the Microlenses are generally much brighter than the interstices, and that the variation of the light inside the microlens image is smooth.
- the algorithm uses the trigonometric identity that indicates that the product of a sine by a cosine can be expressed as the sum of the sine of the difference angle and the sine of the sum angle:
- the term in difference of angles will be proportional to half of said difference, and the term sum will correspond to a double frequency that can be filtered or considered zero as long as the length of the line corresponds exactly to an integer number of wavelengths
- Procedure B two-dimensional phase correlation a) A vertical Line is taken from the central part of the image, taking into account the inclination measured previously.
- a simulated line is generated by means of a sinusoid with zero initial phase of the measured spatial frequency ⁇ pi ⁇ ch) and of amplitude normalized to the maximum value of the image line.
- the length of this line must be a multiple of the measured piten
- a third method can be the calculation of lateral displacement by linear correlation can be performed using the cross correlation of a vertical and a simulated line, but only in one dimension. That way, with great savings in calculation, acceptable results are obtained.
- Procedure C direct detection a) A vertical line is taken from the central part of the image, taking into account the inclination measured previously.
- a simulated line is generated by a sinusoid of the space frequency! measurement (pitch) and amplitude normalized to the maximum value of the image line »The length of this line must be a multiple of the measured pitch.
- simulated pyenoptic image must be generated.
- the simulated plenoptic image is composed by adding two cosines, one horizontal and one vertical where the period of these cosines is the pitch of the image.
- N and M are the horizontal and vertical dimensions of the image.
- Gaussian interpolation Improving FFT Frequency easurement esohi ⁇ ion by Paraboiic and Gaussian Spectrum Interpolation, M. Gasior, J.L. González, CERN, CH-1211-Geneva 23. S itzeriaad
- Stage 3 Generation of the set of refocused images with super resolution, distance calculation, generation of view points and sample of the resident on 3D display, focusing on a single plane or with the scene completely focused. Once calibrated to the image, and prior to its projection, this stage becomes necessary. First of all, it is necessary to form a refocused image at different depths from the previously captured and calibrated pienoptic image.
- the array of microlenses is characterized by containing Ny rows of microlens Nx equal to each other, of focal length f, and where the image formed by the microlenses is collected in a sensor located perpendicular to the optical axis at distance f behind the microlenses, occupying each image after microlens M times M paceles.
- the image can be considered an indexable digital signal with 4 discrete parameters, L (x, y, u, v) > with x varying between 0 and x-1, and between 0 and Ny-1, or between 0 and Ml, and v between 0 and Ml.
- the sample is assumed to regulate the dimensions of microlens and pixels after each microlens. It is required that the number of pixels M after each microlens be prime, for this first, description. And double the prime number for the second described use.
- I yta the continuous image and the continuous pienoptic signal.
- I yta the continuous image and the continuous pienoptic signal.
- the integral becomes a line integral over a 2D space, however this integral over a continuous signal has to be modified to take into account that the sensed with the pienoptic camera is a discrete version of it.
- Our method proposed makes a choice of the discretization method that differs from those previously collected by the literature.
- the parameters of the output image p Xi can be considered parts of the same dimension x x m x * M + p x . and the same for y ⁇ m y * M + p y .
- these formulas can be introduced, multiplying to the factors L (), a weighing function, to reflect the fact that the pixels can pick up
- a first image results from considering only those pixels after microlenses where the dimension p x varies between 0 and / 2- !, while p does between 0 and M / 2-1; another quadrant results from considering p x varying between M / 2 and Ml while p and doing so between 0 and M / 2-1; another to evaluate p x between 0 and M 2-1, while y does it between M 2 and Ml; and finally work with p x between M / 2 and Ml, while p y varies between M 2 and Ml.
- the plenoptic image becomes an integral image using the SPOC algorithm (H. Navarro, R. Mart ⁇ nez-Cuenca, G. Saavedra, M. Mart ⁇ nez-Corral, and B. Javidi, 3D integral imaging dispiay by smart pse ⁇ doscopic -to-orthoscopic conversion, "Opt Express 25, 25573-25583 (2010).
- This algorithm allows you to adapt the pienoptic image to the monitor characteristics (monitor size, number of microlenses, number of pixels), and also of the type of image to be projected (image scale, image distance to the monitor.)
- the proposed plenoptic monitor can be small in size (such as telephony terminals, or mini video game consoles, electronic agendas, or digital photo frames), of intermediate size (such as digital tablets, or nebooks), of Large size (such as television screens or computer monitors) or larger (such as video markers of sports stadiums, large advertising panels or large wall monitors).
- the structure of the monitor is similar.
- the integral image is projected onto the electronic digital display device (which can be of an LCD or LED type on small, intermediate or large monitors, or composed of LEDS or ultra-bright bulbs in the case of wall monitors) .)
- the electronic digital display device which can be of an LCD or LED type on small, intermediate or large monitors, or composed of LEDS or ultra-bright bulbs in the case of wall monitors.
- the focal length of the lenses is of the order of four times the diameter of the lenses.
- the spatial resolution of! The monitor is fixed by the size of the Lenses.
- the number of perspectives is determined by the number of pixels present in the elementary cell behind each lens.
- the size of the lenses must be such that their angular size, seen by the observer when viewing the monitor from a normal distance (for example, 0.5 m for a computer monitor) is equal or less than 10-3 rad.
- the number of pixels With respect to the number of pixels, to provide a sense of continuous perspective, the number of pixels per icrolent must be greater than 12.
- the microlens matrix consists of refractive lenses. In ura wall type monitors to avoid the large weight of large-diameter lenses, it is more convenient to use Fresnel lens sheets or kinoform-type lens arrays.
- cryoscopic 3D images can be projected, with total parallax, and without the disadvantages of visual fatigue inherent in stereoscopic monitors, description of the figures
- Figure 1 Optical assembly of the proposed plenoptic objective.
- Figure 2 Plenopic image where the pth and inclination are calculated by applying Fourier transform.
- Figure 3 Schematic representation where the correspondence of the pixels of the sensor with the boxes of the scheme is illustrated, in an area of the space (x, u). In the scheme, the projection of iodes has been highlighted, the pixels will be a certain micro! Enie. ⁇ j, in this case, xo ⁇ l. The projection follows the inclination that causes the upper left end of the pixel area (x ⁇ 3 ⁇ 4 Ml) to project over the lower left corner of the pixel! (x ⁇ -k, 0), in this case k - 1.
- Figure 6 illustration of the generation of 4 different points seen from a plenoptic image.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
La présente invention concerne un système intégré de capture, traitement et représentation d'image en trois dimensions pour des appareils photo à objectifs interchangeables qui permet de transformer n'importe quel appareil photo à objectifs interchangeables en appareil photo 3D et qui montre les résultats de manière que l'utilisateur ne ressente pas de fatigue visuelle, lequel système comprend un système portatif accouplé à un appareil photo classique à objectifs interchangeables, ainsi qu'un procédé d'étalonnage de l'image acquise, un procédé de génération de l'ensemble d'images remises au point avec une superrésolution, de calcul de distances et de génération de points de vue, un procédé d'adaptation des résultats antérieurs au moyen d'algorithmes appropriés et de projection ultérieure sur un moniteur sur la base du positionnement de microlentilles devant un écran classique.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| ES201100485 | 2011-04-28 | ||
| ES201100485A ES2391185B2 (es) | 2011-04-28 | 2011-04-28 | Sistema integrado de captura, procesado y representación de imagen tridimensional. |
Publications (3)
| Publication Number | Publication Date |
|---|---|
| WO2012146984A2 true WO2012146984A2 (fr) | 2012-11-01 |
| WO2012146984A3 WO2012146984A3 (fr) | 2012-12-27 |
| WO2012146984A8 WO2012146984A8 (fr) | 2013-06-06 |
Family
ID=47072829
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2012/001246 Ceased WO2012146984A2 (fr) | 2011-04-28 | 2012-05-09 | Système intégré de capture, traitement et représentation d'image en trois dimensions |
Country Status (2)
| Country | Link |
|---|---|
| ES (1) | ES2391185B2 (fr) |
| WO (1) | WO2012146984A2 (fr) |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7697839B2 (en) * | 2006-06-30 | 2010-04-13 | Microsoft Corporation | Parametric calibration for panoramic camera systems |
| TWI314832B (en) * | 2006-10-03 | 2009-09-11 | Univ Nat Taiwan | Single lens auto focus system for stereo image generation and method thereof |
| US8189065B2 (en) * | 2008-01-23 | 2012-05-29 | Adobe Systems Incorporated | Methods and apparatus for full-resolution light-field capture and rendering |
| ATE551841T1 (de) * | 2009-04-22 | 2012-04-15 | Raytrix Gmbh | Digitales bildgebungsverfahren zum synthetisieren eines bildes unter verwendung der mit einer plenoptischen kamera aufgezeichneten daten |
-
2011
- 2011-04-28 ES ES201100485A patent/ES2391185B2/es not_active Expired - Fee Related
-
2012
- 2012-05-09 WO PCT/IB2012/001246 patent/WO2012146984A2/fr not_active Ceased
Also Published As
| Publication number | Publication date |
|---|---|
| ES2391185B2 (es) | 2013-06-19 |
| ES2391185A1 (es) | 2012-11-22 |
| WO2012146984A8 (fr) | 2013-06-06 |
| WO2012146984A3 (fr) | 2012-12-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10897608B2 (en) | Capturing light-field images with uneven and/or incomplete angular sampling | |
| ES2349279T3 (es) | Procedimiento y unidad de ajuste a escala para ajustar a escala un modelo tridimensional y aparato de visualización. | |
| CN108353188A (zh) | 用于编码光场内容的方法 | |
| JP6292327B2 (ja) | プレノプティック撮像システムの深度及び視差のマッピングを較正するための方法及びコンピュータプログラム | |
| Hong et al. | Full parallax three-dimensional display from Kinect v1 and v2 | |
| CN107092096A (zh) | 一种裸眼3d地面沙盘显示系统及方法 | |
| CN106154567B (zh) | 一种三维光场显示系统的成像方法及装置 | |
| CN106526843B (zh) | 一种裸眼3d光栅的模拟方法和装置 | |
| Diebold et al. | Light-field camera design for high-accuracy depth estimation | |
| JP2012084105A (ja) | 立体像生成装置およびそのプログラム | |
| CN115406371A (zh) | 一种基于dic的柔性指尖形变测量方法 | |
| US10909704B2 (en) | Apparatus and a method for generating data representing a pixel beam | |
| Shin et al. | Improved Viewing Quality of 3‐D Images in Computational Integral Imaging Reconstruction Based on Lenslet Array Model | |
| WO2012146984A2 (fr) | Système intégré de capture, traitement et représentation d'image en trois dimensions | |
| Su et al. | Calibrating the orientation between a microlens array and a sensor based on projective geometry | |
| ES2987469T3 (es) | Método y sistema óptico para adquirir la distribución tomográfica de frentes de onda de campos electromagnéticos | |
| WO2020244273A1 (fr) | Système d'imagerie stéréoscopique tridimensionnelle à double caméra et procédé de traitement | |
| Pérez et al. | A fast and memory-efficient discrete focal stack transform for plenoptic sensors | |
| Reichel et al. | Reliable and good camera calibration based on machine learning inspired workflow: parameter study and experimental results | |
| Zhao et al. | Removal of parasitic image due to metal specularity based on digital micromirror device camera | |
| Bazeille et al. | Light-field image acquisition from a conventional camera: design of a four minilens ring device | |
| ES2622485T3 (es) | Sistema y método de medición estereoscópicos | |
| Li et al. | Calibrating a camera focused on a long shot using a calibration plate and defocused corner points | |
| Riou et al. | Interests of refocused images calibrated in depth with a multi-view camera for control by vision | |
| Drazic et al. | Optimal design and critical analysis of a high-resolution video plenoptic demonstrator |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12776837 Country of ref document: EP Kind code of ref document: A2 |
|
| DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 12776837 Country of ref document: EP Kind code of ref document: A2 |