WO2022018791A1 - 画像処理装置および画像処理方法 - Google Patents
画像処理装置および画像処理方法 Download PDFInfo
- Publication number
- WO2022018791A1 WO2022018791A1 PCT/JP2020/028065 JP2020028065W WO2022018791A1 WO 2022018791 A1 WO2022018791 A1 WO 2022018791A1 JP 2020028065 W JP2020028065 W JP 2020028065W WO 2022018791 A1 WO2022018791 A1 WO 2022018791A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- images
- collapse
- observation
- existence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
- G06T5/30—Erosion or dilatation, e.g. thinning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
- G01S13/90—Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
- G01S13/9021—SAR image post-processing techniques
- G01S13/9027—Pattern recognition for feature extraction
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/254—Analysis of motion involving subtraction of images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20036—Morphological image processing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30184—Infrastructure
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Definitions
- the present invention relates to an image processing device and an image processing method for generating an image in which a change region with respect to another image can be specified in one image.
- Synthetic Aperture Radar (SAR) technology transmits and receives electromagnetic waves while a flying object such as an artificial satellite or an aircraft moves, and is equivalent to an image obtained by an antenna having a large aperture (hereinafter, also referred to as a SAR image). .) Is a technique for obtaining. Synthetic aperture radar is used, for example, for signal processing of reflected waves from the ground surface and analysis of ground surface displacement.
- observation images images taken by artificial satellites, etc. are referred to as observation images.
- the observed image includes an optical image and a SAR image.
- each of the two images is referred to as an object existence image or an object map, and the two images may be referred to as an image pair.
- an image in which the difference portion between the two images based on the comparison result of the two images can be specified may be referred to as a change map or a composite change map.
- FIG. 14 is an explanatory diagram showing a method of creating the synthetic change map 333 described in Non-Patent Document 1.
- the first image 331 corresponds to the object map which is the first input image.
- the second image 332 corresponds to the object map which is the second input image.
- the first image 331 and the second image 332 are combined while allowing a deviation of several pixels between the first image 331 and the second image 332.
- the synthetic change map 333 is generated.
- a difference portion between the first image 331 and the second image 332 appears.
- Patent Document 1 describes a method of creating a classifier (trained model) using two types of images (image pairs) generated from an interference SAR image and correct answer data as training data (training data).
- the ground surface change is determined using the trained model.
- FIG. 15 is a block diagram showing a configuration of a general system that generates a change map using a trained model.
- the learning model is trained by machine learning 403 using the image pair 401 and the correct answer data (for example, the correct answer change map) 402 as training data (training data).
- a trained model 410 is created.
- the change map 412 is generated from the image pair 411 using the trained model 410.
- the correct answer change map is a change map used as correct answer data.
- the correct answer data is manually created as described in [0019] of Patent Document 1. Therefore, it takes time to obtain a large number of correct answer data.
- the correct answer data created by one creator may differ from the correct answer data created by another creator. That is, the objectivity of the correct answer data is not guaranteed. In other words, there is a possibility that correct answer data that reflects individual differences will be generated.
- the first image 331 and the second image 332, which are the basis of the synthetic change map 333 are manually created. Even if the synthetic change map 333 that can be used as correct data is automatically generated from the first image 331 and the second image 332, the synthetic change map 333 is far from the change map obtained from the actual observed image. there is a possibility. This is because the original first image 331 and the second image 332 are artificially created. As a result, when the synthetic change map 333 is used as the correct change map, it may be far from the correct change map obtained from the actual observation image.
- the present invention can generate an image in which the difference portion between the two input images can be specified in a short time without being affected by individual differences, and the image is far from the image obtained from the actual observed image. It is an object of the present invention to provide an image processing apparatus and an image processing method that can be excluded.
- the image processing apparatus is an image obtained from each of the two observation images based on the observation angle of each of the two observation images and the size of the object appearing in each of the two observation images.
- an image transforming means that transforms an object existence region in two object existence images in which a plurality of objects exist to generate two deformed images, and a composite image is generated by synthesizing the two deformed images. It includes an image generation means for determining a change in an object between two images with an object using an image and generating an image in which the determined change can be specified.
- the image processing method is an image obtained from each of the two observation images based on the observation angle of each of the two observation images and the size of the object appearing in each of the two observation images.
- the object existence region in two object existence images in which a plurality of objects exist is deformed to generate two deformed images, the two deformed images are combined to generate a composite image, and the composite image is used.
- the change of the object between the two objects present images is determined, and an image in which the determined change can be specified is generated.
- the image processing program according to the present invention is an image obtained from each of the two observed images based on the observation angle of each of the two observed images and the size of the object appearing in each of the two observed images on a computer.
- a process of determining a change in an object between two images with an object existing using a composite image and generating an image in which the determined change can be specified is executed.
- an image capable of identifying a difference portion between two input images can be generated in a short time without being affected by individual differences, and the image is far from the image obtained from the actual observation image. Can be excluded.
- FIG. 1 is a block diagram showing the main components in the embodiment of the image processing apparatus.
- the image processing device 1 shown in FIG. 1 includes an object map generation means 10 and a correct answer change map generation means 20.
- a set of observation images is input to the object map generation means 10.
- the object map generation means 10 extracts an image (object existence image) including an object existence region in which the object to be detected for change exists from each of the observed images. That is, the object map generation means 10 generates a set of object maps.
- the set of object maps corresponds to the image pairs described above.
- the object map generation means 10 extracts, for example, a predetermined region from the observation image, but it is also possible to manually extract the region from the observation image.
- the observation angle (azimuth angle and incident angle) of each observation image and the size (height and width) of the object are input to the correct answer change map generation means 20.
- the size of the object is predetermined according to the object to be detected for change.
- the correct answer change map generation means 20 transforms each object map based on each observation angle of the observation image and the size of the object. Further, the correct answer change map generation means 20 generates an image showing a region where the object has changed between the two object maps, that is, a change map, by synthesizing the deformed object map to generate a composite image. do. The change map generated by the correct answer change map generation means 20 is output as a correct answer change map.
- a SAR image will be taken as an example as an observation image.
- an automobile is taken as an example as an object.
- FIG. 2 is an explanatory diagram showing an example of a method of generating a correct answer change map.
- the artificial satellite 100 passing through the orbit A photographs the area including the parking lot 120 at time t1.
- the artificial satellite 100 passing through the orbit B different from the orbit A photographs the area including the parking lot 120 at the time t2 different from the time t1.
- Time t2 is a time after time t1.
- Image A first object map 111 obtained from the observation image at time t1 is shown in the center of the upper part of FIG.
- Image A has three automobiles 91, 92, and 93.
- image B second object map 121 obtained from the observation image at time t2 is shown.
- Image B has two automobiles 93 and 94.
- the automobiles 91 and 92 that existed at time t1 have disappeared.
- a new automobile 94 appears. That is, a new automobile 94 appeared between the time t1 and the time t2.
- the first object map 111 and the second object map 121 correspond to the image of the parking lot 200.
- the correct answer change map generation means 20 generates the correct answer change map 150 using the image A and the image B.
- the ellipse surrounded by the solid line indicates the region where the automobile 93 that has not changed from the time t1 to the time t2 exists. That is, it indicates a region where there is no change.
- the black ellipse indicates the area where the newly emerged automobile 94 resides.
- the ellipse surrounded by the broken line indicates the area where the disappeared automobiles 91 and 92 exist. That is, a black ellipse and an ellipse surrounded by a dashed line indicate a change area.
- the changed area and the non-changed area may be distinguished by an expression different from the expression illustrated in FIG. 2.
- a changeable region and a non-changeable region may be made distinguishable by a difference in color.
- FIG. 3 is an explanatory diagram for explaining the incident angle and the azimuth angle (range azimuth angle) of the electromagnetic wave.
- FIG. 3 shows a first observation image 101 obtained in the orbit A and a second observation image 102 obtained in the orbit B different from the orbit A.
- the incident angles ⁇ A and ⁇ B correspond to the angles from the zenith direction to the direction of the artificial satellite 100.
- the range azimuths ⁇ A and ⁇ B correspond to the angles in the range direction with respect to the reference direction (for example, the north direction).
- Figure 4 is an explanatory diagram for explaining a distance falling (falling weight) l A. Assuming that the height of the object (in this example, the automobile) is h and the incident angle of the electromagnetic wave is ⁇ A , the collapse amount l A is expressed by the following equation (1).
- the collapse amount of the image A is defined as l A
- FIG. 5 is an explanatory diagram for explaining the expansion process executed by the correct answer change map generation means 20.
- the correct answer change map generation means 20 expands the image A (first object map 111) based on the first observation image 101 (see FIG. 3) obtained in the orbit A. I do.
- the correct answer change map generation means 20 moves the object (in this example, the automobile) appearing in the image A in the collapse direction of the corresponding object in the image B, and the collapse amount of the object in the image B. Inflate by the length corresponding to.
- the image A after the expansion process (the first object map 112 in which the object is expanded) is obtained.
- the correct answer change map generation means 20 performs expansion processing on the image B (second object map 121) based on the second observation image 102 (see FIG. 3) obtained in the orbit B.
- the object appearing in the image B is expanded in the collapse direction of the corresponding object in the image A by a length corresponding to the collapse amount of the object in the image A.
- the image B after the expansion process (second object map 122 in which the object is expanded) is obtained.
- the black region indicates an expanded region, that is, an expanded region.
- FIG. 6 is an explanatory diagram for explaining changes in the object.
- the correct answer change map generation means 20 superimposes the image A after the expansion process, that is, the first object map 112, and the image B after the expansion process, that is, the second object map 122.
- FIG. 6 schematically shows the composite image (change map) 140 after superposition.
- observation image that is the source of image B is obtained later in time than the observation image that is the source of image A.
- the area [F, B] indicates a region in which the object is present in the first object map 112 but not in the second object map 122. That is, the region [F, B] indicates a region where the object has disappeared.
- the region [F, F] indicates a region where the object exists in the first object map 112 and exists in the second object map 122. That is, the region [F, F] indicates a region in which no change has occurred.
- the region [B, F] indicates a region where the object does not exist in the first object map 112 but exists in the second object map 122. That is, the region [B, F] indicates a region where a new object appears.
- Regions [B, B] indicate regions where the object does not exist in either the first object map 112 or the second object map 122. That is, the region [B, B] indicates a region in which no change has occurred.
- the correct answer change map generation means 20 generates the change map 140 based on the idea as illustrated in FIG. Specifically, the correct answer change map generation means 20 generates a change map 140 capable of distinguishing a change region (a region where an object disappears or appears) and a non-change region.
- FIG. 7 is an explanatory diagram for explaining the noise removal process.
- the black area corresponds to the area [B, F] exemplified in FIG. That is, the black region indicates the region where the object has disappeared.
- the area surrounded by the broken line corresponds to the area [F, B] illustrated in FIG. That is, the area surrounded by the broken line indicates the area where the object has disappeared.
- the area surrounded by the solid line corresponds to the area [F, F] or the area [B, B] illustrated in FIG. That is, the region surrounded by the solid line indicates a region where no change has occurred.
- the correct answer change map generation means 20 performs noise removal processing on the change map 140.
- the noise removal process is a process of removing a region smaller than an object by regarding it as noise.
- the correct answer change map generation means 20 performs an opening process on the change map 140.
- the opening process is a combination of contraction and expansion.
- the correct answer change map generation means 20 shrinks the object by the number of pixels according to the size of the object when the shrinking process in the opening process is executed.
- the change map from which noise has been removed is referred to as the correct answer change map 150, but the change map 140 before the noise removal processing is used as the correct answer change map although noise remains. May be used as.
- FIG. 8 is a block diagram showing a specific configuration example of the correct answer change map generation means 20.
- the correct answer change map generation means 20 shown in FIG. 8 includes a first collapse parameter calculation means 21, a second collapse parameter calculation means 22, a first expansion means 31, a second expansion means 32, a change map generation means 41, and a change map generation means 41.
- the noise removing means 51 is included.
- the range azimuth and incident angle and the height of the object with respect to the image A are input to the first collapse parameter calculating means 21.
- the first collapse parameter calculation means 21 calculates the collapse amount of the object in the image A by using the incident angle and the height of the object. Further, the first collapse parameter calculation means 21 determines the collapse direction of the object in the image A by using the range azimuth. The collapse direction is the same as the direction indicated by the range azimuth angle ⁇ A.
- the first collapse parameter calculation means 21 outputs the first collapse parameter to the second expansion means 32.
- the first collapse parameter includes at least data indicating the collapse amount of the object and data indicating the collapse direction of the object.
- the range azimuth and incident angle and the height of the object with respect to the image B are input to the second collapse parameter calculating means 22.
- the second collapse parameter calculation means 22 calculates the collapse amount of the object in the image B by using the incident angle and the height of the object. Further, the second collapse parameter calculation means 22 determines the collapse direction of the object in the image B by using the range azimuth. The collapse direction is the same as the direction indicated by the range azimuth angle ⁇ B.
- the second collapse parameter calculation means 22 outputs the second collapse parameter to the first expansion means 31.
- the second collapse parameter includes at least data indicating the collapse amount of the object and data indicating the collapse direction of the object.
- the first collapse parameter calculation means 21 sets the direction indicated by the range azimuth ⁇ A + 180 degrees (or the range azimuth ⁇ A ⁇ 180 degrees). It is calculated as the collapse direction in the collapse parameter of 1.
- the second collapse parameter calculation means 22 calculates the direction indicated by the range azimuth ⁇ B + 180 degrees (or the range azimuth ⁇ B ⁇ 180 degrees) as the collapse direction in the second collapse parameter.
- the image A and the second collapse parameter are input to the first expansion means 31.
- the first expansion means 31 expands the object in the image A using the second collapse parameter to generate an image A (first object map 112) in which the object is expanded.
- the first expansion means 31 outputs the first object map 112 to the change map generation means 41.
- the image B and the first collapse parameter are input to the second expansion means 32.
- the second expansion means 32 expands the object in the image B using the first collapse parameter to generate the image B (second object map 122) in which the object is expanded.
- the second expansion means 32 outputs the second object map 122 to the change map generation means 41.
- the change map generation means 41 superimposes the first object map 112 and the second object map 122. That is, the change map generation means 41 synthesizes the first object map 112 and the second object map 122. Then, the change map generation means 41 determines the change (disappearance or appearance) between the object in the first object map 112 and the object in the second object map 122 corresponding to the object. The change map generation means 41 changes the composite image in which the first object map 112 and the second object map 122 are superposed into an image in which the changed region and the non-change region can be distinguished. It is output to the noise removing means 51 as a map 140.
- the noise removing means 51 performs an opening process on the change map 140, and outputs an image from which noise has been removed as a correct answer change map.
- the object map generation means 10 (not shown in FIG. 8) is an object in which an object to be detected for change exists from each observation image in the set of input observation images.
- An image including an existing area (an object existing image) is extracted.
- the extracted two object existence images form a set of object maps (step S11).
- the two observation images constituting the set of observation images are, for example, SAR images based on images taken from the artificial satellite 100 in different orbits at different times.
- the object map generated in the process of step S11 corresponds to the first object map 111 and the second object map 121 shown in FIG.
- the meta information of one of the observed images is input to the first collapse parameter calculation means 21.
- the meta information of the other observed image is input to the second collapse parameter calculation means 22.
- available observation images are accompanied by meta information (metadata) such as shooting time, shooting point (for example, latitude and longitude of the center of the observed image), and electromagnetic wave irradiation direction (observation direction).
- the first collapse parameter calculation means 21 extracts the range azimuth ⁇ A and the incident angle ⁇ A from the meta information of one observation image
- the second collapse parameter calculation means 22 extracts the range azimuth angle ⁇ A and the incident angle ⁇ A from the meta information of the other observation image.
- the range azimuth angle ⁇ B and the incident angle ⁇ B are extracted (step S12).
- first collapse parameter calculation means 21 and the second collapse parameter calculation means 22 extract the range azimuth and the incident angle from the meta information.
- a means other than the first collapse parameter calculation means 21 and the second collapse parameter calculation means 22 may extract the range azimuth and the incident angle from the meta information. In that case, the means supplies the extracted range azimuth angle and incident angle to the first collapse parameter calculation means 21 and the second collapse parameter calculation means 22.
- step S13 data indicating the height h of the object is input to the first collapse parameter calculation means 21 and the second collapse parameter calculation means 22 (step S13).
- the processing order of steps S11 to S13 is arbitrary. That is, the processing order of steps S11 to S13 does not necessarily have to be the order shown in FIG. Further, the height h of the object is set in advance. For example, when the object is an automobile, a general height value of the automobile or a value having a margin thereof is input to the object map generation means 10 as the height h of the object.
- the first collapse parameter calculation means 21 and the second collapse parameter calculation means 22 calculate the collapse parameter (step S14).
- the first collapse parameter calculating means 21 collapses the object in the image A according to the above equation (1) using the incident angle ⁇ A obtained in the process of step S12 and the height h of the object. Calculate the quantity l A.
- the first collapse parameter calculation means 21 sets the range azimuth angle ⁇ A obtained in the process of step S12 as the collapse direction of the object.
- the first collapse parameter calculation means 21 uses the obtained collapse amount and collapse direction as the first collapse parameter.
- the first collapse parameter calculation means 21 determines the collapse amount and the collapse direction of each object, and each collapse amount and collapse. The direction is included in the first collapse parameter.
- step S14 the second collapse parameter calculating means 22 collapses the object in the image B according to the above equation (1) using the incident angle ⁇ B obtained in the process of step S12 and the height h of the object. Calculate the quantity l B. Further, the second collapse parameter calculation means 22 sets the range azimuth angle ⁇ B obtained in the process of step S12 as the collapse direction of the object. The second collapse parameter calculation means 22 uses the obtained collapse amount and collapse direction as the second collapse parameter. When a plurality of objects are present in the image B, the second collapse parameter calculation means 22 determines the collapse amount and the collapse direction of each object, and each collapse amount and collapse. The direction is included in the second collapse parameter.
- the first collapse parameter calculation means 21 determines the direction of 180 degrees with respect to the range azimuth angle ⁇ A as the collapse direction in the first collapse parameter.
- the second collapse parameter calculation means 22 determines the direction of 180 degrees with respect to the range azimuth angle ⁇ B as the collapse direction in the second collapse parameter.
- the first expanding means 31 and the second expanding means 32 inflate the object in the object map (image A or image B) (step S15).
- a first expansion means 31, an object in the image A, the tilting direction included in the second leaning parameter is inflated by tilting amounts l B.
- the second expansion means 32, an object in the image B, and tilting directions included in the first collapsing parameter is inflated by tilting amounts l A.
- the change map generation means 41 captures an image A in which the object is expanded (first object map 112: see FIG. 5) and an image B in which the object is expanded (second object map 122: see FIG. 5). Overlapping (step S16).
- the change map generation means 41 determines whether or not the object has changed based on the degree of overlap of the objects in the composite image created in the process of step S16. For example, the change map generation means 41 determines whether or not the object has changed by comparing the first object map 112 and the second object map 122 pixel by pixel (pixel by pixel). Then, as illustrated in FIG. 6, the change map generation means 41 determines that the object that exists in the image A but does not exist in the image B is the disappeared object (changed object). .. Further, the change map generation means 41 determines that an object that does not exist in the image A but exists in the image B is a newly appearing object (changed object). The change map generation means 41 determines that the other object is an object that does not change.
- the change map generation means 41 generates a change map 140 (see FIG. 7) by reflecting the determination result of whether or not the change has occurred in the composite image created in the process of step S16 (step S17).
- step S18 Data indicating the width of the object is input to the noise removing means 51 (step S18).
- the width of the object is preset. For example, when the object is an automobile, a value having a width of a general automobile or a value having a margin thereof is input to the noise removing means 51 as the width of the object.
- the process of step S18 may not be executed at the timing shown in FIG. That is, the width of the object may be input by the time the execution of the process of step S19 is started.
- the noise removing means 51 performs an opening process on the change map 140, and outputs an image from which noise has been removed as a correct answer change map (step S19).
- the noise removing means 51 shrinks the object by the number of pixels according to the size (specifically, the width) of the object in the shrink process in the opening process.
- the number of pixels to be shrunk is predetermined according to the size of the object. That is, the number of pixels that should be determined not to be an object is set to a removable number.
- the noise removing means 51 executes the shrinking process twice in the opening process so that the block having a size of less than 3 pixels, that is, 2 pixels or less is removed.
- the image processing device of the present embodiment generates a change map as a correct answer used as training data for machine learning based on an actual observation image. Therefore, the change map can be generated in a short time without being affected by individual differences as in the case of manually creating the change map. In addition, it can be excluded that the change map is far from the image obtained from the actual observation image.
- the image processing apparatus expands the object existence region in the first object map 111 according to the collapse direction and the collapse amount of the object in the second object map 121, and the second It is preferable that the object existence region in the object map 121 is configured to expand according to the collapse direction and the collapse amount of the object in the first object map 111.
- the object existence region in the object map 121 is configured to expand according to the collapse direction and the collapse amount of the object in the first object map 111.
- the image processing device is configured to remove a region having a size smaller than a predetermined value determined based on the width of the object in the composite image.
- the change map (change map used as the correct change map) finally obtained when the small size area is determined to be the change area in the composite image is obtained.
- the map does not include the change area other than the object. Therefore, the reliability of the correct answer change map can be increased.
- the image processing device of the above embodiment can be configured by hardware, but it can also be realized by a computer program.
- FIG. 10 is a block diagram showing a configuration example of an information processing device capable of realizing the functions of the image processing device of the above embodiment.
- the information processing apparatus shown in FIG. 10 includes a processor such as one or a plurality of CPUs (Central Processing Units), a program memory 1002, and a memory 1003.
- FIG. 10 illustrates an information processing apparatus having one processor 1001.
- the program memory 1002 is, for example, a non-transitory computer readable medium.
- Non-temporary computer-readable media include various types of physical recording media (tangible storage medium).
- a semiconductor storage medium such as a flash ROM (Read Only Memory) or a magnetic storage medium such as a hard disk can be used.
- the program memory 1002 is a block in the image processing apparatus of the above embodiment (object map generation means 10, correct answer change map generation means 20, first collapse parameter calculation means 21, second collapse parameter calculation means 22, second.
- An image processing program for realizing the functions of the expansion means 31, the second expansion means 32, the change map generation means 41, and the noise removing means 51) of the first is stored.
- the processor 1001 realizes the function of the image processing device by executing the processing according to the image processing program stored in the program memory 1002.
- the plurality of processors can work together to realize the function of the image processing unit.
- RAM RandomAccessMemory
- the memory 1003 stores temporary data and the like generated when the image processing apparatus is executing processing. It is also possible to envision a form in which the image processing program is transferred to the memory 1003 and the processor 1001 executes the processing based on the image processing program in the memory 1003.
- the program memory 1002 and the memory 1003 may be integrated.
- FIG. 11 is a block diagram showing a main part of the image processing device.
- the image processing apparatus 60 shown in FIG. 11 has an observation angle of each of the two observation images (for example, a range azimuth angle and an incident angle) and a size of an object appearing in each of the two observation images (for example, the height of the object).
- two object existence images for example, the first object map 111 and the second object map 111 which are images obtained from each of the two observation images and in which one or more objects are present.
- image generation unit (image generation means) 62 in the embodiment, a change map generation means 41) that determines a change in an object between the two and generates an image (for example, a change map 140) capable of identifying the determined change. It is realized by.) And.
- the image processing device 60 calculates a collapse amount using the observation angle included in the metadata of the two observation images and the height of the object. (In the embodiment, it is realized by the first collapse parameter calculation means 21 and the second collapse parameter calculation means 22).
- the image processing apparatus 60 removes a region having a size smaller than a predetermined value determined based on the width of the object (removal means) 64 (in the embodiment, the noise removing means 51). It may be further provided with.).
- An image transforming means that transforms an object existence region in two object existence images in which the object exists to generate two deformed images, and A composite image is generated by synthesizing the two deformed images, a change in the object between the two target existing images is determined using the composite image, and an image capable of identifying the determined change is obtained.
- An image processing device including an image generation means for generating.
- the image transforming means is the image processing apparatus of Appendix 1 that expands the object existence region by a predetermined amount in each of the two object existence images.
- the image transforming means sets the object existence region in the first object existence image of the two object existence images to the second object of the two object existence images.
- the object is expanded according to the collapse direction and the collapse amount of the object in the existence image, and the object existence region in the second object existence image is changed to the collapse direction and the collapse amount of the object in the first object existence image.
- the image processing device of Appendix 2 that expands according to.
- the amount of collapse is calculated using the observation angle and the height of the object included in the metadata of the two observation images, and the direction of the collapse is determined based on the observation direction included in the metadata of the observation image.
- the image processing apparatus of Appendix 3 further provided with a means for determining a collapse parameter to be determined.
- Appendix 5 An image processing apparatus according to any one of Appendices 1 to 4, further comprising a removing means for removing an area having a size smaller than a predetermined value determined based on the width of the object.
- Appendix 7 The image processing method of Appendix 6 for expanding the object existence region by a predetermined amount in each of the two object existence images.
- the object existence region in the first object existence image of the two object existence images is the object in the second object existence image of the two object existence images.
- the object existing region in the second object existence image is expanded according to the collapse direction and the collapse amount of the object in the first object existence image by expanding according to the collapse direction and the collapse amount. Image processing method.
- Appendix 9 An image processing method according to any one of Appendices 6 to 8, which removes an area having a size smaller than a predetermined value determined based on the width of the object.
- the image processing program is applied to a computer.
- Appendix 11 The image processing program is applied to a computer.
- the recording medium of Appendix 10 for executing a process of expanding the object existence region by a predetermined amount in each of the two object existence images.
- the image processing program is applied to a computer.
- the object existence region in the first object existence image of the two object existence images is defined as the direction of the collapse of the object in the second object existence image of the two object existence images.
- the process of expanding according to the collapse amount and expanding the object existence region in the second object existence image according to the collapse direction and the collapse amount of the object in the first object existence image is executed in Appendix 11. recoding media.
- the image processing program is applied to a computer.
- Appendix 15 To the computer The image processing program of Appendix 14 for executing a process of expanding the object existence region by a predetermined amount in each of the two object existence images.
- Appendix 16 To the computer The object existence region in the first object existence image of the two object existence images is defined as the direction of the collapse of the object in the second object existence image of the two object existence images. Appendix 15: Inflate according to the amount of collapse, and execute the process of expanding the area where the object exists in the second image with the presence of the object according to the direction and amount of the collapse of the object in the image with the presence of the first object. Image processing program.
- Appendix 18 An image processing program for realizing an image processing method according to any one of the appendices 6 to 9.
- Image processing device 10
- Object map generation means 20
- Correct answer change map generation means 21
- First collapse parameter calculation means 22
- Second collapse parameter calculation means 31
- First expansion means 32
- Second expansion means 41
- Change map generation means 51
- Noise removal means 60
- Image processing device 61
- Image deformation part 62
- Image generation part 63
- Collapse parameter determination part 64
- Removal part 100 Artificial satellite 1001
- Processor 1002 Program memory 1003 Memory
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar, Positioning & Navigation (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Computer Networks & Wireless Communication (AREA)
- Health & Medical Sciences (AREA)
- Geometry (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Electromagnetism (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
前記2つの変形画像を合成して合成画像を生成し、前記合成画像を用いて前記2つの対象物存在画像の間での対象物の変化を判定し、判定された変化を特定可能な画像を生成する画像生成手段と
を備えた画像処理装置。
付記1の画像処理装置。
付記2の画像処理装置。
付記3の画像処理装置。
付記1から付記4のうちのいずれかの画像処理装置。
前記2つの変形画像を合成して合成画像を生成し、前記合成画像を用いて前記2つの対象物存在画像の間での対象物の変化を判定し、判定された変化を特定可能な画像を生成する
画像処理方法。
付記6の画像処理方法。
付記7の画像処理方法。
付記6から付記8のうちのいずれかの画像処理方法。
前記画像処理プログラムは、コンピュータに、
2つの観測画像の各々の観測角度と前記2つの観測画像の各々に現れる対象物の大きさとに基づいて、前記2つの観測画像の各々から得られる画像であって1つまたは複数の対象物が存在する2つの対象物存在画像における対象物存在領域を変形して2つの変形画像を生成する処理と、
前記2つの変形画像を合成して合成画像を生成し、前記合成画像を用いて前記2つの対象物存在画像の間での対象物の変化を判定し、判定された変化を特定可能な画像を生成する処理とを実行させる
記録媒体。
前記2つの対象物存在画像の各々において、前記対象物存在領域を所定量膨張させる処理を実行させる
付記10の記録媒体。
前記2つの対象物存在画像のうちの第1の対象物存在画像における前記対象物存在領域を、前記2つの対象物存在画像のうちの第2の対象物存在画像における対象物の倒れ込みの方位と倒れ込み量に従って膨張させ、前記第2の対象物存在画像における前記対象物存在領域を、前記第1の対象物存在画像における対象物の倒れ込みの方位と倒れ込み量に従って膨張させる処理を実行させる
付記11の記録媒体。
対象物の幅に基づいて決定される所定値よりも小さいサイズの領域を除去する処理を実行させる
付記10から付記12のうちのいずれかの記録媒体。
2つの観測画像の各々の観測角度と前記2つの観測画像の各々に現れる対象物の大きさとに基づいて、前記2つの観測画像の各々から得られる画像であって1つまたは複数の対象物が存在する2つの対象物存在画像における対象物存在領域を変形して2つの変形画像を生成する処理と、
前記2つの変形画像を合成して合成画像を生成し、前記合成画像を用いて前記2つの対象物存在画像の間での対象物の変化を判定し、判定された変化を特定可能な画像を生成する処理と
を実行させるための画像処理プログラム。
前記2つの対象物存在画像の各々において、前記対象物存在領域を所定量膨張させる処理を実行させる
付記14の画像処理プログラム。
前記2つの対象物存在画像のうちの第1の対象物存在画像における前記対象物存在領域を、前記2つの対象物存在画像のうちの第2の対象物存在画像における対象物の倒れ込みの方位と倒れ込み量に従って膨張させ、前記第2の対象物存在画像における前記対象物存在領域を、前記第1の対象物存在画像における対象物の倒れ込みの方位と倒れ込み量に従って膨張させる処理を実行させる
付記15の画像処理プログラム。
対象物の幅に基づいて決定される所定値よりも小さいサイズの領域を除去する処理を実行させる
付記14から付記16のうちのいずれかの画像処理プログラム。
10 対象物マップ生成手段
20 正解変化マップ生成手段
21 第1の倒れ込みパラメタ算出手段
22 第2の倒れ込みパラメタ算出手段
31 第1の膨張手段
32 第2の膨張手段
41 変化マップ生成手段
51 雑音除去手段
60 画像処理装置
61 画像変形部
62 画像生成部
63 倒れ込みパラメタ決定部
64 除去部
100 人工衛星
1001 プロセッサ
1002 プログラムメモリ
1003 メモリ
Claims (13)
- 2つの観測画像の各々の観測角度と前記2つの観測画像の各々に現れる対象物の大きさとに基づいて、前記2つの観測画像の各々から得られる画像であって1つまたは複数の対象物が存在する2つの対象物存在画像における対象物存在領域を変形して2つの変形画像を生成する画像変形手段と、
前記2つの変形画像を合成して合成画像を生成し、前記合成画像を用いて前記2つの対象物存在画像の間での対象物の変化を判定し、判定された変化を特定可能な画像を生成する画像生成手段と
を備えた画像処理装置。 - 前記画像変形手段は、前記2つの対象物存在画像の各々において、前記対象物存在領域を所定量膨張させる
請求項1記載の画像処理装置。 - 前記画像変形手段は、前記2つの対象物存在画像のうちの第1の対象物存在画像における前記対象物存在領域を、前記2つの対象物存在画像のうちの第2の対象物存在画像における対象物の倒れ込みの方位と倒れ込み量に従って膨張させ、前記第2の対象物存在画像における前記対象物存在領域を、前記第1の対象物存在画像における対象物の倒れ込みの方位と倒れ込み量に従って膨張させる
請求項2記載の画像処理装置。 - 前記2つの観測画像のメタデータに含まれる観測角度と対象物の高さとを用いて前記倒れ込み量を算出し、観測画像のメタデータに含まれる観測方位に基づいて倒れ込みの方位を決定する倒れ込みパラメタ決定手段をさらに備えた
請求項3記載の画像処理装置。 - 対象物の幅に基づいて決定される所定値よりも小さいサイズの領域を除去する除去手段をさらに備えた
請求項1から請求項4のうちのいずれか1項に記載の画像処理装置。 - 2つの観測画像の各々の観測角度と前記2つの観測画像の各々に現れる対象物の大きさとに基づいて、前記2つの観測画像の各々から得られる画像であって1つまたは複数の対象物が存在する2つの対象物存在画像における対象物存在領域を変形して2つの変形画像を生成し、
前記2つの変形画像を合成して合成画像を生成し、前記合成画像を用いて前記2つの対象物存在画像の間での対象物の変化を判定し、判定された変化を特定可能な画像を生成する
画像処理方法。 - 前記2つの対象物存在画像の各々において、前記対象物存在領域を所定量膨張させる
請求項6記載の画像処理方法。 - 前記2つの対象物存在画像のうちの第1の対象物存在画像における前記対象物存在領域を、前記2つの対象物存在画像のうちの第2の対象物存在画像における対象物の倒れ込みの方位と倒れ込み量に従って膨張させ、前記第2の対象物存在画像における前記対象物存在領域を、前記第1の対象物存在画像における対象物の倒れ込みの方位と倒れ込み量に従って膨張させる
請求項7記載の画像処理方法。 - 対象物の幅に基づいて決定される所定値よりも小さいサイズの領域を除去する
請求項6から請求項8のうちのいずれか1項に記載の画像処理方法。 - 画像処理プログラムが格納されたコンピュータ読み取り可能な記録媒体であって、
前記画像処理プログラムは、コンピュータに、
2つの観測画像の各々の観測角度と前記2つの観測画像の各々に現れる対象物の大きさとに基づいて、前記2つの観測画像の各々から得られる画像であって1つまたは複数の対象物が存在する2つの対象物存在画像における対象物存在領域を変形して2つの変形画像を生成する処理と、
前記2つの変形画像を合成して合成画像を生成し、前記合成画像を用いて前記2つの対象物存在画像の間での対象物の変化を判定し、判定された変化を特定可能な画像を生成する処理とを実行させる
記録媒体。 - 前記画像処理プログラムは、コンピュータに、
前記2つの対象物存在画像の各々において、前記対象物存在領域を所定量膨張させる処理を実行させる
請求項10記載の記録媒体。 - 前記画像処理プログラムは、コンピュータに、
前記2つの対象物存在画像のうちの第1の対象物存在画像における前記対象物存在領域を、前記2つの対象物存在画像のうちの第2の対象物存在画像における対象物の倒れ込みの方位と倒れ込み量に従って膨張させ、前記第2の対象物存在画像における前記対象物存在領域を、前記第1の対象物存在画像における対象物の倒れ込みの方位と倒れ込み量に従って膨張させる処理を実行させる
請求項11記載の記録媒体。 - 前記画像処理プログラムは、コンピュータに、
対象物の幅に基づいて決定される所定値よりも小さいサイズの領域を除去する処理を実行させる
請求項10から請求項12のうちのいずれか1項に記載の記録媒体。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2022538501A JP7501637B2 (ja) | 2020-07-20 | 2020-07-20 | 画像処理装置および画像処理方法 |
| US18/015,886 US20230245287A1 (en) | 2020-07-20 | 2020-07-20 | Image processing device and image processing method |
| EP20946445.2A EP4184209A4 (en) | 2020-07-20 | 2020-07-20 | IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD |
| PCT/JP2020/028065 WO2022018791A1 (ja) | 2020-07-20 | 2020-07-20 | 画像処理装置および画像処理方法 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2020/028065 WO2022018791A1 (ja) | 2020-07-20 | 2020-07-20 | 画像処理装置および画像処理方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022018791A1 true WO2022018791A1 (ja) | 2022-01-27 |
Family
ID=79728581
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2020/028065 Ceased WO2022018791A1 (ja) | 2020-07-20 | 2020-07-20 | 画像処理装置および画像処理方法 |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US20230245287A1 (ja) |
| EP (1) | EP4184209A4 (ja) |
| JP (1) | JP7501637B2 (ja) |
| WO (1) | WO2022018791A1 (ja) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2024142804A1 (ja) * | 2022-12-28 | 2024-07-04 | 日本電気株式会社 | 情報処理装置、検出装置、情報処理方法、および記録媒体 |
| WO2025058056A1 (ja) * | 2023-09-15 | 2025-03-20 | 住友電気工業株式会社 | 情報処理装置、情報生成方法、及びコンピュータプログラム |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0772244A (ja) * | 1993-06-14 | 1995-03-17 | Nec Corp | 干渉型合成開口レーダ装置および地形変動観測方式 |
| JP2000275338A (ja) * | 1999-03-25 | 2000-10-06 | Mitsubishi Electric Corp | 目標識別装置および目標識別方法 |
| JP2003044996A (ja) * | 2001-07-31 | 2003-02-14 | Matsushita Electric Ind Co Ltd | 障害物検出装置 |
| JP2010278963A (ja) * | 2009-06-01 | 2010-12-09 | Tokyo Gas Co Ltd | 監視システム用の画像処理プログラム、画像処理装置、及び画像処理方法 |
| JP2014164579A (ja) * | 2013-02-26 | 2014-09-08 | Oki Electric Ind Co Ltd | 情報処理装置、プログラム及び情報処理方法 |
| JP2016057092A (ja) * | 2014-09-05 | 2016-04-21 | 国立研究開発法人情報通信研究機構 | Sar図からの立体地形図形成方法 |
| WO2017126547A1 (ja) * | 2016-01-22 | 2017-07-27 | 日本電気株式会社 | 画像処理装置、画像処理方法および画像処理プログラム |
| JP2018194404A (ja) | 2017-05-16 | 2018-12-06 | 株式会社パスコ | 機械学習方法、及び地表変動判定方法 |
Family Cites Families (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| SE9900692L (sv) * | 1999-02-26 | 2000-03-27 | Foersvarets Forskningsanstalt | Sätt att med en SAR-radar detektera objekt som förändrar sig med tiden |
| US8243997B2 (en) * | 2008-10-16 | 2012-08-14 | The Curators Of The University Of Missouri | Detecting geographic-area change using high-resolution, remotely sensed imagery |
| US9146312B1 (en) * | 2011-05-25 | 2015-09-29 | Sandia Corporation | Pre-processing SAR image stream to facilitate compression for transport on bandwidth-limited-link |
| WO2014171988A2 (en) * | 2013-01-29 | 2014-10-23 | Andrew Robert Korb | Methods for analyzing and compressing multiple images |
| US9405974B2 (en) * | 2013-11-13 | 2016-08-02 | Xerox Corporation | System and method for using apparent size and orientation of an object to improve video-based tracking in regularized environments |
| JP6349937B2 (ja) * | 2014-05-09 | 2018-07-04 | 日本電気株式会社 | 変動検出装置、変動検出方法および変動検出用プログラム |
| US10032077B1 (en) * | 2015-10-29 | 2018-07-24 | National Technology & Engineering Solutions Of Sandia, Llc | Vehicle track identification in synthetic aperture radar images |
| US10839211B2 (en) * | 2017-08-08 | 2020-11-17 | Spaceknow Inc. | Systems, methods and computer program products for multi-resolution multi-spectral deep learning based change detection for satellite images |
| US10553020B1 (en) * | 2018-03-20 | 2020-02-04 | Ratheon Company | Shadow mask generation using elevation data |
| US11280911B2 (en) * | 2018-09-28 | 2022-03-22 | Echo Ridge, Llc | System and method for synthetic aperture based position, navigation, and timing |
| US10852421B1 (en) * | 2019-01-24 | 2020-12-01 | Descartes Labs, Inc. | Sparse phase unwrapping |
| US11094114B2 (en) * | 2019-02-08 | 2021-08-17 | Ursa Space Systems Inc. | Satellite SAR artifact suppression for enhanced three-dimensional feature extraction, change detection, and visualizations |
| CN110058236B (zh) * | 2019-05-21 | 2023-04-07 | 中南大学 | 一种面向三维地表形变估计的InSAR和GNSS定权方法 |
| US11333753B2 (en) * | 2019-10-14 | 2022-05-17 | The Boeing Company | Stripmap synthetic aperture radar (SAR) system utilizing direct matching and registration in range profile space |
| EP3896482B1 (en) * | 2020-04-15 | 2024-08-14 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Method for the computer-implemented generation of a synthetic data set for training a convolutional neural network for an interferometric sar |
-
2020
- 2020-07-20 WO PCT/JP2020/028065 patent/WO2022018791A1/ja not_active Ceased
- 2020-07-20 EP EP20946445.2A patent/EP4184209A4/en not_active Withdrawn
- 2020-07-20 JP JP2022538501A patent/JP7501637B2/ja active Active
- 2020-07-20 US US18/015,886 patent/US20230245287A1/en active Pending
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0772244A (ja) * | 1993-06-14 | 1995-03-17 | Nec Corp | 干渉型合成開口レーダ装置および地形変動観測方式 |
| JP2000275338A (ja) * | 1999-03-25 | 2000-10-06 | Mitsubishi Electric Corp | 目標識別装置および目標識別方法 |
| JP2003044996A (ja) * | 2001-07-31 | 2003-02-14 | Matsushita Electric Ind Co Ltd | 障害物検出装置 |
| JP2010278963A (ja) * | 2009-06-01 | 2010-12-09 | Tokyo Gas Co Ltd | 監視システム用の画像処理プログラム、画像処理装置、及び画像処理方法 |
| JP2014164579A (ja) * | 2013-02-26 | 2014-09-08 | Oki Electric Ind Co Ltd | 情報処理装置、プログラム及び情報処理方法 |
| JP2016057092A (ja) * | 2014-09-05 | 2016-04-21 | 国立研究開発法人情報通信研究機構 | Sar図からの立体地形図形成方法 |
| WO2017126547A1 (ja) * | 2016-01-22 | 2017-07-27 | 日本電気株式会社 | 画像処理装置、画像処理方法および画像処理プログラム |
| JP2018194404A (ja) | 2017-05-16 | 2018-12-06 | 株式会社パスコ | 機械学習方法、及び地表変動判定方法 |
Non-Patent Citations (2)
| Title |
|---|
| BRUZZONE, L ET AL.: "A novel framework for the design of change-detection systems for very- High-resolution remote sensing images", PROCEEDINGS OF THE IEEE, vol. 101, no. 3, March 2013 (2013-03-01), pages 609 - 630, XP011494139, DOI: 10.1109/JPROC.2012.2197169 * |
| M. A. LEBEDEV ET AL.: "CHANGE DETECTION IN REMOTE SENSING IMAGES USING CONDITIONAL ADVERSARIAL NETWORKS", THE INTERNATIONAL ARCHIVES OF THE PHOTOGRAMMETRY, REMOTE SENSING AND SPATIAL INFORMATION SCIENCES, vol. XLII-2, 2018 |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2024142804A1 (ja) * | 2022-12-28 | 2024-07-04 | 日本電気株式会社 | 情報処理装置、検出装置、情報処理方法、および記録媒体 |
| JPWO2024142804A1 (ja) * | 2022-12-28 | 2024-07-04 | ||
| JP7845509B2 (ja) | 2022-12-28 | 2026-04-14 | 日本電気株式会社 | 情報処理装置、検出装置、情報処理方法、およびプログラム |
| WO2025058056A1 (ja) * | 2023-09-15 | 2025-03-20 | 住友電気工業株式会社 | 情報処理装置、情報生成方法、及びコンピュータプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| US20230245287A1 (en) | 2023-08-03 |
| JP7501637B2 (ja) | 2024-06-18 |
| EP4184209A4 (en) | 2023-08-09 |
| EP4184209A1 (en) | 2023-05-24 |
| JPWO2022018791A1 (ja) | 2022-01-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10692002B1 (en) | Learning method and learning device of pedestrian detector for robust surveillance based on image analysis by using GAN and testing method and testing device using the same | |
| KR102203135B1 (ko) | 드론을 이용한 인공지능 기반 재난 피해정보 탐지 방법 및 시스템 | |
| CA3115188C (en) | Apparatus and method for providing application service using satellite image | |
| CN112949549A (zh) | 一种基于超分辨率的多分辨率遥感影像的变化检测方法 | |
| US10387752B1 (en) | Learning method and learning device for object detector with hardware optimization based on CNN for detection at distance or military purpose using image concatenation, and testing method and testing device using the same | |
| JPWO2008016153A1 (ja) | 災害対策支援方法 | |
| EP3731143B1 (en) | Method and device for economizing computing resources to be used during a process of verification of convolutional parameters using test pattern | |
| JP7719426B2 (ja) | 地球表面状況把握方法、地球表面状況把握装置、および地球表面状況把握プログラム | |
| WO2022018791A1 (ja) | 画像処理装置および画像処理方法 | |
| CN113743346A (zh) | 图像识别方法、装置、电子设备及存储介质 | |
| US11663801B2 (en) | Method for determining search region using region information and object information and system performing the same | |
| US10769768B1 (en) | Apparatus and method for providing application service using satellite image | |
| JP7652250B2 (ja) | 画像処理装置および画像処理方法 | |
| CN114563785A (zh) | 基于相位梯度的地表形变探测方法、装置、设备及介质 | |
| JP7784865B2 (ja) | 変位抽出装置、変位抽出システム、変位抽出方法およびコンピュータプログラム | |
| CN118501812A (zh) | 火源的定位方法、定位装置和电子设备 | |
| JP7388595B2 (ja) | 画像拡張装置、制御方法、及びプログラム | |
| Cerra et al. | Towards Early Warning for Damages to Cultural Heritage Sites: The Case of Palmyra | |
| CN116469014A (zh) | 基于优化Mask R-CNN的小样本卫星雷达图像帆板识别和分割方法 | |
| Kim et al. | Automated georeferencing of historic aerial photography | |
| Song et al. | Attention-based DeepMoon for Crater detection | |
| Palgen | Applicability of pattern recognition techniques to the analysis of urban quality from satellites | |
| Loerch | Improving Disaster Response with Aerial Imagery Through Uas-Based Image Acquisition and Analysis, Artificial Intelligence, and Timeliness Assessment | |
| RU2818049C2 (ru) | Способ корректировки цифровой модели высот (варианты) | |
| JP7738521B2 (ja) | 地上基準点生成装置、画像処理システム、地上基準点生成方法、画像処理方法、地上基準点生成プログラム、および画像処理プログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20946445 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2022538501 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2020946445 Country of ref document: EP Effective date: 20230220 |
|
| WWW | Wipo information: withdrawn in national office |
Ref document number: 2020946445 Country of ref document: EP |