WO2025204792A1 - Procédé de traitement d'informations, dispositif de traitement d'informations et programme - Google Patents
Procédé de traitement d'informations, dispositif de traitement d'informations et programmeInfo
- Publication number
- WO2025204792A1 WO2025204792A1 PCT/JP2025/008708 JP2025008708W WO2025204792A1 WO 2025204792 A1 WO2025204792 A1 WO 2025204792A1 JP 2025008708 W JP2025008708 W JP 2025008708W WO 2025204792 A1 WO2025204792 A1 WO 2025204792A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- self
- information
- location estimation
- location
- distance information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/242—Means based on the reflection of waves generated by the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/246—Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/40—Control within particular dimensions
- G05D1/43—Control of position or course in two dimensions [2D]
Definitions
- Patent Document 1 discloses a self-propelled robot equipped with a height-adjustable range sensor that can set the height of the range sensor to correspond to the shelf boards in each area in a warehouse where shelves are lined up at different heights, and then uses the measurement results to estimate its own position.
- the information processing device disclosed herein is an information processing device that includes a distance information generation unit that generates first distance information and second distance information from sensor data output by a distance measurement sensor mounted on a moving object; a self-location estimation unit that performs a first self-location estimation process based on the first distance information and a second self-location estimation process based on the second distance information as self-location estimation processes for the moving object; and a self-location information output unit that outputs self-location information that represents the self-location of the moving object based on the processing results of the first self-location estimation process and the second self-location estimation process.
- the program disclosed herein causes a computer to generate first distance information and second distance information from sensor data output by a distance measurement sensor mounted on a mobile object, and as a self-location estimation process for the mobile object, execute a first self-location estimation process based on the first distance information and a second self-location estimation process based on the second distance information, and output self-location information representing the self-location of the mobile object based on the results of the first self-location estimation process and the second self-location estimation process.
- first distance information and second distance information are generated from sensor data output by a distance measurement sensor mounted on a moving body, and as a self-location estimation process for the moving body, a first self-location estimation process based on the first distance information is executed, and a second self-location estimation process based on the second distance information is executed, and self-location information representing the self-location of the moving body is output based on the processing results of the first self-location estimation process and the second self-location estimation process.
- FIG. 1 is a diagram illustrating a first example of a self-location estimation process according to the technology of the present disclosure.
- FIG. 10 is a diagram illustrating a second example of a self-location estimation process according to the technology of the present disclosure.
- FIG. 2 is a block diagram showing an example of the configuration of a moving body.
- FIG. 2 is a block diagram illustrating an example of a functional configuration of an information processing unit.
- FIG. 10 is a diagram illustrating updating/recreating an environment map based on a score.
- FIG. 10 is a diagram illustrating position correction between submaps.
- FIG. 10 is a flowchart illustrating a multi-layer self-location estimation process.
- FIG. 10 is a block diagram showing another example of the functional configuration of the information processing unit.
- 10 is a flowchart illustrating a layer search process.
- FIG. 10 is a block diagram illustrating yet another example of the functional configuration of the information processing unit.
- FIG. 2 is a block diagram illustrating an example of the hardware configuration of a computer.
- LiDAR SLAM Simultaneous Localization and Mapping
- LiDAR Light Detection and Ranging
- 3D LiDAR SLAM uses point cloud information obtained by sensing three-dimensional structures (such as loaded cargo) indicated by the dashed line R1 in the figure, so the impact of environmental changes is significant.
- point cloud information obtained by 3D LiDAR is divided into layers in the height direction, and self-location estimation processing is performed on multiple layers. Layers with large environmental changes are avoided (the processing results of the self-location estimation processing are not used), enabling stable output of self-location information.
- self-location estimation processing is performed in two layers: 2D LiDAR SLAM using 2D point cloud information of the subject at height L11, and 2D LiDAR SLAM using 2D point cloud information of the subject at height L12.
- self-location estimation processing may be performed in two layers: 2D LiDAR SLAM using 2D point cloud information of the subject at height L21, and 3D LiDAR SLAM using 3D point cloud information of the subject in height range L22.
- self-location estimation processing may be performed in three or more layers, including either 2D LiDAR SLAM or 3D LiDAR SLAM.
- Configuration of moving body> 5 is a block diagram showing an example configuration of the mobile object 1.
- the mobile object 1 is configured as a mobile robot capable of autonomously moving in various environments, such as a transport robot that transports luggage in a warehouse or an inspection robot that patrols a construction site and collects images.
- the mobile object 1 may be configured as a cleaning robot that comes into close contact with glass surfaces or walls of a building or house and cleans the surface on which it moves.
- the mobile object 1 is composed of a sensor 10, a drive unit 20, a communication unit 30, a memory unit 40, and an information processing unit 100.
- Sensor 10 is configured to include a distance sensor, an RGB camera, a collision prevention sensor, a speed sensor, an acceleration sensor, etc.
- the sensor data acquired by sensor 10 is supplied to the information processing unit 100.
- the drive unit 20 is configured as a motor that rotates the wheels of the mobile object 1.
- the drive unit 20 operates under the control of the information processing unit 100, allowing the mobile object 1 to move.
- the communication unit 30 is a wireless communication module that performs wireless communication with external devices.
- the communication unit 30 supplies information received via wireless communication to the information processing unit 100, and transmits information supplied from the information processing unit 100 via wireless communication.
- the storage unit 40 is composed of volatile memory such as DRAM (Dynamic Random Access Memory).
- the storage unit 40 stores various data obtained by calculations performed by the information processing unit 100.
- the information processing unit 100 is composed of processors such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit).
- the information processing unit 100 controls each part of the mobile object 1.
- the configuration of the information processing unit 100 in an embodiment of the present disclosure will be described below.
- FIG. 6 is a block diagram illustrating an example of the functional configuration of the information processing unit 100 according to the first embodiment of the present disclosure.
- the information processing unit 100 in Figure 6 can perform the following processing by executing a program stored in memory (not shown).
- the information processing unit 100 generates first distance information and second distance information from sensor data output by the ranging sensor 111 included in the sensor 10.
- the ranging sensor 111 is composed of a 3D sensor that can acquire three-dimensional distance information in the environment, such as a 3D LiDAR or 3D depth sensor.
- the information processing unit 100 can extract first distance information of a first layer and second distance information of a second layer from the sensor data (three-dimensional distance information) output by a single ranging sensor 111.
- the information processing unit 100 performs a first self-location estimation process based on the first distance information, and a second self-location estimation process based on the second distance information, as self-location estimation processes for the moving body 1. That is, the information processing unit 100 performs self-location estimation processes in multiple layers.
- self-location estimation processes in multiple layers will also be referred to as multi-layer self-location estimation processes. Note that, in the following, the multiple layers will be described as layers that differ in the height direction relative to the horizontal plane, but they may also be layers that differ in the horizontal direction (left and right direction) or layers that differ in the diagonal direction.
- Self-location estimation unit 131-1 uses a pre-prepared first layer environment map MP1 to perform self-location estimation processing based on distance information for that layer.
- Self-location estimation unit 131-2 uses a pre-prepared second layer environment map MP2 to perform self-location estimation processing based on distance information for that layer.
- Self-location estimation unit 131-3 uses a pre-prepared third layer environment map MP3 to perform self-location estimation processing based on distance information for that layer.
- the self-location estimation process of each layer is executed as a loosely coupled process, but the self-location estimation process of each layer can also be executed as a tightly coupled process.
- the positions of the corresponding submaps SM21 to SM24 are corrected based on the positional relationships between the submaps SM11 to SM14 that make up the environment map MP1.
- consistency between the environment maps for each layer can be improved.
- the multi-layer self-location estimation process executed by the information processing unit 100 in FIG. 6 will be described with reference to the flowchart in FIG. 9.
- the process in FIG. 9 is initiated, for example, when the mobile object 1, which is a mobile robot, starts up in the environment.
- step S20 the self-location integration unit 140 extracts processing results with scores greater than the threshold from the score determination unit 132 of the layer corresponding to the score greater than the threshold. Then, in step S16, the processing results are integrated based on the scores, and in step S17, the self-location information is output.
- step S22 the map update unit 133 for the other layer updates or recreates the environment map stored in the environment map storage unit 134 for that layer.
- step S17 the processing result with the highest score among all scores smaller than the threshold is output as the self-location information. This allows the self-location estimation process to continue in the layer with the least environmental change, even if there is an environmental change in all layers, and makes it possible to avoid unexpected stops of the moving body 1.
- the information processing unit 100 of FIG. 10 differs from the information processing unit 100 of FIG. 6 in that it newly includes a layer search self-position estimation unit 210 and a score comparison unit 220.
- the layer search self-position estimation unit 210 uses the environment map stored in the environment map storage unit (not shown) to perform self-position estimation processing based on the third distance information (point cloud information of the search layer), and calculates a score for the processing result. The calculated score is supplied to the score comparison unit 220.
- the score comparison unit 220 obtains the scores resulting from the self-location estimation process for each layer in the multi-layer self-location estimation unit 130, and compares them with the scores resulting from the self-location estimation process for the search layer from the score comparison unit 220.
- the score comparison results are supplied to the sensor data division unit 120.
- the sensor data division unit 120 changes the layer into which the sensor data output by the ranging sensor 111 is divided, based on the score comparison result from the score comparison unit 220. In other words, the sensor data division unit 120 changes the layer into which the sensor data output by the ranging sensor 111 is divided, based on the score of the processing result of the self-position estimation process based on the third distance information (point cloud information of the search layer).
- the self-position estimation process executed by the layer search self-position estimation unit 210 can be said to be a layer search self-position estimation process for searching for a layer that has a specific environmental change (or a layer that has no environmental change).
- the layer search process executed by the information processing unit 100 in FIG. 10 will be described with reference to the flowchart in FIG. 11.
- the process in FIG. 11 is executed in parallel with the multi-layer self-location estimation process described with reference to the flowchart in FIG. 9.
- step S21 the layer search self-position estimation unit 210 reads the environment map from the environment map storage unit (not shown), performs self-position estimation processing for layer search based on the distance information (point cloud information) of the search layer, and calculates a score for the processing result.
- step S22 the score comparison unit 220 determines whether the score (reliability) of the processing result of the self-location estimation process for layer search is higher than the processing result of the self-location estimation process for each layer in the multi-layer self-location estimation unit 130. If it is determined that the score of the processing result of the self-location estimation process for layer search is not higher than the processing result of the self-location estimation process for each layer, the process returns to step S21, and the self-location estimation process for layer search is repeated.
- the first layer is changed to a search layer as the layer into which the sensor data is divided.
- the score of the processing result of the self-location estimation process for layer search is higher than the score of the processing result of the self-location estimation process for the second layer, the second layer is changed to a search layer as the layer into which the sensor data is divided.
- the score of the processing result of the self-location estimation process for layer search is higher than the scores of the processing results of the self-location estimation processes for all layers, the layer into which the sensor data is divided is changed so that at least the search layer is included.
- the above process allows the multi-layer self-localization process to autonomously search for layers with large environmental changes and switch to layers with less environmental changes, thereby further improving robustness against environmental changes.
- the information processing unit 100 in FIG. 12 differs from the information processing unit 100 in FIG. 6 in that it has sensor data acquisition units 310-1 and 310-2 instead of the sensor data division unit 120.
- the information processing unit 100 in FIG. 12 generates first distance information and second distance information from the sensor data output by distance measurement sensors 111-1 and 111-2 mounted at different heights on the moving body 1.
- Each of the distance measurement sensors 111-1 and 111-2 may be configured as a 3D sensor capable of acquiring three-dimensional distance information in the environment, or may be configured as a 2D sensor capable of acquiring two-dimensional distance information in the environment, such as a 2D LiDAR or 2D depth sensor.
- the sensor data acquisition unit 310-1 acquires sensor data output by the ranging sensor 111-1 and generates first distance information. If the ranging sensor 111-1 is configured as a 3D sensor, the sensor data acquisition unit 310-1 extracts first distance information of a first layer, which is set based on, for example, environmental information, from the sensor data (three-dimensional distance information) output by the ranging sensor 111-1, and supplies it to the multi-layer self-position estimation unit 130. If the ranging sensor 111-1 is configured as a 2D sensor, the sensor data acquisition unit 310-1 acquires sensor data (two-dimensional distance information) output by the ranging sensor 111-1 and supplies it to the multi-layer self-position estimation unit 130.
- the sensor data acquisition unit 310-2 acquires sensor data output by the ranging sensor 111-2 and generates second distance information. If the ranging sensor 111-2 is configured as a 3D sensor, the sensor data acquisition unit 310-2 extracts second distance information of a second layer, which is set based on, for example, environmental information, from the sensor data (three-dimensional distance information) output by the ranging sensor 111-2, and supplies this to the multi-layer self-position estimation unit 130. If the ranging sensor 111-2 is configured as a 2D sensor, the sensor data acquisition unit 310-2 acquires sensor data (two-dimensional distance information) output by the ranging sensor 111-2 and supplies this to the multi-layer self-position estimation unit 130.
- one step includes multiple processes
- the multiple processes included in that one step can be executed by one device, or they can be shared and executed by multiple devices.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
La présente invention concerne un procédé de traitement d'informations, un dispositif de traitement d'informations et un programme facilitant l'amélioration de la robustesse du traitement d'estimation de propre position. Une unité de traitement d'informations de la présente invention génère des premières informations de distance et des secondes informations de distance à partir de données de capteur délivrées par un capteur de télémétrie monté sur un corps en mouvement, exécute un premier traitement d'estimation de propre position, qui est basé sur les premières informations de distance, et exécute un second traitement d'estimation de propre position, qui est basé sur les secondes informations de distance, en tant que traitement d'estimation de propre position du corps en mouvement, et délivre des informations de propre position représentant la position propre du corps en mouvement sur la base des résultats de traitement du premier traitement d'estimation de propre position et du second traitement d'estimation de propre position. La présente divulgation peut être appliquée à un corps en mouvement qui effectue une estimation de sa propre position dans un environnement où des structures tridimensionnelles changent considérablement.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024-054502 | 2024-03-28 | ||
| JP2024054502 | 2024-03-28 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025204792A1 true WO2025204792A1 (fr) | 2025-10-02 |
Family
ID=97219585
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2025/008708 Pending WO2025204792A1 (fr) | 2024-03-28 | 2025-03-10 | Procédé de traitement d'informations, dispositif de traitement d'informations et programme |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025204792A1 (fr) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014056506A (ja) * | 2012-09-13 | 2014-03-27 | Toyota Central R&D Labs Inc | 障害物検出装置及びそれを備えた移動体 |
| JP2017102705A (ja) * | 2015-12-02 | 2017-06-08 | 株式会社リコー | 自律移動装置及び自律移動装置システム |
| JP2021043082A (ja) * | 2019-09-11 | 2021-03-18 | 株式会社東芝 | 位置推定装置、移動体制御システム、位置推定方法およびプログラム |
-
2025
- 2025-03-10 WO PCT/JP2025/008708 patent/WO2025204792A1/fr active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014056506A (ja) * | 2012-09-13 | 2014-03-27 | Toyota Central R&D Labs Inc | 障害物検出装置及びそれを備えた移動体 |
| JP2017102705A (ja) * | 2015-12-02 | 2017-06-08 | 株式会社リコー | 自律移動装置及び自律移動装置システム |
| JP2021043082A (ja) * | 2019-09-11 | 2021-03-18 | 株式会社東芝 | 位置推定装置、移動体制御システム、位置推定方法およびプログラム |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8295547B1 (en) | Model-based feature tracking in 3-D and 2-D imagery | |
| US10549430B2 (en) | Mapping method, localization method, robot system, and robot | |
| CN109298629B (zh) | 在未绘制地图区域中引导移动平台的系统及方法 | |
| CN108290294B (zh) | 移动机器人及其控制方法 | |
| US11062475B2 (en) | Location estimating apparatus and method, learning apparatus and method, and computer program products | |
| US8897947B2 (en) | Autonomous mobile device | |
| US5363305A (en) | Navigation system for a mobile robot | |
| US9251417B1 (en) | Fast open doorway detection for autonomous robot exploration | |
| US20180112985A1 (en) | Vision-Inertial Navigation with Variable Contrast Tracking Residual | |
| CN114924287B (zh) | 地图构建方法、设备与介质 | |
| Hähnel et al. | Mobile robot mapping in populated environments | |
| WO2021135645A1 (fr) | Procédé et dispositif de mise à jour de carte | |
| Scherer et al. | Using depth in visual simultaneous localisation and mapping | |
| CN112740274A (zh) | 在机器人设备上使用光流传感器进行vslam比例估计的系统和方法 | |
| Bostanci et al. | Sensor fusion of camera, GPS and IMU using fuzzy adaptive multiple motion models | |
| US11880209B2 (en) | Electronic apparatus and controlling method thereof | |
| KR20150144730A (ko) | ADoG 기반 특징점을 이용한 이동 로봇의 위치를 인식하기 위한 장치 및 그 방법 | |
| CN116088503B (zh) | 动态障碍物检测方法和机器人 | |
| Ghani et al. | Improvement of the 2D SLAM system using Kinect sensor for indoor mapping | |
| Rico et al. | Open source robot localization for nonplanar environments | |
| WO2025204792A1 (fr) | Procédé de traitement d'informations, dispositif de traitement d'informations et programme | |
| Martín et al. | Octree-based localization using RGB-D data for indoor robots | |
| Deng et al. | Improved closed-loop detection and Octomap algorithm based on RGB-D SLAM | |
| KR102836221B1 (ko) | 3차원 서브 그리드맵을 이용한 로봇의 자세 추정 방법 및 이를 이용한 로봇 | |
| WO2023189721A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25775198 Country of ref document: EP Kind code of ref document: A1 |