WO2019082926A1 - センシングシステム及び車両 - Google Patents
センシングシステム及び車両Info
- Publication number
- WO2019082926A1 WO2019082926A1 PCT/JP2018/039485 JP2018039485W WO2019082926A1 WO 2019082926 A1 WO2019082926 A1 WO 2019082926A1 JP 2018039485 W JP2018039485 W JP 2018039485W WO 2019082926 A1 WO2019082926 A1 WO 2019082926A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lidar
- vehicle
- unit
- control unit
- lidar unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0015—Planning or execution of driving tasks specially adapted for safety
- B60W60/0017—Planning or execution of driving tasks specially adapted for safety of other traffic participants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—Three-dimensional [3D] imaging with simultaneous measurement of time-of-flight at a two-dimensional [2D] array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4817—Constructional features, e.g. arrangements of optical elements relating to scanning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/484—Transmitters
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/865—Combination of radar systems with lidar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9316—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
Definitions
- the present disclosure relates to a sensing system.
- the present disclosure relates to a sensing system provided to a vehicle capable of traveling in an automatic driving mode.
- the present disclosure relates to a vehicle capable of traveling in an automatic operation mode provided with a sensing system.
- the vehicle system automatically controls the traveling of the vehicle. Specifically, in the automatic driving mode, the vehicle system performs steering control based on information (peripheral environment information) indicating the peripheral environment of the vehicle obtained from a sensor such as a camera or radar (for example, laser radar or millimeter wave radar) (Control of the traveling direction of the vehicle), at least one of brake control and accelerator control (braking of the vehicle, control of acceleration and deceleration) is automatically performed.
- a sensor for example, laser radar or millimeter wave radar
- the driver controls the travel of the vehicle, as is the case with many conventional vehicles.
- traveling of the vehicle is controlled in accordance with a driver's operation (steering operation, braking operation, accelerator operation), and the vehicle system does not automatically perform steering control, brake control, and accelerator control.
- the driving mode of a vehicle is not a concept that exists only in some vehicles, but a concept that exists in all vehicles including conventional vehicles that do not have an automatic driving function, for example, vehicle control It is classified according to the method etc.
- autonomous driving vehicles vehicles traveling in the automatic driving mode on public roads
- manual driving vehicles vehicles traveling in the manual driving mode
- Patent Document 1 discloses an automatic follow-up traveling system in which a following vehicle is automatically followed by a preceding vehicle.
- each of the leading vehicle and the following vehicle is equipped with a lighting system, and character information for preventing other vehicles from breaking in between the leading vehicle and the following vehicle is used as the lighting system of the preceding vehicle.
- character information indicating that the vehicle is following automatically is displayed on the illumination system of the following vehicle.
- the LiDAR unit is used, and the electronic control unit (ECU) uses the point cloud data acquired from the LiDAR unit to obtain the surrounding environment information of the vehicle (for example, the periphery of the vehicle It is possible to obtain information related to the object present in
- the electronic control unit uses the point cloud data acquired from the LiDAR unit to obtain the surrounding environment information of the vehicle (for example, the periphery of the vehicle. It is possible to obtain information related to the object present in
- it is conceivable to increase the scanning resolution of the LiDAR unit but at the same time the calculation of the electronic control unit that processes point cloud data as the scanning resolution of the LiDAR unit increases.
- the load will increase dramatically.
- the detection area (detection angle area) in the horizontal direction of the vehicle is sufficiently wide, the detection area in the vertical direction (vertical direction) of the vehicle is considerably narrow. For this reason, there is room to further improve the recognition accuracy of the surrounding environment of the vehicle by widening the detection region of the LiDAR unit in the vertical direction of the vehicle.
- a first object of the present disclosure is to provide a sensing system and a vehicle capable of improving the accuracy of surrounding environment information while suppressing the calculation load of the electronic control unit.
- a second object of the present disclosure is to provide a sensing system and a vehicle capable of improving the recognition accuracy of the environment around the vehicle.
- a sensing system in a vehicle capable of traveling in an automatic driving mode, and is configured to acquire point cloud data indicating an environment around the vehicle; and the LiDAR unit And LiDAR control unit configured to specify information associated with an object present around the vehicle based on point cloud data acquired from the above.
- the LiDAR control unit is configured to control the LiDAR unit such that the scanning resolution of the LiDAR unit in a first angle area in which the target exists in the detection area of the LiDAR unit is increased.
- the scanning resolution of the LiDAR unit in the first angle region where the object (for example, a pedestrian or the like) is present in the detection region of the LiDAR unit is increased.
- the calculation load of the LiDAR control unit is suppressed by not increasing the scanning resolution in the detection area other than the first angle area.
- the accuracy of the information related to the object can be improved. Therefore, it is possible to provide a sensing system capable of improving the accuracy of the surrounding environment information while suppressing the calculation load of the electronic control unit.
- the LiDAR control unit can not specify the attribute of the object based on the point cloud data acquired from the LiDAR unit, the scan resolution of the LiDAR unit in the first angle area is increased. Control the LiDAR unit.
- the scanning resolution of the LiDAR unit in the first angle region in which the object exists is increased.
- the attributes of the object can be identified with certainty.
- the LiDAR control unit may be configured to control the LiDAR unit such that the scanning resolution of the LiDAR unit in the first angle region is gradually increased until the attribute of the object can be specified. Good.
- the scanning resolution of the LiDAR unit in the first angle region gradually increases until the attribute of the object can be identified, the attribute of the object can be identified with certainty.
- the LiDAR control unit Update the position of the object based on the point cloud data newly acquired from the LiDAR unit, Updating the first angular region based on the updated position of the object; It may be configured as follows.
- the first angle area is updated based on the position of the updated object.
- the scanning resolution of the LiDAR unit can be increased in the first angle area where the moving object is present.
- a vehicle capable of traveling in an automatic driving mode may be provided, including the vehicle system.
- a sensing system is provided to a vehicle capable of traveling in an automatic operation mode, A LiDAR unit configured to obtain point cloud data indicative of a surrounding environment of the vehicle; A LiDAR control unit configured to acquire surrounding environment information indicating a surrounding environment of the vehicle based on point cloud data acquired from the LiDAR unit; An actuator configured to change the tilt angle of the LiDAR unit with respect to the vertical direction of the vehicle; An actuator control unit configured to control driving of the actuator; Equipped with
- the detection region (detection angle range) of the LiDAR unit in the vertical direction can be expanded.
- a sensing system capable of improving the recognition accuracy of the surrounding environment of the vehicle.
- the LiDAR unit may acquire a first frame of the point cloud data. If the tilt angle of the LiDAR unit is a second tilt angle different from the first tilt angle, the LiDAR unit may acquire a second frame of the point cloud data.
- the LiDAR control unit may acquire the surrounding environment information based on the first frame and the second frame.
- the first frame of the point cloud data is acquired, and the point when the inclination angle of the LiDAR unit is the second inclination angle A second frame of group data is obtained. Thereafter, the surrounding environment information is acquired based on the acquired first and second frames.
- the detection area in the vertical direction of the LiDAR unit when the first frame is acquired is different from the detection area in the vertical direction of the LiDAR unit when the second frame is acquired. By combining the second frame, the detection area of the LiDAR unit in the vertical direction can be expanded.
- the actuator control unit may control the actuator in a first scanning time of the LiDAR unit when the first frame is acquired and a second scanning time of the LiDAR unit when the second frame is acquired. It is not necessary to drive.
- the scanning line of the LiDAR unit is not inclined along with the inclination of the LiDAR unit in the vertical direction. As described above, it is possible to reduce the calculation load of the LiDAR control unit that performs calculation processing of point cloud data.
- the actuator control unit may control the actuator in a first scanning time of the LiDAR unit when the first frame is acquired and a second scanning time of the LiDAR unit when the second frame is acquired. It may be driven.
- the actuator since the actuator is driven in the first scan time and the second scan time, the scan line of the LiDAR unit is inclined, but the update rate of the surrounding environment information based on the point cloud data is prevented from being greatly reduced. It is possible to As described above, it is possible to widen the detection area of the LiDAR unit in the vertical direction while avoiding a large decrease in the update rate of the surrounding environment information based on the point cloud data.
- the actuator control unit may be configured to determine whether to drive the actuator according to the current position of the vehicle.
- whether to drive the actuator is determined according to the current position of the vehicle. In other words, it is determined whether to tilt the LiDAR unit in the vertical direction according to the current position of the vehicle. In this way, it is possible to acquire the optimum surrounding environment information according to the current position of the vehicle.
- the actuator control unit may be configured to determine the maximum value of the tilt angle of the LiDAR unit according to the current speed of the vehicle.
- the maximum value of the tilt angle of the LiDAR unit is determined according to the current speed of the vehicle. In this way, it is possible to acquire the optimum surrounding environment information according to the current speed of the vehicle.
- the actuator control unit may be configured to drive the actuator in response to detection of a pedestrian present around the vehicle.
- the actuator is driven according to the detection of the pedestrian present in the periphery of the vehicle.
- the LiDAR unit tilts in the vertical direction.
- the accuracy of the information related to the object for example, the attribute information of the object, etc.
- the tilt angle of the LiDAR unit is gradually changed at a first angle interval within a predetermined angle range with respect to the horizontal direction of the vehicle,
- the tilt angle of the LiDAR unit may be gradually changed at a second angle interval larger than the first angle pitch outside the predetermined angle region.
- the tilt angle of the LiDAR unit is gradually changed at the first angle interval within the predetermined angle region, while the tilt angle of the LiDAR unit is greater than the first angle interval outside the predetermined angle region. It is gradually changed at a large second angular interval.
- the scanning resolution of the LiDAR unit can be enhanced within a predetermined angle region, and the detection region of the LiDAR unit in the vertical direction can be expanded.
- the LiDAR unit A first LiDAR unit and a second LiDAR unit may be provided. In top view, the first LiDAR unit and the second LiDAR unit may be arranged to overlap with each other.
- the actuator is A first actuator configured to change an inclination angle of the first LiDAR unit with respect to the vertical direction; And a second actuator configured to change an inclination angle of the second LiDAR unit with respect to the vertical direction.
- a vehicle capable of traveling in an automatic driving mode, comprising the above-mentioned sensing system.
- 1 is a top view of a vehicle provided with a vehicle system according to a first embodiment of the present invention.
- 1 is a block diagram showing a vehicle system according to a first embodiment. It is a figure which shows the functional block of the control part of the left front illumination system. It is a figure for demonstrating the detection area of the camera in the left front illumination system, the detection area of LiDAR unit, and the detection area of millimeter wave radar. It is a flowchart for demonstrating the control method of the LiDAR unit which concerns on 1st Embodiment. It is a figure which shows a mode that a pedestrian exists in the detection area
- FIG. 5 is a schematic view of the LiDAR unit and actuator viewed from the right side. It is the schematic of LiDAR unit seen from the front. It is a figure which shows a mode that the detection area
- “left and right direction”, “front and back direction”, and “upper and lower direction” will be referred to as appropriate. These directions are relative directions set for the vehicle 1 shown in FIG.
- the “front-rear direction” is a direction including the “front direction” and the “rear direction”.
- the “left-right direction” is a direction including the “left direction” and the “right direction”.
- the “vertical direction” is a direction including “upper direction” and “lower direction”.
- the “horizontal direction” is also referred to as appropriate, but the “horizontal direction” is a direction perpendicular to the “vertical direction” and includes the “horizontal direction” and the “front-rear direction”.
- FIG. 1 is a schematic view showing a top view of a vehicle 1 provided with a vehicle system 2.
- the vehicle 1 is a vehicle (automobile) that can travel in an automatic driving mode, and includes a vehicle system 2.
- the vehicle system 2 includes a vehicle control unit 3, a front left illumination system 4a (hereinafter simply referred to as “illumination system 4a”), a front right illumination system 4b (hereinafter simply referred to as “illumination system 4b”) and a left rear illumination.
- At least a system 4c hereeinafter, simply referred to as “illumination system 4c”
- a right rear illumination system 4d hereinafter, simply referred to as "illumination system 4d”).
- the illumination system 4 a is provided on the left front side of the vehicle 1.
- the lighting system 4a includes a housing 24a installed on the left front side of the vehicle 1 and a light transmitting cover 22a attached to the housing 24a.
- the illumination system 4 b is provided on the right front side of the vehicle 1.
- the lighting system 4b includes a housing 24b installed on the right front side of the vehicle 1, and a light transmitting cover 22b attached to the housing 24b.
- the illumination system 4 c is provided on the left rear side of the vehicle 1.
- the lighting system 4 c includes a housing 24 c installed on the left rear side of the vehicle 1 and a light transmitting cover 22 c attached to the housing 24 c.
- the illumination system 4 d is provided on the right rear side of the vehicle 1.
- the illumination system 4d includes a housing 24d installed on the right rear side of the vehicle 1 and a light transmitting cover 22d attached to the housing 24d.
- FIG. 2 is a block diagram showing a vehicle system 2 according to the present embodiment.
- the vehicle system 2 includes a vehicle control unit 3, illumination systems 4a to 4d, a sensor 5, an HMI (Human Machine Interface) 8, a GPS (Global Positioning System) 9, and a wireless communication unit. 10 and a storage device 11.
- the vehicle system 2 includes a steering actuator 12, a steering device 13, a brake actuator 14, a brake device 15, an accelerator actuator 16, and an accelerator device 17.
- the vehicle system 2 also includes a battery (not shown) configured to supply power.
- the vehicle control unit 3 is configured to control the traveling of the vehicle 1.
- the vehicle control unit 3 is configured of, for example, at least one electronic control unit (ECU: Electronic Control Unit).
- the electronic control unit may include at least one microcontroller including one or more processors and one or more memories, and other electronic circuitry including active and passive elements such as transistors.
- the processor is, for example, a central processing unit (CPU), a micro processing unit (MPU), a graphics processing unit (GPU), and / or a tensor processing unit (TPU).
- the CPU may be configured by a plurality of CPU cores.
- the GPU may be configured by a plurality of GPU cores.
- the memory includes a ROM (Read Only Memory) and a RAM (Random Access Memory).
- a vehicle control program may be stored in the ROM.
- the vehicle control program may include an artificial intelligence (AI) program for autonomous driving.
- AI is a program constructed by supervised or unsupervised machine learning using a neural network such as deep learning.
- the RAM may temporarily store a vehicle control program, vehicle control data, and / or surrounding environment information indicating a surrounding environment of the vehicle.
- the processor may be configured to expand a program specified from the vehicle control program stored in the ROM on the RAM, and execute various processes in cooperation with the RAM.
- the electronic control unit may be configured by at least one integrated circuit such as an application specific integrated circuit (ASIC) or a field-programmable gate array (FPGA). Furthermore, the electronic control unit may be configured by a combination of at least one microcontroller and at least one integrated circuit (such as an FPGA).
- ASIC application specific integrated circuit
- FPGA field-programmable gate array
- the illumination system 4a (an example of a sensing system) further includes a control unit 40a, an illumination unit 42a, a camera 43a, a LiDAR (Light Detection and Ranging) unit 44a (an example of a laser radar), and a millimeter wave radar 45a.
- the control unit 40a, the illumination unit 42a, the camera 43a, the LiDAR unit 44a, and the millimeter wave radar 45a are disposed in a space Sa formed by the housing 24a and the light transmitting cover 22a, as shown in FIG. .
- the control unit 40a may be disposed at a predetermined place of the vehicle 1 other than the space Sa.
- the control unit 40a may be configured integrally with the vehicle control unit 3.
- the control unit 40a is configured by, for example, at least one electronic control unit (ECU).
- the electronic control unit may include at least one microcontroller including one or more processors and one or more memories, and other electronic circuits (eg, transistors, etc.).
- the processor is, for example, a CPU, an MPU, a GPU and / or a TPU.
- the CPU may be configured by a plurality of CPU cores.
- the GPU may be configured by a plurality of GPU cores.
- the memory includes a ROM and a RAM.
- the ROM may store a surrounding environment specifying program for specifying the surrounding environment of the vehicle 1.
- the peripheral environment identification program is a program constructed by supervised or unsupervised machine learning using a neural network such as deep learning.
- a peripheral environment identification program In the RAM, a peripheral environment identification program, image data acquired by the camera 43a, three-dimensional mapping data (point group data) acquired by the LiDAR unit 44a, and / or detection data acquired by the millimeter wave radar 45a, etc. are temporarily stored. May be stored.
- the processor may be configured to expand a program specified from the peripheral environment specifying program stored in the ROM on the RAM and execute various processing in cooperation with the RAM.
- the electronic control unit (ECU) may be configured by at least one integrated circuit such as an ASIC or an FPGA.
- the electronic control unit may be configured by a combination of at least one microcontroller and at least one integrated circuit (such as an FPGA).
- the illumination unit 42 a is configured to form a light distribution pattern by emitting light toward the outside (forward) of the vehicle 1.
- the illumination unit 42a has a light source for emitting light and an optical system.
- the light source may be configured by, for example, a plurality of light emitting elements arranged in a matrix (for example, N rows ⁇ M columns, N> 1, M> 1).
- the light emitting element is, for example, a light emitting diode (LED), a laser diode (LD), or an organic EL element.
- the optical system is configured to reflect light emitted from the light source toward the front of the lighting unit 42a, and to refract light reflected by the light emitted directly from the light source or reflected by the reflector. And at least one of the lenses.
- the lighting unit 42a displays a light distribution pattern for the driver (for example, a low beam light distribution pattern or a high beam light distribution pattern) in front of the vehicle 1. It is configured to form. Thus, the lighting unit 42a functions as a left headlamp unit.
- the lighting unit 42a may be configured to form a light distribution pattern for a camera in front of the vehicle 1.
- Control part 40a may be constituted so that an electric signal (for example, PWM (Pulse Width Modulation) signal) may be separately supplied to each of a plurality of light emitting elements provided in lighting unit 42a.
- the control unit 40a can individually select the light emitting elements to which the electric signal is supplied, and can adjust the duty ratio of the electric signal for each light emitting element. That is, the control unit 40a can select a light emitting element to be turned on or off among the plurality of light emitting elements arranged in a matrix, and can determine the luminance of the light emitting element that is turned on.
- the control unit 40a can change the shape and brightness of the light distribution pattern emitted forward from the lighting unit 42a.
- the camera 43a is configured to detect the surrounding environment of the vehicle 1.
- the camera 43a is configured to transmit image data to the control unit 40a after acquiring image data indicating the environment around the vehicle 1.
- the control unit 40a specifies the surrounding environment information based on the transmitted image data.
- the surrounding environment information may include information on an object present outside the vehicle 1.
- the surrounding environment information may include information on the attribute of an object present outside the vehicle 1 and information on the distance and position of the object relative to the vehicle 1.
- the camera 43a is configured by an imaging device such as, for example, a charge-coupled device (CCD) or a metal oxide semiconductor (CMOS).
- CCD charge-coupled device
- CMOS metal oxide semiconductor
- the camera 43a may be configured as a monocular camera or may be configured as a stereo camera.
- the control unit 40a uses parallax to make the vehicle 1 and an object existing outside the vehicle 1 (for example, based on two or more image data acquired by the stereo camera) The distance between the pedestrian and the like can be specified.
- one camera 43a is provided in the illumination system 4a in the present embodiment, two or more cameras 43a may be provided in the illumination system 4a.
- the LiDAR unit 44 a is configured to detect the surrounding environment of the vehicle 1.
- the LiDAR unit 44a is configured to transmit point cloud data (3D mapping data) indicating the peripheral environment of the vehicle 1 to the control unit 40a.
- the control unit 40a specifies the surrounding environment information based on the transmitted point cloud data.
- the surrounding environment information may include information related to an object present outside the vehicle 1.
- the surrounding environment information may include information on the attribute of an object present outside the vehicle 1, information on the distance and position of the object with respect to the vehicle 1, and information on the moving direction of the object.
- the LiDAR unit 44a acquires information on the time of flight (TOF) ⁇ T1 of the laser beam (light pulse) at each emission angle (horizontal angle ⁇ , vertical angle ⁇ ) of the laser beam. Then, based on the information on the time of flight ⁇ T1, the information on the distance D between the LiDAR unit 44a (vehicle 1) and the object existing outside the vehicle 1 at each emission angle (horizontal angle ⁇ , vertical angle ⁇ ) is obtained can do.
- the flight time ⁇ T1 can be calculated, for example, as follows.
- Time of flight ⁇ T1 time at which laser light (light pulse) returned to LiDAR unit t1—time at which LiDAR unit emitted laser light (light pulse)
- the LiDAR unit 44a can acquire point cloud data (3D mapping data) indicating the environment around the vehicle 1.
- the LiDAR unit 44a includes, for example, a laser light source configured to emit a laser beam, an optical deflector configured to scan the laser beam in the horizontal direction and the vertical direction, and an optical system such as a lens. And a light receiving unit configured to receive the laser light reflected by the object.
- the central wavelength of the laser light emitted from the laser light source is not particularly limited.
- the laser light may be invisible light whose center wavelength is around 900 nm.
- the light deflector may be, for example, a micro electro mechanical systems (MEMS) mirror or a polygon mirror.
- the light receiving unit is, for example, a photodiode.
- the LIDAR unit 44a may acquire point cloud data without scanning the laser beam by the light deflector.
- the LiDAR unit 44a may acquire point cloud data in a phased array method or a flash method.
- one LiDAR unit 44a is provided in the illumination system 4a in the present embodiment, two or more LiDAR units 44a may be provided in the illumination system 4a.
- one LiDAR unit 44a is configured to detect the surrounding environment in the front area of the vehicle 1 and the other LiDAR unit 44a is a vehicle 1 It may be configured to detect the surrounding environment in the side area of the
- the LiDAR unit 44a may increase the angular pitch (scanning resolution) in a predetermined angular area in which the object is present.
- the "horizontal direction” and the “vertical direction” of the LiDAR unit 44a are premised to coincide with the "horizontal direction” and the “vertical direction” of the vehicle 1, but they need to be the same. There is no.
- the millimeter wave radar 45 a is configured to detect the surrounding environment of the vehicle 1.
- the millimeter wave radar 45a is configured to transmit detection data to the control unit 40a after acquiring detection data indicating the surrounding environment of the vehicle 1.
- the control unit 40a specifies the surrounding environment information based on the transmitted detection data.
- the surrounding environment information may include information on an object present outside the vehicle 1.
- the surrounding environment information may include, for example, information on the attribute of an object present outside the vehicle 1, information on the position of the object relative to the vehicle 1, and information on the speed of the object relative to the vehicle 1.
- the millimeter wave radar 45 a may be between the millimeter wave radar 45 a (vehicle 1) and an object existing outside the vehicle 1 by a pulse modulation method, an FM-CW (Frequency Moduled-Continuous Wave) method, or a two-frequency CW method.
- the distance D of can be obtained.
- the pulse modulation method is used, the millimeter wave radar 45a acquires the information on the time of flight ⁇ T2 of the millimeter wave at each emission angle of the millimeter wave, and then the millimeter wave radar at each emission angle based on the information on the time of flight ⁇ T2. It is possible to acquire information on the distance D between 45a (vehicle 1) and an object present outside the vehicle 1.
- the flight time ⁇ T2 can be calculated, for example, as follows.
- Time of flight ⁇ T2 time t3 when millimeter wave returns to millimeter wave radar
- the millimeter wave radar 45a is a vehicle 1 for the millimeter wave radar 45a (vehicle 1) based on the frequency f0 of the millimeter wave emitted from the millimeter wave radar 45a and the frequency f1 of the millimeter wave returned to the millimeter wave radar 45a. It is possible to obtain information on the relative velocity V of an object that exists outside of.
- the illumination system 4a may have a millimeter wave radar 45a for short distance, a millimeter wave radar 45a for medium distance, and a millimeter wave radar 45a for long distance.
- the illumination system 4b further includes a control unit 40b, an illumination unit 42b, a camera 43b, a LiDAR unit 44b, and a millimeter wave radar 45b.
- the control unit 40b, the illumination unit 42b, the camera 43b, the LiDAR unit 44b, and the millimeter wave radar 45b are disposed in a space Sb formed by the housing 24b and the light transmission cover 22b, as shown in FIG. .
- the control unit 40b may be disposed at a predetermined place of the vehicle 1 other than the space Sb.
- the control unit 40 b may be configured integrally with the vehicle control unit 3.
- the control unit 40b may have the same function and configuration as the control unit 40a.
- the lighting unit 42b may have the same function and configuration as the lighting unit 42a.
- the lighting unit 42a functions as a left headlamp unit, while the lighting unit 42b functions as a right headlamp unit.
- the camera 43b may have the same function and configuration as the camera 43a.
- the LiDAR unit 44b may have the same function and configuration as the LiDAR unit 44a.
- the millimeter wave radar 45 b may have the same function and configuration as the millimeter wave radar 45 a.
- the illumination system 4c further includes a control unit 40c, an illumination unit 42c, a camera 43c, a LiDAR unit 44c, and a millimeter wave radar 45c.
- the control unit 40c, the illumination unit 42c, the camera 43c, the LiDAR unit 44c, and the millimeter wave radar 45c, as shown in FIG. 1, are in the space Sc formed by the housing 24c and the light transmission cover 22c (light chamber) Will be placed.
- the control unit 40c may be disposed at a predetermined place of the vehicle 1 other than the space Sc.
- the control unit 40c may be configured integrally with the vehicle control unit 3.
- the control unit 40c may have the same function and configuration as the control unit 40a.
- the illumination unit 42 c is configured to form a light distribution pattern by emitting light toward the outside (rear) of the vehicle 1.
- the illumination unit 42c has a light source for emitting light and an optical system.
- the light source may be configured by, for example, a plurality of light emitting elements arranged in a matrix (for example, N rows ⁇ M columns, N> 1, M> 1).
- the light emitting element is, for example, an LED, an LD or an organic EL element.
- the optical system is configured to reflect light emitted from the light source toward the front of the lighting unit 42c, and to refract light reflected by the light emitted directly from the light source or reflected by the reflector. And at least one of the lenses.
- the lighting unit 42c When the driving mode of the vehicle 1 is the manual driving mode or the driving support mode, the lighting unit 42c may be turned off. On the other hand, when the driving mode of the vehicle 1 is the advanced driving assistance mode or the fully automatic driving mode, the lighting unit 42c may be configured to form a light distribution pattern for a camera behind the vehicle 1.
- the camera 43c may have the same function and configuration as the camera 43a.
- the LiDAR unit 44c may have the same function and configuration as the LiDAR unit 44c.
- the millimeter wave radar 45c may have the same function and configuration as the millimeter wave radar 45a.
- the illumination system 4d further includes a control unit 40d, an illumination unit 42d, a camera 43d, a LiDAR unit 44d, and a millimeter wave radar 45d.
- the control unit 40d, the illumination unit 42d, the camera 43d, the LiDAR unit 44d, and the millimeter wave radar 45d are in a space Sd formed by the housing 24d and the light transmission cover 22d (light chamber) Will be placed.
- the control unit 40d may be disposed at a predetermined place of the vehicle 1 other than the space Sd.
- the control unit 40 d may be configured integrally with the vehicle control unit 3.
- the control unit 40d may have the same function and configuration as the control unit 40c.
- the lighting unit 42d may have the same function and configuration as the lighting unit 42c.
- the camera 43d may have the same function and configuration as the camera 43c.
- the LiDAR unit 44d may have the same function and configuration as the LiDAR unit 44c.
- the millimeter wave radar 45 d may have the same function and configuration as the millimeter wave radar 45 c.
- the sensor 5 may include an acceleration sensor, a speed sensor, a gyro sensor, and the like.
- the sensor 5 is configured to detect the traveling state of the vehicle 1 and to output traveling state information indicating the traveling state of the vehicle 1 to the vehicle control unit 3.
- the sensor 5 is a seating sensor that detects whether the driver is sitting in the driver's seat, a face direction sensor that detects the direction of the driver's face, an external weather sensor that detects an external weather condition, and a person in the car You may further provide a human sensor etc. which detect whether it is.
- the sensor 5 may include an illuminance sensor configured to detect the brightness (illuminance etc.) of the surrounding environment of the vehicle 1.
- the illuminance sensor may determine the brightness of the surrounding environment according to, for example, the magnitude of the photocurrent output from the photodiode.
- the HMI (Human Machine Interface) 8 includes an input unit that receives an input operation from the driver, and an output unit that outputs traveling state information and the like to the driver.
- the input unit includes a steering wheel, an accelerator pedal, a brake pedal, an operation mode switching switch for switching the operation mode of the vehicle 1 and the like.
- the output unit includes a display and the like configured to display the traveling state information, the surrounding environment information, and the lighting state of the lighting system 4.
- the GPS (Global Positioning System) 9 is configured to acquire current position information of the vehicle 1 and to output the acquired current position information to the vehicle control unit 3.
- the wireless communication unit 10 receives information (for example, other vehicle traveling information and the like) related to other vehicles around the vehicle 1 from the other vehicles and also transmits information (for example, own vehicle traveling information and the like) for the vehicle 1 It is configured to transmit (inter-vehicle communication). Further, the wireless communication unit 10 is configured to receive infrastructure information from an infrastructure facility such as a traffic light and a marker light and to transmit vehicle running information of the vehicle 1 to the infrastructure facility (inter-vehicle communication). In addition, the wireless communication unit 10 receives information on the pedestrian from a portable electronic device (smartphone, tablet, wearable device, etc.) carried by the pedestrian, and transmits the traveling information of the vehicle 1 of the vehicle 1 to the portable electronic device. It is configured to (pedal communication).
- information for example, other vehicle traveling information and the like
- information for example, own vehicle traveling information and the like
- the vehicle 1 may communicate directly with other vehicles, infrastructure equipment or portable electronic devices in an ad hoc mode, or may communicate via an access point.
- the wireless communication standard is, for example, Wi-Fi (registered trademark), Bluetooth (registered trademark), ZigBee (registered trademark), LPWA or Li-Fi.
- the vehicle 1 may communicate with other vehicles, infrastructure equipment or portable electronic devices using a fifth generation mobile communication system (5G).
- 5G fifth generation mobile communication system
- the storage device 11 is an external storage device such as a hard disk drive (HDD) or a solid state drive (SSD).
- the storage device 11 may store 2D or 3D map information and / or a vehicle control program.
- 3D map information may be configured by point cloud data.
- the storage device 11 is configured to output map information and a vehicle control program to the vehicle control unit 3 in response to a request from the vehicle control unit 3.
- the map information and the vehicle control program may be updated via the wireless communication unit 10 and a communication network such as the Internet.
- the vehicle control unit 3 controls the steering control signal, the accelerator control signal, and the brake control signal based on the traveling state information, the surrounding environment information, the current position information and / or the map information. Automatically generate at least one of them.
- the steering actuator 12 is configured to receive a steering control signal from the vehicle control unit 3 and to control the steering device 13 based on the received steering control signal.
- the brake actuator 14 is configured to receive a brake control signal from the vehicle control unit 3 and control the brake device 15 based on the received brake control signal.
- the accelerator actuator 16 is configured to receive an accelerator control signal from the vehicle control unit 3 and to control the accelerator device 17 based on the received accelerator control signal. As described above, in the automatic driving mode, the traveling of the vehicle 1 is automatically controlled by the vehicle system 2.
- the vehicle control unit 3 when the vehicle 1 travels in the manual operation mode, the vehicle control unit 3 generates a steering control signal, an accelerator control signal and a brake control signal according to the driver's manual operation on the accelerator pedal, the brake pedal and the steering wheel. Do. As described above, in the manual operation mode, the steering control signal, the accelerator control signal, and the brake control signal are generated by the driver's manual operation, so the travel of the vehicle 1 is controlled by the driver.
- the operation mode includes an automatic operation mode and a manual operation mode.
- the automatic driving mode includes a completely automatic driving mode, an advanced driving support mode, and a driving support mode.
- the vehicle system 2 In the fully automatic operation mode, the vehicle system 2 automatically performs all travel control of steering control, brake control and accelerator control, and the driver is not in a state where the vehicle 1 can be driven.
- the vehicle system 2 In the advanced driving support mode, the vehicle system 2 automatically performs all travel control of steering control, brake control, and accelerator control, and the driver does not drive the vehicle 1 although the vehicle 1 can be driven.
- the driving support mode the vehicle system 2 automatically performs traveling control of a part of steering control, brake control and accelerator control, and the driver drives the vehicle 1 under the driving support of the vehicle system 2.
- the manual operation mode the vehicle system 2 does not automatically perform travel control, and the driver drives the vehicle 1 without driving assistance from the vehicle system 2.
- the operation mode of the vehicle 1 may be switched by operating the operation mode switch.
- the vehicle control unit 3 sets the driving mode of the vehicle 1 to four driving modes (completely automatic driving mode, advanced driving support mode, driving support mode, manual driving mode) according to the driver's operation to the drive mode switching switch. Switch between).
- the operation mode of the vehicle 1 is automatically based on the information on the travelable section where the autonomous driving vehicle can travel and the prohibited travel interval where the autonomous driving vehicle is prohibited or the information on the external weather condition. It may be switched to In this case, the vehicle control unit 3 switches the driving mode of the vehicle 1 based on these pieces of information.
- the driving mode of the vehicle 1 may be automatically switched by using a seating sensor, a face direction sensor, or the like. In this case, the vehicle control unit 3 may switch the operation mode of the vehicle 1 based on output signals from the seating sensor and the face direction sensor.
- FIG. 3 is a diagram showing functional blocks of the control unit 40a of the illumination system 4a.
- the control unit 40a is configured to control the operations of the illumination unit 42a, the camera 43a, the LiDAR unit 44a, and the millimeter wave radar 45a.
- the control unit 40a includes an illumination control unit 410a, a camera control unit 420a, a LiDAR control unit 430a, a millimeter wave radar control unit 440a, and a surrounding environment information fusion unit 450a.
- the illumination control unit 410 a is configured to control the illumination unit 42 a so that the illumination unit 42 a emits a predetermined light distribution pattern toward the front area of the vehicle 1.
- the illumination control unit 410 a may change the light distribution pattern emitted from the illumination unit 42 a according to the operation mode of the vehicle 1.
- the camera control unit 420a controls the operation of the camera 43a and, based on the image data output from the camera 43a, surrounding environment information of the vehicle 1 in the detection area S 1 of the camera 43a (see FIG. 4) It is configured to generate environment information I1.
- the LiDAR control unit 430a controls the operation of the LiDAR unit 44a and, based on the point cloud data output from the LiDAR unit 44a, the environment information of the vehicle 1 in the detection area S 2 (see FIG. 4) of the LiDAR unit 44a. It is configured to generate (hereinafter referred to as peripheral environment information I2).
- the millimeter wave radar control unit 440a controls the operation of the millimeter wave radar 45a and, based on the detection data output from the millimeter wave radar 45a, the vehicle 1 in the detection area S 3 (see FIG. 4) of the millimeter wave radar 45a.
- Is configured to generate the surrounding environment information (hereinafter referred to as the surrounding environment information I3).
- the surrounding environment information fusion unit 450a is configured to merge the surrounding environment information I1, I2, and I3 to generate the fused surrounding environment information If.
- the peripheral environment information If, as shown in FIG. 4, the detection area S 1 of the camera 43a, the detection area S 2 of LiDAR unit 44a, the detection area by combining the detection area S 3 of the millimeter-wave radar 45a Sf
- the information related to the object which exists in the exterior of the vehicle 1 in may be included.
- the surrounding environment information If includes information on the attribute of the object, the position of the object with respect to the vehicle 1, the distance between the vehicle 1 and the object, the velocity of the object with respect to the vehicle 1, and / or the moving direction of the object. May be included.
- the surrounding environment information fusion unit 450 a is configured to transmit the surrounding environment information If to the vehicle control unit 3.
- FIG. 5 is a flowchart for explaining a control method of the LiDAR unit 44a according to the present embodiment.
- Figure 6 is a diagram showing a state in which there is (an example of an object) pedestrian P in the detection area S 2 of LiDAR unit 44a.
- FIG. 7 is a diagram showing an angular area S x in which a pedestrian P is present. In FIGS. 6 and 7, for convenience of explanation, the detection areas of sensors other than the LiDAR unit 44a are not shown.
- control method of the LiDAR unit 44a is described, but the control method of the LiDAR unit 44a is also applicable to the LiDAR units 44b to 44d. That is, the control units 40b to 40d may control the LiDAR units 44b to 44d by the same method as the control method of the LiDAR unit 44a.
- step S1 LiDAR control unit 430a, based on the point cloud data acquired from LiDAR unit 44a, the peripheral region (specifically the vehicle 1, the LiDAR units 44a detection area S 2 ), And determine whether there is an object (eg, pedestrian, other vehicle, etc.).
- the LiDAR unit 44 a scans laser light at a predetermined angular pitch ⁇ in the horizontal direction of the vehicle 1 and at a predetermined angular pitch ⁇ in the vertical direction of the vehicle 1.
- the LiDAR unit 44a can generate point cloud data by scanning laser light at predetermined angular pitches ⁇ and ⁇ . The smaller the predetermined angular pitch, the higher the spatial resolution of the point cloud data.
- step S1 determines whether an object exists in the peripheral area of the vehicle 1 based on the surrounding environment information If transmitted from the control unit 40a. .
- the LiDAR control unit 430a determines whether the attribute of the object present in the peripheral area of the vehicle 1 can be identified based on the point cloud data. For example, when the object is a pedestrian (or a bicycle), the attribute of the object is a pedestrian (or a bicycle). Further, when the object is another vehicle, the attribute of the object is a vehicle. In the present embodiment, as shown in FIG. 6, because of the presence of pedestrians P in the detection area S 2 of LiDAR units 44a, attribute of the object is a pedestrian. If the determination result in step S2 is YES, the series of processing shown in FIG. 5 ends.
- the LiDAR control unit 430a determines that the attribute of the object can not be specified (NO in step S2), the LiDAR control unit 430a executes the process of step S3. Note that, instead of the LiDAR control unit 430a, the vehicle control unit 3 may determine whether the attribute of the object can be specified based on the surrounding environment information If.
- the LiDAR control unit 430a specifies the position of the pedestrian P (target object) based on the point cloud data (step S3).
- the position of the pedestrian P may be the relative position (coordinates) of the pedestrian P with respect to the vehicle 1 or the position (coordinates) of the pedestrian P on the earth space.
- the LiDAR control unit 430a specifies information regarding the distance between the vehicle 1 and the pedestrian P and information regarding the angle of the pedestrian P with respect to the vehicle 1 Good.
- the vehicle control unit 3 may specify the position of the pedestrian P based on the surrounding environment information If.
- the LiDAR control unit 430a increases the scanning resolution of the LiDAR unit 44a only in the angular region S x (see FIG. 7) in which the pedestrian P (object) is present. Specifically, first, the LiDAR control unit 430a determines an angle area S x (an example of a first angle area) based on the position information of the pedestrian P.
- the angular area S x is an angular area that covers the entire pedestrian P. For example, in the horizontal direction of the vehicle 1, when the angular range of the area occupied by the pedestrian P is ⁇ 1, the angular range of the angular area S x in the horizontal direction of the vehicle 1 is ⁇ ( ⁇ 1 + ⁇ ) ( ⁇ > 0).
- the angle ⁇ may be, for example, 0 ⁇ ⁇ 1. In this case, the angular range of the angular area S x is larger than ⁇ and smaller than 2 ⁇ .
- the LiDAR control unit 430a controls the LiDAR unit 44a so that the scanning resolution of the LiDAR unit 44a in the angle region Sx is increased. For example, if the horizontal direction of the angular pitch ⁇ in the detection area S 2 is 0.5 °, so LiDAR control unit 430a, the horizontal angle pitch ⁇ in the angular region S x is 0.1 ° LiDAR The unit 44a may be controlled. Thus, the LiDAR control unit 430a can increase the horizontal scanning resolution of the LiDAR unit 44a in the angle region Sx . Also, the LiDAR control unit 430a may increase the scanning resolution in the vertical direction of the LiDAR unit 44a in the angle region Sx .
- the LiDAR control unit 430a controls the LiDAR unit 44a so that the 1 ° May be
- the LiDAR control unit 430a can increase the scanning resolution in the vertical direction in the angle region Sx .
- the LiDAR unit 44a newly acquires point cloud data (next frame of point cloud data) indicating the peripheral environment of the vehicle 1 in a state where the scanning resolution of the LiDAR unit 44a only in the angular region Sx is increased.
- the newly acquired point cloud data (the next frame of the point group data) by LiDAR unit 44a the spatial resolution in the angular region S x is higher than the spatial resolution in the detection area S 2 of the non-angular area S x . For this reason, it becomes possible to acquire the information (especially attribute information) relevant to the target (pedestrian P) which exists in angle field Sx with high accuracy.
- the LiDAR control unit 430a determines whether or not the attribute of the object can be specified based on the point cloud data newly acquired from the LiDAR unit 44a (step S5). If the attribute of the object can be identified based on the point cloud data (that is, if it is determined that the attribute of the object can be identified), the LiDAR control unit 430a ends the series of processes shown in FIG. On the other hand, when the attribute of the object can not be identified based on the point cloud data (that is, when it is determined that the attribute of the object can not be identified), the LiDAR control unit 430a executes the processing of steps S3 and S4 again. .
- step S3 the LiDAR control unit 430a updates the position of the pedestrian P (object) based on the point cloud data newly acquired from the LiDAR unit 44a. Thereafter, the LiDAR control unit 430a updates the angular area Sx based on the updated position information of the pedestrian P, and further increases the scanning resolution of the LiDAR unit 44a only in the angular area Sx .
- the LiDAR control unit 430a sets the angle pitch ⁇ in the horizontal direction in the angle area Sx to 0.05 °. As such, the LiDAR unit 44a may be controlled. Thus, the LiDAR control unit 430a can gradually increase the horizontal scanning resolution of the LiDAR unit 44a. Furthermore, the LiDAR control unit 430a may gradually increase the scanning resolution in the vertical direction of the LiDAR unit 44a. Thereafter, the LiDAR control unit 430a newly acquires point cloud data indicating the peripheral environment of the vehicle 1 with the scanning resolution of the LiDAR unit 44a in the angle area Sx further increased. After that, the LiDAR control unit 430a determines whether the attribute of the object can be identified based on the newly acquired point cloud data. If the determination result of step S5 is NO, the processes of steps S3 and S4 are performed again.
- the scanning resolution of LiDAR unit 44a in the angular region S x present pedestrian P of the detection region S 2 of LiDAR unit 44a increases. Therefore, while increasing the scanning resolution of LiDAR unit 44a in the angular region S x, that does not increase the scanning resolution in the detection area S 2 of the non-angular area S x, suppressing the calculation load of LiDAR controller 430a (ECU) In addition, the accuracy of the information related to the pedestrian P can be improved. Accordingly, it is possible to provide the illumination system 4a capable of improving the accuracy of the surrounding environment information while suppressing the calculation load of the LiDAR control unit 430a.
- the scanning resolution of the LiDAR unit 44a in the angle region Sx Control the LiDAR unit 44a to increase.
- the LiDAR control unit 430a causes the scanning resolution of the LiDAR unit 44a in the angle region Sx to gradually increase until the attribute of the object can be identified (until the determination result in step S5 becomes YES). Control unit 44a.
- the angle area Sx in which the pedestrian P exists gradually increases, it is possible to reliably specify the attribute of the object.
- the angle area Sx is updated based on the updated position of the pedestrian P.
- the pedestrian P is moving, it is possible to increase the scanning resolution of the LiDAR unit 44a in the angle region Sx in which the moving pedestrian P is present.
- the pedestrian P is shown as an example of the object, but the object is another vehicle (including a two-wheeled vehicle, a three-wheeled vehicle), a traffic infrastructure facility, an obstacle, etc. It is also good.
- the LiDAR control unit 430a may increase the scanning resolution of the LiDAR unit 44a in each of the plurality of angle regions Sx .
- “left and right direction”, “front and back direction”, and “upper and lower direction” will be referred to as appropriate. These directions are relative directions set for the vehicle 101 shown in FIG.
- the “front-rear direction” is a direction including the “front direction” and the “rear direction”.
- the “left-right direction” is a direction including the “left direction” and the “right direction”.
- the “vertical direction” is a direction including “upper direction” and “lower direction”.
- the “horizontal direction” is also referred to as appropriate, but the “horizontal direction” is a direction perpendicular to the “vertical direction” and includes the “horizontal direction” and the “front-rear direction”.
- FIG. 8 is a schematic view showing a top view of a vehicle 101 provided with a vehicle system 102.
- the vehicle 101 is a vehicle (car) that can travel in an automatic driving mode, and includes a vehicle system 102.
- the vehicle system 102 includes a vehicle control unit 103, a front left illumination system 104a (hereinafter simply referred to as “illumination system 104a”), a front right illumination system 104b (hereinafter simply referred to as “illumination system 104b”), and a left rear illumination.
- the illumination system 104 a is provided on the left front side of the vehicle 101.
- the illumination system 104a includes a housing 124a installed on the left front side of the vehicle 101, and a light transmitting cover 122a attached to the housing 124a.
- the illumination system 104 b is provided on the right front side of the vehicle 101.
- the illumination system 104b includes a housing 124b installed on the right front side of the vehicle 101, and a translucent cover 122b attached to the housing 124b.
- the illumination system 104 c is provided on the left rear side of the vehicle 101.
- the illumination system 104c includes a housing 124c installed on the left rear side of the vehicle 101, and a translucent cover 122c attached to the housing 124c.
- the illumination system 104 d is provided on the right rear side of the vehicle 101.
- the lighting system 104d includes a housing 124d installed on the right rear side of the vehicle 101, and a light transmitting cover 122d attached to the housing 124d.
- FIG. 9 is a block diagram showing a vehicle system 102 according to the present embodiment.
- the vehicle system 102 includes a vehicle control unit 103, illumination systems 104a to 104d, a sensor 5, an HMI 8, a GPS 9, a wireless communication unit 10, and a storage device 11.
- the vehicle system 102 includes a steering actuator 12, a steering device 13, a brake actuator 14, a brake device 15, an accelerator actuator 16, and an accelerator device 17.
- Vehicle system 102 also includes a battery (not shown) configured to provide power.
- the vehicle control unit 103 is configured to control the traveling of the vehicle 101.
- the vehicle control unit 103 is configured, for example, by at least one electronic control unit (ECU).
- the electronic control unit may include at least one microcontroller including one or more processors and one or more memories, and other electronic circuitry including active and passive elements such as transistors.
- the processor is, for example, a CPU, an MPU, a GPU and / or a TPU.
- the CPU may be configured by a plurality of CPU cores.
- the GPU may be configured by a plurality of GPU cores.
- the memory includes a ROM and a RAM.
- a vehicle control program may be stored in the ROM.
- the vehicle control program may include an artificial intelligence (AI) program for autonomous driving.
- AI artificial intelligence
- An AI program is a program constructed by supervised or unsupervised machine learning using a neural network such as deep learning.
- the RAM may temporarily store a vehicle control program, vehicle control data, and / or surrounding environment information indicating a surrounding environment of the vehicle.
- the processor may be configured to expand a program specified from the vehicle control program stored in the ROM on the RAM, and execute various processes in cooperation with the RAM.
- the electronic control unit may be configured by at least one integrated circuit such as an ASIC or an FPGA. Furthermore, the electronic control unit may be configured by a combination of at least one microcontroller and at least one integrated circuit (such as an FPGA).
- the illumination system 104a (an example of a sensing system) further includes a control unit 140a, an illumination unit 142a, a camera 143a, a LiDAR unit 144a (an example of a laser radar), a millimeter wave radar 145a, and an actuator 146a.
- the control unit 140a, the illumination unit 142a, the camera 143a, the LiDAR unit 144a, the millimeter wave radar 145a, and the actuator 146a are in the space Sa formed by the housing 124a and the light transmission cover 122a, as shown in FIG. Will be placed.
- the control unit 140a may be disposed at a predetermined place of the vehicle 101 other than the space Sa.
- the control unit 140a may be configured integrally with the vehicle control unit 103.
- the control unit 140a is configured by, for example, at least one electronic control unit (ECU).
- the electronic control unit may include at least one microcontroller including one or more processors and one or more memories, and other electronic circuits (eg, transistors, etc.).
- the processor is, for example, a CPU, an MPU, a GPU and / or a TPU.
- the CPU may be configured by a plurality of CPU cores.
- the GPU may be configured by a plurality of GPU cores.
- the memory includes a ROM and a RAM.
- a peripheral environment specifying program for specifying the peripheral environment of the vehicle 101 may be stored in the ROM.
- the peripheral environment identification program is a program constructed by supervised or unsupervised machine learning using a neural network such as deep learning.
- a peripheral environment identification program In the RAM, a peripheral environment identification program, image data acquired by the camera 143a, three-dimensional mapping data (point group data) acquired by the LiDAR unit 144a, and / or detection data acquired by the millimeter wave radar 145a, etc. are temporarily stored. May be stored.
- the processor may be configured to expand a program specified from the peripheral environment specifying program stored in the ROM on the RAM and execute various processing in cooperation with the RAM.
- the electronic control unit ECU
- the electronic control unit may be configured by at least one integrated circuit such as an ASIC or an FPGA.
- the electronic control unit may be configured by a combination of at least one microcontroller and at least one integrated circuit (such as an FPGA).
- the illumination unit 142 a is configured to form a light distribution pattern by emitting light toward the outside (forward) of the vehicle 101.
- the illumination unit 142a has a light source for emitting light and an optical system.
- the light source may be configured by, for example, a plurality of light emitting elements arranged in a matrix (for example, N rows ⁇ M columns, N> 1, M> 1).
- the light emitting element is, for example, an LED, an LD or an organic EL element.
- the optical system is configured to reflect light emitted from the light source toward the front of the lighting unit 142a, and to refract light reflected by the light emitted directly from the light source or reflected by the reflector. And at least one of the lenses.
- the lighting unit 142a displays a light distribution pattern for the driver (for example, a low beam light distribution pattern or a high beam light distribution pattern) in front of the vehicle 101. It is configured to form. Thus, the lighting unit 142a functions as a left headlamp unit.
- the lighting unit 142a may be configured to form a light distribution pattern for a camera in front of the vehicle 101.
- the control unit 140a may be configured to individually supply an electrical signal (for example, a PWM signal) to each of the plurality of light emitting elements provided in the lighting unit 142a. As described above, the control unit 140a can individually select the light emitting elements to which the electric signal is supplied, and can adjust the duty ratio of the electric signal for each light emitting element. That is, the control unit 140a can select a light emitting element to be turned on or off among the plurality of light emitting elements arranged in a matrix, and can determine the luminance of the light emitting element that is turned on. Therefore, the control unit 140a can change the shape and brightness of the light distribution pattern emitted forward from the illumination unit 142a.
- an electrical signal for example, a PWM signal
- the camera 143a is configured to detect the surrounding environment of the vehicle 101.
- the camera 143a is configured to transmit image data to the control unit 140a after acquiring image data indicating the environment around the vehicle 101.
- the control unit 140a specifies the surrounding environment information based on the transmitted image data.
- the surrounding environment information may include information on an object present outside the vehicle 101.
- the surrounding environment information may include information on the attribute of an object present outside the vehicle 101, and information on the distance and position of the object relative to the vehicle 101.
- the camera 143a is configured by, for example, an imaging device such as a CCD or a CMOS.
- the camera 143a may be configured as a single-eye camera or may be configured as a stereo camera.
- the control unit 140a uses parallax to make the vehicle 101 and an object existing outside the vehicle 101 (e.g., based on two or more image data acquired by the stereo camera). The distance between the pedestrian and the like can be specified. Further, although one camera 143a is provided in the illumination system 104a in the present embodiment, two or more cameras 143a may be provided in the illumination system 104a.
- the LiDAR unit 144a is configured to detect the surrounding environment of the vehicle 101.
- the LiDAR unit 144a is configured to transmit point cloud data to the control unit 140a after acquiring point cloud data (3D mapping data) indicating the surrounding environment of the vehicle 101.
- the control unit 140a specifies the surrounding environment information based on the transmitted point cloud data.
- the surrounding environment information may include information related to an object existing outside the vehicle 101.
- the surrounding environment information may include information on the attribute of the object present outside the vehicle 101, information on the distance and position of the object relative to the vehicle 101, and information on the moving direction of the object.
- the LiDAR unit 144a acquires information on the time of flight (TOF) ⁇ T1 of the laser beam (light pulse) at each emission angle (horizontal angle ⁇ , vertical angle ⁇ ) of the laser beam. Then, based on the information on the time of flight ⁇ T1, the information on the distance D between the LiDAR unit 144a (vehicle 101) and the object existing outside the vehicle 101 at each emission angle (horizontal angle ⁇ , vertical angle ⁇ ) is obtained can do.
- the flight time ⁇ T1 can be calculated, for example, as follows.
- Time of flight ⁇ T1 time at which laser light (light pulse) returned to LiDAR unit t1—time at which LiDAR unit emitted laser light (light pulse)
- the LiDAR unit 144a can acquire point cloud data (3D mapping data) indicating the environment around the vehicle 101.
- the LiDAR unit 144a includes, for example, a light emitting unit configured to emit a laser beam, an optical deflector configured to scan the laser beam in the horizontal direction and the vertical direction, an optical system such as a lens, an object, And a light receiving unit configured to receive the laser beam reflected by the light source.
- the central wavelength of the laser light emitted from the laser light source is not particularly limited.
- the laser light may be invisible light whose center wavelength is around 900 nm.
- the light deflector may be, for example, a MEMS mirror or a polygon mirror. When the light deflector is a polygon mirror, the LiDAR unit 144a always scans the laser light along a predetermined direction. On the other hand, when the light deflector is a MEMS mirror, the LiDAR unit 144a scans the laser light in the direction opposite to the predetermined direction after scanning the laser light along the predetermined direction (that is, the laser Light travels back and forth).
- the light receiving unit is, for example, a photodiode.
- the LiDAR unit 144a may acquire point cloud data without scanning the laser light by the light deflector.
- the LiDAR unit 144a may acquire point cloud data in a phased array method or a flash method.
- one LiDAR unit 144a is provided in the illumination system 104a in the present embodiment, two or more LiDAR units 144a may be provided in the illumination system 104a.
- one LiDAR unit 144a is configured to detect the surrounding environment in the area in front of the vehicle 101 and the other LiDAR unit 144a is a vehicle 101 It may be configured to detect the surrounding environment in the side area of the
- the LiDAR unit 144a may scan the laser light at a predetermined angular pitch (a predetermined scanning resolution in the horizontal direction) in the horizontal direction and a predetermined angular pitch (a predetermined scanning resolution in the vertical direction) in the vertical direction.
- the millimeter wave radar 145 a is configured to detect the surrounding environment of the vehicle 101.
- the millimeter wave radar 145a is configured to transmit detection data to the control unit 140a after acquiring detection data indicating the peripheral environment of the vehicle 101.
- the control unit 140a specifies the surrounding environment information based on the transmitted detection data.
- the surrounding environment information may include information on an object present outside the vehicle 101.
- the surrounding environment information may include, for example, information on the attribute of an object present outside the vehicle 101, information on the position of the object relative to the vehicle 101, and information on the speed of the object relative to the vehicle 101.
- the millimeter wave radar 145a acquires the distance D between the millimeter wave radar 145a (vehicle 101) and an object existing outside the vehicle 101 by pulse modulation, FM-CW, or dual frequency CW. Can.
- the millimeter wave radar 145a acquires information on the time of flight ⁇ T2 of the millimeter wave at each emission angle of the millimeter wave, and then the millimeter wave radar at each emission angle based on the information on the time of flight ⁇ T2. It is possible to obtain information on the distance D between the vehicle 145a (the vehicle 101) and an object present outside the vehicle 101.
- the flight time ⁇ T2 can be calculated, for example, as follows.
- the millimeter wave radar 145a is a vehicle 101 for the millimeter wave radar 145a (vehicle 101) based on the frequency f0 of the millimeter wave emitted from the millimeter wave radar 145a and the frequency f1 of the millimeter wave returned to the millimeter wave radar 145a. It is possible to obtain information on the relative velocity V of an object that exists outside of.
- the illumination system 104a may include a millimeter wave radar 145a for short distance, a millimeter wave radar 145a for medium distance, and a millimeter wave radar 145a for long distance.
- the actuator 146a is configured to change the inclination angle ⁇ of the LiDAR unit 144a with respect to the vertical direction of the vehicle 101.
- an example of the actuator 146 a used in the present embodiment includes an electromagnetic solenoid 462 and a shaft 463 connected to the electromagnetic solenoid 462.
- the electromagnetic solenoid 462 is configured to convert electrical energy into mechanical energy, and can move the shaft 463 in the back and forth direction.
- the illumination system 104b further includes a control unit 140b, an illumination unit 142b, a camera 143b, a LiDAR unit 144b, a millimeter wave radar 145b, and an actuator 146b.
- the control unit 140b, the illumination unit 142b, the camera 143b, the LiDAR unit 144b, the millimeter wave radar 145b, and the actuator 146b are in the space Sb formed by the housing 124b and the light transmission cover 122b, as shown in FIG. Will be placed.
- the control unit 140 b may be disposed at a predetermined place of the vehicle 101 other than the space Sb.
- the control unit 140 b may be configured integrally with the vehicle control unit 103.
- the control unit 140 b may have the same function and configuration as the control unit 140 a.
- the lighting unit 142b may have the same function and configuration as the lighting unit 142a.
- the lighting unit 142a functions as a left headlamp unit
- the lighting unit 142b functions as a right headlamp unit.
- the camera 143 b may have the same function and configuration as the camera 143 a.
- the LiDAR unit 144b may have the same function and configuration as the LiDAR unit 144a.
- the millimeter wave radar 145 b may have the same function and configuration as the millimeter wave radar 145 a.
- the actuator 146b may have the same function and configuration as the actuator 146a.
- the illumination system 104c further includes a control unit 140c, an illumination unit 142c, a camera 143c, a LiDAR unit 144c, a millimeter wave radar 145c, and an actuator 146c.
- the control unit 140c, the illumination unit 142c, the camera 143c, the LiDAR unit 144c, and the millimeter wave radar 145c are, as shown in FIG. 8, in the space Sc formed by the housing 124c and the light transmission cover 122c (light chamber) Will be placed.
- the control unit 140c may be disposed at a predetermined place of the vehicle 101 other than the space Sc.
- the control unit 140c may be configured integrally with the vehicle control unit 103.
- the control unit 140c may have the same function and configuration as the control unit 140a.
- the illumination unit 142 c is configured to form a light distribution pattern by emitting light toward the outside (rear) of the vehicle 101.
- the illumination unit 142c has a light source for emitting light and an optical system.
- the light source may be configured by, for example, a plurality of light emitting elements arranged in a matrix (for example, N rows ⁇ M columns, N> 1, M> 1).
- the light emitting element is, for example, an LED, an LD or an organic EL element.
- the optical system is configured to reflect light emitted from the light source toward the front of the illumination unit 142c, and to refract light reflected by the light emitted directly from the light source or reflected by the reflector. And at least one of the lenses.
- the lighting unit 142c may turn off.
- the lighting unit 142c may be configured to form a light distribution pattern for a camera behind the vehicle 101.
- the camera 143c may have the same function and configuration as the camera 143a.
- the LiDAR unit 144c may have the same function and configuration as the LiDAR unit 144c.
- the millimeter wave radar 145 c may have the same function and configuration as the millimeter wave radar 145 a.
- the actuator 146c may have the same function and configuration as the actuator 146a.
- the illumination system 104d further includes a control unit 140d, an illumination unit 142d, a camera 143d, a LiDAR unit 144d, and a millimeter wave radar 145d.
- the control unit 140d, the illumination unit 142d, the camera 143d, the LiDAR unit 144d, and the millimeter wave radar 145d are in a space Sd formed by the housing 124d and the light transmission cover 122d (light chamber) Will be placed.
- the control unit 140 d may be disposed at a predetermined place of the vehicle 101 other than the space Sd.
- the control unit 140d may be configured integrally with the vehicle control unit 103.
- the control unit 140d may have the same function and configuration as the control unit 140c.
- the lighting unit 142d may have the same function and configuration as the lighting unit 142c.
- the camera 143d may have the same function and configuration as the camera 143c.
- the LiDAR unit 144d may have the same function and configuration as the LiDAR unit 144c.
- the millimeter wave radar 145 d may have the same function and configuration as the millimeter wave radar 145 c.
- the actuator 146d may have the same function and configuration as the actuator 146c.
- FIG. 10 is a diagram showing functional blocks of the control unit 140a of the illumination system 104a.
- the control unit 140a is configured to control the operations of the illumination unit 142a, the camera 143a, the LiDAR unit 144a, the millimeter wave radar 145a, and the actuator 146a.
- the control unit 140a includes an illumination control unit 1410a, a camera control unit 1420a, a LiDAR control unit 1430a, a millimeter wave radar control unit 1440a, an actuator control unit 1460a, and a surrounding environment information fusion unit 1450a.
- the lighting control unit 1410 a is configured to control the lighting unit 142 a so that the lighting unit 142 a emits a predetermined light distribution pattern toward the front area of the vehicle 101.
- the illumination control unit 1410a may change the light distribution pattern emitted from the illumination unit 142a according to the operation mode of the vehicle 101.
- the camera control unit 1420a controls the operation of the camera 143a, based on the image data output from the camera 143a, the ambient environment information of the vehicle 101 in the detection area S 10 of the camera 143a (see FIG. 11) (hereinafter, peripheral It is configured to generate environment information I1.
- LiDAR control unit 1430a controls the operation of LiDAR unit 144a, based on the point cloud data output from LiDAR unit 144a, environmental information of the vehicle 101 in the detection area S 12 of LiDAR unit 144a (see FIG. 11) It is configured to generate (hereinafter referred to as peripheral environment information I2).
- the millimeter wave radar control unit 1440a controls the operation of the millimeter wave radar 145a and, based on the detection data output from the millimeter wave radar 145a, the vehicle 101 in the detection area S 13 (see FIG. 11) of the millimeter wave radar 145a.
- Is configured to generate the surrounding environment information (hereinafter referred to as the surrounding environment information I3).
- the surrounding environment information fusion unit 1450a is configured to fuse the surrounding environment information I1, I2, and I3 to generate the fused surrounding environment information Ig.
- the peripheral environment information Ig as shown in FIG. 11, the detection region S 10 of the camera 143a, the detection region S 12 of LiDAR unit 144a, the detection area by combining a detection region S 13 of the millimeter wave radar 145a S
- the information related to an object existing outside the vehicle 101 in g may be included.
- the surrounding environment information Ig includes information on the attribute of the object, the position of the object relative to the vehicle 101, the distance between the vehicle 101 and the object, the velocity of the object relative to the vehicle 101, and / or the moving direction of the object. May be included.
- the surrounding environment information fusion unit 1450a is configured to transmit the surrounding environment information Ig to the vehicle control unit 103.
- the actuator control unit 1460a is configured to control the drive of the actuator 146a.
- the actuator control unit 1460a can determine the inclination angle of the LiDAR unit 144a with respect to the vertical direction of the vehicle 101 by controlling the drive of the actuator 146a.
- FIG. 12A is a schematic view of the LiDAR unit 144a and the actuator 146a viewed from the right.
- FIG. 12B is a schematic view of the LiDAR unit 144a viewed from the front.
- the LiDAR unit 144a includes a LiDAR unit main body 143 and a housing 140 for housing the LiDAR unit main body 143.
- the LiDAR unit main body 143 is reflected or scattered by three light emitting units E1 to E3 configured to emit laser light (light pulse) toward the outside of the vehicle 101 and an object existing in front of the vehicle 101 And three light receiving units R1 to R3 configured to receive a laser beam.
- the LiDAR unit body 143 is rotationally driven, the LiDAR unit 144a can scan laser light in the horizontal direction.
- Each of the three light emitting units E1 to E3 may be configured to emit a laser beam (light pulse) at the same timing.
- the three light emitting units E1 to E3 may be configured to emit laser light at different angles in the vertical direction.
- the angular difference between the emission angle of the laser beam emitted from the light emitting unit E1 and the emission angle of the laser beam emitted from the light emitting unit E2 in the vertical direction is, for example, 3 °.
- the angle difference between the emission angle of the laser beam emitted from the light emitting unit E2 in the vertical direction and the emission angle of the laser beam emitted from the light emitting unit E3 is, for example, 3 °.
- the number of light emitting units and light receiving units is three, but the number of light emitting units and light receiving units is not limited to three.
- the angle difference of the emission angle of the laser beam is not limited to 3 °.
- the upper end surface 141 of the housing 140 is connected to the fulcrum 70 of the frame body 72 via the upper support shaft 73, and the lower end surface 142 of the housing 140 is connected to the shaft 463 via the lower support shaft 75. There is.
- the upper support shaft 73 is rotatably fixed to the fulcrum 70.
- the inclination angle ⁇ 2 is the maximum inclination angle (maximum value of the inclination angle) of the LiDAR unit 144a in the forward direction.
- the actuator control unit 1460a transmits a control signal (electric signal) corresponding to the inclination angle ⁇ 2 to the actuator 146a.
- the electromagnetic solenoid 462 of the actuator 146a moves the shaft 463 to a position corresponding to the inclination angle ⁇ 2 based on the control signal received from the actuator control unit 1460a.
- the LiDAR unit 144a is inclined forward about the fulcrum 70 by the inclination angle ⁇ 2.
- the inclination angle ⁇ 3 is the maximum inclination angle (maximum value of the inclination angle) of the LiDAR unit 144a in the rear direction.
- the actuator control unit 1460a transmits a control signal (electric signal) corresponding to the inclination angle ⁇ 3 to the actuator 146a.
- the electromagnetic solenoid 462 of the actuator 146a moves the shaft 463 to a position corresponding to the inclination angle ⁇ 3 based on the control signal received from the actuator control unit 1460a.
- the LiDAR unit 144a is inclined rearward about the fulcrum 70 by the inclination angle ⁇ 3.
- the detection region S 12 of the LiDAR unit 144 a moves upward And tilt by an angle ⁇ 2.
- the detection region S 12 of LiDAR unit 144a is an angle ⁇ 3 in the downward direction gradient Do.
- the actuator control unit 1460a is by inclining the LiDAR unit 144a with respect to the vertical direction by using an actuator 146a, it is possible to widen the detection area S 12 of LiDAR unit 144a in the vertical direction.
- the inclination angle ⁇ 2 is the maximum inclination angle of the LiDAR unit 144a in the front direction
- the inclination angle ⁇ 3 is the maximum inclination angle of the LiDAR unit 144a in the rear direction.
- the detection area S 120 gradually inclines upward and downward because the LiDAR unit 144a gradually inclines in the vertical direction.
- the detection area of the LiDAR unit 144a obtained by the tilt control of the LiDAR unit 144a (that is, the detection area S 12 expanded in the vertical direction) is referred to as a detection area S 120 .
- FIGS. 14A and 14B show a a detection region S 120 detection region S 12 in the horizontal direction (the detection region S 12 which is enlarged in the vertical direction).
- FIG. 14B shows detection regions S 10 and S 120 in the vertical direction.
- the angle range of the angle range and the detection area S 120 of the detection region S 12 in the horizontal direction is consistent.
- the angular range ⁇ t of the detection region S 120 in the vertical direction is larger than the angle range ⁇ 1 of the detection region S 12 in the vertical direction.
- the inclination angle ⁇ 2 indicates the maximum inclination angle of the LiDAR unit 144a in the front direction
- the inclination angle ⁇ 3 indicates the maximum inclination angle of the LiDAR unit 144a in the rear direction. Therefore, the angle range ⁇ t of the detection area S 120 Is defined as the following equation (1).
- Angle range ⁇ t ⁇ 1 + ⁇ 2 + ⁇ 3 (1)
- the angular range ⁇ t of the detection area S 120 in the vertical direction is increased by ( ⁇ 2 + ⁇ 3) than the angular range ⁇ 1 of the detection area S 12 in the vertical direction.
- the illumination system 104a sensing system which can improve the recognition precision of the surrounding environment of the vehicle 101 can be provided.
- information for example, attribute information and the like
- FIG. 15 is a diagram for explaining the relationship between the scan time of the LiDAR unit 144a and the drive time of the actuator 146a.
- FIG. 16 is a diagram showing angle changes of three scan lines L1 to L3 between frames F1 to F10 of point group data.
- the actuator control unit 1460a does not drive the actuator 146a in the scanning time (scanning time of the LiDAR unit 144a) in which the LiDAR unit 144a is scanning the laser light.
- the actuator control unit 1460a does not drive the actuator 146a in the scanning time of the LiDAR unit 144a when the frame Fn (n is a natural number) of the point group data is acquired.
- the actuator 146a tilts the LiDAR unit 144a with respect to the vertical direction. Thereafter, the LiDAR unit 144a scans the laser beam to acquire the next frame F3.
- the actuator control unit 1460a controls the drive of the actuator 146a so that the scan time of the LiDAR unit 144a and the drive time of the actuator 146a do not overlap.
- the drive time of the actuator 146a may be set to N times (N is a natural number) of the scan time of the LiDAR unit 144a.
- the frame F1 is not limited to the frame of point cloud data initially acquired by the LiDAR unit 144a.
- Frames F1, F5, and F9 are frames of point cloud data acquired when the LiDAR unit 144a is not inclined in the vertical direction.
- the scanning time of the LiDAR unit 144a shown in FIG. 15 does not include the time of signal processing required to generate point group data after the light receiving units R1 to R3 receive light.
- FIG. 16 shows scan lines L1 to L3 projected on a virtual screen Sc virtually installed in front of the vehicle 101.
- the virtual screen Sc is installed vertically to the horizontal direction of the vehicle 101.
- the scanning line is a locus of a laser light spot formed on the virtual screen by the scanning of the laser light.
- the scanning line L1 is a scanning line of laser light emitted from the light emitting unit E1 (see FIG. 12).
- the scan line L2 is a scan line of the laser light emitted from the light emitting unit E2.
- the scanning line L3 is a scanning line of the laser light emitted from the light emitting unit E3.
- the scanning line L1 and the scanning line in the vertical direction The angular spacing to L2 is 3 °.
- the scanning line L2 in the vertical direction The angular distance between the scanning line L3 and the scanning line L3 is 3 °.
- the angle difference between the scanning line L1 and the scanning line L3 in the vertical direction is 6 °.
- the angular range of the detection region S 12 in the vertical direction is assumed to be 6 °.
- the angle range of the angle difference and the detection region S 12 It should be noted that merely one example.
- the LiDAR unit 144a since the LiDAR unit 144a includes three light emitting units E1 to E3, three scan lines L1 to L3 are projected on the virtual screen Sc.
- N scan lines are projected on the virtual screen Sc.
- the frame F1 is a frame of point cloud data acquired when the LiDAR unit 144a is not inclined in the vertical direction.
- the scan line L2 when acquiring the frame F1 coincides with the reference line.
- the reference line is a line indicating zero degree in the vertical direction.
- the actuator 146a moves the LiDAR unit 144a forward by 1.5 ° while the LiDAR unit 144a is inclined 1.5 ° backward.
- the scanning line L2 detection region S 12
- the scanning line L2 is not inclined in the vertical direction.
- the actuator 146a moves the LiDAR unit 144a forward by 1.5 °.
- the scan line L2 detection area S 12
- the scan line L2 detection area S 12
- the actuator 146a further moves the LiDAR unit 144a by 4.5 ° toward the forward direction.
- the scanning line L2 detection area S 12
- the scanning line L2 is inclined upward by 6 °.
- the actuator 146a moves the LiDAR unit 144a backward by 4.5 ° while the LiDAR unit 144a is inclined 6 ° toward the forward direction.
- the scanning line L2 detection region S 12
- the scanning line L2 is inclined 1.5 ° upward.
- the actuator 146a moves the LiDAR unit 144a backward by 1.5 ° while the LiDAR unit 144a is inclined 1.5 ° forward.
- the scan line L2 detection region S 12
- the frame F9 is a frame of point cloud data acquired when the LiDAR unit 144a is not inclined in the vertical direction.
- the drive control of the actuator 146a that is performed in the acquisition period of the frames F1 to F8 is repeatedly performed.
- tilt control of the LiDAR unit 144a in the acquisition period of the frames F1 to F8 is repeatedly performed.
- the angular positions of the scanning lines L1 to L3 in the vertical direction when the frame F9 is acquired correspond to the angular positions of the scanning lines L1 to L3 in the vertical direction when the frame F1 is acquired.
- the angular positions of the scanning lines L1 to L3 in the vertical direction when the frame F10 is acquired correspond to the angular positions of the scanning lines L1 to L3 in the vertical direction when the frame F2 is acquired.
- the LiDAR control unit 1430a generates the surrounding environment information I2 based on the frames F1 to F8 of the point cloud data. Specifically, the LiDAR control unit 1430a generates the surrounding environment information I2 by combining the frames F1 to F8.
- the detection of the LiDAR unit 144a in the vertical direction is performed by combining the frames.
- the area can be expanded (ie, the detection area S 120 can be obtained).
- one angular range of the detection region S 12 is 6 °
- the angle range of the detection region S 120 corresponding to the detection area S 12 that is enlarged by the tilt control of LiDAR unit 144a is 18 ° It becomes.
- the scan frames L1 to L3 move up and down at 1.5 ° smaller than 3 °.
- the detection region S 120 since the angular spacing between adjacent scan lines is narrowed, it is possible to increase the spatial resolution in the vertical direction of the point group data.
- the accuracy of the surrounding environment information obtained from the point cloud data can be improved.
- the actuator control unit 1460a controls the drive of the actuator 146a so that the scanning time of the LiDAR unit 144a and the driving time of the actuator 146a do not overlap with each other. For this reason, as shown in FIG. 16, the scanning lines L1 to L3 of the LiDAR unit 144a do not incline. In particular, there is no vertical difference between the one end (start point) and the other end (end point) of the scanning line. For example, in the scanning line L1 at the time of acquiring the frame F2, the angular position does not change in the vertical direction. As described above, since the scan lines L1 to L3 do not incline, it is possible to reduce the calculation load of the LiDAR control unit 1430a (ECU) that performs calculation processing of the point group data.
- ECU LiDAR control unit 1430a
- the actuator 146a moves the LiDAR unit 144a at two types of angular intervals (1.5 ° and 4.5 °) between the adjacent frames. Is not limited.
- the actuator 146a may move the LiDAR unit 144a by one type of angular spacing (eg, 1.5 ° or 4.5 °) between adjacent frames.
- the actuator 146a moves the LiDAR unit 144a at 1.5 ° intervals between adjacent frames, the spatial resolution in the vertical direction of the point cloud data can be further improved.
- the actuator 146a moves the LiDAR unit 144a at an interval of 4.5 ° between adjacent frames, the detection region of the LiDAR unit 144a in the vertical direction can be further expanded.
- the actuator 146a is configured to set the inclination angle of the LiDAR unit 144a to a first angular interval (for example, 1.) within a predetermined angular range (for example, within 5 ° downward and 5 ° upward) from the reference line. And gradually changing the tilt angle of the LiDAR unit 144a outside the predetermined angle range by a second angle interval (for example, 4.5.degree.) Larger than the first angle interval.
- the predetermined angle area from the reference line means a predetermined angle area with respect to the horizontal direction.
- the tilt angle of the LiDAR unit 144a is gradually changed at the first angle interval within the predetermined angle region, and the tilt angle of the LiDAR unit 144a is gradually increased at the second angle interval outside the predetermined angle region. Is changed to Therefore, the scanning resolution of the LiDAR unit 144a can be enhanced in a predetermined angle region, and the detection region of the LiDAR unit 144a in the vertical direction can be enlarged.
- the LiDAR unit 144a moves by 1.5 ° between the frame F1 and the frame F2, while the LiDAR unit 144a moves by 4.5 ° between the frame F2 and the frame F3. Therefore, as shown in FIG. 15, the drive time Ta1 of the actuator 146a between the frame F1 and the frame F2 and the drive time Ta2 of the actuator 146a between the frame F2 and the frame F3 are different.
- the actuator control unit 1460a may drive-control the actuator 146a such that the drive period Ta1 and the drive time Ta2 are the same. In this case, the moving speed of the shaft 463 during the driving time Ta2 is higher than the moving speed of the shaft 463 during the driving time Ta1.
- FIG. 17 is a diagram showing angle changes of scanning lines L1 to L3 between frames F1 to F6 of point group data.
- the frame F1 is a frame of point cloud data acquired when the LiDAR unit 144a is not inclined in the vertical direction.
- the scan line L2 coincides with a reference line indicating zero degrees in the vertical direction.
- the scan line L2 when acquiring the frame F2, the scan line L2 is inclined downward by 3 ° in a state where the scan line L1 matches the reference line.
- the scan line L2 matches the reference line.
- the scanning line L2 is inclined upward by 3 ° with the information that the scanning line L3 matches the reference line.
- the scan line L2 coincides with the reference line.
- the drive control of the actuator 146a which is executed in the acquisition period of the frames F1 to F4 is repeatedly executed.
- the tilt control of the LiDAR unit 144a in the acquisition period of the frames F1 to F4 is repeatedly performed.
- the angular positions of the scanning lines L1 to L3 in the vertical direction when the frame F5 is acquired correspond to the angular positions of the scanning lines L1 to L3 in the vertical direction when the frame F1 is acquired.
- the angular positions of the scanning lines L1 to L3 in the vertical direction when the frame F6 is acquired correspond to the angular positions of the scanning lines L1 to L3 in the vertical direction when the frame F2 is acquired.
- the scan lines L1 to L3 move vertically by a predetermined angle while one of the three scan lines L1 to L3 matches the reference line between the frames.
- the reference line is always scanned by the laser light, it is possible to improve the accuracy of the information related to the object present in the vicinity of the reference line.
- FIG. 18A is a diagram for explaining the relationship between the scan time of the LiDAR unit 144a and the drive time of the actuator 146a.
- FIG. 18B is a diagram showing three scan lines L1 to L3 of the LiDAR unit 144a in the tilt control of the LiDAR unit 144a shown in FIG. 18A.
- the actuator control unit 1460a drives the actuator 146a during the scanning time in which the LiDAR unit 144a is scanning the laser light. For example, while the LiDAR unit 144a is scanning laser light to acquire the frame F2 of point cloud data, the actuator 146a tilts the LiDAR unit 144a with respect to the vertical direction. Thus, unlike the example shown in FIG. 15, the actuator control unit 1460a controls the drive of the actuator 146a so that the scanning time of the LiDAR unit 144a and the driving time of the actuator 146a overlap each other.
- the actuator 146a inclines the LiDAR unit 144a with respect to the vertical direction, so the scanning lines L1 to L3 are inclined as shown in FIG. 18B.
- the scan lines L1 to L3 incline as shown in FIG. 18B.
- a vertical angular difference occurs between one end (start point) and the other end (end point) of the scan line.
- the actuator 146a since the actuator 146a is driven in the scan time of the LiDAR unit 144a, although the scan lines L1 to L3 are inclined, the surrounding environment information I2 obtained by combining a plurality of frames of point cloud data It is possible to prevent the update rate (Hz) from being greatly reduced. As described above, it is possible to widen the detection area of the LiDAR unit 144a in the vertical direction while preventing the update rate of the surrounding environment information I2 based on the point cloud data from being greatly reduced.
- FIG. 19 is a flow chart for explaining an example of a process of determining whether to drive the actuator 146 a according to the current position of the vehicle 101.
- step S ⁇ b> 21 the vehicle control unit 103 (see FIG. 9) acquires information (current position information) indicating the current position of the vehicle 101 using the GPS 9. Next, the vehicle control unit 103 acquires map information from the storage device 11 in step S22. Thereafter, the vehicle control unit 103 transmits current position information and map information to the actuator control unit 1460a of the control unit 140a. Next, the actuator control unit 1460a determines whether to drive the actuator 146a based on the received current position information and map information (step S23).
- the actuator control unit 1460a determines whether the vehicle 101 is positioned at a place (for example, an intersection or a downtown area) where a large number of objects (pedestrians and the like) exist. .
- the actuator control unit 1460a determines that the vehicle 101 is positioned at a place where many objects exist (YES in step S23)
- the actuator 146a is driven to tilt the LiDAR unit 144a with respect to the vertical direction.
- Step S24 determines that vehicle 101 is not positioned at a place where many objects exist
- actuator control unit 1460 does not drive actuator 146a (step S25).
- step S25 the LiDAR unit 144a scans the laser beam in a state in which the laser beam is not tilted in the vertical direction.
- the tilt control of the LiDAR unit 144a using the actuator 146a is executed, so the detection area of the LiDAR unit 144a in the vertical direction is enlarged. It is possible to Therefore, information related to an object present around the vehicle 101 can be obtained with high accuracy.
- step S23 the actuator control unit 1460a determines whether the vehicle 101 is positioned on a motorway (such as an expressway) based on the current position information and the map information. Good.
- actuator control unit 1460a determines that vehicle 101 is positioned on a motorway (NO in step S23)
- actuator control unit 1460a does not drive actuator 146a (step S25).
- actuator control unit 1460 drives actuator 146a to incline LiDAR unit 144a in the vertical direction (step S24).
- the tilt control of the LiDAR unit 144a using the actuator 146a is not performed, so the update rate (Hz) of the surrounding environment information I2 based on the point cloud data is It becomes possible to maintain.
- the update rate of the surrounding environment information I2 is larger than the expansion of the detection area in the vertical direction. Maintaining is prioritized.
- whether or not to drive the actuator 146a is determined according to the current position of the vehicle 101. Therefore, it is possible to acquire optimum surrounding environment information according to the current position of the vehicle 101. Note that the series of processes shown in FIG. 19 may be repeatedly performed in a predetermined cycle.
- FIG. 20 is a flowchart for describing an example of processing for determining whether to drive the actuator 146a in accordance with a pedestrian present around the vehicle 101.
- step S30 LiDAR control unit 1430a, based on the point cloud data acquired from LiDAR unit 144a, walking around the vehicle 101 (in particular, the detection region S 12 of LiDAR unit 144a) To determine if a person exists.
- the LiDAR control unit 1430a determines that a pedestrian is present around the vehicle 101 (YES in step S30)
- the LiDAR control unit 1430 transmits, to the actuator control unit 1460a, information indicating the presence of a pedestrian.
- the actuator control unit 1460a causes the LiDAR unit 144a to incline in the vertical direction by driving the actuator 146a according to the information indicating that the pedestrian is present (step S31).
- the LiDAR control unit 1430a determines that a pedestrian does not exist around the vehicle 101 (NO in step S30), it transmits information indicating that a pedestrian does not exist to the actuator control unit 1460a. After that, the actuator control unit 1460a does not drive the actuator according to the information indicating that the pedestrian does not exist (step S32). That is, the LiDAR unit 144a scans the laser beam in a state in which it does not tilt in the vertical direction.
- the actuator 146a is driven in accordance with the presence of a pedestrian present around the vehicle 101.
- the inclination control of LiDAR unit 144a is executed.
- the LiDAR control unit 1430a determines the presence of a pedestrian, but the camera control unit 1420a may determine the presence of a pedestrian based on image data acquired from the camera 143a.
- the millimeter wave radar control unit 1440a may determine the presence of a pedestrian based on the detection data acquired from the millimeter wave radar 145a.
- FIG. 21 is a flow chart for explaining an example of processing for determining the maximum value of the tilt angle of the LiDAR unit 144a in the vertical direction according to the current speed V of the vehicle 101.
- FIG. 22A is a diagram showing a detection area S 120 of the LiDAR unit 144a in the vertical direction when the current speed V of the vehicle 101 is high.
- FIG. 22B is a diagram showing a detection area S 120 of the LiDAR unit 144a in the vertical direction when the current speed V of the vehicle 101 is low.
- step S40 the vehicle control unit 103 specifies the current speed V of the vehicle 101 based on the data transmitted from the sensor 5 (vehicle speed sensor).
- the vehicle control unit 103 determines whether the current speed V is equal to or lower than the threshold speed Vth (step S41).
- the threshold speed Vth can be appropriately set according to the type of the vehicle 101 and the area (country, etc.) in which the vehicle 101 is traveling.
- the threshold velocity Vth is, for example, 60 km / h.
- step S41 When vehicle control unit 103 determines that current speed V is not lower than threshold speed Vth (NO in step S41), information indicating that current speed V is high is transmitted to actuator control unit 1460a of control unit 140a. .
- the actuator control unit 1460a sets the maximum value of the tilt angle of the LiDAR unit 144a to ⁇ max1 according to the information indicating that the current speed V is high (step S42).
- the angular range ⁇ t1 of the detection area S 120 of the LiDAR unit 144a in the vertical direction is defined as in the following formula (2).
- the maximum value of the inclination angle of the LiDAR unit 144a in the front direction is set to ⁇ max1, and the maximum value of the maximum inclination angle of the LiDAR unit 144a in the rear direction is also set to ⁇ max1.
- the angle range of the detection area S 12 is set to .theta.1.
- Angle range ⁇ t1 ⁇ 1 + 2 ⁇ max1 (2)
- the actuator control unit 1460a drives and controls the actuator 146a such that the angle range of the detection range S21 in the vertical direction is ⁇ t1 (step S43).
- step S41 when vehicle control unit 103 determines that current speed V is equal to or lower than threshold speed Vth (YES in step S41), information indicating that current speed V is low is transmitted to actuator control unit 1460a.
- the actuator control unit 1460a sets the maximum value of the tilt angle of the LiDAR unit 144a to ⁇ max2 (> ⁇ max1) according to the information indicating that the current speed V is low (step S44).
- the angular range ⁇ t2 of the detection area S 120 of the LiDAR unit 144a in the vertical direction is defined as in the following formula (3).
- the actuator control unit 1460a drives and controls the actuator 146a such that the angle range of the detection range S21 in the vertical direction is ⁇ t2 (> ⁇ t1) (step S45).
- the LiDAR unit 144a when the current speed V of the vehicle 101 is equal to or less than the threshold speed Vth (that is, when traveling at low speed), the LiDAR unit 144a is largely inclined with respect to the vertical direction (that is, LiDAR).
- the angular range of the detection area S 120 of the LiDAR unit in the vertical direction can be increased by increasing the maximum value of the inclination angle of the unit 144a).
- the scanning resolution of the LiDAR unit 144a is increased by reducing the maximum value of the inclination angle of the LiDAR unit 144a. be able to.
- FIG. 23A is a diagram showing detection regions S 22 and S 23 in the horizontal direction of two LiDAR units (first LiDAR unit 147 a and second LiDAR unit 148 a) arranged in the vertical direction.
- FIG. 23B is a diagram showing detection regions S 22 and S 23 in the vertical direction of the two LiDAR units 147 a and 148 a.
- FIG. 24 is a diagram showing functional blocks of the control unit 240a of the illumination system 204aa.
- the illumination system 204a differs from the illumination system 104a already described in that two LiDAR units and two actuators are provided.
- the two LiDAR units 147a and 148a are arranged to overlap each other in top view.
- the two LiDAR units 147a and 148a are arranged side by side in the vertical direction.
- all of the first LiDAR units 147a may be arranged to overlap with the second LiDAR units 148a, or a part of the first LiDAR units 147a may be arranged to overlap with the second LiDAR units 148a.
- Detection area S 22 of the 1LiDAR unit 147a is an enlarged detection area was in the vertical direction obtained by the inclination control of the 1LiDAR unit 147a using the actuator 149a.
- the detection region S 23 of the 2LiDAR unit 148a is an enlarged detection area was in the vertical direction obtained by the inclination control of the 2LiDAR unit 148a using the actuator 150a.
- the angular ranges of the detection areas S 22 and S 23 in the horizontal direction coincide with each other.
- the angular ranges of the detection areas S 22 and S 23 in the vertical direction may coincide with each other or may be different.
- the control unit 240a includes an illumination unit 142a, a millimeter wave radar 145a, a camera 143a, a first LiDAR unit 147a, a second LiDAR unit 148a, a first actuator 149a, and a second actuator 150a.
- the control unit 240a includes an illumination control unit 1410a, a millimeter wave radar control unit 1440a, a camera control unit 1420a, a LiDAR control unit 1435a, an actuator control unit 1465a, and a surrounding environment information fusion unit 1450a.
- the LiDAR control unit 1435a is configured to control the operation of the first LiDAR unit 147a and the second LiDAR unit 148a.
- LiDAR control unit 1435a based on the point cloud data outputted from the 1LiDAR unit 147a, and is configured to generate environment information in the detection area S 22 of the 1LiDAR unit 147a.
- LiDAR control unit 1435a based on the output point cloud data from the 2LiDAR unit 148a, and is configured to generate environment information in the detection area S 12 of the 2LiDAR unit 148a.
- the first actuator 149a is configured to change the inclination angle of the first LiDAR unit 147a with respect to the vertical direction.
- the second actuator 150a is configured to change the inclination angle of the second LiDAR unit 148a with respect to the vertical direction.
- the two actuators 149a, 150a may have the same configuration as the actuator 146a shown in FIG. 12, and may include an electromagnetic solenoid and a shaft connected to the electromagnetic solenoid.
- the two LiDAR units 147a and 148a and the two actuators 149a and 150a are disposed in the space Sa formed by the housing 124a and the light transmitting cover 122a, as shown in FIG.
- the actuator control unit 1465a is configured to control the drive of the first actuator 149a and the second actuator 150a.
- the actuator control unit 1465a can determine the inclination angle of the first LiDAR unit 147a in the vertical direction by controlling the driving of the first actuator 149a.
- the actuator control unit 1465a can determine the tilt angle of the second LiDAR unit 148a with respect to the vertical direction by controlling the drive of the second actuator 150a.
- the detection area in the vertical direction by using two LiDAR units 147a and 148a arranged to overlap each other in top view.
- the detection region of the LiDAR unit in the vertical direction is not wide enough, but by using two detection regions S 22 and S 23 , the detection of the LiDAR unit in the vertical direction is performed. It is possible to secure a sufficient detection area.
- the detection areas S 22 and S 23 may partially overlap each other in the vertical direction.
- each of the right front lighting system, the left back lighting system and the right back lighting system may have two LiDAR units and two actuators.
- FIG. 25A is a diagram showing an example of the detection region S 25 of LiDAR unit 152a when the vehicle 101 enters the intersection.
- Figure 25B is a diagram showing a LiDAR unit 152a of the detection region S 25 while the vehicle 101 is traveling straight ahead.
- the illumination system 104a comprises two LiDAR units.
- One of the two LiDAR units is a LiDAR unit 144a configured to detect the surrounding environment in the front area of the vehicle 101.
- the other of the two LiDAR units is a LiDAR unit 152a configured to detect the surrounding environment in the side area of the vehicle 101.
- An actuator (not shown) is configured to rotate the LiDAR unit 152a in the horizontal direction.
- the actuator control unit 1460a (see FIG. 10) can determine the direction of the emission surface of the LiDAR unit 152a in the horizontal direction by controlling the drive of the actuator.
- the actuator control unit 1460a by controlling the driving of the actuator, it is possible to move the detection region S 25 of LiDAR unit 152a.
- the two LiDAR units 144a and 152a are disposed in the space Sa formed by the housing 124a and the light transmitting cover 122a, as shown in FIG.
- the actuator control unit 1460a causes the exit surface (or detection area S 25 ) of the LiDAR unit 152a to face the left rear side of the vehicle 101. , Control the actuator.
- the LiDAR unit 152a can detect the environment surrounding the vehicle 101 in the left rear area of the vehicle 101.
- the actuator control unit 1460a causes the exit surface (or detection area S 25 ) of the LiDAR unit 152a to face the left front side of the vehicle 101. , Control the actuator.
- the LiDAR unit 152a can detect the environment surrounding the vehicle 101 in the left front area of the vehicle 101. At this point, when the vehicle 101 turns left, the surrounding environment information of the vehicle 101 in the left front area of the vehicle 101 becomes more important, so that the exit surface of the LiDAR unit 152a faces the left front side of the vehicle 101.
- the actuator is controlled. Therefore, it is possible to acquire optimal surrounding environment information according to the situation of the vehicle 101.
- the lighting system 104 b is configured to detect the surrounding environment in the side area of the vehicle 101 and the LiDAR unit configured to detect the surrounding environment in the front area of the vehicle 101. You may have a LiDAR unit.
- each of the lighting systems 104c, 4d is configured to detect the surrounding environment in the rear area of the vehicle 101, and the LiDAR unit configured to detect the surrounding environment in the side area of the vehicle 101. May be provided.
- the driving mode of the vehicle has been described as including the fully automatic driving mode, the advanced driving support mode, the driving support mode, and the manual driving mode, but the driving mode of the vehicle is any of these four modes It should not be limited to The classification of the operation mode of the vehicle may be appropriately changed in accordance with the laws or regulations relating to automatic driving in each country. Similarly, the definitions of “completely automatic driving mode”, “advanced driving support mode”, and “driving support mode” described in the description of the present embodiment are merely examples, and the laws or regulations concerning automatic driving in each country or These definitions may be changed as appropriate in accordance with the rules.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Medical Informatics (AREA)
- Traffic Control Systems (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Lighting Device Outwards From Vehicle And Optical Signal (AREA)
Abstract
Description
前記LiDAR制御部は、前記LiDARユニットの検出領域のうち前記対象物が存在する第1角度領域における前記LiDARユニットの走査分解能が増加するように前記LiDARユニットを制御するように構成されている。
前記LiDARユニットから新たに取得された点群データに基づいて、前記対象物の位置を更新し、
前記更新された対象物の位置に基づいて、前記第1角度領域を更新する、
ように構成されてもよい。
前記車両の周辺環境を示す点群データを取得するように構成されたLiDARユニットと、
前記LiDARユニットから取得された点群データに基づいて、前記車両の周辺環境を示す周辺環境情報を取得するように構成されたLiDAR制御部と、
前記車両の上下方向に対する前記LiDARユニットの傾斜角度を変更するように構成されたアクチュエータと、
前記アクチュエータの駆動を制御するように構成されたアクチュエータ制御部と、
を備える。
前記LiDARユニットの傾斜角度が前記第1の傾斜角度とは異なる第2の傾斜角度である場合に、前記LiDARユニットは、前記点群データの第2フレームを取得してもよい。
前記LiDAR制御部は、前記第1フレームと前記第2フレームに基づいて、前記周辺環境情報を取得してもよい。
前記車両の水平方向に対して所定の角度領域内において前記LiDARユニットの傾斜角度を第1の角度間隔で徐々に変更し、
前記所定の角度領域外において前記LiDARユニットの傾斜角度を前記第1の角度ピッチよりも大きい第2の角度間隔で徐々に変更してもよい。
第1のLiDARユニットと、第2のLiDARユニットを備えてもよい。
上面視において、前記第1のLiDARユニットと前記第2のLiDARユニットは互いに重なるように配置されてもよい。
前記アクチュエータは、
前記上下方向に対する前記第1のLiDARユニットの傾斜角度を変更するように構成された第1のアクチュエータと、
前記上下方向に対する前記第2のLiDARユニットの傾斜角度を変更するように構成された第2のアクチュエータと、を備えてもよい。
以下、本開示の第1実施形態について図面を参照しながら説明する。尚、本実施形態の説明において既に説明された部材と同一の参照番号を有する部材については、説明の便宜上、その説明は省略する。また、本図面に示された各部材の寸法は、説明の便宜上、実際の各部材の寸法とは異なる場合がある。
飛行時間ΔT1=レーザ光(光パルス)がLiDARユニットに戻ってきた時刻t1-LiDARユニットがレーザ光(光パルス)を出射した時刻t0
このように、LiDARユニット44aは、車両1の周辺環境を示す点群データ(3Dマッピングデータ)を取得することができる。
飛行時間ΔT2=ミリ波がミリ波レーダに戻ってきた時刻t3-ミリ波レーダがミリ波を出射した時刻t2
また、ミリ波レーダ45aは、ミリ波レーダ45aから出射されたミリ波の周波数f0とミリ波レーダ45aに戻ってきたミリ波の周波数f1に基づいて、ミリ波レーダ45a(車両1)に対する車両1の外部に存在する物体の相対速度Vに関する情報を取得することができる。
同様な機能及び構成を有してもよい。
以下、本開示の第2実施形態について図面を参照しながら説明する。尚、本実施形態の説明において、第1実施形態で説明された部材と同一の参照番号を有する部材については、説明の便宜上、その説明は省略する。また、本図面に示された各部材の寸法は、説明の便宜上、実際の各部材の寸法とは異なる場合がある。
飛行時間ΔT1=レーザ光(光パルス)がLiDARユニットに戻ってきた時刻t1-LiDARユニットがレーザ光(光パルス)を出射した時刻t0
このように、LiDARユニット144aは、車両101の周辺環境を示す点群データ(3Dマッピングデータ)を取得することができる。
飛行時間ΔT2=ミリ波がミリ波レーダに戻ってきた時刻t3-ミリ波レーダがミリ波を出射した時刻t2
また、ミリ波レーダ145aは、ミリ波レーダ145aから出射されたミリ波の周波数f0とミリ波レーダ145aに戻ってきたミリ波の周波数f1に基づいて、ミリ波レーダ145a(車両101)に対する車両101の外部に存在する物体の相対速度Vに関する情報を取得することができる。
角度範囲θt=θ1+θ2+θ3・・・(1)
このように、上下方向における検出領域S120の角度範囲θtは、上下方向における検出領域S12の角度範囲θ1よりも(θ2+θ3)だけ増加する。
角度範囲θt1=θ1+2θmax1・・・(2)
次に、アクチュエータ制御部1460aは、上下方向における検出範囲S21の角度範囲がθt1となるようにアクチュエータ146aを駆動制御する(ステップS43)。
角度範囲θt2=θ1+2θmax2・・・(3)
次に、アクチュエータ制御部1460aは、上下方向における検出範囲S21の角度範囲がθt2(>θt1)となるようにアクチュエータ146aを駆動制御する(ステップS45)。
Claims (15)
- 自動運転モードで走行可能な車両に設けられたセンシングシステムであって、
前記車両の周辺環境を示す点群データを取得するように構成されたLiDARユニットと、
前記LiDARユニットから取得された点群データに基づいて、前記車両の周辺に存在する対象物に関連付けられた情報を特定するように構成されたLiDAR制御部と、を備え、
前記LiDAR制御部は、前記LiDARユニットの検出領域のうち前記対象物が存在する第1角度領域における前記LiDARユニットの走査分解能が増加するように前記LiDARユニットを制御するように構成されている、センシングシステム。 - 前記LiDAR制御部は、前記LiDARユニットから取得された点群データに基づいて前記対象物の属性を特定できなかった場合に、前記第1角度領域における前記LiDARユニットの走査分解能が増加するように前記LiDARユニットを制御する、請求項1に記載のセンシングシステム。
- 前記LiDAR制御部は、前記対象物の属性が特定可能となるまで前記第1角度領域における前記LiDARユニットの走査分解能が徐々に増加するように前記LiDARユニットを制御するように構成されている、請求項2に記載のセンシングシステム。
- 前記LiDAR制御部は、
前記LiDARユニットから新たに取得された点群データに基づいて、前記対象物の位置を更新し、
前記更新された対象物の位置に基づいて、前記第1角度領域を更新する、
ように構成されている、請求項1から3のうちいずれか一項に記載のセンシングシステム。 - 請求項1から4のうちいずれか一項に記載のセンシングシステムを備えた、自動運転モードで走行可能な車両。
- 自動運転モードで走行可能な車両に設けられるセンシングシステムであって、
前記車両の周辺環境を示す点群データを取得するように構成されたLiDARユニットと、
前記LiDARユニットから取得された点群データに基づいて、前記車両の周辺環境を示す周辺環境情報を取得するように構成されたLiDAR制御部と、
前記車両の上下方向に対する前記LiDARユニットの傾斜角度を変更するように構成されたアクチュエータと、
前記アクチュエータの駆動を制御するように構成されたアクチュエータ制御部と、
を備えたセンシングシステム。 - 前記LiDARユニットの傾斜角度が第1の傾斜角度である場合に、前記LiDARユニットは、前記点群データの第1フレームを取得し、
前記LiDARユニットの傾斜角度が前記第1の傾斜角度とは異なる第2の傾斜角度である場合に、前記LiDARユニットは、前記点群データの第2フレームを取得し、
前記LiDAR制御部は、前記第1フレームと前記第2フレームに基づいて、前記周辺環境情報を取得する、請求項6に記載のセンシングシステム。 - 前記アクチュエータ制御部は、前記第1フレームが取得されるときの前記LiDARユニットの第1走査時間及び前記第2フレームが取得されるときの前記LiDARユニットの第2走査時間において、前記アクチュエータを駆動させない、請求項7に記載のセンシングシステム。
- 前記アクチュエータ制御部は、前記第1フレームが取得されるときの前記LiDARユニットの第1走査時間及び前記第2フレームが取得されるときの前記LiDARユニットの第2走査時間において、前記アクチュエータを駆動させる、請求項7に記載のセンシングシステム。
- 前記アクチュエータ制御部は、前記車両の現在位置に応じて、前記アクチュエータを駆動させるかどうかを決定するように構成されている、請求項6から9のうちいずれか一項に記載のセンシングシステム。
- 前記アクチュエータ制御部は、前記車両の現在速度に応じて、前記LiDARユニットの傾斜角度の最大値を決定するように構成されている、請求項6から10のうちいずれか一項に記載のセンシングシステム。
- 前記アクチュエータ制御部は、前記車両の周辺に存在する歩行者の検出に応じて、前記アクチュエータを駆動させるように構成されている、請求項6から11のうちいずれか一項に記載のセンシングシステム。
- 前記アクチュエータは、
前記車両の水平方向に対して所定の角度領域内において前記LiDARユニットの傾斜角度を第1の角度間隔で徐々に変更し、
前記所定の角度領域外において前記LiDARユニットの傾斜角度を前記第1の角度ピッチよりも大きい第2の角度間隔で徐々に変更する、ように構成されている、請求項6から12のうちいずれか一項に記載のセンシングシステム。 - 前記LiDARユニットは、
第1のLiDARユニットと、第2のLiDARユニットを備え、
上面視において、前記第1のLiDARユニットと前記第2のLiDARユニットは互いに重なるように配置され、
前記アクチュエータは、
前記上下方向に対する前記第1のLiDARユニットの傾斜角度を変更するように構成された第1のアクチュエータと、
前記上下方向に対する前記第2のLiDARユニットの傾斜角度を変更するように構成された第2のアクチュエータと、を備える、請求項6から9のうちいずれか一項に記載のセンシングシステム。 - 請求項6から14のうちいずれか一項に記載のセンシングシステムを備えた、自動運転モードで走行可能な車両。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/758,459 US11554792B2 (en) | 2017-10-26 | 2018-10-24 | Sensing system and vehicle |
| EP18870978.6A EP3703031B1 (en) | 2017-10-26 | 2018-10-24 | Sensing system and vehicle |
| JP2019551198A JP7152413B2 (ja) | 2017-10-26 | 2018-10-24 | センシングシステム及び車両 |
| US18/077,334 US12077187B2 (en) | 2017-10-26 | 2022-12-08 | Sensing system and vehicle |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017-207499 | 2017-10-26 | ||
| JP2017207499 | 2017-10-26 | ||
| JP2017207500 | 2017-10-26 | ||
| JP2017-207500 | 2017-10-26 |
Related Child Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/758,459 A-371-Of-International US11554792B2 (en) | 2017-10-26 | 2018-10-24 | Sensing system and vehicle |
| US18/077,334 Division US12077187B2 (en) | 2017-10-26 | 2022-12-08 | Sensing system and vehicle |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019082926A1 true WO2019082926A1 (ja) | 2019-05-02 |
Family
ID=66247867
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2018/039485 Ceased WO2019082926A1 (ja) | 2017-10-26 | 2018-10-24 | センシングシステム及び車両 |
Country Status (5)
| Country | Link |
|---|---|
| US (2) | US11554792B2 (ja) |
| EP (1) | EP3703031B1 (ja) |
| JP (1) | JP7152413B2 (ja) |
| CN (1) | CN109709530A (ja) |
| WO (1) | WO2019082926A1 (ja) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2021179388A (ja) * | 2020-05-15 | 2021-11-18 | 日立Astemo株式会社 | 物体検知装置 |
| JP2022013133A (ja) * | 2020-07-03 | 2022-01-18 | トヨタ自動車株式会社 | レーザレーダ取付構造 |
| WO2022219959A1 (ja) * | 2021-04-15 | 2022-10-20 | 株式会社デンソー | センシングモジュール、センシングシステム |
| WO2022219961A1 (ja) * | 2021-04-15 | 2022-10-20 | 株式会社デンソー | センシング制御装置、センシング制御方法、センシング制御プログラム |
| JP2023005500A (ja) * | 2021-06-29 | 2023-01-18 | コイト電工株式会社 | 検知装置 |
| CN116331229A (zh) * | 2023-03-13 | 2023-06-27 | 东南大学 | 一种智能网联电动汽车底盘系统平台及运行方法 |
| US12413700B2 (en) | 2021-05-13 | 2025-09-09 | Maxell, Ltd. | Image display apparatus and image display system |
Families Citing this family (24)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7136716B2 (ja) * | 2019-02-08 | 2022-09-13 | 日立Astemo株式会社 | 電子制御装置、判定方法 |
| WO2020195376A1 (ja) * | 2019-03-27 | 2020-10-01 | 日本電気株式会社 | 監視装置、不審オブジェクト検出方法、および記録媒体 |
| US11204417B2 (en) * | 2019-05-08 | 2021-12-21 | GM Global Technology Operations LLC | Selective attention mechanism for improved perception sensor performance in vehicular applications |
| US11315431B2 (en) * | 2019-10-09 | 2022-04-26 | Uatc, Llc | Systems and methods for autonomous vehicle controls |
| CN112651266B (zh) * | 2019-10-11 | 2024-08-06 | 阿波罗智能技术(北京)有限公司 | 行人检测方法和装置 |
| US11436876B2 (en) * | 2019-11-01 | 2022-09-06 | GM Global Technology Operations LLC | Systems and methods for diagnosing perception systems of vehicles based on temporal continuity of sensor data |
| US11493922B1 (en) | 2019-12-30 | 2022-11-08 | Waymo Llc | Perimeter sensor housings |
| US11557127B2 (en) | 2019-12-30 | 2023-01-17 | Waymo Llc | Close-in sensing camera system |
| JP7207366B2 (ja) * | 2020-05-19 | 2023-01-18 | トヨタ自動車株式会社 | 車載表示システム |
| KR20220082551A (ko) * | 2020-12-10 | 2022-06-17 | 현대자동차주식회사 | 차량 |
| US20240168135A1 (en) * | 2021-03-26 | 2024-05-23 | Pioneer Corporation | Sensor device, control device, control method, program, and storage medium |
| CN115201804B (zh) * | 2021-04-12 | 2025-08-08 | 武汉智行者科技有限公司 | 目标速度估计方法、装置以及存储介质 |
| CN114375406B (zh) * | 2021-04-12 | 2025-05-06 | 武汉智行者科技有限公司 | 目标速度估计方法、装置以及存储介质 |
| US11709260B2 (en) * | 2021-04-30 | 2023-07-25 | Zoox, Inc. | Data driven resolution function derivation |
| CN113276846B (zh) * | 2021-05-25 | 2022-11-01 | 华域汽车系统股份有限公司 | 一种后方十字交通报警系统及方法 |
| CN115691153A (zh) * | 2021-07-30 | 2023-02-03 | 武汉万集光电技术有限公司 | 一种感知系统的感知方法、感知系统、设备及存储介质 |
| WO2023093981A1 (en) * | 2021-11-24 | 2023-06-01 | Volkswagen Aktiengesellschaft | Method to organize data traffic affecting a vehicle |
| CN114348018A (zh) * | 2021-12-17 | 2022-04-15 | 际络科技(上海)有限公司 | 商用车辆的自动驾驶系统及方法 |
| US12114084B2 (en) | 2022-03-08 | 2024-10-08 | Nec Corporation Of America | Image based localization |
| US12067805B2 (en) | 2022-03-08 | 2024-08-20 | Nec Corporation Of America | Facial gesture recognition in SWIR images |
| US12198375B2 (en) * | 2022-03-08 | 2025-01-14 | Nec Corporation Of America | Image analysis for controlling movement of an object |
| US12181695B2 (en) | 2022-03-08 | 2024-12-31 | Nec Corporation Of America | Retroreflector |
| CN114779252A (zh) * | 2022-03-09 | 2022-07-22 | 苏州豪米波技术有限公司 | 毫米波雷达多模式极近距离探测系统以及探测方法 |
| CN116794666A (zh) * | 2022-03-17 | 2023-09-22 | 北京图森智途科技有限公司 | 数据采集装置和用于确定传感器的姿态的方法 |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH09277887A (ja) | 1996-04-16 | 1997-10-28 | Honda Motor Co Ltd | 自動追従走行システム |
| JP2012063230A (ja) * | 2010-09-15 | 2012-03-29 | Ricoh Co Ltd | レーザレーダ装置 |
| JP2013029375A (ja) * | 2011-07-27 | 2013-02-07 | Ihi Corp | 障害物検出方法及び障害物検出装置 |
| JP2015175644A (ja) * | 2014-03-13 | 2015-10-05 | 株式会社リコー | 測距システム、情報処理装置、情報処理方法及びプログラム |
| JP2016522886A (ja) * | 2013-04-11 | 2016-08-04 | グーグル インコーポレイテッド | 車載センサを用いて気象状態を検出する方法及びシステム |
| JP2017207499A (ja) | 2005-12-22 | 2017-11-24 | アボツト・モレキユラー・インコーポレイテツド | 肺がんへの傾向についてのスクリーニングのための方法およびマーカー組合せ |
| JP2017207500A (ja) | 2015-01-30 | 2017-11-24 | 株式会社東京精密 | 三次元座標測定装置 |
Family Cites Families (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9383753B1 (en) * | 2012-09-26 | 2016-07-05 | Google Inc. | Wide-view LIDAR with areas of special attention |
| US9632210B2 (en) | 2013-05-07 | 2017-04-25 | Google Inc. | Methods and systems for detecting weather conditions using vehicle onboard sensors |
| US8983705B2 (en) | 2013-04-30 | 2015-03-17 | Google Inc. | Methods and systems for detecting weather conditions including fog using vehicle onboard sensors |
| US9207323B2 (en) | 2013-04-11 | 2015-12-08 | Google Inc. | Methods and systems for detecting weather conditions including wet surfaces using vehicle onboard sensors |
| US9025140B2 (en) | 2013-05-07 | 2015-05-05 | Google Inc. | Methods and systems for detecting weather conditions including sunlight using vehicle onboard sensors |
| US10247854B2 (en) | 2013-05-07 | 2019-04-02 | Waymo Llc | Methods and systems for detecting weather conditions using vehicle onboard sensors |
| DE102015201317B4 (de) | 2015-01-27 | 2025-03-20 | Bayerische Motoren Werke Aktiengesellschaft | Vermessen einer Abmessung auf einer Oberfläche |
| US10509110B2 (en) * | 2015-12-29 | 2019-12-17 | The Boeing Company | Variable resolution light radar system |
| WO2017168576A1 (ja) * | 2016-03-29 | 2017-10-05 | パイオニア株式会社 | 光制御装置、光制御方法およびプログラム |
| KR101877388B1 (ko) * | 2016-07-21 | 2018-07-11 | 엘지전자 주식회사 | 차량용 라이다 장치 |
| US20180136314A1 (en) * | 2016-11-15 | 2018-05-17 | Wheego Electric Cars, Inc. | Method and system for analyzing the distance to an object in an image |
| CN106596856A (zh) * | 2016-12-09 | 2017-04-26 | 山东理工大学 | 一种基于激光雷达和摄影测量的车辆污染物排放量实时监测方法 |
| CN209471245U (zh) * | 2017-10-26 | 2019-10-08 | 株式会社小糸制作所 | 传感系统及车辆 |
| CN209471244U (zh) * | 2017-10-26 | 2019-10-08 | 株式会社小糸制作所 | 传感系统及车辆 |
-
2018
- 2018-10-18 CN CN201811213608.7A patent/CN109709530A/zh active Pending
- 2018-10-24 US US16/758,459 patent/US11554792B2/en active Active
- 2018-10-24 WO PCT/JP2018/039485 patent/WO2019082926A1/ja not_active Ceased
- 2018-10-24 JP JP2019551198A patent/JP7152413B2/ja active Active
- 2018-10-24 EP EP18870978.6A patent/EP3703031B1/en active Active
-
2022
- 2022-12-08 US US18/077,334 patent/US12077187B2/en active Active
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH09277887A (ja) | 1996-04-16 | 1997-10-28 | Honda Motor Co Ltd | 自動追従走行システム |
| JP2017207499A (ja) | 2005-12-22 | 2017-11-24 | アボツト・モレキユラー・インコーポレイテツド | 肺がんへの傾向についてのスクリーニングのための方法およびマーカー組合せ |
| JP2012063230A (ja) * | 2010-09-15 | 2012-03-29 | Ricoh Co Ltd | レーザレーダ装置 |
| JP2013029375A (ja) * | 2011-07-27 | 2013-02-07 | Ihi Corp | 障害物検出方法及び障害物検出装置 |
| JP2016522886A (ja) * | 2013-04-11 | 2016-08-04 | グーグル インコーポレイテッド | 車載センサを用いて気象状態を検出する方法及びシステム |
| JP2015175644A (ja) * | 2014-03-13 | 2015-10-05 | 株式会社リコー | 測距システム、情報処理装置、情報処理方法及びプログラム |
| JP2017207500A (ja) | 2015-01-30 | 2017-11-24 | 株式会社東京精密 | 三次元座標測定装置 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3703031A4 |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2021179388A (ja) * | 2020-05-15 | 2021-11-18 | 日立Astemo株式会社 | 物体検知装置 |
| JP7489231B2 (ja) | 2020-05-15 | 2024-05-23 | 日立Astemo株式会社 | 物体検知装置 |
| JP2022013133A (ja) * | 2020-07-03 | 2022-01-18 | トヨタ自動車株式会社 | レーザレーダ取付構造 |
| JP7294256B2 (ja) | 2020-07-03 | 2023-06-20 | トヨタ自動車株式会社 | レーザレーダ取付方法 |
| WO2022219959A1 (ja) * | 2021-04-15 | 2022-10-20 | 株式会社デンソー | センシングモジュール、センシングシステム |
| WO2022219961A1 (ja) * | 2021-04-15 | 2022-10-20 | 株式会社デンソー | センシング制御装置、センシング制御方法、センシング制御プログラム |
| US12413700B2 (en) | 2021-05-13 | 2025-09-09 | Maxell, Ltd. | Image display apparatus and image display system |
| JP2023005500A (ja) * | 2021-06-29 | 2023-01-18 | コイト電工株式会社 | 検知装置 |
| JP7811092B2 (ja) | 2021-06-29 | 2026-02-04 | コイト電工株式会社 | 検知装置 |
| CN116331229A (zh) * | 2023-03-13 | 2023-06-27 | 东南大学 | 一种智能网联电动汽车底盘系统平台及运行方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7152413B2 (ja) | 2022-10-12 |
| EP3703031A4 (en) | 2021-12-08 |
| CN109709530A (zh) | 2019-05-03 |
| US12077187B2 (en) | 2024-09-03 |
| EP3703031B1 (en) | 2025-08-13 |
| US11554792B2 (en) | 2023-01-17 |
| JPWO2019082926A1 (ja) | 2020-12-03 |
| US20230105832A1 (en) | 2023-04-06 |
| US20200255030A1 (en) | 2020-08-13 |
| EP3703031A1 (en) | 2020-09-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7152413B2 (ja) | センシングシステム及び車両 | |
| JP7222892B2 (ja) | 車両用照明システム、車両システム及び車両 | |
| JP7235659B2 (ja) | 車両用照明システム及び車両 | |
| EP3888965B1 (en) | Head-up display, vehicle display system, and vehicle display method | |
| JP6970612B2 (ja) | 車両用照明システム及び車両 | |
| US20220073035A1 (en) | Dirt detection system, lidar unit, sensing system for vehicle, and vehicle | |
| US20190248281A1 (en) | Vehicle illumination system and vehicle | |
| JP7331083B2 (ja) | 車両用センシングシステム及び車両 | |
| CN113557386A (zh) | 车辆用灯具及车辆 | |
| US20250020304A1 (en) | Road surface drawing lamp and road surface drawing lamp system | |
| US20220009406A1 (en) | Vehicle lighting system | |
| CN110271480B (zh) | 车辆系统 | |
| CN209471245U (zh) | 传感系统及车辆 | |
| CN209471244U (zh) | 传感系统及车辆 | |
| CN113454418A (zh) | 车辆用传感系统及车辆 | |
| JP7340607B2 (ja) | 車両用照明システム、車両システム及び車両 | |
| JP2020082748A (ja) | 車両用照明システム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18870978 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2019551198 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2018870978 Country of ref document: EP Effective date: 20200526 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 2018870978 Country of ref document: EP |