WO2013134956A1 - Navigation system and method for different mobility modes - Google Patents
Navigation system and method for different mobility modes Download PDFInfo
- Publication number
- WO2013134956A1 WO2013134956A1 PCT/CN2012/072465 CN2012072465W WO2013134956A1 WO 2013134956 A1 WO2013134956 A1 WO 2013134956A1 CN 2012072465 W CN2012072465 W CN 2012072465W WO 2013134956 A1 WO2013134956 A1 WO 2013134956A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- navigation data
- navigation
- mobility mode
- vehicle
- directions
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
- G01C21/3423—Multimodal routing
Definitions
- At least one embodiment of the present invention pertains to navigation, and more particularly, to a system and method for providing seamless navigation across various devices as a function of mobility mode.
- a conventional navigation system may provide directions for driving a car and not for walking or not for driving partially and walking partially.
- Directions for walking could vary significantly from driving since different routes may be available that are unavailable for cars.
- a walker can walk on paths or against vehicular traffic on a sidewalk adjacent a one-way street.
- a system and method enable calculation of navigation data as a function of mobility mode and to transmit the calculated data to devices accordingly. Accordingly, the system and method help a person get to their final desired location.
- the method calculates first navigation data based on a first mobility mode at a first device; receives a mobility mode change indication to a second mobility mode; calculates second navigation data based on the second mobility mode; and then transmits the second navigation data to a second device.
- the first navigation data calculation can be performed between the first device and a server.
- the second navigation data can be derived directly from the server.
- the system comprises mode change indication logic, direction logic and transmission logic.
- the mode logic receives a mobility mode change indication from a first mobility mode to a second mobility mode.
- the direction logic calculates first navigation data based on the first mobility mode and second navigation data based on the second mobility mode.
- the transmission logic transmits the second navigation data to a second device.
- the calculations can be done at the first device (e.g., a navigation system installed in a vehicle), the second device (e.g., a mobile device), and/or at a third device (e.g., server).
- the navigation data can include directions, a destination, and/or an origination.
- the first navigation data may include driving directions while the second navigation data may include walking directions.
- the first device can be installed in a vehicle while the second device may be a mobile device (e.g., mobile phone).
- the mobility mode change indication can include an indication that the vehicle has been placed in Park and/or that the first device has been powered off (e.g., no further signals are received from the first device).
- Figure 1 is a diagram illustrating a network according to an embodiment of the invention.
- Figure 2 is a high-level extent diagram showing an example of architecture of a client, server and/or navigation unit of Figure 1.
- Figure 3 is a block diagram showing contents of the direction system of Figure 1.
- Figure 4 is a flowchart illustrating a navigation technique.
- FIG. 1 is a diagram illustrating a network 100 according to an embodiment of the invention.
- the network 100 includes a server 1 10, a computer 1 12, a network (cloud) 120, a vehicle (e.g., automobile) 130, and a mobile device 140.
- the server 1 10 includes a direction system 1 1 1 that receives origination and destination data, as well as network 100 node identifiers, and transmits navigation information to the navigation unit 132 and/or the mobile device 140.
- the direction system 1 1 1 resides instead of or in addition on the navigation unit 132 and/or the mobile device 140.
- the vehicle 130 includes a navigation unit 132 that is coupled the vehicle 130 (e.g., installed in or detachably coupled to the vehicle 130).
- the vehicle 130 can include other vehicles, such as aircraft, ships, motorcycles, submersibles, etc.
- the navigation unit 132 includes a nav ID 134, such as a MAC address and/or other identifier.
- the mobile device 140 which can include a laptop, mobile phone, etc. includes a mob ID 144, such as a MAC address and/or other identifier.
- the network 100 can include other and/or additional nodes.
- the cloud 120 can be, for example, a local area network (LAN), wide area network (WAN), metropolitan area network (MAN), global area network such as the Internet, a Fibre Channel fabric, or any combination of such interconnects.
- LAN local area network
- WAN wide area network
- MAN metropolitan area network
- GSM Global System for Mobile communications
- Each of the server 1 10, the computer 1 12, and the navigation unit 132 may be, for example, a conventional personal computer (PC), server-class computer, workstation, handheld computing/communication device, or the like.
- the direction system 1 1 1 receives an origination, destination and identifiers (134 and 144) from one of the other nodes on the network 100.
- a user can enter the data on the computer 1 12, which then transmits the data to the direction system 1 1 1 via the cloud 120.
- the direction system 1 1 1 then calculates navigation data based on mobility mode (e.g., driving directions for the navigation unit 132 to the closest point available for the destination and walking directions from the closest point to the actual destination or just the actual origins and destinations and directions are calculated by the receiving units 132 and 140) and transmits the data to the navigation unit 132 and mobile device 140, respectively.
- the navigation device 132 and mobile device 140 can then output (visually, aurally, etc.) their respective received data and/or calculated data (e.g., directions) to the user.
- the direction system 1 1 1 calculates navigation data in real-time when receiving a mode change indication and new origination from the navigation unit 132 and/or the mobile device 140. For example, a user may have parked his/her vehicle 130, triggering the navigation unit 132 to determine position using a positioning system (e.g., GPS, Beidou, Glonass, Galileo, Loran, etc.) and transmitting a mobility mode change indication and the position as a new origination to the direction system 1 1 1.
- a positioning system e.g., GPS, Beidou, Glonass, Galileo, Loran, etc.
- the direction system 1 1 1 then calculates navigation information based on the new mobility mode and new origination and transmits the newly calculated navigation data to the mobile device 140, the vehicle 130 and/or the navigation unit 132, which then outputs (visually, aurally, etc.) it to the user.
- Figure 2 is a high-level extent diagram showing an example of an architecture 200 of the server 1 10, the computer 1 12, the navigation unit 132 or mobile device 140 of Figure 1.
- the architecture 200 includes one or more processors 210 and memory 220 coupled to an interconnect 260.
- the interconnect 260 shown in Figure 2 is an abstraction that represents any one or more separate physical buses, point-to-point connections, or both, connected by appropriate bridges, adapters, or controllers.
- the interconnect 260 may include, for example, a system bus, a form of Peripheral Component Interconnect (PCI) bus, a HyperTransport or industry standard architecture (ISA) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (I2C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 1394 bus, also called “Firewire", and/or any other suitable form of physical connection.
- PCI Peripheral Component Interconnect
- ISA HyperTransport or industry standard architecture
- SCSI small computer system interface
- USB universal serial bus
- I2C IIC
- IEEE Institute of Electrical and Electronics Engineers
- the processor(s) 210 is/are the central processing unit (CPU) of the architecture 200 and, thus, control the overall operation of the architecture 200. In certain embodiments, the processor(s) 210 accomplish this by executing software or firmware stored in memory 220.
- the processor(s) 210 may be, or may include, one or more programmable general-purpose or special-purpose microprocessors, digital signal processors (DSPs), programmable controllers, application specific integrated circuits (ASICs), programmable logic devices (PLDs), or the like, or a combination of such devices.
- the memory 220 is or includes the main memory of the architecture 200.
- the memory 220 represents any form of random access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such devices.
- RAM random access memory
- ROM read-only memory
- flash memory or the like, or a combination of such devices.
- the memory 220 may contain, among other things, software or firmware code for use in implementing at least some of the embodiments of the invention introduced herein.
- a communications interface 240 such as, but not limited to, a network adapter, one or more output device(s) 230 and one or more input device(s) 250.
- the network adapter 240 provides the architecture 200 with the ability to communicate with remote devices over the network cloud 120 and may be, for example, an Ethernet adapter or Fibre Channel adapter.
- the input device 250 may include a touch screen, keyboard, and/or mouse, etc.
- the output device 230 may include a screen and/or speakers, etc.
- the architecture 200 includes a receiving device (e.g., antenna) to receive satellite or other signals needed to calculate location.
- Such special-purpose circuitry can be in the form of, for example, one or more application-specific integrated circuits (ASICs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), etc.
- ASICs application-specific integrated circuits
- PLDs programmable logic devices
- FPGAs field-programmable gate arrays
- Software or firmware to implement the techniques introduced here may be stored on a machine-readable storage medium and may be executed by one or more general-purpose or special-purpose programmable microprocessors.
- machine-readable medium includes any mechanism that can store information in a form accessible by a machine (a machine may be, for example, a computer, network device, cellular phone, personal digital assistant (PDA),
- a machine may be, for example, a computer, network device, cellular phone, personal digital assistant (PDA),
- PDA personal digital assistant
- a machine-accessible medium includes recordable/non-recordable media (e.g., read-only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; etc.), etc.
- recordable/non-recordable media e.g., read-only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; etc.
- logic means: a) special-purpose hardwired circuitry, such as one or more application-specific integrated circuits (ASICs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), or other similar device(s); b) programmable circuitry programmed with software and/or firmware, such as one or more programmed general-purpose microprocessors, digital signal processors (DSPs) and/or microcontrollers, or other similar device(s); or c) a combination of the forms mentioned in a) and b).
- ASICs application-specific integrated circuits
- PLDs programmable logic devices
- FPGAs field programmable gate arrays
- firmware such as one or more programmed general-purpose microprocessors, digital signal processors (DSPs) and/or microcontrollers, or other similar device(s); or c) a combination of the forms mentioned in a) and b).
- Figure 3 is a block diagram showing contents of the direction system 1 1 1 of Figure 1.
- the direction system 11 1 includes direction logic 300, map data 310, mode logic 320 and transmission logic 330.
- the direction system 1 1 1 receives, with the transmission logic 330, an origination, a destination and IDs 134 and 144 of network 100 nodes.
- the IDs 134 and 144 are associated with a mobility mode (walking, driving, etc.) indicated in a database (not shown) in the direction system 1 1 1 and/or by the IDs themselves.
- the direction system 1 1 1 then calculates navigation data, using the map data 310 if required, and transmits the calculated navigation data, with the transmission logic 330, to the nodes associated with the IDs as a function of mobility mode.
- the calculated navigation data may include driving directions for the navigation unit 132 or the vehicle 130 and walking directions (from the terminus of the driving directions to the actual destination) for the mobile device 140.
- the navigation data may only include the origination, destination and driving directions terminus and the devices 132 and 140 can then calculate directions as needed.
- the navigation data only includes the destination and origination and driving directions terminus are calculated in real-time by the devices 132 and 140.
- the vehicle 130 While the vehicle 130 is parked in a current point (CP) before the destination point (DP), it constitutes a necessary condition for the navigation unit 132 to send notification to central server 1 10 that the distance between the current point (CP) and the destination point (DP) is less than a predetermined value, e.g. 3 KM, preferably 1 KM, and more preferably 0.5 KM.
- a predetermined value e.g. 3 KM, preferably 1 KM, and more preferably 0.5 KM.
- the direction logic 300 then sends the walking directions from the current point (CP) to the destination point (DP).
- the mobile device 140 receives a short text message which contains the directions from the current point CP to the destination point DP.
- the mobile device 140 receives a multimedia message containing acoustic directions. The user can reach the destination following the acoustic directions.
- the direction logic 300 will send the navigation data of directions from the current point (CP) to the destination point (DP) to the user's mobile device 140 if the mobile device 140 has a same navigation software as a navigation software in the vehicle 130. Additionally, the user may get a short text message to choose whether to receive the walking direction data, instead of receiving the walking direction data directly.
- the map data 310 includes street, path, sidewalk, etc. data needed for calculating directions.
- the mode logic 320 determines or receives an indication of a change of mobility mode so that the direction logic 300 can send the appropriate data to the appropriate network 100 node.
- the mode logic 320 can receive an indication of a switch from the navigation unit 132 (driving) to mobile device 140 (walking) by the navigation unit 132 being powered off; the vehicle 130 being placed in park; the vehicle 130 reaching the determined driving directions terminus; a user transmitting an indication of the switch; an engine is off; a key is removed from an ignition; and/or a door is opened; etc.
- FIG. 4 is a flowchart illustrating a navigation technique 400.
- the direction logic 300 receives (410), with the transmission logic 330, from the computer 1 12 or other network 100 node, an origination, a destination and IDs 134 and 144.
- the direction logic 300 calculates (420) navigation data based on mobility mode and transmits (430), with the transmission logic 330, the navigation data to the navigation unit 132 and/or the mobile device 140.
- the transmission (430) can start at vehicle 130 ignition or in advance (e.g., after the receiving (410)).
- the navigation unit 132 starts to detect the driving condition (e.g., location) while the navigation is started. In one embodiment, the navigation unit 132 will start to navigate automatically when the user starts the vehicle 130. In another embodiment, the navigation unit 132 will inquire the user whether wants to navigate with the preselected route and destination e.g., by providing a digital option button on the screen of the navigation unit 16 or an acoustic inquiry. The navigation unit 132 determines the driving condition by receiving data from sensors in the different parts of the vehicle 130.
- the driving condition e.g., location
- the navigation unit 132 in case that the vehicle 130 is parked in the current point (CP) before the destination point (DP), the navigation unit 132 will continue navigating the route from the current point (CP) to the destination point (DP) automatically while the vehicle 130 is restarted.
- the navigation unit 132 is further configured to inquire whether the user wants to navigate the remaining route from the current point (CP) to the destination point (DP) by providing a digital option button on the screen or an acoustic inquiry.
- the mode logic 320 then optionally receives (440) or determines a mobility mode change and optionally a new origination.
- the direction logic 300 then calculates (450) navigation data based on the changed mobility mode and transmits (460) the same to the mobile device 140.
- the method 400 then ends.
- parts of the technique 400 can be carried out in a different order or substantially simultaneously. For example, navigation data as a function of mobility mode can be calculated together and transmitted at approximately similar times to the navigation unit 132 and the mobile device 140. Further, directions can be calculated and transmitted by the direction logic 300 and/or by the devices 132 and/or 140 themselves.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Description
NAVIGATION SYSTEM AND METHOD FOR DIFFERENT MOBILITY MODES
FIELD OF THE INVENTION
[0001] At least one embodiment of the present invention pertains to navigation, and more particularly, to a system and method for providing seamless navigation across various devices as a function of mobility mode.
BACKGROUND
[0002] Conventional navigations systems provide navigation solutions based on a single mobility mode and not for transitions between mobility modes. For example, a conventional navigation system may provide directions for driving a car and not for walking or not for driving partially and walking partially. Directions for walking could vary significantly from driving since different routes may be available that are unavailable for cars. For example, a walker can walk on paths or against vehicular traffic on a sidewalk adjacent a one-way street.
[0003] Accordingly, a new system and method may be needed to provide navigation across multiple mobility modes.
SUMMARY
[0004] A system and method enable calculation of navigation data as a function of mobility mode and to transmit the calculated data to devices accordingly. Accordingly, the system and method help a person get to their final desired location. In an
embodiment, the method calculates first navigation data based on a first mobility mode at a first device; receives a mobility mode change indication to a second mobility mode; calculates second navigation data based on the second mobility mode; and then transmits the second navigation data to a second device. The first navigation data calculation can be performed between the first device and a server. The second navigation data can be derived directly from the server.
[0005] In an embodiment, the system comprises mode change indication logic, direction logic and transmission logic. The mode logic receives a mobility mode change
indication from a first mobility mode to a second mobility mode. The direction logic calculates first navigation data based on the first mobility mode and second navigation data based on the second mobility mode. The transmission logic transmits the second navigation data to a second device.
[0006] The calculations can be done at the first device (e.g., a navigation system installed in a vehicle), the second device (e.g., a mobile device), and/or at a third device (e.g., server). The navigation data can include directions, a destination, and/or an origination. The first navigation data may include driving directions while the second navigation data may include walking directions. The first device can be installed in a vehicle while the second device may be a mobile device (e.g., mobile phone). The mobility mode change indication can include an indication that the vehicle has been placed in Park and/or that the first device has been powered off (e.g., no further signals are received from the first device).
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] One or more embodiments of the present invention are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements.
[0008] Figure 1 is a diagram illustrating a network according to an embodiment of the invention.
[0009] Figure 2 is a high-level extent diagram showing an example of architecture of a client, server and/or navigation unit of Figure 1.
[0010] Figure 3 is a block diagram showing contents of the direction system of Figure 1.
[0011] Figure 4 is a flowchart illustrating a navigation technique. DETAILED DESCRIPTION
[0012] References in this description to "an embodiment", "one embodiment", or the like, mean that the particular feature, function, structure or characteristic being described
is included in at least one embodiment of the present invention. Occurrences of such phrases in this specification do not necessarily all refer to the same embodiment. On the other hand, such references are not necessarily mutually exclusive either.
[0013] Figure 1 is a diagram illustrating a network 100 according to an embodiment of the invention. The network 100 includes a server 1 10, a computer 1 12, a network (cloud) 120, a vehicle (e.g., automobile) 130, and a mobile device 140. The server 1 10 includes a direction system 1 1 1 that receives origination and destination data, as well as network 100 node identifiers, and transmits navigation information to the navigation unit 132 and/or the mobile device 140. In an embodiment, the direction system 1 1 1 resides instead of or in addition on the navigation unit 132 and/or the mobile device 140. The vehicle 130 includes a navigation unit 132 that is coupled the vehicle 130 (e.g., installed in or detachably coupled to the vehicle 130). In other embodiments, the vehicle 130 can include other vehicles, such as aircraft, ships, motorcycles, submersibles, etc. The navigation unit 132 includes a nav ID 134, such as a MAC address and/or other identifier. The mobile device 140, which can include a laptop, mobile phone, etc. includes a mob ID 144, such as a MAC address and/or other identifier. Note that the network 100 can include other and/or additional nodes.
[0014] The cloud 120 can be, for example, a local area network (LAN), wide area network (WAN), metropolitan area network (MAN), global area network such as the Internet, a Fibre Channel fabric, or any combination of such interconnects. Each of the server 1 10, the computer 1 12, and the navigation unit 132 may be, for example, a conventional personal computer (PC), server-class computer, workstation, handheld computing/communication device, or the like.
[0015] During operation of the network 100, the direction system 1 1 1 receives an origination, destination and identifiers (134 and 144) from one of the other nodes on the network 100. For example, a user can enter the data on the computer 1 12, which then transmits the data to the direction system 1 1 1 via the cloud 120. The direction system 1 1 1 then calculates navigation data based on mobility mode (e.g., driving directions for the navigation unit 132 to the closest point available for the destination and walking
directions from the closest point to the actual destination or just the actual origins and destinations and directions are calculated by the receiving units 132 and 140) and transmits the data to the navigation unit 132 and mobile device 140, respectively. The navigation device 132 and mobile device 140 can then output (visually, aurally, etc.) their respective received data and/or calculated data (e.g., directions) to the user.
[0016] In another embodiment, the direction system 1 1 1 calculates navigation data in real-time when receiving a mode change indication and new origination from the navigation unit 132 and/or the mobile device 140. For example, a user may have parked his/her vehicle 130, triggering the navigation unit 132 to determine position using a positioning system (e.g., GPS, Beidou, Glonass, Galileo, Loran, etc.) and transmitting a mobility mode change indication and the position as a new origination to the direction system 1 1 1. The direction system 1 1 1 then calculates navigation information based on the new mobility mode and new origination and transmits the newly calculated navigation data to the mobile device 140, the vehicle 130 and/or the navigation unit 132, which then outputs (visually, aurally, etc.) it to the user.
[0017] Figure 2 is a high-level extent diagram showing an example of an architecture 200 of the server 1 10, the computer 1 12, the navigation unit 132 or mobile device 140 of Figure 1. The architecture 200 includes one or more processors 210 and memory 220 coupled to an interconnect 260. The interconnect 260 shown in Figure 2 is an abstraction that represents any one or more separate physical buses, point-to-point connections, or both, connected by appropriate bridges, adapters, or controllers. The interconnect 260, therefore, may include, for example, a system bus, a form of Peripheral Component Interconnect (PCI) bus, a HyperTransport or industry standard architecture (ISA) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (I2C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 1394 bus, also called "Firewire", and/or any other suitable form of physical connection.
[0018] The processor(s) 210 is/are the central processing unit (CPU) of the architecture 200 and, thus, control the overall operation of the architecture 200. In certain embodiments, the processor(s) 210 accomplish this by executing software or firmware
stored in memory 220. The processor(s) 210 may be, or may include, one or more programmable general-purpose or special-purpose microprocessors, digital signal processors (DSPs), programmable controllers, application specific integrated circuits (ASICs), programmable logic devices (PLDs), or the like, or a combination of such devices.
[0019] The memory 220 is or includes the main memory of the architecture 200. The memory 220 represents any form of random access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such devices. In use, the memory 220 may contain, among other things, software or firmware code for use in implementing at least some of the embodiments of the invention introduced herein.
[0020] Also connected to the processor(s) 210 through the interconnect 260 is a communications interface 240, such as, but not limited to, a network adapter, one or more output device(s) 230 and one or more input device(s) 250. The network adapter 240 provides the architecture 200 with the ability to communicate with remote devices over the network cloud 120 and may be, for example, an Ethernet adapter or Fibre Channel adapter. The input device 250 may include a touch screen, keyboard, and/or mouse, etc. The output device 230 may include a screen and/or speakers, etc. In an embodiment, the architecture 200 includes a receiving device (e.g., antenna) to receive satellite or other signals needed to calculate location.
[0021] The techniques introduced herein can be implemented by programmable circuitry programmed/configured by software and/or firmware, or entirely by
special-purpose circuitry, or by a combination of such forms. Such special-purpose circuitry (if any) can be in the form of, for example, one or more application-specific integrated circuits (ASICs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), etc.
[0022] Software or firmware to implement the techniques introduced here may be stored on a machine-readable storage medium and may be executed by one or more general-purpose or special-purpose programmable microprocessors. A
"machine-readable medium", as the term is used herein, includes any mechanism that
can store information in a form accessible by a machine (a machine may be, for example, a computer, network device, cellular phone, personal digital assistant (PDA),
manufacturing tool, any device with one or more processors, etc.). For example, a machine-accessible medium includes recordable/non-recordable media (e.g., read-only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; etc.), etc.
[0023] The term "logic", as used herein, means: a) special-purpose hardwired circuitry, such as one or more application-specific integrated circuits (ASICs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), or other similar device(s); b) programmable circuitry programmed with software and/or firmware, such as one or more programmed general-purpose microprocessors, digital signal processors (DSPs) and/or microcontrollers, or other similar device(s); or c) a combination of the forms mentioned in a) and b).
[0024] Note that any and all of the embodiments described above can be combined with each other, except to the extent that it may be stated otherwise above or to the extent that any such embodiments might be mutually exclusive in function and/or structure.
[0025] Figure 3 is a block diagram showing contents of the direction system 1 1 1 of Figure 1. The direction system 11 1 includes direction logic 300, map data 310, mode logic 320 and transmission logic 330. The direction system 1 1 1 receives, with the transmission logic 330, an origination, a destination and IDs 134 and 144 of network 100 nodes. The IDs 134 and 144 are associated with a mobility mode (walking, driving, etc.) indicated in a database (not shown) in the direction system 1 1 1 and/or by the IDs themselves. The direction system 1 1 1 then calculates navigation data, using the map data 310 if required, and transmits the calculated navigation data, with the transmission logic 330, to the nodes associated with the IDs as a function of mobility mode.
[0026] The calculated navigation data may include driving directions for the navigation unit 132 or the vehicle 130 and walking directions (from the terminus of the driving directions to the actual destination) for the mobile device 140. Alternatively, the navigation data may only include the origination, destination and driving directions
terminus and the devices 132 and 140 can then calculate directions as needed. In an embodiment, the navigation data only includes the destination and origination and driving directions terminus are calculated in real-time by the devices 132 and 140.
[0027] While the vehicle 130 is parked in a current point (CP) before the destination point (DP), it constitutes a necessary condition for the navigation unit 132 to send notification to central server 1 10 that the distance between the current point (CP) and the destination point (DP) is less than a predetermined value, e.g. 3 KM, preferably 1 KM, and more preferably 0.5 KM.
[0028] The direction logic 300 then sends the walking directions from the current point (CP) to the destination point (DP). In a first embodiment, the mobile device 140 receives a short text message which contains the directions from the current point CP to the destination point DP. In a second embodiment, the mobile device 140 receives a multimedia message containing acoustic directions. The user can reach the destination following the acoustic directions. In a third embodiment, the direction logic 300 will send the navigation data of directions from the current point (CP) to the destination point (DP) to the user's mobile device 140 if the mobile device 140 has a same navigation software as a navigation software in the vehicle 130. Additionally, the user may get a short text message to choose whether to receive the walking direction data, instead of receiving the walking direction data directly.
[0029] The map data 310 includes street, path, sidewalk, etc. data needed for calculating directions. The mode logic 320 determines or receives an indication of a change of mobility mode so that the direction logic 300 can send the appropriate data to the appropriate network 100 node. For example, the mode logic 320 can receive an indication of a switch from the navigation unit 132 (driving) to mobile device 140 (walking) by the navigation unit 132 being powered off; the vehicle 130 being placed in park; the vehicle 130 reaching the determined driving directions terminus; a user transmitting an indication of the switch; an engine is off; a key is removed from an ignition; and/or a door is opened; etc. The transmission logic 330 works with the other logics 300 and 320 to receive and transmit data to other network 100 nodes via the cloud 120.
[0030] Figure 4 is a flowchart illustrating a navigation technique 400. In an embodiment, the direction logic 300 receives (410), with the transmission logic 330, from the computer 1 12 or other network 100 node, an origination, a destination and IDs 134 and 144. The direction logic 300 then calculates (420) navigation data based on mobility mode and transmits (430), with the transmission logic 330, the navigation data to the navigation unit 132 and/or the mobile device 140. The transmission (430) can start at vehicle 130 ignition or in advance (e.g., after the receiving (410)).
[0031] The navigation unit 132 starts to detect the driving condition (e.g., location) while the navigation is started. In one embodiment, the navigation unit 132 will start to navigate automatically when the user starts the vehicle 130. In another embodiment, the navigation unit 132 will inquire the user whether wants to navigate with the preselected route and destination e.g., by providing a digital option button on the screen of the navigation unit 16 or an acoustic inquiry. The navigation unit 132 determines the driving condition by receiving data from sensors in the different parts of the vehicle 130.
[0032] In another embodiment, in case that the vehicle 130 is parked in the current point (CP) before the destination point (DP), the navigation unit 132 will continue navigating the route from the current point (CP) to the destination point (DP) automatically while the vehicle 130 is restarted. Alternatively, the navigation unit 132 is further configured to inquire whether the user wants to navigate the remaining route from the current point (CP) to the destination point (DP) by providing a digital option button on the screen or an acoustic inquiry.
[0033] The mode logic 320 then optionally receives (440) or determines a mobility mode change and optionally a new origination. The direction logic 300 then calculates (450) navigation data based on the changed mobility mode and transmits (460) the same to the mobile device 140. The method 400 then ends. In an embodiment, parts of the technique 400 can be carried out in a different order or substantially simultaneously. For example, navigation data as a function of mobility mode can be calculated together and transmitted at approximately similar times to the navigation unit 132 and the mobile
device 140. Further, directions can be calculated and transmitted by the direction logic 300 and/or by the devices 132 and/or 140 themselves.
[0034] Although the present invention has been described with reference to specific exemplary embodiments, it will be recognized that the invention is not limited to the embodiments described, but can be practiced with modification and alteration within the spirit and scope of the appended claims. Accordingly, the specification and drawings are to be regarded in an illustrative sense rather than a restrictive sense.
Claims
1. A method, comprising:
calculating first navigation data based on a first mobility mode at a first device; receiving a mobility mode change indication to a second mobility mode;
calculating second navigation data based on the second mobility mode; and transmitting the second navigation data to a second device; wherein the first navigation data calculation is performed between the first device and a server, the second navigation data is derived directly from the server.
2. The method of claim 1 , wherein the first navigation data includes driving directions and transmitting the calculated first navigation data to a vehicle.
3. The method of claim 2, wherein the second navigation data includes walking directions and wherein the transmitting transmits via short text message.
4. The method of claim 3, wherein the walking directions originate from a terminus of the driving directions.
5. The method of claim 3, wherein the walking directions originate at a position of the second device.
6. The method of claim 1 , wherein the receiving the mobility mode change indication includes receiving an indication that a vehicle has been placed in park.
7. The method of claim 1 , wherein the receiving the mobility mode change indication includes receiving an indication that the first device is within a predetermined distance between a current position and a destination position.
8. A system, comprising:
mode logic configured to receive a mobility mode change indication from a first mobility mode to a second mobility mode;
direction logic configured to calculate first navigation data based on the first mobility mode and to calculate second navigation data based on the second mobility mode; and
transmission logic configured to transmit the second navigation data to a second device.
9. The system of claim 8, wherein the first navigation data includes driving directions and the transmission logic is further configured to transmit the calculated first navigation data to a vehicle.
10. The system of claim 9, wherein the second navigation data includes walking directions and wherein the transmission logic configuration is for transmission via short text message.
1 1. The system of claim 10, wherein the walking directions originate from a terminus of the driving directions.
12. The system of claim 10, wherein the walking directions originate at a position of the second device.
13. The system of claim 8, wherein the mobility mode change indication includes an indication that a vehicle has been placed in park.
14. The system of claim 8, wherein the mobility mode change indication includes an indication that a first device is within a predetermined distance between a current position and a destination position.
15. A method to perform navigation among a server, a navigation unit of a vehicle, and a mobile device, comprising:
calculating first navigation data based on a first mobility mode at the navigation unit of the vehicle via a direction system in the server;
transmitting the calculated first navigation data to the navigation unit of the vehicle; receiving a mobility mode change indication to a second mobility mode by the server;
calculating second navigation data based on the second mobility mode at the server; and
transmitting the second navigation data to a mobile device by the server; wherein the first navigation data includes driving directions, the second navigation data includes walking directions.
16. The method of claim 15, wherein the direction system receives data including an origination, destination and device identifiers.
17. The method of claim 16, wherein a user can enter the received data on a personal computer or mobile device that is connected to a network, which then transmits the data to the direction system via a network.
18. The method of claim 16, wherein a user can enter the received data on the navigation unit of the vehicle, which then transmits the data to the direction system via a network.
19. The method of claim 15, wherein the walking directions originate at a current position of the navigation unit of the vehicle.
20. The method of claim 15, wherein the walking directions originate at a position of the mobile device.
21. The method of claim 15, wherein receiving the mobility mode change indication includes receiving an indication that the navigation unit of the vehicle is within a predetermined distance between a current position and a destination position.
Priority Applications (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP12871101.7A EP2825846A4 (en) | 2012-03-16 | 2012-03-16 | Navigation system and method for different mobility modes |
| PCT/CN2012/072465 WO2013134956A1 (en) | 2012-03-16 | 2012-03-16 | Navigation system and method for different mobility modes |
| US14/385,498 US9207085B2 (en) | 2012-03-16 | 2012-03-16 | Navigation system and method for different mobility modes |
| CN201280001252.5A CN104321618A (en) | 2012-03-16 | 2012-03-16 | Navigation system and method for different mobility modes |
| IN8344DEN2014 IN2014DN08344A (en) | 2012-03-16 | 2012-03-16 | |
| TW101140292A TWI540311B (en) | 2012-03-16 | 2012-10-31 | Navigation system and method for different mobility modes |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2012/072465 WO2013134956A1 (en) | 2012-03-16 | 2012-03-16 | Navigation system and method for different mobility modes |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2013134956A1 true WO2013134956A1 (en) | 2013-09-19 |
Family
ID=49160246
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2012/072465 Ceased WO2013134956A1 (en) | 2012-03-16 | 2012-03-16 | Navigation system and method for different mobility modes |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US9207085B2 (en) |
| EP (1) | EP2825846A4 (en) |
| CN (1) | CN104321618A (en) |
| IN (1) | IN2014DN08344A (en) |
| TW (1) | TWI540311B (en) |
| WO (1) | WO2013134956A1 (en) |
Cited By (92)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105180951A (en) * | 2014-06-05 | 2015-12-23 | 宝马股份公司 | Route Planning For Vehicle |
| CN106687766A (en) * | 2014-09-03 | 2017-05-17 | 爱信艾达株式会社 | Route searching system, route searching method, and computer program |
| WO2017089532A1 (en) * | 2015-11-27 | 2017-06-01 | Bragi GmbH | Vehicle with interaction between vehicle navigation system and wearable devices |
| US9944295B2 (en) | 2015-11-27 | 2018-04-17 | Bragi GmbH | Vehicle with wearable for identifying role of one or more users and adjustment of user settings |
| US9978278B2 (en) | 2015-11-27 | 2018-05-22 | Bragi GmbH | Vehicle to vehicle communications using ear pieces |
| US10015579B2 (en) | 2016-04-08 | 2018-07-03 | Bragi GmbH | Audio accelerometric feedback through bilateral ear worn device system and method |
| US10013542B2 (en) | 2016-04-28 | 2018-07-03 | Bragi GmbH | Biometric interface system and method |
| US10040423B2 (en) | 2015-11-27 | 2018-08-07 | Bragi GmbH | Vehicle with wearable for identifying one or more vehicle occupants |
| US10045116B2 (en) | 2016-03-14 | 2018-08-07 | Bragi GmbH | Explosive sound pressure level active noise cancellation utilizing completely wireless earpieces system and method |
| US10045110B2 (en) | 2016-07-06 | 2018-08-07 | Bragi GmbH | Selective sound field environment processing system and method |
| US10045117B2 (en) | 2016-11-04 | 2018-08-07 | Bragi GmbH | Earpiece with modified ambient environment over-ride function |
| US10045112B2 (en) | 2016-11-04 | 2018-08-07 | Bragi GmbH | Earpiece with added ambient environment |
| US10049184B2 (en) | 2016-10-07 | 2018-08-14 | Bragi GmbH | Software application transmission via body interface using a wearable device in conjunction with removable body sensor arrays system and method |
| US10045736B2 (en) | 2016-07-06 | 2018-08-14 | Bragi GmbH | Detection of metabolic disorders using wireless earpieces |
| US10063957B2 (en) | 2016-11-04 | 2018-08-28 | Bragi GmbH | Earpiece with source selection within ambient environment |
| US10058282B2 (en) | 2016-11-04 | 2018-08-28 | Bragi GmbH | Manual operation assistance with earpiece with 3D sound cues |
| US10062373B2 (en) | 2016-11-03 | 2018-08-28 | Bragi GmbH | Selective audio isolation from body generated sound system and method |
| US10085082B2 (en) | 2016-03-11 | 2018-09-25 | Bragi GmbH | Earpiece with GPS receiver |
| US10085091B2 (en) | 2016-02-09 | 2018-09-25 | Bragi GmbH | Ambient volume modification through environmental microphone feedback loop system and method |
| US10099636B2 (en) | 2015-11-27 | 2018-10-16 | Bragi GmbH | System and method for determining a user role and user settings associated with a vehicle |
| US10104460B2 (en) | 2015-11-27 | 2018-10-16 | Bragi GmbH | Vehicle with interaction between entertainment systems and wearable devices |
| US10104464B2 (en) | 2016-08-25 | 2018-10-16 | Bragi GmbH | Wireless earpiece and smart glasses system and method |
| US10104487B2 (en) | 2015-08-29 | 2018-10-16 | Bragi GmbH | Production line PCB serial programming and testing method and system |
| US10122421B2 (en) | 2015-08-29 | 2018-11-06 | Bragi GmbH | Multimodal communication system using induction and radio and method |
| US10117604B2 (en) | 2016-11-02 | 2018-11-06 | Bragi GmbH | 3D sound positioning with distributed sensors |
| US10158934B2 (en) | 2016-07-07 | 2018-12-18 | Bragi GmbH | Case for multiple earpiece pairs |
| US10165350B2 (en) | 2016-07-07 | 2018-12-25 | Bragi GmbH | Earpiece with app environment |
| US10200780B2 (en) | 2016-08-29 | 2019-02-05 | Bragi GmbH | Method and apparatus for conveying battery life of wireless earpiece |
| US10205814B2 (en) | 2016-11-03 | 2019-02-12 | Bragi GmbH | Wireless earpiece with walkie-talkie functionality |
| US10212505B2 (en) | 2015-10-20 | 2019-02-19 | Bragi GmbH | Multi-point multiple sensor array for data sensing and processing system and method |
| US10216474B2 (en) | 2016-07-06 | 2019-02-26 | Bragi GmbH | Variable computing engine for interactive media based upon user biometrics |
| US10225638B2 (en) | 2016-11-03 | 2019-03-05 | Bragi GmbH | Ear piece with pseudolite connectivity |
| US10297911B2 (en) | 2015-08-29 | 2019-05-21 | Bragi GmbH | Antenna for use in a wearable device |
| US10313779B2 (en) | 2016-08-26 | 2019-06-04 | Bragi GmbH | Voice assistant system for wireless earpieces |
| US10327082B2 (en) | 2016-03-02 | 2019-06-18 | Bragi GmbH | Location based tracking using a wireless earpiece device, system, and method |
| US10334346B2 (en) | 2016-03-24 | 2019-06-25 | Bragi GmbH | Real-time multivariable biometric analysis and display system and method |
| US10344960B2 (en) | 2017-09-19 | 2019-07-09 | Bragi GmbH | Wireless earpiece controlled medical headlight |
| US10382854B2 (en) | 2015-08-29 | 2019-08-13 | Bragi GmbH | Near field gesture control system and method |
| US10397688B2 (en) | 2015-08-29 | 2019-08-27 | Bragi GmbH | Power control for battery powered personal area network device system and method |
| US10397686B2 (en) | 2016-08-15 | 2019-08-27 | Bragi GmbH | Detection of movement adjacent an earpiece device |
| US10405081B2 (en) | 2017-02-08 | 2019-09-03 | Bragi GmbH | Intelligent wireless headset system |
| US10409091B2 (en) | 2016-08-25 | 2019-09-10 | Bragi GmbH | Wearable with lenses |
| US10412478B2 (en) | 2015-08-29 | 2019-09-10 | Bragi GmbH | Reproduction of ambient environmental sound for acoustic transparency of ear canal device system and method |
| US10433788B2 (en) | 2016-03-23 | 2019-10-08 | Bragi GmbH | Earpiece life monitor with capability of automatic notification system and method |
| US10455313B2 (en) | 2016-10-31 | 2019-10-22 | Bragi GmbH | Wireless earpiece with force feedback |
| US10460095B2 (en) | 2016-09-30 | 2019-10-29 | Bragi GmbH | Earpiece with biometric identifiers |
| US10469931B2 (en) | 2016-07-07 | 2019-11-05 | Bragi GmbH | Comparative analysis of sensors to control power status for wireless earpieces |
| US10506327B2 (en) | 2016-12-27 | 2019-12-10 | Bragi GmbH | Ambient environmental sound field manipulation based on user defined voice and audio recognition pattern analysis system and method |
| US10506322B2 (en) | 2015-10-20 | 2019-12-10 | Bragi GmbH | Wearable device onboard applications system and method |
| US10555700B2 (en) | 2016-07-06 | 2020-02-11 | Bragi GmbH | Combined optical sensor for audio and pulse oximetry system and method |
| US10575086B2 (en) | 2017-03-22 | 2020-02-25 | Bragi GmbH | System and method for sharing wireless earpieces |
| US10582290B2 (en) | 2017-02-21 | 2020-03-03 | Bragi GmbH | Earpiece with tap functionality |
| US10580282B2 (en) | 2016-09-12 | 2020-03-03 | Bragi GmbH | Ear based contextual environment and biometric pattern recognition system and method |
| US10582289B2 (en) | 2015-10-20 | 2020-03-03 | Bragi GmbH | Enhanced biometric control systems for detection of emergency events system and method |
| US10582328B2 (en) | 2016-07-06 | 2020-03-03 | Bragi GmbH | Audio response based on user worn microphones to direct or adapt program responses system and method |
| US10587943B2 (en) | 2016-07-09 | 2020-03-10 | Bragi GmbH | Earpiece with wirelessly recharging battery |
| US10598506B2 (en) | 2016-09-12 | 2020-03-24 | Bragi GmbH | Audio navigation using short range bilateral earpieces |
| US10620698B2 (en) | 2015-12-21 | 2020-04-14 | Bragi GmbH | Voice dictation systems using earpiece microphone system and method |
| US10617297B2 (en) | 2016-11-02 | 2020-04-14 | Bragi GmbH | Earpiece with in-ear electrodes |
| US10621583B2 (en) | 2016-07-07 | 2020-04-14 | Bragi GmbH | Wearable earpiece multifactorial biometric analysis system and method |
| US10667033B2 (en) | 2016-03-02 | 2020-05-26 | Bragi GmbH | Multifactorial unlocking function for smart wearable device and method |
| US10672239B2 (en) | 2015-08-29 | 2020-06-02 | Bragi GmbH | Responsive visual communication system and method |
| US10698983B2 (en) | 2016-10-31 | 2020-06-30 | Bragi GmbH | Wireless earpiece with a medical engine |
| US10708699B2 (en) | 2017-05-03 | 2020-07-07 | Bragi GmbH | Hearing aid with added functionality |
| US10747337B2 (en) | 2016-04-26 | 2020-08-18 | Bragi GmbH | Mechanical detection of a touch movement using a sensor and a special surface pattern system and method |
| US10771877B2 (en) | 2016-10-31 | 2020-09-08 | Bragi GmbH | Dual earpieces for same ear |
| US10771881B2 (en) | 2017-02-27 | 2020-09-08 | Bragi GmbH | Earpiece with audio 3D menu |
| US10821361B2 (en) | 2016-11-03 | 2020-11-03 | Bragi GmbH | Gaming with earpiece 3D audio |
| US10852829B2 (en) | 2016-09-13 | 2020-12-01 | Bragi GmbH | Measurement of facial muscle EMG potentials for predictive analysis using a smart wearable system and method |
| US10856809B2 (en) | 2016-03-24 | 2020-12-08 | Bragi GmbH | Earpiece with glucose sensor and system |
| US10888039B2 (en) | 2016-07-06 | 2021-01-05 | Bragi GmbH | Shielded case for wireless earpieces |
| US10887679B2 (en) | 2016-08-26 | 2021-01-05 | Bragi GmbH | Earpiece for audiograms |
| US10904653B2 (en) | 2015-12-21 | 2021-01-26 | Bragi GmbH | Microphone natural speech capture voice dictation system and method |
| US10942701B2 (en) | 2016-10-31 | 2021-03-09 | Bragi GmbH | Input and edit functions utilizing accelerometer based earpiece movement system and method |
| US10977348B2 (en) | 2016-08-24 | 2021-04-13 | Bragi GmbH | Digital signature using phonometry and compiled biometric data system and method |
| US11013445B2 (en) | 2017-06-08 | 2021-05-25 | Bragi GmbH | Wireless earpiece with transcranial stimulation |
| US11064408B2 (en) | 2015-10-20 | 2021-07-13 | Bragi GmbH | Diversity bluetooth system and method |
| US11085871B2 (en) | 2016-07-06 | 2021-08-10 | Bragi GmbH | Optical vibration detection system and method |
| US11086593B2 (en) | 2016-08-26 | 2021-08-10 | Bragi GmbH | Voice assistant for wireless earpieces |
| US11116415B2 (en) | 2017-06-07 | 2021-09-14 | Bragi GmbH | Use of body-worn radar for biometric measurements, contextual awareness and identification |
| US11200026B2 (en) | 2016-08-26 | 2021-12-14 | Bragi GmbH | Wireless earpiece with a passive virtual assistant |
| US11245463B2 (en) | 2019-10-14 | 2022-02-08 | Volkswagen Aktiengesellschaft | Wireless communication device and corresponding apparatus, method and computer program |
| US11272367B2 (en) | 2017-09-20 | 2022-03-08 | Bragi GmbH | Wireless earpieces for hub communications |
| US11283742B2 (en) | 2016-09-27 | 2022-03-22 | Bragi GmbH | Audio-based social media platform |
| US11336405B2 (en) | 2019-10-14 | 2022-05-17 | Volkswagen Aktiengesellschaft | Wireless communication device and corresponding apparatus, method and computer program |
| US11380430B2 (en) | 2017-03-22 | 2022-07-05 | Bragi GmbH | System and method for populating electronic medical records with wireless earpieces |
| US11438110B2 (en) | 2019-10-14 | 2022-09-06 | Volkswagen Aktiengesellschaft | Wireless communication device and corresponding apparatus, method and computer program |
| US11490858B2 (en) | 2016-08-31 | 2022-11-08 | Bragi GmbH | Disposable sensor array wearable device sleeve system and method |
| US11544104B2 (en) | 2017-03-22 | 2023-01-03 | Bragi GmbH | Load sharing between wireless earpieces |
| US11553462B2 (en) | 2019-10-14 | 2023-01-10 | Volkswagen Aktiengesellschaft | Wireless communication device and corresponding apparatus, method and computer program |
| US11694771B2 (en) | 2017-03-22 | 2023-07-04 | Bragi GmbH | System and method for populating electronic health records with wireless earpieces |
| US11799852B2 (en) | 2016-03-29 | 2023-10-24 | Bragi GmbH | Wireless dongle for communications with wireless earpieces |
Families Citing this family (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9264862B2 (en) * | 2013-08-15 | 2016-02-16 | Apple Inc. | Determining exit from a vehicle |
| US9279696B2 (en) * | 2013-10-25 | 2016-03-08 | Qualcomm Incorporated | Automatic handover of positioning parameters from a navigation device to a mobile device |
| CN106996786A (en) * | 2016-01-22 | 2017-08-01 | 高德信息技术有限公司 | Air navigation aid, device, server and system |
| CN106767886A (en) * | 2017-02-08 | 2017-05-31 | 大陆汽车电子(芜湖)有限公司 | The method that walking navigation is automatically switched to from traffic navigation |
| CN106940190A (en) * | 2017-05-15 | 2017-07-11 | 英华达(南京)科技有限公司 | Navigation drawing drawing method, navigation picture draw guider and navigation system |
| CN107515005A (en) * | 2017-09-27 | 2017-12-26 | 江西爱驰亿维实业有限公司 | Seamless switching air navigation aid, system, vehicle device and cloud server |
| CN111220165B (en) * | 2018-11-23 | 2022-05-24 | 腾讯大地通途(北京)科技有限公司 | Vehicle navigation method, system, server and terminal |
| US12018947B2 (en) | 2019-03-20 | 2024-06-25 | Lg Electronics Inc. | Method for providing navigation service using mobile terminal, and mobile terminal |
| CN112033424A (en) * | 2019-06-03 | 2020-12-04 | 上海博泰悦臻网络技术服务有限公司 | Method and system for switching working modes of navigation software and electronic equipment |
| CN111598760B (en) * | 2020-04-28 | 2023-08-04 | 中国第一汽车股份有限公司 | Travel tool management method, device, equipment and storage medium |
| CN115701525A (en) * | 2021-08-02 | 2023-02-10 | 上海擎感智能科技有限公司 | Navigation control method, system, device and medium |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1385673A (en) * | 2001-05-15 | 2002-12-18 | 松下电器产业株式会社 | Navigation system |
| JP2005003526A (en) * | 2003-06-12 | 2005-01-06 | Denso Corp | Navigation system |
| CN102200443A (en) * | 2010-03-23 | 2011-09-28 | 神达电脑股份有限公司 | Method for automatically selecting navigation route in personal navigation device |
| JP2011209169A (en) * | 2010-03-30 | 2011-10-20 | Panasonic Corp | Navigation device |
Family Cites Families (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6636145B1 (en) * | 1999-07-07 | 2003-10-21 | Honda Giken Kogyo Kabushiki Kaisha | Vehicle sharing system and method with parking state detection |
| JP3749821B2 (en) * | 1999-09-30 | 2006-03-01 | 株式会社東芝 | Pedestrian road guidance system and pedestrian road guidance method |
| JP3751795B2 (en) * | 1999-11-22 | 2006-03-01 | 株式会社東芝 | Pedestrian route guidance automatic creation device and method, and recording medium |
| US6381534B2 (en) * | 2000-02-14 | 2002-04-30 | Fujitsu Limited | Navigation information presenting apparatus and method thereof |
| JP2001304902A (en) * | 2000-04-27 | 2001-10-31 | Toyota Motor Corp | Storage medium and system storing specific facility location information |
| JP2002183878A (en) * | 2000-12-15 | 2002-06-28 | Toshiba Corp | Pedestrian way guidance system, pedestrian way guidance service method, way guidance data collection device, and way guidance data collection method |
| US7474960B1 (en) * | 2002-12-30 | 2009-01-06 | Mapquest, Inc. | Presenting a travel route |
| CN1774614B (en) * | 2003-04-17 | 2011-03-30 | 株式会社日本耐美得 | Pedestrian guiding device, pedestrian guiding system, pedestrian guiding method |
| US8296059B2 (en) * | 2006-06-11 | 2012-10-23 | Volvo Technology Corp. | Method and arrangement for reducing criminal risk to an overland transport |
| US20080033644A1 (en) * | 2006-08-03 | 2008-02-07 | Bannon Sean A | Navigation Routing System Having Environmentally Triggered Routing |
| US20080167806A1 (en) * | 2007-01-05 | 2008-07-10 | Zeetoo, Inc. | System and method for providing local maps using wireless handheld devices |
| JP5230652B2 (en) * | 2007-01-10 | 2013-07-10 | トムトム インターナショナル ベスローテン フエンノートシャップ | Method, computer program and navigation system for indicating traffic delay |
| US8000892B2 (en) * | 2007-06-12 | 2011-08-16 | Campus Destinations, Inc. | Pedestrian mapping system |
| US8060297B2 (en) * | 2007-12-14 | 2011-11-15 | Microsoft Corporation | Route transfer between devices |
| CN102037330B (en) * | 2009-01-14 | 2014-11-26 | 通腾科技股份有限公司 | Navigation apparatus and method |
| US8494439B2 (en) * | 2010-05-04 | 2013-07-23 | Robert Bosch Gmbh | Application state and activity transfer between devices |
| US8731814B2 (en) * | 2010-07-02 | 2014-05-20 | Ford Global Technologies, Llc | Multi-modal navigation system and method |
| US9103681B2 (en) * | 2013-06-08 | 2015-08-11 | Apple Inc. | Navigation application with several navigation modes |
-
2012
- 2012-03-16 CN CN201280001252.5A patent/CN104321618A/en active Pending
- 2012-03-16 IN IN8344DEN2014 patent/IN2014DN08344A/en unknown
- 2012-03-16 WO PCT/CN2012/072465 patent/WO2013134956A1/en not_active Ceased
- 2012-03-16 EP EP12871101.7A patent/EP2825846A4/en not_active Withdrawn
- 2012-03-16 US US14/385,498 patent/US9207085B2/en not_active Expired - Fee Related
- 2012-10-31 TW TW101140292A patent/TWI540311B/en not_active IP Right Cessation
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1385673A (en) * | 2001-05-15 | 2002-12-18 | 松下电器产业株式会社 | Navigation system |
| JP2005003526A (en) * | 2003-06-12 | 2005-01-06 | Denso Corp | Navigation system |
| CN102200443A (en) * | 2010-03-23 | 2011-09-28 | 神达电脑股份有限公司 | Method for automatically selecting navigation route in personal navigation device |
| JP2011209169A (en) * | 2010-03-30 | 2011-10-20 | Panasonic Corp | Navigation device |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP2825846A4 * |
Cited By (153)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105180951A (en) * | 2014-06-05 | 2015-12-23 | 宝马股份公司 | Route Planning For Vehicle |
| CN106687766A (en) * | 2014-09-03 | 2017-05-17 | 爱信艾达株式会社 | Route searching system, route searching method, and computer program |
| US10397688B2 (en) | 2015-08-29 | 2019-08-27 | Bragi GmbH | Power control for battery powered personal area network device system and method |
| US10122421B2 (en) | 2015-08-29 | 2018-11-06 | Bragi GmbH | Multimodal communication system using induction and radio and method |
| US10104487B2 (en) | 2015-08-29 | 2018-10-16 | Bragi GmbH | Production line PCB serial programming and testing method and system |
| US10439679B2 (en) | 2015-08-29 | 2019-10-08 | Bragi GmbH | Multimodal communication system using induction and radio and method |
| US10672239B2 (en) | 2015-08-29 | 2020-06-02 | Bragi GmbH | Responsive visual communication system and method |
| US10412478B2 (en) | 2015-08-29 | 2019-09-10 | Bragi GmbH | Reproduction of ambient environmental sound for acoustic transparency of ear canal device system and method |
| US10297911B2 (en) | 2015-08-29 | 2019-05-21 | Bragi GmbH | Antenna for use in a wearable device |
| US10382854B2 (en) | 2015-08-29 | 2019-08-13 | Bragi GmbH | Near field gesture control system and method |
| US11683735B2 (en) | 2015-10-20 | 2023-06-20 | Bragi GmbH | Diversity bluetooth system and method |
| US11419026B2 (en) | 2015-10-20 | 2022-08-16 | Bragi GmbH | Diversity Bluetooth system and method |
| US10582289B2 (en) | 2015-10-20 | 2020-03-03 | Bragi GmbH | Enhanced biometric control systems for detection of emergency events system and method |
| US11064408B2 (en) | 2015-10-20 | 2021-07-13 | Bragi GmbH | Diversity bluetooth system and method |
| US10212505B2 (en) | 2015-10-20 | 2019-02-19 | Bragi GmbH | Multi-point multiple sensor array for data sensing and processing system and method |
| US12052620B2 (en) | 2015-10-20 | 2024-07-30 | Bragi GmbH | Diversity Bluetooth system and method |
| US10506322B2 (en) | 2015-10-20 | 2019-12-10 | Bragi GmbH | Wearable device onboard applications system and method |
| US10040423B2 (en) | 2015-11-27 | 2018-08-07 | Bragi GmbH | Vehicle with wearable for identifying one or more vehicle occupants |
| US9978278B2 (en) | 2015-11-27 | 2018-05-22 | Bragi GmbH | Vehicle to vehicle communications using ear pieces |
| US10099636B2 (en) | 2015-11-27 | 2018-10-16 | Bragi GmbH | System and method for determining a user role and user settings associated with a vehicle |
| US10104460B2 (en) | 2015-11-27 | 2018-10-16 | Bragi GmbH | Vehicle with interaction between entertainment systems and wearable devices |
| US10155524B2 (en) | 2015-11-27 | 2018-12-18 | Bragi GmbH | Vehicle with wearable for identifying role of one or more users and adjustment of user settings |
| US9944295B2 (en) | 2015-11-27 | 2018-04-17 | Bragi GmbH | Vehicle with wearable for identifying role of one or more users and adjustment of user settings |
| WO2017089532A1 (en) * | 2015-11-27 | 2017-06-01 | Bragi GmbH | Vehicle with interaction between vehicle navigation system and wearable devices |
| US10620698B2 (en) | 2015-12-21 | 2020-04-14 | Bragi GmbH | Voice dictation systems using earpiece microphone system and method |
| US11496827B2 (en) | 2015-12-21 | 2022-11-08 | Bragi GmbH | Microphone natural speech capture voice dictation system and method |
| US10904653B2 (en) | 2015-12-21 | 2021-01-26 | Bragi GmbH | Microphone natural speech capture voice dictation system and method |
| US12088985B2 (en) | 2015-12-21 | 2024-09-10 | Bragi GmbH | Microphone natural speech capture voice dictation system and method |
| US10085091B2 (en) | 2016-02-09 | 2018-09-25 | Bragi GmbH | Ambient volume modification through environmental microphone feedback loop system and method |
| US10412493B2 (en) | 2016-02-09 | 2019-09-10 | Bragi GmbH | Ambient volume modification through environmental microphone feedback loop system and method |
| US10327082B2 (en) | 2016-03-02 | 2019-06-18 | Bragi GmbH | Location based tracking using a wireless earpiece device, system, and method |
| US10667033B2 (en) | 2016-03-02 | 2020-05-26 | Bragi GmbH | Multifactorial unlocking function for smart wearable device and method |
| US10085082B2 (en) | 2016-03-11 | 2018-09-25 | Bragi GmbH | Earpiece with GPS receiver |
| US12279083B2 (en) | 2016-03-11 | 2025-04-15 | Bragi GmbH | Earpiece with GPS receiver |
| US11336989B2 (en) | 2016-03-11 | 2022-05-17 | Bragi GmbH | Earpiece with GPS receiver |
| US11968491B2 (en) | 2016-03-11 | 2024-04-23 | Bragi GmbH | Earpiece with GPS receiver |
| US10893353B2 (en) | 2016-03-11 | 2021-01-12 | Bragi GmbH | Earpiece with GPS receiver |
| US11700475B2 (en) | 2016-03-11 | 2023-07-11 | Bragi GmbH | Earpiece with GPS receiver |
| US10506328B2 (en) | 2016-03-14 | 2019-12-10 | Bragi GmbH | Explosive sound pressure level active noise cancellation |
| US10045116B2 (en) | 2016-03-14 | 2018-08-07 | Bragi GmbH | Explosive sound pressure level active noise cancellation utilizing completely wireless earpieces system and method |
| US10433788B2 (en) | 2016-03-23 | 2019-10-08 | Bragi GmbH | Earpiece life monitor with capability of automatic notification system and method |
| US10856809B2 (en) | 2016-03-24 | 2020-12-08 | Bragi GmbH | Earpiece with glucose sensor and system |
| US10334346B2 (en) | 2016-03-24 | 2019-06-25 | Bragi GmbH | Real-time multivariable biometric analysis and display system and method |
| US11799852B2 (en) | 2016-03-29 | 2023-10-24 | Bragi GmbH | Wireless dongle for communications with wireless earpieces |
| US10313781B2 (en) | 2016-04-08 | 2019-06-04 | Bragi GmbH | Audio accelerometric feedback through bilateral ear worn device system and method |
| US10015579B2 (en) | 2016-04-08 | 2018-07-03 | Bragi GmbH | Audio accelerometric feedback through bilateral ear worn device system and method |
| US10747337B2 (en) | 2016-04-26 | 2020-08-18 | Bragi GmbH | Mechanical detection of a touch movement using a sensor and a special surface pattern system and method |
| US10169561B2 (en) | 2016-04-28 | 2019-01-01 | Bragi GmbH | Biometric interface system and method |
| US10013542B2 (en) | 2016-04-28 | 2018-07-03 | Bragi GmbH | Biometric interface system and method |
| US10582328B2 (en) | 2016-07-06 | 2020-03-03 | Bragi GmbH | Audio response based on user worn microphones to direct or adapt program responses system and method |
| US11085871B2 (en) | 2016-07-06 | 2021-08-10 | Bragi GmbH | Optical vibration detection system and method |
| US11770918B2 (en) | 2016-07-06 | 2023-09-26 | Bragi GmbH | Shielded case for wireless earpieces |
| US10448139B2 (en) | 2016-07-06 | 2019-10-15 | Bragi GmbH | Selective sound field environment processing system and method |
| US11497150B2 (en) | 2016-07-06 | 2022-11-08 | Bragi GmbH | Shielded case for wireless earpieces |
| US10888039B2 (en) | 2016-07-06 | 2021-01-05 | Bragi GmbH | Shielded case for wireless earpieces |
| US10216474B2 (en) | 2016-07-06 | 2019-02-26 | Bragi GmbH | Variable computing engine for interactive media based upon user biometrics |
| US10470709B2 (en) | 2016-07-06 | 2019-11-12 | Bragi GmbH | Detection of metabolic disorders using wireless earpieces |
| US10045110B2 (en) | 2016-07-06 | 2018-08-07 | Bragi GmbH | Selective sound field environment processing system and method |
| US11781971B2 (en) | 2016-07-06 | 2023-10-10 | Bragi GmbH | Optical vibration detection system and method |
| US10201309B2 (en) | 2016-07-06 | 2019-02-12 | Bragi GmbH | Detection of physiological data using radar/lidar of wireless earpieces |
| US12178027B2 (en) | 2016-07-06 | 2024-12-24 | Bragi GmbH | Shielded case for wireless earpieces |
| US10555700B2 (en) | 2016-07-06 | 2020-02-11 | Bragi GmbH | Combined optical sensor for audio and pulse oximetry system and method |
| US10045736B2 (en) | 2016-07-06 | 2018-08-14 | Bragi GmbH | Detection of metabolic disorders using wireless earpieces |
| US10158934B2 (en) | 2016-07-07 | 2018-12-18 | Bragi GmbH | Case for multiple earpiece pairs |
| US10516930B2 (en) | 2016-07-07 | 2019-12-24 | Bragi GmbH | Comparative analysis of sensors to control power status for wireless earpieces |
| US10469931B2 (en) | 2016-07-07 | 2019-11-05 | Bragi GmbH | Comparative analysis of sensors to control power status for wireless earpieces |
| US10165350B2 (en) | 2016-07-07 | 2018-12-25 | Bragi GmbH | Earpiece with app environment |
| US10621583B2 (en) | 2016-07-07 | 2020-04-14 | Bragi GmbH | Wearable earpiece multifactorial biometric analysis system and method |
| US10587943B2 (en) | 2016-07-09 | 2020-03-10 | Bragi GmbH | Earpiece with wirelessly recharging battery |
| US10397686B2 (en) | 2016-08-15 | 2019-08-27 | Bragi GmbH | Detection of movement adjacent an earpiece device |
| US11620368B2 (en) | 2016-08-24 | 2023-04-04 | Bragi GmbH | Digital signature using phonometry and compiled biometric data system and method |
| US12437049B2 (en) | 2016-08-24 | 2025-10-07 | Bragi GmbH | Digital signature using phonometry and compiled biometric data system and method |
| US12001537B2 (en) | 2016-08-24 | 2024-06-04 | Bragi GmbH | Digital signature using phonometry and compiled biometric data system and method |
| US10977348B2 (en) | 2016-08-24 | 2021-04-13 | Bragi GmbH | Digital signature using phonometry and compiled biometric data system and method |
| US10104464B2 (en) | 2016-08-25 | 2018-10-16 | Bragi GmbH | Wireless earpiece and smart glasses system and method |
| US10409091B2 (en) | 2016-08-25 | 2019-09-10 | Bragi GmbH | Wearable with lenses |
| US11573763B2 (en) | 2016-08-26 | 2023-02-07 | Bragi GmbH | Voice assistant for wireless earpieces |
| US11861266B2 (en) | 2016-08-26 | 2024-01-02 | Bragi GmbH | Voice assistant for wireless earpieces |
| US10313779B2 (en) | 2016-08-26 | 2019-06-04 | Bragi GmbH | Voice assistant system for wireless earpieces |
| US11200026B2 (en) | 2016-08-26 | 2021-12-14 | Bragi GmbH | Wireless earpiece with a passive virtual assistant |
| US12574669B2 (en) | 2016-08-26 | 2026-03-10 | Bragi GmbH | Earpiece for audiograms |
| US11086593B2 (en) | 2016-08-26 | 2021-08-10 | Bragi GmbH | Voice assistant for wireless earpieces |
| US12182474B2 (en) | 2016-08-26 | 2024-12-31 | Bragi GmbH | Wireless earpiece with a passive virtual assistant |
| US10887679B2 (en) | 2016-08-26 | 2021-01-05 | Bragi GmbH | Earpiece for audiograms |
| US12265757B2 (en) | 2016-08-26 | 2025-04-01 | Bragi GmbH | Voice assistant for wireless earpieces |
| US10200780B2 (en) | 2016-08-29 | 2019-02-05 | Bragi GmbH | Method and apparatus for conveying battery life of wireless earpiece |
| US12245873B2 (en) | 2016-08-31 | 2025-03-11 | Bragi GmbH | Disposable sensor array wearable device sleeve system and method |
| US11490858B2 (en) | 2016-08-31 | 2022-11-08 | Bragi GmbH | Disposable sensor array wearable device sleeve system and method |
| US10598506B2 (en) | 2016-09-12 | 2020-03-24 | Bragi GmbH | Audio navigation using short range bilateral earpieces |
| US10580282B2 (en) | 2016-09-12 | 2020-03-03 | Bragi GmbH | Ear based contextual environment and biometric pattern recognition system and method |
| US11294466B2 (en) | 2016-09-13 | 2022-04-05 | Bragi GmbH | Measurement of facial muscle EMG potentials for predictive analysis using a smart wearable system and method |
| US11675437B2 (en) | 2016-09-13 | 2023-06-13 | Bragi GmbH | Measurement of facial muscle EMG potentials for predictive analysis using a smart wearable system and method |
| US10852829B2 (en) | 2016-09-13 | 2020-12-01 | Bragi GmbH | Measurement of facial muscle EMG potentials for predictive analysis using a smart wearable system and method |
| US12045390B2 (en) | 2016-09-13 | 2024-07-23 | Bragi GmbH | Measurement of facial muscle EMG potentials for predictive analysis using a smart wearable system and method |
| US11956191B2 (en) | 2016-09-27 | 2024-04-09 | Bragi GmbH | Audio-based social media platform |
| US12463926B2 (en) | 2016-09-27 | 2025-11-04 | Bragi GmbH | Audio-based social media platform |
| US11627105B2 (en) | 2016-09-27 | 2023-04-11 | Bragi GmbH | Audio-based social media platform |
| US11283742B2 (en) | 2016-09-27 | 2022-03-22 | Bragi GmbH | Audio-based social media platform |
| US10460095B2 (en) | 2016-09-30 | 2019-10-29 | Bragi GmbH | Earpiece with biometric identifiers |
| US10049184B2 (en) | 2016-10-07 | 2018-08-14 | Bragi GmbH | Software application transmission via body interface using a wearable device in conjunction with removable body sensor arrays system and method |
| US11947874B2 (en) | 2016-10-31 | 2024-04-02 | Bragi GmbH | Input and edit functions utilizing accelerometer based earpiece movement system and method |
| US10942701B2 (en) | 2016-10-31 | 2021-03-09 | Bragi GmbH | Input and edit functions utilizing accelerometer based earpiece movement system and method |
| US10698983B2 (en) | 2016-10-31 | 2020-06-30 | Bragi GmbH | Wireless earpiece with a medical engine |
| US12321668B2 (en) | 2016-10-31 | 2025-06-03 | Bragi GmbH | Input and edit functions utilizing accelerometer based earpiece movement system and method |
| US10771877B2 (en) | 2016-10-31 | 2020-09-08 | Bragi GmbH | Dual earpieces for same ear |
| US11599333B2 (en) | 2016-10-31 | 2023-03-07 | Bragi GmbH | Input and edit functions utilizing accelerometer based earpiece movement system and method |
| US10455313B2 (en) | 2016-10-31 | 2019-10-22 | Bragi GmbH | Wireless earpiece with force feedback |
| US10117604B2 (en) | 2016-11-02 | 2018-11-06 | Bragi GmbH | 3D sound positioning with distributed sensors |
| US10617297B2 (en) | 2016-11-02 | 2020-04-14 | Bragi GmbH | Earpiece with in-ear electrodes |
| US10062373B2 (en) | 2016-11-03 | 2018-08-28 | Bragi GmbH | Selective audio isolation from body generated sound system and method |
| US12400630B2 (en) | 2016-11-03 | 2025-08-26 | Bragi GmbH | Selective audio isolation from body generated sound system and method |
| US10896665B2 (en) | 2016-11-03 | 2021-01-19 | Bragi GmbH | Selective audio isolation from body generated sound system and method |
| US12226696B2 (en) | 2016-11-03 | 2025-02-18 | Bragi GmbH | Gaming with earpiece 3D audio |
| US11908442B2 (en) | 2016-11-03 | 2024-02-20 | Bragi GmbH | Selective audio isolation from body generated sound system and method |
| US10821361B2 (en) | 2016-11-03 | 2020-11-03 | Bragi GmbH | Gaming with earpiece 3D audio |
| US11325039B2 (en) | 2016-11-03 | 2022-05-10 | Bragi GmbH | Gaming with earpiece 3D audio |
| US11417307B2 (en) | 2016-11-03 | 2022-08-16 | Bragi GmbH | Selective audio isolation from body generated sound system and method |
| US11806621B2 (en) | 2016-11-03 | 2023-11-07 | Bragi GmbH | Gaming with earpiece 3D audio |
| US10205814B2 (en) | 2016-11-03 | 2019-02-12 | Bragi GmbH | Wireless earpiece with walkie-talkie functionality |
| US10225638B2 (en) | 2016-11-03 | 2019-03-05 | Bragi GmbH | Ear piece with pseudolite connectivity |
| US10063957B2 (en) | 2016-11-04 | 2018-08-28 | Bragi GmbH | Earpiece with source selection within ambient environment |
| US10398374B2 (en) | 2016-11-04 | 2019-09-03 | Bragi GmbH | Manual operation assistance with earpiece with 3D sound cues |
| US10045117B2 (en) | 2016-11-04 | 2018-08-07 | Bragi GmbH | Earpiece with modified ambient environment over-ride function |
| US10045112B2 (en) | 2016-11-04 | 2018-08-07 | Bragi GmbH | Earpiece with added ambient environment |
| US10681450B2 (en) | 2016-11-04 | 2020-06-09 | Bragi GmbH | Earpiece with source selection within ambient environment |
| US10681449B2 (en) | 2016-11-04 | 2020-06-09 | Bragi GmbH | Earpiece with added ambient environment |
| US10397690B2 (en) | 2016-11-04 | 2019-08-27 | Bragi GmbH | Earpiece with modified ambient environment over-ride function |
| US10058282B2 (en) | 2016-11-04 | 2018-08-28 | Bragi GmbH | Manual operation assistance with earpiece with 3D sound cues |
| US10506327B2 (en) | 2016-12-27 | 2019-12-10 | Bragi GmbH | Ambient environmental sound field manipulation based on user defined voice and audio recognition pattern analysis system and method |
| US10405081B2 (en) | 2017-02-08 | 2019-09-03 | Bragi GmbH | Intelligent wireless headset system |
| US10582290B2 (en) | 2017-02-21 | 2020-03-03 | Bragi GmbH | Earpiece with tap functionality |
| US10771881B2 (en) | 2017-02-27 | 2020-09-08 | Bragi GmbH | Earpiece with audio 3D menu |
| US12299479B2 (en) | 2017-03-22 | 2025-05-13 | Bragi GmbH | Load sharing between wireless earpieces |
| US11710545B2 (en) | 2017-03-22 | 2023-07-25 | Bragi GmbH | System and method for populating electronic medical records with wireless earpieces |
| US10575086B2 (en) | 2017-03-22 | 2020-02-25 | Bragi GmbH | System and method for sharing wireless earpieces |
| US11694771B2 (en) | 2017-03-22 | 2023-07-04 | Bragi GmbH | System and method for populating electronic health records with wireless earpieces |
| US11380430B2 (en) | 2017-03-22 | 2022-07-05 | Bragi GmbH | System and method for populating electronic medical records with wireless earpieces |
| US12354715B2 (en) | 2017-03-22 | 2025-07-08 | Bragi GmbH | System and method for populating electronic health records with wireless earpieces |
| US12087415B2 (en) | 2017-03-22 | 2024-09-10 | Bragi GmbH | System and method for populating electronic medical records with wireless earpieces |
| US11544104B2 (en) | 2017-03-22 | 2023-01-03 | Bragi GmbH | Load sharing between wireless earpieces |
| US10708699B2 (en) | 2017-05-03 | 2020-07-07 | Bragi GmbH | Hearing aid with added functionality |
| US11116415B2 (en) | 2017-06-07 | 2021-09-14 | Bragi GmbH | Use of body-worn radar for biometric measurements, contextual awareness and identification |
| US12226199B2 (en) | 2017-06-07 | 2025-02-18 | Bragi GmbH | Use of body-worn radar for biometric measurements, contextual awareness and identification |
| US11013445B2 (en) | 2017-06-08 | 2021-05-25 | Bragi GmbH | Wireless earpiece with transcranial stimulation |
| US11911163B2 (en) | 2017-06-08 | 2024-02-27 | Bragi GmbH | Wireless earpiece with transcranial stimulation |
| US10344960B2 (en) | 2017-09-19 | 2019-07-09 | Bragi GmbH | Wireless earpiece controlled medical headlight |
| US11272367B2 (en) | 2017-09-20 | 2022-03-08 | Bragi GmbH | Wireless earpieces for hub communications |
| US12069479B2 (en) | 2017-09-20 | 2024-08-20 | Bragi GmbH | Wireless earpieces for hub communications |
| US11711695B2 (en) | 2017-09-20 | 2023-07-25 | Bragi GmbH | Wireless earpieces for hub communications |
| US11245463B2 (en) | 2019-10-14 | 2022-02-08 | Volkswagen Aktiengesellschaft | Wireless communication device and corresponding apparatus, method and computer program |
| US11336405B2 (en) | 2019-10-14 | 2022-05-17 | Volkswagen Aktiengesellschaft | Wireless communication device and corresponding apparatus, method and computer program |
| US11438110B2 (en) | 2019-10-14 | 2022-09-06 | Volkswagen Aktiengesellschaft | Wireless communication device and corresponding apparatus, method and computer program |
| US11553462B2 (en) | 2019-10-14 | 2023-01-10 | Volkswagen Aktiengesellschaft | Wireless communication device and corresponding apparatus, method and computer program |
Also Published As
| Publication number | Publication date |
|---|---|
| IN2014DN08344A (en) | 2015-05-08 |
| EP2825846A4 (en) | 2015-12-09 |
| US20150081218A1 (en) | 2015-03-19 |
| TW201339540A (en) | 2013-10-01 |
| CN104321618A (en) | 2015-01-28 |
| US9207085B2 (en) | 2015-12-08 |
| TWI540311B (en) | 2016-07-01 |
| EP2825846A1 (en) | 2015-01-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9207085B2 (en) | Navigation system and method for different mobility modes | |
| JP7710015B2 (en) | Parking assistance measures | |
| JP5795078B2 (en) | Vehicle side system | |
| EP3190385B1 (en) | Route searching system, route searching method, and computer program | |
| US20110018736A1 (en) | Geographically specific emergency notification | |
| US20150304817A1 (en) | Mobile communication device and communication control method | |
| JP2014228546A (en) | System and method for storing and recalling location data | |
| WO2017068897A1 (en) | Navigation system | |
| US10899348B2 (en) | Method, apparatus and computer program product for associating map objects with road links | |
| CN104183116A (en) | Taxi scheduling system, vehicle-mounted navigation terminal and scheduling server | |
| CN110239544A (en) | Controller of vehicle, control method for vehicle and storage medium | |
| JP2019061414A (en) | Estimation device | |
| JP2019100763A (en) | Passing-each-other difficulty section avoidance system, server device, information display device, and passing-each-other difficulty section avoidance method | |
| CN113393071A (en) | Car dispatch service device, method, and computer-readable medium having program recorded thereon | |
| WO2015112752A1 (en) | Automated navigation and configuration systems and methods for limited-access vehicles | |
| CN105956171B (en) | A method and device for real-time data sharing | |
| JP2019125167A (en) | Onboard equipment, server, navigation system, map display program, and map display method | |
| CN110832563A (en) | Information communication device and location management system | |
| JP5472039B2 (en) | Guidance information providing system | |
| KR20220023683A (en) | Method and Apparatus for Providing Multi-Modal Service Using Personal Mobility | |
| US11081003B2 (en) | Map-providing server and map-providing method | |
| JP2007040711A (en) | In-vehicle device | |
| EP4382865A1 (en) | Method, apparatus, and computer program product for intelligent trajectory configurations within mobility data using junctions inferred by features of the mobility data | |
| JP5076617B2 (en) | Car navigation system | |
| KR20070019442A (en) | Personal Navigation System Using Public Transportation Information and Its Method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12871101 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 14385498 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| REEP | Request for entry into the european phase |
Ref document number: 2012871101 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2012871101 Country of ref document: EP |