WO2001069380A2 - Systeme et procede de validation d'interfaces utilisateur adaptables dynamiquement pour dispositifs electroniques - Google Patents
Systeme et procede de validation d'interfaces utilisateur adaptables dynamiquement pour dispositifs electroniques Download PDFInfo
- Publication number
- WO2001069380A2 WO2001069380A2 PCT/US2001/008151 US0108151W WO0169380A2 WO 2001069380 A2 WO2001069380 A2 WO 2001069380A2 US 0108151 W US0108151 W US 0108151W WO 0169380 A2 WO0169380 A2 WO 0169380A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- adaptation
- user
- capability
- preference
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2807—Exchanging configuration information on appliance services in a home automation network
- H04L12/2814—Exchanging control software or macros for controlling appliance services in a home automation network
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2807—Exchanging configuration information on appliance services in a home automation network
- H04L12/2809—Exchanging configuration information on appliance services in a home automation network indicating that an appliance service is present in a home automation network
Definitions
- Assistive technology can be used to help a person who is hindered in some way from interacting with a system in conventional ways.
- these assistive technology solutions are not very extensible and do not adapt to the changing needs of a user.
- Another aspect of the invention includes a preference object residing in a memory of an electronic device, the preference object comprising at least two preference functions, wherein each of the at least two preference functions have an associated type, wherein the type is one of: entry, control, presentation, and authorization, at least one preference rating for each of the preference functions, wherein the preference rating preferentially orders the at least two preference functions.
- Another aspect of the invention includes a system for generating a preference object in an electronic device, the method comprising means for receiving a user description object that defines user characteristic information, wherein the user characteristic information is selected from the group comprising: situational information, environmental information, behavior information, and context information, means for storing the user description in a database, means for determining the conditions of the user, and means for generating, in response to a user accessing an electronic device, a preference object that defines one or more user preferences, wherein the content of the preference object is at least based in part upon the content of the user description object and the determined conditions.
- Yet another aspect of the invention includes a capability object residing in a memory of an electronic device, the capability object comprising a descriptor which identifies an electronic device or software application that is associated with the capability object and which identifies whether the electronic device or software application is directly accessible by the user or, alternatively, an information source that is accessible by the electronic device or software application, and at least one capability function, the capability function comprising a type descriptor that defines the type of the capability object as being either (1) an electronic device or software application or (2) an information source.
- the user input description module 124 is also capable of stimulating the generation of preference objects and capability objects through direct interaction with the static UDO 142 and the dynamic UDO 144 whenever changes in conditions of the user 120, the user experience device 120, or the user local system 132 are significant enough to warrant an adaptation.
- the adaptation engine 162 is an active collaborator in the session between the user and the information source 166.
- the preference objects and capability objects are cached in the adaptation engine 162 in a preference object cache 212 and a capability object cache 216 (shown in Figure 2).
- the user experience device 128 presents to the user 120 information that is received by the user local system 132 from the information source 166. Furthermore, the user experience device 128 receives control information from the user 120 regarding services or information that are provided by the information source 166.
- the user experience device 128 can render and receive visual, aural, tactile, haptic, gestural and other modes, and specifically includes assistive and accessor devices that accommodate a disability of a user.
- the user experience device 128 can include a keyboard, a mouse, a monitor and speakers as devices, and a graphical user interface (GUI) for the operating system and applications.
- GUI graphical user interface
- the user experience device 128 could include a screen reader.
- the user experience device 128 also includes: ATMs, kiosks, smartphones, set-top boxes, smart appliances, smartcards, JavaTM rings, and RF tags. 4.
- the User Local System The user local system 132 includes an electronic device that the human user may use to retrieve information or requests services from the information source 166.
- the user local system 132 may also include an operating system, applications, and peripheral devices including assistive and accessor devices.
- the user local system 132 can include any electronic device. Depending on the embodiment, the user local system 132 may be integrated with the user experience device 128. 5.
- the Static UDO The Static UDO
- the user's preferences can be entirely independent of any qualifying conditions that are specified.
- the preferences are for the same user described above, but they could as easily be for any user with no disability who chooses to operate their "accessor" by voice control and entry, but who views the visual presentation on the accessor's screen.
- a lower priority choice states that aural presentation in lieu of visual presentation is acceptable. This might be chosen when the person is engaged in some activity where they cannot see the screen, or should not be looking at it, such as while driving a car.
- This user has also expressed a preference for using the accessor as a control and entry device for systems external to the accessor ("extra"), while using that system's visual display, if it has one.
- FIG. 2 is a block diagram illustrating certain sub-components of the adaptation engine 162.
- the adaptation engine 162 includes the following modules: a communications interface 204, a capabilities registry 208, a preference object cache 212, a capability object cache 216, an adaptation event manager 220, an adaptation manager 224, a plurality of adapter service units 228, an interpretive consolidator 232, an adaptation service registry 236, an adaptation object assembler 240, a learning engine 244, a session manager 248, and an accounting log 252.
- the communication interface 204 includes protocols and services for receiving preference objects and capability objects and for transmitting adaptation objects to appropriate recipients.
- the capabilities registry 208 provides services to transmit capability objects and preference objects.
- the abstraction filter 150 classifies the user information into the following functional groups: • Control (manipulation and activation of interface elements)
- the Control Function includes the modes:
- Navigation e.g., mouse, arrow key, TAB key, voice, etc.
- Aural e.g., level, bandwidth, voice features, other sounds, etc.
- the proxy 136 is an agent of the user 120 which may be employed when the user local system 132 cannot make the necessary changes in the user experience device 128 or the information source 166 on its own. For example, the user local system 132 may not have the capability of converting text to speech or speech to text (this would have been expressed in a capability object to the adaptation engine 162).
- the function of the user's proxy 136 is to execute an adaptation object transmitted by the adaptation engine 162.
- the proxy 136 is an agent of the user, and is shown in the user domain 104, its physical location could be in several different physical locations.
- the information source 166 may include: a computer; a personal appliance; an ATM; a kiosk; a handheld device; a smart appliance, a network of devices, a networked federation of computers; a game system; electronic instrumentation; an automobile; a television; a telephone; a lamp; an air conditioning system; a sprinkler system; an elevator; or a monitoring and control system for a room, such as family room, an office, or an elevator.
- the information source 166 is integrated with the user local system 132.
- a preference object is made up of three types of object, P 1f P 2 and P 3 , and that there may be one or more occurrences of objects of type P 2 .
- the objects appearing in this syntax have referent names-e.g., ID, preference object Function and Adaptation Session-which are defined in similar syntax subsequent to this definition.
- the Name is a unique handle by which the preference object will be identified.
- the Source identifies the entity that generated the preference object.
- the Adaptation Session identifies the session with which the Preference Object is associated.
- Preference Object Function -. ⁇ PF, PF 2 PF 3 + PF 4 * PF 5 PF, Type PF 2 Name PF 3 Mode PF 4 External Reference PF 5
- the Name field stores a unique handle that is used for correlating the Preference Object Function with Capability Object Functions and other entities.
- the Mode field identifies the mode of the Preference Object Function, e.g., visual for presentation, activation for Entry, selection for Control. When a preference is identified, it should be in the context of a specific device, e.g., a monitor. Thus, a component of the Reference object should point to the capability object where this device is described.
- the Preference Object contains a priority for the preference with respect to other preferences. Furthermore, the Preference Object may describe characteristics of the preference.
- the Name field of the Preference is used to refer to it within a selected session.
- the Descriptor field contained in the Characteristics object is not further defined here. Capability Object
- the Name field is a unique handle for the capability object.
- the Source field identifies the entity that generated the capability object; there may be multiple capability objects generated. Furthermore, the Source field identifies type of the capability object, i.e., a user capability object or an information source capability object. Since capability objects may be related to several concurrent sessions, the Adaptation Session field contains references to these.
- the Type field of the Capability specifies the type of entity that the Capability refers to, e.g., a mode for an interface function, when the Capability Object Function Type is one of Entry, Control, Presentation or Authorization, or a performance feature, such as throughput on the communications interface, when the Capability Object Function Type is Communications.
- the Executables field can contain classes that embody the capability, rather than a description of characteristics, which might not be complete enough or may be too complex to be usable.
- the Characteristics field contains the relevant capability descriptions of the entity.
- the Adaptation Object Listener field identifies an entity that is expecting to receive any adaptation object that is generated using the contents of the capability object.
- the Adaptation Object Listener field might reference an application on the user local system that is expecting to be transformed in some way-an adaptable application.
- the proxy 136 is another example of a Adaptation Object Listener that could be designated.
- the Adaptation Object Listener field is associated with a capability, since more than one entity may be needed to act on an adaptation object.
- the Action field is used to direct the adaptation engine 162 to take some action, e.g., to "get" a set of capability characteristics from a previously cached capability object or from a remote source.
- Typical Constraint Descriptions include maximum, minimum and preferred values. However, not all characteristics can be limited in this manner. For example, with red-green colorblindness, colors should be restricted to blue-yellow. However, the maximum blue is white and the minimum blue is black, and similarly for yellow. Moreover, some blue-greens are considered to be blues, may even contain red components, and still be distinguishable by a person who is red-green colorblind.
- the Weight object is a set of values that could be used as parameters in the adaptation engine learning and decision functions, such as neural nets and genetic algorithms.
- the dynamic UDO generates UDO objects based upon the provided information.
- the abstraction filter 150 transmits the interface function descriptions to the preference/capability assembly 154.
- the preference/capability assembly 154 generates the preference object and retrieves the user capability objects.
- the capability object is generated using information that is transmitted by the user local system 132.
- the adaptation engine 162 transmits, depending on the content of the adaptation object, the adaptation object to either the proxy 136 or the user local system 132.
- all adaptation objects are sent by default to the user local system 132. However, if the user local system 132 cannot perform adaptation specified by the adaptation object, the adaptation object is transmitted to the proxy 136. Continuing to a state 472, the adaptation object is used to transform the human factor information that is transmitted to the user 120 or, alternatively, the control information that is received by the client with respect to the accessed electronic device or the information source 166.
- the following illustrates one exemplary function of the adaptation system 100. It is noted that the adaptation system 100 can be used in a multitude of other contexts and be embodied in other electronic devices than those that are described below. Jennifer Adams uses a wheelchair for mobility since she lost the use of all extremities in a diving accident.
- the adaptation engine 162 pairs the ATM registered capability object with this preference object by assigning the adaptation session number, e.g., "12345", to a copy of the capability object. This pairing triggers the adaptation sequence.
- the first step in the adaptation sequence involves obtaining a preference object for Jennifer's situation at establishment of the session.
- the adaptation engine 162 passes the authentication from the preference object to the query interface 158, which verifies Jennifer's identity. Section of Jennifer's static UDO (Secured Information)
- the structure of the static UDO is such that any faculty which is not impaired (presumably for some time that is "long", e.g., compared to a mean session length) does not have an entry. Thus, Jennifer's vision, which is not impaired, does not appear in an entry in the UDO.
- the abstraction filter 150 first deduces from the UDO that Jennifer has no basic visual disabilities, since there is no FACULTY entry for vision in her static UDO. It also sees that her physical condition does not constrain head movement (only her arms, hand and legs are affected), so her field of vision is not going to be limited by not being able to move her head. [Although this is not an issue here, if Jennifer's head were restricted in its movement and her effective field of view thereby constrained, this would have to be taken into account in the accommodation, perhaps through a field-of-view parameter.] Consequently, the preference for using the default visual presentation can be used without any limitation.
- the other interface functions of Entry and Control will be mediated through the accessor over an ATMSecureRF channel, among other possible means of interconnecting.
- Jennifer's accessor is passed to the preference/capability assembly 154, where a new preference object and a capability object are created. These are identified by the adaptation session number previously sent by the ATM to the accessor and by unique names. The accessor also assigns itself a unique handle to identify itself in the adaptation session. The ATM will assign an ATMSecureRF port address to accompany the accessor's name. For the example, the name can be "jadams". These are sent on to the adaptation engine 162, where they are analyzed by the adaptation manager 224 and found to be consistent with the capability object previously registered by the ATM. This results in an adaptation object that essentially verifies this consistency and provides the necessary setup for the transaction interaction.
- the foregoing adaptation object is essentially a confirmation of the mapping from the ATM's touchscreen inputs to the accessor via the ATMSecureRF channel, using a protocol called ATMtransaction, which will allow Jennifer to use her voice commands to the accessor to provide the transaction inputs.
- the ID "transaction" entries on the Function, Configuration and Setup tie all these fields together.
- This adaptation object is sent to the ATM, which then carries out the mapping. Information on the configuration and setup are sent to the accessor as well and the ATM notifies the accessor to begin the transaction using the ATMSecureRF.
- the accessor transaction application is now running, controlled by her voice.
- the preference object sent earlier, caused a capability object from the accessor manufacturer to be accessed and cached in the adaptation engine, with a session number. This same session number is used in the new preference object to cross-reference it with the accessor capability object
- the storage of this item in the dynamic UDO is triggered by Jennifer's command to the accessor to have the ATM direct its visual display to the accessor.
- a constraint message will be forwarded to the static UDO repository to be combined with Jennifer's information there to form a preference object and the necessary capability objects to complete the preference transaction.
- the constraint message is cached in the dynamic UDO at least until the session ends and thereafter on a least-recently-used caching discipline.
- PREFERENCE TYPE "supplemental" > capability object: jadamsC 1 ;
- This preference object repeats the identification information and adds a preference for the visual presentation to be delivered to the accessor, using the transaction setup already in place.
- the Preference shows a tag not defined formally earlier, the LOCUS tag. This will eventually indicate where the visual presentation will arrive for display.
- the "supplemental" preference points to a capability object, described below, containing additional information regarding the accessor and Jennifer's preferred GUI.
- This capability object makes reference to the accessor's characteristics, which the adaptation engine 162 should have already cached from a previous retrieval, and to the GUI that Jennifer uses. Some of the GUI characteristics are in fact preferences that Jennifer has specified, but are considered to be capabilities because she does not permit altering them. Note also that the preference object "jadamsPV, above, is cross-referenced.
- the ATM has previously provided the adaptation engine 162 with a capability object that describes the visual interface of the ATM.
- the adaptation engine 162 uses this interface description and the accessor GUI description to render the ATM display for use on the accessor. This assumes that there is a generic, or abstract, description of the ATM visual interface available. In the case where there is not such a description, the ATM and accessor can revert to another standard interface.
- the preference object and capability object above are transferred to the adaptation engine 162, where the adaptation procedure begins with the adaptation manager 224 examining the preference object and the capability objects to determine which adaptation service units 228 should be assigned. This can be done by examining the attributes in the tags of the preference object and capability objects. For example, the following characteristics taken from the accessor's capability information indicate that size and resolution of the accessor's display screen will be involved.
- an adaptation service unit that can deal with display scaling is selected.
- the adaptation service unit uses the preference object and capability object information to begin building adaptation protocols, which are made up of adaptation concepts and adaptation constraints. Since the adaptation manager 224 does not yet have access to the full description of the ATM's current interface, the adaptation service unit can only build "approaches” and "outlines" of what is to be done-the adaptation concepts and constraints. Further detailing to build concrete adaptation objects will be provided by the adaptation service, assumed to have access to the description of the ATM's current interface, or by the interpretive consolidator 232. The adaptation concepts and constraints produced by the adaptation service units are collected into adaptation protocols which are then presented to the interpretive consolidator 232 for coordination and resolution of conflicts, among other services that it can provide.
- the interpretive consolidator 232 is called upon to coordinate these adaptation protocols and to resolve any conflicts detected. It can also bring into the activity any knowledge that it might have about previous adaptation transactions that might have been requested by Jennifer during her session and even those of other people who might have interacted with the ATM prior to Jennifer. Because the example is fragmentary for tutorial purposes, there are no apparent conflicts to resolve. However, the interpretive consolidator 232 already has some knowledge about the ATM's display from a capability object that was submitted on its behalf:
- the APPLY attribute in the CONCEPT tag indicates the Adaptation concept within the same Adaptation protocol to which the adjustment will apply.
- the present system uses "self-descriptions" of information system to provide flexibility. This information is used to determine operating conditions and configurations for a given session, and to support dynamic changes in these conditions and configurations during a session. These operating conditions and configurations are the result of negotiation, either by one of the communicants, such as a user, a machine that is associated with the user, or by a neutral party.
- the self-descriptions can be used to negotiate a communication paradigm with an information system.
- the negotiations may be simple, e.g., matching attributes, or complex, e.g., matching an offer to a range (e.g., similarity), weighted decision models and counter-offers.
- the self-descriptions, or "profiles”, include: information about the user's working context; about the task that is to be performed; what kinds of constraints are placed on how the task is accomplished, e.g., by the working context or by limitations of the user; what solutions the user prefers to apply to ameliorate the constraints; platform and communications capacities; device capabilities and limitations.
- a profile might only state that for all contexts, a selected user prefers the head tracker as a pointing device to be used for control and data entry, and that visual presentation is preferred in all contexts.
- the foregoing and still other benefits of the adaptation system 100 include: 1) communications processes that permit conveying user preference and capability information to an adaptation service; 2) software for use as an adaptation service to generate transformations of user interface and content of applications and any other information necessary for operational configuration and setup of the user's system on the basis of a user's preferences, capabilities and situation; 3) communications for conveying said generated transformation and operational information to the user's system or to a another system acting as proxy for the user's system; 4) a protected storage system for holding users' personal data related to capabilities and preferences; 5) an extraction mechanism which abstracts a user's personal data to protect privacy, for producing a statement of preferences and capabilities in a current situation to be conveyed to an adaptation service; 6) a mechanism for learning about a user's preferences and capabilities and using this information for optimizing adaptation processes.
- the invention employs several aspects of prior art in novel ways and introduces new functionality.
- the invention provides an end-to-e ⁇ d process beginning with gathering user information, through generating transformations based upon that information, conveying the instructions for changes to the user system for implementing the changes in the user's system.
- a disability is irrelevant to a condition in a context.
- a blind person who can read Braille is generally able to read whether it is dark or light, quiet or noisy in the surrounding environment.
- a disability might be irrelevant in one interaction function but crucial in another.
- a person who has only a motor nerve difficulty e.g., quadriplegic
- quadriplegic might not be constrained by this disability in visual or aural presentations, but might be considerably constrained when control has to be actuated through the visual presentation— for example, navigating and selecting hypertext links in small font on the browser screen.
- a disability by its nature, may also cause temporary impairment of an ability that is not otherwise constrained.
- collaboration among the various modal accessing elements can be performed.
- An example of this is a military command post where speech, visual, audio and gestural controls are coordinated. Each officer will have preferences based upon rank, experience, specialty and mission assignment, in addition to personal effects.
- There adaptation system can handle collaborations- among two or more users. If the adaptation system is not powerful enough to execute the transformations by itself, a proxy can be used to handle this execution for the adaptation system. Consequently, configuration information may accompany the instructions for interface transformations. Similarly, additional communications or protocols may be used to accomplish the execution. For example, if a proxy is used, an identifier and locator may be used to provide access to it, and possibly another protocol invoked between the proxy and the user system. Consequently, some setup information may also need to accompany the transformation instructions.
- the present system allows for the generation of self-descriptions that define user preferences and the capabilities of components interacting on some task and their operating environments and conditions.
- the adaptation system 162 enables negotiations based on those descriptions and conditions, and the construction of configurations and interface transformations according to the negotiations for accomplishing the intended task.
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Selective Calling Equipment (AREA)
Abstract
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU2001247422A AU2001247422A1 (en) | 2000-03-14 | 2001-03-14 | A system and method for enabling dynamically adaptable user interfaces for electronic devices |
Applications Claiming Priority (8)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18919100P | 2000-03-14 | 2000-03-14 | |
| US60/189,191 | 2000-03-14 | ||
| US61018600A | 2000-07-05 | 2000-07-05 | |
| US61017900A | 2000-07-05 | 2000-07-05 | |
| US61018100A | 2000-07-05 | 2000-07-05 | |
| US09/610,186 | 2000-07-05 | ||
| US09/610,181 | 2000-07-05 | ||
| US09/610,179 | 2000-07-05 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2001069380A2 true WO2001069380A2 (fr) | 2001-09-20 |
| WO2001069380A3 WO2001069380A3 (fr) | 2003-03-27 |
Family
ID=27497790
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2001/008151 Ceased WO2001069380A2 (fr) | 2000-03-14 | 2001-03-14 | Systeme et procede de validation d'interfaces utilisateur adaptables dynamiquement pour dispositifs electroniques |
Country Status (2)
| Country | Link |
|---|---|
| AU (1) | AU2001247422A1 (fr) |
| WO (1) | WO2001069380A2 (fr) |
Cited By (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| FR2852707A1 (fr) * | 2003-03-17 | 2004-09-24 | Eric Joye | Interface de commande universelle |
| EP1551153A1 (fr) * | 2004-01-05 | 2005-07-06 | Microsoft Corporation | Configuration d' interface utilisateur |
| SG113437A1 (en) * | 2001-11-13 | 2005-08-29 | Ntt Docomo Inc | Service information providing system, service information providing method, and control station |
| EP1308856A3 (fr) * | 2001-11-01 | 2006-01-18 | Matsushita Electric Industrial Co., Ltd. | Système de fourniture d'informations et dispositif serveur de fourniture d'informations pour son utilisation, terminal d'information et procédé de fourniture d'informations dépendant d'un profil utilisateur |
| EP1679828A1 (fr) * | 2005-01-07 | 2006-07-12 | Samsung Electronics Co., Ltd. | Procédé et système pour prioriser des tâches rendues disponibles par un dispositif dans un réseau |
| EP1528464A3 (fr) * | 2003-09-05 | 2007-01-31 | Samsung Electronics Co., Ltd. | Interface utilisateur proactive ayant un agent évolutif |
| WO2007074537A1 (fr) * | 2005-12-27 | 2007-07-05 | Matsushita Electric Works, Ltd. | Systèmes et procédés permettant de fournir des interfaces utilisateurs distribuées afin de configurer des dispositifs clients |
| EP1570374A4 (fr) * | 2002-10-16 | 2010-06-02 | Korea Electronics Telecomm | Procede et systeme de transformation adaptative d'un contenu visuel en fonction des symptomes caracteristiques de basse vision et des preferences de presentation d'un utilisateur |
| US8028283B2 (en) | 2006-03-20 | 2011-09-27 | Samsung Electronics Co., Ltd. | Method and system for automated invocation of device functionalities in a network |
| US8069422B2 (en) | 2005-01-10 | 2011-11-29 | Samsung Electronics, Co., Ltd. | Contextual task recommendation system and method for determining user's context and suggesting tasks |
| US8099313B2 (en) | 2004-09-22 | 2012-01-17 | Samsung Electronics Co., Ltd. | Method and system for the orchestration of tasks on consumer electronics |
| EP2434444A1 (fr) * | 2010-09-28 | 2012-03-28 | Honeywell International, Inc. | Système et procédé configurable adaptatif et automatique |
| US8185427B2 (en) | 2004-09-22 | 2012-05-22 | Samsung Electronics Co., Ltd. | Method and system for presenting user tasks for the control of electronic devices |
| US8205013B2 (en) | 2005-05-02 | 2012-06-19 | Samsung Electronics Co., Ltd. | Method and system for aggregating the control of middleware control points |
| EP2242208A3 (fr) * | 2002-10-02 | 2012-10-03 | Mitsubishi Electric Corporation | Appareil d'adaptateur de communication, adaptateur de communication, procédé d'écriture de données dans une mémoire non volatile, et appareil électrique et dispositif d'écriture ROM utilisé par le procédé |
| US8412554B2 (en) | 2004-09-24 | 2013-04-02 | Samsung Electronics Co., Ltd. | Method and system for describing consumer electronics using separate task and device descriptions |
| US8990688B2 (en) | 2003-09-05 | 2015-03-24 | Samsung Electronics Co., Ltd. | Proactive user interface including evolving agent |
| WO2019005341A1 (fr) * | 2017-06-28 | 2019-01-03 | Microsoft Technology Licensing, Llc | Amélioration de l'expérience utilisateur en fonction des capacités matérielles du dispositif |
| USRE47908E1 (en) | 1991-12-23 | 2020-03-17 | Blanding Hovenweep, Llc | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
| USRE48056E1 (en) | 1991-12-23 | 2020-06-16 | Blanding Hovenweep, Llc | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
| US20230297986A1 (en) * | 2022-03-18 | 2023-09-21 | Wincor Nixdorf International Gmbh | Self-Service Terminal and Method |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB9508283D0 (en) * | 1995-02-07 | 1995-06-14 | British Telecomm | Information services provision and management |
| WO1999009658A2 (fr) * | 1997-08-15 | 1999-02-25 | Inergy Online, Inc. | Systeme d'exploitation a cote serveur et plate-forme internet independante et suite d'applications |
| US6509913B2 (en) * | 1998-04-30 | 2003-01-21 | Openwave Systems Inc. | Configurable man-machine interface |
-
2001
- 2001-03-14 WO PCT/US2001/008151 patent/WO2001069380A2/fr not_active Ceased
- 2001-03-14 AU AU2001247422A patent/AU2001247422A1/en not_active Abandoned
Cited By (29)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| USRE47908E1 (en) | 1991-12-23 | 2020-03-17 | Blanding Hovenweep, Llc | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
| USRE48056E1 (en) | 1991-12-23 | 2020-06-16 | Blanding Hovenweep, Llc | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
| USRE49387E1 (en) | 1991-12-23 | 2023-01-24 | Blanding Hovenweep, Llc | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
| EP1308856A3 (fr) * | 2001-11-01 | 2006-01-18 | Matsushita Electric Industrial Co., Ltd. | Système de fourniture d'informations et dispositif serveur de fourniture d'informations pour son utilisation, terminal d'information et procédé de fourniture d'informations dépendant d'un profil utilisateur |
| SG113437A1 (en) * | 2001-11-13 | 2005-08-29 | Ntt Docomo Inc | Service information providing system, service information providing method, and control station |
| EP2242208A3 (fr) * | 2002-10-02 | 2012-10-03 | Mitsubishi Electric Corporation | Appareil d'adaptateur de communication, adaptateur de communication, procédé d'écriture de données dans une mémoire non volatile, et appareil électrique et dispositif d'écriture ROM utilisé par le procédé |
| EP1570374A4 (fr) * | 2002-10-16 | 2010-06-02 | Korea Electronics Telecomm | Procede et systeme de transformation adaptative d'un contenu visuel en fonction des symptomes caracteristiques de basse vision et des preferences de presentation d'un utilisateur |
| WO2004086330A3 (fr) * | 2003-03-17 | 2005-08-25 | Eric Joye | Interface de commande universelle |
| FR2852707A1 (fr) * | 2003-03-17 | 2004-09-24 | Eric Joye | Interface de commande universelle |
| EP1528464A3 (fr) * | 2003-09-05 | 2007-01-31 | Samsung Electronics Co., Ltd. | Interface utilisateur proactive ayant un agent évolutif |
| EP1522918A3 (fr) * | 2003-09-05 | 2007-04-04 | Samsung Electronics Co., Ltd. | Interface utilisateur proactive |
| US8990688B2 (en) | 2003-09-05 | 2015-03-24 | Samsung Electronics Co., Ltd. | Proactive user interface including evolving agent |
| EP1551153A1 (fr) * | 2004-01-05 | 2005-07-06 | Microsoft Corporation | Configuration d' interface utilisateur |
| US8196044B2 (en) | 2004-01-05 | 2012-06-05 | Microsoft Corporation | Configuration of user interfaces |
| US8185427B2 (en) | 2004-09-22 | 2012-05-22 | Samsung Electronics Co., Ltd. | Method and system for presenting user tasks for the control of electronic devices |
| US8099313B2 (en) | 2004-09-22 | 2012-01-17 | Samsung Electronics Co., Ltd. | Method and system for the orchestration of tasks on consumer electronics |
| US8412554B2 (en) | 2004-09-24 | 2013-04-02 | Samsung Electronics Co., Ltd. | Method and system for describing consumer electronics using separate task and device descriptions |
| EP1679828A1 (fr) * | 2005-01-07 | 2006-07-12 | Samsung Electronics Co., Ltd. | Procédé et système pour prioriser des tâches rendues disponibles par un dispositif dans un réseau |
| US8510737B2 (en) | 2005-01-07 | 2013-08-13 | Samsung Electronics Co., Ltd. | Method and system for prioritizing tasks made available by devices in a network |
| US8069422B2 (en) | 2005-01-10 | 2011-11-29 | Samsung Electronics, Co., Ltd. | Contextual task recommendation system and method for determining user's context and suggesting tasks |
| US8205013B2 (en) | 2005-05-02 | 2012-06-19 | Samsung Electronics Co., Ltd. | Method and system for aggregating the control of middleware control points |
| US8806347B2 (en) | 2005-12-27 | 2014-08-12 | Panasonic Corporation | Systems and methods for providing distributed user interfaces to configure client devices |
| JP2008522248A (ja) * | 2005-12-27 | 2008-06-26 | 松下電工株式会社 | クライアント装置を設定するために分散ユーザインタフェースを提供するためのシステム及び方法 |
| WO2007074537A1 (fr) * | 2005-12-27 | 2007-07-05 | Matsushita Electric Works, Ltd. | Systèmes et procédés permettant de fournir des interfaces utilisateurs distribuées afin de configurer des dispositifs clients |
| US8028283B2 (en) | 2006-03-20 | 2011-09-27 | Samsung Electronics Co., Ltd. | Method and system for automated invocation of device functionalities in a network |
| EP2434444A1 (fr) * | 2010-09-28 | 2012-03-28 | Honeywell International, Inc. | Système et procédé configurable adaptatif et automatique |
| WO2019005341A1 (fr) * | 2017-06-28 | 2019-01-03 | Microsoft Technology Licensing, Llc | Amélioration de l'expérience utilisateur en fonction des capacités matérielles du dispositif |
| US10586389B2 (en) | 2017-06-28 | 2020-03-10 | Microsoft Technology Licensing, Llc | Device panel capabilities and spatial relationships |
| US20230297986A1 (en) * | 2022-03-18 | 2023-09-21 | Wincor Nixdorf International Gmbh | Self-Service Terminal and Method |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2001069380A3 (fr) | 2003-03-27 |
| AU2001247422A1 (en) | 2001-09-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2001069380A2 (fr) | Systeme et procede de validation d'interfaces utilisateur adaptables dynamiquement pour dispositifs electroniques | |
| US7167142B2 (en) | Multi-user display system | |
| US7454526B2 (en) | Method and system for providing browser functions on a web page for client-specific accessibility | |
| JP4897844B2 (ja) | 病院イントラネットウェブサイトの個人化 | |
| US20070055938A1 (en) | Server-based method for providing internet content to users with disabilities | |
| JP2003526136A (ja) | ネットワークに亘ってコンピュータ化された患者の記録を表示するシステム及び方法 | |
| Yang et al. | Enhancing pervasive Web accessibility with rule-based adaptation strategy | |
| EP1685679B1 (fr) | Systeme frontal d'interface utilisateur compatible pour des interfaces utilisateurs eloignees | |
| EP1357723A2 (fr) | Interface d'interaction pour un environnement avec une pluralité de dispositifs | |
| US10366135B2 (en) | Zero footprint application virtualization | |
| Stephanidis et al. | Self-adapting web-based systems: Towards universal accessibility | |
| Vandervelpen et al. | Light-weight distributed web interfaces: preparing the web for heterogeneous environments | |
| CN109767847A (zh) | 远程医疗就诊方法、装置和计算机可读存储介质 | |
| Burzagli et al. | Design for All in action: An example of analysis and implementation | |
| Meyer et al. | Literature review of computer tools for the visually impaired: a focus on search engines | |
| JP2003281030A (ja) | 情報提供サーバ、情報提供方法 | |
| KR20190079092A (ko) | 비로그인 상태의 비정형적 정보를 이용한 사용자 구분 및 인증을 위한 시스템 및 방법 | |
| Mankoff et al. | Domisilica: Providing ubiquitous access to the home | |
| Záruba et al. | CONNECT: A personal remote messaging and monitoring system to aid people with disabilities | |
| JP4761702B2 (ja) | プライバシを考慮したパーソナライゼーションのためのシステム及び方法 | |
| Fink et al. | Towards a user-adapted information environment on the Web | |
| Ferri et al. | The HMI digital ecosystem: Challenges and possible solutions | |
| Jimenez-Mixco et al. | A new approach for accessible interaction within smart homes through virtual reality | |
| Emiliani | Perspectives on Accessibility: From Assistive Technologies to Universal Access and Design for | |
| Oh et al. | What is the Key Difference Between Legal Accessibility Guidelines and Real Users’ Experience? |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ CZ DE DE DK DK DM DZ EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: COMMUNICATION PURSUANT TO RULE 69 EPC (EPO FORM 2524 OF 161203) |
|
| 122 | Ep: pct application non-entry in european phase | ||
| NENP | Non-entry into the national phase |
Ref country code: JP |