|
|
|
|
||
"Haptic Effects With Proximity Sensing " eliminates lag time between touch and vibration!This fantastically cool tech from IMMR eliminates the 50ms lag time between touching the screen and receiving vibration confirmation (caused by the time it takes the vibration motor to ramp up). It's one small step for technology, one giant leap for virtual reality. But wait, there's more! Later in the document the language turns to fascinating haptic gesture control without touching the screen. The haptics are felt by the hand holding the portable device. This really ices IMMR's dominance in the touch-screen feedback arena. If any company wants ZERO lag for touch-screen interactions or magical gesture control haptics they'll have to go through IMMR. Excerpts from the patent application: [United States Patent Application 20090106655 April 23, 2009 Haptic Effects With Proximity Sensing Abstract A method of generating haptic effects on a device includes detecting the presence of an object near an input area of the device and generating a haptic effect on the device in response to the presence detection. Inventors: Grant; Danny A.; (Quebec, CA); Gregorio; Pedro; (Quebec, CA); Lacroix; Robert A.; (Quebec, CA) Assignee Name: Immersion Corporation Filed: May 4, 2007 [0005]Increasingly, portable devices are moving away from physical buttons in favor of touchscreen-only interfaces. This shift allows increased flexibility, reduced parts count, and reduced dependence on failure-prone mechanical buttons and is in line with emerging trends in product design. When using the touchscreen input device, a mechanical confirmation on button press or other user interface action can be simulated with haptics. [0006]For portable devices, cost is an important driving factor. Therefore, to generate haptic effects a single low cost actuator is generally used, for example an eccentric rotating mass ("ERM") motor or an electromagnetic motor. These actuators are able to produce strong magnitude haptic outputs. However, they also require a certain amount of time to achieve their peak haptic output (e.g., approximately 50 ms). These actuators are also used to provide feedback to the user when operating a touch sensitive input of a touchscreen device. For example when the user presses a button on a touchscreen a haptic effect is output to give the sensation of pressing a mechanical button. It is desired to output the haptic effect at the same time the user has selected the button in the interface. However, due to the time it takes to have actuator reach a desired magnitude, the haptic effect lags behind the button press event. If this lag becomes too long the user will not perceive the button press and the haptic effect as a single event. [0007]Based on the foregoing, there is a need for an improved system and method for generating haptic effects for a touchscreen. DETAILED DESCRIPTION [0012]One embodiment is a portable device that includes a haptic feedback system with proximity sensing. The haptic system initiates the haptic feedback before a user actually touches a touchscreen or other input area based on the proximity information. [0018]Proximity sensor 14 may use any technology that allows the proximity of a finger or other object to touchscreen 13 to be sensed. For example, it may be based on sensing technologies including capacitive, electric field, inductive, hall effect, reed, eddy current, magneto resistive, optical shadow, optical visual light, optical IR, optical color recognition, ultrasonic, acoustic emission, radar, heat, sonar, conductive or resistive and the like. [0019]In one embodiment, proximity sensor 14 includes one or more proximity sensors that each generate a sensing field above touchscreen 13 and that produce signals when an object disturbs or intercepts the sensing field(s). Each sensing field typically generates its own signals when disturbed. In one embodiment, a single sensing field is used to cover the entire touchscreen 13 surface. In another embodiment, a single sensing field only covers a portion of the touchscreen 13 surface. In another embodiment, multiple sensing fields are used to cover the entire touchscreen 13 surface. Any number of sensing fields may be used. In some cases, in order to perform tracking, the sensing fields may even be distributed as a pixelated array of nodes. [0022]At 104, proximity sensor 14 determines the position, speed and/or acceleration of the finger relative to the surface of touchscreen 13. This enables processor 12 to determine whether the user's finger will actually contact touchscreen 13. For example, if the proximity signal is increasing at a certain rate, it is highly likely that the user will contact touchscreen 13 and press a button. [0023]At 106, based on the determination at 104, processor 12 can calculate when the finger is expected to contact touchscreen 13. In anticipation of this contact, processor 12 initiates the haptic effect before the actual contact, thus avoiding the lag time caused by actuator 18. Processor 12 may use the acceleration of the finger and the starting time required by actuator 18 to determine how far in advance to initiate the haptic effect and energize actuator 18. Therefore, the haptic effect will be implemented at approximately the exact time that the finger actually contacts touchscreen 13 and result in better synchrony of the haptic effect with the button press event. In another embodiment, processor 12 may initiate the haptic effect upon sensing the mere presence of the finger at 102.] Gesture control section: [[0024]In typical use of cell phones or PDA's, the user generally holds the device in one hand and uses the other hand to interact with the user interface such as touchscreen 13. For handheld haptic devices with proximity sensing this means that the user can sense the haptic feedback with the hand holding the device even though the finger has not yet touched the surface. Therefore, useful haptic effects can be created as a function of the proximity even when a finger never touches touchscreen 13. [0025]In one embodiment, if a user is hovering a finger over the touchscreen and moving over a grid of displayed buttons, a first haptic effect can be played when the user is moving from over one button to over the next button. The first haptic effect can be a short soft haptic effect in order to simulate the feel of moving over the edge of one button to the next. This first haptic effect will give the user an indication of the button locations without the user activating the buttons. A second haptic effect can be played when the user actually touches the screen and acts to select the button. The second haptic effect can be a strong haptic effect simulating a sharp button click.] http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=1&u=/netahtml/PTO/search -bool.html&r=1&f=G&l=50&co1=AND&d=PG01&s1=20090106655&OS=20090106655&RS=20090106655 |
return to message board, top of board |
Msg # | Subject | Author | Recs | Date Posted |
45229 | Re: "Haptic Effects With Proximity Sensing " eliminates lag time between touch and vibration! | stormado | 0 | 4/23/2009 9:06:44 AM |