May 2018 - page 12

April 18
12
E
mbedded
C
omputing
PCAP Touch displays -
what does the future hold?
By Markus Hell,
DATA MODUL
In this article the author compares
three different touch technologies
and examines their suitability for
industrial applications.
„„
The triumph of PCAP Touch technol-
ogy started in the consumer market of the
2000s. Displays with PCAP Touch have now
reached professional products, and operating
concepts with touch functions in industrial
applications with one-finger/multi-touch,
slides, etc are largely known, and functional
principles and design possibilities are recog-
nized. Research and thought in development
departments is continuously dedicated to
the development of demand-based, appli-
cation-specific and potential revolutionary
touch methods/technologies. Which current
enhancements and further developments are
promising and/or upgradeable? For which
applications in which industries would cer-
tain specific technologies be useful/possible or
even necessary? What could touch panels of
the (near) future look like?
PCAP Touch technology is a familiar element
of everyday life as well as of industrial prod-
ucts of all industries. Technical features such
as operation under water or with gloves, an
extended temperature range, EMC conformity
and various assembly options for different
requirements (SITO and OGS touch displays,
film/film and glass touch displays) today are
included in the scope of supply of a PCAP
Touch control unit. The technologically highly
specialized optical bonding is provided by var-
ious suppliers such as Data Modul in differing
quality. Three variations of demand-focused
operating concepts of added value built upon
the existing projecting capacitive technology or
enhancing this are currently the focus: PCAP
with haptic feedback, with gesture control and
with Force / 3D Touch.
Touch panels with haptic feedback confirm,
via direct re-transfer of the force to the user,
their interaction on the touch sensor. Eye
contact is therefore not absolutely necessary
for this. The user ascertains the position of
their finger on the sensor on the basis of the
tangible feedback alone, and their interac-
tion is thus confirmed. An extension of this
kind is conceivable in applications where the
user must be focused on an object/a patient/
an action near the screen. Scopes of applica-
tion here include the medical, automotive and
entertainment industries. However, in many
applications, a touch operation without eye
contact is (still) not required. The benefits of
haptic feedback are initially limited: as a gen-
eral rule, device users see simply touching the
surface as a sufficient form of tactile feedback
for interaction confirmation. Integrating this
additional function into an existing appli-
cation is complex and expensive. It is neces-
sary for the entire mechanical concept to be
adapted, as the tactile providers of feedback,
the mechanical actuators (vibrating motors,
Piezo elements, linear drive mechanisms),
also have to be integrated. To enable the feed-
back in the first place, the surface has to be
integrated in the casing on a floating basis,
as otherwise no vibrations or the like can be
transferred to the user. The moving medium
(mostly touch and cover glass) is a factor that
must be considered, as, in industrial applica-
tions, the cover glass that has to move is over 2
mm thick. This means that the requirements
surrounding, among others, the system stabil-
ity, lifespan, power consumption or the exer-
tion of force on connecting elements cannot
be fulfilled to optimum effect.
Hover gesture refers to the touch-free interac-
tion through gestures in a defined space and
over given axes (X, Y, Z). The identification of
gestures takes place either via an electromag-
netic field in addition to a touch surface, or
completely on camera basis. The GUI of the
display is not covered by the fingers, and the
view of the screen remains unimpaired. The
surface is almost completely unaffected by
soiling, and the interaction with the touch
sensor can take place without eye contact.
The user behavior is determined by familiar
operating concepts: on a screen, a user mostly
resorts to learned operating patterns. The
user experience crucial for product success
is unfamiliar due to the gestures that need to
be learned. To adapt HMI systems to gesture
control, wide-ranging adaptations in the GUI
Figure 1. Example of a touch
panel with Force Touch in a
medical application
1...,2,3,4,5,6,7,8,9,10,11 13,14,15,16,17,18,19,20,21,22,...40
Powered by FlippingBook