Getting Started With The Wacom Feel™ Multi-Touch API
What do I need to start developing software for the Wacom Feel™ Multi-Touch API?
You will need a Wacom tablet with support for touch input. Also you will need to install the Wacom tablet driver, which includes the Wacom Feel™ Multi-Touch framework. There are some files which will need to be included alongside your source code, the details of which can be found in Multi Touch Framework - Basics.
What do I need to use Wacom Feel™ Multi-Touch APIs? Do I need to include any SDK or library with my own project?
The Wacom Feel™ Multi-Touch API support is a part of the Wacom Tablet Driver software itself. Driver installation supplies all necessary system components to communicate with the tablets via the API. When you build your app using SDK-supplied headers, you can dynamically link to the installed framework. See https://github.com/Wacom-Developer/wacom-device-kit-macos-multi-touch.
Does the Wacom Feel™ Multi-Touch API work for all Wacom tablets?
No, you will need a Wacom tablet that supports touch input.
Where can I find Wacom Feel™ Multi-Touch API sample applications?
Sample code can be found on https://github.com/Wacom-Developer/wacom-device-kit-macos-multi-touch.
How can I download the SDK for the Feel™ Multi-Touch API?
The runtime dependencies for the Feel™ Multi-Touch API are provided via the Wacom Tablet Driver; it is always installed when the end-user installs the tablet driver on their system. The compile time dependencies are detailed in Multi-Touch Framework - Basics.
What kind of touch data can I get from the tablet via the Wacom Feel™ Multi-Touch API?
All touch capable tablets will report at least the following for each finger:
-
Finger state — (none, up, down, hold)
-
Finger touch location — (x, y)
-
Finger size — (width and height)
For supported tablets, additional data types are supported: Blob data – irregular shaped regions where one or more touch contacts were detected Raw data – all data for the sensor, which includes areas of non-contact along with contact areas
See the Data Read Functions section of the Multi-Touch Framework - Basics page for a full review of capabilities.
Can I use the Wacom Feel™ Multi-Touch API to design touch functionality or gestures for Web App?
No, we currently do not offer browser support for the Wacom Feel™ Multi-Touch API.
Understanding Multi-Touch Data
What can I do with this touch data?
You can use the data to create features that enable direct interaction between the user and content on-screen – touching the screen objects and causing some action on them. Or you can use it to create gestures that drive the application. Create an immersive experience with fingers, without a mouse or keyboard. The Wacom Feel™ Multi-Touch screen responds to many points of contact simultaneously, not just from one or two fingers, but from multiple contact points at once.
What are the primary operating modes supported by the Wacom Feel™ Multi-Touch API?
Consumer mode – In Consumer mode, the forefront Wacom Feel™ Multi-Touch API enabled application is the only process to receive touch data. The data is not passed on for gesture processing, to the system nor to any other multi-touch applications.
Observer mode – In Observer mode, the multi-touch application is configured to receive touch data, which it can parse as needed. The touch data is also sent to the tablet driver gesture recognition algorithms as well as on to the system itself (for cursor navigation, for example). All running Wacom Feel™ Multi-Touch API enabled applications will receive the touch data. If the application chooses to pass the touch data through to the driver in Observer mode, then the tablet driver will interpret touch data and recognize gestures as appropriate for the tablet and operating system.
Passthrough mode – In Passthrough mode, touch data is sent to the system if it falls within a specified hitrect. This is useful for consumer applications that want to preserve exclusive use of the touch data for drawing or object manipulation, while still maintaining the ability for users to interact with application controls (eg: buttons, menus, etc.). Passthrough mode regions (hitrects) can be used in conjunction with the other two modes.
Which operating mode should I use in my application?
Consumer mode is useful if you are using the touch data to define your own gestures or other interactions and do not want any interference from the OS. It may require more work in your application because the OS does not move the cursor or provide the ability to interact with menus or other dialogs.
Observer mode provides the application the ability to interact with the touch data. The data is then passed on to the OS to do the work of moving the cursor and interacting with items outside the scope of your window. Note that the Wacom Tablet Driver will continue to process touch data and apply gesture recognizing algorithms.
Passthrough mode is useful for setting up click regions or button box areas in your application. It is recommended to combine this with Consumer mode.
How does palm rejection work? What is the "touch confidence bit"?
Palm rejection is the ability for an application to ignore touch data at such times where it would cause interference (such as leaving smudges or streaks) in the application with which the user is interacting. The API supports the ability for an application to build in palm rejection through the use of "confidence bits". A confidence bit is a flag for a single finger's touch data that indicates whether the tablet driver thinks the touch is intentional (valid) or accidental (invalid).
A common use of the confidence bit is for applications that incorporate both pen and touch input. Touches that are in the vicinity of the pen location would be deemed non-confident (for right-handed users, if the touch occurs primarily to the right of the pen location). Another use of confidence bits is to flag when a touch contact is too large, where "too large" is approximately anything over the size of a normal finger contact. Although confidence bits come with all finger touch data, the application is free to ignore them and build in its own criteria for deciding when touches are valid or not.
If your application needs to match the down and up events it will be important to inspect the non-confident data; most down events happen in a non-confident state.
The use of palm rejection will add some delay as it can take several frames for a contact to be accepted.
How can I create extended and customizable gestures? Does Wacom provide a tool to create them?
Developers can use the Wacom Feel™ Multi-Touch data to create their own gestures in place of the standard OS gestures, or to extend them in a unique way. Wacom does not provide a tool for this.
How does the Wacom gesture engine work in relation to the API?
The Wacom Feel™ Multi-Touch API does not provide any gesture events. The gesture events generated by the driver are sent in the form of keystroke and mouse wheel events. The output of some gestures can be modified by the user via the Wacom Tablet Properties application. If the application registers for Consumer mode touch data then these events are not generated. If an application wishes to receive tablet driver gesture support as well as handle touch data input, it should register for Observer mode touch data.
How can my application support the system-wide "user configurable gestures" from Wacom, and also my custom application-specific touch features?
To do this you would need to use Observer mode and respond to the events from the OS as needed by your application.
When resting my palm on the tablet and touching with fingers for gestures or finger data input, can the Wacom Feel™ Multi-Touch API identify and reject the palm?
Most touch capable tablets can use the confidence bits to indicate a false (reject) condition when a palm is resting on the tablet. For tablets that do not support confidence bits, all touches are interpreted as valid and are reported. Note that if the multi-touch app is set up in Observer mode, then touch data is also passed to the tablet driver gesture recognizers which may or may not use the touch confidence bits in their gesture recognizer algorithms in determining gestures.
Why can't I get the model number to use when designing features for Wacom tablets?
Designing touch interactions based on features provided by the touch capabilities structure is the best way to insure future compatibility. We make sure that the same features in future tablets work with the API. Design decisions based on model numbers may break when future tablet models are released.
As the API changes and improves, will my application still work with older versions of the API?
Generally yes. Maintaining backward compatibility is a high priority. We will strive to maintain support for previous versions of the API as new versions come out. Please check the return value from the WacomMTInitialize function for version compatibility issues.
What is frame count and how do I use it?
The frame count value is incremented every time a new packet of contacts is detected. Since the rate of packet delivery is fixed the frame count value can be used to measure the rate a contact moves without needing to rely on an internal timer value. This can also be used to identify when a frame is dropped. The property returns zero for unsupported tablets.
Fixing Common Issues and Problems
Why am I not receiving touch data?
- Check that the drivers are running in the Activity Monitor.
Process Name | %CPU | CPU Time | Idle Wake Up | %GPU |
---|---|---|---|---|
Wacom Touch Driver | 0.0 | 5.90 | 0 | 0.0 |
Wacom Tablet Driver | 0.6 | 2:25.97 | 7 | 0.0 |
Tablet Driver | 0.0 | 0.23 | 0 | 0.0 |
- Make sure the entitlements are added to the project. See the Entitlements section of the Multi-Touch Framework - Basics page.
- Check the return value from WacomMTInitialize. See Multi-Touch Framework - Reference.
My app is getting overwhelmed with too many touch callbacks, the callback is lagging and becoming unresponsive. How can I fix this?
The tablet touch data is high resolution and reports at 100 times per second for optimal user experience. This data stream includes the finger location and whether the finger is in the "down", "up", or "hold" position. The Wacom Feel™ Multi-Touch API cannot anticipate the performance requirements of the application and will report at full speed. For the smoothest control and best quality experience it is not recommended to skip data points. For optimal performance, the app is advised to use a separate thread to manage, thereby alleviating load on the main app.
There are two methods for reading data. Developers can either register a read callback or register a window id to receive a message when data is ready. If performance is an issue it may be better to use the callback method to queue the data and process it on a different thread. The processing thread could then peek at the data and determine if further processing is needed and maybe not process all of the data. If you want to skip frames for performance reasons one possibility may be to look at the contact state of each finger in the frame and process every other one (for example) unless there is a change in finger state (up or new finger down). This way you don't miss any ups or downs but also don't need to process every packet in between if you don't want to. It is important not to take very much time in the callback function and you also can't rely on the data persisting after you exit. It's best to maintain your own internal queue of contacts and process them on a separate thread.