Multitouch gesture support for Zest graphs

When I read Jan Köhnlein’s post about multitouch gestures support in their graph viewer, I was thinking, these gestures would be a nice addition to the Zest visualization library. In addition to that, it could be my first new feature as a Zest committer, so I opened bug 371152.

So I looked into the multitouch API in SWT (btw. it is really to use, check snippet 353). Luckily, only a gesture listener had to be added, that is already supported by the Zest graph widget.

I thought about what gestures could be universally supported in the Zest graph, and found that scrolling is already supported, magnification is easy to implement, and for rotation I found a possible use.

Today, I finished the implementation, and created a short video presentation that shows the features in action. See the Youtube video: http://youtu.be/cVVxOIwHN7s

Additionally, it is possible to replace the built-in gestures by creating the graph with a specific style bit, and adding other gesture listeners manually as below:

//Disabling default gestures
Graph graph = new Graph(parent, ZestStyles.GESTURES_DISABLED);
//Adding own gesture listener
graph.addGestureListener(new GestureListener() {

  public void gesture(GestureEvent e) {
    switch (e.detail) {
      // Do nothing
    }
  }
});

I believe the base implementation of the gesture support is useful, but I am open to change it if someone has a better idea.

Author: Zoltán Ujhelyi

I am an Eclipse Technology Expert at IncQuery Labs Ltd. and a regular contributor of open-source projects, most importantly the VIATRA project at eclipse.org. Furthermore, I am in the process of finishing my PhD in computer science at the Budapest University of Technology and Economics focusing on analysis techniques for model queries and transformations.

8 thoughts on “Multitouch gesture support for Zest graphs”

  1. Cool! I am happy my post has inspired some people to really think about modern, higher-level interaction patterns in modern graphical editors, not just drag&drop and lots of dialogs. I am sure that leads to much more intuitive applications. We can still learn a lot from mobile touch-devices here.

  2. @jan I do believe in transferring ideas between tasks. You created some code that demonstrated it is easy to code; so it was worth trying it.

    Now that a basic version is implemented, it is try to test it in the real world. 😀

  3. Hi, just found that I call setTouchEnabled(true) in the code, which blocks the gesture event. This is so inconvenient that I can not implement my own gesture recognizer which works simultaneously with the embedded recognizer. It takes me nearly 2 hours to locate this bug.

  4. @Davy: Well, the Javadoc of the SWT control explicitely says the following:

    NOTE: If setTouchEnabled(true) has previously been invoked on the receiver then setTouchEnabled(false) must be invoked on it to specify that gesture events should be sent instead of touch events.

    This means, the SWT gesture events were defined in this way; Zest just follows the platform lead.

    However, I am not sure whether I understood what are you trying to achieve – maybe if you could elaborate a bit, I could help to work around your issue.

  5. @Jan Köhnlein:
    This was only submitted to the then-Zest2, now GEF4 version of Zest. It would be possible to add it into the older Zest versions, given a new enough SWT dependency (if I recall correctly, 3.7). However, considering Zest 1.x and GEF 3.x works with very old SWT releases; and additionally, the GEF4 version is designed as the place where everything new goes, this was not backported into Zest 1.x.

    Is this feature required for some specific reason? If yes, I am not completely against backporting the change (it should be simple enough), but we should discuss this on some GEF-specific forum.

Leave a Reply