When I read Jan Köhnlein’s post about multitouch gestures support in their graph viewer, I was thinking, these gestures would be a nice addition to the Zest visualization library. In addition to that, it could be my first new feature as a Zest committer, so I opened bug 371152.
So I looked into the multitouch API in SWT (btw. it is really to use, check snippet 353). Luckily, only a gesture listener had to be added, that is already supported by the Zest graph widget.
I thought about what gestures could be universally supported in the Zest graph, and found that scrolling is already supported, magnification is easy to implement, and for rotation I found a possible use.
Today, I finished the implementation, and created a short video presentation that shows the features in action. See the Youtube video: http://youtu.be/cVVxOIwHN7s
Additionally, it is possible to replace the built-in gestures by creating the graph with a specific style bit, and adding other gesture listeners manually as below:
//Disabling default gestures Graph graph = new Graph(parent, ZestStyles.GESTURES_DISABLED); //Adding own gesture listener graph.addGestureListener(new GestureListener() { public void gesture(GestureEvent e) { switch (e.detail) { // Do nothing } } });
I believe the base implementation of the gesture support is useful, but I am open to change it if someone has a better idea.
Cool! I am happy my post has inspired some people to really think about modern, higher-level interaction patterns in modern graphical editors, not just drag&drop and lots of dialogs. I am sure that leads to much more intuitive applications. We can still learn a lot from mobile touch-devices here.
@jan I do believe in transferring ideas between tasks. You created some code that demonstrated it is easy to code; so it was worth trying it.
Now that a basic version is implemented, it is try to test it in the real world. 😀
Hi, just found that I call setTouchEnabled(true) in the code, which blocks the gesture event. This is so inconvenient that I can not implement my own gesture recognizer which works simultaneously with the embedded recognizer. It takes me nearly 2 hours to locate this bug.
@Davy: Well, the Javadoc of the SWT control explicitely says the following:
NOTE: If setTouchEnabled(true) has previously been invoked on the receiver then setTouchEnabled(false) must be invoked on it to specify that gesture events should be sent instead of touch events.
This means, the SWT gesture events were defined in this way; Zest just follows the platform lead.
However, I am not sure whether I understood what are you trying to achieve – maybe if you could elaborate a bit, I could help to work around your issue.
Hasn’t this made it into Zest? Just tried 1.5.100 but it doesn’t seem to be there.
@Jan Köhnlein:
This was only submitted to the then-Zest2, now GEF4 version of Zest. It would be possible to add it into the older Zest versions, given a new enough SWT dependency (if I recall correctly, 3.7). However, considering Zest 1.x and GEF 3.x works with very old SWT releases; and additionally, the GEF4 version is designed as the place where everything new goes, this was not backported into Zest 1.x.
Is this feature required for some specific reason? If yes, I am not completely against backporting the change (it should be simple enough), but we should discuss this on some GEF-specific forum.