Multi-touch Gesture Handling for Zooming with AIS_MouseGesture_Zoom on 3D View

Hello OCCT Community,

I would like to handle on our Qt project with multi touch screens. I've tried that CAD Assistant application works very well with multi-touch screens and I've investigated some multi-touch methods on OCCT side. However, I couldn't overcome multi-touch problems on our project. I would like to zoom in/out on multi-touch screens and manage it.

Firstly, I've added below code line for Qt can catch the touch events:

setAttribute(Qt::WA_AcceptTouchEvents, true);

Then, I have overrided some Open CASCADE methods such as AddTouchPoint, RemoveTouchPoint, UpdateTouchPoint. Most likely my usage below is wrong, but I don't know exactly how it should be correct.

void OCCTVisualizerWidget::AddTouchPoint(Standard_Size theId, const Graphic3d_Vec2d &thePnt, Standard_Boolean theClearBefore)
{
    AIS_ViewController::AddTouchPoint(theId, thePnt, theClearBefore);
    if (myToAskNextFrame) //myToAskNextFrame: OCCT specific flag, indicating that another frame should be drawn right after this one
    {
        // ask more frames for animation
        updateView();
    }
}

bool OCCTVisualizerWidget::RemoveTouchPoint(Standard_Size theId, Standard_Boolean theClearSelectPnts)
{
   auto result = AIS_ViewController::RemoveTouchPoint(theId, theClearSelectPnts);
   if (myToAskNextFrame) //myToAskNextFrame: OCCT specific flag, indicating that another frame should be drawn right after this one
   {
       // ask more frames for animation
       updateView();
   }
   return result;
}

void OCCTVisualizerWidget::UpdateTouchPoint(Standard_Size theId, const Graphic3d_Vec2d &thePnt)
{
   AIS_ViewController::UpdateTouchPoint(theId, thePnt);
   if (myToAskNextFrame) //myToAskNextFrame: OCCT specific flag, indicating that another frame should be drawn right after this one
   {
       // ask more frames for animation
       updateView();
   }
}

Lastly, I've tried to manage it some mouse gesture mapping methods but I couldn't figure out true implementation yet:

void OCCTVisualizerWidget::setMouseGestureMappingOptions()
{
    // clear map from defaults
    ais_mouse_gesture_map_.Clear();

    ais_mouse_gesture_map_.Bind (Aspect_VKeyMouse_RightButton,  AIS_MouseGesture_RotateOrbit);
    ais_mouse_gesture_map_.Bind (Aspect_VKeyMouse_LeftButton,   AIS_MouseGesture_Pan);

    // For synchronous inputs (i.e.If users would like to zoom, press CTRL key and click right mouse button)
    ais_mouse_gesture_map_.Bind (Aspect_VKeyMouse_RightButton | Aspect_VKeyFlags_CTRL,  AIS_MouseGesture_Zoom);

    ais_mouse_gesture_map_.Bind (Aspect_VKeyMouse_LeftButton | Aspect_VKeyFlags_ALT | Aspect_VKeyFlags_SHIFT, AIS_MouseGesture_SelectRectangle);
}

How I can use "AIS_MouseGesture_Zoom" with multi-touch events?

Thank you in advance! Best regards,

Nezihe

mr soup's picture

Hey, did you find any answer for this? we are using the same methods:
UpdateTouchPoint,AddTouchPoint and RemoveTouchPoint but the result is nowhere as perfect as CAD Assistant

Le Pharaon's picture

You don’t actually need to map multi-touch to AIS_MouseGesture_Zoom directly. In OCCT, pinch-zoom is handled through the touch point API, and the view controller figures out zooming automatically once it receives two active touch points that move apart or together. In short: if touch events are coming in correctly, pinch-zoom “just works” without binding a gesture. A few practical tips: 1. Forward raw Qt touch events directly Don’t override the Add/Update/Remove methods unless you need extra logic. Instead, in your event() handler, send touch events straight to the AIS_ViewController:
 

bool OCCTVisualizerWidget::event(QEvent* e)
{
    if (e->type() == QEvent::TouchBegin ||
      e->type() == QEvent::TouchUpdate ||
        e->type() == QEvent::TouchEnd)
    {
        const QTouchEvent* touch = static_cast<QTouchEvent*>(e);
        for (const QTouchEvent::TouchPoint& p : touch->touchPoints())
        {
            const Standard_Size id = p.id();
            const Graphic3d_Vec2d pos(p.pos().x(), p.pos().y());
            switch (p.state())
            {
            case Qt::TouchPointPressed:
               myViewController->AddTouchPoint(id, pos, false);
                break;
            case Qt::TouchPointMoved:
               myViewController->UpdateTouchPoint(id, pos);
                break;
            case Qt::TouchPointReleased:
                myViewController->RemoveTouchPoint(id, false);
                break;
            default:
                break;
            }
        }

updateView();
return true;
}
return QWidget::event(e);
}

If you do only this, pinch zoom should already work.

2. Remove the gesture mapping from touch events

AIS_MouseGesture_Zoom only applies to mouse-based inputs, not touch. Multi-touch doesn’t go through the mouse gesture map.

So you can keep your mouse gesture mapping for desktop users, but it’s unrelated to touch handling.

3. Make sure Qt actually generates multi-touch events

Some systems require:


 

setAttribute(Qt::WA_AcceptTouchEvents); grabGesture(Qt::PinchGesture);

The pinch gesture isn’t required for OCCT, but it confirms Qt is receiving correct data.

4. No need to manually trigger zoom logic

As long as OCCT gets two or more touch points and their deltas, it computes scaling internally.

gkv311 n's picture

Le Pharaon wrote:

Make sure Qt actually generates multi-touch events setAttribute(Qt::WA_AcceptTouchEvents);

Thanks for sharing the code snippet.

In addition to that some systems (Windows, or maybe touch panel drivers) emulate mouse events and pass them in parallel with touch events.

This would cause overlapped inputs passed to AIS_ViewController and misbehavior. You'll need to filter out such mouse events which could be detected by QMouseEvent::source()==Qt::MouseEventSynthesizedBySystem after enabling multi-touch events.

Notice that some drivers handle extra gestures on their own, regardless of application handling multi-touch input. These couldn't be disabled at application level.

mr soup's picture

Thanks for the replies! The only thing that isn’t working correctly is the rotation mode it changes it behavior in touch mode compared to mouse mode. I would assume it should use the same logic as the mouse? I didnt change the AIS_RotationMode so it's still on "AIS_RotationMode_BndBoxActive" mode

gkv311 n's picture

The only thing that isn’t working correctly is the rotation mode it changes it behavior in touch mode compared to mouse mode. I would assume it should use the same logic as the mouse? I didnt change the AIS_RotationMode so it's still on "AIS_RotationMode_BndBoxActive" mode

The rotation mode is shared between mouse and touch gestures. What exactly do you experience? How rotation is different from mouse?