Skip to content
This repository has been archived by the owner on Jul 18, 2023. It is now read-only.

API. Events

Cyrille Rossant edited this page Feb 26, 2014 · 1 revision

Event system API

This document describes the full specification of the event system API which is a generic callback registration system. It is most commonly used to process user and window-system events.

Overview

The event system allows some basic callback management with features inspired by Qt's signal/slot mechanism and pyglet's event system:

  • Instances of the EventEmitter class contain a list of callbacks. When the instance itself is called, each of the callbacks is invoked in sequence. The EventEmitter instances are stored in publicly-accessible locations with the intention that any other part of the application may register a callback with the EventEmitter, and thus receive notifications when the emitter is called. EventEmitter is analogous to PyQt's Signal class, and calling an emitter is analogous to signal.emit().
  • Callbacks can be registered either by calling emitter.connect(callback) or by using @emitter.connect as a decorator to a callback function.
  • EventEmitters pass only a single argument to their callbacks: an Event instance. When events occur such as user interaction, windowing system changes, or anything else that may be of global interest, an Event instance is generated with information about the event, then passed to an EventEmitter, which in turn passes it on to its callbacks.
  • There are subclasses for each kind of event (e.g. MouseEvent, KeyEvent, ResizeEvent). Each subclass defines a particular set of attributes with information relevant to the event. Among other things, this ensures that an event's attributes are always complete.
  • EventEmitters can be used on their own, but they are most commonly grouped together within an EmitterGroup object, which helps to organize and manage large numbers of emitters. By convention, objects which make use of multiple EventEmitters will have an 'events' attribute which references an EmitterGroup. The EmitterGroup has one attribute referencing each emitter. For example: visual.events.mouse_down, canvas.events.resize, and timer.events.timeout.
  • Objects that make use of an EmitterGroup will also have default callback methods for each emitter. For example, the visual.events.mouse_down emitter is automatically connected to visual.on_mouse_down. This mimics the behavior of the event system in Qt and pyglet.

Callback registration examples

There are a few different ways to register an event callback.

  1. Function decorators:

     canvas = Canvas()
     
     @canvas.events.paint
     def paint(event):
         glClear(...)
         # paint scene here
         
     @canvas.events.mouse_press
     def mouse_press(event):
         print "Clicked on:", event.pos
    
  2. This is equivalent to calling emitter.connect:

     canvas = Canvas()
     
     def paint(event):
         glClear(...)
         # paint scene here
     canvas.events.paint.connect(paint)
    
  3. Similarly, one can make use of the default callbacks by creating a subclass of Canvas:

     class MyCanvas(Canvas):
         def on_paint(self, event):
             ...
             
         def on_mouse_press(self, event):
             ...
    
  4. Or by simply re-assigning the same methods:

     canvas = Canvas()
     
     def paint(event):
         ...
     canvas.on_paint = paint
    
  5. The previous example is possible because, in that case, the emitter uses a symbolic connection. Rather than connecting to a function, it is connected to an object and a method name, thus the callback is retrieved from object every time the emitter is invoked. Symbolic callbacks are registered by calling connect with an (object, method_name) tuple:

     class MyCanvas(Canvas):
         def __init__(self):
             Canvas.__init__()
             self.events.mouse_press.connect((self, 'mouse_handler'))
             self.events.mouse_release.connect((self, 'mouse_handler'))
             self.events.mouse_move.connect((self, 'mouse_handler'))
             
         def mouse_handler(event):
             ## handles all types of mouse input events
    

    If the method referenced by a symbolic callback does not exist, it is ignored silently. This allows many default callbacks to exist without incurring the penalty of invoking empty callback methods for each one (constructing and tearing down stack frames is expensive).

Disabling emitters, disconnecting callbacks, and claiming events

  • Many event systems include a mechanism that allows one registered callback to prevent others from seeing the event. We are taking a slightly different approach: registered callbacks may modify the event and mark it as being handled; other callbacks are encouraged to check whether each event has been handled before acting on it. The reasoning for this approach is that, in general, one event handler cannot predict whether other handlers will be interested in receiving the event (for example, consider a callback whose function is simply to record all of the events it receives. An upstream event handler should not be able to block this callback).

  • Callbacks may be disconnected individually or in batches:

    visual.events.mouse_down.disconnect(mouse_event)  # remove a single callback
    visual.events.mouse_down.disconnect()  # remove all callbacks for this event type
    visual.events.disconnect()  # remove all callbacks on this visual
    
  • Emitters may be temporarily blocked/unblocked:

    # move a visual but make sure nobody receives notification that this has happened
    with visual.events.moved.blocker():
        visual.set_pos(x, y)
    # later, we can send a different move event manually:
    visual.events.moved(...)
    

    This is useful in a couple of situations: 1) An event would normally be emitted multiple times but only needs to be handled the last time. 2) A loop of connected events needs to be temporarily broken to prevent infinite looping.

Emitter groups

The EmitterGroup class mainly provides organization for large numbers of EventEmitters. It is usually found as the 'events' attribute on Canvas, Timer, and Visual instances. Additionally, EmitterGroup provides several other useful features:

  • EmitterGroup is itself a subclass of EventEmitter. Connecting a callback to an EmitterGroup causes the callback to receive all events for any of the emitters in the group.
  • Handles setting up default callbacks. For example, canvas.events.resize is automatically connected to canvas.on_resize
  • Gives each emitter the defaults 'source' and 'name' (see Event generation below)
  • Provides methods for blocking / unbocking / disconnecting all emitters at once

Event generation

When an EventEmitter is called, it invokes each of its callbacks in sequence. Each callback receives a single Event argument which is either 1) created manually and passed to the emitter:

event = Event(name='resize', size=(w,h), ...)
visual.events.resize(event)

Or 2) the emitter uses all of the arguments passed to it to construct an Event:

visual.events.resize(name='resize', size=(w,h), ...)

To generate events, emitters use the class specified in their 'event_class' attribute. By default, this is set to the base Event class.

EventEmitters also have a special attribute 'defaults', which is a simple dictionary with string keys. Every time an event is emitted, the emitter modifies the event's attributes to contain the items specified in emitter.defaults. For example, most EventEmitters have at least two entries in their defaults: 'name' and 'source'. When these emitters are invoked, the event they send will have 'name' and 'source' attributes set:

@visual.events.mouse_press
def press(event):
    print "mouse was clicked over visual:", event.source

In the example above, event.source is provided by the EventEmitter, rather than the originator of the event.

Some events only make sense at the widget level (such as resize events). For mouse events, however, the event should be emitted at the object that the cursor is over.

To detect what object is under the mouse, we need a picking mechanism. OpenGL has a built in picking system, but it has some limitations (I can't remember what exactly). You can also have a pre-render pass, in which all objects draw themselves in a specific color. During mouse interaction, you can then query the pixel color and look up the corresponding object. Other approaches may be possible (especially in 2D).

When you know the object that the mouse is over, you can emit the event from that object. The event will then propagate (see below) and have the effects that it should have.

There are some details to take into account. Firstly, an object should detect a mouse release for every time that it received a mouse press, even if the mouse is not over the object anymore. You could even say that an object receives a mouse release only if it received a mouse press first (not sure about this though).

Secondly, it is important to distinguish dragging from clicking. Therefore, a click consists of a mouse down followed by a mouse up event, with little motion in between. Vispy could define a MouseClick event for convenience. Similarly, mouse dragging can be defined as a mouse press followed by a predefined minimum motion.

In other words: the MousePressEvent is rather primitive, and in most use cases you should use the MouseClick or MouseDrag event; a mouse action is either a drag or a click (or something else), but not both. Although both start with a mouse press.

Event propagation

Following the Qt way of thinking, we can distinguish two kind of events: 1) real events, that need to be handled; 2) signals, which is just a way of notifying different listeners that are interested in something that occurred. Qt uses two separate mechanisms for this, we want to have just one, but the above patterns are still relevant, and making them work in one system requires careful design.

The current idea is that an event is passed to all registered handlers. After that, the event is propagated to the parent object, etc. In other words, all handlers are always called. This behavior corresponds with the signal-mechanism. Further, a handler can say that it has "handled" the event (i.e. taken appropriate action on it) by calling event.accept(). Other handlers that want to take action on the event (which is different from being just interested in it as a signal) should check whether the event is already handled before taking action.

The event object allows storing useful information, such as who handled the event. Such information may be convenient to allow creating advanced event mechanisms. Although it might also make things overly complex :)

Generation of synthetic events

(From discussion between LC and AK on G-chat)

Custom EventEmitters may be created to generate new events based on raw events. They listen to the raw events and may generate (and emit) a new event or series of delayed events if a certain situation occurs. For instance the ClickEventEmitter may listen to mouse down and mouse up, and generate mouse click.

These high level emitters can be added per visual. To do so call:

visual.events.mouse_click = ClickEventEmitter(visual)

This approach makes for a very simple yet flexible approach to adding custom events. Another common use for this is filtering mouse / touch input to generate a stream of similar events with momentum or smoothing. For example, see the SmoothWheelFilter class in experimental/plot_lines.py

Example use cases

  • Basic Canvas event handling. We have 1) a Canvas backend which generates window-systm and input events, such as resize notifications, 2) a Canvas instance that receives these notifications, and 3) other objects that which to receive and respond to these notifications as well.

      class Canvas(object):
          def on_resize(self, event):
              """This method is called when the Canvas is resized. The *event*
              argument will have a 'size' attribute indicating the new size."""
    
          def on_paint(self, event):
              """This method is called when all or part of the Canvas needs to be 
              repainted. The *event* argument will have a 'region' attribute 
              indicating the sub-region of the canvas needing a repaint."""
    
      class ScenegraphCanvas(object):
          """Class encapsulating a Canvas which automatically renders the contents of
          a scenegraph."""
          def __init__(self):
              Canvas.__init__(self)
              self.scenegraph = Scenegraph()
    
              # directly connect the resize event to the corresponding event on the 
              # scenegraph
              self.resize_event.connect(self.scenegraph.resize_event)
    
          def on_paint(self):
              glClear(GL_COLOR_BUFFER, ...)
              self.scenegraph.render()
    
    
      canvas = ScenegraphCanvas()
    
      @canvas.resize_event.connect
      def my_resize_handler(event):
          print "Canvas was resized"
    
  • The EventEmitter class can be reimplemented to provide a modified event stream (such as smoothing mouse input events)

      class SmoothEventEmitter(EventEmitter)
          def input_event(self, event):
              # A raw motion event has occurred; move the 'target' position 
              # and re-start the timer if needed
    
          def on_timer(self):
              # emit new event with modified position
    
      canvas = Canvas()
      canvas.smooth_mouse_move_event = SmoothEventEmitter()
      canvas.mouse_move_event.connect(canvas.smooth_mouse_move_event.input_event)
    
  • Zoomable ViewBox: A ScenegraphCanvas (see above) is used with a ViewBox visual contained inside the scenegraph. The job of the ViewBox is to provide a coordinate transformation for its child visuals and allow the user to modify that transformation by mouse interaction (panning / zooming). Dragging with the left mouse button pans the view, while dragging the right button scales the view (the exact button bindings are just an example; this must be made user-configurable).

    In this example, mouse events pass from the backend to the Canvas, then from the Canvas to the SceneGraph. The Scenegraph then makes some decision about which object(s) in the scene should be notified of the mouse events and in what order (traditionally, the visual under the mouse pointer at the time of button-press receives all mouse events until the button is released again).

    The ViewBox visual, just like Canvas, has its own set of EventEmitters connected to default handler methods. One of these methods handles mouse events and determines when a mouse drag is occurring and which button(s) are pressed. The event handler updates the transformation of the ViewBox, then requests a repaint via the scenegraph.

  • A ViewBox (as above) contains an Image visual which just displays an image. Since the view is zoomable, it is possible to zoom such that the image occupies the entire view. Right-clicking on the Image raises a contex menu, defined by Image.on_mouse_click. When dragging over the Image with the right mouse button, the mouse events are instead handled by the ViewBox to keep the expected zooming behavior. (Please note this is not a trivial example--deciding which visuals should be event recipients requires a lot of organization and communication between visuals and the scenegraph)

  • A ViewBox (as above) contains two orthogonal lines which can be moved by dragging with the left mouse button. When the mouse hovers over a line (with a margin of 5px or so), it changes color to indicate to the user that it can be dragged. If the mouse hovers over the intersection between the two lines, two different outcomes are possible:

    • Both lines hilight, and dragging will accordingly move both lines.
    • Only one line will hilight, and the one that is hilighted will be the only one that responds to mouse drag events (not this is also not trivial--we must guarantee that, as long as one line is hilighted, it will be the recipient of future left-drag-events).

    Regardless of the position of the cursor, right-dragging the mouse always causes the view to zoom. This is particularly tricky because the scenegraph needs to deliver left-drag events to the line(s) and right-drag events to the view.

Interface

Full specification

Clone this wiki locally