Web Analytics Made Easy - Statcounter
Skip to content

Editors#

An editor is a viewport-facing tool that receives user interaction events such as mouse and keyboard input.

Use an editor when the user should act directly in the 3D view: create objects, select them, deform them, transform them, or perform guided interactive workflows.

Editors are shown in the editor area on the left side of the viewport.

Note

One and only one editor is active at any time.

Editor example

Mental model#

An editor is the main entry point for interactive tools. The core class is SBGEditor.

Choose an editor when the tool behavior is driven by user gestures in the viewport rather than by a standalone settings window. Editors are often used together with:

The flow of events#

Qt events reaching the viewport are processed in this order:

  • If Shift is pressed, the active camera is operated, the Qt event is accepted, and nothing more may happen.
  • If Shift is not pressed, the Qt event is passed to the active editor. If the active editor accepts the event, nothing more may happen.
  • If the active editor ignores the Qt event, a data graph node is picked in the viewport. If the picked node has a corresponding SBGDataGraphNode GUI class, then the Qt event is passed to this GUI.

This order matters because it explains why an editor should accept only the events it truly handles. If it consumes everything, object-level interaction never gets a chance to run.

When an editor is the right choice#

Editors work well for:

  • creation tools
  • manipulation tools
  • selection tools
  • interaction modes that stay active until the user switches away

Typical examples:

  • a nanotube or graphene builder
  • a lasso or rectangle selector
  • a rigid transform tool
  • a custom domain-specific modeling instrument

Practical guidance#

When implementing an editor, define clearly:

  • how it becomes active
  • which gestures it handles
  • when it should accept or ignore an event
  • whether it creates or activates controllers
  • how it updates the data graph

That makes the editor predictable and easier to combine with the rest of the SAMSON interaction model.

Example scenario#

Suppose you want to implement an editor that creates a polymer path from viewport clicks:

  • the editor becomes the active tool
  • each click adds or moves a control point
  • keyboard modifiers change the creation mode
  • the editor creates or updates nodes in the data graph as the interaction progresses

That is editor-driven interaction because the workflow is stateful and gesture-based.