In order to synchronize the the camera position of multiple render windows, we need a synchronization concept.
Options that should be modifiable:
- camera zoom
- camera pan (x, y)
- image slice walk (z)
- view direction (axial, sagittal, coronal)
The view direction can be individually changed for each render window (has already been proven in the render window manager).
Since the camera position does not belong to a data node / image but to a render window, we will not use the property mechanism.
Instead, we want to use a event system.
What has to be considered concerning such an event system?
- context information to provide group-specific events?
- rules to define what happens on synchronization?
- std::function for arbitrary rule
- how does the mouse event system work?