This game demo showcases the idea of affective camera control. Several computational models of affect have been created based on data from previous players. Metrics are collected during the game and used to direct the camera behavior in order to drive players experience according to the selected mode.
- 3D prey/predator in which the player controls a ball inside a maze trying to pick pellets and avoid enemies.
- The virtual camera that determines how the game world is displayed on screen is controlled by the game engine.
- Artificial Neural Network models trained on historical data are used to assess different aspects of player experience (fun, challenge and frustration) and influence them through adaptation of the virtual camera behavior.
G. N. Yannakakis, H. P. Martinez, and A. Jhala – Towards Affective Camera Control in Games User Modeling and User-Adapted Interaction, Springer, 2010.