COMPUTER GAME TECHNOLOGIES * VIRTUAL REALITY * VIRTUAL CHARACTERS AND CROWDS * URBAN PERCEPTIONThe images above are from student projects that I’ve supervised in courses I teach or am affiliated with, including DD1354, DH2323, DT2350, DD1392 MVK, SA104X, DD2463 (Advanced Individual Course), DD3336 and Exjobb (Master thesis). I have a large variety of potential projects available to all levels and backgrounds of students at KTH relating to the areas of real-time computer graphics and animation, computer game technologies, perceptual computing and procedural generation. Application areas range from architecture and urban modelling (using VR technologies such as the Oculus Rift), to pedestrian behaviour and interaction with full-sized virtual humanoids using devices such as the Kinect and Leap Motion. Contact me if you would like to investigate possibilities for project work. Some ideas are below. Note that many of these project areas include the opportunity to collaborate with research groups both within and outside KTH. You may obtain a further idea of the types of project areas that I typically supervise from the blogs here, here and here. PROJECT IDEASHere are some examples of general project areas/ideas (with links to further references): 1) Scheduling algorithm for simulating the routines of autonomous pedestrians in a virtual KTH campus , . 2) Emotion recognition and mirroring game using virtual characters' facial expressions and body motions for learning scenarios between robots, agents and children . 3) Inverse procedural generation techniques for the automatic generation of stylised virtual urban environments , , . 4) Machine learning and data-driven approaches from mobile robotics applied to the simulation of small groups of virtual agents , . 5) A (crowd-sourcing) tool using computer game technologies to investigate the perception of urban environments with a view to enhancing human well-being , . 6) Automatic variation of the graphical appearance of virtual characters for rendering plausible looking crowd scenarios using limited numbers of models , .
Christopher PetersAssociate Professor (Docent)Computational Science and Technology Department (CST)School of Computer Science and Communication (CSC)KTH Royal Institute of TechnologySweden
Real-time gaze-based interaction with virtual agents in the Unity game engine. The pink dot that can be seen moving in the video above is the gaze position of the user. It is used to drive the behviours of virtual characters in the environment and to alter the volume of the audio from objects that are being attended to. High density crowds via fluid simulationRichard Ristic and Johan Berglund (SA104X)
Real-time procedural urban walk-through in the Unity game engine. The purpose of this application is for experiments relating to urban perception and well being. Some of the parameters that can be altered prcoedurally include buildings hieght, the amount of greenery in the scene and the foot path (sidewalk) width.ObscurusDaniel Nyberg (DD3336)
GALLERYGaze-based interaction with virtual agentsGroup Bismarck (MVK14, Client: Tobii Technology)
High density crowd simulation in the Unity game engine, described in this blog. The algorithm is based on a unilaterally incompressible fluid simulation. The crowd rerndering aspects of the project are based on two previous student projects from the DH2323 and DH2320 courses.
Miguel Ramos Carretero, Adam Qureshi, Christopher Peters,“Evaluating the perception of group emotion from full body movements in the context of virtual crowds”. ACM Symposium on Applied Perception 2014, pp. 7-14
MASTER THESESReal-time Depth Sorting of Transparent Fragments, AXEL LEWENHAUPT, expected 2016 (in collaboration with Fatshark AB)Unified Volumetric Lighting with Clustered Shading, PHILIP SKOLD, expected 2016 (in collaboration with Fatshark AB)The Effect of Facial Expressions Valence on the Perception of Expressed Body Motions, ROBIN PALMBERG, expected 2016Perception of Trustworthiness and Valence of Emotional Expressions in Virtual Characters, NIKLAS BLOMQVIST, 2016Investigating Urban Perception Using Procedural Street Generation and Virtual Reality, OSCAR FRIBERG, 2016Concealing Rendering Simplifications Using Gaze Contingent Depth of Field, TIM LINDEBERG, 2016 (in collaboration with Tobii AB)An Evaluation of Interactors' Gaze-to-Object Mapping Performance in 3D Virtual Environments, MARTIN SCHON, 2016 (in collaboration with Tobii AB)Collision Detection Between Dynamic Objects and Static Displacement Mapped Surfaces in ComputerGames, FANGKAI YANG, 2015 (in collaboration with Avalanche Studios)Volumetric Terrain Generation on the GPU: A Modern GPGPU Approach to Marching Cubes, LUDWIG PETHRUS ENGSTROM, 2015Expression of Emotion in Virtual Crowds: Investigating Emotion Contagion and Perception of Emotional Behaviour in Crowd Simulation, MIGUEL RAMOS CARRETERO, 2014
Obscurus is an immersive game experience created by Daniel Nyberg using the Bitsquid game engine and uses the Oculus Rift virtual reality helmet. Daniel subsequently entered and won the prestigious CAwards! More about the development of Obscurus in his blog, here.
PUBLICATIONS AUTHORED BY MASTER STUDENTS
This study probes human sensitivities to the emotional expressions of a task irrelevant background crowd on the perception of emotion from a task relevant foreground group based on their full body motions.
PROJECTS: ENGINEERING PHYSICS Group Simulation of Three Agents using Unity3D, VINCENT WONG and MAX TURPEINEN, 2016Simulating Group Formations using RVO, MARTIN FUNKQVIST and STAFFAN SANDBERG, 2016High-density Real-time Virtual Crowds via Unilaterally Incompressible Fluid Simulation, RICHARD RISTIC and JOHAN BERGLUND, 2015