As an Associate Professor and the head of the DIG Lab, I am dedicated to mentoring students to become skilled researchers and collaborators. Allow me to introduce our team.


I work on consistent 3D texture generation with geometry-aware relighting, focusing on multi-view texture optimization and dental imaging applications using the Mitsuba3 renderer.




I am interested in medium-aware 3D Gaussian Splatting reformulations for active digital twins, focusing on forward rendering optimizations and fragment-level gradient accumulation in overlapped gaussian scenarios.




I work on real-time graphics engine optimization and physically-based rendering systems, with expertise in DirectX12 pipeline enhancement and voxel-based global illumination (VXGI) for production-quality rendering engines.




I work on controllable hair editing using sketch-guided diffusion models, focusing on context-robust generation and matte-gated ControlNet architectures in DiT frameworks.




I lead our Genesis-based Real2Sim project — spanning state and trajectory-to-state mappers, inverse-dynamics calibration, residual reinforcement learning (BC + RL), and large-scale multi-agent combat simulation.




I focus on terrain-aware inverse dynamics calibration between Blender and Genesis simulators, exploring 3D terrain generalization for sim-to-sim transfer.




I am researching residual reinforcement learning approaches for path-following control, with an emphasis on covariate shift mitigation and sim2real transfer.




I work on rig-constrained on-the-fly 3D Gaussian Splatting for real-time novel view synthesis with multi-camera systems, specializing in rotation-only rig optimization and incremental pose estimation.




Alumni

Juwon Kim (BS, 2023.2)   
Jonghoon Kim (BS, 2023.2)   
Taemin Jung (BS, 2024.2)   
Donguk Kim (BS, 2024.2)   
Sihun Jin (BS, 2024.2)   
Haneul Kim (BS, 2024.2)   
Minkyo Kim (BS, 2024.2)   
Eunyoung Choi (MS)