|
Jang-Hyun Kim
Hello! I am a machine learning researcher.
I recently earned PhD in computer science at Seoul National University, advised by Hyun Oh Song.
I was a visiting scholar at New York University in 2024, hosted by Kyunghyun Cho.
Before that, I interned at NAVER
AI in 2018, where I worked on speech enhancement.
I completed BSc with Mathematics at Seoul National University in 2019. My studies were
supported by the Korea Foundation for Advanced Studies (PhD) and Presidential Science Scholarship
(BSc).
I'll be joining Apple AI/ML Foundation models team (NYC) as a Research Scientist!
Email   |  
CV   |  
Scholar   |  
Github   |  
Twitter
|
|
|
Research
Future AI systems will require long-term interaction, continual personalization, large-database
inference, and streamed input processing. I believe that efficient and effective context
management is the key to enabling these capabilities. My current research focuses on improving
contextual memory in AI models, addressing challenges in attention mechanisms
(Context Memory), KV caching (KVzip), and tokenization.
Previously, I led ML research projects in synthetic data
generation, dataset
cleaning, causal discovery
for human genes, and speech
enhancement.
|
|
KVzip: Query-Agnostic KV Cache Compression with Context Reconstruction
Jang-Hyun Kim,
Jinuk Kim,
Sangwoo
Kwon,
Jae W. Lee,
Sangdoo Yun,
Hyun Oh Song
NeurIPS, 2025 -
Oral Presentation
(77/21575=0.35%)
Paper |
Code |
Blog |
Bibtex
|
|
Large-Scale Targeted Cause Discovery via Learning from Simulated Data
Jang-Hyun Kim,
Claudia Skok
Gibbs,
Sangdoo Yun,
Hyun Oh Song,
Kyunghyun Cho
TMLR, 2025
Paper |
Code |
LM podcast |
Bibtex
|
|
Compressed Context Memory For Online Language Model Interaction
Jang-Hyun Kim,
Junyoung Yeom,
Sangdoo Yun*,
Hyun Oh Song*
ICLR, 2024
Paper |
Code |
Project Page |
Bibtex
|
|
Neural Relation Graph: A Unified Framework for Identifying Label Noise and Outlier Data
Jang-Hyun Kim,
Sangdoo Yun,
Hyun Oh Song
NeurIPS, 2023
Paper |
Code |
Bibtex
|
|
Dataset Condensation via Efficient Synthetic-Data Parameterization
Jang-Hyun Kim,
Jinuk Kim,
Seong Joon Oh,
Sangdoo Yun,
Hwanjun Song,
Joonhyun Jeong,
Jung-Woo Ha,
Hyun Oh Song
ICML, 2022
Paper |
Code |
Bibtex
|
|
Uncertainty-Based Offline Reinforcement Learning with Diversified Q-Ensemble
Gaon
An*,
Seungyong Moon*,
Jang-Hyun Kim,
Hyun Oh Song
NeurIPS, 2021
Paper |
Code |
Bibtex
|
|
Co-Mixup: Saliency Guided Joint Mixup with Supermodular Diversity
Jang-Hyun Kim,
Wonho
Choo,
Hosan Jeong,
Hyun Oh Song
ICLR, 2021 -
Oral Presentation
(53/2997=1.7%)
Paper |
Code |
Bibtex
|
|
Spherical Principal Curves
Jongmin Lee*,
Jang-Hyun Kim*,
Hee-Seok Oh
TPAMI, 2021 | R Journal, 2022
Paper |
R Journal |
Code |
Bibtex
|
|
Puzzle Mix: Exploiting Saliency and Local statistics for Optimal Mixup
Jang-Hyun Kim,
Wonho
Choo,
Hyun Oh Song
ICML, 2020
Paper |
Code |
Bibtex
|
|
Phase-Aware Speech Enhancement with Deep Complex U-Net
Hyeong-Seok Choi,
Jang-Hyun Kim,
Jaesung
Huh,
Adrian Kim,
Jung-Woo Ha,
Kyogu Lee
arxiv, 2019
Paper |
Bibtex
|
|
Multi-Domain Processing via Hybrid Denoising Networks for Speech Enhancement
Jang-Hyun Kim*,
Jaejun
Yoo*,
Sanghyuk Chun,
Adrian Kim,
Jung-Woo Ha
arxiv, 2018
Paper |
Code |
Bibtex |
Demo
|
|
Google's Speaker Verification
Code |
Kaggle
|
|
Caricature Generation
Code
|
|
Image Mosaic via Mixed Integer Programming
Code
|
Dissertation
Data Optimization for Efficient Deep Learning,
PhD Dissertation, 2025 | Paper
Mathematical Backgrounds for Machine Learning,
Undergraduate Dissertation (in Korean), 2019 | Paper
|
Academic Services
Workshop Program Committee / Reviewer
- Curated Data for Efficient Learning (ICCV 2025) | Website
- Interpolation Regularizers and Beyond (NeurIPS 2022) | Website
- ImageNet: Past, Present, and Future (NeurIPS 2021) | Website
Reviewing Activities
- NeurIPS (2021-), ICLR (2022-), ICML (2022-), TMLR (2022-)
|
|