Jang-Hyun Kim

Hello! I am a PhD student in computer science at Seoul National University, advised by Hyun Oh Song. I am currently at New York University as a visiting scholar, hosted by Kyunghyun Cho (~2024.08). I previously interned at NAVER AI in 2018. I completed BSc with Mathematics at Seoul National University in 2019. My PhD studies are supported by the Korea Foundation for Advanced Studies (KFAS).

Email   /  CV   /  Scholar   /  Github   /  Twitter

profile photo

Research

My research focuses on developing a robust and efficient machine learning system with a data-centric approach, particularly leveraging the interplay between data and a model. Across various domains including image, speech, and language, I address real-world challenges.

My key research observations are:
  • [Data Generation] During training, models learn to locate informative parts of the data for inference. Saliency-guided data augmentation can reinforce the training. [Puzzle Mix, Co-Mixup].
  • [Data Compression] Trained models can compress datasets or contexts, enhancing the efficiency of training and inference processes. [Context Memory, Data Parameterization, Spherical Principal Curves].
  • [Data Identification] We can infer relationships within data by leveraging trained models, facilitating the characterization of problematic data in large datasets. [Neural Relation Graph].

Publications
relation
Compressed Context Memory For Online Language Model Interaction
Jang-Hyun Kim, Junyoung Yeom, Sangdoo Yun, Hyun Oh Song
ICLR, 2024
Paper | Code | Project Page | Bibtex

relation
Neural Relation Graph: A Unified Framework for Identifying Label Noise and Outlier Data
Jang-Hyun Kim, Sangdoo Yun, Hyun Oh Song
NeurIPS, 2023
Paper | Code | Bibtex

idc
Dataset Condensation via Efficient Synthetic-Data Parameterization
Jang-Hyun Kim, Jinuk Kim, Seong Joon Oh, Sangdoo Yun, Hwanjun Song,
Joonhyun Jeong, Jung-Woo Ha, Hyun Oh Song
ICML, 2022
Paper | Code | Bibtex

edac
Uncertainty-Based Offline Reinforcement Learning with Diversified Q-Ensemble
Gaon An*, Seungyong Moon*, Jang-Hyun Kim, Hyun Oh Song
NeurIPS, 2021
Paper | Code | Bibtex

comix
Co-Mixup: Saliency Guided Joint Mixup with Supermodular Diversity
Jang-Hyun Kim, Wonho Choo, Hosan Jeong, Hyun Oh Song
ICLR ( Oral Presentation ), 2021
Paper | Code | Bibtex

spc
Spherical Principal Curves
Jongmin Lee*, Jang-Hyun Kim*, Hee-Seok Oh (*: equal contribution)
TPAMI, 2021 | R Journal, 2022
Paper | R Journal | Code | Bibtex
puzzle
Puzzle Mix: Exploiting Saliency and Local statistics for Optimal Mixup
Jang-Hyun Kim, Wonho Choo, Hyun Oh Song
ICML, 2020
Paper | Code | Bibtex
dcu
Phase-Aware Speech Enhancement with Deep Complex U-Net
Hyeong-Seok Choi, Jang-Hyun Kim, Jaesung Huh, Adrian Kim,
Jung-Woo Ha, Kyogu Lee
arxiv, 2019
Paper | Bibtex
mdphd
Multi-Domain Processing via Hybrid Denoising Networks for Speech Enhancement
Jang-Hyun Kim*, Jaejun Yoo*, Sanghyuk Chun, Adrian Kim, Jung-Woo Ha
arxiv, 2018
Paper | Code | Bibtex | Demo

Projects
sv
Google's Speaker Verification
Code | Kaggle

caricature
Caricature Generation
Code

comix
Image Mosaic via Mixed Integer Programming
Code


Notes

Mathematical Backgrounds for Machine Learning, An undergraduate dissertation (in Korean, 2018), Paper

Academic Services

Workshop Program Committee
  • First Workshop on Interpolation Regularizers and Beyond (NeurIPS 2022), Website
  • Workshop on ImageNet: Past, Present, and Future (NeurIPS 2021), Website

Reviewing Activities
  • NeurIPS (2021-), ICLR (2022-), ICML (2022-), TMLR (2022-)

Template based on Jon Barron's website.