Youyeon Joo
Ph.D. student

This will be updated soon.

If you have any questions, please feel free to contact me.


Education
  • Seoul National University
    Seoul National University
    Department of Electrical and Computer Engineering
    Ph.D. Student (Advisor: Yunheung Paek)
    Mar. 2022 - present
  • Ewha Womans University
    Ewha Womans University
    B.S. in Cyber Security (Cum Laude)
    Mar. 2018 - Feb. 2022
Honors & Awards
  • Excellence Paper Award of ACK 2024 (granted by KIPS)
    2024
  • Excellence Paper Award of ASK 2023 (granted by KIPS)
    2023
  • Academic Excellence Scholarship (granted by Ewha Womans University)
    2020-2022
News
2025
"Efficient Keyset Design for Neural Networks Using Homomorphic Encryption" is accepted at MDPI Sensors.
Jul 08
"SLOTHE : Lazy Approximation of Non-Arithmetic Neural Network Functions over Encrypted Data" is accepted at USENIX Security 2025.
Jun 06
Homepage is created !
Jun 05
Selected Publications (view all )
SLOTHE : Lazy Approximation of Non-Arithmetic Neural Network Functions over Encrypted Data

Kevin Nam*, Youyeon Joo*, Seungjin Ha, Yunheung Paek$\dagger$ (* equal contribution, $\dagger$ corresponding author)

USENIX Security Symposium (USENIX Sec), 2025

Existing works adopt an eager approximation (EA) strategy to approximate non-arithmetic functions (NAFs), which statically replaces each NAF with a fixed polynomial, locking in computational errors and limiting optimization opportunities. We propose SLOTHE, a lazy approximation (LA) solution that recursively decomposes NAF codes into arithmetic and nonarithmetic sub-functions, selectively approximating only the non-arithmetic components when required.

SLOTHE : Lazy Approximation of Non-Arithmetic Neural Network Functions over Encrypted Data

Kevin Nam*, Youyeon Joo*, Seungjin Ha, Yunheung Paek$\dagger$ (* equal contribution, $\dagger$ corresponding author)

2025

Existing works adopt an eager approximation (EA) strategy to approximate non-arithmetic functions (NAFs), which statically replaces each NAF with a fixed polynomial, locking in computational errors and limiting optimization opportunities. We propose SLOTHE, a lazy approximation (LA) solution that recursively decomposes NAF codes into arithmetic and nonarithmetic sub-functions, selectively approximating only the non-arithmetic components when required.

LOHEN: Layer-wise Optimizations for Neural Network Inferences over Encrypted Data with High Performance or Accuracy

Kevin Nam, Youyeon Joo, Dongju Lee, Seungjin Ha, Hyunyoung Oh, Hyungon Moon$\dagger$, Yunheung Paek$\dagger$ ($\dagger$ corresponding author)

USENIX Security Symposium (USENIX Sec), 2025

When FHE is applied to neural networks (NNs), we have observed that the distinct layered architecture of NN models opens the door for a performance improvement by using layer-wise Ciphertext Configurations (CCs), because a globally chosen CC may not be the best possible CC for every layer individually. This paper introduces LOHEN, a technique crafted to attain high performance of NN inference by enabling to use layer-wise CC efficiently.

LOHEN: Layer-wise Optimizations for Neural Network Inferences over Encrypted Data with High Performance or Accuracy

Kevin Nam, Youyeon Joo, Dongju Lee, Seungjin Ha, Hyunyoung Oh, Hyungon Moon$\dagger$, Yunheung Paek$\dagger$ ($\dagger$ corresponding author)

2025

When FHE is applied to neural networks (NNs), we have observed that the distinct layered architecture of NN models opens the door for a performance improvement by using layer-wise Ciphertext Configurations (CCs), because a globally chosen CC may not be the best possible CC for every layer individually. This paper introduces LOHEN, a technique crafted to attain high performance of NN inference by enabling to use layer-wise CC efficiently.

All publications