DeepLearning(11)
-
[Paper Review] GANomaly: Semi-Supervised Anomaly Detection via Adversarial Training
GANomaly Semi-supervised anomaly detection via adversarial training (2018) Authors: Samet Akcay, Amir Atapour-Abarghouei , and Toby P. Breckon Conference: Asian Conference on Computer Vision (622--637) Organization: Springer Summary: Semi-supervised anomaly detection Architecture: Conditional GAN an adverasarial autoencoder within an Encoder-Decoder-Encoder pipeline Anomalies are detected when t..
2020.08.10 -
[Paper Review] f-AnoGAN: Fast unsupervised anomaly detection with generative adversarial networks
f-AnoGAN Authors : Thomas Schleg,Philipp Seeböck, Sebastian M.Waldstein, Georg Langs,Ursula Schmidt-Erfurth Journal : Medical Image Analysis (Impact Factor: 11.148 in 2020) Summary : 기존 AnoGAN가 inference할 때 iterative optimization 의 느린 속도를 개선하고자 encoder 기반의 구조를 제안하였다. 또한, 다양한 encoder 학습 방법을 제안하였다. Architecture : Wasserstein GAN + Encoder Encoder based Anomaly Detection Encoder enables A fast lear..
2020.08.09 -
[Paper review] StyleGAN2
Analyzing and Improving the Image Quality of StyleGAN Introduction 제목: Analyzing and Improving the Image Quality of StyleGAN 저자: Tero Karras, Samuli Laine, Miika Aittala, Janne Hellsten, Jaakko Lehtinen, Timo Aila 기관: NVIDIA 요약: StyleGAN의 문제를 분석하고, 모델의 구조와 학습 방법을 개선 Generator의 구조 개선 Redesign generator normalization: "Droplet artifacts" 문제 해결 Revisit progressive growing: "phase" artifacts, Shift-..
2020.07.28 -
[Paper Review] StyleGAN
StyleGAN: A Style-Based Generator Architecture for GANs We propose an alternative generator architecture for generative adversarial networks, borrowing from style transfer literature. The new architecture leads to an automatically learned, unsupervised separation of high-level attributes (e.g., pose and identity when trained on human faces) and stochastic variation in the generated images (e.g.,..
2020.07.27 -
Small Batch Size in Deep Learning
How to Choose the Batch Size? 일반적으로 아래 세 가지 batch size를 결정할 수 있으며, 배치 사이즈의 크기에 따라 딥러닝의 성능에 많은 영향을 미친다. Batch(deterministic) Gradient Descent Mini-Batch Gradient Descent Stochastic(Online) Gradient Descent Large Batch Small Batch Accurate estimate of the gradient (low variance) Noisy estimate of the gradient (high variance) High Computation cost per iteration Low computation cost per iteration Hi..
2020.07.23