Follow
Yury Nahshan
Yury Nahshan
Unknown affiliation
No verified email
Title
Cited by
Cited by
Year
Post training 4-bit quantization of convolutional networks for rapid-deployment
R Banner, Y Nahshan, D Soudry
Advances in Neural Information Processing Systems 32, 2019
633*2019
Accurate post training quantization with small calibration sets
I Hubara, Y Nahshan, Y Hanani, R Banner, D Soudry
International Conference on Machine Learning, 4466-4475, 2021
204*2021
Loss aware post-training quantization
Y Nahshan, B Chmiel, C Baskin, E Zheltonozhskii, R Banner, ...
Machine Learning 110 (11), 3245-3262, 2021
1342021
Robust quantization: One model to rule them all
B Chmiel, R Banner, G Shomron, Y Nahshan, A Bronstein, U Weiser
Advances in neural information processing systems 33, 5308-5317, 2020
712020
Linear Log-Normal Attention with Unbiased Concentration
Y Nahshan, J Kampeas, E Haleva
https://openreview.net/pdf?id=5nM2AHzqUj, 2023
2023
Rotation Invariant Quantization for Model Compression
J Kampeas, Y Nahshan, H Kremer, G Lederman, S Zaloshinski, Z Li, ...
arXiv preprint arXiv:2303.03106, 2023
2023
ACIQ: ANALYTICAL CLIPPING FOR INTEGER QUAN
R Banner, Y Nahshan, E Hoffer, D Soudry
arXiv preprint arXiv:1810.05723, 2018
2018
Supplementary Material: Accurate Post Training Quantization With Small Calibration Sets
I Hubara, Y Nahshan, Y Hanani, R Banner, D Soudry
The system can't perform the operation now. Try again later.
Articles 1–8