Q8bert: Quantized 8bit bert O Zafrir, G Boudoukh, P Izsak, M Wasserblat 2019 Fifth Workshop on Energy Efficient Machine Learning and Cognitive …, 2019 | 522 | 2019 |
How to train BERT with an academic budget P Izsak, M Berchansky, O Levy arXiv preprint arXiv:2104.07705, 2021 | 95 | 2021 |
Transformer language models without positional encodings still learn positional information A Haviv, O Ram, O Press, P Izsak, O Levy arXiv preprint arXiv:2203.16634, 2022 | 74 | 2022 |
Cloud-enabled, distributed and high-availability system with virtual machine checkpointing B Hudzia, S Walsh, R Tell, A Shribman, P Izsak US Patent 9,563,452, 2017 | 33 | 2017 |
Term set expansion based nlp architect by intel ai lab J Mamou, O Pereg, M Wasserblat, A Eirew, Y Green, S Guskin, P Izsak, ... arXiv preprint arXiv:1808.08953, 2018 | 29 | 2018 |
Exploring the boundaries of low-resource BERT distillation M Wasserblat, O Pereg, P Izsak Proceedings of SustaiNLP: Workshop on Simple and Efficient Natural Language …, 2020 | 17 | 2020 |
System and method for text normalization in noisy channels H Weisman, P Izsak, I Achlow, V Shafran US Patent 10,803,241, 2020 | 14 | 2020 |
The search duel: a response to a strong ranker P Izsak, F Raiber, O Kurland, M Tennenholtz Proceedings of the 37th international ACM SIGIR conference on Research …, 2014 | 12 | 2014 |
Term set expansion based on multi-context term embeddings: an end-to-end workflow J Mamou, O Pereg, M Wasserblat, I Dagan, Y Goldberg, A Eirew, Y Green, ... arXiv preprint arXiv:1807.10104, 2018 | 8 | 2018 |
Training compact models for low resource entity tagging using pre-trained language models P Izsak, S Guskin, M Wasserblat 2019 Fifth Workshop on Energy Efficient Machine Learning and Cognitive …, 2019 | 7 | 2019 |
Setexpander: End-to-end term set expansion based on multi-context term embeddings J Mamou, O Pereg, M Wasserblat, I Dagan, Y Goldberg, A Eirew, Y Green, ... Proceedings of the 27th International Conference on Computational …, 2018 | 7 | 2018 |
Determination of prominent phrases in multi-channel interactions by multi-feature evaluations H Weisman, P Izsak, V Shafran US Patent 9,953,048, 2018 | 5 | 2018 |
Optimizing retrieval-augmented reader models via token elimination M Berchansky, P Izsak, A Caciularu, I Dagan, M Wasserblat arXiv preprint arXiv:2310.13682, 2023 | 4 | 2023 |
CoTAR: Chain-of-Thought Attribution Reasoning with Multi-level Granularity M Berchansky, D Fleischer, M Wasserblat, P Izsak arXiv preprint arXiv:2404.10513, 2024 | 1 | 2024 |
RAG Foundry: A Framework for Enhancing LLMs for Retrieval Augmented Generation D Fleischer, M Berchansky, M Wasserblat, P Izsak arXiv preprint arXiv:2408.02545, 2024 | | 2024 |
Reduction of latency in retriever-reader architectures M Berchansky, P Izsak US Patent App. 17/957,456, 2023 | | 2023 |
Remote redundant array of inexpensive memory A Shribman, P Izsak, B Hudzia, R Tell US Patent App. 13/646,433, 2014 | | 2014 |
Leveraging memory mirroring for transparent memory scale-out with zero-downtime failover of remote hosts R Tell, P Izsak, A Shribman, S Walsh, B Hudzia 2013 IEEE Symposium on Computers and Communications (ISCC), 000384-000390, 2013 | | 2013 |
Data Intensive Enterprise Applications P Izsak, A Shribman Data Intensive Storage Services for Cloud Environments, 158-165, 2013 | | 2013 |
2019 Fifth Workshop on Energy Efficient Machine Learning and Cognitive Computing-NeurIPS Edition (EMC2-NIPS)| 978-1-6654-2418-9/19/$31.00© 2019 IEEE| DOI: 10.1109/EMC2 … R Appuswamy, JV Arthur, D Bablani, D Badawi, A Ball, J Beu, T Bluche, ... | | |