Aaron Wang
Advisor: Rick Cavanaugh
Mentor: Jennifer Ngadiuba
Undergraduate: University of Washington (Physics)
Graduate: University of Illinois Chicago (Physics)
Project: Efficient Transformers
Exploring efficient transformer architectures on particle physics data sets and implementing well-performing models into hls4ml. Transformers are a revolutionary deep learning architecture that is used widely in natural language processing. Its ability to process long chains of information is vital to processing long chains of particle physics jet reactions. Despite the power of the transformer model, there is a huge computational bottleneck, quadratically scaling with the input sequence length. I am exploring new models that reduce this computational bottleneck in order to speed of inference time of transformer models.