I am a PhD candidate at Purdue University, where I work on Natural Language Understanding with Dr. Julia Taylor Rayz at the AKRaNLU Lab. I am particularly interested in characterizing the level of semantic knowledge made available to computational models that only learn from textual exposure. I work closely with Dr. Allyson Ettinger and her lab at UChicago. I am also affiliated with CERIAS, Purdue’s center for research and education in areas of information security.
In 2018, I was fortunate to be awarded the Purdue Research Foundation fellowship (now known as the Ross-Lynn Graduate Student Fellowship). I then taught database fundamentals to sophomore level undergraduates for three semesters. I am currently funded by an NSF-EAGER grant focused on using artificial intelligence techniques to develop entertainment education materials for social-engineering research.
My email is [my-first-name] @ purdue [dot] edu.[why is it like that?]
PhD, Natural Language Understanding, Current
MS in Computer Information Technology, 2020
BS in Computer Information Technology, 2018
October 2021: Passed my prelim examination!
September 2021: Submitted a (tentative) thesis summary to the AAAI-2022 doctoral consortium!
July 2021: Presenting my paper on whether language models learn typicality at CogSci 2021!
May 2021: Our NAFIPS 2021 paper received an honorable mention for the Best Student Paper award.