Welcome! I am a researcher and quantitative engineer who enjoys creating novel solutions for big data and machine learning problems while exploring diverse research topics, transferring ideas across domains, and ensuring practical effectiveness.
I am currently on the 2025 job market. Please contact me if you would like to discuss potential openings or collaborations!
š¼ Professional Experiences
- Graduate Researcher at UofT: I built personalized and group-wise differentially private federated learning algorithms to meet heterogeneous privacy needs, and engineered fault-tolerant distributed systems that mitigate stragglers.
- Privacy Research Scientist Intern at CISPA: I designed time-adaptive, personalized privacy budget mechanisms to enhance learning performance under anonymization constraints.
- Machine Learning Research Associate Intern at Huawei: I developed quantization error bounds to optimize hardware-aware training of deep neural networks.
- Information Theory Research Intern at CUHK: I explored fundamental problems in network information theory to model reliability trade-offs in communication channels.
š Academic Background
I earned my Ph.D. and MASc. in Electrical and Computer Engineering at the University of Toronto, advised by Prof. Stark C. Draper, working on the intersection of distributed systems, machine learning, and data privacy. Before that, I received my B.Sc. in Electrical Engineering (digital systems) with a minor in Economics at the Sharif University of Technology.
š¬ Areas of Expertise
My work spans:
- Privacy in Machine Learning: I developed differentially private training methods and implemented them in Python using Opacus.
- Optimization: I tuned algorithm parameters and analyzed convergence of training algorithms under privacy and utility constraints.
- Federated Machine Learning: My research pioneered the design of private grouped federated learning under personalization constraints.
- Algorithm Design: I designed randomized, numerically stable, and approximate matrix multiplication algorithms for straggler-resilient distributed systems.
- Error-Correction Codes: I applied polynomial coding and interpolation to introduce efficient redundancy and address stragglers in distributed computing.
- Quantization in Machine Learning: I analyzed the error of using reduced-precision weights and activations to tradeoff accuracy for speed and memory efficiency.
- Implementation: I built differentially private ML in PyTorch with Opacus, and tested distributed computing on EC2 and SciNet HPC cluster using Python and mpi4py.
Selected Publications
News
Mar 2025 | I successfully passed my Ph.D. Final Oral Exam and submitted my Ph.D. thesis, which was accepted *as is* (with no required revisions)! Huge thanks to my committee members Prof. Stark Draper, Prof. Mohammad Ali Maddah-Ali, Prof. Azarakhsh Malekian, Prof. Dimitrios Hatzinakos, and Prof. Nicolas Papernot! |
---|---|
Feb 2025 | I was honored to receive the Farid and Diana Najm Graduate Fellowship and Shahid U. H. Qureshi Memorial Scholarship (both awarded to one ECE student per year), as well as the PhD Graduate Award for the second time! Many thanks to the donors for their support and trust in my work! |
Jan 2025 | Our paper, "Differentially Private Federated Learning with Time-Adaptive Privacy Spending", has been accepted to ICLR 2025! |
Dec 2024 | I successfully passed my Ph.D. Departmental Oral Exam! Huge thanks to my committee members Prof. Stark Draper, Prof. Nicolas Papernot, and Prof. Dimitrios Hatzinakos. |
Personal
Outside of research, I enjoy practicing creativity through cooking and experimenting with new dishes, hiking new trails, and trying to win at backgammon or other strategic games. When Iām not lazy, I spend time gardening. Whenever possible, I travel to explore new cultures.