Skip to main content
Back to KTH start page

Sina Sheikholeslami

Profile picture of Sina Sheikholeslami

Doctoral student

Details

Address
KISTAGÅNGEN 16
Room

Researcher


About me

I am currently on the academic and industrial job market. For academia, I'm looking for a researcher or a Post-doc position in fields such as distributed deep learning, data-intensive computing, and scalable systems and frameworks for reproducible scientific research. For the industrial positions, I'm open to positions such as Research Engineer, AI/ML Engineer, Data Engineer, and R&D Engineer. You can find my latest CV here.

I am a PhD Student in the Distributed Computing Group of the division of Software and Computer Systems (SCS), EECS School of KTH, where I'm being advised by Vladimir Vlassov, Amir Payberah, and Jim Dowling. My main interest lies in the intersection of distributed systems, machine learning/deep learning, and data-intensive computing. Inspired by "I choose a lazy person to do a hard job, because a lazy person will find an easy way to do it", I have a particular interest in finding (relatively) simple and practical solutions to challenging questions in machine learning, deep learning, and scalable computing. I also strongly believe in open science and reproducible research, and to that end all of my own work is publicly available with ready-to-execute scripts for reproducing the results.

During my PhD studies, I have worked on systems for machine learning and deep learning. In particular, we developed and released the first framework for automated, parallel ablation studies for deep learning, called AutoAblation, as part of Maggy; and introduced a novel approach for dataset partitioning in data-parallel training of deep neural networks by considering the importance of dataset examples, named Importance-aware DPT, which won the Best Artefact Award of DAIS 2023.

Recently, I have become interested in the idea of "reusing" the computation results of one ML/DL pipeline stage in another stage. To that end, we showed that it makes sense to reuse "model weights" from the winning hyperparameter tuning trial, to initialize the model at the training stage. This work resulted in our most recent paper, "Deep Neural Network Weight Initialization from Hyperparameter Tuning Trials", which is to apper in ICONIP 2024 but you can already find our author's version .

Before starting my Ph.D. studies, I was an EIT Digital Master School scholar (2017-2019) and did my M.Sc. studies at Eindhoven University of Technology (first year) and KTH (second year). I did my internship at Logical Clocks AB, which led to my thesis, "Ablation Programming for Machine Learning".

Prior to that, I was a Big Data R&D Engineer at Digikala.com. I did my Bachelor of Science in Computer Software Engineering at Amirkabir University of Technology (Tehran Polytechnic), where I was the President of CEIT's Students' Scientific Chapter from January 2014 to March 2015.

In addition to my studies, I am currently a Steward of the PhD Chapter's Masters of Ceremonies Group. Before that, I was a Board Member and the Council Coordinator of KTH's PhD Chapter from January to July 2024. Prior to that, I was a member of EECS PhD Student Council (January 2020 - December 2023), where I was the Vice-chair (January 2021 - December 2022), and a representative in the New Faculty Appointment Committee, the Faculty Promotion Committee, the School Assembly of EECS, and the Council for Third-Cycle Education of EECS. I was also a member of the Nominating Committee of the PhD Chapter (May 2020 - December 2022).

Outside KTH, I am Sweden's Local Representative for the EIT Digital Alumni Foundation since January 2020.


Courses

Data Mining (ID2222), assistant | Course web

Data Mining (FID3016), assistant | Course web

Data-Intensive Computing (ID2221), assistant, teacher | Course web

Operating Systems (ID1200), teacher | Course web

Operating Systems (ID1206), teacher | Course web