1. Problem Definition

Title: Designing Attention Mechanisms for Scalable Graph Transformers on Large Graphs

<aside> 📌

The aim of this dissertation is to design the attention mechanism within Graph Transformers to improve their computational efficiency and scalability for large graphs.

This work will address the limitations of Graph Transformers, which face high computational costs and memory usage as graph size increases. The goal is to improve these models for practical real-world applications, demonstrating the scalability and efficiency of the proposed techniques.

Specifically, the research will focus on enhancing attention mechanisms such as sparse attention or localized attention, to handle large-scale graphs without significant compromises in performance.

</aside>


Research Question: Q. How can attention mechanisms be optimized in Graph Transformers to improve scalability and efficiency when applied to large graphs, while maintaining accuracy?


Key Areas of Focus:


2. Dissertation Progress Tracker

<aside> 🟣

Dec 2024 | Literature Review

<aside> 🟡

Jan 2025 | Understanding Existing Approaches

<aside> 🟠

Feb 2025 | Experimentation Setup & Baseline Model Implementation

<aside> 🔵

Mar 2025 | Optimization-1 & Initial Testing

<aside> 🔴

April 2025 | Optimization-2 & Scaling

<aside> âš«

May 2025 | Real-World Applications & Results Analysis

<aside> 🟢

June 2025 | Final Report & Dissertation Writing