An open API service providing repository metadata for many open source software ecosystems.

GitHub / fzhu0628 1 Repository

PhD student @North Carolina State University | B.E. & M.E. from Fudan University | Working on Optimization, Federated Learning, and Theoretical (Multi-Agent) RL

fzhu0628/FedHSA---Tighter-Rates-for-Heterogeneous-Federated-Stochastic-Approximation-under-Markovian-Sampling

This is the second paper of my PhD career. Focusing on the general stochastic approximation framework, it proposes a federated algorithm that finds the optimum of an average of contractive operators.

Language: Python - Size: 25.4 KB - Last synced at: 23 days ago - Pushed at: 23 days ago - Stars: 0 - Forks: 0

fzhu0628/Fast-FedPG---Towards-Fast-Rates-for-Federated-and-Multi-Task-Reinforcement-Learning

This work is a conference paper published at IEEE CDC 2024. The paper is dedicated to finding a policy that maximizes the average of long-term cumulative rewards across environments. Included in the repository are a brief introduction of our work, the poster for the AI Symposium at NCSU, and the slides for the CDC talk.

Size: 2.19 MB - Last synced at: 6 months ago - Pushed at: 6 months ago - Stars: 0 - Forks: 0

fzhu0628/fzhu0628

Config files for my GitHub profile.

Size: 3.91 KB - Last synced at: 6 months ago - Pushed at: 6 months ago - Stars: 0 - Forks: 0

fzhu0628/DRAG-Divergence-Based-Adaptive-Aggregation-in-Federated-Learning-on-Non-IID-Data

In this work, we developed a novel algorithm named divergence-based adaptive aggregation (DRAG) to deal with the client-drift effect. Additionally, the DRAG algorithm also showcases resilience against byzantine attacks, as demonstrated through experiments.

Size: 0 Bytes - Last synced at: 7 months ago - Pushed at: 7 months ago - Stars: 0 - Forks: 0

fzhu0628/STSyn---Speeding-up-local-SGD-with-straggler-tolerant-synchronization

This work is a journal paper published at IEEE TSP in 2024, concentrating on improving the robustness to stragglers in distributed/federated learning with synchronous local SGD.

Size: 3.91 KB - Last synced at: 7 months ago - Pushed at: 7 months ago - Stars: 0 - Forks: 0

fzhu0628/G-CADA---Adaptive-worker-grouping-for-communication-efficient-and-straggler-tolerant-distributed-SGD

This work was published at IEEE ISIT 2022, where we proposed a novel algorithm named G-CADA aiming to improve the time and communication efficiency of distributed learning systems based on grouping and adaptive selection methods :smile:

Language: Python - Size: 11.7 KB - Last synced at: 7 months ago - Pushed at: 7 months ago - Stars: 0 - Forks: 0

fzhu0628/AgeSel---Communication-efficient-local-SGD-with-age-based-worker-selection

This work is a journal paper published at The Journal of Supercomputing in 2023, focusing on improving communication efficiency of a distributed learning system using age-based worker selection techniques.

Language: Python - Size: 1.95 KB - Last synced at: 7 months ago - Pushed at: 7 months ago - Stars: 0 - Forks: 0