Por favor, use este identificador para citar o enlazar este ítem:
https://doi.org/10.1016/j.eswa.2023.122861
Twittear
Registro completo de metadatos
Campo DC | Valor | Lengua/Idioma |
---|---|---|
dc.contributor.author | Martínez Beltrán, Enrique Tomás | - |
dc.contributor.author | Perales Gómez, Ángel Luis | - |
dc.contributor.author | Feng, Chao | - |
dc.contributor.author | Sánchez Sánchez, Pedro Miguel | - |
dc.contributor.author | López Bernal, Sergio | - |
dc.contributor.author | Gérôme, Bovet | - |
dc.contributor.author | Gil Pérez, Manuel | - |
dc.contributor.author | Martínez Pérez, Gregorio | - |
dc.contributor.author | Huertas Celdrán, Alberto | - |
dc.contributor.other | Facultades, Departamentos, Servicios y Escuelas::Departamentos de la UMU::Ingeniería y Tecnología de Computadores | es |
dc.date.accessioned | 2024-06-28T08:09:12Z | - |
dc.date.available | 2024-06-28T08:09:12Z | - |
dc.date.issued | 2024-05-14 | - |
dc.identifier.citation | Expert Systems with Applications, 2024, Vol. 242: 122861 | es |
dc.identifier.issn | Print: 0957-4174 | - |
dc.identifier.issn | Electronic: 1873-6793 | - |
dc.identifier.uri | http://hdl.handle.net/10201/142732 | - |
dc.description | © 2023 The Author(s). This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/ This document is the Published version of a Published Work that appeared in final form in Expert Systems with Applications. To access the final edited and published work see https://doi.org/10.1016/j.eswa.2023.122861 | - |
dc.description.abstract | In 2016, Google proposed Federated Learning (FL) as a novel paradigm to train Machine Learning (ML) models across the participants of a federation while preserving data privacy. Since its birth, Centralized FL (CFL) has been the most used approach, where a central entity aggregates participants’ models to create a global one. However, CFL presents limitations such as communication bottlenecks, single point of failure, and reliance on a central server. Decentralized Federated Learning (DFL) addresses these issues by enabling decentralized model aggregation and minimizing dependency on a central entity. Despite these advances, current platforms training DFL models struggle with key issues such as managing heterogeneous federation network topologies, adapting the FL process to virtualized or physical deployments, and using a limited number of metrics to evaluate different federation scenarios for efficient implementation. To overcome these challenges, this paper presents Fedstellar, a novel platform designed to train FL models in a decentralized, semi-decentralized, and centralized fashion across diverse federations of physical or virtualized devices. Fedstellar allows users to create federations by customizing parameters like the number and type of devices training FL models, the network topology connecting them, the machine and deep learning algorithms, or the datasets of each participant, among others. Additionally, it offers real-time monitoring of model and network performance. The Fedstellar implementation encompasses a web application with an interactive graphical interface, a controller for deploying federations of nodes using physical or virtual devices, and a core deployed on each device, which provides the logic needed to train, aggregate, and communicate in the network. The effectiveness of the platform has been demonstrated in two scenarios: a physical deployment involving single-board devices such as Raspberry Pis for detecting cyberattacks and a virtualized deployment comparing various FL approaches in a controlled environment using MNIST and CIFAR-10 datasets. In both scenarios, Fedstellar demonstrated consistent performance and adaptability, achieving of 91%, 98%, and 91.2% using DFL for detecting cyberattacks and classifying MNIST and CIFAR-10, respectively, reducing training time by 32% compared to centralized approaches. | es |
dc.format | application/pdf | es |
dc.format.extent | 15 | - |
dc.language | eng | es |
dc.publisher | Elsevier | - |
dc.relation | This work has been partially supported by (a) 21629/FPI/21, Fundación Séneca, Región de Murcia (Spain), (b) the strategic project DEFENDER from the Spanish National Institute of Cybersecurity(INCIBE) and by the Recovery, Transformation and Resilience Plan, Next Generation EU, (c) the Swiss Federal Office for Defense Procurement (armasuisse) with the DEFENDIS and CyberForce projects (CYD-C-2020003), (d) the University of Zürich UZH, (e) MCIN/AEI/10.13039/501100011033, NextGenerationEU/PRTR, UE, under grant TED2021-129300B-I00, and (f) MCIN/AEI/10.13039/501100011033/FEDER , UE, under grant PID2021-122466OB-I00. | es |
dc.rights | info:eu-repo/semantics/openAccess | es |
dc.rights | Attribution-NonCommercial-NoDerivatives 4.0 Internacional | * |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-nd/4.0/ | * |
dc.subject | Decentralized federated learning | - |
dc.subject | Deep learning | - |
dc.subject | Collaborative training | - |
dc.subject | Communication mechanisms | - |
dc.title | Fedstellar: a platform for decentralized federated learning | es |
dc.type | info:eu-repo/semantics/article | es |
dc.relation.publisherversion | https://www.sciencedirect.com/science/article/pii/S0957417423033638?via%3Dihub#d1e2723 | - |
dc.identifier.doi | https://doi.org/10.1016/j.eswa.2023.122861 | - |
Aparece en las colecciones: | Artículos: Ingeniería y Tecnología de Computadores |
Ficheros en este ítem:
Fichero | Descripción | Tamaño | Formato | |
---|---|---|---|---|
1-s2.0-S0957417423033638-main.pdf | 3,04 MB | Adobe PDF | Visualizar/Abrir |
Este ítem está sujeto a una licencia Creative Commons Licencia Creative Commons