Skip to content

Unipisa/OpenFL-XAI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OpenFL-XAI

License

OpenFL-XAI is an extension to the open-source Intel® OpenFL framework for providing user-friendly support to Federated Learning (FL) of Fuzzy Rule-Based Systems (FRBS) as explainable-by-design models. As an extension to OpenFL, OpenFL-XAI enables several data owners, possibly dislocated across different sites, to collaboratively train an eXplainable Artificial Intelligence (XAI) model while preserving the privacy of their raw data. An overview of the extensions to OpenFL that characterize OpenFL-XAI is reported in openfl_changelog.txt.
By supporting FL of highly interpretable FRBSs, OpenFL-XAI addresses two key requirements towards trustworthiness of AI systems, namely privacy preservation and transparency.

The current version of the framework includes the implementation of FL of Takagi-Sugeno-Kang Fuzzy Rule-Based Systems for solving regression problems (please refer to Bárcena et al., 2022 for more details). However, the framework is composed by a set of general classes allowing developers to design additional rule-based systems and new aggregation schemes.

This work has been developed by the Artificial Intelligence R&D Group at the Department of Information Engineering, University of Pisa, as part of the activities carried out within Hexa-X Project, the European Union's Flagship 6G Project. OpenFL-XAI has supported research, development, and demonstration activities concerning the FL of XAI models, which has been recently awarded as key innovation by the EU Innovation Radar. This work has been also partially funded by the PNRR - M4C2 - Investimento 1.3, Partenariato Esteso PE00000013 - FAIR - Future Artificial Intelligence Research" - Spoke 1 Human-centered AI"

tree aggregator cert    tree aggregator cert    tree aggregator cert

Table of Contents

Repository Structure

├── openfl-xai_workspaces                   # folder containing two OpenFL workspaces, namely:
│   ├── xai_frbs_generic                    	# Openfl-XAI workspace template containing all the customized classes to enable FL of XAI models.
│   ├── xai_tsk_frbs                        	# workspace of a first order TSK-FRBS, based on the Openfl-XAI workspace. This workspace is used in the Illustrative Example.
├── certificates                            # folder containing certificates used by Aggregator and Collaborators to prove their identity.
├── data                                    # folder containing the private data of the Collaborators, to be used for local model training.
├── configuration.json                      # json file in which the name of the XAI model to be used is specified.
├── docker-compose.yml                      # file used to generate Docker images to deploy Openfl-XAI components in Docker containers.
├── Dockerfile.openfl_xai                   # file used to generate Docker images to deploy Openfl-XAI components in Docker containers.
├── Dockerfile.xai_aggregator               # file used to generate Docker images to deploy Openfl-XAI components in Docker containers.
├── Dockerfile.xai_collaborator             # file used to generate Docker images to deploy Openfl-XAI components in Docker containers.
├── global_models                           # folder for storing aggregated model.
├── logs                                    # folder for storing containers logs.
├── terminal_interface.py                   # Python module implementing a command line interface for executing the Illustrative Example.
├── OpenFL_changelog.txt                    # changes made in OpenFL-XAI as extension of the OpenFL base components.
├── use_case_requirements.txt               # dependencies needed to execute the Illustrative Example. These requirements are installed in the Docker images.
├── Illustrative_Example.md                 # documentation: guide to execute an illustrative example for FL of first-order TSK-FRBS.
├── Customization_Guide.md                  # documentation: guide to customize FL process with your own models and settings.
└── images                                  # utility folder for documentation images.

Prerequisites

OpenFL-XAI requires:

In addition, the following Python packages are required to leverage the generated models and perform inference outside the Docker environment.

  • NumPy >= 1.24.3
  • SimpFul >= 2.11.0
  • Scikit-learn >= 1.2.2

Illustrative Example

Check the illustrative example for a step-by-step guide to the usage of OpenFL-XAI for learning a first-order TSK-FRBS in a federated fashion.

Setup and run a new Federated Learning process

Check the customization guide for a step-by-step guide to the customization of the FL process with your own models and settings.

License

This project is licensed under Apache License Version 2.0. By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.

Contributors

Citations

  1. M. Daole, A. Schiavo, P. Ducange, A. Renda, J. L. Corcuera Bárcena, F. Marcelloni, "OpenFL-XAI: Federated Learning of Explainable Artificial Intelligence Models in Python" Elsevier SoftwareX, Volume 23, 2023, 101505, DOI: https://doi.org/10.1016/j.softx.2023.101505
@article{openfl-xai_citation,
	author={Daole, Mattia and Schiavo, Alessio and Corcuera B{\'a}rcena, Jos{\'e} Luis and Ducange, Pietro and Marcelloni, Francesco and Renda, Alessandro},
	title={OpenFL-XAI: Federated Learning of Explainable Artificial Intelligence Models in Python},
	journal = {SoftwareX},
	volume = {23},
	pages = {101505},
	year = {2023},
	issn = {2352-7110},
	doi = {https://doi.org/10.1016/j.softx.2023.101505},
}
  1. J. L. Corcuera Bárcena, P. Ducange, A. Ercolani, F. Marcelloni, A. Renda, "An Approach to Federated Learning of Explainable Fuzzy Regression Models", in: 2022 IEEE International Conference in Fuzzy Systems (FUZZ-IEEE), IEEE, 2022, pp. 1–8. doi:10.1109/FUZZ-IEEE55066.2022.9882881.
@INPROCEEDINGS{CorcueraBarcena-FedTSK,   
  author={Corcuera B{\'a}rcena, Jos{\'e} Luis and Ducange, Pietro and Ercolani, Alessio and Marcelloni, Francesco and Renda, Alessandro},   
  booktitle={2022 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE)},   
  title={An Approach to Federated Learning of Explainable Fuzzy Regression Models},   
  year={2022},   
  volume={},
  number={},
  pages={1-8},
  doi={10.1109/FUZZ-IEEE55066.2022.9882881}}
  1. P. Foley et al., "OpenFL: the open federated learning library", Physics in Medicine & Biology (2022). doi:10.1088/1361-6560/ac97d9.
@article{openfl_citation,
	author={Foley, Patrick and Sheller, Micah J and Edwards, Brandon and Pati, Sarthak and Riviera, Walter and Sharma, Mansi and Moorthy, Prakash Narayana and Wang, Shi-han and Martin, Jason and Mirhaji, Parsa and Shah, Prashant and Bakas, Spyridon},
	title={OpenFL: the open federated learning library},
	journal={Physics in Medicine \& Biology},
	url={http://iopscience.iop.org/article/10.1088/1361-6560/ac97d9},
	year={2022},
	doi={10.1088/1361-6560/ac97d9},
	publisher={IOP Publishing}
}

Authors would like to thank Intel® and, in particular, Ing. Dario Sabella for providing hardware support and for the fruitful discussions.

Acknowledgments

This work has been partly funded by the PON 2014-2021 "Research and Innovation", DM MUR 1062/2021, Project title: "Progettazione e sperimentazione di algoritmi di federated learning per data stream mining", PNRR - M4C2 - Investimento 1.3, Partenariato Esteso PE00000013 - "FAIR - Future Artificial Intelligence Research" - Spoke 1 "Human-centered AI" and the PNRR "Tuscany Health Ecosystem" (THE) (Ecosistemi dell’Innovazione) - Spoke 6 - Precision Medicine & Personalized Healthcare (CUP I53C22000780001) under the NextGeneration EU programme, and by the Italian Ministry of University and Research (MUR) in the framework of the FoReLab and CrossLab projects (Departments of Excellence).

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages