Differentiable logics for neural network training and verification
Differentiable logics for neural network training and verification
Neural network (NN) verification is a problem that has drawn attention of many researchers. The specific nature of neural networks does away with the conventional assumption that a static program is given for verification as in the case of NNs multiple models can be used if one fails a new one can be trained leading to an approach called continuous verification, referring to the loop between training and verification. One tactic for improving the network’s performance is through “constraint-based loss functions” - a method of using differentiable logic (DL) to translate logical constraints into loss functions which can then be used to train the network specifically to satisfy said constraint. In this paper we present a uniform way of defining a translation from logic syntax to a differentiable loss function then examine and compare the existing DLs. We explore mathematical properties desired in such translations and discuss the design space identifying possible directions of future work.
67-77
Ślusarz, Natalia
368b7981-c4b3-4ddb-aa83-3243088a1172
Komendantskaya, Ekaterina
f12d9c23-5589-40b8-bcf9-a04fe9dedf61
Daggitt, Matthew L.
7788a0b1-f07e-4b37-b34a-77b7d6ad4005
Stewart, Robert
3b99f51f-d1fe-4783-a845-5668e67b72bb
2022
Ślusarz, Natalia
368b7981-c4b3-4ddb-aa83-3243088a1172
Komendantskaya, Ekaterina
f12d9c23-5589-40b8-bcf9-a04fe9dedf61
Daggitt, Matthew L.
7788a0b1-f07e-4b37-b34a-77b7d6ad4005
Stewart, Robert
3b99f51f-d1fe-4783-a845-5668e67b72bb
Ślusarz, Natalia, Komendantskaya, Ekaterina, Daggitt, Matthew L. and Stewart, Robert
(2022)
Differentiable logics for neural network training and verification.
Isac, Omri, Katz, Guy, Ivanov, Radoslav, Narodytska, Nina and Nenzi, Laura
(eds.)
In Software Verification and Formal Methods for ML-Enabled Autonomous Systems - 5th International Workshop, FoMLAS 2022, and 15th International Workshop, NSV 2022, Proceedings.
vol. 13466 LNCS,
Springer Cham.
.
(doi:10.1007/978-3-031-21222-2_5).
Record type:
Conference or Workshop Item
(Paper)
Abstract
Neural network (NN) verification is a problem that has drawn attention of many researchers. The specific nature of neural networks does away with the conventional assumption that a static program is given for verification as in the case of NNs multiple models can be used if one fails a new one can be trained leading to an approach called continuous verification, referring to the loop between training and verification. One tactic for improving the network’s performance is through “constraint-based loss functions” - a method of using differentiable logic (DL) to translate logical constraints into loss functions which can then be used to train the network specifically to satisfy said constraint. In this paper we present a uniform way of defining a translation from logic syntax to a differentiable loss function then examine and compare the existing DLs. We explore mathematical properties desired in such translations and discuss the design space identifying possible directions of future work.
This record has no associated files available for download.
More information
Published date: 2022
Additional Information:
Funding Information:
Acknowledgement. Authors acknowledge support of EPSRC grant AISEC EP/T026952/1 and NCSC grant Neural Network Verification: in search of the missing spec.
Publisher Copyright:
© 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.
Venue - Dates:
5th International Workshop on Software Verification and Formal Methods for ML-Enables Autonomous Systems, FoMLAS 2022 and 15th International Workshop on Numerical Software Verification, NSV 2022, , Haifa, Israel, 2022-08-11 - 2022-08-11
Identifiers
Local EPrints ID: 482741
URI: http://eprints.soton.ac.uk/id/eprint/482741
ISSN: 0302-9743
PURE UUID: 1f011908-ebc7-4692-980d-9ecc2833ca38
Catalogue record
Date deposited: 12 Oct 2023 16:38
Last modified: 05 Jun 2024 18:16
Export record
Altmetrics
Contributors
Author:
Natalia Ślusarz
Author:
Ekaterina Komendantskaya
Author:
Matthew L. Daggitt
Author:
Robert Stewart
Editor:
Omri Isac
Editor:
Guy Katz
Editor:
Radoslav Ivanov
Editor:
Nina Narodytska
Editor:
Laura Nenzi
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics