Logic of differentiable logics: Towards a uniform semantics of DL

N Ślusarz, E Komendantskaya, ML Daggitt… - arXiv preprint arXiv …, 2023 - arxiv.org
arXiv preprint arXiv:2303.10650, 2023arxiv.org
Differentiable logics (DL) have recently been proposed as a method of training neural
networks to satisfy logical specifications. A DL consists of a syntax in which specifications
are stated and an interpretation function that translates expressions in the syntax into loss
functions. These loss functions can then be used during training with standard gradient
descent algorithms. The variety of existing DLs and the differing levels of formality with which
they are treated makes a systematic comparative study of their properties and …
Differentiable logics (DL) have recently been proposed as a method of training neural networks to satisfy logical specifications. A DL consists of a syntax in which specifications are stated and an interpretation function that translates expressions in the syntax into loss functions. These loss functions can then be used during training with standard gradient descent algorithms. The variety of existing DLs and the differing levels of formality with which they are treated makes a systematic comparative study of their properties and implementations difficult. This paper remedies this problem by suggesting a meta-language for defining DLs that we call the Logic of Differentiable Logics, or LDL. Syntactically, it generalises the syntax of existing DLs to FOL, and for the first time introduces the formalism for reasoning about vectors and learners. Semantically, it introduces a general interpretation function that can be instantiated to define loss functions arising from different existing DLs. We use LDL to establish several theoretical properties of existing DLs, and to conduct their empirical study in neural network verification.
arxiv.org
以上显示的是最相近的搜索结果。 查看全部搜索结果