Published: 28th March 2013 DOI: 10.4204/EPTCS.113 ISSN: 2075-2180 |
This document contains the proceedings of the Seventh International Workshop on Logical and Semantic Frameworks, with Applications, which was held on September 29 and 30, 2012, in Rio de Janeiro, Brazil. It contains 11 regular papers (9 long and 2 short) accepted for presentation at the meeting, as well as extended abstracts of invited talks by Torben Braüner (Roskilde University, Denmark), Maribel Fernàndez (King's College London, United Kingdom), Edward Hermann Haeusler (PUC-Rio, Brazil) and Alexandre Miquel (École Normale Supérieure de Lyon, France).
Each submission was reviewed by at least three referees, and an electronic Program Committee (PC) meeting was held using the EasyChair system.
We are very grateful to our invited speakers for kindly accepting to give talks at LSFA 2012. We would also like to thank all the PC members for their active participation to the electronic PC discussion.
Last but not least, thanks the organizing committee of LSFA 2012, Christiano Braga (UFF), Renata de Freitas (UFF), Luiz Carlos Pereira (PUC-Rio, UFRJ), and Maurício Ayala-Rincón (UnB)
21th of February, 2013 | Delia Kesner (Université Paris-Diderot, France) Petrucio Viana (Universidade Federal Fluminense, Brazil) |
Hybrid logic
Hybrid logics are obtained by adding further expressive power to ordinary modal logic.
The history of hybrid logics goes back to the philosopher Arthur Prior's work in the 1960s.
The most basic hybrid logic is obtained by extending ordinary modal logic with
nominals
, which are propositional symbols of a new sort.
In the Kripke semantics a nominal is interpreted in a restricted way such that it is true at exactly one point (where the points represent possible worlds, times, locations, epistemic states, states in a computer, or something else).
If the points are given a temporal reading, this enables the formalization of natural language statements that are true at exactly one time, for example
Access control is a fundamental aspect of computer security; it aims at protecting resources from non-authorized users.
The generalized use of access control in distributed computing environments has increased the need for high-level declarative languages that enable security administrators to specify and analyze a wide range of complex policies.
The first attempts to develop formal theories to define and validate security policies have used first-order theorem provers, purpose-built logics, or flow-analysis.
More recently, rewriting techniques have been fruitfully exploited in the context of
security protocols
(e.g., [2]),
information leakage
(e.g., [8]), and
access control
(e.g., [10, 5]).
Rewriting systems present many advantages: they have a well-studied theory, and programming languages and tools are available for fast prototyping and
analysis
(e.g. Maude [6]).
Once the policies are defined, they can be integrated into a standard implementation, using for instance the weaving techniques described in [11].
Over the last few years, a variety of access control models have been developed, often motivated by particular applications.
The unifying metamodel for access control proposed
in [1]
identifies a core set of principles of access control;
all the standard models can then be derived as instances of the metamodel.
In [4],
we use the same approach to define a metamodel of access control for
distributed environments
where each component of the system preserves its autonomy.
The notion of distributed environment that we consider is related to the notion of federation
developed in the context of database systems [9,7].
We propose a formal framework for modeling (and enforcing) global access control policies that takes into account the local policies specified and maintained by each member of the federation.
In particular we ensure the coherence of a global access control decision w.r.t. local access control requirements by specifying in a tunable way how to integrate access authorizations resulting from the local policies.
We provide a rewrite-based operational semantics that deals in a uniform way with distributed systems where different access control policies are
maintained locally. In this way, properties of access control policies can be proved in a modular way.
In particular, we are interested in consistency and totality properties of policies.
These properties guarantee that access requests will be treated as expected. A consistent access control policy specifies at most one answer for each access request
(i.e., an access request cannot be both granted and denied).
Totality guarantees that every access request will be given an answer.
We show how consistency and totality properties of access control policies can be derived from standard properties of the rewrite framework we use.
The results reported here are based on preliminary work presented
in [3, 4].
References
[1]
S. Barker.
The next 700 access control models or a unifying meta-model?
In SACMAT 2009, 14th ACM Symposium on Access Control Models and Technologies, Stresa, Italy, June 3-5, 2009, Proceedings, pages 187-196. ACM Press, 2009. doi:10.1145/1542207.1542238
[2]
G. Barthe, G. Dufay, M. Huisman, and S. Melo de Sousa.
Jakarta: a toolset to reason about the JavaCard platform.
In Proceedings of e-SMART'01, number 2140 in Lecture Notes in Computer Science. Springer-Verlag, 2002. doi:10.1007/3-540-45418-7_2
[3]
C. Bertolissi and M. Fernández.
Category-based authorisation models: operational semantics and expressive power.
In Proceedings of the International Symposium on Engineering Secure Software and Systems, ESSOS 2010, Pisa, number 5965 in Lecture Notes in Computer Science. Springer, 2010.
doi:10.1007/978-3-642-11747-3_11
[4]
C. Bertolissi and M. Fernández.
Rewrite specifications of access control policies in distributed environments.
In Proc. of STM 2010: 6th Workshop on Security and Trust Management, Athens, Greece, 2010, number 6710 in Lecture Notes in Computer Science. Springer, 2011. doi:10.1007/978-3-642-22444-7_4
[5]
C. Bertolissi, M. Fernández, and S. Barker.
Dynamic event-based access control as term rewriting.
In Data and Applications Security XXI. Proceedings of DBSEC 2007, number 4602 in Lecture Notes in Computer Science. Springer-Verlag, 2007. doi:10.1007/978-3-540-73538-0_15
[6]
M. Clavel, F. Durán, S. Eker, P. Lincoln, N. Martí-Oliet, J. Meseguer, and C. Talcott.
The Maude 2.0 system.
In Rewriting Techniques and Applications (RTA 2003), number 2706 in Lecture Notes in Computer Science, pages 76-87. Springer-Verlag, 2003. doi:10.1007/3-540-44881-0_7
[7]
S. De Capitani di Vimercati and P. Samarati.
Authorization specification and enforcement in federated database systems.
J. Comput. Secur., 5:155-188, March 1997. doi:10.3233/JCS-1997-5205
[8]
R. Echahed and F. Prost.
Security policy in a declarative style.
In Proc. 7th ACM-SIGPLAN Symposium on Principles and Practice of Declarative Programming (PPDP'05). ACM Press, 2005. doi:10.1145/1069774.1069789
[9]
D. Jonscher and K.R. Dittrich.
An approach for building secure database federations.
In Proceedings of the 20th International Conference on Very Large Data Bases, VLDB '94, pages 24-35, San Francisco, CA, USA, 1994.
[10]
A. Santana de Oliveira.
Réécriture et Modularité pour les Politiques de Sécurité.
PhD thesis, Université Henri Poincaré, Nancy, France, 2008.
[11]
A. Santana de Oliveira, E. Ke Wang, C. Kirchner, and H. Kirchner.
Weaving rewrite-based access control policies.
In Proceedings of the 2007 ACM workshop on Formal methods in security engineering, FMSE 2007, Fairfax, VA, USA, November 2, 2007, pages 71-80. ACM, 2007. doi:10.1145/1314436.1314446
Classical
(K)
and Intuitionistic
(I)
propositional logics are well-known and very useful basic logics.
On top of each of them many reasoning systems for AI and foundations of program constructions have been proposed.
As a matter of terminology, given a logic
L,
let
Taut(L) = { α : ∀ A ∈ Models(L), A ⊨L α }
and
Sat(L) = { α : ∃ A ∈ Models(L), A ⊨L α }.
Minimal logic
(M)
is the logic obtained from
I
by removing the intuitionistic absurdity rule
(from ⊢ ⊥ infer ⊢ B, for any B).
Purely implicational minimal logic
(M→)
is the fragment of minimal logic dealing only with the → logical constant.
This work discusses the computational complexity of
Sat(L)
and
Taut(L),
for arbitrary propositional logics
L,
and how it is
(possibly)
hard to have an efficient mechanization of their respective proof procedures.
Sat(I) and
Taut(I)
are known to be PSPACE-complete, while
Taut(K)
is CONP-complete and
Sat(K)
is NP-complete.
The proofs of these facts were obtained by model-theoretic arguments on the respective logics.
For example, Statman [6] provided a polynomial reduction of quantified boolean logic
(Taut(QBL)) to
Taut(I)
that is based on the truth-value semantics of the former.
Additionally, in the same article, he provided a proof-theoretical argument for a polynomial reduction of
I
to
M→.
After his article, the computational complexities of
Sat(M→)
and
Taut(M→)
are also known to be PSPACE-complete, since COPSPACE = PSPACE.
It is worth noting that in spite of how Statman's reduction of
QBF to
I
is based on a truth-valued semantics he also uses a proof-theoretical result:
Glyvenko's theorem
(from
Γ ⊢K α
it follows
Γ ⊢I ¬ ¬ α).
There are two representative ways of normalizing Natural Deduction derivations.
Prawitz strategy transforms arbitrary ⊥-classical applications in the derivation into atomic ⊥-classical applications.
It works for the { ∧, →, ¬ } fragment.
On the other hand, Seldin strategy transforms the derivation into a new derivation with at most one ⊥-classical application, the last rule application in it.
Both strategies are concluded by usual reductions for eliminating maximal (detour) formulas as in the intuitionistic case, since classical ⊥ rules do not yield maximal formulas.
In [5] it is shown how these usual strategies to normalize classical derivations are strongly related to double negation translation from classical to intuitionistic logic.
Prawitz and Seldin strategy are related to Gödel and Kolmogorov translations respectively.
In fact Seldin strategy can be seen as an alternative proof of Glyvenko's theorem.
It is worth noting that both translations are polynomial on the size of the original derivation.
From facts and results mentioned above, we observe that the sub-formula principle property holding in classical, intuitionistic and minimal logics is what allows one to polynomially relate derivations among these logics, performing a worst case analysis for
Taut(I),
Taut(M)
and
Taut(M→).
In this work, we extend the polynomial reduction of purely implicational logic to any Natural Deduction presented propositional logic that satisfies the sub-formula principle.
More precisely we prove:
Theorem 1
Let L be a propositional logic that has a finitary, complete and sound Natural Deduction system satisfying the sub-formula principle.
Then Taut(L) is PSPACE-hard.
We remark that some well-known logics, such as Propositional Dynamic Logic,
PDL,
and
S5Cn (n ≥ 2),
for example, hardly satisfy the sub-formula property, unless EXPTIME = PSPACE.
A feasible proving systems for these logics hardly exists.
These logics are not polynomially bounded either, unless EXPTIME = NP.
Any logic satisfying sub-formula principle, according Theorem 1 cannot be polynomially bounded either, unless NP = PSPACE.
It is be generally accepted that a mechanizable logic should only have polynomially bounded proofs.
A logic weaker than
Taut(M→)
might be polynomially bounded, but its practical use must be more restricted in face of the already discussed facts.
An alternative position for considering a mechanizable logic is taken in [1].
A propositional proof system is called automatizable whenever it only produces proofs in time polynomially bounded by the size of the smallest proof of the same conclusion.
When one consider these smallest proofs written in stronger systems than the original one, it is called weak automatizable.
Under certain conditions, weak automatizability is equivalent to feasible interpolation property (see [1]).
A proof system has feasible interpolation property whenever for every proof of
α → β
there is a Craig interpolant
γ
that is polynomially bounded on the size of the proof of
α → β.
Interpolants are useful lemmas that can be used to produce smaller proofs.
There are examples showing that an exponential gap may exist between proofs without lemmas and with lemmas.
Since
Taut(M→)
satisfies the subformula principle, Craig interpolation property must hold.
However, interpolants are, in most of the cases, either the conclusion
β
or the premise
α,
and although they are polynomially bounded on the size of the proof they do not help producing smaller proofs.
Despite
Taut(M→)
satisfying feasible interpolation property, this does not provide any clue on the definition of an efficient propositional proof system for it.
We conclude this work by commenting some possible heuristics to provide short (polynomially bounded) proofs.
Introduction of interpolants, or lemmata, is the most well-known heuristic.
In [2] and [3] we show that it does not work at all and we propose the use of substitution/unification together with graph-based presentation in order to have short proofs.
An example inspired in the Fibonnaci series is shown.
This example is exponentially sized, it does not have any non-trivial and short interpolant and a proof polynomially bounded can be obtained by the application of the proposed heuristics.
A previous version of this work was presented at [4].
References
[1]
A. Atserias and M. L. Bonet.
On the automatizability of resolution and related propositional proof systems.
Inf. Comput., 189:182-201, 2004. doi:10.1016/j.ic.2003.10.004
[2]
L. Gordeev, E. H. Haeusler, and V. G. Costa.
Proof compressions with circuit-structured substitutions.
J. Math. Sci., 158:645-658, 2009. doi:10.1007/s10958-009-9405-3
[3]
L. Gordeev, E.H. Haeusler and L.C. Pereira.
Propositional proof compressions and DNF logic.
Log. J. IGPL, 19:62-86, 2011. doi:10.1093/jigpal/jzq003
[4]
E.H. Haeusler.
On the computational complexity of Purely Implicational Logic: A proof-theoretical discussion.
In XV SLALM, 15th Latin American Symposium on Mathematical Logic, Bogota, Colombia, 2012, Book of Abstracts.
[5]
L.C. Pereira, E.H. Haeusler, and M. da P.N. Medeiros.
Some results on Classical Logic fragments containing the negation.
O Que Nos Faz Pensar, 23: 105-111 , 2008.
In portuguese.
[6]
R. Statman
Intuitionistic Propositional Logic is polynomial-space complete.
Theor. Comput. Sci., 9:67-72, 1979. doi:10.1016/0304-3975(79)90006-9
The theory of classical realizability was introduced by Krivine [8] in the middle of the 90's in order to analyze the computational contents of classical proofs, following the connection between classical reasoning and control operators discovered by Griffin [4].
More than an extension of the theory of intuitionistic realizability, classical realizability is a complete reformulation of the very principles of realizability based on a combination [12, 11] of Kleene's realizability [5] with Friedman's A-translation [3].
One of the most interesting aspects of the theory is that it is highly modular:
although it was originally developed for second order arithmetic, classical realizability naturally extends to more expressive frameworks such as Zermelo-Fraenkel set theory [6] or the calculus of constructions with universes [10].
Moreover, the underlying language of realizers can be freely enriched with new instructions in order to realize extra reasoning principles, such as the axiom of dependent choices [7].
More recently, Krivine [9] proposed an ambitious research program, whose aim is to unify the methods of classical realizability with Cohen's forcing [1, 2] in order to extract programs from proofs obtained by the forcing technique.
In this talk, I shall survey the methods and the main results of classical realizability, while presenting some of its specific problems, such as the specification problem.
I shall also discuss the perspectives of combining classical realizability with the technique of forcing, showing that this combination is not only interesting from a pure model theoretic point of view, but that it can also bring new insights about a possible widening of the proofs-as-programs correspondence beyond the limits of pure functional programming.
References
[1]
P. J. Cohen.
The independence of the Continuum Hypothesis.
P. Natl. Acad. Sci. USA., 50:1143-1148, 1963.
[2]
P. J. Cohen.
The independence of the Continuum Hypothesis II.
P. Natl. Acad. Sci. USA., 51:105-110, 1964.
[3]
H. Friedman.
Classically and Intuitionistically Provably Recursive Functions.
Lect. Notes Math., 688:21-28, 1978. doi:10.1007/BFb0103100
[4]
T. Griffin.
A Formulae-as-Types Notion of Control.
Principles Of Programming Languages (POPL'90), 47-58, 1990. doi:10.1145/96709.96714
[5]
S.C. Kleene.
On the interpretation of intuitionistic number theory.
J. Symb. Logic, 10:109-124, 1945.
[6]
J.-L. Krivine.
Typed lambda-calculus in classical Zermelo-Fraenkel set theory.
Arch. Math. Log., 40:189-205, 2001. doi:10.1007/s001530000057
[7]
J.-L. Krivine.
Dependent choice, 'quote' and the clock.
Theor. Comput. Sci., 308:259-276, 2003. doi:10.1016/S0304-3975(02)00776-4
[8]
J.-L. Krivine.
Realizability in classical logic.
Interactive models of computation and program behaviour, 27:197-229.
Société Mathématique de France, 2009.
[9]
J.-L. Krivine.
Realizability algebras: a program to well order R.
CoRR, abs/1005.2395, 2010.
[10]
A. Miquel.
Classical program extraction in the calculus of constructions.
Computer Science Logic (CSL'07), 313-327, 2007. doi:10.1007/978-3-540-74915-8_25
[11]
A. Miquel
Existential witness extraction in classical realizability and via a negative translation.
Log. Meth. Comp. Sci., 7:1-47, 2011. doi:10.2168/LMCS-7(2:2)2011
[12]
P. Oliva and T. Streicher.
On Krivine's realizability interpretation of classical second-order arithmetic.
Fundam. Inform., 84:207-220, 2008.