OPUS 4 Latest Documents RSS FeedLatest documents
http://publikationen.stub.uni-frankfurt.de/index/index/
Thu, 07 Aug 2014 15:04:26 +0200Thu, 07 Aug 2014 15:04:26 +0200Correctness of an STM Haskell implementation
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34440
A concurrent implementation of software transactional memory in Concurrent Haskell using a call-by-need functional language with processes and futures is given. The description of the small-step operational semantics is precise and explicit, and employs an early abort of conflicting transactions. A proof of correctness of the implementation is given for a contextual semantics with may- and should-convergence. This implies that our implementation is a correct evaluator for an abstract specification equipped with a big-step semantics.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34440Tue, 08 Jul 2014 15:04:26 +0200Towards correctness of program transformations through unification and critical pair computation
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34439
Correctness of program transformations in extended lambda calculi with a contextual semantics is usually based on reasoning about the operational semantics which is a rewrite semantics. A successful approach to proving correctness is the combination of a context lemma with the computation of overlaps between program transformations and the reduction rules, and then of so-called complete sets of diagrams. The method is similar to the computation of critical pairs for the completion of term rewriting systems.We explore cases where the computation of these overlaps can be done in a first order way by variants of critical pair computation that use unification algorithms. As a case study we apply the method to a lambda calculus with recursive let-expressions and describe an effective unification algorithm to determine all overlaps of a set of transformations with all reduction rules. The unification algorithm employs many-sorted terms, the equational theory of left-commutativity modelling multi-sets, context variables of different kinds and a mechanism for compactly representing binding chains in recursive let-expressions.Conrad Rau; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34439Tue, 08 Jul 2014 14:56:08 +0200Reconstructing a logic for inductive proofs of properties of functional programs
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34438
David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34438Tue, 08 Jul 2014 14:44:41 +0200Reconstruction of a logic for inductive proofs of properties of functional programs
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34437
The interactive verification system VeriFun is based on a polymorphic call-by-value functional language and on a first-order logic with initial model semantics w.r.t. constructors. It is designed to perform automatic induction proofs and can also deal with partial functions. This paper provides a reconstruction of the corresponding logic and semantics using the standard treatment of undefinedness which adapts and improves the VeriFun-logic by allowing reasoning on nonterminating expressions and functions. Equality of expressions is defined as contextual equivalence based on observing termination in all closing contexts. The reconstruction shows that several restrictions of the VeriFun framework can easily be removed, by natural generalizations: mutual recursive functions, abstractions in the data values, and formulas with arbitrary quantifier prefix can be formulated. The main results of this paper are: an extended set of deduction rules usable in VeriFun under the adapted semantics is proved to be correct, i.e. they respect the observational equivalence in all extensions of a program. We also show that certain classes of theorems are conservative under extensions, like universally quantified equations. Also other special classes of theorems are analyzed for conservativity.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34437Tue, 08 Jul 2014 14:35:53 +0200Reconstruction of a logic for inductive proofs of properties of functional programs
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34436
The interactive verification system VeriFun is based on a polymorphic call-by-value functional language and on a first-order logic with initial model semantics w.r.t. constructors. This paper provides a reconstruction of the corresponding logic when partial functions are permitted. Typing is polymorphic for the definition of functions but monomorphic for terms in formulas. Equality of terms is defined as contextual equivalence based on observing termination in all contexts. The reconstruction also allows several generalizations of the functional language like mutual recursive functions and abstractions in the data values. The main results are: Correctness of several program transformations for all extensions of a program, which have a potential usage in a deduction system. We also proved that universally quantified equations are conservative, i.e. if a universally quantified equation is valid w.r.t. a program P, then it remains valid if the program is extended by new functions and/or new data types.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34436Tue, 08 Jul 2014 14:25:24 +0200On correctness of buffer implementations in a concurrent lambda calculus with futures
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34435
Motivated by the question of correctness of a specific implementation of concurrent buffers in the lambda calculus with futures underlying Alice ML, we prove that concurrent buffers and handled futures can correctly encode each other. Correctness means that our encodings preserve and reflect the observations of may- and must-convergence, and as a consequence also yields soundness of the encodings with respect to a contextually defined notion of program equivalence. While these translations encode blocking into queuing and waiting, we also describe an adequate encoding of buffers in a calculus without handles, which is more low-level and uses busy-waiting instead of blocking. Furthermore we demonstrate that our correctness concept applies to the whole compilation process from high-level to low-level concurrent languages, by translating the calculus with buffers, handled futures and data constructors into a small core language without those constructs.Jan Schwinghammer; David Sabel; Joachim Niehren; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34435Tue, 08 Jul 2014 14:12:40 +0200Adequacy of compositional translations for observational semantics
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34433
We investigate methods and tools for analyzing translations between programming languages with respect to observational semantics. The behavior of programs is observed in terms of may- and mustconvergence in arbitrary contexts, and adequacy of translations, i.e., the reflection of program equivalence, is taken to be the fundamental correctness condition. For compositional translations we propose a notion of convergence equivalence as a means for proving adequacy. This technique avoids explicit reasoning about contexts, and is able to deal with the subtle role of typing in implementations of language extensions.Manfred Schmidt-Schauß; Joachim Niehren; Jan Schwinghammer; David Sabelworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34433Tue, 08 Jul 2014 14:02:58 +0200Adequacy of compositional translations for observational semantics
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34432
We investigate methods and tools for analyzing translations between programming languages with respect to observational semantics. The behavior of programs is observed in terms of may- and mustconvergence in arbitrary contexts, and adequacy of translations, i.e., the reflection of program equivalence, is taken to be the fundamental correctness condition. For compositional translations we propose a notion of convergence equivalence as a means for proving adequacy. This technique avoids explicit reasoning about contexts, and is able to deal with the subtle role of typing in implementations of language extensions.Manfred Schmidt-Schauß; Joachim Niehren; Jan Schwinghammer; David Sabelworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34432Tue, 08 Jul 2014 13:55:58 +0200Adequacy of compositional translations for observational semantics
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34431
We investigate methods and tools for analysing translations between programming languages with respect to observational semantics. The behaviour of programs is observed in terms of may- and mustconvergence in arbitrary contexts, and adequacy of translations, i.e., the reflection of program equivalence, is taken to be the fundamental correctness condition. For compositional translations we propose a notion of convergence equivalence as a means for proving adequacy. This technique avoids explicit reasoning about contexts, and is able to deal with the subtle role of typing in implementations of language extensions.
Manfred Schmidt-Schauß; Joachim Niehren; Jan Schwinghammer; David Sabelworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34431Tue, 08 Jul 2014 13:48:09 +0200Adequacy of compositional translations for observational semantics
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34430
We investigate methods and tools for analysing translations between programming languages with respect to observational semantics. The behaviour of programs is observed in terms of may- and mustconvergence in arbitrary contexts, and adequacy of translations, i.e., the reflection of program equivalence, is taken to be the fundamental correctness condition. For compositional translations we propose a notion of convergence equivalence as a means for proving adequacy. This technique avoids explicit reasoning about contexts, and is able to deal with the subtle role of typing in implementations of language extensions.Manfred Schmidt-Schauß; Joachim Niehren; Jan Schwinghammer; David Sabelworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34430Tue, 08 Jul 2014 13:41:34 +0200A finite simulation method in a non-deterministic call-by-need calculus with letrec, constructors and case
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34429
The paper proposes a variation of simulation for checking and proving contextual equivalence in a non-deterministic call-by-need lambda-calculus with constructors, case, seq, and a letrec with cyclic dependencies. It also proposes a novel method to prove its correctness. The calculus’ semantics is based on a small-step rewrite semantics and on may-convergence. The cyclic nature of letrec bindings, as well as nondeterminism, makes known approaches to prove that simulation implies contextual equivalence, such as Howe’s proof technique, inapplicable in this setting. The basic technique for the simulation as well as the correctness proof is called pre-evaluation, which computes a set of answers for every closed expression. If simulation succeeds in finite computation depth, then it is guaranteed to show contextual preorder of expressions.Manfred Schmidt-Schauß; Elena Machkasovaworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34429Tue, 08 Jul 2014 13:30:21 +0200On generic context lemmas for lambda calculi with sharing
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34428
This paper proves several generic variants of context lemmas and thus contributes to improving the tools for observational semantics of deterministic and non-deterministic higher-order calculi that use a small-step reduction semantics. The generic (sharing) context lemmas are provided for may- as well as two variants of must-convergence, which hold in a broad class of extended process- and extended lambda calculi, if the calculi satisfy certain natural conditions. As a guide-line, the proofs of the context lemmas are valid in call-by-need calculi, in callby-value calculi if substitution is restricted to variable-by-variable and in process calculi like variants of the π-calculus. For calculi employing beta-reduction using a call-by-name or call-by-value strategy or similar reduction rules, some iu-variants of ciu-theorems are obtained from our context lemmas. Our results reestablish several context lemmas already proved in the literature, and also provide some new context lemmas as well as some new variants of the ciu-theorem. To make the results widely applicable, we use a higher-order abstract syntax that allows untyped calculi as well as certain simple typing schemes. The approach may lead to a unifying view of higher-order calculi, reduction, and observational equality.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34428Tue, 08 Jul 2014 13:09:22 +0200On generic context lemmas for lambda calculi with sharing
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34427
This paper proves several generic variants of context lemmas and thus contributes to improving the tools to develop observational semantics that is based on a reduction semantics for a language. The context lemmas are provided for may- as well as two variants of mustconvergence and a wide class of extended lambda calculi, which satisfy certain abstract conditions. The calculi must have a form of node sharing, e.g. plain beta reduction is not permitted. There are two variants, weakly sharing calculi, where the beta-reduction is only permitted for arguments that are variables, and strongly sharing calculi, which roughly correspond to call-by-need calculi, where beta-reduction is completely replaced by a sharing variant. The calculi must obey three abstract assumptions, which are in general easily recognizable given the syntax and the reduction rules. The generic context lemmas have as instances several context lemmas already proved in the literature for specific lambda calculi with sharing. The scope of the generic context lemmas comprises not only call-by-need calculi, but also call-by-value calculi with a form of built-in sharing. Investigations in other, new variants of extended lambda-calculi with sharing, where the language or the reduction rules and/or strategy varies, will be simplified by our result, since specific context lemmas are immediately derivable from the generic context lemma, provided our abstract conditions are met.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34427Tue, 08 Jul 2014 13:02:37 +0200A call-by-need lambda-calculus with locally bottom-avoiding choice: context lemma and correctness of transformations
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34426
We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb that is locally bottom-avoiding. We use a small-step operational semantics in form of a single-step rewriting system that defines a (nondeterministic) normal order reduction. This strategy can be made fair by adding resources for bookkeeping. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into any program context their termination behaviour is the same, where we use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We show that we can drop the fairness condition for equational reasoning, since the valid equations w.r.t. normal order reduction are the same as for fair normal order reduction. We evolve different proof tools for proving correctness of program transformations, in particular, a context lemma for may- as well as mustconvergence is proved, which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations preserve contextual equivalence.We also prove a standardisation theorem for fair normal order reduction. The structure of the ordering <=c a is also analysed: Ω is not a least element, and <=c already implies contextual equivalence w.r.t. may-convergence.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34426Tue, 08 Jul 2014 12:52:05 +0200A call-by-need lambda-calculus with locally bottom-avoiding choice: context lemma and correctness of transformations
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34425
We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb that is locally bottom-avoiding. We use a small-step operational semantics in form of a single-step rewriting system that defines a (nondeterministic) normal order reduction. This strategy can be made fair by adding resources for bookkeeping. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into any program context their termination behaviour is the same, where we use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We show that we can drop the fairness condition for equational reasoning, since the valid equations w.r.t. normal order reduction are the same as for fair normal order reduction. We evolve different proof tools for proving correctness of program transformations, in particular, a context lemma for may- as well as mustconvergence is proved, which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations preserve contextual equivalence.We also prove a standardisation theorem for fair normal order reduction. The structure of the ordering <=c a is also analysed: Ω is not a least element, and <=c already implies contextual equivalence w.r.t. may-convergence.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34425Tue, 08 Jul 2014 12:38:20 +0200Equivalence of call-by-name and call-by-need for lambda-calculi with letrec
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34424
We develop a proof method to show that in a (deterministic) lambda calculus with letrec and equipped with contextual equivalence the call-by-name and the call-by-need evaluation are equivalent, and also that the unrestricted copy-operation is correct. Given a let-binding x = t, the copy-operation replaces an occurrence of the variable x by the expression t, regardless of the form of t. This gives an answer to unresolved problems in several papers, it adds a strong method to the tool set for reasoning about contextual equivalence in higher-order calculi with letrec, and it enables a class of transformations that can be used as optimizations. The method can be used in different kind of lambda calculi with cyclic sharing. Probably it can also be used in non-deterministic lambda calculi if the variable x is “deterministic”, i.e., has no interference with non-deterministic executions. The main technical idea is to use a restricted variant of the infinitary lambda-calculus, whose objects are the expressions that are unrolled w.r.t. let, to define the infinite developments as a reduction calculus on the infinite trees and showing a standardization theorem.Manfred Schmidt-Schaußworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34424Tue, 08 Jul 2014 12:31:08 +0200On equivalences and standardization in a non-deterministic call-by-need lambda calculus
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34451
The goal of this report is to prove correctness of a considerable subset of transformations w.r.t. contextual equivalence in an extended lambda-calculus LS with case, constructors, seq, let, and choice, with a simple set of reduction rules; and to argue that an approximation calculus LA is equivalent to LS w.r.t. the contextual preorder, which enables the proof tool of simulation. Unfortunately, a direct proof appears to be impossible.
The correctness proof is by defining another calculus L comprising the complex variants of copy, case-reduction and seq-reductions that use variable-binding chains. This complex calculus has well-behaved diagrams and allows a proof of correctness of transformations, and that the simple calculus LS, the calculus L, and the calculus LA all have an equivalent contextual preorder.Manfred Schmidt-Schauß; Matthias Mannworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34451Tue, 08 Jul 2014 12:16:49 +0200On equivalences and standardization in a non-deterministic call-by-need lambda calculus
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34450
The goal of this report is to prove correctness of a considerable subset of transformations w.r.t. contextual equivalence in an extended lambda-calculus LS with case, constructors, seq, let, and choice, with a simple set of reduction rules; and to argue that an approximation calculus LA is equivalent to LS w.r.t. the contextual preorder, which enables the proof tool of simulation. Unfortunately, a direct proof appears to be impossible The correctness proof is by defining another calculus L comprising the complex variants of copy, case-reduction and seq-reductions that use variable-binding chains. This complex calculus has well-behaved diagrams and allows a proof of correctness of transformations, and that the simple calculus LS, the calculus L, and the calculus LA all have an equivalent contextual preorder.Manfred Schmidt-Schauß; Matthias Mannworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34450Tue, 08 Jul 2014 11:24:31 +0200On conservativity of concurrent Haskell
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34312
The calculus CHF models Concurrent Haskell extended by concurrent, implicit futures. It is a process calculus with concurrent threads, monadic concurrent evaluation, and includes a pure functional lambda-calculus which comprises data constructors, case-expressions, letrec-expressions, and Haskell’s seq. Futures can be implemented in Concurrent Haskell using the primitive unsafeInterleaveIO, which is available in most implementations of Haskell. Our main result is conservativity of CHF, that is, all equivalences of pure functional expressions are also valid in CHF. This implies that compiler optimizations and transformations from pure Haskell remain valid in Concurrent Haskell even if it is extended by futures. We also show that this is no longer valid if Concurrent Haskell is extended by the arbitrary use of unsafeInterleaveIO.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34312Thu, 03 Jul 2014 14:49:03 +0200Simulation in the call-by-need lambda-calculus with letrec, case, constructors, and seq
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34234
This paper shows equivalence of applicative similarity and contextual approximation, and hence also of bisimilarity and contextual equivalence, in LR, the deterministic call-by-need lambda calculus with letrec extended by data constructors, case-expressions and Haskell's seqoperator. LR models an untyped version of the core language of Haskell. Bisimilarity simplifies equivalence proofs in the calculus and opens a way for more convenient correctness proofs for program transformations.
The proof is by a fully abstract and surjective transfer of the contextual approximation into a call-by-name calculus, which is an extension of Abramsky's lazy lambda calculus. In the latter calculus equivalence of similarity and contextual approximation can be shown by Howe's method. Using an equivalent but inductive definition of behavioral preorder we then transfer similarity back to the calculus LR.
The translation from the call-by-need letrec calculus into the extended call-by-name lambda calculus is the composition of two translations. The first translation replaces the call-by-need strategy by a call-by-name strategy and its correctness is shown by exploiting infinite tress, which emerge by unfolding the letrec expressions. The second translation encodes letrec-expressions by using multi-fixpoint combinators and its correctness is shown syntactically by comparing reductions of both calculi. A further result of this paper is an isomorphism between the mentioned calculi, and also with a call-by-need letrec calculus with a less complex definition of reduction than LR.Manfred Schmidt-Schauß; David Sabel; Elena Machkasovaworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/34234Thu, 03 Jul 2014 14:36:22 +0200Applicative may- and should-simulation in the call-by-value
lambda calculus with amb
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/33609
Motivated by the question whether sound and expressive applicative similarities for program calculi with should-convergence exist, this paper investigates expressive applicative similarities for the untyped call-by-value lambda-calculus extended with McCarthy's ambiguous choice operator amb. Soundness of the applicative similarities w.r.t. contextual equivalence based on may-and should-convergence is proved by adapting Howe's method to should-convergence. As usual for nondeterministic calculi, similarity is not complete w.r.t. contextual equivalence which requires a rather complex counter example as a witness. Also the call-by-value lambda-calculus with the weaker nondeterministic construct erratic choice is analyzed and sound applicative similarities are provided. This justifies the expectation that also for more expressive and call-by-need higher-order calculi there are sound and powerful similarities for should-convergence.Manfred Schmidt-Schauß; David Sabelworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/33609Tue, 24 Jun 2014 16:53:05 +0200Contextual equivalence for the pi-calculus that can stop
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/33608
The pi-calculus is a well-analyzed model for mobile processes and mobile computations.
While a lot of other process and lambda calculi that are core languages of higher-order concurrent and/or functional programming languages use a contextual semantics observing the termination behavior of programs in all program contexts, traditional program equivalences in the pi-calculus are bisimulations and barbed testing equivalences, which observe the communication capabilities of processes under reduction and in contexts.
There is a distance between these two approaches to program equivalence which makes it hard to compare the pi-calculus with other languages. In this paper we contribute to bridging this gap by investigating a contextual semantics of the synchronous pi-calculus with replication and without sums.
To transfer contextual equivalence to the pi-calculus we add a process Stop as constant which indicates success and is used as the base to define and analyze the contextual equivalence which observes may- and should-convergence of processes.
We show as a main result that contextual equivalence in the pi-calculus with Stop conservatively extends barbed testing equivalence in the (Stop-free) pi-calculus. This implies that results on contextual equivalence can be directly transferred to the (Stop-free) pi-calculus with barbed testing equivalence.
We analyze the contextual ordering, prove some nontrivial process equivalences, and provide proof tools for showing contextual equivalences. Among them are a context lemma, and new notions of sound applicative similarities for may- and should-convergence.David Sabel; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/33608Tue, 24 Jun 2014 16:37:11 +0200Observational program calculi and the correctness of translations
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/33607
Motivated by our experience in analyzing properties of translations between programming languages with observational semantics, this paper clarifies the notions, the relevant questions, and the methods, constructs a general framework, and provides several tools for proving various correctness properties of translations like adequacy and full abstractness. The presented framework can directly be applied to the observational equivalences derived from the operational semantics of programming calculi, and also to other situations, and thus has a wide range of applications.Manfred Schmidt-Schauß; David Sabel; Joachim Niehren; Jan Schwinghammerworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/33607Tue, 24 Jun 2014 16:18:13 +0200Extending Abramsky's lazy lambda calculus: (non)-conservativity of embeddings
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/33606
Our motivation is the question whether the lazy lambda calculus, a pure lambda calculus with the leftmost outermost rewriting strategy, considered under observational semantics, or extensions thereof, are an adequate model for semantic equivalences in real-world purely functional programming languages, in particular for a pure core language of Haskell. We explore several extensions of the lazy lambda calculus: addition of a seq-operator, addition of data constructors and case-expressions, and their combination, focusing on conservativity of these extensions. In addition to untyped calculi, we study their monomorphically and polymorphically typed versions. For most of the extensions we obtain non-conservativity which we prove by providing counterexamples. However, we prove conservativity of the extension by data constructors and case in the monomorphically typed scenario.Manfred Schmidt-Schauß; Elena Machkasova; David Sabelworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/33606Tue, 24 Jun 2014 16:06:24 +0200Extending Abramsky's lazy lambda calculus: (non)-conservativity of embeddings
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/33605
Our motivation is the question whether the lazy lambda calculus, a pure lambda calculus with the leftmost outermost rewriting strategy, considered under observational semantics, or extensions thereof, are an adequate model for semantic equivalences in real-world purely functional programming languages, in particular for a pure core language of Haskell. We explore several extensions of the lazy lambda calculus: addition of a seq-operator, addition of data constructors and case-expressions, and their combination, focusing on conservativity of these extensions. In addition to untyped calculi, we study their monomorphically and polymorphically typed versions. For most of the extensions we obtain non-conservativity which we prove by providing counterexamples. However, we prove conservativity of the extension by data constructors and case in the monomorphically typed scenario.Manfred Schmidt-Schauß; Elena Machkasova; David Sabelworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/33605Tue, 24 Jun 2014 15:58:44 +0200