OPUS 4 Latest Documents RSS FeedLatest documents
http://publikationen.stub.uni-frankfurt.de/index/index/
Wed, 19 Apr 2006 18:07:13 +0200Wed, 19 Apr 2006 18:07:13 +0200Deciding subset relationship of co-inductively defined set constants
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/3046
Static analysis of different non-strict functional programming languages makes use of set constants like Top, Inf, and Bot denoting all expressions, all lists without a last Nil as tail, and all non-terminating programs, respectively. We use a set language that permits union, constructors and recursive definition of set constants with a greatest fixpoint semantics. This paper proves decidability, in particular EXPTIMEcompleteness, of subset relationship of co-inductively defined sets by using algorithms and results from tree automata. This shows decidability of the test for set inclusion, which is required by certain strictness analysis algorithms in lazy functional programming languages.Manfred Schmidt-Schauß; David Sabel; Marko Schützworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/3046Wed, 19 Apr 2006 18:07:13 +0200A complete proof of the safety of Nöcker's strictness analysis
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/4394
This paper proves correctness of Nöcker's method of strictness analysis, implemented in the Clean compiler, which is an effective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt did on the correctness of the abstract reduction rules. Our method fully considers the cycle detection rules, which are the main strength of Nöcker's strictness analysis. Our algorithm SAL is a reformulation of Nöcker's strictness analysis algorithm in a higher-order call-by-need lambda-calculus with case, constructors, letrec, and seq, extended by set constants like Top or Inf, denoting sets of expressions. It is also possible to define new set constants by recursive equations with a greatest fixpoint semantics. The operational semantics is a small-step semantics. Equality of expressions is defined by a contextual semantics that observes termination of expressions. Basically, SAL is a non-termination checker. The proof of its correctness and hence of Nöcker's strictness analysis is based mainly on an exact analysis of the lengths of normal order reduction sequences. The main measure being the number of 'essential' reductions in a normal order reduction sequence. Our tools and results provide new insights into call-by-need lambda-calculi, the role of sharing in functional programming languages, and into strictness analysis in general. The correctness result provides a foundation for Nöcker's strictness analysis in Clean, and also for its use in Haskell.Manfred Schmidt-Schauß; Marko Schütz; David Sabelworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/4394Tue, 14 Jun 2005 12:27:46 +0200On the safety of Nöcker's strictness analysis
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/4594
This paper proves correctness of Nocker s method of strictness analysis, implemented for Clean, which is an e ective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt, which addresses correctness of the abstract reduction rules. Our method also addresses the cycle detection rules, which are the main strength of Nocker s strictness analysis. We reformulate Nocker s strictness analysis algorithm in a higherorder lambda-calculus with case, constructors, letrec, and a nondeterministic choice operator used as a union operator. Furthermore, the calculus is expressive enough to represent abstract constants like Top or Inf. The operational semantics is a small-step semantics and equality of expressions is defined by a contextual semantics that observes termination of expressions. The correctness of several reductions is proved using a context lemma and complete sets of forking and commuting diagrams. The proof is based mainly on an exact analysis of the lengths of normal order reductions. However, there remains a small gap: Currently, the proof for correctness of strictness analysis requires the conjecture that our behavioral preorder is contained in the contextual preorder. The proof is valid without referring to the conjecture, if no abstract constants are used in the analysis.Manfred Schmidt-Schauß; Marko Schütz; David Sabelworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/4594Thu, 12 May 2005 16:51:58 +0200Efficient strictness analysis of Haskell in Haskell using abstract reduction
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/4603
The extraction of strictness information marks an indispensable element of an efficient compilation of lazy functional languages like Haskell. Based on the method of abstract reduction we have developed an e cient strictness analyser for a core language of Haskell. It is completely written in Haskell and compares favourably with known implementations. The implementation is based on the G#-machine, which is an extension of the G-machine that has been adapted to the needs of abstract reduction.Marko Schütz; Manfred Schmidt-Schauß; Sven Eric Panitzworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/4603Thu, 12 May 2005 16:43:32 +0200Automatic extraction of context information from programs using abstract reduction
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/4602
This paper describes context analysis, an extension to strictness analysis for lazy functional languages. In particular it extends Wadler's four point domain and permits in nitely many abstract values. A calculus is presented based on abstract reduction which given the abstract values for the result automatically finds the abstract values for the arguments. The results of the analysis are useful for veri fication purposes and can also be used in compilers which require strictness information.Marko Schütz; Manfred Schmidt-Schaußworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/4602Thu, 12 May 2005 16:42:29 +0200The G#-machine : efficient strictness analysis in Haskell
http://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/4597
Marko Schützworkingpaperhttp://publikationen.stub.uni-frankfurt.de/frontdoor/index/index/docId/4597Thu, 12 May 2005 16:37:26 +0200