[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
Programming in Lambda Prolog is structured around the definition of modules. In Teyjus, there are three files associated with a module. If the name of the module is <modname> then these files are <modname>.mod, <modname>.sig and <modname>.lp, and they correspond, respectively, to the source code, the signature and the bytecode file generated by the compiler for the module. The user of Teyjus is not concerned with the internal structure of the bytecode file and it suffices for him/her to know merely of its existence. It is necessary, however, to understand the structure and purpose of the signature and source code file in order to produce meaningful Lambda Prolog programs that can be compiled and executed under Teyjus. This chapter presents the structure of these two kinds of objects. Before looking at the syntactic aspects, however, it is useful to understand the intended roles of these two kinds of objects in programming.
• Program Structure | Role of Signatures and Modules. | |
6.1 Characterization of Tokens in Teyjus | ||
6.2 The Structure of a Signature File | ||
6.3 The Structure of Source Code for a Module | ||
6.4 Matching a Module with its Signature | Matching Module Code to its Signature. | |
6.5 Using Signatures to Certify Module Interactions | ||
6.6 Builtin Sorts and Type Constructors, Constants and Predicates | A Catalogue of Teyjus builtins. | |
6.7 Special Characteristics of the Teyjus Implementation | Some Language Level Idiocyncracies. |
Signatures are associated with a module to determine an external view for it. In particular, a signature identifies names of sorts, type constructors, constants and operators and also might indicate that certain predicates are to be used in the module without changes in their definitions or that a fixed definition for these predicates is to be exported from that module. Signatures are eventually employed in two ways: (a) when a module is being compiled, the signature is used to determine whether the module is "living up to its promise" and also to qualify some of the declarations in the module and (b) when some other module imports or accumulates this one, the signature determines what is externally visible and hence plays a role in certifying the module interactions.
Two technical terms are used in the above paragraph to describe possibilities for module interactions: importing and accumulating. We pause briefly to explain these kinds of interactions. Importation is a process by which the globally visible procedures defined in a particular module become dynamically available in solving the bodies of procedures in another module. Note that actual interactions of this kind manifest themselves only at loading and execution time. Modules that interact through importation are in fact compiled separately in Teyjus, with signatures being used to ensure compatibility and to determine load-time actions and run-time interactions. Accumulation, on the other hand, is mainly a compile-time process: it calls for the accretion of the code in other independently defined modules into a new, larger, module. The vision that accumulation is intended to evoke is that of static inlining of code. However, the inlining is not entirely naive. In Lambda Prolog it is possible to scope constant names and procedure definitions over modules. For names, this is achieved via the notion of a local constant: any constant that is not defined in the signature form the module is interpreted as being local. For procedures, this is manifest in the scopes of imports (and also in explicit implication goals). The inlining corresponding to accumulation is to be done in a way that respects scopes. In particular, locals in an accumulated module are to be treated as being distinct even if there is a coincidence in names with the constants in another accumulated module or in the accumulating module. Similarly, imports in an accumulated module have only the code originating in that module as their scope.
The precise way in which signatures fulfill their intended roles is manifest in the notions of matching a module declaration with its signature and of certifying interactions via imports and accumulates. These notions are explained in more detail in this chapter after a presentation of the syntax of signatures and modules supported by Teyjus.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
We describe below the rules for recognizing tokens in the different relevant categories. Two points are relevant to mention prior to this description. First, as is standard, the principle of longest match is used in tokenization. The Teyjus lexical analyzer is in fact generated by flex, based on a regular grammar presentation of the tokens and token categories below. Second, a far fewer set of characters are treated as token separators than is usual. In particular, only the characters identified as punctuation symbols are interpreted in this manner over and above `white space' characters. To take a specific example, a sequence such as X+Y is interpreted as a token as opposed to a sequence of three tokens.
6.1.1 Keywords | The reserved words in Teyjus. | |
6.1.2 Other Special Tokens in Teyjus | Other special tokens in Teyjus. | |
6.1.3 Punctuation Symbols | Delimiters, grouping tokens and such. | |
6.1.4 Identifier and Variable Names | Variables and identifiers. | |
6.1.5 Integer Literals | Integer literals. | |
6.1.6 Real Literals | Real literals. | |
6.1.7 String Literals | String literals. | |
6.1.8 The Syntax for Comments | The syntax for comments. |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The following sequences of characters are recognized as tokens with a special meaning within Teyjus.
module end import accumulate accum_sig local localkind closed sig kind type exportdef useonly infixl infixr infix prefix prefixr postfix postfixl :- \ -> ! |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The following tokens are ones that, in addition to the keywords and punctuation symbols, cannot be redefined by user programs. Some of these symbols--in particular, the symbols pi, sigma, ;, &, =>, =, :: and nil--are constants and are like other identifiers except for the `no redefinition' clause. The remaining symbols are overloaded and so are treated in a special way by the parser. Finally, some symbols, in particular :: and nil, receive a special treatment at the abstract machine level.
pi sigma , ; & => + - * / ~ < > =< >= :: = nil |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The following symbols serve as delimiters, grouping symbols, etc. Notice that comma appears in the list of pseudo keywords as well; usage is determined by context.
. , ( ) : [ ] | |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
Any contiguous sequence of alphanumeric or sign characters different from the punctuation characters (i.e. any of the characters +-*/\^<>=`'~?@#$&!_~) that begins with something other than a digit or the sequence /* constitutes a name. A name that begins with anything other than the underscore character may be used as that of a variable bound by an abstraction. Names other than those that begin with the underscore (_) character or an uppercase letter and those included in the special tokens category may be identified as constants through type declarations or sorts or type constructors through kind declarations. Such names are also interpreted as those of (local, undeclared) constants if they are used in clauses without a previous type declaration and they are not bound by enclosing abstractions. A name that begins with the underscore character or with an uppercase letter and that is not bound by an abstraction is considered to be that of a free (implicitly existentially quantified) variable.
Three points should be noted with regard to this definition. First, as already observed and unlike in most other languages, certain characters (like -, *, etc) are not treated as token separators in addition to being tokens. Thus, X-Y is a single token and not a sequence of three tokens. Second, some token sequences such as => and -> appear to be ambiguous, qualifying to be either keywords or names. This ambiguity is resolved by deeming the keyword interpretation to take precedence in all such cases. Finally, some names, such as pi and ::, are treated as special (overloaded or predefined) tokens and these cannot be reinterpreted through type declarations.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
A sequence of digits is recognized as an integer constant. As usual, there are size limitations on integers. In the present implementation, the limitation is that the integer should fit in 25 bits.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
A sequence of digits followed by a period and then a sequence of digits is recognized as a real constant. The sequence before the period may be empty but the sequence after the period must be nonempty. There are size and precision restrictions that arise in the present implementation first from the limitations of the implementation language (double in C) and then from those of the abstract machine. The latter are likely to be dominant and are the following: the exponent must fit in 7 bits and the mantissa in 15 bits.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
These are character sequences enclosed within two double quotes. We permit a mix of SICSTUS and ML syntax for strings. In particular:
A string constant is a sequence enclosed between quotes of zero or more printable characters numbered 33-126, spaces or escape sequences. All escape sequences barring one start with the character \ and stand for a character sequence. The escape sequences are:
\a Alert (ASCII 0x07) \b Backspace (ASCII 0x08) \t Horizontal tab (ASCII 0x09) \n Linefeed or newline (ASCII 0x0A) \v Vertical tab (ASCII 0x0B) \f Form feed (ASCII 0x0C) \r Carriage return (ASCII 0x0D) \e Escape (ASCII 0x1B) \d Delete (ASCII 0x7F) \\ Backslash \" Double quote \^c A control character whose encoding is C mod 32, where C is the encoding of the character c, with C in the range [64,122]. \ddd The character whose encoding is the number ddd, three decimal digits denoting an integer in the range [0,255]. \xhh The character whose encoding is the number hh, two hexadecimal digits, with a-f (A-F) denoting the `digits' 10-15 respectively. \f...f\ This sequence is ignored, where f...f stands for a sequence of one or more formatting characters (that are currently restricted to horizontal and vertical tab, whitespace, newline and carriage return). \c... where ... stands for a sequence of formatting characters, ending with a non formatting character. This sequence is to be ignored. |
In the escape sequences involving decimal or hexadecimal digits, the sequence of digits is taken to be the longest sequence of such characters.
The exceptional escape sequence is ""; this stands for a " included inside the string literal.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
Two styles for writing comments are supported. Multiline comments may be enclosed within a matched pair of /* and */; nesting of comments is legal. Single line comments are preceded by % and extend to the end of the line.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The Teyjus parser for signatures is automatically generated by bison from a context-free grammar and a post processor that manipulates the (pre) abstract syntax structures generated from the grammar. The post processing phase is needed to handle information such as arities of type constructors that is also gleaned through the parse. The grammar below represents the effect of the combined processing and, therefore, differs from the one presented to bison.
6.2.1 A BNF Specification of Signature Syntax | A BNF specification of signature syntax. | |
6.2.2 Commentary on Signature Syntax | Commentary on the grammar. |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
<Signature> ::= <SigHeader> <SigPreamble> <Sign-Decls> <SigEnd> <SigHeader> ::= 'sig' <NameToken> '.' <SigEnd> ::= 'end' | '$' % $ stands for eof <SigPreamble> ::= <Empty> | accum_sig <NameTokens> '.' <SigPreamble> <NameTokens> ::= <NameToken> | <NameToken> ',' <NameTokens> <Sign-Decls> ::= <Empty> | <Sign-Decl> <Sign-Decls> <Sign-Decl> ::= 'kind' <Ids> <Kind> '.' | 'type' <Ids> <Type> '.' | <Fixity> <Ids> <SmallInt> '.' | 'exportdef' <Ids> '.' | 'exportdef' <Ids> <Type> '.' | 'useonly' <Ids> '.' | 'useonly' <Ids> <Type> '.' <Ids> ::= <Id> | <Id> ',' <Ids> <Kind> ::= 'type' | 'type' '->' <Kind> <Type> ::= <CType> '->' <Type> | <CType> <CType> ::= <NameToken> | <TyCIdent> <CTypes> | '(' <Type> ')' <TyCIdent> ::= <Id> <CTypes> ::= <CType> | <CType> <CTypes> <Fixity> ::= 'infixl' | 'infixr' | 'infix' | 'prefix' | 'prefixr' | 'postfix' | 'postfixl' <NameToken> ::= { Any Teyjus token distinct from keywords, pseudo-keywords, integer, string and real literals } <Id> ::= { Like NameToken, except that the token should also begin with something different from an uppercase letter and _ } <SmallInt> ::= { Integer value between 0 and 255; i.e. unsigned one byte number } <Empty> ::= { empty token sequence } |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The declarations in a signature file must satisfy certain properties in addition to being derivable from the grammar rules:
kind foo type -> ... -> type |
with n+1 occurrences of type identifies foo as a type constructor of arity n. (If n is 0 the object identified is a sort.) Note that there are restrictions built into the grammar rules on the syntax of tokens that can be identified as sorts and type constructors.
<Fixity> <Ids> <SmallInt> '.' |
Such declarations identify relevant (term) constants as being infix, prefix or postfix operators that are not associative, left associative or right associative and having a specified precedence. Types of constants that are defined as operators must be consistent with their interpretation as such.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
We describe here the structure of declarations constituting the source code for a module that is contained in a file called <modname>.mod. As already explained, this source code file goes hand-in-hand with the signature file for the module and requires the latter to be used in its interpretation.
The Teyjus parser for the source code file consists of a part automatically generated by bison from a context-free grammar and a post processor that manipulates the (pre) abstract syntax structures generated by the bison based parser. The post processing phase is needed to handle information such as arities of type constructors and operator fixities and precedence that is also gleaned through the parse. The grammar below represents the effect of the combined processing and differs for this reason from the one presented to bison.
6.3.1 A BNF Specification of Module Syntax | A BNF specification of module syntax. | |
6.3.2 Commentary on Module Syntax | Commentary on the grammar. |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The rules below describe the syntax of the declarations that may appear in a module file. These grammar rules build on the ones already presented while explaining the syntax of signatures in A BNF Specification of Signature Syntax. Thus, to understand this grammar, it may be necessary to consult also the grammar for signatures.
<Module> ::= <ModHeader> <ModPreamble> <ModBody> <ModEnd> <ModHeader> ::= 'module' <NameToken> '.' <ModEnd> ::= '$' % $ stands for eof | 'end' <ModPreamble> ::= <Empty> | 'import' <NameTokens> '.' <ModPreamble> | 'accumulate' <NameTokens> '.' <ModPreamble> | 'accum_sig' <NameTokens> '.' <ModPreamble> <ModBody> ::= <Empty> | <ModSignDecl> <ModBody> | <ModClause> <ModBody> <ModSignDecl> ::= <Sign-Decl> | 'local' <Ids> '.' | 'local' <Ids> <Type> '.' | 'localkind' <Ids> '.' | 'localkind' <Ids> <Kind> '.' | 'closed' <Ids> '.' | 'closed' <Ids> <Type> '.' <ModClause> ::= <Clause> '.' <Clause> ::= <Atom> | <Clause> ':-' <Goal> | <Goal> '=>' <Clause> | <Clause> ',' <Clause> | <Clause> '&' <Clause> | <PiId> <TypedId> '\' <Clause> | <PiId> <CTerm> | '(' <Clause> ')' <Goal> ::= '!' | <PiId> <Term> | <PiId> <TypedId> '\' <Goal> | <SigmaId> <Term> | <SigmaId> <TypedId> '\' <Goal> | <Goal> ';' <Goal> | <Goal> ',' <Goal> | <Goal> '&' <Goal> | <Clause> '=>' <Goal> | <Atom> | <FlexAtom> | '(' <Goal> ')' <Atom> ::= <Term> { the type of the term must be boolean and its main functor must be a constant or a variable bound by an enclosing essential universal quantifier } <FlexAtom> ::= <TypedVar> | <FlexAtom> <AppTerm> | '(' <FlexAtom> ')' <CTerm> ::= <Term> { The main functor of the term must be a constant or a variable bound by an enclosing essential universal quantifier } <Term> ::= <Term> <TypedCIdent> <Term> | <Term> <TypedCIdent> | <TypedCIdent> <Term> | <Term> <Term> | <TypedId> '\' <Term> | '[' ']' % Prolog list notation | '[' TermList '|' Term ']' % Prolog list notation | <TypedCIdent> | <TypedVar> | <TypedId> | '(' <Term> ')' <TermList> ::= <Term> | <Term> ',' <TermList> <PiId> ::= 'pi' | 'pi' ':' <Type> | '(' <PiId> ')' <SigmaId> ::= 'sigma' | 'sigma' ':' <Type> | '(' <SigmaId> ')' <TypedId> ::= <AbsToken> | <AbsToken> ':' <Type> | '(' <TypedId> ')' <TypedVar> ::= <VIdent> | <VIdent> ':' <Type> | '(' <TypedVar> ')' <TypedCIdent> ::= <SpecialOp> | <Id> | <Id> ':' <Type> | '(' <TypedCIdent> ')' <SpecialOp> ::= { A pseudo keyword corresponding to one of the overloaded operators } <AbsToken> ::= { A <NameToken> that does not begin with _ } <VIdent> ::= { A Teyjus token that begins with an uppercase letter or _ and that is not the variable bound by an enclosing abstraction } |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The module declaration must satisfy certain properties in addition to adhering to the syntax rules:
<Term> <TypedCIdent> <Term> |
that appears in the rules for <Term>, the identifier must have been declared as an infix operator. Similarly, in the form
<Term> <TypedCIdent> |
that appears in the rule for <Term>, the identifier must have been declared as a postfix operator and in the form
<TypedCIdent> <Term> |
the identifier should have been declared as a prefix operator. The rule for <Term> is obviously ambiguous as written. Disambiguation is to be affected using the precedence and associativity of the relevant operators. In this context, it is also to be noted that abstraction and application are treated like (infix) operators, that application is left associative and has higher precedence than all other operators and abstraction is right associative and has lower precedence than all other operators.
[ p,(q,r),s] |
the middle comma is forced to be interpreted as conjunction. In the other direction, if p, q and r are of predicate type, we may want to construct a list that has these three as elements. This cannot be forced using the Prolog-style list notation; the list
[p, q, r] |
will only be read as a singleton list. However, to depict such a list, we can use :: and nil based representation instead.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The process of matching a signature with the declarations in the source file, i.e. the file with extension .mod, has three essential effects:
The broad manner of processing is the following. First, all type and kind declarations in a module are processed to extract lists of global and local constant, sort and type constructor declarations. The global lists are enhanced by the declarations in accumulated signatures and in the signatures of imported modules. The global lists are further enhanced by global declarations in accumulated modules. The locally or implicitly declared constants, kinds and type constructors in such modules are added as new objects of these categories distinct from any other symbols that are or will be encountered in processing. The global lists are now filtered by the lists obtained from the signature for this module, with the eliminated objects going to relevant local lists. At each stage in this processing, the declared type and arity of objects in a given category having identical names in any given context must match.
The simplified signature of global and local sorts, type constructors and constants is used to analyze the clauses in the module. In the process, new constants may be encountered and these may be added to the relevant local lists. The inference of types for undeclared constants may also be supported by this process. A check must be made in the course of processing clauses to ensure that useonly predicates do not have defining clauses either at the top-level or in the antecedents of implication goals.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
Part of the process of certifying module interactions involves checking for compatibility between symbols that are global to an imported module and the definitions of these symbols in the importing module or in the signatures of sibling imported modules. This aspect is covered under the topic of matching a module with its signature. The other part of the process concerns ensuring that restrictions imposed by closed, exportdef and useonly declarations are satisfied. This is done as follows:
We assume that we are given a list of closed predicates at the outset; these predicates are ones defined to be closed in some module within whose accumulate structure the present one appears embedded. We prune this list first by eliminating all the predicates that are either not global in the present module or are defined to be useonly. We then add to the list all the predicates defined to be closed within the present module. Modules accumulated into the present context are then to be processed in the same fashion as the one being described, except that the incoming list of closed predicates is the newly derived one. With regard to each imported module, the requirement is that if any of the predicates in the newly derived list is global to it, then it is also defined to be useonly.
One aspect to check is the compatibility of the `parent' module with those imported or accumulated by it. For this purpose, we (conceptually) collect all the exportdef predicates from the signatures of the imported or accumulated modules. We then check that the resulting list consists of only non-global constants or of global predicates that are also of the exportdef kind in the parent module. Further, none of the predicates in this list should have defining clauses in the parent module.
The other aspect to check is compatibility between sibling imported and accumulated modules. For this purpose, it is necessary to determine that the exportdef declaration of each predicate in one module is complemented by a useonly declaration for that predicate in a sibling module if that predicate is global there.
The role of closed declarations from an outwards perspective is identical to that of exportdef declarations and processing relevant to it therefore follows the lines described in the preceding item.
Within the main module and each accumulated module, we check that there are no clauses defining useonly predicates as described in the section on matching a module with its signature. Then for each accumulated/imported module and useonly constant we check that the constant is either not global in that module or is defined to be useonly in it.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The builtin notions supported by Teyjus are sub-categorized and described by the material below.
6.6.1 Sorts and Type Constructors | Builtin Sorts and Type Constructors. | |
6.6.2 Logical Constants and Primitive Goals | ||
6.6.3 Pervasive Constants | Predefined Non Boolean Type Constants. | |
6.6.4 Builtin Comparison Predicates | Builtin Comparison Operators. | |
6.6.5 Input/Output and other Side Effecting Predicates | Predicates for I/O and Relatives. |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
Teyjus embodies an understanding of the following primitive types
int for integers in an implementation determined range. Teyjus supports integers in the range (-2**25,2**25). real for reals in an implementation determined range. Teyjus currently supports a representation based on a 15 bit mantissa and 7 bit exponent representation. string for the type of strings in_stream for input streams out_stream for output streams o for the type of propositions |
Teyjus also supports the following type constructors
list the unary type constructor for lists |
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The following constants are recognized by the Teyjus parser as logical constants or primitive goals; the types and interpretations of these symbols are indicated alongside them. Some of the symbols listed are treated as keywords (see section Characterization of Tokens in Teyjus) and so cannot be redefined. Others can be redefined by type declarations in user programs.
true of type o, representing the true proposition ! of type o, representing the Prolog cut fail of type o, representing the goal that always fails halt of type o, representing a goal that terminates interaction stop of type o, representing a goal that terminates interaction , of type o -> o -> o, representing conjunction & of type o -> o -> o, representing conjunction ; of type o -> o -> o, representing disjunction :- of type o -> o -> o, representing Prolog style implication => of type o -> o -> o, representing implication written right sigma of type (A -> o) -> o, representing existential quantification pi of type (A -> o) -> o, representing universal quantification not of type o -> o, representing Prolog-style negation-by-failure = of type A -> A -> o, representing (Lambda Prolog) equality |
The predicates , (comma), &, ; (semicolon), :- and => are all actually recognized as infix operators with the following characteristics: comma is left associative with precedence level 110, & is right associative with precedence level 120, semicolon is left associative with precedence level 100, :- is non associative with precedence level 0 and => is right associative with precedence level 130.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The following symbols are recognized as constants with the associated attributes. Those of these symbols that are not categorized as keywords (see section Characterization of Tokens in Teyjus) can be redefined by the user.
overloaded prefix operator of type (int -> int) and (real -> real). Nonassociative and with higher precedence than everything else except application.
overloaded infix left associative operator with precedence level 150 and type either (int -> int -> int) or (real -> real -> real).
overloaded infix left associative operator with precedence level 150 and type either (int -> int -> int) or (real -> real -> real).
overloaded infix left associative operator with precedence level 160 and type either (int -> int -> int) or (real -> real -> real).
infix left associative operator with precedence level 160 and type int -> int -> int.
infix left associative operator with precedence level 160 and type real -> real -> real.
infix left associative operator with precedence level 160 and type int -> int -> int.
of type int -> real.
of type int -> int.
of type real -> real.
of type real -> real. Argument is in radians.
of type real -> real. Argument is in radians.
of type real -> real. Result (when evaluation is forced) is in radians.
of type real -> real.
of type real -> int.
of type real -> int.
of type real -> int.
of type real -> real.
of type string -> string -> string, representing string concatenation.
of type string -> int, representing the length function on strings.
of type int -> string.
of type string -> int representing the usual ord function on characters (encoded as single character strings).
of type string -> int -> int -> string. When evaluated, (substring s i j) returns the string starting at position i (zero indexed) in the string and ending at position (i + j). The result is undefined if i < 0 or the length of s is less than i + j.
of type int -> string. When evaluated, the result returned is a string rendition of the integer value of the first argument.
of type real -> string. When evaluated, the result returned is a string rendition of the real value of the first argument.
of type in_stream, representing the standard input stream
of type out_stream, representing the standard output stream
of type in_stream, representing the standard error stream
of type (list A), representing the (polymorphic) nil list
of type (A -> (list A) -> (list A)) representing the usual list constructor.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The following comparison operators are supported by Teyjus.
< > =< >= |
These operators are overloaded in an ad hoc way. In particular, they each have the type A -> A -> o, where A may be instantiated by the types int, real and string. All these operators are of the infix variety, none is associative and each has the precedence level of 130.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
The following additional predicates are treated as builtin ones in Teyjus. Their interpretation is true for the most part to that in Terzo, the ML based interpreter for Lambda Prolog.
This is the evaluation predicate familiar from Prolog. It is actually defined as an infix, non associative operator with precedence level 130. This predicate is overloaded in an ad hoc way: it can be used at the types int -> int -> o, real -> real -> o and string -> string -> o. The determination of specific type can be delayed to run-time and an illegal usage may result in a run-time type error.
Predicate type: int -> int -> o. Calls the Unix 'gettimeofday' function. Instantiates the first and second arguments to the number of seconds and microseconds respectively, since 00:00 Universal Coordinated Time, Janurary 1, 1970. The arguments must be uninstantiated variables.
Predicate type: string -> string -> o. Calls the Unix 'getenv' function to return the value of environment variables. The first argument should be instantiated to the name of the environment variable. The second argument will be unified with the value of the variable.
Predicate type: string -> in_stream -> o. Its first argument must be instantiated to a string literal and its second argument should be an uninstantiated variable. Its effect is to open a file with name given by the string literal as an input stream. It is an error if a file of the relevant name does not exist.
Predicate type: string -> out_stream -> o. Its first argument must be instantiated to a string literal and its second argument should be an unbound variable. Its effect is to open a file with name given by the string literal as an output stream and to set the variable argument to this stream. If a file of the given name does not exist, then a new one is created. If such a file exists, then open_out initializes this file before it sets it up for writing whereas open_append begins writing at the end of the file.
Predicate type: string -> in_stream -> o. Its first argument must be instantiated to a string literal and its second argument must be an unbound variable. Its effect is to bind the variable to a stream containing the text in the string argument.
Predicate type: in_stream -> o. Its argument must be instantiated to an open input stream. The effect is to close this stream.
Predicate type: out_stream -> o. Its argument must be instantiated to an open output stream. The effect is to close this stream.
Predicate type: A -> string -> o. A character string representation of the first argument is constructed and is unified with the second argument.
Predicate type: string -> A -> o. The first argument must be instantiated to a string. The effect is to parse the portion of the string up to a period token (that excludes the decimal point) into a term and to try to unify this with the second argument. This predicate may fail because of type disparities as well.
Predicate type: in_stream -> int -> string -> o. (input strm i str) inputs at most i characters from strm and unifies this with str. This predicate blocks until i characters are available or EOF is reached. The second argument must be instantiated to a non-negative integer and the first to an open in_stream.
Predicate type: out_stream -> string -> o. Outputs the string argument to the stream represented by the first argument. Both arguments must be instantiated, the first to an open output stream and the second to a string literal.
Predicate type: in_stream -> string -> o. The first argument must be instantiated to an open input stream. The predicate blocks till a newline or EOF is present in the input stream, then reads all the characters up to and including the EOF or newline and unifies this with the string argument.
Predicate type: in_stream -> string -> o. The first argument must be instantiated to an open input stream. The next character in the stream is looked up (but not consumed) and the result is unified with the string argument. The predicate blocks till a character or EOF is available.
Predicate type: in_stream -> o. The argument must be instantiated to an open input stream. Succeeds if the end of stream has been reached and fails otherwise.
Predicate type: out_stream -> o. The argument must be instantiated to an open output stream. Any buffered output is written out to the stream.
Predicate type: string -> o. This predicate is equivalent to (output std_out).
Predicate type: A -> o. This predicate reads in the character sequence up to and including a newline or EOF from std_in, attempts to parse this as a period terminated term, and unifies the result with the argument. Note: This predicate flushes the input beyond the period in the current line.
Predicate type: out_stream -> A -> o. The term constituting the second argument is written out to the out_stream given by the first argument. The first argument must be instantiated to an open out_stream.
Predicate type: in_stream -> A -> o. The first argument must be instantiated to an open in_stream. A period terminated string is read in from the in_stream, this is parsed as a term and then unified with the second argument. Note: This predicate does not flush the input from the current line even when the input port is std_in.
Predicate type: string -> int -> in_stream -> out_stream -> o. Establishes a TCP socket. The first argument should be instantiated to a host name such as 'tulip.cs.umn.edu' or an IP address such as '128.101.35.21'. The second argument should be instantiated to the port number of the service. The third and fourth arguments should be uninstantiated variables that will be instantiated respectively to input and output streams associated with the socket. This predicate only works under Unix/Linux platforms.
[ < ] | [ > ] | [ << ] | [ Up ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
Listed below are some special characteristics of the Teyjus system knowledge of which may be relevant to programming:
G1, !, G2 ; G3; G4 |
the crossing of the cut removes alternative solution paths only within G1. If there is a failure in solving G2, the disjunctive goal may still be solved by solving G3 or G4.
[ << ] | [ >> ] | [Top] | [Contents] | [Index] | [ ? ] |
This document was generated by Gopalan Nadathur on October, 20 2007 using texi2html 1.76.