2 resultados para non-standard lexical functions
em Boston University Digital Common
Resumo:
If every lambda-abstraction in a lambda-term M binds at most one variable occurrence, then M is said to be "linear". Many questions about linear lambda-terms are relatively easy to answer, e.g. they all are beta-strongly normalizing and all are simply-typable. We extend the syntax of the standard lambda-calculus L to a non-standard lambda-calculus L^ satisfying a linearity condition generalizing the notion in the standard case. Specifically, in L^ a subterm Q of a term M can be applied to several subterms R1,...,Rk in parallel, which we write as (Q. R1 \wedge ... \wedge Rk). The appropriate notion of beta-reduction beta^ for the calculus L^ is such that, if Q is the lambda-abstraction (\lambda x.P) with m\geq 0 bound occurrences of x, the reduction can be carried out provided k = max(m,1). Every M in L^ is thus beta^-SN. We relate standard beta-reduction and non-standard beta^-reduction in several different ways, and draw several consequences, e.g. a new simple proof for the fact that a standard term M is beta-SN iff M can be assigned a so-called "intersection" type ("top" type disallowed).
Resumo:
We wish to construct a realization theory of stable neural networks and use this theory to model the variety of stable dynamics apparent in natural data. Such a theory should have numerous applications to constructing specific artificial neural networks with desired dynamical behavior. The networks used in this theory should have well understood dynamics yet be as diverse as possible to capture natural diversity. In this article, I describe a parameterized family of higher order, gradient-like neural networks which have known arbitrary equilibria with unstable manifolds of known specified dimension. Moreover, any system with hyperbolic dynamics is conjugate to one of these systems in a neighborhood of the equilibrium points. Prior work on how to synthesize attractors using dynamical systems theory, optimization, or direct parametric. fits to known stable systems, is either non-constructive, lacks generality, or has unspecified attracting equilibria. More specifically, We construct a parameterized family of gradient-like neural networks with a simple feedback rule which will generate equilibrium points with a set of unstable manifolds of specified dimension. Strict Lyapunov functions and nested periodic orbits are obtained for these systems and used as a method of synthesis to generate a large family of systems with the same local dynamics. This work is applied to show how one can interpolate finite sets of data, on nested periodic orbits.