871 resultados para propositional linear-time temporal logic
Resumo:
During locomotion, turning is a common and recurring event which is largely neglected in the current state-of-the-art ankle-foot prostheses, forcing amputees to use different steering mechanisms for turning, compared to non-amputees. A better understanding of the complexities surrounding lower limb prostheses will lead to increased health and well-being of amputees. The aim of this research is to develop a steerable ankle-foot prosthesis that mimics the human ankle mechanical properties. Experiments were developed to estimate the mechanical impedance of the ankle and the ankles angles during straight walk and step turn. Next, this information was used in the design of a prototype, powered steerable ankle-foot prosthesis with two controllable degrees of freedom. One of the possible approaches in design of the prosthetic robots is to use the human joints’ parameters, especially their impedance. A series of experiments were conducted to estimate the stochastic mechanical impedance of the human ankle when muscles were fully relaxed and co-contracting antagonistically. A rehabilitation robot for the ankle, Anklebot, was employed to provide torque perturbations to the ankle. The experiments were performed in two different configurations, one with relaxed muscles, and one with 10% of maximum voluntary contraction (MVC). Surface electromyography (sEMG) was used to monitor muscle activation levels and these sEMG signals were displayed to subjects who attempted to maintain them constant. Time histories of ankle torques and angles in the lateral/medial (LM) directions, inversion-eversion (IE), and dorsiflexionplantarflexion (DP) were recorded. Linear time-invariant transfer functions between the measured torques and angles were estimated providing an estimate of ankle mechanical impedance. High coherence was observed over a frequency range up to 30 Hz. The main effect of muscle activation was to increase the magnitude of ankle mechanical impedance in all degrees of freedom of the ankle. Another experiment compared the three-dimensional angles of the ankle during step turn and straight walking. These angles were measured to be used for developing the control strategy of the ankle-foot prosthesis. An infrared camera system was used to track the trajectories and angles of the foot and leg. The combined phases of heel strike and loading response, mid stance, and terminal stance and pre-swing were determined and used to measure the average angles at each combined phase. The Range of motion (ROM) in IE increased during turning while ML rotation decreased and DP changed the least. During the turning step, ankle displacement in DP started with similar angles to straight walk and progressively showed less plantarflexion. In IE, the ankle showed increased inversion leaning the body toward the inside of the turn. ML rotation initiated with an increased medial rotation during the step turn relative to the straight walk transitioning to increased lateral rotation at the toe off. A prototype ankle-foot prosthesis capable of controlling both DP and IE using a cable driven mechanism was developed and assessed as part of a feasibility study. The design is capable of reproducing the angles required for straight walk and step turn; generates 712N of lifting force in plantarflexion, and shows passive stiffness comparable to a nonload bearing ankle impedance. To evaluate the performance of the ankle-foot prosthesis, a circular treadmill was developed to mimic human gait during steering. Preliminary results show that the device can appropriately simulate human gait with loading and unloading the ankle joint during the gait in circular paths.
Resumo:
Pack ice in the Bellingshausen Sea contained moderate to high stocks of microalgal biomass (3-10 mg Chl a/m**2) spanning the range of general sea-ice microalgal microhabitats (e.g., bottom, interior and surface) during the International Polar Year (IPY) Sea Ice Mass Balance in the Antarctic (SIMBA) studies. Measurements of irradiance above and beneath the ice as well as optical properties of the microalgae therein demonstrated that absorption of photosynthetically active radiation (PAR) by particulates (microalgae and detritus) had a substantial influence on attenuation of PAR and irradiance transmission in areas with moderate snow covers (0.2-0.3 m) and more moderate effects in areas with low snow cover. Particulates contributed an estimated 25 to 90% of the attenuation coefficients for the first-year sea ice at wavelengths less than 500 nm. Strong ultraviolet radiation (UVR) absorption by particulates was prevalent in the ice habitats where solar radiation was highest - with absorption coefficients by ice algae often being as large as that of the sea ice. Strong UVR-absorption features were associated with an abundance of dinoflagellates and a general lack of diatoms - perhaps suggesting UVR may be influencing the structure of some parts of the sea-ice microbial communities in the pack ice during spring. We also evaluated the time-varying changes in the spectra of under-ice irradiances in the austral spring and showed dynamics associated with changes that could be attributed to coupled changes in the ice thickness (mass balance) and microalgal biomass. All results are indicative of radiation-induced changes in the absorption properties of the pack ice and highlight the non-linear, time-varying, biophysical interactions operating within the Antarctic pack ice ecosystem.
Resumo:
Knowing the size of the terms to which program variables are bound at run-time in logic programs is required in a class of optimizations which includes granularity control and recursion elimination. Such size is difficult to even approximate at compile time and is thus generally computed at run-time by using (possibly predeñned) predicates which traverse the terms involved. We propose a technique which has the potential of performing this computation much more efficiently. The technique is based on ñnding program procedures which are called before those in which knowledge regarding term sizes is needed and which traverse the terms whose size is to be determined, and transforming such procedures so that they compute term sizes "on the fly". We present a systematic way of determining whether a given program can be transformed in order to compute a given term size at a given program point without additional term traversal. Also, if several such transformations are possible our approach allows ñnding minimal transformations under certain criteria. We also discuss the advantages and applications of our technique (specifically in the task of granularity control) and present some performance results.
Resumo:
Knowing the size of the terms to which program variables are bound at run-time in logic programs is required in a class of applications related to program optimization such as, for example, recursion elimination and granularity analysis. Such size is difficult to even approximate at compile time and is thus generally computed at run-time by using (possibly predefined) predicates which traverse the terms involved. We propose a technique based on program transformation which has the potential of performing this computation much more efficiently. The technique is based on finding program procedures which are called before those in which knowledge regarding term sizes is needed and which traverse the terms whose size is to be determined, and transforming such procedures so that they compute term sizes "on the fly". We present a systematic way of determining whether a given program can be transformed in order to compute a given term size at a given program point without additional term traversal. Also, if several such transformations are possible our approach allows finding minimal transformations under certain criteria. We also discuss the advantages and present some applications of our technique.
Resumo:
Knowing the size of the terms to which program variables are bound at run-time in logic programs is required in a class of applications related to program optimization such as, for example, granularity analysis and selection among different algorithms or control rules whose performance may be dependent on such size. Such size is difficult to even approximate at compile time and is thus generally computed at run-time by using (possibly predefined) predicates which traverse the terms involved. We propose a technique based on program transformation which has the potential of performing this computation much more efficiently. The technique is based on finding program procedures which are called before those in which knowledge regarding term sizes is needed and which traverse the terms whose size is to be determined, and transforming such procedures so that they compute term sizes "on the fly". We present a systematic way of determining whether a given program can be transformed in order to compute a given term size at a given program point without additional term traversal. Also, if several such transformations are possible our approach allows finding minimal transformations under certain criteria. We also discuss the advantages and applications of our technique and present some performance results.
Resumo:
Starting from the way the inter-cellular communication takes place by means of protein channels and also from the standard knowledge about neuron functioning, we propose a computing model called a tissue P system, which processes symbols in a multiset rewriting sense, in a net of cells similar to a neural net. Each cell has a finite state memory, processes multisets of symbol-impulses, and can send impulses (?excitations?) to the neighboring cells. Such cell nets are shown to be rather powerful: they can simulate a Turing machine even when using a small number of cells, each of them having a small number of states. Moreover, in the case when each cell works in the maximal manner and it can excite all the cells to which it can send impulses, then one can easily solve the Hamiltonian Path Problem in linear time. A new characterization of the Parikh images of ET0L languages are also obtained in this framework.
Resumo:
Dislocation mobility —the relation between applied stress and dislocation velocity—is an important property to model the mechanical behavior of structural materials. These mobilities reflect the interaction between the dislocation core and the host lattice and, thus, atomistic resolution is required to capture its details. Because the mobility function is multiparametric, its computation is often highly demanding in terms of computational requirements. Optimizing how tractions are applied can be greatly advantageous in accelerating convergence and reducing the overall computational cost of the simulations. In this paper we perform molecular dynamics simulations of ½ 〈1 1 1〉 screw dislocation motion in tungsten using step and linear time functions for applying external stress. We find that linear functions over time scales of the order of 10–20 ps reduce fluctuations and speed up convergence to the steady-state velocity value by up to a factor of two.
Resumo:
n this paper we propose the use of Networks of Bio-inspired Processors (NBP) to model some biological phenomena within a computational framework. In particular, we propose the use of an extension of NBP named Network Evolutionary Processors Transducers to simulate chemical transformations of substances. Within a biological process, chemical transformations of substances are basic operations in the change of the state of the cell. Previously, it has been proved that NBP are computationally complete, that is, they are able to solve NP complete problems in linear time, using massively parallel computations. In addition, we propose a multilayer architecture that will allow us to design models of biological processes related to cellular communication as well as their implications in the metabolic pathways. Subsequently, these models can be applied not only to biological-cellular instances but, possibly, also to configure instances of interactive processes in many other fields like population interactions, ecological trophic networks, in dustrial ecosystems, etc.
Resumo:
In this paper we define the notion of an axiom dependency hypergraph, which explicitly represents how axioms are included into a module by the algorithm for computing locality-based modules. A locality-based module of an ontology corresponds to a set of connected nodes in the hypergraph, and atoms of an ontology to strongly connected components. Collapsing the strongly connected components into single nodes yields a condensed hypergraph that comprises a representation of the atomic decomposition of the ontology. To speed up the condensation of the hypergraph, we first reduce its size by collapsing the strongly connected components of its graph fragment employing a linear time graph algorithm. This approach helps to significantly reduce the time needed for computing the atomic decomposition of an ontology. We provide an experimental evaluation for computing the atomic decomposition of large biomedical ontologies. We also demonstrate a significant improvement in the time needed to extract locality-based modules from an axiom dependency hypergraph and its condensed version.
Resumo:
Exercises of application of the systematic procedure to derive linear inequalities for logic expressions (Ejercicios de aplicación del método sistemático de obtención de restricciones lineales para expresiones lógicas).
Resumo:
Radar target identification based on complex natural resonances is sometimes achieved by convolving a linear time-domain filter with a received target signature. The filter is constructed from measured or pre-calculated target resonances. The performance of the target identification procedure is degraded if the difference between the sampling rates of the target signature and the filter is ignored. The problem is investigated for the natural extinction pulse technique (E-pulse) for the case of identifying stick models of aircraft.
Resumo:
It has long been recognized that demographic structure within a population can significantly affect the likely outcomes of harvest. Many studies have focussed on equilibrium dynamics and maximization of the value of the harvest taken. However, in some cases the management objective is to maintain the population at a abundance that is significantly below the carrying capacity. Achieving such an objective by harvest can be complicated by the presence of significant structure (age or stage) in the target population. in such cases, optimal harvest strategies must account for differences among age- or stage-classes of individuals in their relative contribution to the demography of the population. In addition, structured populations are also characterized by transient non-linear dynamics following perturbation, such that even under an equilibrium harvest, the population may exhibit significant momentum, increasing or decreasing before cessation of growth. Using simple linear time-invariant models, we show that if harvest levels are set dynamically (e.g., annually) then transient effects can be as or more important than equilibrium outcomes. We show that appropriate harvest rates can be complicated by uncertainty about the demographic structure of the population, or limited control over the structure of the harvest taken. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The Symbolic Analysis Laboratory (SAL) is a suite of tools for analysis of state transition systems. Tools supported include a simulator and four temporal logic model checkers. The common input language to these tools was originally developed with translation from other languages, both programming and specification languages, in mind. It is, therefore, a rich language supporting a range of type definitions and expressions. In this paper, we investigate the translation of Z specifications into the SAL language as a means of providing model checking support for Z. This is facilitated by a library of SAL definitions encoding the Z mathematical toolkit.
Resumo:
Experiments with simulators allow psychologists to better understand the causes of human errors and build models of cognitive processes to be used in human reliability assessment (HRA). This paper investigates an approach to task failure analysis based on patterns of behaviour, by contrast to more traditional event-based approaches. It considers, as a case study, a formal model of an air traffic control (ATC) system which incorporates controller behaviour. The cognitive model is formalised in the CSP process algebra. Patterns of behaviour are expressed as temporal logic properties. Then a model-checking technique is used to verify whether the decomposition of the operator's behaviour into patterns is sound and complete with respect to the cognitive model. The decomposition is shown to be incomplete and a new behavioural pattern is identified, which appears to have been overlooked in the analysis of the data provided by the experiments with the simulator. This illustrates how formal analysis of operator models can yield fresh insights into how failures may arise in interactive systems.
Resumo:
Formal methods have significant benefits for developing safety critical systems, in that they allow for correctness proofs, model checking safety and liveness properties, deadlock checking, etc. However, formal methods do not scale very well and demand specialist skills, when developing real-world systems. For these reasons, development and analysis of large-scale safety critical systems will require effective integration of formal and informal methods. In this paper, we use such an integrative approach to automate Failure Modes and Effects Analysis (FMEA), a widely used system safety analysis technique, using a high-level graphical modelling notation (Behavior Trees) and model checking. We inject component failure modes into the Behavior Trees and translate the resulting Behavior Trees to SAL code. This enables us to model check if the system in the presence of these faults satisfies its safety properties, specified by temporal logic formulas. The benefit of this process is tool support that automates the tedious and error-prone aspects of FMEA.