996 resultados para 2003-10-BS
Resumo:
Q. Shen and R. Jensen, 'Selecting Informative Features with Fuzzy-Rough Sets and its Application for Complex Systems Monitoring,' Pattern Recognition, vol. 37, no. 7, pp. 1351-1363, 2004.
Resumo:
Mike J. Wilkinson, Luisa J. Elliott, Jo?l Allainguillaume, Michael W. Shaw, Carol Norris, Ruth Welters, Matthew Alexander, Jeremy Sweet, David C. Mason (2003). Hybridization between Brassica napus and B-rapa on a national scale in the United Kingdom, Science, 302 (5644), 457-459. RAE2008
Resumo:
23 hojas : ilustraciones, fotografías.
Resumo:
sermon text; MS Word document
Resumo:
National Science Foundation (CCR-998310); Army Research Office (DAAD19-02-1-0058)
Resumo:
Temporal structure is skilled, fluent action exists at several nested levels. At the largest scale considered here, short sequences of actions that are planned collectively in prefronatal cortex appear to be queued for performance by a cyclic competitive process that operates in concert with a parallel analog representation that implicitly specifies the relative priority of elements of the sequence. At an intermediate scale, single acts, like reaching to grasp, depend on coordinated scaling of the rates at which many muscles shorten or lengthen in parallel. To ensure success of acts such as catching an approaching ball, such parallel rate scaling, which appears to be one function of the basal ganglia, must be coupled to perceptual variables such as time-to-contact. At a finer scale, within each act, desired rate scaling can be realized only if precisely timed muscle activations first accelerate and then decelerate the limbs, to ensure that muscle length changes do not under- or over- shoot the amounts needed for precise acts. Each context of action may require a different timed muscle activation pattern than similar contexts. Because context differences that require different treatment cannot be known in advance, a formidable adaptive engine-the cerebellum-is needed to amplify differences within, and continuosly search, a vast parallel signal flow, in order to discover contextual "leading indicators" of when to generate distinctive patterns of analog signals. From some parts of the cerebellum, such signals control muscles. But a recent model shows how the lateral cerebellum may serve the competitive queuing system (frontal cortex) as a repository of quickly accessed long-term sequence memories. Thus different parts of the cerebellum may use the same adaptive engine design to serve the lowest and highest of the three levels of temporal structure treated. If so, no one-to-one mapping exists between leveels of temporal structure and major parts of the brain. Finally, recent data cast doubt on network-delay models of cerebellar adaptive timing.
Resumo:
Much sensory-motor behavior develops through imitation, as during the learning of handwriting by children. Such complex sequential acts are broken down into distinct motor control synergies, or muscle groups, whose activities overlap in time to generate continuous, curved movements that obey an intense relation between curvature and speed. The Adaptive Vector Integration to Endpoint (AVITEWRITE) model of Grossberg and Paine (2000) proposed how such complex movements may be learned through attentive imitation. The model suggest how frontal, parietal, and motor cortical mechanisms, such as difference vector encoding, under volitional control from the basal ganglia, interact with adaptively-timed, predictive cerebellar learning during movement imitation and predictive performance. Key psycophysical and neural data about learning to make curved movements were simulated, including a decrease in writing time as learning progresses; generation of unimodal, bell-shaped velocity profiles for each movement synergy; size scaling with isochrony, and speed scaling with preservation of the letter shape and the shapes of the velocity profiles; an inverse relation between curvature and tangential velocity; and a Two-Thirds Power Law relation between angular velocity and curvature. However, the model learned from letter trajectories of only one subject, and only qualitative kinematic comparisons were made with previously published human data. The present work describes a quantitative test of AVITEWRITE through direct comparison of a corpus of human handwriting data with the model's performance when it learns by tracing human trajectories. The results show that model performance was variable across subjects, with an average correlation between the model and human data of 89+/-10%. The present data from simulations using the AVITEWRITE model highlight some of its strengths while focusing attention on areas, such as novel shape learning in children, where all models of handwriting and learning of other complex sensory-motor skills would benefit from further research.
Resumo:
There is much common ground between the areas of coding theory and systems theory. Fitzpatrick has shown that a Göbner basis approach leads to efficient algorithms in the decoding of Reed-Solomon codes and in scalar interpolation and partial realization. This thesis simultaneously generalizes and simplifies that approach and presents applications to discrete-time modeling, multivariable interpolation and list decoding. Gröbner basis theory has come into its own in the context of software and algorithm development. By generalizing the concept of polynomial degree, term orders are provided for multivariable polynomial rings and free modules over polynomial rings. The orders are not, in general, unique and this adds, in no small way, to the power and flexibility of the technique. As well as being generating sets for ideals or modules, Gröbner bases always contain a element which is minimal with respect tot the corresponding term order. Central to this thesis is a general algorithm, valid for any term order, that produces a Gröbner basis for the solution module (or ideal) of elements satisfying a sequence of generalized congruences. These congruences, based on shifts and homomorphisms, are applicable to a wide variety of problems, including key equations and interpolations. At the core of the algorithm is an incremental step. Iterating this step lends a recursive/iterative character to the algorithm. As a consequence, not all of the input to the algorithm need be available from the start and different "paths" can be taken to reach the final solution. The existence of a suitable chain of modules satisfying the criteria of the incremental step is a prerequisite for applying the algorithm.
Resumo:
Real-time polymerase chain reaction (PCR) has recently been described as a new tool to measure and accurately quantify mRNA levels. In this study, we have applied this technique to evaluate cytokine mRNA synthesis induced by antigenic stimulation with purified protein derivative (PPD) or heparin-binding haemagglutinin (HBHA) in human peripheral blood mononuclear cells (PBMC) from Mycobacterium tuberculosis-infected individuals. Whereas PPD and HBHA optimally induced IL-2 mRNA after respectively 8 and 16 to 24 h of in vitro stimulation, longer in vitro stimulation times were necessary for optimal induction of interferon-gamma (IFN-gamma) mRNA, respectively 16 to 24 h for PPD and 24 to 96 h for HBHA. IL-13 mRNA was optimally induced by in vitro stimulation after 16-48 h for PPD and after 48 to 96 h for HBHA. Comparison of antigen-induced Th1 and Th2 cytokines appears, therefore, valuable only if both cytokine types are analysed at their optimal time point of production, which, for a given cytokine, may differ for each antigen tested. Results obtained by real-time PCR for IFN-gamma and IL-13 mRNA correlated well with those obtained by measuring the cytokine concentrations in cell culture supernatants, provided they were high enough to be detected. We conclude that real-time PCR can be successfully applied to the quantification of antigen-induced cytokine mRNA and to the evaluation of the Th1/Th2 balance, only if the kinetics of cytokine mRNA appearance are taken into account and evaluated for each cytokine measured and each antigen analysed.
Resumo:
SCOPUS: le.j
Resumo:
Direct chill (DC) casting is a core primary process in the production of aluminum ingots. However, its operational optimization is still under investigation with regard to a number of features, one of which is the issue of curvature at the base of the ingot. Analysis of these features requires a computational model of the process that accounts for the fluid flow, heat transfer, solidification phase change, and thermomechanical analysis. This article describes an integrated approach to the modeling of all the preceding phenomena and their interactions.
Resumo:
The presented numerical modelling for the magnetic levitation involves coupling of the electromagnetic field, liquid shape change, fluid velocities and the temperature field at every time step during the simulation in time evolution. Combination of the AC and DC magnetic fields can be used to achieve high temperature, stable levitation conditions. The oscillation frequency spectra are analysed for droplets levitated in AC and DC magnetic fields at various combinations. An electrically poorly conducting, diamagnetic droplet (e.g. water) can be stably levitated using the dia- and para-magnetic properties of the sample material in a high intensity, gradient DC field.
Resumo:
Dr Fuchen Jia, Dr Mayer Patel and Professor Edwin Galea explain how advanced fire models were used to unravel the secrets of Swissair Flight 111, which crashed off the coast of Canada in 1998.
Resumo:
Computer based mathematical models describing the aircraft evacuation process have a vital role to play in the design and development of safer aircraft, the implementation of safer and more rigorous certification criteria, in cabin crew training and post-mortem accident investigation. As the risk of personal injury and the costs involved in performing full-scale certification trials are high, the development and use of these evacuation modelling tools are essential. Furthermore, evacuation models provide insight into the evacuation process that is impossible to derive from a single certification trial. The airEXODUS evacuation model has been under development since 1989 with support from the UK CAA and the aviation industry. In addition to describing the capabilities of the airEXODUS evacuation model, this paper describes the findings of a recent CAA project aimed at investigating model accuracy in predicting past certification trials. Furthermore, airEXODUS is used to examine issues related to the Blended Wing Body (BWB) and Very Large Transport Aircraft (VLTA). These radical new aircraft concepts pose considerable challenges to designers, operators and certification authorities. BWB concepts involving one or two decks with possibly four or more aisles offer even greater challenges. Can the largest exits currently available cope with passenger flow arising from four or five aisles? Do we need to consider new concepts in exit design? Should the main aisle be made wider to accommodate more passengers? In this paper we discuss various issues evacuation related issues associated VLTA and BWB aircraft and demonstrate how computer based evacuation models can be used to investigage these issues through examination of aisle/exit configurations for BWB cabin layouts.