938 resultados para Generation from examples


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To explore the extent and nature of change in cognitive-motor interference (CMI) among rehabilitating stroke patients who showed dual-task gait decrement at initial assessment. Design: Experimental, with in-subjects, repeated measures design. Setting: Rehabilitation centre for adults with acquired, nonprogressive brain injury. Subjects: Ten patients with unilateral stroke, available for reassessment 1-9 months following their participation in a study of CMI after brain injury. Measures: Median stride duration; mean word generation. Methods: Two x one-minute walking trials, two x one-minute word generation trials, two x one-minute trials of simultaneous walking and word generation; 10-metre walking time; Barthel ADL Scale score. Results: Seven out of ten patients showed reduction over time in dual-task gait decrement. Three out of ten showed reduction in cognitive decrement. Only one showed concomitant reduction in gait and word generation decrement. Conclusion: Extent of CMI during relearning to walk after a stroke reduced over time in the majority of patients. Effects were more evident in improved stride duration than improved cognitive performance. Measures of multiple task performance should be included in assessment for functional recovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two algorithms for finding the point on non-rational/rational Bezier curves of which the normal vector passes through a given external point are presented. The algorithms are based on Bezier curves generation algorithms of de Casteljau's algorithm for non-rational Bezier curve or Farin's recursion for rational Bezier curve, respectively. Orthogonal projections from the external point are used to guide the directional search used in the proposed iterative algorithms. Using Lyapunov's method, it is shown that each algorithm is able to converge to a local minimum for each case of non-rational/rational Bezier curves. It is also shown that on convergence the distance between the point on curves to the external point reaches a local minimum for both approaches. Illustrative examples are included to demonstrate the effectiveness of the proposed approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainties in sea-level projections for the 21st century have focused ice sheet modelling efforts to include the processes that are thought to be contributing to the recently observed rapid changes at ice sheet margins. This effort is still in its infancy, however, leaving us unable to make reliable predictions of ice sheet responses to a warming climate if such glacier accelerations were to increase in size and frequency. The geological record, however, has long identified examples of nonlinear ice sheet response to climate forcing (Shackleton NJ, Opdyke ND. 1973. Oxygen isotope and paleomagnetic stratigraphy of equatorial Pacific core V28–239, late Pliocene to latest Pleistocene. Geological Society of America Memoirs145: 449–464; Fairbanks RG. 1989. A 17,000 year glacio-eustatic sea level record: influence of glacial melting rates on the Younger Dryas event and deep ocean circulation. Nature342: 637–642; Bard E, Hamelin B, Arnold M, Montaggioni L, Cabioch G, Faure G, Rougerie F. 1996. Sea level record from Tahiti corals and the timing of deglacial meltwater discharge. Nature382: 241–244), thus suggesting an alternative strategy for constraining the rate and magnitude of sea-level change that we might expect by the end of this century. Copyright © 2009 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This short contribution examines the difficulties that have not yet been fully overcome in the many developments made from the simplest (and original) tube model for entangled polymers. It is concluded that many more length scales have to be considered sequentially when deriving a continuum rheological model from molecular considerations than have been considered in the past. In particular, most unresolved issues of the tube theory are related to the length scales of tube diameter, and molecular dynamics simulations is the perfect route to resolve them. The power of molecular simulations is illustrated by two examples: stress contributions from bonded and non-bonded interaction, and the inter-chain coupling, which is usually neglected in the tube theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years nonpolynomial finite element methods have received increasing attention for the efficient solution of wave problems. As with their close cousin the method of particular solutions, high efficiency comes from using solutions to the Helmholtz equation as basis functions. We present and analyze such a method for the scattering of two-dimensional scalar waves from a polygonal domain that achieves exponential convergence purely by increasing the number of basis functions in each element. Key ingredients are the use of basis functions that capture the singularities at corners and the representation of the scattered field towards infinity by a combination of fundamental solutions. The solution is obtained by minimizing a least-squares functional, which we discretize in such a way that a matrix least-squares problem is obtained. We give computable exponential bounds on the rate of convergence of the least-squares functional that are in very good agreement with the observed numerical convergence. Challenging numerical examples, including a nonconvex polygon with several corner singularities, and a cavity domain, are solved to around 10 digits of accuracy with a few seconds of CPU time. The examples are implemented concisely with MPSpack, a MATLAB toolbox for wave computations with nonpolynomial basis functions, developed by the authors. A code example is included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A common problem in many data based modelling algorithms such as associative memory networks is the problem of the curse of dimensionality. In this paper, a new two-stage neurofuzzy system design and construction algorithm (NeuDeC) for nonlinear dynamical processes is introduced to effectively tackle this problem. A new simple preprocessing method is initially derived and applied to reduce the rule base, followed by a fine model detection process based on the reduced rule set by using forward orthogonal least squares model structure detection. In both stages, new A-optimality experimental design-based criteria we used. In the preprocessing stage, a lower bound of the A-optimality design criterion is derived and applied as a subset selection metric, but in the later stage, the A-optimality design criterion is incorporated into a new composite cost function that minimises model prediction error as well as penalises the model parameter variance. The utilisation of NeuDeC leads to unbiased model parameters with low parameter variance and the additional benefit of a parsimonious model structure. Numerical examples are included to demonstrate the effectiveness of this new modelling approach for high dimensional inputs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a world of almost permanent and rapidly increasing electronic data availability, techniques of filtering, compressing, and interpreting this data to transform it into valuable and easily comprehensible information is of utmost importance. One key topic in this area is the capability to deduce future system behavior from a given data input. This book brings together for the first time the complete theory of data-based neurofuzzy modelling and the linguistic attributes of fuzzy logic in a single cohesive mathematical framework. After introducing the basic theory of data-based modelling, new concepts including extended additive and multiplicative submodels are developed and their extensions to state estimation and data fusion are derived. All these algorithms are illustrated with benchmark and real-life examples to demonstrate their efficiency. Chris Harris and his group have carried out pioneering work which has tied together the fields of neural networks and linguistic rule-based algortihms. This book is aimed at researchers and scientists in time series modeling, empirical data modeling, knowledge discovery, data mining, and data fusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study details validation of two separate multiplex STR systems for use in paternity investigations. These are the Second Generation Multiplex (SGM) developed by the UK Forensic Science Service and the PowerPlex 1 multiplex commercially available from Promega Inc. (Madison, WI, USA). These multiplexes contain 12 different STR systems (two are duplicated in the two systems). Population databases from Caucasian, Asian and Afro-Caribbean populations have been compiled for all loci. In all but two of the 36 STR/ethnic group combinations, no evidence was obtained to indicate inconsistency with Hardy-Weinberg (HW) proportions. Empirical and theoretical approaches have been taken to validate these systems for paternity testing. Samples from 121 cases of disputed paternity were analysed using established Single Locus Probe (SLP) tests currently in use, and also using the two multiplex STR systems. Results of all three test systems were compared and no non-conformities in the conclusions were observed, although four examples of apparent germ line mutations in the STR systems were identified. The data was analysed to give information on expected paternity indices and exclusion rates for these STR systems. The 12 systems combined comprise a highly discriminating test suitable for paternity testing. 99.96% of non-fathers are excluded from paternity on two or more STR systems. Where no exclusion is found, Paternity Index (PI) values of > 10,000 are expected in > 96% of cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important goal in computational neuroanatomy is the complete and accurate simulation of neuronal morphology. We are developing computational tools to model three-dimensional dendritic structures based on sets of stochastic rules. This paper reports an extensive, quantitative anatomical characterization of simulated motoneurons and Purkinje cells. We used several local and global algorithms implemented in the L-Neuron and ArborVitae programs to generate sets of virtual neurons. Parameters statistics for all algorithms were measured from experimental data, thus providing a compact and consistent description of these morphological classes. We compared the emergent anatomical features of each group of virtual neurons with those of the experimental database in order to gain insights on the plausibility of the model assumptions, potential improvements to the algorithms, and non-trivial relations among morphological parameters. Algorithms mainly based on local constraints (e.g., branch diameter) were successful in reproducing many morphological properties of both motoneurons and Purkinje cells (e.g. total length, asymmetry, number of bifurcations). The addition of global constraints (e.g., trophic factors) improved the angle-dependent emergent characteristics (average Euclidean distance from the soma to the dendritic terminations, dendritic spread). Virtual neurons systematically displayed greater anatomical variability than real cells, suggesting the need for additional constraints in the models. For several emergent anatomical properties, a specific algorithm reproduced the experimental statistics better than the others did. However, relative performances were often reversed for different anatomical properties and/or morphological classes. Thus, combining the strengths of alternative generative models could lead to comprehensive algorithms for the complete and accurate simulation of dendritic morphology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A vision system for recognizing rigid and articulated three-dimensional objects in two-dimensional images is described. Geometrical models are extracted from a commercial computer aided design package. The models are then augmented with appearance and functional information which improves the system's hypothesis generation, hypothesis verification, and pose refinement. Significant advantages over existing CAD-based vision systems, which utilize only information available in the CAD system, are realized. Examples show the system recognizing, locating, and tracking a variety of objects in a robot work-cell and in natural scenes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally assumed that the variability of neuronal morphology has an important effect on both the connectivity and the activity of the nervous system, but this effect has not been thoroughly investigated. Neuroanatomical archives represent a crucial tool to explore structure–function relationships in the brain. We are developing computational tools to describe, generate, store and render large sets of three–dimensional neuronal structures in a format that is compact, quantitative, accurate and readily accessible to the neuroscientist. Single–cell neuroanatomy can be characterized quantitatively at several levels. In computer–aided neuronal tracing files, a dendritic tree is described as a series of cylinders, each represented by diameter, spatial coordinates and the connectivity to other cylinders in the tree. This ‘Cartesian’ description constitutes a completely accurate mapping of dendritic morphology but it bears little intuitive information for the neuroscientist. In contrast, a classical neuroanatomical analysis characterizes neuronal dendrites on the basis of the statistical distributions of morphological parameters, e.g. maximum branching order or bifurcation asymmetry. This description is intuitively more accessible, but it only yields information on the collective anatomy of a group of dendrites, i.e. it is not complete enough to provide a precise ‘blueprint’ of the original data. We are adopting a third, intermediate level of description, which consists of the algorithmic generation of neuronal structures within a certain morphological class based on a set of ‘fundamental’, measured parameters. This description is as intuitive as a classical neuroanatomical analysis (parameters have an intuitive interpretation), and as complete as a Cartesian file (the algorithms generate and display complete neurons). The advantages of the algorithmic description of neuronal structure are immense. If an algorithm can measure the values of a handful of parameters from an experimental database and generate virtual neurons whose anatomy is statistically indistinguishable from that of their real counterparts, a great deal of data compression and amplification can be achieved. Data compression results from the quantitative and complete description of thousands of neurons with a handful of statistical distributions of parameters. Data amplification is possible because, from a set of experimental neurons, many more virtual analogues can be generated. This approach could allow one, in principle, to create and store a neuroanatomical database containing data for an entire human brain in a personal computer. We are using two programs, L–NEURON and ARBORVITAE, to investigate systematically the potential of several different algorithms for the generation of virtual neurons. Using these programs, we have generated anatomically plausible virtual neurons for several morphological classes, including guinea pig cerebellar Purkinje cells and cat spinal cord motor neurons. These virtual neurons are stored in an online electronic archive of dendritic morphology. This process highlights the potential and the limitations of the ‘computational neuroanatomy’ strategy for neuroscience databases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solar irradiance measurements from a new high density urban network in London are presented. Annual averages demonstrate that central London receives 30 ± 10 Wm-2 less solar irradiance than outer London at midday, equivalent to 9 ± 3% less than the London average. Particulate matter and AERONET measurements combined with radiative transfer modeling suggest that the direct aerosol radiative effect could explain 33 to 40% of the inner London deficit and a further 27 to 50% could be explained by increased cloud optical depth due to the aerosol indirect effect. These results have implications for solar power generation and urban energy balance models. A new technique using ‘Langley flux gradients’ to infer aerosol column concentrations over clear periods of three hours has been developed and applied to three case studies. Comparisons with particulate matter measurements across London have been performed and demonstrate that the solar irradiance measurement network is able to detect aerosol distribution across London and transport of a pollution plume out of London.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over recent years there has been an increasing deployment of renewable energy generation technologies, particularly large-scale wind farms. As wind farm deployment increases, it is vital to gain a good understanding of how the energy produced is affected by climate variations, over a wide range of time-scales, from short (hours to weeks) to long (months to decades) periods. By relating wind speed at specific sites in the UK to a large-scale climate pattern (the North Atlantic Oscillation or "NAO"), the power generated by a modelled wind turbine under three different NAO states is calculated. It was found that the wind conditions under these NAO states may yield a difference in the mean wind power output of up to 10%. A simple model is used to demonstrate that forecasts of future NAO states can potentially be used to improve month-ahead statistical forecasts of monthly-mean wind power generation. The results confirm that the NAO has a significant impact on the hourly-, daily- and monthly-mean power output distributions from the turbine with important implications for (a) the use of meteorological data (e.g. their relationship to large scale climate patterns) in wind farm site assessment and, (b) the utilisation of seasonal-to-decadal climate forecasts to estimate future wind farm power output. This suggests that further research into the links between large-scale climate variability and wind power generation is both necessary and valuable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Waves with periods shorter than the inertial period exist in the atmosphere (as inertia-gravity waves) and in the oceans (as Poincaré and internal gravity waves). Such waves owe their origin to various mechanisms, but of particular interest are those arising either from local secondary instabilities or spontaneous emission due to loss of balance. These phenomena have been studied in the laboratory, both in the mechanically-forced and the thermally-forced rotating annulus. Their generation mechanisms, especially in the latter system, have not yet been fully understood, however. Here we examine short period waves in a numerical model of the rotating thermal annulus, and show how the results are consistent with those from earlier laboratory experiments. We then show how these waves are consistent with being inertia-gravity waves generated by a localised instability within the thermal boundary layer, the location of which is determined by regions of strong shear and downwelling at certain points within a large-scale baroclinic wave flow. The resulting instability launches small-scale inertia-gravity waves into the geostrophic interior of the flow. Their behaviour is captured in fully nonlinear numerical simulations in a finite-difference, 3D Boussinesq Navier-Stokes model. Such a mechanism has many similarities with those responsible for launching small- and meso-scale inertia-gravity waves in the atmosphere from fronts and local convection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity consumption in Ghana is estimated to be increasing by 10% per annum due to the demand from the growing population. However, current sources of production (hydro and thermal facilities) generate only 66% of the current demand. Considering current trends, it is difficult to substantiate these basic facts, because of the lack of information. As a result, research into the existing sources of generating electricity, electricity consumption and prospective projects has been performed. This was achieved using three key techniques; review of literature, empirical studies and modelling. The results presented suggest that, current annual installed capacity of energy generation (i.e. 1960 MW) must be increased to 9,405.59 MW, assuming 85% plant availability. This is then capable to coop with the growing demand and it would give access to the entire population as well as support commercial and industrial activities for the growth of the economy. The prospect of performing this research is with the expectation to present an academic research agenda for further exploration into the subject area, without which the growth of the country would be stagnant.