128 resultados para Generation from examples


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Olive pomace oil, also known as "orujo" olive oil, is a blend of refined-pomace oil and virgin olive oil, fit for human consumption. Maslinic acid, oleanolic acid, erythrodiol, and uvaol are pentacyclic triterpenes, found in the non-glyceride fraction of orujo oil, which have previously been reported to have anti-inflammatory properties. In the present work, we investigated the effect of these minor components on pro-inflammatory cytokine production by human peripheral blood mononuclear cells in six different samples. Uvaol, erythrodiol, and oleanolic acid significantly decreased IL-1 beta and IL-6 production in a dose-dependent manner. All three compounds significantly reduced TNF-alpha production at 100 mu M; however, at 10 mu M, uvaol and oleanolic acid enhanced the generation of TNF-alpha. In contrast, maslinic acid did not significantly alter the concentration of those cytokines, with the exception of a slight inhibitory effect at 100 mu M. All four triterpenes inhibited production of I-309, at 50 mu M and 100 mu M. However, uvaol enhanced I-309 production at 10 mu M. The triterpenic dialcohols had a similar effect on MIG production. In conclusion, this study demonstrates that pentacyclic triterpenes in orujo oil exhibit pro- and anti-inflammatory properties depending on chemical structure and dose, and may be useful in modulating the immune response. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was carried out to examine the effect or inulin (IN), fructooligosaccharide (FOS), polydextrose (POL) and isomaltooligosaccharides (ISO), alone and in combination, on gas production, gas composition and prebiotic effects. Static batch culture fermentation was performed with faecal samples from three healthy volunteers to study the volume and composition of gas generated and changes in bacterial populations. Four carbohydrates alone or mixed with one another (50:50) were examined. Prebiotic index (PI) was calculated and used to compare the prebiotic effect. The high amount of gas produced by IN was reduced by mixing it with FOS. No reduction in gas generation was observed when POL and ISO mixed with other substrates. It was found that the mixture of IN and FOS was effective in reducing the amount of gas produced while augmenting or maintaining their potential to Support the growth of bifidobacteria in Faecal batch culture as the highest PI was achieved with FOS alone and a mixture of FOS and IN. It was also found that high volume of gas was generated in presence of POL and ISO and they had lower prebiotic effect. The results of this study imply that a Mixture of prebiotics could prove effective in reducing the amount of gas generated by the gut microflora. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Random number generation (RNG) is a functionally complex process that is highly controlled and therefore dependent on Baddeley's central executive. This study addresses this issue by investigating whether key predictions from this framework are compatible with empirical data. In Experiment 1, the effect of increasing task demands by increasing the rate of the paced generation was comprehensively examined. As expected, faster rates affected performance negatively because central resources were increasingly depleted. Next, the effects of participants' exposure were manipulated in Experiment 2 by providing increasing amounts of practice on the task. There was no improvement over 10 practice trials, suggesting that the high level of strategic control required by the task was constant and not amenable to any automatization gain with repeated exposure. Together, the results demonstrate that RNG performance is a highly controlled and demanding process sensitive to additional demands on central resources (Experiment 1) and is unaffected by repeated performance or practice (Experiment 2). These features render the easily administered RNG task an ideal and robust index of executive function that is highly suitable for repeated clinical use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To explore the extent and nature of change in cognitive-motor interference (CMI) among rehabilitating stroke patients who showed dual-task gait decrement at initial assessment. Design: Experimental, with in-subjects, repeated measures design. Setting: Rehabilitation centre for adults with acquired, nonprogressive brain injury. Subjects: Ten patients with unilateral stroke, available for reassessment 1-9 months following their participation in a study of CMI after brain injury. Measures: Median stride duration; mean word generation. Methods: Two x one-minute walking trials, two x one-minute word generation trials, two x one-minute trials of simultaneous walking and word generation; 10-metre walking time; Barthel ADL Scale score. Results: Seven out of ten patients showed reduction over time in dual-task gait decrement. Three out of ten showed reduction in cognitive decrement. Only one showed concomitant reduction in gait and word generation decrement. Conclusion: Extent of CMI during relearning to walk after a stroke reduced over time in the majority of patients. Effects were more evident in improved stride duration than improved cognitive performance. Measures of multiple task performance should be included in assessment for functional recovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two algorithms for finding the point on non-rational/rational Bezier curves of which the normal vector passes through a given external point are presented. The algorithms are based on Bezier curves generation algorithms of de Casteljau's algorithm for non-rational Bezier curve or Farin's recursion for rational Bezier curve, respectively. Orthogonal projections from the external point are used to guide the directional search used in the proposed iterative algorithms. Using Lyapunov's method, it is shown that each algorithm is able to converge to a local minimum for each case of non-rational/rational Bezier curves. It is also shown that on convergence the distance between the point on curves to the external point reaches a local minimum for both approaches. Illustrative examples are included to demonstrate the effectiveness of the proposed approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainties in sea-level projections for the 21st century have focused ice sheet modelling efforts to include the processes that are thought to be contributing to the recently observed rapid changes at ice sheet margins. This effort is still in its infancy, however, leaving us unable to make reliable predictions of ice sheet responses to a warming climate if such glacier accelerations were to increase in size and frequency. The geological record, however, has long identified examples of nonlinear ice sheet response to climate forcing (Shackleton NJ, Opdyke ND. 1973. Oxygen isotope and paleomagnetic stratigraphy of equatorial Pacific core V28–239, late Pliocene to latest Pleistocene. Geological Society of America Memoirs145: 449–464; Fairbanks RG. 1989. A 17,000 year glacio-eustatic sea level record: influence of glacial melting rates on the Younger Dryas event and deep ocean circulation. Nature342: 637–642; Bard E, Hamelin B, Arnold M, Montaggioni L, Cabioch G, Faure G, Rougerie F. 1996. Sea level record from Tahiti corals and the timing of deglacial meltwater discharge. Nature382: 241–244), thus suggesting an alternative strategy for constraining the rate and magnitude of sea-level change that we might expect by the end of this century. Copyright © 2009 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This short contribution examines the difficulties that have not yet been fully overcome in the many developments made from the simplest (and original) tube model for entangled polymers. It is concluded that many more length scales have to be considered sequentially when deriving a continuum rheological model from molecular considerations than have been considered in the past. In particular, most unresolved issues of the tube theory are related to the length scales of tube diameter, and molecular dynamics simulations is the perfect route to resolve them. The power of molecular simulations is illustrated by two examples: stress contributions from bonded and non-bonded interaction, and the inter-chain coupling, which is usually neglected in the tube theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years nonpolynomial finite element methods have received increasing attention for the efficient solution of wave problems. As with their close cousin the method of particular solutions, high efficiency comes from using solutions to the Helmholtz equation as basis functions. We present and analyze such a method for the scattering of two-dimensional scalar waves from a polygonal domain that achieves exponential convergence purely by increasing the number of basis functions in each element. Key ingredients are the use of basis functions that capture the singularities at corners and the representation of the scattered field towards infinity by a combination of fundamental solutions. The solution is obtained by minimizing a least-squares functional, which we discretize in such a way that a matrix least-squares problem is obtained. We give computable exponential bounds on the rate of convergence of the least-squares functional that are in very good agreement with the observed numerical convergence. Challenging numerical examples, including a nonconvex polygon with several corner singularities, and a cavity domain, are solved to around 10 digits of accuracy with a few seconds of CPU time. The examples are implemented concisely with MPSpack, a MATLAB toolbox for wave computations with nonpolynomial basis functions, developed by the authors. A code example is included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A common problem in many data based modelling algorithms such as associative memory networks is the problem of the curse of dimensionality. In this paper, a new two-stage neurofuzzy system design and construction algorithm (NeuDeC) for nonlinear dynamical processes is introduced to effectively tackle this problem. A new simple preprocessing method is initially derived and applied to reduce the rule base, followed by a fine model detection process based on the reduced rule set by using forward orthogonal least squares model structure detection. In both stages, new A-optimality experimental design-based criteria we used. In the preprocessing stage, a lower bound of the A-optimality design criterion is derived and applied as a subset selection metric, but in the later stage, the A-optimality design criterion is incorporated into a new composite cost function that minimises model prediction error as well as penalises the model parameter variance. The utilisation of NeuDeC leads to unbiased model parameters with low parameter variance and the additional benefit of a parsimonious model structure. Numerical examples are included to demonstrate the effectiveness of this new modelling approach for high dimensional inputs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a world of almost permanent and rapidly increasing electronic data availability, techniques of filtering, compressing, and interpreting this data to transform it into valuable and easily comprehensible information is of utmost importance. One key topic in this area is the capability to deduce future system behavior from a given data input. This book brings together for the first time the complete theory of data-based neurofuzzy modelling and the linguistic attributes of fuzzy logic in a single cohesive mathematical framework. After introducing the basic theory of data-based modelling, new concepts including extended additive and multiplicative submodels are developed and their extensions to state estimation and data fusion are derived. All these algorithms are illustrated with benchmark and real-life examples to demonstrate their efficiency. Chris Harris and his group have carried out pioneering work which has tied together the fields of neural networks and linguistic rule-based algortihms. This book is aimed at researchers and scientists in time series modeling, empirical data modeling, knowledge discovery, data mining, and data fusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study details validation of two separate multiplex STR systems for use in paternity investigations. These are the Second Generation Multiplex (SGM) developed by the UK Forensic Science Service and the PowerPlex 1 multiplex commercially available from Promega Inc. (Madison, WI, USA). These multiplexes contain 12 different STR systems (two are duplicated in the two systems). Population databases from Caucasian, Asian and Afro-Caribbean populations have been compiled for all loci. In all but two of the 36 STR/ethnic group combinations, no evidence was obtained to indicate inconsistency with Hardy-Weinberg (HW) proportions. Empirical and theoretical approaches have been taken to validate these systems for paternity testing. Samples from 121 cases of disputed paternity were analysed using established Single Locus Probe (SLP) tests currently in use, and also using the two multiplex STR systems. Results of all three test systems were compared and no non-conformities in the conclusions were observed, although four examples of apparent germ line mutations in the STR systems were identified. The data was analysed to give information on expected paternity indices and exclusion rates for these STR systems. The 12 systems combined comprise a highly discriminating test suitable for paternity testing. 99.96% of non-fathers are excluded from paternity on two or more STR systems. Where no exclusion is found, Paternity Index (PI) values of > 10,000 are expected in > 96% of cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important goal in computational neuroanatomy is the complete and accurate simulation of neuronal morphology. We are developing computational tools to model three-dimensional dendritic structures based on sets of stochastic rules. This paper reports an extensive, quantitative anatomical characterization of simulated motoneurons and Purkinje cells. We used several local and global algorithms implemented in the L-Neuron and ArborVitae programs to generate sets of virtual neurons. Parameters statistics for all algorithms were measured from experimental data, thus providing a compact and consistent description of these morphological classes. We compared the emergent anatomical features of each group of virtual neurons with those of the experimental database in order to gain insights on the plausibility of the model assumptions, potential improvements to the algorithms, and non-trivial relations among morphological parameters. Algorithms mainly based on local constraints (e.g., branch diameter) were successful in reproducing many morphological properties of both motoneurons and Purkinje cells (e.g. total length, asymmetry, number of bifurcations). The addition of global constraints (e.g., trophic factors) improved the angle-dependent emergent characteristics (average Euclidean distance from the soma to the dendritic terminations, dendritic spread). Virtual neurons systematically displayed greater anatomical variability than real cells, suggesting the need for additional constraints in the models. For several emergent anatomical properties, a specific algorithm reproduced the experimental statistics better than the others did. However, relative performances were often reversed for different anatomical properties and/or morphological classes. Thus, combining the strengths of alternative generative models could lead to comprehensive algorithms for the complete and accurate simulation of dendritic morphology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A vision system for recognizing rigid and articulated three-dimensional objects in two-dimensional images is described. Geometrical models are extracted from a commercial computer aided design package. The models are then augmented with appearance and functional information which improves the system's hypothesis generation, hypothesis verification, and pose refinement. Significant advantages over existing CAD-based vision systems, which utilize only information available in the CAD system, are realized. Examples show the system recognizing, locating, and tracking a variety of objects in a robot work-cell and in natural scenes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally assumed that the variability of neuronal morphology has an important effect on both the connectivity and the activity of the nervous system, but this effect has not been thoroughly investigated. Neuroanatomical archives represent a crucial tool to explore structure–function relationships in the brain. We are developing computational tools to describe, generate, store and render large sets of three–dimensional neuronal structures in a format that is compact, quantitative, accurate and readily accessible to the neuroscientist. Single–cell neuroanatomy can be characterized quantitatively at several levels. In computer–aided neuronal tracing files, a dendritic tree is described as a series of cylinders, each represented by diameter, spatial coordinates and the connectivity to other cylinders in the tree. This ‘Cartesian’ description constitutes a completely accurate mapping of dendritic morphology but it bears little intuitive information for the neuroscientist. In contrast, a classical neuroanatomical analysis characterizes neuronal dendrites on the basis of the statistical distributions of morphological parameters, e.g. maximum branching order or bifurcation asymmetry. This description is intuitively more accessible, but it only yields information on the collective anatomy of a group of dendrites, i.e. it is not complete enough to provide a precise ‘blueprint’ of the original data. We are adopting a third, intermediate level of description, which consists of the algorithmic generation of neuronal structures within a certain morphological class based on a set of ‘fundamental’, measured parameters. This description is as intuitive as a classical neuroanatomical analysis (parameters have an intuitive interpretation), and as complete as a Cartesian file (the algorithms generate and display complete neurons). The advantages of the algorithmic description of neuronal structure are immense. If an algorithm can measure the values of a handful of parameters from an experimental database and generate virtual neurons whose anatomy is statistically indistinguishable from that of their real counterparts, a great deal of data compression and amplification can be achieved. Data compression results from the quantitative and complete description of thousands of neurons with a handful of statistical distributions of parameters. Data amplification is possible because, from a set of experimental neurons, many more virtual analogues can be generated. This approach could allow one, in principle, to create and store a neuroanatomical database containing data for an entire human brain in a personal computer. We are using two programs, L–NEURON and ARBORVITAE, to investigate systematically the potential of several different algorithms for the generation of virtual neurons. Using these programs, we have generated anatomically plausible virtual neurons for several morphological classes, including guinea pig cerebellar Purkinje cells and cat spinal cord motor neurons. These virtual neurons are stored in an online electronic archive of dendritic morphology. This process highlights the potential and the limitations of the ‘computational neuroanatomy’ strategy for neuroscience databases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solar irradiance measurements from a new high density urban network in London are presented. Annual averages demonstrate that central London receives 30 ± 10 Wm-2 less solar irradiance than outer London at midday, equivalent to 9 ± 3% less than the London average. Particulate matter and AERONET measurements combined with radiative transfer modeling suggest that the direct aerosol radiative effect could explain 33 to 40% of the inner London deficit and a further 27 to 50% could be explained by increased cloud optical depth due to the aerosol indirect effect. These results have implications for solar power generation and urban energy balance models. A new technique using ‘Langley flux gradients’ to infer aerosol column concentrations over clear periods of three hours has been developed and applied to three case studies. Comparisons with particulate matter measurements across London have been performed and demonstrate that the solar irradiance measurement network is able to detect aerosol distribution across London and transport of a pollution plume out of London.