66 resultados para Linear and multilinear programming


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Mothers' self-reported stroking of their infants over the first weeks of life modifies the association between prenatal depression and physiological and emotional reactivity at 7 months, consistent with animal studies of the effects of tactile stimulation. We now investigate whether the effects of maternal stroking persist to 2.5 years. Given animal and human evidence for sex differences in the effects of prenatal stress we compare associations in boys and girls. Method From a general population sample of 1233 first-time mothers recruited at 20 weeks gestation we drew a random sample of 316 for assessment at 32 weeks, stratified by reported inter-partner psychological abuse, a risk indicator for child development. Of these mothers, 243 reported at 5 and 9 weeks how often they stroked their infants, and completed the Child Behavior Checklist (CBCL) at 2.5 years post-delivery. Results There was a significant interaction between prenatal anxiety and maternal stroking in the prediction of CBCL internalizing (p = 0.001) and anxious/depressed scores (p < 0.001). The effects were stronger in females than males, and the three-way interaction prenatal anxiety × maternal stroking × sex of infant was significant for internalizing symptoms (p = 0.003). The interactions arose from an association between prenatal anxiety and internalizing symptoms only in the presence of low maternal stroking. Conclusions The findings are consistent with stable epigenetic effects, many sex specific, reported in animal studies. While epigenetic mechanisms may be underlying the associations, it remains to be established whether stroking affects gene expression in humans.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a novel adaptive multiple modelling algorithm for non-linear and non-stationary systems. This simple modelling paradigm comprises K candidate sub-models which are all linear. With data available in an online fashion, the performance of all candidate sub-models are monitored based on the most recent data window, and M best sub-models are selected from the K candidates. The weight coefficients of the selected sub-model are adapted via the recursive least square (RLS) algorithm, while the coefficients of the remaining sub-models are unchanged. These M model predictions are then optimally combined to produce the multi-model output. We propose to minimise the mean square error based on a recent data window, and apply the sum to one constraint to the combination parameters, leading to a closed-form solution, so that maximal computational efficiency can be achieved. In addition, at each time step, the model prediction is chosen from either the resultant multiple model or the best sub-model, whichever is the best. Simulation results are given in comparison with some typical alternatives, including the linear RLS algorithm and a number of online non-linear approaches, in terms of modelling performance and time consumption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The classical computer vision methods can only weakly emulate some of the multi-level parallelisms in signal processing and information sharing that takes place in different parts of the primates’ visual system thus enabling it to accomplish many diverse functions of visual perception. One of the main functions of the primates’ vision is to detect and recognise objects in natural scenes despite all the linear and non-linear variations of the objects and their environment. The superior performance of the primates’ visual system compared to what machine vision systems have been able to achieve to date, motivates scientists and researchers to further explore this area in pursuit of more efficient vision systems inspired by natural models. In this paper building blocks for a hierarchical efficient object recognition model are proposed. Incorporating the attention-based processing would lead to a system that will process the visual data in a non-linear way focusing only on the regions of interest and hence reducing the time to achieve real-time performance. Further, it is suggested to modify the visual cortex model for recognizing objects by adding non-linearities in the ventral path consistent with earlier discoveries as reported by researchers in the neuro-physiology of vision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Milk supply from Mexican dairy farms does not meet demand and small-scale farms can contribute toward closing the gap. Two multi-criteria programming techniques, goal programming and compromise programming, were used in a study of small-scale dairy farms in central Mexico. To build the goal and compromise programming models, 4 ordinary linear programming models were also developed, which had objective functions to maximize metabolizable energy for milk production, to maximize margin of income over feed costs, to maximize metabolizable protein for milk production, and to minimize purchased feedstuffs. Neither multicriteria approach was significantly better than the other; however, by applying both models it was possible to perform a more comprehensive analysis of these small-scale dairy systems. The multi-criteria programming models affirm findings from previous work and suggest that a forage strategy based on alfalfa, rye-grass, and corn silage would meet nutrient requirements of the herd. Both models suggested that there is an economic advantage in rescheduling the calving season to the second and third calendar quarters to better synchronize higher demand for nutrients with the period of high forage availability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of Research Theme 4 (RT4) was to advance understanding of the basic science issues at the heart of the ENSEMBLES project, focusing on the key processes that govern climate variability and change, and that determine the predictability of climate. Particular attention was given to understanding linear and non-linear feedbacks that may lead to climate surprises,and to understanding the factors that govern the probability of extreme events. Improved understanding of these issues will contribute significantly to the quantification and reduction of uncertainty in seasonal to decadal predictions and projections of climate change. RT4 exploited the ENSEMBLES integrations (stream 1) performed in RT2A as well as undertaking its own experimentation to explore key processes within the climate system. It was working at the cutting edge of problems related to climate feedbacks, the interaction between climate variability and climate change � especially how climate change pertains to extreme events, and the predictability of the climate system on a range of time-scales. The statisticalmethodologies developed for extreme event analysis are new and state-of-the-art. The RT4-coordinated experiments, which have been conducted with six different atmospheric GCMs forced by common timeinvariant sea surface temperature (SST) and sea-ice fields (removing some sources of inter-model variability), are designed to help to understand model uncertainty (rather than scenario or initial condition uncertainty) in predictions of the response to greenhouse-gas-induced warming. RT4 links strongly with RT5 on the evaluation of the ENSEMBLES prediction system and feeds back its results to RT1 to guide improvements in the Earth system models and, through its research on predictability, to steer the development of methods for initialising the ensembles

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New experiments underpin the interpretation of the basic division in crystallization behaviour of polyethylene in terms of whether or not there is time for the fold surface to order before the next molecular layer is added at the growth front. For typical growth rates, in Regime 11, polyethylene lamellae form with disordered {001} fold surfaces then transform, with lamellar thickening and twisting, towards the more-ordered condition found for slower crystallization in Regime 1, in which lamellae form with and retain {201} fold surfaces. Several linear and linear-low-density polyethylenes have been used to show that, for the same polymer crystallized alone or in a blend, the growth rate at which the change in initial lamellar condition occurs is reasonably constant thereby supporting the concept of a specific time for surfaces to attain the ordered {201}) state. This specific time, in the range from milliseconds to seconds, increases with molecular length, and in linear-low-density polymer, for higher branch contents. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monomer-sequence information in synthetic copolyimides can be recognised by tweezer-type molecules binding to adjacent triplet-sequences on the polymer chains. In the present paper different tweezer-molecules are found to have different sequence-selectivities, as demonstrated in solution by 1H NMR spectroscopy and in the solid state by single crystal X-ray analyses of tweezer-complexes with linear and macrocyclic oligo-imides. This work provides clear-cut confirmation of polyimide chain-folding and adjacent-tweezer-binding. It also reveals a new and entirely unexpected mechanism for sequence-recognition which, by analogy with a related process in biomolecular information processing, may be termed "frameshift-reading". The ability of one particular tweezer-molecule to detect, with exceptionally high sensitivity, long-range sequence-information in chain-folding aromatic copolyimides, is readily explained by this novel process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Severe acute respiratory syndrome (SARS) coronavirus (SCoV) spike (S) protein is the major surface antigen of the virus and is responsible for receptor binding and the generation of neutralizing antibody. To investigate SCoV S protein, full-length and individual domains of S protein were expressed on the surface of insect cells and were characterized for cleavability and reactivity with serum samples obtained from patients during the convalescent phase of SARS. S protein could be cleaved by exogenous trypsin but not by coexpressed furin, suggesting that the protein is not normally processed during infection. Reactivity was evident by both flow cytometry and Western blot assays, but the pattern of reactivity varied according to assay and sequence of the antigen. The antibody response to SCoV S protein involves antibodies to both linear and conformational epitopes, with linear epitopes associated with the carboxyl domain and conformational epitopes associated with the amino terminal domain. Recombinant SCoV S protein appears to be a suitable antigen for the development of an efficient and sensitive diagnostic test for SARS, but our data suggest that assay format and choice of S antigen are important considerations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We previously reported sequence determination of neutral oligosaccharides by negative ion electrospray tandem mass spectrometry on a quadrupole-orthogonal time-of-flight instrument with high sensitivity and without the need of derivatization. In the present report, we extend our strategies to sialylated oligosaccharides for analysis of chain and blood group types together with branching patterns. A main feature in the negative ion mass spectrometry approach is the unique double glycosidic cleavage induced by 3-glycosidic substitution, producing characteristic D-type fragments which can be used to distinguish the type 1 and type 2 chains, the blood group related Lewis determinants, 3,6-disubstituted core branching patterns, and to assign the structural details of each of the branches. Twenty mono- and disialylated linear and branched oligosaccharides were used for the investigation, and the sensitivity achieved is in the femtomole range. To demonstrate the efficacy of the strategy, we have determined a novel complex disialylated and monofucosylated tridecasaccharide that is based on the lacto-N-decaose core. The structure and sequence assignment was corroborated by :methylation analysis and H-1 NMR spectroscopy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The conformation of a model peptide AAKLVFF based on a fragment of the amyloid beta peptide A beta 16-20, KLVFF, is investigated in methanol and water via solution NMR experiments and Molecular dynamics computer simulations. In previous work, we have shown that AAKLVFF forms peptide nanotubes in methanol and twisted fibrils in water. Chemical shift measurements were used to investigate the solubility of the peptide as a function of concentration in methanol and water. This enabled the determination of critical aggregation concentrations, The Solubility was lower in water. In dilute solution, diffusion coefficients revealed the presence of intermediate aggregates in concentrated solution, coexisting with NMR-silent larger aggregates, presumed to be beta-sheets. In water, diffusion coefficients did not change appreciably with concentration, indicating the presence mainly of monomers, coexisting with larger aggregates in more concentrated solution. Concentration-dependent chemical shift measurements indicated a folded conformation for the monomers/intermediate aggregates in dilute methanol, with unfolding at higher concentration. In water, an antiparallel arrangement of strands was indicated by certain ROESY peak correlations. The temperature-dependent solubility of AAKLVFF in methanol was well described by a van't Hoff analysis, providing a solubilization enthalpy and entropy. This pointed to the importance of solvophobic interactions in the self-assembly process. Molecular dynamics Simulations constrained by NOE values from NMR suggested disordered reverse turn structures for the monomer, with an antiparallel twisted conformation for dimers. To model the beta-sheet structures formed at higher concentration, possible model arrangements of strands into beta-sheets with parallel and antiparallel configurations and different stacking sequences were used as the basis for MD simulations; two particular arrangements of antiparallel beta-sheets were found to be stable, one being linear and twisted and the other twisted in two directions. These structures Were used to simulate Circular dichroism spectra. The roles of aromatic stacking interactions and charge transfer effects were also examined. Simulated spectra were found to be similar to those observed experimentally.(in water or methanol) which show a maximum at 215 or 218 nm due to pi-pi* interactions, when allowance is made for a 15-18 nm red-shift that may be due to light scattering effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 1989, the computer programming language POP-11 is 21 years old. This book looks at the reasons behind its invention, and traces its rise from an experimental language to a major AI language, playing a major part in many innovating projects. There is a chapter on the inventor of the language, Robin Popplestone, and a discussion of the applications of POP-11 in a variety of areas. The efficiency of AI programming is covered, along with a comparison between POP-11 and other programming languages. The book concludes by reviewing the standardization of POP-11 into POP91.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analyzes the use of linear and neural network models for financial distress classification, with emphasis on the issues of input variable selection and model pruning. A data-driven method for selecting input variables (financial ratios, in this case) is proposed. A case study involving 60 British firms in the period 1997-2000 is used for illustration. It is shown that the use of the Optimal Brain Damage pruning technique can considerably improve the generalization ability of a neural model. Moreover, the set of financial ratios obtained with the proposed selection procedure is shown to be an appropriate alternative to the ratios usually employed by practitioners.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents two schemes of measuring the linear and angular kinematics of a rigid body using a kinematically redundant array of triple-axis accelerometers with potential applications in biomechanics. A novel angular velocity estimation algorithm is proposed and evaluated that can compensate for angular velocity errors using measurements of the direction of gravity. Analysis and discussion of optimal sensor array characteristics are provided. A damped 2 axis pendulum was used to excite all 6 DoF of the a suspended accelerometer array through determined complex motion and is the basis of both simulation and experimental studies. The relationship between accuracy and sensor redundancy is investigated for arrays of up to 100 triple axis (300 accelerometer axes) accelerometers in simulation and 10 equivalent sensors (30 accelerometer axes) in the laboratory test rig. The paper also reports on the sensor calibration techniques and hardware implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Crystallization must occur in honey in order to produce set or creamed honey; however, the process must occur in a controlled manner in order to obtain an acceptable product. As a consequence, reliable methods are needed to measure the crystal content of honey (φ expressed as kg crystal per kg honey), which can also be implemented with relative ease in industrial production facilities. Unfortunately, suitable methods do not currently exist. This article reports on the development of 2 independent offline methods to measure the crystal content in honey based on differential scanning calorimetry and high-performance liquid chromatography. The 2 methods gave highly consistent results on the basis of paired t-test involving 143 experimental points (P > 0.05, r**2 = 0.99). The crystal content also correlated with the relative viscosity, defined as the ratio of the viscosity of crystal containing honey to that of the same honey when all crystals are dissolved, giving the following correlation: μr = 1 + 1398.8∅**2.318. This correlation can be used to estimate the crystal content of honey in industrial production facilities. The crystal growth rate at a temperature of 14 ◦C—the normal crystallization temperature used in practice—was linear, and the growth rate also increased with the total glucose content in the honey.