971 resultados para Non linear optical phenomina,


Relevância:

100.00% 100.00%

Publicador:

Resumo:

STUDY QUESTION: What are the long term trends in the total (live births, fetal deaths, and terminations of pregnancy for fetal anomaly) and live birth prevalence of neural tube defects (NTD) in Europe, where many countries have issued recommendations for folic acid supplementation but a policy for mandatory folic acid fortification of food does not exist? METHODS: This was a population based, observational study using data on 11 353 cases of NTD not associated with chromosomal anomalies, including 4162 cases of anencephaly and 5776 cases of spina bifida from 28 EUROCAT (European Surveillance of Congenital Anomalies) registries covering approximately 12.5 million births in 19 countries between 1991 and 2011. The main outcome measures were total and live birth prevalence of NTD, as well as anencephaly and spina bifida, with time trends analysed using random effects Poisson regression models to account for heterogeneities across registries and splines to model non-linear time trends. SUMMARY ANSWER AND LIMITATIONS: Overall, the pooled total prevalence of NTD during the study period was 9.1 per 10 000 births. Prevalence of NTD fluctuated slightly but without an obvious downward trend, with the final estimate of the pooled total prevalence of NTD in 2011 similar to that in 1991. Estimates from Poisson models that took registry heterogeneities into account showed an annual increase of 4% (prevalence ratio 1.04, 95% confidence interval 1.01 to 1.07) in 1995-99 and a decrease of 3% per year in 1999-2003 (0.97, 0.95 to 0.99), with stable rates thereafter. The trend patterns for anencephaly and spina bifida were similar, but neither anomaly decreased substantially over time. The live birth prevalence of NTD generally decreased, especially for anencephaly. Registration problems or other data artefacts cannot be excluded as a partial explanation of the observed trends (or lack thereof) in the prevalence of NTD. WHAT THIS STUDY ADDS: In the absence of mandatory fortification, the prevalence of NTD has not decreased in Europe despite longstanding recommendations aimed at promoting peri-conceptional folic acid supplementation and existence of voluntary folic acid fortification. FUNDING, COMPETING INTERESTS, DATA SHARING: The study was funded by the European Public Health Commission, EUROCAT Joint Action 2011-2013. HD and ML received support from the European Commission DG Sanco during the conduct of this study. No additional data available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews and extends our previous work to enable fast axonal diameter mapping from diffusion MRI data in the presence of multiple fibre populations within a voxel. Most of the existing mi-crostructure imaging techniques use non-linear algorithms to fit their data models and consequently, they are computationally expensive and usually slow. Moreover, most of them assume a single axon orientation while numerous regions of the brain actually present more complex configurations, e.g. fiber crossing. We present a flexible framework, based on convex optimisation, that enables fast and accurate reconstructions of the microstructure organisation, not limited to areas where the white matter is coherently oriented. We show through numerical simulations the ability of our method to correctly estimate the microstructure features (mean axon diameter and intra-cellular volume fraction) in crossing regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study analyzed high-density event-related potentials (ERPs) within an electrical neuroimaging framework to provide insights regarding the interaction between multisensory processes and stimulus probabilities. Specifically, we identified the spatiotemporal brain mechanisms by which the proportion of temporally congruent and task-irrelevant auditory information influences stimulus processing during a visual duration discrimination task. The spatial position (top/bottom) of the visual stimulus was indicative of how frequently the visual and auditory stimuli would be congruent in their duration (i.e., context of congruence). Stronger influences of irrelevant sound were observed when contexts associated with a high proportion of auditory-visual congruence repeated and also when contexts associated with a low proportion of congruence switched. Context of congruence and context transition resulted in weaker brain responses at 228 to 257 ms poststimulus to conditions giving rise to larger behavioral cross-modal interactions. Importantly, a control oddball task revealed that both congruent and incongruent audiovisual stimuli triggered equivalent non-linear multisensory interactions when congruence was not a relevant dimension. Collectively, these results are well explained by statistical learning, which links a particular context (here: a spatial location) with a certain level of top-down attentional control that further modulates cross-modal interactions based on whether a particular context repeated or changed. The current findings shed new light on the importance of context-based control over multisensory processing, whose influences multiplex across finer and broader time scales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The most important adverse effect of BoNT-A is the systemic diffusion of the toxin. There is some evidence that the administration of high doses can increase the risk of systemic diffusion and the development of clinically evident adverse effects, however an international consensus does not exist about its maximum dose. AIM: The aim of this study was to evaluate changes in autonomic heart drive induced by high doses (higher than 600 units) of incobotulinumtoxinA injection in spastic stroke patients. Moreover, the treatment safety by monitoring adverse events occurrence was assessed. DESIGN: Case control study. POPULATION: Eleven stroke survivors with spastic hemiplegia. METHODS: Patients were treated with intramuscular focal injections of IncobotulinumtoxinA (NT 201; Xeomin®, Merz Pharmaceuticals GmbH, Frankfurt, Germany). Doses were below 12 units/Kg. Each patient underwent an ECG recording before injection and 10 days after treatment. Linear and non-linear Heart Rate variability (HRV) measures were derived from ECGs with a dedicated software. RESULTS: None of the variable considered showed statistically significant changes after BoNT-A injection. CONCLUSION: The use of incobotulinumtoxinA in adult patients at doses up to 12 units/kg seems to be safe regarding autonomic heart drive. CLINICAL REHABILITATION IMPACT: The use of IncobotulinumtoxinA up to 600 units could be a safe therapeutic option in spastic hemiplegic stroke survivors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ISSUES: There have been reviews on the association between density of alcohol outlets and harm including studies published up to December 2008. Since then the number of publications has increased dramatically. The study reviews the more recent studies with regard to their utility to inform policy. APPROACH: A systematic review found more than 160 relevant studies (published between January 2009 and October 2014). The review focused on: (i) outlet density and assaultive or intimate partner violence; (ii) studies including individual level data; or (iii) 'natural experiments'. KEY FINDINGS: Despite overall evidence for an association between density and harm, there is little evidence on causal direction (i.e. whether demand leads to more supply or increased availability increases alcohol use and harm). When outlet types (e.g. bars, supermarkets) are analysed separately, studies are too methodologically diverse and partly contradictory to permit firm conclusions besides those pertaining to high outlet densities in areas such as entertainment districts. Outlet density commonly had little effect on individual-level alcohol use, and the few 'natural experiments' on restricting densities showed little or no effects. IMPLICATIONS AND CONCLUSIONS: Although outlet densities are likely to be positively related to alcohol use and harm, few policy recommendations can be given as effects vary across study areas, outlet types and outlet cluster size. Future studies should examine in detail outlet types, compare different outcomes associated with different strengths of association with alcohol, analyse non-linear effects and compare different methodologies. Purely aggregate-level studies examining total outlet density only should be abandoned. [Gmel G, Holmes J, Studer J. Are alcohol outlet densities strongly associated with alcohol-related outcomes? A critical review of recent evidence. Drug Alcohol Rev 2015].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: Different accelerometer cutpoints used by different researchers often yields vastly different estimates of moderate-to-vigorous intensity physical activity (MVPA). This is recognized as cutpoint non-equivalence (CNE), which reduces the ability to accurately compare youth MVPA across studies. The objective of this research is to develop a cutpoint conversion system that standardizes minutes of MVPA for six different sets of published cutpoints. DESIGN: Secondary data analysis. METHODS: Data from the International Children's Accelerometer Database (ICAD; Spring 2014) consisting of 43,112 Actigraph accelerometer data files from 21 worldwide studies (children 3-18 years, 61.5% female) were used to develop prediction equations for six sets of published cutpoints. Linear and non-linear modeling, using a leave one out cross-validation technique, was employed to develop equations to convert MVPA from one set of cutpoints into another. Bland Altman plots illustrate the agreement between actual MVPA and predicted MVPA values. RESULTS: Across the total sample, mean MVPA ranged from 29.7MVPAmind(-1) (Puyau) to 126.1MVPAmind(-1) (Freedson 3 METs). Across conversion equations, median absolute percent error was 12.6% (range: 1.3 to 30.1) and the proportion of variance explained ranged from 66.7% to 99.8%. Mean difference for the best performing prediction equation (VC from EV) was -0.110mind(-1) (limits of agreement (LOA), -2.623 to 2.402). The mean difference for the worst performing prediction equation (FR3 from PY) was 34.76mind(-1) (LOA, -60.392 to 129.910). CONCLUSIONS: For six different sets of published cutpoints, the use of this equating system can assist individuals attempting to synthesize the growing body of literature on Actigraph, accelerometry-derived MVPA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The least square method is analyzed. The basic aspects of the method are discussed. Emphasis is given in procedures that allow a simple memorization of the basic equations associated with the linear and non linear least square method, polinomial regression and multilinear method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tässä työssä kehitettiin palo- ja pelastuskäyttöön tarkoitettuun henkilönostimeen teleskooppipuomin profiilit. Profiilien valmistusmateriaalina oli kuumavalssattu, ultraluja säänkestävä rakenneteräs. Työssä kehitettiin standardien ja ohjeiden pohjalta laskentapohja, jolla voidaan tutkia teleskooppipuomin jaksojen tukireaktioita, taivutus- ja vääntömomentteja ja leikkaus ja normaalivoimia. Laskentapohjassa voidaan varioida eri kuormitusten suuntia, teleskooppipuomin sivusuuntaista ulottumaa ja nostokulmaa. Profiilien alustavassa mitoituksessa hyödynnettiin paikallisen lommahduksen huomioon ottavia standardeja ja suunnitteluohjeita. Eri poikkileikkausten ominaisuuksia verrattiin keskenään ja profiili valittiin yhdessä kohdeyrityksen kanssa. Alustavan mitoituksen yhteydessä muodostettiin apuohjelma valitulle poikkileikkaukselle, jolla voitiin tutkia profiilin eri muuttujien vaikutusta mm. paikalliseen lommahdukseen ja jäykkyyteen. Laskentapohjaan sisällytettiin myös optimointirutiini, jolla voitiin minimoida poikkileikkauksen pinta-ala ja tätä kautta profiilin massa. Lopullinen mitoitus suoritettiin elementtimenetelmällä. Mitoituksessa tutkittiin alustavasti mitoitettujen profiilien paikallista lommahdusta lineaarisen stabiilius- ja epälineaarisen analyysin pohjalta. Profiilien jännityksiä tutkittiin tarkemmin mm. varioimalla kuormituksia ja osittelemalla elementtien normaalijännityksiä. Diplomityössä kehitetyllä ja analysoidulla teleskooppipuomilla voitiin keventää jaksojen painoja 15-30 %. Sivusuuntainen ulottuma parani samalla lähes 20 % ja nimelliskuorma kasvoi 25 %.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the main problems in quantitative analysis of complex samples by x-ray fluorescence is related to interelemental (or matrix) effects. These effects appear as a result of interactions among sample elements, affecting the x-ray emission intensity in a non-linear manner. Basically, two main effects occur; intensity absorption and enhancement. The combination of these effects can lead to serious problems. Many studies have been carried out proposing mathematical methods to correct for these effects. Basic concepts and the main correction methods are discussed here.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Learning of preference relations has recently received significant attention in machine learning community. It is closely related to the classification and regression analysis and can be reduced to these tasks. However, preference learning involves prediction of ordering of the data points rather than prediction of a single numerical value as in case of regression or a class label as in case of classification. Therefore, studying preference relations within a separate framework facilitates not only better theoretical understanding of the problem, but also motivates development of the efficient algorithms for the task. Preference learning has many applications in domains such as information retrieval, bioinformatics, natural language processing, etc. For example, algorithms that learn to rank are frequently used in search engines for ordering documents retrieved by the query. Preference learning methods have been also applied to collaborative filtering problems for predicting individual customer choices from the vast amount of user generated feedback. In this thesis we propose several algorithms for learning preference relations. These algorithms stem from well founded and robust class of regularized least-squares methods and have many attractive computational properties. In order to improve the performance of our methods, we introduce several non-linear kernel functions. Thus, contribution of this thesis is twofold: kernel functions for structured data that are used to take advantage of various non-vectorial data representations and the preference learning algorithms that are suitable for different tasks, namely efficient learning of preference relations, learning with large amount of training data, and semi-supervised preference learning. Proposed kernel-based algorithms and kernels are applied to the parse ranking task in natural language processing, document ranking in information retrieval, and remote homology detection in bioinformatics domain. Training of kernel-based ranking algorithms can be infeasible when the size of the training set is large. This problem is addressed by proposing a preference learning algorithm whose computation complexity scales linearly with the number of training data points. We also introduce sparse approximation of the algorithm that can be efficiently trained with large amount of data. For situations when small amount of labeled data but a large amount of unlabeled data is available, we propose a co-regularized preference learning algorithm. To conclude, the methods presented in this thesis address not only the problem of the efficient training of the algorithms but also fast regularization parameter selection, multiple output prediction, and cross-validation. Furthermore, proposed algorithms lead to notably better performance in many preference learning tasks considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rosin is a natural product from pine forests and it is used as a raw material in resinate syntheses. Resinates are polyvalent metal salts of rosin acids and especially Ca- and Ca/Mg- resinates find wide application in the printing ink industry. In this thesis, analytical methods were applied to increase general knowledge of resinate chemistry and the reaction kinetics was studied in order to model the non linear solution viscosity increase during resinate syntheses by the fusion method. Solution viscosity in toluene is an important quality factor for resinates to be used in printing inks. The concept of critical resinate concentration, c crit, was introduced to define an abrupt change in viscosity dependence on resinate concentration in the solution. The concept was then used to explain the non-inear solution viscosity increase during resinate syntheses. A semi empirical model with two estimated parameters was derived for the viscosity increase on the basis of apparent reaction kinetics. The model was used to control the viscosity and to predict the total reaction time of the resinate process. The kinetic data from the complex reaction media was obtained by acid value titration and by FTIR spectroscopic analyses using a conventional calibration method to measure the resinate concentration and the concentration of free rosin acids. A multivariate calibration method was successfully applied to make partial least square (PLS) models for monitoring acid value and solution viscosity in both mid-infrared (MIR) and near infrared (NIR) regions during the syntheses. The calibration models can be used for on line resinate process monitoring. In kinetic studies, two main reaction steps were observed during the syntheses. First a fast irreversible resination reaction occurs at 235 °C and then a slow thermal decarboxylation of rosin acids starts to take place at 265 °C. Rosin oil is formed during the decarboxylation reaction step causing significant mass loss as the rosin oil evaporates from the system while the viscosity increases to the target level. The mass balance of the syntheses was determined based on the resinate concentration increase during the decarboxylation reaction step. A mechanistic study of the decarboxylation reaction was based on the observation that resinate molecules are partly solvated by rosin acids during the syntheses. Different decarboxylation mechanisms were proposed for the free and solvating rosin acids. The deduced kinetic model supported the analytical data of the syntheses in a wide resinate concentration region, over a wide range of viscosity values and at different reaction temperatures. In addition, the application of the kinetic model to the modified resinate syntheses gave a good fit. A novel synthesis method with the addition of decarboxylated rosin (i.e. rosin oil) to the reaction mixture was introduced. The conversion of rosin acid to resinate was increased to the level necessary to obtain the target viscosity for the product at 235 °C. Due to a lower reaction temperature than in traditional fusion synthesis at 265 °C, thermal decarboxylation is avoided. As a consequence, the mass yield of the resinate syntheses can be increased from ca. 70% to almost 100% by recycling the added rosin oil.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ability of biomolecules to catalyze chemical reactions is due chiefly to their sensitivity to variations of the pH in the surrounding environment. The reason for this is that they are made up of chemical groups whose ionization states are modulated by pH changes that are of the order of 0.4 units. The determination of the protonation states of such chemical groups as a function of conformation of the biomolecule and the pH of the environment can be useful in the elucidation of important biological processes from enzymatic catalysis to protein folding and molecular recognition. In the past 15 years, the theory of Poisson-Boltzmann has been successfully used to estimate the pKa of ionizable sites in proteins yielding results, which may differ by 0.1 unit from the experimental values. In this study, we review the theory of Poisson-Boltzmann under the perspective of its application to the calculation of pKa in proteins.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dynamic mechanical analysis (DMA) is widely used in materials characterization. In this work, we briefly introduce the main concepts related to this technique such as, linear and non-linear viscoelasticity, relaxation time, response of material when it is submitted to a sinusoidal or other periodic stress. Moreover, the main applications of this technique in polymers and polymer blends are also presented. The discussion includes: phase behavior, crystallization; spectrum of relaxation as a function of frequency or temperature; correlation between the material damping and its acoustic and mechanical properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is a well known phenomenon that the constant amplitude fatigue limit of a large component is lower than the fatigue limit of a small specimen made of the same material. In notched components the opposite occurs: the fatigue limit defined as the maximum stress at the notch is higher than that achieved with smooth specimens. These two effects have been taken into account in most design handbooks with the help of experimental formulas or design curves. The basic idea of this study is that the size effect can mainly be explained by the statistical size effect. A component subjected to an alternating load can be assumed to form a sample of initiated cracks at the end of the crack initiation phase. The size of the sample depends on the size of the specimen in question. The main objective of this study is to develop a statistical model for the estimation of this kind of size effect. It was shown that the size of a sample of initiated cracks shall be based on the stressed surface area of the specimen. In case of varying stress distribution, an effective stress area must be calculated. It is based on the decreasing probability of equally sized initiated cracks at lower stress level. If the distribution function of the parent population of cracks is known, the distribution of the maximum crack size in a sample can be defined. This makes it possible to calculate an estimate of the largest expected crack in any sample size. The estimate of the fatigue limit can now be calculated with the help of the linear elastic fracture mechanics. In notched components another source of size effect has to be taken into account. If we think about two specimens which have similar shape, but the size is different, it can be seen that the stress gradient in the smaller specimen is steeper. If there is an initiated crack in both of them, the stress intensity factor at the crack in the larger specimen is higher. The second goal of this thesis is to create a calculation method for this factor which is called the geometric size effect. The proposed method for the calculation of the geometric size effect is also based on the use of the linear elastic fracture mechanics. It is possible to calculate an accurate value of the stress intensity factor in a non linear stress field using weight functions. The calculated stress intensity factor values at the initiated crack can be compared to the corresponding stress intensity factor due to constant stress. The notch size effect is calculated as the ratio of these stress intensity factors. The presented methods were tested against experimental results taken from three German doctoral works. Two candidates for the parent population of initiated cracks were found: the Weibull distribution and the log normal distribution. Both of them can be used successfully for the prediction of the statistical size effect for smooth specimens. In case of notched components the geometric size effect due to the stress gradient shall be combined with the statistical size effect. The proposed method gives good results as long as the notch in question is blunt enough. For very sharp notches, stress concentration factor about 5 or higher, the method does not give sufficient results. It was shown that the plastic portion of the strain becomes quite high at the root of this kind of notches. The use of the linear elastic fracture mechanics becomes therefore questionable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

B3LYP/6-31G(d,p) calculations were used to determine the optimized geometries of the C2H4O-C2H2 and C2H4S-C2H2 heterocyclic hydrogen-bonded complexes. Results of structural, rotational, electronic and vibrational parameters indicate that the hydrogen bonding is non-linear due to the pi bond of the acetylene interacting with the hydrogen atoms of the methyl groups of the three-membered rings. Moreover, the theoretical investigation showed that the non-linearity is much more intriguing, since there is a structural disjunction on the acetylene within the heterocyclic system.