950 resultados para Non-Linear Analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

En el presente trabajo, tratamos diferentes perspectivas sobre la poética, estrategias compositivas y repercusión perceptiva del tiempo en la música de Gérard Grisey. En el primer capítulo, abordamos la concepción del tiempo como unidad y proporcionalidad duracional y su relación con otros parámetros musicales. A continuación, presentamos tres enfoques sobre el tiempo que emergen de la poética de Grisey y del análisis de sus obras: la ruptura con la proporcionalidad duracional y la relación entre tiempo y sonido, el concepto de cambio de escala temporal y la analogía entre tiempo y cosmos. En el segundo capítulo, proponemos tres categorías temporales basadas principalmente en el concepto de previsibilidad: tiempo no lineal, tiempo lineal y tiempo procesual. En el tercer y último capítulo, exponemos los fundamentos de la Teoría de la Información, su relación con el discurso de Grisey y su método de aplicación.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Interaction effects are usually modeled by means of moderated regression analysis. Structural equation models with non-linear constraints make it possible to estimate interaction effects while correcting formeasurement error. From the various specifications, Jöreskog and Yang's(1996, 1998), likely the most parsimonious, has been chosen and further simplified. Up to now, only direct effects have been specified, thus wasting much of the capability of the structural equation approach. This paper presents and discusses an extension of Jöreskog and Yang's specification that can handle direct, indirect and interaction effects simultaneously. The model is illustrated by a study of the effects of an interactive style of use of budgets on both company innovation and performance

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Several methods have been suggested to estimate non-linear models with interaction terms in the presence of measurement error. Structural equation models eliminate measurement error bias, but require large samples. Ordinary least squares regression on summated scales, regression on factor scores and partial least squares are appropriate for small samples but do not correct measurement error bias. Two stage least squares regression does correct measurement error bias but the results strongly depend on the instrumental variable choice. This article discusses the old disattenuated regression method as an alternative for correcting measurement error in small samples. The method is extended to the case of interaction terms and is illustrated on a model that examines the interaction effect of innovation and style of use of budgets on business performance. Alternative reliability estimates that can be used to disattenuate the estimates are discussed. A comparison is made with the alternative methods. Methods that do not correct for measurement error bias perform very similarly and considerably worse than disattenuated regression

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND In previous meta-analyses, tea consumption has been associated with lower incidence of type 2 diabetes. It is unclear, however, if tea is associated inversely over the entire range of intake. Therefore, we investigated the association between tea consumption and incidence of type 2 diabetes in a European population. METHODOLOGY/PRINCIPAL FINDINGS The EPIC-InterAct case-cohort study was conducted in 26 centers in 8 European countries and consists of a total of 12,403 incident type 2 diabetes cases and a stratified subcohort of 16,835 individuals from a total cohort of 340,234 participants with 3.99 million person-years of follow-up. Country-specific Hazard Ratios (HR) for incidence of type 2 diabetes were obtained after adjustment for lifestyle and dietary factors using a Cox regression adapted for a case-cohort design. Subsequently, country-specific HR were combined using a random effects meta-analysis. Tea consumption was studied as categorical variable (0, >0-<1, 1-<4, ≥ 4 cups/day). The dose-response of the association was further explored by restricted cubic spline regression. Country specific medians of tea consumption ranged from 0 cups/day in Spain to 4 cups/day in United Kingdom. Tea consumption was associated inversely with incidence of type 2 diabetes; the HR was 0.84 [95%CI 0.71, 1.00] when participants who drank ≥ 4 cups of tea per day were compared with non-drinkers (p(linear trend) = 0.04). Incidence of type 2 diabetes already tended to be lower with tea consumption of 1-<4 cups/day (HR = 0.93 [95%CI 0.81, 1.05]). Spline regression did not suggest a non-linear association (p(non-linearity) = 0.20). CONCLUSIONS/SIGNIFICANCE A linear inverse association was observed between tea consumption and incidence of type 2 diabetes. People who drink at least 4 cups of tea per day may have a 16% lower risk of developing type 2 diabetes than non-tea drinkers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Raman spectroscopy has become an attractive tool for the analysis of pharmaceutical solid dosage forms. In the present study it is used to ensure the identity of tablets. The two main applications of this method are release of final products in quality control and detection of counterfeits. Twenty-five product families of tablets have been included in the spectral library and a non-linear classification method, the Support Vector Machines (SVMs), has been employed. Two calibrations have been developed in cascade: the first one identifies the product family while the second one specifies the formulation. A product family comprises different formulations that have the same active pharmaceutical ingredient (API) but in a different amount. Once the tablets have been classified by the SVM model, API peaks detection and correlation are applied in order to have a specific method for the identification and allow in the future to discriminate counterfeits from genuine products. This calibration strategy enables the identification of 25 product families without error and in the absence of prior information about the sample. Raman spectroscopy coupled with chemometrics is therefore a fast and accurate tool for the identification of pharmaceutical tablets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

AIM: This study aims to investigate the clinical and demographic factors influencing gentamicin pharmacokinetics in a large cohort of unselected premature and term newborns and to evaluate optimal regimens in this population. METHODS: All gentamicin concentration data, along with clinical and demographic characteristics, were retrieved from medical charts in a Neonatal Intensive Care Unit over 5 years within the frame of a routine therapeutic drug monitoring programme. Data were described using non-linear mixed-effects regression analysis ( nonmem®). RESULTS: A total of 3039 gentamicin concentrations collected in 994 preterm and 455 term newborns were included in the analysis. A two compartment model best characterized gentamicin disposition. The average parameter estimates, for a median body weight of 2170 g, were clearance (CL) 0.089 l h(-1) (CV 28%), central volume of distribution (Vc ) 0.908 l (CV 18%), intercompartmental clearance (Q) 0.157 l h(-1) and peripheral volume of distribution (Vp ) 0.560 l. Body weight, gestational age and post-natal age positively influenced CL. Dopamine co-administration had a significant negative effect on CL, whereas the influence of indomethacin and furosemide was not significant. Both body weight and gestational age significantly influenced Vc . Model-based simulations confirmed that, compared with term neonates, preterm infants need higher doses, superior to 4 mg kg(-1) , at extended intervals to achieve adequate concentrations. CONCLUSIONS: This observational study conducted in a large cohort of newborns confirms the importance of body weight and gestational age for dosage adjustment. The model will serve to set up dosing recommendations and elaborate a Bayesian tool for dosage individualization based on concentration monitoring.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Comprehensive approach study aimed understanding the reflections and contrasts between personal time and medical therapy protocol time in the life of a young woman with breast cancer. Addressed as a situational study and grounded in Beth’s life story about getting sick and dying of cancer at age 34, the study’s data collection process employed interviews, observation and medical record analysis. The construction of the analytic-synthetic box based on the chronology of Beth’s clinical progression, treatment phases and temporal perception of occurrences enabled us to point out a linear medical therapy protocol time identified by the diagnosis and treatment sequencing process. On the other hand, Beth’s experienced time was marked by simultaneous and non-linear events that generated suffering resulting from the disease. Such comprehension highlights the need for healthcare professionals to take into account the time experienced by the patient, thus providing an indispensable cancer therapeutic protocol with a personal character.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In order to have references for discussing mathematical menus in political science, Ireview the most common types of mathematical formulae used in physics andchemistry, as well as some mathematical advances in economics. Several issues appearrelevant: variables should be well defined and measurable; the relationships betweenvariables may be non-linear; the direction of causality should be clearly identified andnot assumed on a priori grounds. On these bases, theoretically-driven equations onpolitical matters can be validated by empirical tests and can predict observablephenomena.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents and estimates a dynamic choice model in the attribute space considering rational consumers. In light of the evidence of several state-dependence patterns, the standard attribute-based model is extended by considering a general utility function where pure inertia and pure variety-seeking behaviors can be explained in the model as particular linear cases. The dynamics of the model are fully characterized by standard dynamic programming techniques. The model presents a stationary consumption pattern that can be inertial, where the consumer only buys one product, or a variety-seeking one, where the consumer shifts among varied products.We run some simulations to analyze the consumption paths out of the steady state. Underthe hybrid utility assumption, the consumer behaves inertially among the unfamiliar brandsfor several periods, eventually switching to a variety-seeking behavior when the stationary levels are approached. An empirical analysis is run using scanner databases for three different product categories: fabric softener, saltine cracker, and catsup. Non-linear specifications provide the best fit of the data, as hybrid functional forms are found in all the product categories for most attributes and segments. These results reveal the statistical superiority of the non-linear structure and confirm the gradual trend to seek variety as the level of familiarity with the purchased items increases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a comparative analysis of linear and mixed modelsfor short term forecasting of a real data series with a high percentage of missing data. Data are the series of significant wave heights registered at regular periods of three hours by a buoy placed in the Bay of Biscay.The series is interpolated with a linear predictor which minimizes theforecast mean square error. The linear models are seasonal ARIMA models and themixed models have a linear component and a non linear seasonal component.The non linear component is estimated by a non parametric regression of dataversus time. Short term forecasts, no more than two days ahead, are of interestbecause they can be used by the port authorities to notice the fleet.Several models are fitted and compared by their forecasting behavior.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Erosion is deleterious because it reduces the soil's productivity capacity for growing crops and causes sedimentation and water pollution problems. Surface and buried crop residue, as well as live and dead plant roots, play an important role in erosion control. An efficient way to assess the effectiveness of such materials in erosion reduction is by means of decomposition constants as used within the Revised Universal Soil Loss Equation - RUSLE's prior-land-use subfactor - PLU. This was investigated using simulated rainfall on a 0.12 m m-1 slope, sandy loam Paleudult soil, at the Agriculture Experimental Station of the Federal University of Rio Grande do Sul, in Eldorado do Sul, State of Rio Grande do Sul, Brazil. The study area had been covered by native grass pasture for about fifteen years. By the middle of March 1996, the sod was mechanically mowed and the crop residue removed from the field. Late in April 1996, the sod was chemically desiccated with herbicide and, about one month later, the following treatments were established and evaluated for sod biomass decomposition and soil erosion, from June 1996 to May 1998, on duplicated 3.5 x 11.0 m erosion plots: (a) and (b) soil without tillage, with surface residue and dead roots; (c) soil without tillage, with dead roots only; (d) soil tilled conventionally every two-and-half months, with dead roots plus incorporated residue; and (e) soil tilled conventionally every six months, with dead roots plus incorporated residue. Simulated rainfall was applied with a rotating-boom rainfall simulator, at an intensity of 63.5 mm h-1 for 90 min, eight to nine times during the experimental period (about every two-and-half months). Surface and subsurface sod biomass amounts were measured before each rainfall test along with the erosion measurements of runoff rate, sediment concentration in runoff, soil loss rate, and total soil loss. Non-linear regression analysis was performed using an exponential and a power model. Surface sod biomass decomposition was better depicted by the exponential model, while subsurface sod biomass was by the power model. Subsurface sod biomass decomposed faster and more than surface sod biomass, with dead roots in untilled soil without residue on the surface decomposing more than dead roots in untilled soil with surface residue. Tillage type and frequency did not appreciably influence subsurface sod biomass decomposition. Soil loss rates increased greatly with both surface sod biomass decomposition and decomposition of subsurface sod biomass in the conventionally tilled soil, but they were minimally affected by subsurface sod biomass decomposition in the untilled soil. Runoff rates were little affected by the studied treatments. Dead roots plus incorporated residues were effective in reducing erosion in the conventionally tilled soil, while consolidation of the soil surface was important in no-till. The residual effect of the turned soil on erosion diminished gradually with time and ceased after two years.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study investigated the spatial, spectral, temporal and functional proprieties of functional brain connections involved in the concurrent execution of unrelated visual perception and working memory tasks. Electroencephalography data was analysed using a novel data-driven approach assessing source coherence at the whole-brain level. Three connections in the beta-band (18-24 Hz) and one in the gamma-band (30-40 Hz) were modulated by dual-task performance. Beta-coherence increased within two dorsofrontal-occipital connections in dual-task conditions compared to the single-task condition, with the highest coherence seen during low working memory load trials. In contrast, beta-coherence in a prefrontal-occipital functional connection and gamma-coherence in an inferior frontal-occipitoparietal connection was not affected by the addition of the second task and only showed elevated coherence under high working memory load. Analysis of coherence as a function of time suggested that the dorsofrontal-occipital beta-connections were relevant to working memory maintenance, while the prefrontal-occipital beta-connection and the inferior frontal-occipitoparietal gamma-connection were involved in top-down control of concurrent visual processing. The fact that increased coherence in the gamma-connection, from low to high working memory load, was negatively correlated with faster reaction time on the perception task supports this interpretation. Together, these results demonstrate that dual-task demands trigger non-linear changes in functional interactions between frontal-executive and occipitoparietal-perceptual cortices.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The nutritional state of the pineapple plant has a large effect on plant growth, on fruit production, and fruit quality. The aim of this study was to assess the uptake, accumulation, and export of nutrients by the irrigated 'Vitória' pineapple plant during and at the end of its development. A randomized block statistical design with four replications was used. The treatments were defined by different times of plant collection: at 270, 330, 390, 450, 510, 570, 690, 750, and 810 days after planting (DAP). The collected plants were separated into the following components: leaves, stem, roots, fruit, and slips for determination of fresh and dry matter weight at 65 ºC. After drying, the plant components were ground for characterization of the composition and content of nutrients taken up and exported by the pineapple plant. The results were subjected to analysis of variance, and non-linear regression models were fitted for the significant differences identified by the F test (p<0.01). The leaves and the stem were the plant components that showed the greatest accumulation of nutrients. For production of 72 t ha-1 of fruit, the macronutrient accumulation in the 'Vitória' pineapple exhibited the following decreasing order: K > N > S > Ca > Mg > P, which corresponded to 898, 452, 134, 129, 126, and 107 kg ha-1, respectively, of total accumulation. The export of macronutrients by the pineapple fruit was in the following decreasing order: K > N > S > Ca > P > Mg, which was equivalent to 18, 17, 11, 8, 8, and 5 %, respectively, of the total accumulated by the pineapple. The 'Vitória' pineapple plant exported 78 kg ha-1 of N, 8 kg ha-1 of P, 164 kg ha-1 of K, 14 kg ha-1 of S, 10 kg ha-1 of Ca, and 6 kg ha-1 of Mg by the fruit. The nutrient content exported by the fruits represent important components of nutrient extraction from the soil, which need to be restored, while the nutrients contained in the leaves, stems and roots can be incorporated in the soil within a program of recycling of crop residues.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Estimating the time since discharge of a spent cartridge or a firearm can be useful in criminal situa-tions involving firearms. The analysis of volatile gunshot residue remaining after shooting using solid-phase microextraction (SPME) followed by gas chromatography (GC) was proposed to meet this objective. However, current interpretative models suffer from several conceptual drawbacks which render them inadequate to assess the evidential value of a given measurement. This paper aims to fill this gap by proposing a logical approach based on the assessment of likelihood ratios. A probabilistic model was thus developed and applied to a hypothetical scenario where alternative hy-potheses about the discharge time of a spent cartridge found on a crime scene were forwarded. In order to estimate the parameters required to implement this solution, a non-linear regression model was proposed and applied to real published data. The proposed approach proved to be a valuable method for interpreting aging-related data.