930 resultados para Random effect model
Resumo:
Les biotechnologies, le réchauffement climatique, les ressources naturelles et la gestion des écosystèmes sont tous représentatifs de la “nouvelle politique de la nature” (Hajer 2003), un terme englobant les enjeux marqués par une grande incertitude scientifique et un encadrement réglementaire inadapté aux nouvelles réalités, suscitant de fait un conflit politique hors du commun. Dans l'espoir de diminuer ces tensions et de générer un savoir consensuel, de nombreux gouvernements se tournent vers des institutions scientifiques ad hoc pour documenter l'élaboration des politiques et répondre aux préoccupations des partie-prenantes. Mais ces évaluations scientifiques permettent-elles réellement de créer une compréhension commune partagée par ces acteurs politiques polarisés? Alors que l'on pourrait croire que celles-ci génèrent un climat d'apprentissage collectif rassembleur, un environnement politique conflictuel rend l'apprentissage entre opposant extrêmement improbable. Ainsi, cette recherche documente le potentiel conciliateur des évaluation scientifique en utilisant le cas des gaz de schiste québécois (2010-2014). Ce faisant, elle mobilise la littérature sur les dimensions politiques du savoir et de la science afin de conceptualiser le rôle des évaluations scientifiques au sein d'une théorie de la médiation scientifique (scientific brokerage). Une analyse de réseau (SNA) des 5751 références contenues dans les documents déposés par 268 organisations participant aux consultations publiques de 2010 et 2014 constitue le corps de la démonstration empirique. Précisément, il y est démontré comment un médiateur scientifique peut rediriger le flux d'information afin de contrer l'incompatibilité entre apprentissage collectif et conflit politique. L'argument mobilise les mécanismes cognitifs traditionnellement présents dans la théorie des médiateurs de politique (policy broker), mais introduit aussi les jeux de pouvoir fondamentaux à la circulation de la connaissance entre acteurs politiques.
Resumo:
Comumente dados de precipitação pluvial apresentam variação e a obtenção da estimativa de sua distribuição espacial é primordial no planejamento agrícola e ambiental. O objetivo neste trabalho foi comparar o método de estimação dos mínimos quadrados ponderados para ajuste de modelos ao semivariograma com o método de tentativa e erro, através da técnica de auto-validação "jack-knifing", para dados de precipitação pluvial média anual do Estado de São Paulo. Observações de precipitação correspondentes ao período de 1957 a 1997 foram usadas para trezentos e setenta e nove (379) estações pluviométricas abrangendo todo o Estado de São Paulo, representando uma área de aproximadamente 248.808,8 km². A periodicidade exibida pelos semivariogramas foi ajustada pelo modelo "hole effect", em que os parâmetros foram estimados com maior precisão pelo método de mínimos quadrados ponderados quando comparado com o método de tentativa e erro. O método de auto-validação "jack-knifing" mostrou-se adequado para a definição de métodos e modelos a serem usados para semivariâncias, cujo procedimento permitiu definir dezesseis vizinhos como o número ideal para a estimativa por krigagem de valores de precipitação pluvial para locais não amostrados no Estado de São Paulo.
Resumo:
The papers included in this thesis deal with a few aspects of insurance economics that have seldom been dealt with in the applied literature. In the first paper I apply for the first time the tools of the economics of crime to study the determinants of frauds, using data on Italian provinces. The contributions to the literature are manifold: -The price of insuring has a positive correlation with the propensity to defraud -Social norms constraint fraudulent behavior, but their strength is curtailed in economic downturns -I apply a simple extension of the Random Coefficient model, which allows for the presence of time invariant covariates and asymmetries in the impact of the regressors. The second paper assesses how the evolution of macro prudential regulation of insurance companies has been reflected in their equity price. I employ a standard event study methodology, deriving the definition of the “control” and “treatment” groups from what is implied by the regulatory framework. The main results are: -Markets care about the evolution of the legislation. Their perception has shifted from a first positive assessment of a possible implicit “too big to fail” subsidy to a more negative one related to its cost in terms of stricter capital requirement -The size of this phenomenon is positively related to leverage, size and on the geographical location of the insurance companies The third paper introduces a novel methodology to forecast non-life insurance premiums and profitability as function of macroeconomic variables, using the simultaneous equation framework traditionally employed macroeconometric models and a simple theoretical model of insurance pricing to derive a long term relationship between premiums, claims expenses and short term rates. The model is shown to provide a better forecast of premiums and profitability compared with the single equation specifications commonly used in applied analysis.
Resumo:
Earthquake prediction is a complex task for scientists due to the rare occurrence of high-intensity earthquakes and their inaccessible depths. Despite this challenge, it is a priority to protect infrastructure, and populations living in areas of high seismic risk. Reliable forecasting requires comprehensive knowledge of seismic phenomena. In this thesis, the development, application, and comparison of both deterministic and probabilistic forecasting methods is shown. Regarding the deterministic approach, the implementation of an alarm-based method using the occurrence of strong (fore)shocks, widely felt by the population, as a precursor signal is described. This model is then applied for retrospective prediction of Italian earthquakes of magnitude M≥5.0,5.5,6.0, occurred in Italy from 1960 to 2020. Retrospective performance testing is carried out using tests and statistics specific to deterministic alarm-based models. Regarding probabilistic models, this thesis focuses mainly on the EEPAS and ETAS models. Although the EEPAS model has been previously applied and tested in some regions of the world, it has never been used for forecasting Italian earthquakes. In the thesis, the EEPAS model is used to retrospectively forecast Italian shallow earthquakes with a magnitude of M≥5.0 using new MATLAB software. The forecasting performance of the probabilistic models was compared to other models using CSEP binary tests. The EEPAS and ETAS models showed different characteristics for forecasting Italian earthquakes, with EEPAS performing better in the long-term and ETAS performing better in the short-term. The FORE model based on strong precursor quakes is compared to EEPAS and ETAS using an alarm-based deterministic approach. All models perform better than a random forecasting model, with ETAS and FORE models showing better performance. However, to fully evaluate forecasting performance, prospective tests should be conducted. The lack of objective tests for evaluating deterministic models and comparing them with probabilistic ones was a challenge faced during the study.
Resumo:
We present a method to simulate the Magnetic Barkhausen Noise using the Random Field Ising Model with magnetic long-range interaction. The method allows calculating the magnetic flux density behavior in particular sections of the lattice reticule. The results show an internal demagnetizing effect that proceeds from the magnetic long-range interactions. This demagnetizing effect induces the appearing of a magnetic pattern in the region of magnetic avalanches. When compared with the traditional method, the proposed numerical procedure neatly reduces computational costs of simulation. (c) 2008 Published by Elsevier B.V.
Resumo:
The objective of the present study was to estimate milk yield genetic parameters applying random regression models and parametric correlation functions combined with a variance function to model animal permanent environmental effects. A total of 152,145 test-day milk yields from 7,317 first lactations of Holstein cows belonging to herds located in the southeastern region of Brazil were analyzed. Test-day milk yields were divided into 44 weekly classes of days in milk. Contemporary groups were defined by herd-test-day comprising a total of 2,539 classes. The model included direct additive genetic, permanent environmental, and residual random effects. The following fixed effects were considered: contemporary group, age of cow at calving (linear and quadratic regressions), and the population average lactation curve modeled by fourth-order orthogonal Legendre polynomial. Additive genetic effects were modeled by random regression on orthogonal Legendre polynomials of days in milk, whereas permanent environmental effects were estimated using a stationary or nonstationary parametric correlation function combined with a variance function of different orders. The structure of residual variances was modeled using a step function containing 6 variance classes. The genetic parameter estimates obtained with the model using a stationary correlation function associated with a variance function to model permanent environmental effects were similar to those obtained with models employing orthogonal Legendre polynomials for the same effect. A model using a sixth-order polynomial for additive effects and a stationary parametric correlation function associated with a seventh-order variance function to model permanent environmental effects would be sufficient for data fitting.
Resumo:
Most cellular solids are random materials, while practically all theoretical structure-property results are for periodic models. To be able to generate theoretical results for random models, the finite element method (FEM) was used to study the elastic properties of solids with a closed-cell cellular structure. We have computed the density (rho) and microstructure dependence of the Young's modulus (E) and Poisson's ratio (PR) for several different isotropic random models based on Voronoi tessellations and level-cut Gaussian random fields. The effect of partially open cells is also considered. The results, which are best described by a power law E infinity rho (n) (1<n<2), show the influence of randomness and isotropy on the properties of closed-cell cellular materials, and are found to be in good agreement with experimental data. (C) 2001 Acta Materialia Inc. Published by Elsevier Science Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
We propose a new and clinically oriented approach to perform atlas-based segmentation of brain tumor images. A mesh-free method is used to model tumor-induced soft tissue deformations in a healthy brain atlas image with subsequent registration of the modified atlas to a pathologic patient image. The atlas is seeded with a tumor position prior and tumor growth simulating the tumor mass effect is performed with the aim of improving the registration accuracy in case of patients with space-occupying lesions. We perform tests on 2D axial slices of five different patient data sets and show that the approach gives good results for the segmentation of white matter, grey matter, cerebrospinal fluid and the tumor.
Resumo:
We present an automatic method to segment brain tissues from volumetric MRI brain tumor images. The method is based on non-rigid registration of an average atlas in combination with a biomechanically justified tumor growth model to simulate soft-tissue deformations caused by the tumor mass-effect. The tumor growth model, which is formulated as a mesh-free Markov Random Field energy minimization problem, ensures correspondence between the atlas and the patient image, prior to the registration step. The method is non-parametric, simple and fast compared to other approaches while maintaining similar accuracy. It has been evaluated qualitatively and quantitatively with promising results on eight datasets comprising simulated images and real patient data.
Resumo:
The use of group-randomized trials is particularly widespread in the evaluation of health care, educational, and screening strategies. Group-randomized trials represent a subset of a larger class of designs often labeled nested, hierarchical, or multilevel and are characterized by the randomization of intact social units or groups, rather than individuals. The application of random effects models to group-randomized trials requires the specification of fixed and random components of the model. The underlying assumption is usually that these random components are normally distributed. This research is intended to determine if the Type I error rate and power are affected when the assumption of normality for the random component representing the group effect is violated. ^ In this study, simulated data are used to examine the Type I error rate, power, bias and mean squared error of the estimates of the fixed effect and the observed intraclass correlation coefficient (ICC) when the random component representing the group effect possess distributions with non-normal characteristics, such as heavy tails or severe skewness. The simulated data are generated with various characteristics (e.g. number of schools per condition, number of students per school, and several within school ICCs) observed in most small, school-based, group-randomized trials. The analysis is carried out using SAS PROC MIXED, Version 6.12, with random effects specified in a random statement and restricted maximum likelihood (REML) estimation specified. The results from the non-normally distributed data are compared to the results obtained from the analysis of data with similar design characteristics but normally distributed random effects. ^ The results suggest that the violation of the normality assumption for the group component by a skewed or heavy-tailed distribution does not appear to influence the estimation of the fixed effect, Type I error, and power. Negative biases were detected when estimating the sample ICC and dramatically increased in magnitude as the true ICC increased. These biases were not as pronounced when the true ICC was within the range observed in most group-randomized trials (i.e. 0.00 to 0.05). The normally distributed group effect also resulted in bias ICC estimates when the true ICC was greater than 0.05. However, this may be a result of higher correlation within the data. ^
Resumo:
Improved strategies for synthesis make it possible to expand the range of glycopeptides available for detailed conformational studies. The glycopeptide 1 was synthesized using a new solid phase synthesis of carbohydrates and a convergent coupling to peptide followed by deprotection. Its conformational properties were subjected to NMR analysis and compared with a control peptide 2 prepared by conventional solid phase methods. Whereas peptide 2 fails to manifest any appreciable secondary structure, the glycopeptide 1 does show considerable conformational bias suggestive of an equilibrium between an ordered and a random state. The implications of this ordering effect for the larger issue of protein folding are considered.
Resumo:
We study theoretically the effect of a new type of blocklike positional disorder on the effective electromagnetic properties of one-dimensional chains of resonant, high-permittivity dielectric particles, where particles are arranged into perfectly well-ordered blocks whose relative position is a random variable. This creates a finite order correlation length that mimics the situation encountered in metamaterials fabricated through self-assembled techniques, whose structures often display short-range order between near neighbors but long-range disorder, due to stacking defects. Using a spectral theory approach combined with a principal component statistical analysis, we study, in the long-wavelength regime, the evolution of the electromagnetic response when the composite filling fraction and the block size are changed. Modifications in key features of the resonant response (amplitude, width, etc.) are investigated, showing a regime transition for a filling fraction around 50%.
Resumo:
The generation of bradykinin (BK; Arg-Pro-Pro-Gly-Phe-Ser-Pro-Phe-Arg) in blood and kallidin (Lys-BK) in tissues by the action of the kallikrein-kinin system has received little attention in non-mammalian vertebrates. In mammals, kallidin can be generated by the coronary endothelium and myocytes in response to ischemia, mediating cardioprotective events. The plasma of birds lacks two key components of the kallikrein-kinin system: the low molecular weight kininogen and a prekallikrein activator analogous to mammalian factor XII, but treatment with bovine plasma kallikrein generates ornitho-kinin [Thr6,Leu8]-BK. The possible cardioprotective effect of ornitho-kinin infusion was investigated in an anesthetized, open-chest chicken model of acute coronary occlusion. A branch of the left main coronary artery was reversibly ligated to produce ischemia followed by reperfusion, after which the degree of myocardial necrosis (infarct size as a percent of area at risk) was assessed by tetrazolium staining. The iv injection of a low dose of ornitho-kinin (4 µg/kg) reduced mean arterial pressure from 88 ± 12 to 42 ± 7 mmHg and increased heart rate from 335 ± 38 to 402 ± 45 bpm (N = 5). The size of the infarct was reduced by pretreatment with ornitho-kinin (500 µg/kg infused over a period of 5 min) from 35 ± 3 to 10 ± 2% of the area at risk. These results suggest that the physiological role of the kallikrein-kinin system is preserved in this animal model in spite of the absence of two key components, i.e., low molecular weight kininogen and factor XII.
Resumo:
Objective: The aim of this study was to assess the effects of 830 and 670 nm laser on malondialdehyde (MDA) concentration in random skin-flap survival. Background Data: Low-level laser therapy (LLLT) has been reported to be successful in stimulating the formation of new blood vessels and activating superoxide-dismutase delivery, thus helping the inhibition of free-radical action and consequently reducing necrosis. Materials and Methods: Thirty Wistar rats were used and divided into three groups, with 10 rats in each one. A random skin flap was raised on the dorsum of each animal. Group 1 was the control group; group 2 received 830 nm laser radiation; and group 3 was submitted to 670 nm laser radiation. The animals underwent laser therapy with 36 J/cm(2) energy density immediately after surgery and on the 4 days subsequent to surgery. The application site of the laser radiation was 1 point, 2.5 cm from the flap's cranial base. The percentage of the skin-flap necrosis area was calculated 7 days postoperative using the paper-template method, and a skin sample was collected immediately after as a way of determining the MDA concentration. Results: Statistically significant differences were found between the necrosis percentages, with higher values seen in group 1 compared with groups 2 and 3. Groups 2 and 3 did not present statistically significant differences (p > 0.05). Group 3 had a lower concentration of MDA values compared to the control group (p < 0.05). Conclusion: LLLT was effective in increasing the random skin-flap viability in rats, and the 670 nm laser was efficient in reducing the MDA concentration.