948 resultados para Random Coefficient Autoregressive Model{ RCAR (1)}
Resumo:
The objective of this paper is to introduce a diVerent approach, called the ecological-longitudinal, to carrying out pooled analysis in time series ecological studies. Because it gives a larger number of data points and, hence, increases the statistical power of the analysis, this approach, unlike conventional ones, allows the complementation of aspects such as accommodation of random effect models, of lags, of interaction between pollutants and between pollutants and meteorological variables, that are hardly implemented in conventional approaches. Design—The approach is illustrated by providing quantitative estimates of the short-termeVects of air pollution on mortality in three Spanish cities, Barcelona,Valencia and Vigo, for the period 1992–1994. Because the dependent variable was a count, a Poisson generalised linear model was first specified. Several modelling issues are worth mentioning. Firstly, because the relations between mortality and explanatory variables were nonlinear, cubic splines were used for covariate control, leading to a generalised additive model, GAM. Secondly, the effects of the predictors on the response were allowed to occur with some lag. Thirdly, the residual autocorrelation, because of imperfect control, was controlled for by means of an autoregressive Poisson GAM. Finally, the longitudinal design demanded the consideration of the existence of individual heterogeneity, requiring the consideration of mixed models. Main results—The estimates of the relative risks obtained from the individual analyses varied across cities, particularly those associated with sulphur dioxide. The highest relative risks corresponded to black smoke in Valencia. These estimates were higher than those obtained from the ecological-longitudinal analysis. Relative risks estimated from this latter analysis were practically identical across cities, 1.00638 (95% confidence intervals 1.0002, 1.0011) for a black smoke increase of 10 μg/m3 and 1.00415 (95% CI 1.0001, 1.0007) for a increase of 10 μg/m3 of sulphur dioxide. Because the statistical power is higher than in the individual analysis more interactions were statistically significant,especially those among air pollutants and meteorological variables. Conclusions—Air pollutant levels were related to mortality in the three cities of the study, Barcelona, Valencia and Vigo. These results were consistent with similar studies in other cities, with other multicentric studies and coherent with both, previous individual, for each city, and multicentric studies for all three cities
Resumo:
La nostra investigació s'inscriu en la concepció dinàmica de la intel·ligència, i concretament en el processos que configuren el processament cerebral en el Model d'integració de la informació descrit per Das, Kirby i Jarman (1979). Els dos processos cerebrals que constitueixen la base de la conducta intel·ligent són el processament simultani i el processament seqüencial; són les dues estratègies principals del processament de la informació. Tota classe d'estímul és susceptible d'ésser processat o bé seqüencialment (seriació, verbal, anàlisi), o be simultàniament (global, visual, síntesi). Basant-nos en el recull bibliogràfic i amb la convicció de que apropant-nos al coneixement de les peculiaritats del processament de la informació, ens endinsem en la comprensió del procés que mena a la conducta intel·ligent, i per tant, a l'aprenentatge, formulem la següent hipòtesi de treball: en els nens de preescolar (d'entre els 3 i els sis anys) es donaran aquest dos tipus de processament i variaran en funció de l'edat, el sexe, l'atenció, les dificultats d'aprenentatge, els problemes de llenguatge, el bilingüisme, el nivell sociocultural, la dominància manual, el nivell mental i de la presència de patologia. Les diferències que s'esdevinguin ens permetran de formular criteris i pautes per a la intervenció educativa. Els nostres objectius es refonen en mesurar el processament en nens de preescolar de les comarques gironines, verificar la relació de cada tipus de processament amb les variables esmentades, comprovar si s'estableix un paral·lelisme entre el processament i les aportacions de concepció localitzacionista de les funcions cerebrals en base als nostres resultats, i pautes per a la intervenció pedagògica. Quant al mètode, hem seleccionat una mostra representativa dels nens i nenes matriculats a les escoles publiques de les comarques gironines durant el curs 92/93, mitjançant un mostreig aleatori estratificat i per conglomerats. El tamany real de la mostra és de dos-cents seixanta un subjectes. Els instruments emprats han estat els següents: el Test K-ABC de Kaufman & Kaufman (1983) per a la avaluació del processament; un formulari dirigit als pares per a la recollida de la informació pertinent; entrevistes amb les mestres, i el Test de la Figura Humana de Goodenough. Pel que fa referència als resultats de la nostra recerca i en funció dels objectius proposats, constatem els fets següents. En els nens de preescolar, amb edats d'entre els tres i els sis anys, es constata l'existència dels dos tipus de processament cerebral, sense que es doni un predomini d'un sobre de l'altre; ambdós processaments actuen interrelacionadament. Ambdós tipus de processament milloren a mesura que augmenta l'edat, però es constaten diferències derivades del nivell mental: amb un nivell mental normal s'hi associa una millora d'ambdós processaments, mentre que amb un nivell mental deficient només millora fonamentalment el processament seqüencial. Tanmateix, el processament simultani està més relacionat amb les funcions cognitives complexes i és més nivell mental dependent que el processament seqüencial. Tant les dificultats d'aprenentatge com els problemes de llenguatge predominen en els nens i nenes amb un desequilibri significatiu entre ambdós tipus de processament; les dificultats d'aprenentatge estan més relacionades amb una deficiència del processament simultani, mentre que els problemes de llenguatge es relacionen més amb una deficiència en el processament seqüencial. Els nivells socioculturals baixos es relacionen amb resultats inferiors en ambdós tipus de processament. Per altra part, entre els nens bilingües és més freqüent el processament seqüencial significatiu. El test de la Figura Humana es comporta com un marcador de processament simultani i el nivell atencional com un marcador de la gravetat del problema que afecta al processament i en el següent ordre: nivell mental deficient, dificultats, d'aprenentatge i problemes de llenguatge . Les deficiències atencionals van lligades a deficiències en el processament simultani i a la presencia de patologia. Quant a la dominància manual no es constaten diferències en el processament. Finalment, respecte del sexe només podem aportar que quan un dels dos tipus de processament és deficitari,i es dóna per tant, un desequilibri en el processament, predomina significativament el nombre de nens afectats per sobre del de nenes.
Resumo:
La present tesi proposa una metodología per a la simulació probabilística de la fallada de la matriu en materials compòsits reforçats amb fibres de carboni, basant-se en l'anàlisi de la distribució aleatòria de les fibres. En els primers capítols es revisa l'estat de l'art sobre modelització matemàtica de materials aleatoris, càlcul de propietats efectives i criteris de fallada transversal en materials compòsits. El primer pas en la metodologia proposada és la definició de la determinació del tamany mínim d'un Element de Volum Representatiu Estadístic (SRVE) . Aquesta determinació es du a terme analitzant el volum de fibra, les propietats elàstiques efectives, la condició de Hill, els estadístics de les components de tensió i defromació, la funció de densitat de probabilitat i les funcions estadístiques de distància entre fibres de models d'elements de la microestructura, de diferent tamany. Un cop s'ha determinat aquest tamany mínim, es comparen un model periòdic i un model aleatori, per constatar la magnitud de les diferències que s'hi observen. Es defineix, també, una metodologia per a l'anàlisi estadístic de la distribució de la fibra en el compòsit, a partir d'imatges digitals de la secció transversal. Aquest anàlisi s'aplica a quatre materials diferents. Finalment, es proposa un mètode computacional de dues escales per a simular la fallada transversal de làmines unidireccionals, que permet obtenir funcions de densitat de probabilitat per a les variables mecàniques. Es descriuen algunes aplicacions i possibilitats d'aquest mètode i es comparen els resultats obtinguts de la simulació amb valors experimentals.
Resumo:
QUAGMIRE is a quasi-geostrophic numerical model for performing fast, high-resolution simulations of multi-layer rotating annulus laboratory experiments on a desktop personal computer. The model uses a hybrid finite-difference/spectral approach to numerically integrate the coupled nonlinear partial differential equations of motion in cylindrical geometry in each layer. Version 1.3 implements the special case of two fluid layers of equal resting depths. The flow is forced either by a differentially rotating lid, or by relaxation to specified streamfunction or potential vorticity fields, or both. Dissipation is achieved through Ekman layer pumping and suction at the horizontal boundaries, including the internal interface. The effects of weak interfacial tension are included, as well as the linear topographic beta-effect and the quadratic centripetal beta-effect. Stochastic forcing may optionally be activated, to represent approximately the effects of random unresolved features. A leapfrog time stepping scheme is used, with a Robert filter. Flows simulated by the model agree well with those observed in the corresponding laboratory experiments.
Resumo:
1. We compared the baseline phosphorus (P) concentrations inferred by diatom-P transfer functions and export coefficient models at 62 lakes in Great Britain to assess whether the techniques produce similar estimates of historical nutrient status. 2. There was a strong linear relationship between the two sets of values over the whole total P (TP) gradient (2-200 mu g TP L-1). However, a systematic bias was observed with the diatom model producing the higher values in 46 lakes (of which values differed by more than 10 mu g TP L-1 in 21). The export coefficient model gave the higher values in 10 lakes (of which the values differed by more than 10 mu g TP L-1 in only 4). 3. The difference between baseline and present-day TP concentrations was calculated to compare the extent of eutrophication inferred by the two sets of model output. There was generally poor agreement between the amounts of change estimated by the two approaches. The discrepancy in both the baseline values and the degree of change inferred by the models was greatest in the shallow and more productive sites. 4. Both approaches were applied to two lakes in the English Lake District where long-term P data exist, to assess how well the models track measured P concentrations since approximately 1850. There was good agreement between the pre-enrichment TP concentrations generated by the models. The diatom model paralleled the steeper rise in maximum soluble reactive P (SRP) more closely than the gradual increase in annual mean TP in both lakes. The export coefficient model produced a closer fit to observed annual mean TP concentrations for both sites, tracking the changes in total external nutrient loading. 5. A combined approach is recommended, with the diatom model employed to reflect the nature and timing of the in-lake response to changes in nutrient loading, and the export coefficient model used to establish the origins and extent of changes in the external load and to assess potential reduction in loading under different management scenarios. 6. However, caution must be exercised when applying these models to shallow lakes where the export coefficient model TP estimate will not include internal P loading from lake sediments and where the diatom TP inferences may over-estimate TP concentrations because of the high abundance of benthic taxa, many of which are poor indicators of trophic state.
Resumo:
Heat waves are expected to increase in frequency and magnitude with climate change. The first part of a study to produce projections of the effect of future climate change on heat-related mortality is presented. Separate city-specific empirical statistical models that quantify significant relationships between summer daily maximum temperature (T max) and daily heat-related deaths are constructed from historical data for six cities: Boston, Budapest, Dallas, Lisbon, London, and Sydney. ‘Threshold temperatures’ above which heat-related deaths begin to occur are identified. The results demonstrate significantly lower thresholds in ‘cooler’ cities exhibiting lower mean summer temperatures than in ‘warmer’ cities exhibiting higher mean summer temperatures. Analysis of individual ‘heat waves’ illustrates that a greater proportion of mortality is due to mortality displacement in cities with less sensitive temperature–mortality relationships than in those with more sensitive relationships, and that mortality displacement is no longer a feature more than 12 days after the end of the heat wave. Validation techniques through residual and correlation analyses of modelled and observed values and comparisons with other studies indicate that the observed temperature–mortality relationships are represented well by each of the models. The models can therefore be used with confidence to examine future heat-related deaths under various climate change scenarios for the respective cities (presented in Part 2).
Resumo:
Global hydrological models (GHMs) model the land surface hydrologic dynamics of continental-scale river basins. Here we describe one such GHM, the Macro-scale - Probability-Distributed Moisture model.09 (Mac-PDM.09). The model has undergone a number of revisions since it was last applied in the hydrological literature. This paper serves to provide a detailed description of the latest version of the model. The main revisions include the following: (1) the ability for the model to be run for n repetitions, which provides more robust estimates of extreme hydrological behaviour, (2) the ability of the model to use a gridded field of coefficient of variation (CV) of daily rainfall for the stochastic disaggregation of monthly precipitation to daily precipitation, and (3) the model can now be forced with daily input climate data as well as monthly input climate data. We demonstrate the effects that each of these three revisions has on simulated runoff relative to before the revisions were applied. Importantly, we show that when Mac-PDM.09 is forced with monthly input data, it results in a negative runoff bias relative to when daily forcings are applied, for regions of the globe where the day-to-day variability in relative humidity is high. The runoff bias can be up to - 80% for a small selection of catchments but the absolute magnitude of the bias may be small. As such, we recommend future applications of Mac-PDM.09 that use monthly climate forcings acknowledge the bias as a limitation of the model. The performance of Mac-PDM.09 is evaluated by validating simulated runoff against observed runoff for 50 catchments. We also present a sensitivity analysis that demonstrates that simulated runoff is considerably more sensitive to method of PE calculation than to perturbations in soil moisture and field capacity parameters.
Resumo:
In this paper, the mixed logit (ML) using Bayesian methods was employed to examine willingness-to-pay (WTP) to consume bread produced with reduced levels of pesticides so as to ameliorate environmental quality, from data generated by a choice experiment. Model comparison used the marginal likelihood, which is preferable for Bayesian model comparison and testing. Models containing constant and random parameters for a number of distributions were considered, along with models in ‘preference space’ and ‘WTP space’ as well as those allowing for misreporting. We found: strong support for the ML estimated in WTP space; little support for fixing the price coefficient a common practice advocated and adopted in the environmental economics literature; and, weak evidence for misreporting.
Resumo:
The purpose of this study was to improve the prediction of the quantity and type of Volatile Fatty Acids (VFA) produced from fermented substrate in the rumen of lactating cows. A model was formulated that describes the conversion of substrate (soluble carbohydrates, starch, hemi-cellulose, cellulose, and protein) into VFA (acetate, propionate, butyrate, and other VFA). Inputs to the model were observed rates of true rumen digestion of substrates, whereas outputs were observed molar proportions of VFA in rumen fluid. A literature survey generated data of 182 diets (96 roughage and 86 concentrate diets). Coefficient values that define the conversion of a specific substrate into VFA were estimated meta-analytically by regression of the model against observed VFA molar proportions using non-linear regression techniques. Coefficient estimates significantly differed for acetate and propionate production in particular, between different types of substrate and between roughage and concentrate diets. Deviations of fitted from observed VFA molar proportions could be attributed to random error for 100%. In addition to regression against observed data, simulation studies were performed to investigate the potential of the estimation method. Fitted coefficient estimates from simulated data sets appeared accurate, as well as fitted rates of VFA production, although the model accounted for only a small fraction (maximally 45%) of the variation in VFA molar proportions. The simulation results showed that the latter result was merely a consequence of the statistical analysis chosen and should not be interpreted as an indication of inaccuracy of coefficient estimates. Deviations between fitted and observed values corresponded to those obtained in simulations. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The formulation of a new process-based crop model, the general large-area model (GLAM) for annual crops is presented. The model has been designed to operate on spatial scales commensurate with those of global and regional climate models. It aims to simulate the impact of climate on crop yield. Procedures for model parameter determination and optimisation are described, and demonstrated for the prediction of groundnut (i.e. peanut; Arachis hypogaea L.) yields across India for the period 1966-1989. Optimal parameters (e.g. extinction coefficient, transpiration efficiency, rate of change of harvest index) were stable over space and time, provided the estimate of the yield technology trend was based on the full 24-year period. The model has two location-specific parameters, the planting date, and the yield gap parameter. The latter varies spatially and is determined by calibration. The optimal value varies slightly when different input data are used. The model was tested using a historical data set on a 2.5degrees x 2.5degrees grid to simulate yields. Three sites are examined in detail-grid cells from Gujarat in the west, Andhra Pradesh towards the south, and Uttar Pradesh in the north. Agreement between observed and modelled yield was variable, with correlation coefficients of 0.74, 0.42 and 0, respectively. Skill was highest where the climate signal was greatest, and correlations were comparable to or greater than correlations with seasonal mean rainfall. Yields from all 35 cells were aggregated to simulate all-India yield. The correlation coefficient between observed and simulated yields was 0.76, and the root mean square error was 8.4% of the mean yield. The model can be easily extended to any annual crop for the investigation of the impacts of climate variability (or change) on crop yield over large areas. (C) 2004 Elsevier B.V. All rights reserved.
Hydrolyzable tannin structures influence relative globular and random coil protein binding strengths
Resumo:
Binding parameters for the interactions of pentagalloyl glucose (PGG) and four hydrolyzable tannins (representing gallotannins and ellagitannins) with gelatin and bovine serum albumin (BSA) have been determined from isothermal titration calorimetry data. Equilibrium binding constants determined for the interaction of PGG and isolated mixtures of tara gallotannins and of sumac gallotannins with gelatin and BSA were of the same order of magnitude for each tannin (in the range of 10(4)-10(5) M-1 for stronger binding sites when using a binding model consisting of two sets of multiple binding sites). In contrast, isolated mixtures of chestnut ellagitannins and of myrabolan ellagitannins exhibited 3-4 orders of magnitude greater equilibrium binding constants for the interaction with gelatin (similar to 2 x 10(6) M-1) than for that with BSA (similar to 8 x 10(2) M-1). Binding stoichiometries revealed that the stronger binding sites on gelatin outnumbered those on BSA by a ratio of at least similar to 2:1 for all of the hydrolyzable tannins studied. Overall, the data revealed that relative binding constants for the interactions with gelatin and BSA are dependent on the structural flexibility of the tannin molecule.
Resumo:
Diebold and Lamb (1997) argue that since the long-run elasticity of supply derived from the Nerlovian model entails a ratio of random variables, it is without moments. They propose minimum expected loss estimation to correct this problem but in so-doing ignore the fact that a non white-noise-error is implicit in the model. We show that, as a consequence the estimator is biased and demonstrate that Bayesian estimation which fully accounts for the error structure is preferable.
Resumo:
As a continuing effort to establish the structure-activity relationships (SARs) within the series of the angiotensin II antagonists (sartans), a pharmacophoric model was built by using novel TOPP 3D descriptors. Statistical values were satisfactory (PC4: r(2)=0.96, q(2) ((5) (random) (groups))=0.84; SDEP=0.26) and encouraged the synthesis and consequent biological evaluation of a series of new pyrrolidine derivatives. SAR together with a combined 3D quantitative SAR and high-throughput virtual screening showed that the newly synthesized 1-acyl-N-(biphenyl-4-ylmethyl)pyrrolidine-2-carboxamides may represent an interesting starting point for the design of new antihypertensive agents. In particular, biological tests performed on CHO-hAT(1) cells stably expressing the human AT(1) receptor showed that the length of the acyl chain is crucial for the receptor interaction and that the valeric chain is the optimal one.