850 resultados para Random effects
Resumo:
Reliably representing both horizontal cloud inhomogeneity and vertical cloud overlap is fundamentally important for the radiation budget of a general circulation model. Here, we build on the work of Part One of this two-part paper by applying a pair of parameterisations that account for horizontal inhomogeneity and vertical overlap to global re-analysis data. These are applied both together and separately in an attempt to quantify the effects of poor representation of the two components on radiation budget. Horizontal inhomogeneity is accounted for using the “Tripleclouds” scheme, which uses two regions of cloud in each layer of a gridbox as opposed to one; vertical overlap is accounted for using “exponential-random” overlap, which aligns vertically continuous cloud according to a decorrelation height. These are applied to a sample of scenes from a year of ERA-40 data. The largest radiative effect of horizontal inhomogeneity is found to be in areas of marine stratocumulus; the effect of vertical overlap is found to be fairly uniform, but with larger individual short-wave and long-wave effects in areas of deep, tropical convection. The combined effect of the two parameterisations is found to reduce the magnitude of the net top-of-atmosphere cloud radiative forcing (CRF) by 2.25 W m−2, with shifts of up to 10 W m−2 in areas of marine stratocumulus. The effects of the uncertainty in our parameterisations on radiation budget is also investigated. It is found that the uncertainty in the impact of horizontal inhomogeneity is of order ±60%, while the uncertainty in the impact of vertical overlap is much smaller. This suggests an insensitivity of the radiation budget to the exact nature of the global decorrelation height distribution derived in Part One.
Resumo:
Brand competition is modelled using an agent based approach in order to examine the long run dynamics of market structure and brand characteristics. A repeated game is designed where myopic firms choose strategies based on beliefs about their rivals and consumers. Consumers are heterogeneous and can observe neighbour behaviour through social networks. Although firms do not observe them, the social networks have a significant impact on the emerging market structure. Presence of networks tends to polarize market share and leads to higher volatility in brands. Yet convergence in brand characteristics usually happens whenever the market reaches a steady state. Scale-free networks accentuate the polarization and volatility more than small world or random networks. Unilateral innovations are less frequent under social networks.
Resumo:
Stochastic Diffusion Search is an efficient probabilistic bestfit search technique, capable of transformation invariant pattern matching. Although inherently parallel in operation it is difficult to implement efficiently in hardware as it requires full inter-agent connectivity. This paper describes a lattice implementation, which, while qualitatively retaining the properties of the original algorithm, restricts connectivity, enabling simpler implementation on parallel hardware. Diffusion times are examined for different network topologies, ranging from ordered lattices, over small-world networks to random graphs.
Acute effects of meal fatty acid composition on insulin sensitivity in healthy post-menopausal women
Resumo:
Postprandial plasma insulin concentrations after a single high-fat meal may be modified by the presence of specific fatty acids although the effects of sequential meal ingestion are unknown. The aim of the present study was to examine the effects of altering the fatty acid composition in a single mixed fat-carbohydrate meal on glucose metabolism and insulin sensitivity of a second meal eaten 5 h later. Insulin sensitivity was assessed using a minimal model approach. Ten healthy post-menopausal women underwent four two-meal studies in random order. A high-fat breakfast (40 g fat) where the fatty acid composition was predominantly saturated fatty acids (SFA), n-6 polyunsaturated fatty acids (PUFA), long-chain n-3 PUFA or monounsaturated fatty acids (MUFA) was followed 5 h later by a low-fat, high-carbohydrate lunch (5.7 g fat), which was identical in all four studies. The plasma insulin response was significantly higher following the SFA meal than the other meals after both breakfast and lunch (P<0.006) although there was no effect of breakfast fatty acid composition on plasma glucose concentrations. Postprandial insulin sensitivity (SI(Oral)) was assessed for 180 min after each meal. SI(Oral) was significantly lower after lunch than after breakfast for all four test meals (P=0.019) following the same rank order (SFA < n-6 PUFA < n-3 PUFA < MUFA) for each meal. The present study demonstrates that a single meal rich in SFA reduces postprandial insulin sensitivity with 'carry-over' effects for the next meal.
Resumo:
Background: Vagal stimulation in response to nutrients is reported to elicit an array of digestive and endocrine responses, including an alteration in postprandial lipid metabolism. Objective: The objective of this study was to assess whether neural stimulation could alter hormone and substrate metabolism during the late postprandial phase, with implications for body fat mobilization. Design: Vagal stimulation was achieved by using the modified sham feeding (MSF) technique, in which nutrients are chewed and tasted but not swallowed. Ten healthy subjects were studied on 3 separate occasions, 4 wk apart. Five hours after a high-fat breakfast (56 g fat), the subjects were given 1 of 3 test meals allocated in random order: water, a lunch containing a modest amount of fat (38 g), or MSF (38 g fat). Blood was collected for 3 h poststimulus for hormone and metabolite analyses. Results: Plasma insulin and pancreatic polypeptide concentrations peaked at 250% and 209% of baseline concentrations within 15 min of MSF. The plasma glucose concentration increased significantly (P = 0.038) in parallel with the changes observed in the plasma insulin concentration. The nonesterified fatty acid concentration was significantly suppressed (P = 0.006); maximum suppression occurred at a mean time of 114 min after MSF. This fall in nonesterified fatty acid was accompanied by a fall in the plasma glucagon concentration from 122 to 85 pmol/L (P = 0.018) at a mean time of 113 min after MSF. Conclusions: Effects on substrate metabolism after MSF in the postprandial state differ from those usually reported in the postabsorptive state. The effects of MSF were prolonged beyond the period of the cephalic response and these may be relevant for longer-term metabolic regulation.
Resumo:
In order to validate the reported precision of space‐based atmospheric composition measurements, validation studies often focus on measurements in the tropical stratosphere, where natural variability is weak. The scatter in tropical measurements can then be used as an upper limit on single‐profile measurement precision. Here we introduce a method of quantifying the scatter of tropical measurements which aims to minimize the effects of short‐term atmospheric variability while maintaining large enough sample sizes that the results can be taken as representative of the full data set. We apply this technique to measurements of O3, HNO3, CO, H2O, NO, NO2, N2O, CH4, CCl2F2, and CCl3F produced by the Atmospheric Chemistry Experiment–Fourier Transform Spectrometer (ACE‐FTS). Tropical scatter in the ACE‐FTS retrievals is found to be consistent with the reported random errors (RREs) for H2O and CO at altitudes above 20 km, validating the RREs for these measurements. Tropical scatter in measurements of NO, NO2, CCl2F2, and CCl3F is roughly consistent with the RREs as long as the effect of outliers in the data set is reduced through the use of robust statistics. The scatter in measurements of O3, HNO3, CH4, and N2O in the stratosphere, while larger than the RREs, is shown to be consistent with the variability simulated in the Canadian Middle Atmosphere Model. This result implies that, for these species, stratospheric measurement scatter is dominated by natural variability, not random error, which provides added confidence in the scientific value of single‐profile measurements.
Resumo:
Aim Species distribution models (SDMs) based on current species ranges underestimate the potential distribution when projected in time and/or space. A multi-temporal model calibration approach has been suggested as an alternative, and we evaluate this using 13,000 years of data. Location Europe. Methods We used fossil-based records of presence for Picea abies, Abies alba and Fagus sylvatica and six climatic variables for the period 13,000 to 1000 yr bp. To measure the contribution of each 1000-year time step to the total niche of each species (the niche measured by pooling all the data), we employed a principal components analysis (PCA) calibrated with data over the entire range of possible climates. Then we projected both the total niche and the partial niches from single time frames into the PCA space, and tested if the partial niches were more similar to the total niche than random. Using an ensemble forecasting approach, we calibrated SDMs for each time frame and for the pooled database. We projected each model to current climate and evaluated the results against current pollen data. We also projected all models into the future. Results Niche similarity between the partial and the total-SDMs was almost always statistically significant and increased through time. SDMs calibrated from single time frames gave different results when projected to current climate, providing evidence of a change in the species realized niches through time. Moreover, they predicted limited climate suitability when compared with the total-SDMs. The same results were obtained when projected to future climates. Main conclusions The realized climatic niche of species differed for current and future climates when SDMs were calibrated considering different past climates. Building the niche as an ensemble through time represents a way forward to a better understanding of a species' range and its ecology in a changing climate.
Resumo:
In order to examine metacognitive accuracy (i.e., the relationship between metacognitive judgment and memory performance), researchers often rely on by-participant analysis, where metacognitive accuracy (e.g., resolution, as measured by the gamma coefficient or signal detection measures) is computed for each participant and the computed values are entered into group-level statistical tests such as the t-test. In the current work, we argue that the by-participant analysis, regardless of the accuracy measurements used, would produce a substantial inflation of Type-1 error rates, when a random item effect is present. A mixed-effects model is proposed as a way to effectively address the issue, and our simulation studies examining Type-1 error rates indeed showed superior performance of mixed-effects model analysis as compared to the conventional by-participant analysis. We also present real data applications to illustrate further strengths of mixed-effects model analysis. Our findings imply that caution is needed when using the by-participant analysis, and recommend the mixed-effects model analysis.
Resumo:
Influences of inbreeding on daily milk yield (DMY), age at first calving (AFC), and calving intervals (CI) were determined on a highly inbred zebu dairy subpopulation of the Guzerat breed. Variance components were estimated using animal models in single-trait analyses. Two approaches were employed to estimate inbreeding depression: using individual increase in inbreeding coefficients or using inbreeding coefficients as possible covariates included in the statistical models. The pedigree file included 9,915 animals, of which 9,055 were inbred, with an average inbreeding coefficient of 15.2%. The maximum inbreeding coefficient observed was 49.45%, and the average inbreeding for the females still in the herd during the analysis was 26.42%. Heritability estimates were 0.27 for DMY and 0.38 for AFC. The genetic variance ratio estimated with the random regression model for CI ranged around 0.10. Increased inbreeding caused poorer performance in DMY, AFC, and CI. However, some of the cows with the highest milk yield were among the highly inbred animals in this subpopulation. Individual increase in inbreeding used as a covariate in the statistical models accounted for inbreeding depression while avoiding overestimation that may result when fitting inbreeding coefficients.
Resumo:
The Prospective and Retrospective Memory Questionnaire (PRMQ) has been shown to have acceptable reliability and factorial, predictive, and concurrent validity. However, the PRMQ has never been administered to a probability sample survey representative of all ages in adulthood, nor have previous studies controlled for factors that are known to influence metamemory, such as affective status. Here, the PRMQ was applied in a survey adopting a probabilistic three-stage cluster sample representative of the population of Sao Paulo, Brazil, according to gender, age (20-80 years), and economic status (n=1042). After excluding participants who had conditions that impair memory (depression, anxiety, used psychotropics, and/or had neurological/psychiatric disorders), in the remaining 664 individuals we (a) used confirmatory factor analyses to test competing models of the latent structure of the PRMQ, and (b) studied effects of gender, age, schooling, and economic status on prospective and retrospective memory complaints. The model with the best fit confirmed the same tripartite structure (general memory factor and two orthogonal prospective and retrospective memory factors) previously reported. Women complained more of general memory slips, especially those in the first 5 years after menopause, and there were more complaints of prospective than retrospective memory, except in participants with lower family income.
Resumo:
The fabrication of controlled molecular architectures is essential for organic devices, as is the case of emission of polarized light for the information industry. In this study, we show that optimized conditions can be established to allow layer-by-layer (LbL) films of poly(p-phenylene vinylene) (PPV)+dodecylbenzenesulfonate (DBS) to be obtained with anisotropic properties. Films with five layers and converted at 110 degrees C had a dichroic ratio delta = 2.3 and order parameter r = 34%, as indicated in optical spectroscopy and emission ellipsometry data. This anisotropy was decreased with the number of layers deposited, with delta = 1.0 for a 75-layer LbL PPV + DBS film. The analysis with atomic force microscopy showed the formation of polymer clusters in a random growth process with the normalized height distribution being represented by a Gaussian function. In spite of this randomness in film growth, the self-covariance function pointed to a correlation between clusters, especially for thick films. In summary, the LbL method may be exploited to obtain both anisotropic films with polarized emission and regular, nanostructured surfaces. (c) 2010 Wiley Periodicals, Inc. J Polym Sci Part B: Polym Phys 49: 206-213, 2011
Resumo:
We consider method of moment fixed effects (FE) estimation of technical inefficiency. When N, the number of cross sectional observations, is large it ispossible to obtain consistent central moments of the population distribution of the inefficiencies. It is well-known that the traditional FE estimator may be seriously upward biased when N is large and T, the number of time observations, is small. Based on the second central moment and a single parameter distributional assumption on the inefficiencies, we obtain unbiased technical inefficiencies in large N settings. The proposed methodology bridges traditional FE and maximum likelihood estimation – bias is reduced without the random effects assumption.
Resumo:
We consider methods for estimating causal effects of treatment in the situation where the individuals in the treatment and the control group are self selected, i.e., the selection mechanism is not randomized. In this case, simple comparison of treated and control outcomes will not generally yield valid estimates of casual effects. The propensity score method is frequently used for the evaluation of treatment effect. However, this method is based onsome strong assumptions, which are not directly testable. In this paper, we present an alternative modeling approachto draw causal inference by using share random-effect model and the computational algorithm to draw likelihood based inference with such a model. With small numerical studies and a real data analysis, we show that our approach gives not only more efficient estimates but it is also less sensitive to model misspecifications, which we consider, than the existing methods.
Resumo:
Bounds on the distribution function of the sum of two random variables with known marginal distributions obtained by Makarov (1981) can be used to bound the cumulative distribution function (c.d.f.) of individual treatment effects. Identification of the distribution of individual treatment effects is important for policy purposes if we are interested in functionals of that distribution, such as the proportion of individuals who gain from the treatment and the expected gain from the treatment for these individuals. Makarov bounds on the c.d.f. of the individual treatment effect distribution are pointwise sharp, i.e. they cannot be improved in any single point of the distribution. We show that the Makarov bounds are not uniformly sharp. Specifically, we show that the Makarov bounds on the region that contains the c.d.f. of the treatment effect distribution in two (or more) points can be improved, and we derive the smallest set for the c.d.f. of the treatment effect distribution in two (or more) points. An implication is that the Makarov bounds on a functional of the c.d.f. of the individual treatment effect distribution are not best possible.
Resumo:
Estimation of demand and supply in differentiated products markets is a central issue in Empirical Industrial Organization and has been used to study the effects of taxes, merges, introduction of new goods, market power, among others. Logit and Random Coefficients Logit are examples of demand models used to study these effects. For the supply side it is generally supposed a Nash equilibrium in prices. This work presents a detailed discussion of these models of demand and supply as well as the procedure for estimation. Lastly, is made an application to the Brazilian fixed income fund market.