898 resultados para Exponential Random Graph Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Joint generalized linear models and double generalized linear models (DGLMs) were designed to model outcomes for which the variability can be explained using factors and/or covariates. When such factors operate, the usual normal regression models, which inherently exhibit constant variance, will under-represent variation in the data and hence may lead to erroneous inferences. For count and proportion data, such noise factors can generate a so-called overdispersion effect, and the use of binomial and Poisson models underestimates the variability and, consequently, incorrectly indicate significant effects. In this manuscript, we propose a DGLM from a Bayesian perspective, focusing on the case of proportion data, where the overdispersion can be modeled using a random effect that depends on some noise factors. The posterior joint density function was sampled using Monte Carlo Markov Chain algorithms, allowing inferences over the model parameters. An application to a data set on apple tissue culture is presented, for which it is shown that the Bayesian approach is quite feasible, even when limited prior information is available, thereby generating valuable insight for the researcher about its experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A total of 152,145 weekly test-day milk yield records from 7317 first lactations of Holstein cows distributed in 93 herds in southeastern Brazil were analyzed. Test-day milk yields were classified into 44 weekly classes of DIM. The contemporary groups were defined as herd-year-week of test-day. The model included direct additive genetic, permanent environmental and residual effects as random and fixed effects of contemporary group and age of cow at calving as covariable, linear and quadratic effects. Mean trends were modeled by a cubic regression on orthogonal polynomials of DIM. Additive genetic and permanent environmental random effects were estimated by random regression on orthogonal Legendre polynomials. Residual variances were modeled using third to seventh-order variance functions or a step function with 1, 6,13,17 and 44 variance classes. Results from Akaike`s and Schwarz`s Bayesian information criterion suggested that a model considering a 7th-order Legendre polynomial for additive effect, a 12th-order polynomial for permanent environment effect and a step function with 6 classes for residual variances, fitted best. However, a parsimonious model, with a 6th-order Legendre polynomial for additive effects and a 7th-order polynomial for permanent environmental effects, yielded very similar genetic parameter estimates. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The detection of seizure in the newborn is a critical aspect of neurological research. Current automatic detection techniques are difficult to assess due to the problems associated with acquiring and labelling newborn electroencephalogram (EEG) data. A realistic model for newborn EEG would allow confident development, assessment and comparison of these detection techniques. This paper presents a model for newborn EEG that accounts for its self-similar and non-stationary nature. The model consists of background and seizure sub-models. The newborn EEG background model is based on the short-time power spectrum with a time-varying power law. The relationship between the fractal dimension and the power law of a power spectrum is utilized for accurate estimation of the short-time power law exponent. The newborn EEG seizure model is based on a well-known time-frequency signal model. This model addresses all significant time-frequency characteristics of newborn EEG seizure which include; multiple components or harmonics, piecewise linear instantaneous frequency laws and harmonic amplitude modulation. Estimates of the parameters of both models are shown to be random and are modelled using the data from a total of 500 background epochs and 204 seizure epochs. The newborn EEG background and seizure models are validated against real newborn EEG data using the correlation coefficient. The results show that the output of the proposed models has a higher correlation with real newborn EEG than currently accepted models (a 10% and 38% improvement for background and seizure models, respectively).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spin glasses are magnetic systems with conflicting and random interactions between the individual spins. The dynamics of spin glasses, as of structural glasses, reflect their complexity. Both in experimental and numerical work the relaxation below the freezing temperature depends strongly on the annealing conditions (aging) and, above the freezing point, relaxation in equilibrium is slow and non-exponential, In this Forum, observed characteristics of the dynamics were summarized and the physical models proposed to explain them were outlined. (C) 1998 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A significant problem in the collection of responses to potentially sensitive questions, such as relating to illegal, immoral or embarrassing activities, is non-sampling error due to refusal to respond or false responses. Eichhorn & Hayre (1983) suggested the use of scrambled responses to reduce this form of bias. This paper considers a linear regression model in which the dependent variable is unobserved but for which the sum or product with a scrambling random variable of known distribution, is known. The performance of two likelihood-based estimators is investigated, namely of a Bayesian estimator achieved through a Markov chain Monte Carlo (MCMC) sampling scheme, and a classical maximum-likelihood estimator. These two estimators and an estimator suggested by Singh, Joarder & King (1996) are compared. Monte Carlo results show that the Bayesian estimator outperforms the classical estimators in almost all cases, and the relative performance of the Bayesian estimator improves as the responses become more scrambled.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A mixture model for long-term survivors has been adopted in various fields such as biostatistics and criminology where some individuals may never experience the type of failure under study. It is directly applicable in situations where the only information available from follow-up on individuals who will never experience this type of failure is in the form of censored observations. In this paper, we consider a modification to the model so that it still applies in the case where during the follow-up period it becomes known that an individual will never experience failure from the cause of interest. Unless a model allows for this additional information, a consistent survival analysis will not be obtained. A partial maximum likelihood (ML) approach is proposed that preserves the simplicity of the long-term survival mixture model and provides consistent estimators of the quantities of interest. Some simulation experiments are performed to assess the efficiency of the partial ML approach relative to the full ML approach for survival in the presence of competing risks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A two-component survival mixture model is proposed to analyse a set of ischaemic stroke-specific mortality data. The survival experience of stroke patients after index stroke may be described by a subpopulation of patients in the acute condition and another subpopulation of patients in the chronic phase. To adjust for the inherent correlation of observations due to random hospital effects, a mixture model of two survival functions with random effects is formulated. Assuming a Weibull hazard in both components, an EM algorithm is developed for the estimation of fixed effect parameters and variance components. A simulation study is conducted to assess the performance of the two-component survival mixture model estimators. Simulation results confirm the applicability of the proposed model in a small sample setting. Copyright (C) 2004 John Wiley Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protein engineering is a powerful tool, which correlates protein structure with specific functions, both in applied biotechnology and in basic research. Here, we present a practical teaching course for engineering the green fluorescent protein (GFP) from Aequorea victoria by a random mutagenesis strategy using error-prone polymerase chain reaction. Screening of bacterial colonies transformed with random mutant libraries identified GFP variants with increased fluorescence yields. Mapping the three-dimensional structure of these mutants demonstrated how alterations in structural features such as the environment around the fluorophore and properties of the protein surface can influence functional properties such as the intensity of fluorescence and protein solubility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We prove that, once an algorithm of perfect simulation for a stationary and ergodic random field F taking values in S(Zd), S a bounded subset of R(n), is provided, the speed of convergence in the mean ergodic theorem occurs exponentially fast for F. Applications from (non-equilibrium) statistical mechanics and interacting particle systems are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The elevated plus-maze is a device widely used to assess rodent anxiety under the effect of several treatments, including pharmacological agents. The animal is placed at the center of the apparatus, which consists of two open arms and two arms enclosed by walls, and the number of entries and duration of stay in each arm are measured for a 5-min exposure period. The effect of an anxiolytic drug is to increase the percentage of time spent and number of entries into the open arms. In this work, we propose a new measure of anxiety levels in the rat submitted to the elevated plus-maze. We represented the spatial structure of the elevated plus-maze in terms of a directed graph and studied the statistics of the rat`s transitions between the nodes of the graph. By counting the number of times each transition is made and ordering them in descending frequency we represented the rat`s behavior in a rank-frequency plot. Our results suggest that the curves obtained under different pharmacological conditions can be well fitted by a power law with an exponent sensitive to both the drug type and the dose used. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study shows the creation of a graphical representation after the application of a questionnaire to evaluate the indicative factors of a sustainable telemedicine and telehealth center in Sao Paulo, Brazil. We categorized the factors into seven domain areas: institutional, functional, economic-financial, renewal, academic-scientific, partnerships, and social welfare, which were plotted into a graphical representation. The developed graph was shown to be useful when used in the same institution over a long period and complemented with secondary information from publications, archives, and administrative documents to support the numerical indicators. Its use may contribute toward monitoring the factors that define telemedicine and telehealth center sustainability. When systematically applied, it may also be useful for identifying the specific characteristics of the telemedicine and telehealth center, to support its organizational development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Changes in the shape of the capnogram may reflect changes in lung physiology. We studied the effect of different ventilation/perfusion ratios (V/Q) induced by positive end-expiratory pressures (PEEP) and lung recruitment on phase III slope (S(III)) of volumetric capnograms. Methods Seven lung-lavaged pigs received volume control ventilation at tidal volumes of 6 ml/kg. After a lung recruitment maneuver, open-lung PEEP (OL-PEEP) was defined at 2 cmH(2)O above the PEEP at the onset of lung collapse as identified by the maximum respiratory compliance during a decremental PEEP trial. Thereafter, six distinct PEEP levels either at OL-PEEP, 4 cmH(2)O above or below this level were applied in a random order, either with or without a prior lung recruitment maneuver. Ventilation-perfusion distribution (using multiple inert gas elimination technique), hemodynamics, blood gases and volumetric capnography data were recorded at the end of each condition (minute 40). Results S(III) showed the lowest value whenever lung recruitment and OL-PEEP were jointly applied and was associated with the lowest dispersion of ventilation and perfusion (Disp(R-E)), the lowest ratio of alveolar dead space to alveolar tidal volume (VD(alv)/VT(alv)) and the lowest difference between arterial and end-tidal pCO(2) (Pa-ETCO(2)). Spearman`s rank correlations between S(III) and Disp(R-E) showed a =0.85 with 95% CI for (Fisher`s Z-transformation) of 0.74-0.91, P < 0.0001. Conclusion In this experimental model of lung injury, changes in the phase III slope of the capnograms were directly correlated with the degree of ventilation/perfusion dispersion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The brain is a complex system that, in the normal condition, has emergent properties like those associated with activity-dependent plasticity in learning and memory, and in pathological situations, manifests abnormal long-term phenomena like the epilepsies. Data from our laboratory and from the literature were classified qualitatively as sources of complexity and emergent properties from behavior to electrophysiological, cellular, molecular, and computational levels. We used such models as brainstem-dependent acute audiogenic seizures and forebrain-dependent kindled audiogenic seizures. Additionally we used chemical OF electrical experimental models of temporal lobe epilepsy that induce status epilepticus with behavioral, anatomical, and molecular sequelae such as spontaneous recurrent seizures and long-term plastic changes. Current Computational neuroscience tools will help the interpretation. storage, and sharing of the exponential growth of information derived from those studies. These strategies are considered solutions to deal with the complexity of brain pathologies such as the epilepsies. (C) 2008 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When the data consist of certain attributes measured on the same set of items in different situations, they would be described as a three-mode three-way array. A mixture likelihood approach can be implemented to cluster the items (i.e., one of the modes) on the basis of both of the other modes simultaneously (i.e,, the attributes measured in different situations). In this paper, it is shown that this approach can be extended to handle three-mode three-way arrays where some of the data values are missing at random in the sense of Little and Rubin (1987). The methodology is illustrated by clustering the genotypes in a three-way soybean data set where various attributes were measured on genotypes grown in several environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A generalised model for the prediction of single char particle gasification dynamics, accounting for multi-component mass transfer with chemical reaction, heat transfer, as well as structure evolution and peripheral fragmentation is developed in this paper. Maxwell-Stefan analysis is uniquely applied to both micro and macropores within the framework of the dusty-gas model to account for the bidisperse nature of the char, which differs significantly from the conventional models that are based on a single pore type. The peripheral fragmentation and random-pore correlation incorporated into the model enable prediction of structure/reactivity relationships. The occurrence of chemical reaction within the boundary layer reported by Biggs and Agarwal (Chem. Eng. Sci. 52 (1997) 941) has been confirmed through an analysis of CO/CO2 product ratio obtained from model simulations. However, it is also quantitatively observed that the significance of boundary layer reaction reduces notably with the reduction of oxygen concentration in the flue gas, operational pressure and film thickness. Computations have also shown that in the presence of diffusional gradients peripheral fragmentation occurs in the early stages on the surface, after which conversion quickens significantly due to small particle size. Results of the early commencement of peripheral fragmentation at relatively low overall conversion obtained from a large number of simulations agree well with experimental observations reported by Feng and Bhatia (Energy & Fuels 14 (2000) 297). Comprehensive analysis of simulation results is carried out based on well accepted physical principles to rationalise model prediction. (C) 2001 Elsevier Science Ltd. AH rights reserved.