61 resultados para robust mean
Resumo:
In this paper we deal with robust inference in heteroscedastic measurement error models Rather than the normal distribution we postulate a Student t distribution for the observed variables Maximum likelihood estimates are computed numerically Consistent estimation of the asymptotic covariance matrices of the maximum likelihood and generalized least squares estimators is also discussed Three test statistics are proposed for testing hypotheses of interest with the asymptotic chi-square distribution which guarantees correct asymptotic significance levels Results of simulations and an application to a real data set are also reported (C) 2009 The Korean Statistical Society Published by Elsevier B V All rights reserved
Resumo:
Linear mixed models were developed to handle clustered data and have been a topic of increasing interest in statistics for the past 50 years. Generally. the normality (or symmetry) of the random effects is a common assumption in linear mixed models but it may, sometimes, be unrealistic, obscuring important features of among-subjects variation. In this article, we utilize skew-normal/independent distributions as a tool for robust modeling of linear mixed models under a Bayesian paradigm. The skew-normal/independent distributions is an attractive class of asymmetric heavy-tailed distributions that includes the skew-normal distribution, skew-t, skew-slash and the skew-contaminated normal distributions as special cases, providing an appealing robust alternative to the routine use of symmetric distributions in this type of models. The methods developed are illustrated using a real data set from Framingham cholesterol study. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A Hamiltonian system perturbed by two waves with particular wave numbers can present robust tori, which are barriers created by the vanishing of the perturbed Hamiltonian at some defined positions. When robust tori exist, any trajectory in phase space passing close to them is blocked by emergent invariant curves that prevent the chaotic transport. Our results indicate that the considered particular solution for the two waves Hamiltonian model shows plenty of robust tori blocking radial transport. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The non-twist standard map occurs frequently in many fields of science specially in modelling the dynamics of the magnetic field lines in tokamaks. Robust tori, dynamical barriers that impede the radial transport among different regions of the phase space, are introduced in the non-twist standard map in a conservative fashion. The resulting non-twist standard map with robust tori is an improved model to study transport barriers in plasmas confined in tokamaks.
Resumo:
We present a non-linear symplectic map that describes the alterations of the magnetic field lines inside the tokamak plasma due to the presence of a robust torus (RT) at the plasma edge. This RT prevents the magnetic field lines from reaching the tokamak wall and reduces, in its vicinity, the islands and invariant curve destruction due to resonant perturbations. The map describes the equilibrium magnetic field lines perturbed by resonances created by ergodic magnetic limiters (EMLs). We present the results obtained for twist and non-twist mappings derived for monotonic and non-monotonic plasma current density radial profiles, respectively. Our results indicate that the RT implementation would decrease the field line transport at the tokamak plasma edge. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Radial transport in the tokamap, which has been proposed as a simple model for the motion in a stochastic plasma, is investigated. A theory for previous numerical findings is presented. The new results are stimulated by the fact that the radial diffusion coefficients is space-dependent. The space-dependence of the transport coefficient has several interesting effects which have not been elucidated so far. Among the new findings are the analytical predictions for the scaling of the mean radial displacement with time and the relation between the Fokker-Planck diffusion coefficient and the diffusion coefficient from the mean square displacement. The applicability to other systems is also discussed. (c) 2009 WILEY-VCH GmbH & Co. KGaA, Weinheim
Resumo:
Purpose: To obtain cerebral perfusion territories of the left, the right. and the posterior circulation in humans with high signal-to-noise ratio (SNR) and robust delineation. Materials and Methods: Continuous arterial spin labeling (CASL) was implemented using a dedicated radio frequency (RF) coil. positioned over the neck, to label the major cerebral feeding arteries in humans. Selective labeling was achieved by flow-driven adiabatic fast passage and by tilting the longitudinal labeling gradient about the Y-axis by theta = +/- 60 degrees. Results: Mean cerebral blood flow (CBF) values in gray matter (GM) and white matter (WM) were 74 +/- 13 mL center dot 100 g(-1) center dot minute(-1) and 14 +/- 13 mL center dot 100 g(-1) center dot minute(-1), respectively (N = 14). There were no signal differences between left and right hemispheres when theta = 0 degrees (P > 0.19), indicating efficient labeling of both hemispheres. When theta = +60 degrees, the signal in GM on the left hemisphere, 0.07 +/- 0.06%, was 92% lower than on the right hemisphere. 0.85 +/- 0.30% (P < 1 x 10(-9)). while for theta = -60 degrees, the signal in the right hemisphere. 0.16 +/- 0.13%, was 82% lower than on the contralateral side. 0.89 +/- 0.22% (P < 1 x 10(-10)). Similar attenuations were obtained in WM. Conclusion: Clear delineation of the left and right cerebral perfusion territories was obtained, allowing discrimination of the anterior and posterior circulation in each hemisphere.
Resumo:
The count intercept is a robust method for the numerical analysis of fabrics Launeau and Robin (1996). It counts the number of intersections between a set of parallel scan lines and a mineral phase, which must be identified on a digital image. However, the method is only sensitive to boundaries and therefore supposes the user has some knowledge about their significance. The aim of this paper is to show that a proper grey level detection of boundaries along scan lines is sufficient to calculate the two-dimensional anisotropy of grain or crystal distributions without any particular image processing. Populations of grains and crystals usually display elliptical anisotropies in rocks. When confirmed by the intercept analysis, a combination of a minimum of 3 mean length intercept roses, taken on 3 more or less perpendicular sections, allows the calculation of 3-dimensional ellipsoids and the determination of their standard deviation with direction and intensity in 3 dimensions as well. The feasibility of this quick method is attested by numerous examples on theoretical objects deformed by active and passive deformation, on BSE images of synthetic magma flow, on drawing or direct analysis of thin section pictures of sandstones and on digital images of granites directly taken and measured in the field. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
In this paper, a simple relation between the Leimkuhler curve and the mean residual life is established. The result is illustrated with several models commonly used in informetrics, such as exponential, Pareto and lognormal. Finally, relationships with some other reliability concepts are also presented. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
We discuss the estimation of the expected value of the quality-adjusted survival, based on multistate models. We generalize an earlier work, considering the sojourn times in health states are not identically distributed, for a given vector of covariates. Approaches based on semiparametric and parametric (exponential and Weibull distributions) methodologies are considered. A simulation study is conducted to evaluate the performance of the proposed estimator and the jackknife resampling method is used to estimate the variance of such estimator. An application to a real data set is also included.
Resumo:
Regression models for the mean quality-adjusted survival time are specified from hazard functions of transitions between two states and the mean quality-adjusted survival time may be a complex function of covariates. We discuss a regression model for the mean quality-adjusted survival (QAS) time based on pseudo-observations, which has the advantage of directly modeling the effect of covariates in the QAS time. Both Monte Carlo Simulations and a real data set are studied. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
In clinical trials, it may be of interest taking into account physical and emotional well-being in addition to survival when comparing treatments. Quality-adjusted survival time has the advantage of incorporating information about both survival time and quality-of-life. In this paper, we discuss the estimation of the expected value of the quality-adjusted survival, based on multistate models for the sojourn times in health states. Semiparametric and parametric (with exponential distribution) approaches are considered. A simulation study is presented to evaluate the performance of the proposed estimator and the jackknife resampling method is used to compute bias and variance of the estimator. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
The immersed boundary method is a versatile tool for the investigation of flow-structure interaction. In a large number of applications, the immersed boundaries or structures are very stiff and strong tangential forces on these interfaces induce a well-known, severe time-step restriction for explicit discretizations. This excessive stability constraint can be removed with fully implicit or suitable semi-implicit schemes but at a seemingly prohibitive computational cost. While economical alternatives have been proposed recently for some special cases, there is a practical need for a computationally efficient approach that can be applied more broadly. In this context, we revisit a robust semi-implicit discretization introduced by Peskin in the late 1970s which has received renewed attention recently. This discretization, in which the spreading and interpolation operators are lagged. leads to a linear system of equations for the inter-face configuration at the future time, when the interfacial force is linear. However, this linear system is large and dense and thus it is challenging to streamline its solution. Moreover, while the same linear system or one of similar structure could potentially be used in Newton-type iterations, nonlinear and highly stiff immersed structures pose additional challenges to iterative methods. In this work, we address these problems and propose cost-effective computational strategies for solving Peskin`s lagged-operators type of discretization. We do this by first constructing a sufficiently accurate approximation to the system`s matrix and we obtain a rigorous estimate for this approximation. This matrix is expeditiously computed by using a combination of pre-calculated values and interpolation. The availability of a matrix allows for more efficient matrix-vector products and facilitates the design of effective iterative schemes. We propose efficient iterative approaches to deal with both linear and nonlinear interfacial forces and simple or complex immersed structures with tethered or untethered points. One of these iterative approaches employs a splitting in which we first solve a linear problem for the interfacial force and then we use a nonlinear iteration to find the interface configuration corresponding to this force. We demonstrate that the proposed approach is several orders of magnitude more efficient than the standard explicit method. In addition to considering the standard elliptical drop test case, we show both the robustness and efficacy of the proposed methodology with a 2D model of a heart valve. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
We present a variable time step, fully adaptive in space, hybrid method for the accurate simulation of incompressible two-phase flows in the presence of surface tension in two dimensions. The method is based on the hybrid level set/front-tracking approach proposed in [H. D. Ceniceros and A. M. Roma, J. Comput. Phys., 205, 391400, 2005]. Geometric, interfacial quantities are computed from front-tracking via the immersed-boundary setting while the signed distance (level set) function, which is evaluated fast and to machine precision, is used as a fluid indicator. The surface tension force is obtained by employing the mixed Eulerian/Lagrangian representation introduced in [S. Shin, S. I. Abdel-Khalik, V. Daru and D. Juric, J. Comput. Phys., 203, 493-516, 2005] whose success for greatly reducing parasitic currents has been demonstrated. The use of our accurate fluid indicator together with effective Lagrangian marker control enhance this parasitic current reduction by several orders of magnitude. To resolve accurately and efficiently sharp gradients and salient flow features we employ dynamic, adaptive mesh refinements. This spatial adaption is used in concert with a dynamic control of the distribution of the Lagrangian nodes along the fluid interface and a variable time step, linearly implicit time integration scheme. We present numerical examples designed to test the capabilities and performance of the proposed approach as well as three applications: the long-time evolution of a fluid interface undergoing Rayleigh-Taylor instability, an example of bubble ascending dynamics, and a drop impacting on a free interface whose dynamics we compare with both existing numerical and experimental data.
Resumo:
We propose a likelihood ratio test ( LRT) with Bartlett correction in order to identify Granger causality between sets of time series gene expression data. The performance of the proposed test is compared to a previously published bootstrapbased approach. LRT is shown to be significantly faster and statistically powerful even within non- Normal distributions. An R package named gGranger containing an implementation for both Granger causality identification tests is also provided.