955 resultados para standard package software


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Accelerated failure time models with a shared random component are described, and are used to evaluate the effect of explanatory factors and different transplant centres on survival times following kidney transplantation. Different combinations of the distribution of the random effects and baseline hazard function are considered and the fit of such models to the transplant data is critically assessed. A mixture model that combines short- and long-term components of a hazard function is then developed, which provides a more flexible model for the hazard function. The model can incorporate different explanatory variables and random effects in each component. The model is straightforward to fit using standard statistical software, and is shown to be a good fit to the transplant data. Copyright (C) 2004 John Wiley Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider the issue of performing accurate small-sample likelihood-based inference in beta regression models, which are useful for modelling continuous proportions that are affected by independent variables. We derive small-sample adjustments to the likelihood ratio statistic in this class of models. The adjusted statistics can be easily implemented from standard statistical software. We present Monte Carlo simulations showing that inference based on the adjusted statistics we propose is much more reliable than that based on the usual likelihood ratio statistic. A real data example is presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Likelihood ratio tests can be substantially size distorted in small- and moderate-sized samples. In this paper, we apply Skovgaard`s [Skovgaard, I.M., 2001. Likelihood asymptotics. Scandinavian journal of Statistics 28, 3-321] adjusted likelihood ratio statistic to exponential family nonlinear models. We show that the adjustment term has a simple compact form that can be easily implemented from standard statistical software. The adjusted statistic is approximately distributed as X(2) with high degree of accuracy. It is applicable in wide generality since it allows both the parameter of interest and the nuisance parameter to be vector-valued. Unlike the modified profile likelihood ratio statistic obtained from Cox and Reid [Cox, D.R., Reid, N., 1987. Parameter orthogonality and approximate conditional inference. journal of the Royal Statistical Society B49, 1-39], the adjusted statistic proposed here does not require an orthogonal parameterization. Numerical comparison of likelihood-based tests of varying dispersion favors the test we propose and a Bartlett-corrected version of the modified profile likelihood ratio test recently obtained by Cysneiros and Ferrari [Cysneiros, A.H.M.A., Ferrari, S.L.P., 2006. An improved likelihood ratio test for varying dispersion in exponential family nonlinear models. Statistics and Probability Letters 76 (3), 255-265]. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The main purpose of this work is to study the behaviour of Skovgaard`s [Skovgaard, I.M., 2001. Likelihood asymptotics. Scandinavian journal of Statistics 28, 3-32] adjusted likelihood ratio statistic in testing simple hypothesis in a new class of regression models proposed here. The proposed class of regression models considers Dirichlet distributed observations, and the parameters that index the Dirichlet distributions are related to covariates and unknown regression coefficients. This class is useful for modelling data consisting of multivariate positive observations summing to one and generalizes the beta regression model described in Vasconcellos and Cribari-Neto [Vasconcellos, K.L.P., Cribari-Neto, F., 2005. Improved maximum likelihood estimation in a new class of beta regression models. Brazilian journal of Probability and Statistics 19,13-31]. We show that, for our model, Skovgaard`s adjusted likelihood ratio statistics have a simple compact form that can be easily implemented in standard statistical software. The adjusted statistic is approximately chi-squared distributed with a high degree of accuracy. Some numerical simulations show that the modified test is more reliable in finite samples than the usual likelihood ratio procedure. An empirical application is also presented and discussed. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present work consists of the investigation of the navigation of Pioneer 10 and 11 probes becoming known as the “Pioneer Anomaly”: the trajectories followed by the spacecrafts did not match the ones retrieved with standard navigation software. Mismatching appeared as a linear drift in the Doppler data received by the spacecrafts, which has been ascribed to a constant sunward acceleration of about 8.5×10-10 m/s2. The study presented hereafter tries to find a convincing explanation to this discrepancy. The research is based on the analysis of Doppler tracking data through the ODP (Orbit Determination Program), developed by NASA/JPL. The method can be summarized as: seek for any kind of physics affecting the dynamics of the spacecraft or the propagation of radiometric data, which may have not been properly taken into account previously, and check whether or not these might rule out the anomaly. A major effort has been put to build a thermal model of the spacecrafts for predicting the force due to anisotropic thermal radiation, since this is a model not natively included in the ODP. Tracking data encompassing more than twenty years of Pioneer 10 interplanetary cruise, plus twelve years of Pioneer 11 have been analyzed in light of the results of the thermal model. Different strategies of orbit determination have been implemented, including single arc, multi arc and stochastic filters, and their performance compared. Orbital solutions have been obtained without the needing of any acceleration other than the thermal recoil one indicating it as the responsible for the observed linear drift in the Doppler residuals. As a further support to this we checked that inclusion of additional constant acceleration as does not improve the quality of orbital solutions. All the tests performed lead to the conclusion that no anomalous acceleration is acting on Pioneers spacecrafts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this study was to evaluate whether measurements on conventional cephalometric radiographs are comparable with 3D measurements on 3D models of human skulls, derived from cone beam CT (CBCT) data. A CBCT scan and a conventional cephalometric radiograph were made of 40 dry skulls. Standard cephalometric software was used to identify landmarks on both the 2D images and the 3D models. The same operator identified 17 landmarks on the cephalometric radiographs and on the 3D models. All images and 3D models were traced five times with a time-interval of 1 week and the mean value of repeated measurements was used for further statistical analysis. Distances and angles were calculated. Intra-observer reliability was good for all measurements. The reproducibility of the measurements on the conventional cephalometric radiographs was higher compared with the reproducibility of measurements on the 3D models. For a few measurements a clinically relevant difference between measurements on conventional cephalometric radiographs and 3D models was found. Measurements on conventional cephalometric radiographs can differ significantly from measurements on 3D models of the same skull. The authors recommend that 3D tracings for longitudinal research are not used in cases were there are only 2D records from the past.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this study was to evaluate whether measurements performed on conventional frontal radiographs are comparable to measurements performed on three-dimensional (3D) models of human skulls derived from cone beam computed tomography (CBCT) scans and if the latter can be used in longitudinal studies. Cone beam computed tomography scans and conventional frontal cephalometric radiographs were made of 40 dry human skulls. From the CBCT scan a 3D model was constructed. Standard cephalometric software was used to identify landmarks and to calculate ratios and angles. The same operator identified 10 landmarks on both types of cephalometric radiographs, and on all images, five times with a time interval of 1 wk. Intra-observer reliability was acceptable for all measurements. There was a statistically significant and clinically relevant difference between measurements performed on conventional frontal radiographs and on 3D CBCT-derived models of the same skull. There was a clinically relevant difference between angular measurements performed on conventional frontal cephalometric radiographs, compared with measurements performed on 3D models constructed from CBCT scans. We therefore recommend that 3D models should not be used for longitudinal research in cases where there are only two-dimensional (2D) records from the past.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study evaluated whether measurements on conventional frontal radiographs are comparable with measurements on cone beam computed tomography (CBCT)-constructed frontal cephalometric radiographs taken from dry human skulls. CBCT scans and conventional frontal cephalometric radiographs were made of 40 dry skulls. With I-Cat Vision((R)) software, a cephalometric radiograph was constructed from the CBCT scan. Standard cephalometric software was used to identify landmarks and calculate ratios and angles. The same operator identified 10 landmarks on both types of cephalometric radiographs on all Images 5 times with a time-interval of 1 week. Intra-observer reliability was acceptable for all measurements. The reproducibility of the measurements on the frontal radiographs obtained from the CBCT scans was higher than those on conventional frontal radiographs. There is a statistically significant and clinically relevant difference between measurements on conventional and constructed frontal radiographs. There is a clinically relevant difference between angular measurements performed on conventional frontal cephalometric radiographs, compared with measurements on frontal cephalometric radiographs constructed from CBCT scans, owing to different positioning of patients in both devices. Positioning of the patient in the CBCT device appears to be an important factor in cases where a 2D projection of the 3D scan is made.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To assess the effect of age and disease on mineral distribution at the distal third of the tibia, bone mineral content (BMC) and bone mineral density (BMD) were measured at lumbar spine (spine), femoral neck (neck), and diaphysis (Dia) and distal epiphysis (Epi) of the tibia in 89 healthy control women of different age groups (20-29, n = 12; 30-39, n = 11; 40-44, n = 12; 45-49, n = 12; 50-54, n = 12; 55-59, n = 10; 60-69, n = 11; 70-79, n = 9), in 25 women with untreated vertebral osteoporosis (VOP), and in 19 women with primary hyperparathyroidism (PHPT) using dual-energy x-ray absorptiometry (DXA; Hologic QDR 1000 and standard spine software). A soft tissue simulator was used to compensate for heterogeneity of soft tissue thickness around the leg. Tibia was scanned over a length of 130 mm from the ankle joint, fibula being excluded from analysis. For BMC and BMD, 10 sections 13 mm each were analyzed separately and then pooled to define the epiphysis (Epi 13-52 mm) and diaphysis area (Dia 91-130 mm). Precision after repositioning was 1.9 and 2.1% for Epi and Dia, respectively. In the control group, at any site there was no significant difference between age groups 20-29 and 30-39, which thus were pooled to define the peak bone mass (PBM).(ABSTRACT TRUNCATED AT 250 WORDS)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Due to the ongoing trend towards increased product variety, fast-moving consumer goods such as food and beverages, pharmaceuticals, and chemicals are typically manufactured through so-called make-and-pack processes. These processes consist of a make stage, a pack stage, and intermediate storage facilities that decouple these two stages. In operations scheduling, complex technological constraints must be considered, e.g., non-identical parallel processing units, sequence-dependent changeovers, batch splitting, no-wait restrictions, material transfer times, minimum storage times, and finite storage capacity. The short-term scheduling problem is to compute a production schedule such that a given demand for products is fulfilled, all technological constraints are met, and the production makespan is minimised. A production schedule typically comprises 500–1500 operations. Due to the problem size and complexity of the technological constraints, the performance of known mixed-integer linear programming (MILP) formulations and heuristic approaches is often insufficient. We present a hybrid method consisting of three phases. First, the set of operations is divided into several subsets. Second, these subsets are iteratively scheduled using a generic and flexible MILP formulation. Third, a novel critical path-based improvement procedure is applied to the resulting schedule. We develop several strategies for the integration of the MILP model into this heuristic framework. Using these strategies, high-quality feasible solutions to large-scale instances can be obtained within reasonable CPU times using standard optimisation software. We have applied the proposed hybrid method to a set of industrial problem instances and found that the method outperforms state-of-the-art methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Permanent displacements of a gas turbine founded on a fine, poorly graded, and medium density sand are studied. The amplitudes and modes of vibration are computed using Barkan´s formulation, and the “High-Cycle Accumulation” (HCA) model is employed to account for accumulated deformations due to the high number of cycles. The methodology is simple: it can be easily incorporated into standard mathematical software, and HCA model parameters can be estimated based on granulometry and index properties. Special attention is devoted to ‘transient’ situations at equipment´s start-up, during which a range of frequencies – including frequencies that could be similar to the natural frequencies of the ground – is traversed. Results show that such transient situations could be more restrictive than stationary situations corresponding to normal operation. Therefore, checking the stationary situation only might not be enough, and studying the influence of transient situations on computed permanent displacements is needed to produce a proper foundation design

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study was conducted to develop a method, termed 'back analysis (BA)', for converting non-compartmental variables to compartment model dependent pharmacokinetic parameters for both one- and two-compartment models. A Microsoft Excel((R)) spreadsheet was implemented with the use of Solver((R)) and visual basic functions. The performance of the BA method in estimating pharmacokinetic parameter values was evaluated by comparing the parameter values obtained to a standard modelling software program, NONMEM, using simulated data. The results show that the BA method was reasonably precise and provided low bias in estimating fixed and random effect parameters for both one- and two-compartment models. The pharmacokinetic parameters estimated from the BA method were similar to those of NONMEM estimation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This project investigates why people in Chile acquired so much consumer debt in contexts of material prosperity, and asks what the role of inequality and commodification is in this process. The case raises an important challenge to the literature. Insofar as existing accounts assume that the financialization of consumption occurs in contexts marked by wage stagnation and a general deterioration of the middle classes, they engender two contradictory explanations: while political economists argue that people use credit in order to smooth their consumption in the face of market volatility, economists maintain that concentration of wealth at the top pushes middle income consumers to emulate the expenditures of the rich and consume beyond their means. These explanations do not necessarily fit the reality of developing countries. Triangulating in-depth interviews with middle class families, multivariate statistical analysis and secondary literature, the project shows that consumers in Chile use credit to finance “ordinary” forms of consumption that do not aim either at coping with market instability or emulating and signaling status to others. Rather, Chileans use department store credit cards in order to acquire a standard package of “inconspicuous” goods that they feel entitled to have. From this point of view, the systematic indebtedness of consumers originates in a major concern with “rank”, “achievement” and "security" that – following De Botton -- I call “status anxiety”. Status anxiety does not stem from the desire to emulate rich consumers, but from the impossibility of complying with normative expectations about what a middle class family should be (and have) that outweigh wage improvements. The project thus investigates the way in which “status anxiety” is systematically reproduced by means of two broad mechanisms that prompt people to acquire consumer debt. The first mechanism generating debt stems from an increase of real wages and high levels of inequality. It is explained by a general sociological principle known as relative deprivation, which points to the fact that general satisfaction with one´s income, possessions or status, is assessed not in absolute terms such as total income, but in relation with reference groups. In this sense, I explore the mechanisms that operate as catalyzers of relative deprivation, by making explicit social inequalities and distorting the perception of others´ wealth. Despite upward mobility and economic improvement, Chileans share the perception of “falling behind,” which materializes in an “imaginary middle class” against which people compare their status, possessions and economic independence. Finally, I show that the commodification of education, health and pension funds does not directly prompt people to acquire consumer debt, but operate as “income draining” mechanisms that demand higher shares of middle class families’ “discretionary income.” In combination with “relative deprivation,” these “income draining” mechanisms leave families with few options to perform their desired class identities, other than learning how to bring resources from the future into the present with the help of department store credit cards.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Context. In February-March 2014, the MAGIC telescopes observed the high-frequency peaked BL Lac 1ES 1011+496 (z=0.212) in flaring state at very-high energy (VHE, E>100GeV). The flux reached a level more than 10 times higher than any previously recorded flaring state of the source. Aims. Description of the characteristics of the flare presenting the light curve and the spectral parameters of the night-wise spectra and the average spectrum of the whole period. From these data we aim at detecting the imprint of the Extragalactic Background Light (EBL) in the VHE spectrum of the source, in order to constrain its intensity in the optical band. Methods. We analyzed the gamma-ray data from the MAGIC telescopes using the standard MAGIC software for the production of the light curve and the spectra. For the constraining of the EBL we implement the method developed by the H.E.S.S. collaboration in which the intrinsic energy spectrum of the source is modeled with a simple function (< 4 parameters), and the EBL-induced optical depth is calculated using a template EBL model. The likelihood of the observed spectrum is then maximized, including a normalization factor for the EBL opacity among the free parameters. Results. The collected data allowed us to describe the flux changes night by night and also to produce di_erential energy spectra for all nights of the observed period. The estimated intrinsic spectra of all the nights could be fitted by power-law functions. Evaluating the changes in the fit parameters we conclude that the spectral shape for most of the nights were compatible, regardless of the flux level, which enabled us to produce an average spectrum from which the EBL imprint could be constrained. The likelihood ratio test shows that the model with an EBL density 1:07 (-0.20,+0.24)stat+sys, relative to the one in the tested EBL template (Domínguez et al. 2011), is preferred at the 4:6 σ level to the no-EBL hypothesis, with the assumption that the intrinsic source spectrum can be modeled as a log-parabola. This would translate into a constraint of the EBL density in the wavelength range [0.24 μm,4.25 μm], with a peak value at 1.4 μm of λF_ = 12:27^(+2:75)_ (-2:29) nW m^(-2) sr^(-1), including systematics.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

A new version of the TomoRebuild data reduction software package is presented, for the reconstruction of scanning transmission ion microscopy tomography (STIMT) and particle induced X-ray emission tomography (PIXET) images. First, we present a state of the art of the reconstruction codes available for ion beam microtomography. The algorithm proposed here brings several advantages. It is a portable, multi-platform code, designed in C++ with well-separated classes for easier use and evolution. Data reduction is separated in different steps and the intermediate results may be checked if necessary. Although no additional graphic library or numerical tool is required to run the program as a command line, a user friendly interface was designed in Java, as an ImageJ plugin. All experimental and reconstruction parameters may be entered either through this plugin or directly in text format files. A simple standard format is proposed for the input of experimental data. Optional graphic applications using the ROOT interface may be used separately to display and fit energy spectra. Regarding the reconstruction process, the filtered backprojection (FBP) algorithm, already present in the previous version of the code, was optimized so that it is about 10 times as fast. In addition, Maximum Likelihood Expectation Maximization (MLEM) and its accelerated version Ordered Subsets Expectation Maximization (OSEM) algorithms were implemented. A detailed user guide in English is available. A reconstruction example of experimental data from a biological sample is given. It shows the capability of the code to reduce noise in the sinograms and to deal with incomplete data, which puts a new perspective on tomography using low number of projections or limited angle.