99 resultados para resersible jump Markov chain Monte Carlo


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The procedure for online process control by attributes consists of inspecting a single item at every m produced items. It is decided on the basis of the inspection result whether the process is in-control (the conforming fraction is stable) or out-of-control (the conforming fraction is decreased, for example). Most articles about online process control have cited the stoppage of the production process for an adjustment when the inspected item is non-conforming (then the production is restarted in-control, here denominated as corrective adjustment). Moreover, the articles related to this subject do not present semi-economical designs (which may yield high quantities of non-conforming items), as they do not include a policy of preventive adjustments (in such case no item is inspected), which can be more economical, mainly if the inspected item can be misclassified. In this article, the possibility of preventive or corrective adjustments in the process is decided at every m produced item. If a preventive adjustment is decided upon, then no item is inspected. On the contrary, the m-th item is inspected; if it conforms, the production goes on, otherwise, an adjustment takes place and the process restarts in-control. This approach is economically feasible for some practical situations and the parameters of the proposed procedure are determined minimizing an average cost function subject to some statistical restrictions (for example, to assure a minimal levelfixed in advanceof conforming items in the production process). Numerical examples illustrate the proposal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the complexity-performance trade-off of several heuristic near-optimum multiuser detection (MuD) approaches applied to the uplink of synchronous single/multiple-input multiple-output multicarrier code division multiple access (S/MIMO MC-CDMA) systems. Genetic algorithm (GA), short term tabu search (STTS) and reactive tabu search (RTS), simulated annealing (SA), particle swarm optimization (PSO), and 1-opt local search (1-LS) heuristic multiuser detection algorithms (Heur-MuDs) are analyzed in details, using a single-objective antenna-diversity-aided optimization approach. Monte- Carlo simulations show that, after convergence, the performances reached by all near-optimum Heur-MuDs are similar. However, the computational complexities may differ substantially, depending on the system operation conditions. Their complexities are carefully analyzed in order to obtain a general complexity-performance framework comparison and to show that unitary Hamming distance search MuD (uH-ds) approaches (1-LS, SA, RTS and STTS) reach the best convergence rates, and among them, the 1-LS-MuD provides the best trade-off between implementation complexity and bit error rate (BER) performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent claim that the exit probability (EP) of a slightly modified version of the Sznadj model is a continuous function of the initial magnetization is questioned. This result has been obtained analytically and confirmed by Monte Carlo simulations, simultaneously and independently by two different groups (EPL, 82 (2008) 18006; 18007). It stands at odds with an earlier result which yielded a step function for the EP (Europhys. Lett., 70 (2005) 705). The dispute is investigated by proving that the continuous shape of the EP is a direct outcome of a mean-field treatment for the analytical result. As such, it is most likely to be caused by finite-size effects in the simulations. The improbable alternative would be a signature of the irrelevance of fluctuations in this system. Indeed, evidence is provided in support of the stepwise shape as going beyond the mean-field level. These findings yield new insight in the physics of one-dimensional systems with respect to the validity of a true equilibrium state when using solely local update rules. The suitability and the significance to perform numerical simulations in those cases is discussed. To conclude, a great deal of caution is required when applying updates rules to describe any system especially social systems. Copyright (C) EPLA, 2011

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to present an economical design of an X chart for a short-run production. The process mean starts equal to mu(0) (in-control, State I) and in a random time it shifts to mu(1) > mu(0) (out-of-control, State II). The monitoring procedure consists of inspecting a single item at every m produced ones. If the measurement of the quality characteristic does not meet the control limits, the process is stopped, adjusted, and additional (r - 1) items are inspected retrospectively. The probabilistic model was developed considering only shifts in the process mean. A direct search technique is applied to find the optimum parameters which minimizes the expected cost function. Numerical examples illustrate the proposed procedure. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article deals with the efficiency of fractional integration parameter estimators. This study was based on Monte Carlo experiments involving simulated stochastic processes with integration orders in the range]-1,1[. The evaluated estimation methods were classified into two groups: heuristics and semiparametric/maximum likelihood (ML). The study revealed that the comparative efficiency of the estimators, measured by the lesser mean squared error, depends on the stationary/non-stationary and persistency/anti-persistency conditions of the series. The ML estimator was shown to be superior for stationary persistent processes; the wavelet spectrum-based estimators were better for non-stationary mean reversible and invertible anti-persistent processes; the weighted periodogram-based estimator was shown to be superior for non-invertible anti-persistent processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SETTING: Chronic obstructive pulmonary disease (COPD) is the third leading cause of death among adults in Brazil. OBJECTIVE: To evaluate the mortality and hospitalisation trends in Brazil caused by COPD during the period 1996-2008. DESIGN: We used the health official statistics system to obtain data about mortality (1996-2008) and morbidity (1998-2008) due to COPD and all respiratory diseases (tuberculosis: codes A15-16; lung cancer: code C34, and all diseases coded from J40 to 47 in the 10th Revision of the International Classification of Diseases) as the underlying cause, in persons aged 45-74 years. We used the Joinpoint Regression Program log-linear model using Poisson regression that creates a Monte Carlo permutation test to identify points where trend lines change significantly in magnitude/direction to verify peaks and trends. RESULTS: The annual per cent change in age-adjusted death rates due to COPD declined by 2.7% in men (95%CI -3.6 to -1.8) and -2.0% (95%CI -2.9 to -1.0) in women; and due to all respiratory causes it declined by -1.7% (95%CI 2.4 to -1.0) in men and -1.1% (95%CI -1.8 to -0.3) in women. Although hospitalisation rates for COPD are declining, the hospital admission fatality rate increased in both sexes. CONCLUSION: COPD is still a leading cause of mortality in Brazil despite the observed decline in the mortality/hospitalisation rates for both sexes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radiation dose calculations in nuclear medicine depend on quantification of activity via planar and/or tomographic imaging methods. However, both methods have inherent limitations, and the accuracy of activity estimates varies with object size, background levels, and other variables. The goal of this study was to evaluate the limitations of quantitative imaging with planar and single photon emission computed tomography (SPECT) approaches, with a focus on activity quantification for use in calculating absorbed dose estimates for normal organs and tumors. To do this we studied a series of phantoms of varying complexity of geometry, with three radionuclides whose decay schemes varied from simple to complex. Four aqueous concentrations of (99m)Tc, (131)I, and (111)In (74, 185, 370, and 740 kBq mL(-1)) were placed in spheres of four different sizes in a water-filled phantom, with three different levels of activity in the surrounding water. Planar and SPECT images of the phantoms were obtained on a modern SPECT/computed tomography (CT) system. These radionuclides and concentration/background studies were repeated using a cardiac phantom and a modified torso phantom with liver and ""tumor"" regions containing the radionuclide concentrations and with the same varying background levels. Planar quantification was performed using the geometric mean approach, with attenuation correction (AC), and with and without scatter corrections (SC and NSC). SPECT images were reconstructed using attenuation maps (AM) for AC; scatter windows were used to perform SC during image reconstruction. For spherical sources with corrected data, good accuracy was observed (generally within +/- 10% of known values) for the largest sphere (11.5 mL) and for both planar and SPECT methods with (99m)Tc and (131)I, but were poorest and deviated from known values for smaller objects, most notably for (111)In. SPECT quantification was affected by the partial volume effect in smaller objects and generally showed larger errors than the planar results in these cases for all radionuclides. For the cardiac phantom, results were the most accurate of all of the experiments for all radionuclides. Background subtraction was an important factor influencing these results. The contribution of scattered photons was important in quantification with (131)I; if scatter was not accounted for, activity tended to be overestimated using planar quantification methods. For the torso phantom experiments, results show a clear underestimation of activity when compared to previous experiment with spherical sources for all radionuclides. Despite some variations that were observed as the level of background increased, the SPECT results were more consistent across different activity concentrations. Planar or SPECT quantification on state-of-the-art gamma cameras with appropriate quantitative processing can provide accuracies of better than 10% for large objects and modest target-to-background concentrations; however when smaller objects are used, in the presence of higher background, and for nuclides with more complex decay schemes, SPECT quantification methods generally produce better results. Health Phys. 99(5):688-701; 2010

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: At least for a subset of patients, the clinical diagnosis of mild cognitive impairment (MCI) may represent an intermediate stage between normal aging and dementia. Nevertheless, the patterns of transition of cognitive states between normal cognitive aging and MCI to dementia are not well established. In this study we address the pattern of transitions between cognitive states in patients with MCI and healthy controls, prior to the conversion to dementia. Methods: 139 subjects (78% women, mean age, 68.5 +/- 6.1 years; mean educational level, 11.7 +/- 5.4 years) were consecutively assessed in a memory clinic with a standardized clinical and neuropsychological protocol, and classified as cognitively healthy (normal controls) or with MCI (including subtypes) at baseline. These subjects underwent annual reassessments (mean duration of follow-up: 2.7 +/- 1.1 years), in which cognitive state was ascertained independently of prior diagnoses. The pattern of transitions of the cognitive state was determined by Markov chain analysis. Results: The transitions from one cognitive state to another varied substantially between MCI subtypes. Single-domain MCI (amnestic and non-amnestic) more frequently returned to normal cognitive state upon follow-up (22.5% and 21%, respectively). Among subjects who progressed to Alzheimer`s disease (AD), the most common diagnosis immediately prior conversion was multiple-domain MCI (85%). Conclusion: The clinical diagnosis of MCI and its subtypes yields groups of patients with heterogeneous patterns of transitions between one given cognitive state to another. The presence of more severe and widespread cognitive deficits, as indicated by the group of multiple-domain amnestic MCI may be a better predictor of AD than single-domain amnestic or non-amnestic deficits. These higher-risk individuals could probably be the best candidates for the development of preventive strategies and early treatment for the disease.