835 resultados para sampling methodology
Resumo:
A study of tin deposits from Priamurye (Russia) is performed to analyze the differencesbetween them based on their origin and also on commercial criteria. A particularanalysis based on their vertical zonality is also given for samples from Solnechnoedeposit. All the statistical analysis are made on the subcomposition formed by seventrace elements in cassiterite (In, Sc, Be, W, Nb, Ti and V) using the Aitchison’methodology of analysis of compositional data
Resumo:
One of the key aspects in 3D-image registration is the computation of the joint intensity histogram. We propose a new approach to compute this histogram using uniformly distributed random lines to sample stochastically the overlapping volume between two 3D-images. The intensity values are captured from the lines at evenly spaced positions, taking an initial random offset different for each line. This method provides us with an accurate, robust and fast mutual information-based registration. The interpolation effects are drastically reduced, due to the stochastic nature of the line generation, and the alignment process is also accelerated. The results obtained show a better performance of the introduced method than the classic computation of the joint histogram
Resumo:
Over the past decade, significant interest has been expressed in relating the spatial statistics of surface-based reflection ground-penetrating radar (GPR) data to those of the imaged subsurface volume. A primary motivation for this work is that changes in the radar wave velocity, which largely control the character of the observed data, are expected to be related to corresponding changes in subsurface water content. Although previous work has indeed indicated that the spatial statistics of GPR images are linked to those of the water content distribution of the probed region, a viable method for quantitatively analyzing the GPR data and solving the corresponding inverse problem has not yet been presented. Here we address this issue by first deriving a relationship between the 2-D autocorrelation of a water content distribution and that of the corresponding GPR reflection image. We then show how a Bayesian inversion strategy based on Markov chain Monte Carlo sampling can be used to estimate the posterior distribution of subsurface correlation model parameters that are consistent with the GPR data. Our results indicate that if the underlying assumptions are valid and we possess adequate prior knowledge regarding the water content distribution, in particular its vertical variability, this methodology allows not only for the reliable recovery of lateral correlation model parameters but also for estimates of parameter uncertainties. In the case where prior knowledge regarding the vertical variability of water content is not available, the results show that the methodology still reliably recovers the aspect ratio of the heterogeneity.
Resumo:
BACKGROUND Only multifaceted hospital wide interventions have been successful in achieving sustained improvements in hand hygiene (HH) compliance. METHODOLOGY/PRINCIPAL FINDINGS Pre-post intervention study of HH performance at baseline (October 2007-December 2009) and during intervention, which included two phases. Phase 1 (2010) included multimodal WHO approach. Phase 2 (2011) added Continuous Quality Improvement (CQI) tools and was based on: a) Increase of alcohol hand rub (AHR) solution placement (from 0.57 dispensers/bed to 1.56); b) Increase in frequency of audits (three days every three weeks: "3/3 strategy"); c) Implementation of a standardized register form of HH corrective actions; d) Statistical Process Control (SPC) as time series analysis methodology through appropriate control charts. During the intervention period we performed 819 scheduled direct observation audits which provided data from 11,714 HH opportunities. The most remarkable findings were: a) significant improvements in HH compliance with respect to baseline (25% mean increase); b) sustained high level (82%) of HH compliance during intervention; c) significant increase in AHRs consumption over time; c) significant decrease in the rate of healthcare-acquired MRSA; d) small but significant improvements in HH compliance when comparing phase 2 to phase 1 [79.5% (95% CI: 78.2-80.7) vs 84.6% (95% CI:83.8-85.4), p<0.05]; e) successful use of control charts to identify significant negative and positive deviations (special causes) related to the HH compliance process over time ("positive": 90.1% as highest HH compliance coinciding with the "World hygiene day"; and "negative":73.7% as lowest HH compliance coinciding with a statutory lay-off proceeding). CONCLUSIONS/SIGNIFICANCE CQI tools may be a key addition to WHO strategy to maintain a good HH performance over time. In addition, SPC has shown to be a powerful methodology to detect special causes in HH performance (positive and negative) and to help establishing adequate feedback to healthcare workers.
Field optimisation of MosquiTRAP sampling for monitoring Aedes aegypti Linnaeus (Diptera: Culicidae)
Resumo:
A sticky trap designed to capture gravid Aedes (Stegomyia) aegypti mosquitoes, MosquiTRAP, has been evaluated for monitoring this species in Brazil. However, the effects of trap densities on the capture rate of Ae. aegypti females and the sensitivity of vector detection are still unknown. After a preliminary study has identified areas of high and low female mosquito abundance, a set of experiments was conducted in four neighbourhoods of Belo Horizonte (state of Minas Gerais, Brazil) using densities of 1, 2, 4, 8, 16, 32 and 64 traps per block. Trap sensitivity (positive MosquiTRAP index) increased significantly when 1-8 MosquiTRAPs were installed per block in both high and low abundance areas. A strong fit was obtained for the total number of mosquitoes captured with increasing trap densities through a non-linear function (Box-Lucas) (r² = 0,994), which likely exhibits saturation towards an equilibrium level. The capacity of the Mean Female Aedes Index to distinguish between areas of high and low Ae. aegypti abundance was also investigated; the achieved differentiation was shown to be dependent on the MosquiTRAP density.
Resumo:
The method of sample recovery for trace detection and identification of explosives plays a critical role in several criminal investigations. After bombing, there can be difficulties in sending big objects to a laboratory for analysis. Traces can also be searched for on large surfaces, on hands of suspects or on surfaces where the explosive was placed during preparatory phases (e.g. places where an IED was assembled, vehicles used for transportation, etc.). In this work, triacetone triperoxide (TATP) was synthesized from commercial precursors following reported methods. Several portions of about 6 mg of TATP were then spread on different surfaces (e.g. floors, tables, etc.) or used in handling tests. Three different swabbing systems were used: a commercial swab, pre-wetted with propan-2-ol (isopropanol) and water (7:3), dry paper swabs, and cotton swabs wetted with propan-2-ol. Paper and commercial swabs were also used to sample a metal plate, where a small charge of about 4 g of TATP was detonated. Swabs were sealed in small glass jars with screw caps and Parafilm® M and sent to the laboratory for analysis. Swabs were extracted and analysed several weeks later by gas chromatography/mass spectrometry. All the three systems gave positive results, but wetted swabs collected higher amounts of TATP. The developed procedure showed its suitability for use in real cases, allowing TATP detection in several simulations, including a situation in which people wash their hands after handling the explosive.
Resumo:
BACKGROUND Functional brain images such as Single-Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) have been widely used to guide the clinicians in the Alzheimer's Disease (AD) diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD) Systems. METHODS It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs) are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE) features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT), Principal Component Analysis (PCA) or Partial Least Squares (PLS) (the two latter also analysed with a LMNN transformation). Regarding the classifiers, kernel Support Vector Machines (SVMs) and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. RESULTS Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i) linear transformation of the PLS or PCA reduced data, ii) feature reduction technique, and iii) classifier (with Euclidean, Mahalanobis or Energy-based methodology). The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT) and 90.67%, 88% and 93.33% (for PET), respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. CONCLUSIONS All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between the classes but it also makes (in combination with NMSE and PLS) this rate variation more stable. In addition, their generalization ability is another advance since several experiments were performed on two image modalities (SPECT and PET).
Resumo:
Distribution, abundance, feeding behaviour, host preference, parity status and human-biting and infection rates are among the medical entomological parameters evaluated when determining the vector capacity of mosquito species. To evaluate these parameters, mosquitoes must be collected using an appropriate method. Malaria is primarily transmitted by anthropophilic and synanthropic anophelines. Thus, collection methods must result in the identification of the anthropophilic species and efficiently evaluate the parameters involved in malaria transmission dynamics. Consequently, human landing catches would be the most appropriate method if not for their inherent risk. The choice of alternative anopheline collection methods, such as traps, must consider their effectiveness in reproducing the efficiency of human attraction. Collection methods lure mosquitoes by using a mixture of olfactory, visual and thermal cues. Here, we reviewed, classified and compared the efficiency of anopheline collection methods, with an emphasis on Neotropical anthropophilic species, especially Anopheles darlingi, in distinct malaria epidemiological conditions in Brazil.
Resumo:
BACKGROUND Missed, delayed or incorrect diagnoses are considered to be diagnostic errors. The aim of this paper is to describe the methodology of a study to analyse cognitive aspects of the process by which primary care (PC) physicians diagnose dyspnoea. It examines the possible links between the use of heuristics, suboptimal cognitive acts and diagnostic errors, using Reason's taxonomy of human error (slips, lapses, mistakes and violations). The influence of situational factors (professional experience, perceived overwork and fatigue) is also analysed. METHODS Cohort study of new episodes of dyspnoea in patients receiving care from family physicians and residents at PC centres in Granada (Spain). With an initial expected diagnostic error rate of 20%, and a sampling error of 3%, 384 episodes of dyspnoea are calculated to be required. In addition to filling out the electronic medical record of the patients attended, each physician fills out 2 specially designed questionnaires about the diagnostic process performed in each case of dyspnoea. The first questionnaire includes questions on the physician's initial diagnostic impression, the 3 most likely diagnoses (in order of likelihood), and the diagnosis reached after the initial medical history and physical examination. It also includes items on the physicians' perceived overwork and fatigue during patient care. The second questionnaire records the confirmed diagnosis once it is reached. The complete diagnostic process is peer-reviewed to identify and classify the diagnostic errors. The possible use of heuristics of representativeness, availability, and anchoring and adjustment in each diagnostic process is also analysed. Each audit is reviewed with the physician responsible for the diagnostic process. Finally, logistic regression models are used to determine if there are differences in the diagnostic error variables based on the heuristics identified. DISCUSSION This work sets out a new approach to studying the diagnostic decision-making process in PC, taking advantage of new technologies which allow immediate recording of the decision-making process.
Resumo:
The impact of radial k-space sampling and water-selective excitation on a novel navigator-gated cardiac-triggered slab-selective inversion prepared 3D steady-state free-precession (SSFP) renal MR angiography (MRA) sequence was investigated. Renal MRA was performed on a 1.5-T MR system using three inversion prepared SSFP approaches: Cartesian (TR/TE: 5.7/2.8 ms, FA: 85 degrees), radial (TR/TE: 5.5/2.7 ms, FA: 85 degrees) SSFP, and radial SSFP combined with water-selective excitation (TR/TE: 9.9/4.9 ms, FA: 85 degrees). Radial data acquisition lead to significantly reduced motion artifacts (P < 0.05). SNR and CNR were best using Cartesian SSFP (P < 0.05). Vessel sharpness and vessel length were comparable in all sequences. The addition of a water-selective excitation could not improve image quality. In conclusion, radial k-space sampling reduces motion artifacts significantly in slab-selective inversion prepared renal MRA, while SNR and CNR are decreased. The addition of water-selective excitation could not improve the lower CNR in radial scanning.
Resumo:
PURPOSE: To investigate the potential of free-breathing 3D steady-state free precession (SSFP) imaging with radial k-space sampling for coronary MR-angiography (MRA), coronary projection MR-angiography and coronary vessel wall imaging. MATERIALS AND METHODS: A navigator-gated free-breathing T2-prepared 3D SSFP sequence (TR = 6.1 ms, TE = 3.0 ms, flip angle = 120 degrees, field-of-view = 360 mm(2)) with radial k-space sampling (384 radials) was implemented for coronary MRA. For projection coronary MRA, this sequence was combined with a 2D selective aortic spin tagging pulse. Coronary vessel wall imaging was performed using a high-resolution inversion-recovery black-blood 3D radial SSFP sequence (384 radials, TR = 5.3 ms, TE = 2.7 ms, flip angle = 55 degrees, reconstructed resolution 0.35 x 0.35 x 1.2 mm(3)) and a local re-inversion pulse. Six healthy volunteers (two for each sequence) were investigated. Motion artifact level was assessed by two radiologists. Results: In coronary MRA, the coronary lumen was displayed with a high signal and high contrast to the surrounding lumen. Projection coronary MRA demonstrated selective visualization of the coronary lumen while surrounding tissue was almost completely suppressed. In coronary vessel wall imaging, the vessel wall was displayed with a high signal when compared to the blood pool and the surrounding tissue. No visible motion artifacts were seen. Conclusion: 3D radial SSFP imaging enables coronary MRA, coronary projection MRA and coronary vessel wall imaging with a low motion artifact level.
Resumo:
It has been shown that the accuracy of mammographic abnormality detection methods is strongly dependent on the breast tissue characteristics, where a dense breast drastically reduces detection sensitivity. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. Here, we describe the development of an automatic breast tissue classification methodology, which can be summarized in a number of distinct steps: 1) the segmentation of the breast area into fatty versus dense mammographic tissue; 2) the extraction of morphological and texture features from the segmented breast areas; and 3) the use of a Bayesian combination of a number of classifiers. The evaluation, based on a large number of cases from two different mammographic data sets, shows a strong correlation ( and 0.67 for the two data sets) between automatic and expert-based Breast Imaging Reporting and Data System mammographic density assessment