811 resultados para Algorithm Calibration
Resumo:
OBJECTIVE: To review the available knowledge on epidemiology and diagnoses of acute infections in children aged 2 to 59 months in primary care setting and develop an electronic algorithm for the Integrated Management of Childhood Illness to reach optimal clinical outcome and rational use of medicines. METHODS: A structured literature review in Medline, Embase and the Cochrane Database of Systematic Review (CDRS) looked for available estimations of diseases prevalence in outpatients aged 2-59 months, and for available evidence on i) accuracy of clinical predictors, and ii) performance of point-of-care tests for targeted diseases. A new algorithm for the management of childhood illness (ALMANACH) was designed based on evidence retrieved and results of a study on etiologies of fever in Tanzanian children outpatients. FINDINGS: The major changes in ALMANACH compared to IMCI (2008 version) are the following: i) assessment of 10 danger signs, ii) classification of non-severe children into febrile and non-febrile illness, the latter receiving no antibiotics, iii) classification of pneumonia based on a respiratory rate threshold of 50 assessed twice for febrile children 12-59 months; iv) malaria rapid diagnostic test performed for all febrile children. In the absence of identified source of fever at the end of the assessment, v) urine dipstick performed for febrile children <2 years to consider urinary tract infection, vi) classification of 'possible typhoid' for febrile children >2 years with abdominal tenderness; and lastly vii) classification of 'likely viral infection' in case of negative results. CONCLUSION: This smartphone-run algorithm based on new evidence and two point-of-care tests should improve the quality of care of <5 year children and lead to more rational use of antimicrobials.
Resumo:
Intravascular brachytherapy with beta sources has become a useful technique to prevent restenosis after cardiovascular intervention. In particular, the Beta-Cath high-dose-rate system, manufactured by Novoste Corporation, is a commercially available 90Sr 90Y source for intravascular brachytherapy that is achieving widespread use. Its dosimetric characterization has attracted considerable attention in recent years. Unfortunately, the short ranges of the emitted beta particles and the associated large dose gradients make experimental measurements particularly difficult. This circumstance has motivated the appearance of a number of papers addressing the characterization of this source by means of Monte Carlo simulation techniques.
Resumo:
Chemical analysis is a well-established procedure for the provenancing of archaeological ceramics. Various analytical techniques are routinely used and large amounts of data have been accumulated so far in data banks. However, in order to exchange results obtained by different laboratories, the respective analytical procedures need to be tested in terms of their inter-comparability. In this study, the schemes of analysis used in four laboratories that are involved in archaeological pottery studies on a routine basis were compared. The techniques investigated were neutron activation analysis (NAA), X-ray fluorescence analysis (XRF), inductively coupled plasma optical emission spectrometry (ICP-OES) and inductively coupled plasma mass spectrometry (ICP-MS). For this comparison series of measurements on different geological standard reference materials (SRM) were carried out and the results were statistically evaluated. An attempt was also made towards the establishment of calibration factors between pairs of analytical setups in order to smooth the systematic differences among the results.
Resumo:
Coherent anti-Stokes Raman scattering is the powerful method of laser spectroscopy in which significant successes are achieved. However, the non-linear nature of CARS complicates the analysis of the received spectra. The objective of this Thesis is to develop a new phase retrieval algorithm for CARS. It utilizes the maximum entropy method and the new wavelet approach for spectroscopic background correction of a phase function. The method was developed to be easily automated and used on a large number of spectra of different substances.. The algorithm was successfully tested on experimental data.
Resumo:
A thorough literature review about the current situation on the implementation of eye lens monitoring has been performed in order to provide recommendations regarding dosemeter types, calibration procedures and practical aspects of eye lens monitoring for interventional radiology personnel. Most relevant data and recommendations from about 100 papers have been analysed and classified in the following topics: challenges of today in eye lens monitoring; conversion coefficients, phantoms and calibration procedures for eye lens dose evaluation; correction factors and dosemeters for eye lens dose measurements; dosemeter position and influence of protective devices. The major findings of the review can be summarised as follows: the recommended operational quantity for the eye lens monitoring is H p (3). At present, several dosemeters are available for eye lens monitoring and calibration procedures are being developed. However, in practice, very often, alternative methods are used to assess the dose to the eye lens. A summary of correction factors found in the literature for the assessment of the eye lens dose is provided. These factors can give an estimation of the eye lens dose when alternative methods, such as the use of a whole body dosemeter, are used. A wide range of values is found, thus indicating the large uncertainty associated with these simplified methods. Reduction factors from most common protective devices obtained experimentally and using Monte Carlo calculations are presented. The paper concludes that the use of a dosemeter placed at collar level outside the lead apron can provide a useful first estimate of the eye lens exposure. However, for workplaces with estimated annual equivalent dose to the eye lens close to the dose limit, specific eye lens monitoring should be performed. Finally, training of the involved medical staff on the risks of ionising radiation for the eye lens and on the correct use of protective systems is strongly recommended.
Resumo:
Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).
Resumo:
Fetal MRI reconstruction aims at finding a high-resolution image given a small set of low-resolution images. It is usually modeled as an inverse problem where the regularization term plays a central role in the reconstruction quality. Literature has considered several regularization terms s.a. Dirichlet/Laplacian energy [1], Total Variation (TV)based energies [2,3] and more recently non-local means [4]. Although TV energies are quite attractive because of their ability in edge preservation, standard explicit steepest gradient techniques have been applied to optimize fetal-based TV energies. The main contribution of this work lies in the introduction of a well-posed TV algorithm from the point of view of convex optimization. Specifically, our proposed TV optimization algorithm for fetal reconstruction is optimal w.r.t. the asymptotic and iterative convergence speeds O(1/n(2)) and O(1/root epsilon), while existing techniques are in O(1/n) and O(1/epsilon). We apply our algorithm to (1) clinical newborn data, considered as ground truth, and (2) clinical fetal acquisitions. Our algorithm compares favorably with the literature in terms of speed and accuracy.
Resumo:
This paper describes Question Waves, an algorithm that can be applied to social search protocols, such as Asknext or Sixearch. In this model, the queries are propagated through the social network, with faster propagation through more trustable acquaintances. Question Waves uses local information to make decisions and obtain an answer ranking. With Question Waves, the answers that arrive first are the most likely to be relevant, and we computed the correlation of answer relevance with the order of arrival to demonstrate this result. We obtained correlations equivalent to the heuristics that use global knowledge, such as profile similarity among users or the expertise value of an agent. Because Question Waves is compatible with the social search protocol Asknext, it is possible to stop a search when enough relevant answers have been found; additionally, stopping the search early only introduces a minimal risk of not obtaining the best possible answer. Furthermore, Question Waves does not require a re-ranking algorithm because the results arrive sorted
Resumo:
Objective The authors have sought to study the calibration of a clinical PKA meter (Diamentor E2) and a calibrator for clinical meters (PDC) in the Laboratory of Ionizing Radiation Metrology at Instituto de Energia e Ambiente - Universidade de São Paulo. Materials and Methods Different qualities of both incident and transmitted beams were utilized in conditions similar to a clinical setting, analyzing the influence from the reference dosimeter, from the distance between meters, from the filtration and from the average beam energy. Calibrations were performed directly against a standard 30 cm3 cylindrical chamber or a parallel-plate monitor chamber, and indirectly against the PDC meter. Results The lowest energy dependence was observed for transmitted beams. The cross calibration between the Diamentor E2 and the PDC meters, and the PDC presented the greatest propagation of uncertainties. Conclusion The calibration coefficient of the PDC meter showed to be more stable with voltage, while the Diamentor E2 calibration coefficient was more variable. On the other hand, the PDC meter presented greater uncertainty in readings (5.0%) than with the use of the monitor chamber (3.5%) as a reference.
Resumo:
This paper proposes a pose-based algorithm to solve the full SLAM problem for an autonomous underwater vehicle (AUV), navigating in an unknown and possibly unstructured environment. The technique incorporate probabilistic scan matching with range scans gathered from a mechanical scanning imaging sonar (MSIS) and the robot dead-reckoning displacements estimated from a Doppler velocity log (DVL) and a motion reference unit (MRU). The proposed method utilizes two extended Kalman filters (EKF). The first, estimates the local path travelled by the robot while grabbing the scan as well as its uncertainty and provides position estimates for correcting the distortions that the vehicle motion produces in the acoustic images. The second is an augment state EKF that estimates and keeps the registered scans poses. The raw data from the sensors are processed and fused in-line. No priory structural information or initial pose are considered. The algorithm has been tested on an AUV guided along a 600 m path within a marina environment, showing the viability of the proposed approach
Resumo:
Image segmentation of natural scenes constitutes a major problem in machine vision. This paper presents a new proposal for the image segmentation problem which has been based on the integration of edge and region information. This approach begins by detecting the main contours of the scene which are later used to guide a concurrent set of growing processes. A previous analysis of the seed pixels permits adjustment of the homogeneity criterion to the region's characteristics during the growing process. Since the high variability of regions representing outdoor scenes makes the classical homogeneity criteria useless, a new homogeneity criterion based on clustering analysis and convex hull construction is proposed. Experimental results have proven the reliability of the proposed approach
Resumo:
This paper proposes a calibration method which can be utilized for the analysis of SEM images. The field of application of the developed method is a calculation of surface potential distribution of biased silicon edgeless detector. The suggested processing of the data collected by SEM consists of several stages and takes into account different aspects affecting the SEM image. The calibration method doesn’t pretend to be precise but at the same time it gives the basics of potential distribution when the different biasing voltages applied to the detector.