887 resultados para automatic particle picking
Resumo:
El problema de controlar les emissions de televisió digital a tota Europa pel desenvolupament de receptors robustos i fiables és cada vegada més significant, per això, sorgeix la necessitat d’automatitzar el procés d’anàlisi i control d’aquests senyals. Aquest projecte presenta el desenvolupament software d’una aplicació que vol solucionar una part d’aquest problema. L’aplicació s’encarrega d’analitzar, gestionar i capturar senyals de televisió digital. Aquest document fa una introducció a la matèria central que és la televisió digital i la informació que porten els senyals de televisió, concretament, la que es refereix a l’estàndard "Digital Video Broadcasting". A continuació d’aquesta part, l’escrit es concentra en l’explicació i descripció de les funcionalitats que necessita cobrir l'aplicació, així com introduir i explicar cada etapa d’un procés de desenvolupament software. Finalment, es resumeixen els avantatges de la creació d’aquest programa per l’automatització de l’anàlisi de senyal digital partint d’una optimització de recursos.
Resumo:
In recent years traditional inequality measures have been used to quite a considerable extent to examine the international distribution of environmental indicators. One of its main characteristics is that each one assigns different weights to the changes that occur in the different sections of the variable distribution and, consequently, the results they yield can potentially be very different. Hence, we suggest the appropriateness of using a range of well-recommended measures to achieve more robust results. We also provide an empirical test for the comparative behaviour of several suitable inequality measures and environmental indicators. Our findings support the hypothesis that in some cases there are differences among measures in both the sign of the evolution and its size. JEL codes: D39; Q43; Q56. Keywords: international environment factor distribution; Kaya factors; Inequality measurement
Resumo:
The ID-Chagas test is a particle gel immunoassay (PaGIA). Red coloured particles are sensitised with three different synthetic peptides representing antigen sequences of Trypanosoma cruzi: Ag2, TcD and TcE. When these particles are mixed with serum containing specific antibodies, they agglutinate. The reaction mixture is centrifuged through a gel filtration matrix allowing free agglutinated particles to remain trapped on the top or distributed within the gel. The result can be read visually. In order to investigate the ability of the ID-PaGIA to discriminate negative and positive sera, 111 negative and 119 positive, collected in four different Brazilian institutions, were tested by each of the participants. All sera were previously classified as positive or negative according to results obtained with three conventional tests (indirect immunofluorescence, indirect hemaglutination, and enzime linked immunosorbent assay). Sensitivity rates of ID-PaGIA varied from 95.7% to 97.4% with mean sensitivity of 96.8% and specificity rates varied from 93.8 to 98.8% with mean specificity of 94.6%. The overall Kappa test was 0.94. The assay presents as advantages the simplicity of operation and the reaction time of 20 min. In this study, ID-PaGIA showed to be highly sensitive and specific.
Resumo:
Purpose: Recently morphometric measurements of the ascending aorta have been done with ECG-gated MDCT to help the development of future endovascular therapies (TCT) [1]. However, the variability of these measurements remains unknown. It will be interesting to know the impact of CAD (computer aided diagnosis) with automated segmentation of the vessel and automatic measurements of diameter on the management of ascending aorta aneurysms. Methods and Materials: Thirty patients referred for ECG-gated CT thoracic angiography (64-row CT scanner) were evaluated. Measurements of the maximum and minimum ascending aorta diameters were obtained automatically with a commercially available CAD and semi-manually by two observers separately. The CAD algorithms segment the iv-enhanced lumen of the ascending aorta into perpendicular planes along the centreline. The CAD then determines the largest and the smallest diameters. Both observers repeated the automatic measurements and the semimanual measurements during a different session at least one month after the first measurements. The Bland and Altman method was used to study the inter/intraobserver variability. A Wilcoxon signed-rank test was also used to analyse differences between observers. Results: Interobserver variability for semi-manual measurements between the first and second observers was between 1.2 to 1.0 mm for maximal and minimal diameter, respectively. Intraobserver variability of each observer ranged from 0.8 to 1.2 mm, the lowest variability being produced by the more experienced observer. CAD variability could be as low as 0.3 mm, showing that it can perform better than human observers. However, when used in nonoptimal conditions (streak artefacts from contrast in the superior vena cava or weak lumen enhancement), CAD has a variability that can be as high as 0.9 mm, reaching variability of semi-manual measurements. Furthermore, there were significant differences between both observers for maximal and minimal diameter measurements (p<0.001). There was also a significant difference between the first observer and CAD for maximal diameter measurements with the former underestimating the diameter compared to the latter (p<0.001). As for minimal diameters, they were higher when measured by the second observer than when measured by CAD (p<0.001). Neither the difference of mean minimal diameter between the first observer and CAD nor the difference of mean maximal diameter between the second observer and CAD was significant (p=0.20 and 0.06, respectively). Conclusion: CAD algorithms can lessen the variability of diameter measurements in the follow-up of ascending aorta aneurysms. Nevertheless, in non-optimal conditions, it may be necessary to correct manually the measurements. Improvements of the algorithms will help to avoid such a situation.
Resumo:
We investigated the use of in situ implant formation that incorporates superparamagnetic iron oxide nanoparticles (SPIONs) as a form of minimally invasive treatment of cancer lesions by magnetically induced local hyperthermia. We developed injectable formulations that form gels entrapping magnetic particles into a tumor. We used SPIONs embedded in silica microparticles to favor syringeability and incorporated the highest proportion possible to allow large heating capacities. Hydrogel, single-solvent organogel and cosolvent (low-toxicity hydrophilic solvent) organogel formulations were injected into human cancer tumors xenografted in mice. The thermoreversible hydrogels (poloxamer, chitosan), which accommodated 20% w/v of the magnetic microparticles, proved to be inadequate. Alginate hydrogels, however, incorporated 10% w/v of the magnetic microparticles, and the external gelation led to strong implants localizing to the tumor periphery, whereas internal gelation failed in situ. The organogel formulations, which consisted of precipitating polymers dissolved in single organic solvents, displayed various microstructures. A 8% poly(ethylene-vinyl alcohol) in DMSO containing 40% w/v of magnetic microparticles formed the most suitable implants in terms of tumor casting and heat delivery. Importantly, it is of great clinical interest to develop cosolvent formulations with up to 20% w/v of magnetic microparticles that show reduced toxicity and centered tumor implantation.
Resumo:
Background. A software based tool has been developed (Optem) to allow automatize the recommendations of the Canadian Multiple Sclerosis Working Group for optimizing MS treatment in order to avoid subjective interpretation. METHODS: Treatment Optimization Recommendations (TORs) were applied to our database of patients treated with IFN beta1a IM. Patient data were assessed during year 1 for disease activity, and patients were assigned to 2 groups according to TOR: "change treatment" (CH) and "no change treatment" (NCH). These assessments were then compared to observed clinical outcomes for disease activity over the following years. RESULTS: We have data on 55 patients. The "change treatment" status was assigned to 22 patients, and "no change treatment" to 33 patients. The estimated sensitivity and specificity according to last visit status were 73.9% and 84.4%. During the following years, the Relapse Rate was always higher in the "change treatment" group than in the "no change treatment" group (5 y; CH: 0.7, NCH: 0.07; p < 0.001, 12 m - last visit; CH: 0.536, NCH: 0.34). We obtained the same results with the EDSS (4 y; CH: 3.53, NCH: 2.55, annual progression rate in 12 m - last visit; CH: 0.29, NCH: 0.13). CONCLUSION: Applying TOR at the first year of therapy allowed accurate prediction of continued disease activity in relapses and disability progression.
Resumo:
Named entity recognizers are unable to distinguish if a term is a general concept as "scientist" or an individual as "Einstein". In this paper we explore the possibility to reach this goal combining two basic approaches: (i) Super Sense Tagging (SST) and (ii) YAGO. Thanks to these two powerful tools we could automatically create a corpus set in order to train the SuperSense Tagger. The general F1 is over 76% and the model is publicly available.
Resumo:
We present a system for dynamic network resource configuration in environments with bandwidth reservation. The proposed system is completely distributed and automates the mechanisms for adapting the logical network to the offered load. The system is able to manage dynamically a logical network such as a virtual path network in ATM or a label switched path network in MPLS or GMPLS. The system design and implementation is based on a multi-agent system (MAS) which make the decisions of when and how to change a logical path. Despite the lack of a centralised global network view, results show that MAS manages the network resources effectively, reducing the connection blocking probability and, therefore, achieving better utilisation of network resources. We also include details of its architecture and implementation
Resumo:
A recent trend in digital mammography is computer-aided diagnosis systems, which are computerised tools designed to assist radiologists. Most of these systems are used for the automatic detection of abnormalities. However, recent studies have shown that their sensitivity is significantly decreased as the density of the breast increases. This dependence is method specific. In this paper we propose a new approach to the classification of mammographic images according to their breast parenchymal density. Our classification uses information extracted from segmentation results and is based on the underlying breast tissue texture. Classification performance was based on a large set of digitised mammograms. Evaluation involves different classifiers and uses a leave-one-out methodology. Results demonstrate the feasibility of estimating breast density using image processing and analysis techniques
Resumo:
Obtaining automatic 3D profile of objects is one of the most important issues in computer vision. With this information, a large number of applications become feasible: from visual inspection of industrial parts to 3D reconstruction of the environment for mobile robots. In order to achieve 3D data, range finders can be used. Coded structured light approach is one of the most widely used techniques to retrieve 3D information of an unknown surface. An overview of the existing techniques as well as a new classification of patterns for structured light sensors is presented. This kind of systems belong to the group of active triangulation method, which are based on projecting a light pattern and imaging the illuminated scene from one or more points of view. Since the patterns are coded, correspondences between points of the image(s) and points of the projected pattern can be easily found. Once correspondences are found, a classical triangulation strategy between camera(s) and projector device leads to the reconstruction of the surface. Advantages and constraints of the different patterns are discussed