883 resultados para Image interpretation, Computer-assisted
Resumo:
Steatosis, also known as fatty liver, corresponds to an abnormal retention of lipids within the hepatic cells and reflects an impairment of the normal processes of synthesis and elimination of fat. Several causes may lead to this condition, namely obesity, diabetes, or alcoholism. In this paper an automatic classification algorithm is proposed for the diagnosis of the liver steatosis from ultrasound images. The features are selected in order to catch the same characteristics used by the physicians in the diagnosis of the disease based on visual inspection of the ultrasound images. The algorithm, designed in a Bayesian framework, computes two images: i) a despeckled one, containing the anatomic and echogenic information of the liver, and ii) an image containing only the speckle used to compute the textural features. These images are computed from the estimated RF signal generated by the ultrasound probe where the dynamic range compression performed by the equipment is taken into account. A Bayes classifier, trained with data manually classified by expert clinicians and used as ground truth, reaches an overall accuracy of 95% and a 100% of sensitivity. The main novelties of the method are the estimations of the RF and speckle images which make it possible to accurately compute textural features of the liver parenchyma relevant for the diagnosis.
Resumo:
Ao longo dos tempos foi possível constatar que uma grande parte do tempo dos professores é gasta na componente de avaliação. Por esse facto, há já algumas décadas que a correcção automática de texto livre é alvo de investigação. Sendo a correcção de exercícios efectuada pelo computador permite que o professor dedique o seu tempo em tarefas que melhorem a aprendizagem dos alunos. Para além disso, cada vez mais as novas tecnologias permitem o uso de ferramentas com bastante utilidade no ensino, pois para além de facilitarem a exposição do conhecimento também permitem uma maior retenção da informação. Logo, associar ferramentas de gestão de sala de aula à correcção automática de respostas de texto livre é um desafio bastante interessante. O objectivo desta dissertação foi a realização de um estudo relativamente à área de avaliação assistida por computador em que este trabalho se insere. Inicialmente, foram analisados alguns correctores ortográficos para seleccionar aquele que seria integrado no módulo proposto. De seguida, foram estudadas as técnicas mais relevantes e as ferramentas que mais se enquadram no âmbito deste trabalho. Neste contexto, a ideia foi partir da existência de uma ferramenta de gestão de sala de aula e desenvolver um módulo para a correcção de exercícios. A aplicação UNI_NET-Classroom, que foi a ferramenta para a qual o módulo foi desenvolvido, já continha um componente de gestão de exercícios que apenas efectuava a correcção para as respostas de escolha múltipla. Com este trabalho pretendeu-se acrescentar mais uma funcionalidade a esse componente, cujo intuito é dar apoio ao professor através da correcção de exercícios e sugestão da cotação a atribuir. Por último, foram realizadas várias experiências sobre o módulo desenvolvido, de forma a ser possível retirar algumas conclusões para o presente trabalho. A conclusão mais importante foi que as ferramentas de correcção automática são uma mais-valia para os professores e escolas.
Resumo:
Electrocardiographic (ECG) signals are emerging as a recent trend in the field of biometrics. In this paper, we propose a novel ECG biometric system that combines clustering and classification methodologies. Our approach is based on dominant-set clustering, and provides a framework for outlier removal and template selection. It enhances the typical workflows, by making them better suited to new ECG acquisition paradigms that use fingers or hand palms, which lead to signals with lower signal to noise ratio, and more prone to noise artifacts. Preliminary results show the potential of the approach, helping to further validate the highly usable setups and ECG signals as a complementary biometric modality.
Resumo:
INTED2010, the 4th International Technology, Education and Development Conference was held in Valencia (Spain), on March 8, 9 and 10, 2010.
Resumo:
The aim of this research was to evaluate the protein polymorphism degree among seventy-five C. albicans strains from healthy children oral cavities of five socioeconomic categories from eight schools (private and public) in Piracicaba city, São Paulo State, in order to identify C. albicans subspecies and their similarities in infantile population groups and to establish their possible dissemination route. Cell cultures were grown in YEPD medium, collected by centrifugation, and washed with cold saline solution. The whole-cell proteins were extracted by cell disruption, using glass beads and submitted to SDS-PAGE technique. After electrophoresis, the protein bands were stained with Coomassie-blue and analyzed by statistics package NTSYS-pc version 1.70 software. Similarity matrix and dendrogram were generated by using the Dice similarity coefficient and UPGMA algorithm, respectively, which made it possible to evaluate the similarity or intra-specific polymorphism degrees, based on whole-cell protein fingerprinting of C. albicans oral isolates. A total of 13 major phenons (clusters) were analyzed, according to their homogeneous (socioeconomic category and/or same school) and heterogeneous (distinct socioeconomic categories and/or schools) characteristics. Regarding to the social epidemiological aspect, the cluster composition showed higher similarities (0.788 < S D < 1.0) among C. albicans strains isolated from healthy children independent of their socioeconomic bases (high, medium, or low). Isolates of high similarity were not found in oral cavities from healthy children of social stratum A and D, B and D, or C and E. This may be explained by an absence of a dissemination route among these children. Geographically, some healthy children among identical and different schools (private and public) also are carriers of similar strains but such similarity was not found among other isolates from children from certain schools. These data may reflect a restricted dissemination route of these microorganisms in some groups of healthy scholars, which may be dependent of either socioeconomic categories or geographic site of each child. In contrast to the higher similarity, the lower similarity or higher polymorphism degree (0.499 < S D < 0.788) of protein profiles was shown in 23 (30.6%) C. albicans oral isolates. Considering the social epidemiological aspect, 42.1%, 41.7%, 26.6%, 23.5%, and 16.7% were isolates from children concerning to socioeconomic categories A, D, C, B, and E, respectively, and geographically, 63.6%, 50%, 33.3%, 33.3%, 30%, 25%, and 14.3% were isolates from children from schools LAE (Liceu Colégio Albert Einstein), MA (E.E.P.S.G. "Prof. Elias de Melo Ayres"), CS (E.E.P.G. "Prof. Carlos Sodero"), AV (Alphaville), HF (E.E.P.S.G. "Honorato Faustino), FMC (E.E.P.G. "Prof. Francisco Mariano da Costa"), and MEP (E.E.P.S.G. "Prof. Manasses Ephraim Pereira), respectively. Such results suggest a higher protein polymorphism degree among some strains isolated from healthy children independent of their socioeconomic strata or geographic sites. Complementary studies, involving healthy students and their families, teachers, servants, hygiene and nutritional habits must be done in order to establish the sources of such colonization patterns in population groups of healthy children. The whole-cell protein profile obtained by SDS-PAGE associated with computer-assisted numerical analysis may provide additional criteria for the taxonomic and epidemiological studies of C. albicans.
Resumo:
Software tools in education became popular since the widespread of personal computers. Engineering courses lead the way in this development and these tools became almost a standard. Engineering graduates are familiar with numerical analysis tools but also with simulators (e.g. electronic circuits), computer assisted design tools and others, depending on the degree. One of the main problems with these tools is when and how to start use them so that they can be beneficial to students and not mere substitutes for potentially difficult calculations or design. In this paper a software tool to be used by first year students in electronics/electricity courses is presented. The growing acknowledgement and acceptance of open source software lead to the choice of an open source software tool – Scilab, which is a numerical analysis tool – to develop a toolbox. The toolbox was developed to be used as standalone or integrated in an e-learning platform. The e-learning platform used was Moodle. The first approach was to assess the mathematical skills necessary to solve all the problems related to electronics and electricity courses. Analysing the existing circuit simulators software tools, it is clear that even though they are very helpful by showing the end result they are not so effective in the process of the students studying and self learning since they show results but not intermediate steps which are crucial in problems that involve derivatives or integrals. Also, they are not very effective in obtaining graphical results that could be used to elaborate reports and for an overall better comprehension of the results. The developed tool was based on the numerical analysis software Scilab and is a toolbox that gives their users the opportunity to obtain the end results of a circuit analysis but also the expressions obtained when derivative and integrals calculations, plot signals, obtain vector diagrams, etc. The toolbox runs entirely in the Moodle web platform and provides the same results as the standalone application. The students can use the toolbox through the web platform (in computers where they don't have installation privileges) or in their personal computers by installing both the Scilab software and the toolbox. This approach was designed for first year students from all engineering degrees that have electronics/electricity courses in their curricula.
Resumo:
The process of visually exploring underwater environments is still a complex problem. Underwater vision systems require complementary means of sensor information to help overcome water disturbances. This work proposes the development of calibration methods for a structured light based system consisting on a camera and a laser with a line beam. Two different calibration procedures that require only two images from different viewpoints were developed and tested in dry and underwater environments. Results obtained show, an accurate calibration for the camera/projector pair with errors close to 1 mm even in the presence of a small stereos baseline.
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
BACKGROUND: Wireless capsule endoscopy has been introduced as an innovative, non-invasive diagnostic technique for evaluation of the gastrointestinal tract, reaching places where conventional endoscopy is unable to. However, the output of this technique is an 8 hours video, whose analysis by the expert physician is very time consuming. Thus, a computer assisted diagnosis tool to help the physicians to evaluate CE exams faster and more accurately is an important technical challenge and an excellent economical opportunity. METHOD: The set of features proposed in this paper to code textural information is based on statistical modeling of second order textural measures extracted from co-occurrence matrices. To cope with both joint and marginal non-Gaussianity of second order textural measures, higher order moments are used. These statistical moments are taken from the two-dimensional color-scale feature space, where two different scales are considered. Second and higher order moments of textural measures are computed from the co-occurrence matrices computed from images synthesized by the inverse wavelet transform of the wavelet transform containing only the selected scales for the three color channels. The dimensionality of the data is reduced by using Principal Component Analysis. RESULTS: The proposed textural features are then used as the input of a classifier based on artificial neural networks. Classification performances of 93.1% specificity and 93.9% sensitivity are achieved on real data. These promising results open the path towards a deeper study regarding the applicability of this algorithm in computer aided diagnosis systems to assist physicians in their clinical practice.
Resumo:
Simulated moving bed (SMB) chromatography is attracting more and more attention since it is a powerful technique for complex separation tasks. Nowadays, more than 60% of preparative SMB units are installed in the pharmaceutical and in the food in- dustry [SDI, Preparative and Process Liquid Chromatography: The Future of Process Separations, International Strategic Directions, Los Angeles, USA, 2002. http://www. strategicdirections.com]. Chromatography is the method of choice in these ¯elds, be- cause often pharmaceuticals and ¯ne-chemicals have physico-chemical properties which di®er little from those of the by-products, and they may be thermally instable. In these cases, standard separation techniques as distillation and extraction are not applicable. The noteworthiness of preparative chromatography, particulary SMB process, as a sep- aration and puri¯cation process in the above mentioned industries has been increasing, due to its °exibility, energy e±ciency and higher product purity performance. Consequently, a new SMB paradigm is requested by the large number of potential small- scale applications of the SMB technology, which exploits the °exibility and versatility of the technology. In this new SMB paradigm, a number of possibilities for improving SMB performance through variation of parameters during a switching interval, are pushing the trend toward the use of units with smaller number of columns because less stationary phase is used and the setup is more economical. This is especially important for the phar- maceutical industry, where SMBs are seen as multipurpose units that can be applied to di®erent separations in all stages of the drug-development cycle. In order to reduce the experimental e®ort and accordingly the coast associated with the development of separation processes, simulation models are intensively used. One impor- tant aspect in this context refers to the determination of the adsorption isotherms in SMB chromatography, where separations are usually carried out under strongly nonlinear conditions in order to achieve higher productivities. The accurate determination of the competitive adsorption equilibrium of the enantiomeric species is thus of fundamental importance to allow computer-assisted optimization or process scale-up. Two major SMB operating problems are apparent at production scale: the assessment of product quality and the maintenance of long-term stable and controlled operation. Constraints regarding product purity, dictated by pharmaceutical and food regulatory organizations, have drastically increased the demand for product quality control. The strict imposed regulations are increasing the need for developing optically pure drugs.(...)
Resumo:
Driven by concerns about rising energy costs, security of supply and climate change a new wave of Sustainable Energy Technologies (SET’s) have been embraced by the Irish consumer. Such systems as solar collectors, heat pumps and biomass boilers have become common due to government backed financial incentives and revisions of the building regulations. However, there is a deficit of knowledge and understanding of how these technologies operate and perform under Ireland’s maritime climate. This AQ-WBL project was designed to address both these needs by developing a Data Acquisition (DAQ) system to monitor the performance of such technologies and a web-based learning environment to disseminate performance characteristics and supplementary information about these systems. A DAQ system consisting of 108 sensors was developed as part of Galway-Mayo Institute of Technology’s (GMIT’s) Centre for the Integration of Sustainable EnergyTechnologies (CiSET) in an effort to benchmark the performance of solar thermal collectors and Ground Source Heat Pumps (GSHP’s) under Irish maritime climate, research new methods of integrating these systems within the built environment and raise awareness of SET’s. It has operated reliably for over 2 years and has acquired over 25 million data points. Raising awareness of these SET’s is carried out through the dissemination of the performance data through an online learning environment. A learning environment was created to provide different user groups with a basic understanding of a SET’s with the support of performance data, through a novel 5 step learning process and two examples were developed for the solar thermal collectors and the weather station which can be viewed at http://www.kdp 1 .aquaculture.ie/index.aspx. This online learning environment has been demonstrated to and well received by different groups of GMIT’s undergraduate students and plans have been made to develop it further to support education, awareness, research and regional development.
Resumo:
We present a computer-assisted analysis of combinatorial properties of the Cayley graphs of certain finitely generated groups: Given a group with a finite set of generators, we study the density of the corresponding Cayley graph, that is, the least upper bound for the average vertex degree (= number of adjacent edges) of any finite subgraph. It is known that an m-generated group is amenable if and only if the density of the corresponding Cayley graph equals to 2m. We test amenable and non-amenable groups, and also groups for which amenability is unknown. In the latter class we focus on Richard Thompson’s group F.
Resumo:
PURPOSE: To evaluate the feasibility of visualizing the stent lumen using coronary magnetic resonance angiography in vitro. MATERIAL AND METHODS: Nineteen different coronary stents were implanted in plastic tubes with an inner diameter of 3 mm. The tubes were positioned in a plastic container filled with gel and included in a closed flow circuit (constant flow 18 cm/sec). The magnetic resonance images were obtained with a dual inversion fast spin-echo sequence. For intraluminal stent imaging, subtraction images were calculated from scans with and without flow. Subsequently, intraluminal signal properties were objectively assessed and compared. RESULTS: As a function of the stent type, various degrees of in-stent signal attenuation were observed. Tantalum stents demonstrated minimal intraluminal signal attenuation. For nitinol stents, the stent lumen could be identified, but the intraluminal signal was markedly reduced. Steel stents resulted in the most pronounced intraluminal signal voids. CONCLUSIONS: With the present technique, radiofrequency penetration into the stents is strongly influenced by the stent material. Thesefindings may have important implicationsforfuture stent design and stent imaging strategies.
Resumo:
Functional connectivity in human brain can be represented as a network using electroencephalography (EEG) signals. These networks--whose nodes can vary from tens to hundreds--are characterized by neurobiologically meaningful graph theory metrics. This study investigates the degree to which various graph metrics depend upon the network size. To this end, EEGs from 32 normal subjects were recorded and functional networks of three different sizes were extracted. A state-space based method was used to calculate cross-correlation matrices between different brain regions. These correlation matrices were used to construct binary adjacency connectomes, which were assessed with regards to a number of graph metrics such as clustering coefficient, modularity, efficiency, economic efficiency, and assortativity. We showed that the estimates of these metrics significantly differ depending on the network size. Larger networks had higher efficiency, higher assortativity and lower modularity compared to those with smaller size and the same density. These findings indicate that the network size should be considered in any comparison of networks across studies.
Resumo:
Red blood cell (RBC) parameters such as morphology, volume, refractive index, and hemoglobin content are of great importance for diagnostic purposes. Existing approaches require complicated calibration procedures and robust cell perturbation. As a result, reference values for normal RBC differ depending on the method used. We present a way for measuring parameters of intact individual RBCs by using digital holographic microscopy (DHM), a new interferometric and label-free technique with nanometric axial sensitivity. The results are compared with values achieved by conventional techniques for RBC of the same donor and previously published figures. A DHM equipped with a laser diode (lambda = 663 nm) was used to record holograms in an off-axis geometry. Measurements of both RBC refractive indices and volumes were achieved via monitoring the quantitative phase map of RBC by means of a sequential perfusion of two isotonic solutions with different refractive indices obtained by the use of Nycodenz (decoupling procedure). Volume of RBCs labeled by membrane dye Dil was analyzed by confocal microscopy. The mean cell volume (MCV), red blood cell distribution width (RDW), and mean cell hemoglobin concentration (MCHC) were also measured with an impedance volume analyzer. DHM yielded RBC refractive index n = 1.418 +/- 0.012, volume 83 +/- 14 fl, MCH = 29.9 pg, and MCHC 362 +/- 40 g/l. Erythrocyte MCV, MCH, and MCHC achieved by an impedance volume analyzer were 82 fl, 28.6 pg, and 349 g/l, respectively. Confocal microscopy yielded 91 +/- 17 fl for RBC volume. In conclusion, DHM in combination with a decoupling procedure allows measuring noninvasively volume, refractive index, and hemoglobin content of single-living RBCs with a high accuracy.