982 resultados para Elasticity Imaging Techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent studies of mobile Web trends show the continued explosion of mobile-friend content. However, the wide number and heterogeneity of mobile devices poses several challenges for Web programmers, who want automatic delivery of context and adaptation of the content to mobile devices. Hence, the device detection phase assumes an important role in this process. In this chapter, the authors compare the most used approaches for mobile device detection. Based on this study, they present an architecture for detecting and delivering uniform m-Learning content to students in a Higher School. The authors focus mainly on the XML device capabilities repository and on the REST API Web Service for dealing with device data. In the former, the authors detail the respective capabilities schema and present a new caching approach. In the latter, they present an extension of the current API for dealing with it. Finally, the authors validate their approach by presenting the overall data and statistics collected through the Google Analytics service, in order to better understand the adherence to the mobile Web interface, its evolution over time, and the main weaknesses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Doutoramento em Conservação e Restauro, especialidade Teoria, História e Técnicas

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: This study aims to investigate the influence of tube potential (kVp) variation in relation to perceptual image quality and effective dose for pelvis using automatic exposure control (AEC) and non-AEC in a computed radiography (CR) system. Methods and Materials: To determine the effects of using AEC and non-AEC by applying the 10 kVp rule in two experiments using an anthropomorphic pelvis phantom. Images were acquired using 10 kVp increments (60-120 kVp) for both experiments. The first experiment, based on seven AEC combinations, produced 49 images. The mean mAs from each kVp increment were used as a baseline for the second experiment producing 35 images. A total of 84 images were produced and a panel of 5 experienced observers participated for the image scoring using the 2 AFC visual grading software. PCXMC software was used to estimate the effective dose. Results: A decrease in perceptual image quality as the kVp increases was observed both in non-AEC and AEC experiments, however no significant statistical differences (p> 0.05) were found. Image quality scores from all observers at 10 kVp increments for all mAs values using non-AEC mode demonstrates a better score up to 90 kVp. Effective dose results show a statistical significant decrease (p=0.000) on the 75th quartile from 0.3 mSv at 60 kVp to 0.1 mSv at 120 kVp when applying the 10 kVp rule in non-AEC mode. Conclusion: No significant reduction in perceptual image quality is observed when increasing kVp whilst a marked and significant effective dose reduction is observed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mathematical models and statistical analysis are key instruments in soil science scientific research as they can describe and/or predict the current state of a soil system. These tools allow us to explore the behavior of soil related processes and properties as well as to generate new hypotheses for future experimentation. A good model and analysis of soil properties variations, that permit us to extract suitable conclusions and estimating spatially correlated variables at unsampled locations, is clearly dependent on the amount and quality of data and of the robustness techniques and estimators. On the other hand, the quality of data is obviously dependent from a competent data collection procedure and from a capable laboratory analytical work. Following the standard soil sampling protocols available, soil samples should be collected according to key points such as a convenient spatial scale, landscape homogeneity (or non-homogeneity), land color, soil texture, land slope, land solar exposition. Obtaining good quality data from forest soils is predictably expensive as it is labor intensive and demands many manpower and equipment both in field work and in laboratory analysis. Also, the sampling collection scheme that should be used on a data collection procedure in forest field is not simple to design as the sampling strategies chosen are strongly dependent on soil taxonomy. In fact, a sampling grid will not be able to be followed if rocks at the predicted collecting depth are found, or no soil at all is found, or large trees bar the soil collection. Considering this, a proficient design of a soil data sampling campaign in forest field is not always a simple process and sometimes represents a truly huge challenge. In this work, we present some difficulties that have occurred during two experiments on forest soil that were conducted in order to study the spatial variation of some soil physical-chemical properties. Two different sampling protocols were considered for monitoring two types of forest soils located in NW Portugal: umbric regosol and lithosol. Two different equipments for sampling collection were also used: a manual auger and a shovel. Both scenarios were analyzed and the results achieved have allowed us to consider that monitoring forest soil in order to do some mathematical and statistical investigations needs a sampling procedure to data collection compatible to established protocols but a pre-defined grid assumption often fail when the variability of the soil property is not uniform in space. In this case, sampling grid should be conveniently adapted from one part of the landscape to another and this fact should be taken into consideration of a mathematical procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present study three techniques for obtaining outer membrane enriched fractions from Yersinia pestis were evaluated. The techniques analysed were: differential solubilization of the cytoplasmic membrane with Sarkosyl or Triton X-100, and centrifugation in sucrose density gradients. The sodium dodecyl-sulfate polyacrylamide gel electrophoresis (SDS-PAGE) of outer membrane isolated by the different methods resulted in similar protein patterns. The measurement of NADH-dehydrogenase and succinate dehydrogenase (inner membrane enzymes) indicated that the outer membrane preparations obtained by the three methods were pure enough for analytical studies. In addition, preliminary evidences on the potential use of outer membrane proteins for the identification of geographic variants of Y. pestis wild isolates are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable flow simulation software is inevitable to determine an optimal injection strategy in Liquid Composite Molding processes. Several methodologies can be implemented into standard software in order to reduce CPU time. Post-processing techniques might be one of them. Post-processing a finite element solution is a well-known procedure, which consists in a recalculation of the originally obtained quantities such that the rate of convergence increases without the need for expensive remeshing techniques. Post-processing is especially effective in problems where better accuracy is required for derivatives of nodal variables in regions where Dirichlet essential boundary condition is imposed strongly. In previous works influence of smoothness of non-homogeneous Dirichlet condition, imposed on smooth front was examined. However, usually quite a non-smooth boundary is obtained at each time step of the infiltration process due to discretization. Then direct application of post-processing techniques does not improve final results as expected. The new contribution of this paper lies in improvement of the standard methodology. Improved results clearly show that the recalculated flow front is closer to the ”exact” one, is smoother that the previous one and it improves local disturbances of the “exact” solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Post-processing a finite element solution is a well-known technique, which consists in a recalculation of the originally obtained quantities such that the rate of convergence increases without the need for expensive remeshing techniques. Postprocessing is especially effective in problems where better accuracy is required for derivatives of nodal variables in regions where Dirichlet essential boundary condition is imposed strongly. Consequently such an approach can be exceptionally good in modelling of resin infiltration under quasi steady-state assumption by remeshing techniques and with explicit time integration, because only the free-front normal velocities are necessary to advance the resin front to the next position. The new contribution is the post-processing analysis and implementation of the freeboundary velocities of mesolevel infiltration analysis. Such implementation ensures better accuracy on even coarser meshes, which in consequence reduces the computational time also by the possibility of employing larger time steps.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introdução – A escolha do tratamento depende de vários fatores, incluindo o estado clínico e prognóstico de cada doente. Estes fatores desempenham um papel importante na escolha da intervenção terapêutica em metástases ósseas. A deteção precoce e o tratamento adequado podem melhorar a qualidade de vida e independência funcional dos doentes. Metodologia – Este artigo pretende realizar uma revisão sistemática da literatura dos últimos 15 anos, identificando os diferentes tipos de fracionamentos (fração única versus múltiplas frações) e técnicas utilizadas em radioterapia no tratamento de metástases ósseas. Resultados – Os recentes avanços na tecnologia e nas técnicas de tratamento de radioterapia ajudam na distribuição de doses altamente conformacionais e com orientação por imagem para uma entrega mais precisa do tratamento. A radioterapia estereotáxica corporal (SBRT, do acrónimo inglês stereotactic body radiotherapy) permite delimitar e aumentar a dose nos tumores a irradiar. No caso das metástases ósseas, os resultados de controlo local do tumor e da dor têm-se revelado promissores. A radioterapia convencional de 8Gyx1, no entanto, continua a ser o tratamento mais indicado nos doentes paliativos. Conclusão – O tratamento de metástases ósseas é complexo e uma abordagem multidisciplinar é sempre necessária. O tratamento deve ser individualizado para se adequar aos sintomas e estado clínico de cada doente.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation submitted in Faculdade de Ciências e Tecnologia of Universidade Nova de Lisboa for the degree of Master in Biomedical Engineering

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Mammography is considered the best imaging technique for breast cancer screening, and the radiographer plays an important role in its performance. Therefore, continuing education is critical to improving the performance of these professionals and thus providing better health care services. Objective: Our goal was to develop an e-learning course on breast imaging for radiographers, assessing its efficacy , effectiveness, and user satisfaction. Methods: A stratified randomized controlled trial was performed with radiographers and radiology students who already had mammography training, using pre- and post-knowledge tests, and satisfaction questionnaires. The primary outcome was the improvement in test results (percentage of correct answers), using intention-to-treat and per-protocol analysis. Results: A total of 54 participants were assigned to the intervention (20 students plus 34 radiographers) with 53 controls (19+34). The intervention was completed by 40 participants (11+29), with 4 (2+2) discontinued interventions, and 10 (7+3) lost to follow-up. Differences in the primary outcome were found between intervention and control: 21 versus 4 percentage points (pp), P<.001. Stratified analysis showed effect in radiographers (23 pp vs 4 pp; P=.004) but was unclear in students (18 pp vs 5 pp; P=.098). Nonetheless, differences in students’ posttest results were found (88% vs 63%; P=.003), which were absent in pretest (63% vs 63%; P=.106). The per-protocol analysis showed a higher effect (26 pp vs 2 pp; P<.001), both in students (25 pp vs 3 pp; P=.004) and radiographers (27 pp vs 2 pp; P<.001). Overall, 85% were satisfied with the course, and 88% considered it successful. Conclusions: This e-learning course is effective, especially for radiographers, which highlights the need for continuing education.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydatid disease in tropical areas poses a serious diagnostic problem due to the high frequence of cross-reactivity with other endemic helminthic infections. The enzyme-linked-immunosorbent assay (ELISA) and the double diffusion arc 5 showed respectively a sensitivity of 73% and 57% and a specificity of 84-95% and 100%. However, the specificity of ELISA was greatly increased by using ovine serum and phosphorylcholine in the diluent buffer. The hydatic antigen obtained from ovine cyst fluid showed three main protein bands of 64,58 and 30 KDa using SDS PAGE and immunoblotting. Sera from patients with onchocerciasis, cysticercosis, toxocariasis and Strongyloides infection cross-reacted with the 64 and 58 KDa bands by immunoblotting. However, none of the analyzed sera recognized the 30 KDa band, that seems to be specific in this assay. The immunoblotting showed a sensitivity of 80% and a specificity of 100% when used to recognize the 30 KDa band.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation presented to obtain the degree of Doctor of Philosophy in Electrical Engineering, speciality on Perceptional Systems, by the Universidade Nova de Lisboa, Faculty of Sciences and Technology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces a new method to blindly unmix hyperspectral data, termed dependent component analysis (DECA). This method decomposes a hyperspectral images into a collection of reflectance (or radiance) spectra of the materials present in the scene (endmember signatures) and the corresponding abundance fractions at each pixel. DECA assumes that each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. These abudances are modeled as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. This method overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical based approaches. The effectiveness of the proposed method is illustrated using simulated data based on U.S.G.S. laboratory spectra and real hyperspectral data collected by the AVIRIS sensor over Cuprite, Nevada.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada para obtenção do Grau de Doutor em Engenharia Informática, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

20.00% 20.00%

Publicador:

Resumo:

More than ever, there is an increase of the number of decision support methods and computer aided diagnostic systems applied to various areas of medicine. In breast cancer research, many works have been done in order to reduce false-positives when used as a double reading method. In this study, we aimed to present a set of data mining techniques that were applied to approach a decision support system in the area of breast cancer diagnosis. This method is geared to assist clinical practice in identifying mammographic findings such as microcalcifications, masses and even normal tissues, in order to avoid misdiagnosis. In this work a reliable database was used, with 410 images from about 115 patients, containing previous reviews performed by radiologists as microcalcifications, masses and also normal tissue findings. Throughout this work, two feature extraction techniques were used: the gray level co-occurrence matrix and the gray level run length matrix. For classification purposes, we considered various scenarios according to different distinct patterns of injuries and several classifiers in order to distinguish the best performance in each case described. The many classifiers used were Naïve Bayes, Support Vector Machines, k-nearest Neighbors and Decision Trees (J48 and Random Forests). The results in distinguishing mammographic findings revealed great percentages of PPV and very good accuracy values. Furthermore, it also presented other related results of classification of breast density and BI-RADS® scale. The best predictive method found for all tested groups was the Random Forest classifier, and the best performance has been achieved through the distinction of microcalcifications. The conclusions based on the several tested scenarios represent a new perspective in breast cancer diagnosis using data mining techniques.