897 resultados para Medical image analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective parameters that could provide a basis for food texture selection for elderly or dysphagic patients have not been established. We, therefore, aimed to develop a precise method of measuring large particles (>2 mm in diameter) in a bolus and an analytical method to provide a scientific rationale for food selection under masticatory dysfunction conditions. We developed a new illumination system to evaluate the ability of twenty female participants (mean age, 23.4 +/- 4.3 years) to masticate carrots, peanuts and beef with full, half and one quarter of the number of masticatory strokes. We also evaluated mastication under suppressed force, regulated by 20% electromyographic of the masseter muscle. The intercept and inclination of the regression line for the distribution of large particles were adopted as coefficients for the discrimination of masticatory efficiency. Single set of coefficient thresholds of 0.10 for the intercept and 1.62 for the inclination showed excellent discrimination of masticatory conditions for all three test foods with high specificity and sensitivity. These results suggested that our method of analysing the distribution of particles >2 mm in diameter might provide the basis for the appropriate selection of food texture for masticatory dysfunction patients from the standpoint of comminution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assuming that textbooks give literary expression to cultural and ideological values of a nation or group, we propose the analysis of chemistry textbooks used in Brazilian universities throughout the twentieth century. We analyzed iconographic and textual aspects of 31 textbooks which had significant diffusion in the context of Brazilian universities at that period. As a result of the iconographic analysis, nine categories of images were proposed: (1) laboratory and experimentation, (2) industry and production, (3) graphs and diagrams, (4) illustrations related to daily life, (5) models, (6) illustrations related to the history of science, (7) pictures or diagrams of animal, vegetable or mineral samples, (8) analogies and (9) concepts of physics. The distribution of images among the categories showed a different emphasis in the presentation of chemical content due to a commitment to different conceptions of chemistry over the period. So, we started with chemistry as an experimental science in the early twentieth century, with an emphasis change to the principles of chemistry from the 1950s, culminating in a chemistry of undeniable technological influence. Results showed that reflections not only on the history of science, but on the history of science education, may be useful for the improvement of science education.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of new procedures for quickly obtaining accurate information on the physiological potential of seed lots is essential for developing quality control programs for the seed industry. In this study, the effectiveness of an automated system of seedling image analysis (Seed Vigor Imaging System - SVIS) in determining the physiological potential of sun hemp seeds and its relationship with electrical conductivity tests, were evaluated. SVIS evaluations were performed three and four days after sowing and data on the vigor index and the length and uniformity of seedling growth were collected. The electrical conductivity test was made on 50 seed replicates placed in containers with 75 mL of deionised water at 25 ºC and readings were taken after 1, 2, 4, 8 and 16 hours of imbibition. Electrical conductivity measurements at 4 or 8 hours and the use of the SVIS on 3-day old seedlings can effectively detect differences in vigor between different sun hemp seed lots.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trabajo realizado por: Garijo, J. C., Hernández León, S.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The subject of this doctoral dissertation concerns the definition of a new methodology for the morphological and morphometric study of fossilized human teeth, and therefore strives to provide a contribution to the reconstruction of human evolutionary history that proposes to extend to the different species of hominid fossils. Standardized investigative methodologies are lacking both regarding the orientation of teeth subject to study and in the analysis that can be carried out on these teeth once they are oriented. The opportunity to standardize a primary analysis methodology is furnished by the study of certain early Neanderthal and preneanderthal molars recovered in two caves in southern Italy [Grotta Taddeo (Taddeo Cave) and Grotta del Poggio (Poggio Cave), near Marina di Camerata, Campania]. To these we can add other molars of Neanderthal and modern man of the upper Paleolithic era, specifically scanned in the paleoanthropology laboratory of the University of Arkansas (Fayetteville, Arkansas, USA), in order to increase the paleoanthropological sample data and thereby make the final results of the analyses more significant. The new analysis methodology is rendered as follows: 1. Standardization of an orientation system for primary molars (superior and inferior), starting from a scan of a sample of 30 molars belonging to modern man (15 M1 inferior and 15 M1 superior), the definition of landmarks, the comparison of various systems and the choice of a system of orientation for each of the two dental typologies. 2. The definition of an analysis procedure that considers only the first 4 millimeters of the dental crown starting from the collar: 5 sections parallel to the plane according to which the tooth has been oriented are carried out, spaced 1 millimeter between them. The intention is to determine a method that allows for the differentiation of fossilized species even in the presence of worn teeth. 3. Results and Conclusions. The new approach to the study of teeth provides a considerable quantity of information that can better be evaluated by increasing the fossil sample data. It has been demonstrated to be a valid tool in evolutionary classification that has allowed (us) to differentiate the Neanderthal sample from that of modern man. In a particular sense the molars of Grotta Taddeo, which up until this point it has not been possible to determine with exactness their species of origin, through the present research they are classified as Neanderthal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lo studio dell’intelligenza artificiale si pone come obiettivo la risoluzione di una classe di problemi che richiedono processi cognitivi difficilmente codificabili in un algoritmo per essere risolti. Il riconoscimento visivo di forme e figure, l’interpretazione di suoni, i giochi a conoscenza incompleta, fanno capo alla capacità umana di interpretare input parziali come se fossero completi, e di agire di conseguenza. Nel primo capitolo della presente tesi sarà costruito un semplice formalismo matematico per descrivere l’atto di compiere scelte. Il processo di “apprendimento” verrà descritto in termini della massimizzazione di una funzione di prestazione su di uno spazio di parametri per un ansatz di una funzione da uno spazio vettoriale ad un insieme finito e discreto di scelte, tramite un set di addestramento che descrive degli esempi di scelte corrette da riprodurre. Saranno analizzate, alla luce di questo formalismo, alcune delle più diffuse tecniche di artificial intelligence, e saranno evidenziate alcune problematiche derivanti dall’uso di queste tecniche. Nel secondo capitolo lo stesso formalismo verrà applicato ad una ridefinizione meno intuitiva ma più funzionale di funzione di prestazione che permetterà, per un ansatz lineare, la formulazione esplicita di un set di equazioni nelle componenti del vettore nello spazio dei parametri che individua il massimo assoluto della funzione di prestazione. La soluzione di questo set di equazioni sarà trattata grazie al teorema delle contrazioni. Una naturale generalizzazione polinomiale verrà inoltre mostrata. Nel terzo capitolo verranno studiati più nel dettaglio alcuni esempi a cui quanto ricavato nel secondo capitolo può essere applicato. Verrà introdotto il concetto di grado intrinseco di un problema. Verranno inoltre discusse alcuni accorgimenti prestazionali, quali l’eliminazione degli zeri, la precomputazione analitica, il fingerprinting e il riordino delle componenti per lo sviluppo parziale di prodotti scalari ad alta dimensionalità. Verranno infine introdotti i problemi a scelta unica, ossia quella classe di problemi per cui è possibile disporre di un set di addestramento solo per una scelta. Nel quarto capitolo verrà discusso più in dettaglio un esempio di applicazione nel campo della diagnostica medica per immagini, in particolare verrà trattato il problema della computer aided detection per il rilevamento di microcalcificazioni nelle mammografie.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In un mondo che richiede sempre maggiormente un'automazione delle attività della catena produttiva industriale, la computer vision rappresenta uno strumento fondamentale perciò che viene già riconosciuta internazionalmente come la Quarta Rivoluzione Industriale o Industry 4.0. Avvalendomi di questo strumento ho intrapreso presso l'azienda Syngenta lo studio della problematica della conta automatica del numero di foglie di una pianta. Il problema è stato affrontato utilizzando due differenti approcci, ispirandosi alla letteratura. All'interno dell'elaborato è presente anche la descrizione progettuale di un ulteriore metodo, ad oggi non presente in letteratura. Le metodologie saranno spiegate in dettaglio ed i risultati ottenuti saranno confrontati utilizzando i primi due approcci. Nel capitolo finale si trarranno le conclusioni sulle basi dei risultati ottenuti e dall'analisi degli stessi.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical models have been recently introduced in computational orthopaedics to investigate the bone mechanical properties across several populations. A fundamental aspect for the construction of statistical models concerns the establishment of accurate anatomical correspondences among the objects of the training dataset. Various methods have been proposed to solve this problem such as mesh morphing or image registration algorithms. The objective of this study is to compare a mesh-based and an image-based statistical appearance model approaches for the creation of nite element(FE) meshes. A computer tomography (CT) dataset of 157 human left femurs was used for the comparison. For each approach, 30 finite element meshes were generated with the models. The quality of the obtained FE meshes was evaluated in terms of volume, size and shape of the elements. Results showed that the quality of the meshes obtained with the image-based approach was higher than the quality of the mesh-based approach. Future studies are required to evaluate the impact of this finding on the final mechanical simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Morphometric investigations using a point and intersection counting strategy in the lung often are not able to reveal the full set of morphologic changes. This happens particularly when structural modifications are not expressed in terms of volume density changes and when rough and fine surface density alterations cancel each other at different magnifications. Making use of digital image processing, we present a methodological approach that allows to easily and quickly quantify changes of the geometrical properties of the parenchymal lung structure and reflects closely the visual appreciation of the changes. Randomly sampled digital images from light microscopic sections of lung parenchyma are filtered, binarized, and skeletonized. The lung septa are thus represented as a single-pixel wide line network with nodal points and end points and the corresponding internodal and end segments. By automatically counting the number of points and measuring the lengths of the skeletal segments, the lung architecture can be characterized and very subtle structural changes can be detected. This new methodological approach to lung structure analysis is highly sensitive to morphological changes in the parenchyma: it detected highly significant quantitative alterations in the structure of lungs of rats treated with a glucocorticoid hormone, where the classical morphometry had partly failed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glucocorticoids (GC) are successfully applied in neonatology to improve lung maturation in preterm born babies. Animal studies show that GC can also impair lung development. In this investigation, we used a new approach based on digital image analysis. Microscopic images of lung parenchyma were skeletonised and the geometrical properties of the septal network characterised by analysing the 'skeletal' parameters. Inhibition of the process of alveolarisation after extensive administration of small doses of GC in newborn rats was confirmed by significant changes in the 'skeletal' parameters. The induced structural changes in the lung parenchyma were still present after 60 days in adult rats, clearly indicating a long lasting or even definitive impairment of lung development and maturation caused by GC. Conclusion: digital image analysis and skeletonisation proved to be a highly suited approach to assess structural changes in lung parenchyma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Water flow and solute transport through soils are strongly influenced by the spatial arrangement of soil materials with different hydraulic and chemical properties. Knowing the specific or statistical arrangement of these materials is considered as a key toward improved predictions of solute transport. Our aim was to obtain two-dimensional material maps from photographs of exposed profiles. We developed a segmentation and classification procedure and applied it to the images of a very heterogeneous sand tank, which was used for a series of flow and transport experiments. The segmentation was based on thresholds of soil color, estimated from local median gray values, and of soil texture, estimated from local coefficients of variation of gray values. Important steps were the correction of inhomogeneous illumination and reflection, and the incorporation of prior knowledge in filters used to extract the image features and to smooth the results morphologically. We could check and confirm the success of our mapping by comparing the estimated with the designed sand distribution in the tank. The resulting material map was used later as input to model flow and transport through the sand tank. Similar segmentation procedures may be applied to any high-density raster data, including photographs or spectral scans of field profiles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Firn microstructure is accurately characterized using images obtained from scanning electron microscopy (SEM). Visibly etched grain boundaries within images are used to create a skeleton outline of the microstructure. A pixel-counting utility is applied to the outline to determine grain area. Firn grain sizes calculated using the technique described here are compared to those calculated using the techniques of Cow (1969) and Gay and Weiss (1999) on samples of the same material, and are found to be substantially smaller. The differences in grain size between the techniques are attributed to sampling deficiencies (e.g. the inclusion of pore filler in the grain area) in earlier methods. The new technique offers the advantages of greater accuracy and the ability to determine individual components of the microstructure (grain and pore), which have important applications in ice-core analyses. The new method is validated by calculating activation energies of grain boundary diffusion using predicted values based on the ratio of grain-size measurements between the new and existing techniques. The resulting activation energy falls within the range of values previously reported for firn/ice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantification of protein expression based on immunohistochemistry (IHC) is an important step in clinical diagnoses and translational tissue-based research. Manual scoring systems are used in order to evaluate protein expression based on staining intensities and distribution patterns. However, visual scoring remains an inherently subjective approach. The aim of our study was to explore whether digital image analysis proves to be an alternative or even superior tool to quantify expression of membrane-bound proteins. We analyzed five membrane-binding biomarkers (HER2, EGFR, pEGFR, β-catenin, and E-cadherin) and performed IHC on tumor tissue microarrays from 153 esophageal adenocarcinomas patients from a single center study. The tissue cores were scored visually applying an established routine scoring system as well as by using digital image analysis obtaining a continuous spectrum of average staining intensity. Subsequently, we compared both assessments by survival analysis as an end point. There were no significant correlations with patient survival using visual scoring of β-catenin, E-cadherin, pEGFR, or HER2. In contrast, the results for digital image analysis approach indicated that there were significant associations with disease-free survival for β-catenin, E-cadherin, pEGFR, and HER2 (P = 0.0125, P = 0.0014, P = 0.0299, and P = 0.0096, respectively). For EGFR, there was a greater association with patient survival when digital image analysis was used compared to when visual scoring was (visual: P = 0.0045, image analysis: P < 0.0001). The results of this study indicated that digital image analysis was superior to visual scoring. Digital image analysis is more sensitive and, therefore, better able to detect biological differences within the tissues with greater accuracy. This increased sensitivity improves the quality of quantification.