850 resultados para Computer aided design


Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND The use of reduced-size adult lung transplants could help solve the profound pediatric donor lung shortage. However, adequate long-term function of the mature grafts requires growth in proportion to the recipient's development. METHODS Mature left lower lobes from adult mini-pigs (age: 7 months; mean body weight: 30 kg) were transplanted into 14-week-old piglets (mean body weight: 15 kg). By the end of the 14-week holding period, lungs of the recipients (n = 4) were harvested. After volumetric measurements, the lung morphology was studied using light microscopy, scanning, and transmission electron microscopy. Changes of alveolar airspace volume were determined using a computer aided image analysis system. Comparisons were made to age- and weight-matched controls. RESULTS Volumetric studies showed no significant differences (p = 0.49) between the specific volume (mL/kg body weight) of lobar grafts and left lower lobes of adult controls. Morphologic studies showed marked structural differences between the grafts and the right native lungs of the recipients, with increased average alveolar diameter of the grafts. On light microscopy and scanning electron microscopy, alveoli appeared dilated and rounded compared to the normal polygonal shape in the controls. The computer generated semi-quantitative data of relative alveolar airspace volume tended to be higher in transplanted lobes. CONCLUSIONS The mature pulmonary lobar grafts have filled the growing left hemithorax of the developing recipient. Emphysema-like alterations of the grafts were observed without evidence of alveolar growth in the mature lobar transplants. Thus, it can be questioned whether mature pulmonary grafts can guarantee sufficient long-term gas exchange in growing recipients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study assessed the impact of cigarette advertising on adolescent susceptibility to smoking in the Hempstead and Hitchcock Independent School Districts. A convenience sample of 217 youths, 10-19 years of age was recruited in the study. Students completed both a paper-and-pencil and a computer-aided questionnaire in April 1996. Adolescents were defined as susceptible to smoking if they could not definitely rule out the possibility of future smoking. For the analysis, an index was devised: a 5-point index of an individual's receptivity to cigarette advertising. The index is determined by the number of positive responses to five survey items (recognizing cigarette brand logos, recognizing cigarette advertisement's pictures, recognizing cigarette brand slogans, evaluating adolescent attitudes toward cigarette advertising, and the degree to which adolescents were exposed to cigarette advertisements). Using logistic regression, we assessed the independent importance of the index in predicting susceptibility to smoking and ever smoking after adjusting for sociodemographic variables, perceived school performance and family composition. Of students surveyed, 54.4% of students appeared to have started the smoking uptake process as measured by susceptibility to smoking. Camel was recognized by the majority of students (88%), followed by Marlboro (41.5%) and Newport (40.1%). The pattern for recognition of the cigarette advertisements was the same as the pattern of market for cigarette. Advertisement featuring the cartoon character Joe Camel was significantly more appealing to adolescents than were advertisements with human models, with animal models, and with text only (p $<$ 0.001). Text only advertisement was significantly less appealing than other types of advertisements. The cigarette advertisement with White models (Marlboro) had significantly higher appeal to White students than to African-American students (p $<$ 0.001). The cigarette advertisement featuring African-American models (Virginia Slims) was significantly more appealing to African-American students than other ethnic groups (p $<$ 0.001). Receptivity to cigarette advertising was to be an important concurrent predictor of past smoking experience and intention to smoke in the future. Adolescents who scored in the fourth quartile of the Index of Receptivity to Cigarette Advertising were 7.54 (95% confidence interval (CI) = 1.92-29.56) times as likely to be susceptible to smoking, and were 4.56 (95% CI = 1.55-13.38) times as likely to have tried smoking, as those who scored in the first quartile of the Index. The findings confirmed the hypothesis that cigarette advertising may be a strong current influence in encouraging adolescents to initiate the smoking uptake process than sociodemographic variables, perceived school performance and family composition. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES Recently, an MRI quantification sequence has been developed which can be used to acquire T1- and T2-relaxation times as well as proton density (PD) values. Those three quantitative values can be used to describe soft tissue in an objective manner. The purpose of this study was to investigate the applicability of quantitative cardiac MRI for characterization and differentiation of ischaemic myocardial lesions of different age. MATERIALS AND METHODS Fifty post-mortem short axis cardiac 3 T MR examinations have been quantified using a quantification sequence. Myocardial lesions were identified according to histology and appearance in MRI images. Ischaemic lesions were assessed for mean T1-, T2- and proton density values. Quantitative values were plotted in a 3D-coordinate system to investigate the clustering of ischaemic myocardial lesions. RESULTS A total of 16 myocardial lesions detected in MRI images were histologically characterized as acute lesions (n = 8) with perifocal oedema (n = 8), subacute lesions (n = 6) and chronic lesions (n = 2). In a 3D plot comprising the combined quantitative values of T1, T2 and PD, the clusters of all investigated lesions could be well differentiated from each other. CONCLUSION Post-mortem quantitative cardiac MRI is feasible for characterization and discrimination of different age stages of myocardial infarction. KEY POINTS • MR quantification is feasible for characterization of different stages of myocardial infarction. • The results provide the base for computer-aided MRI cardiac infarction diagnosis. • Diagnostic criteria may also be applied for living patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. MATERIAL AND METHODS Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. RESULTS In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 °C resulted in better tissue discrimination. CONCLUSION Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. KEY POINTS • Postmortem MR quantification is feasible for soft tissue discrimination and characterization • Temperature dependence of the T1 values challenges the MR quantification approach • The results provide the basis for computer-aided postmortem MRI diagnosis • Diagnostic criteria may also be applied for living patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The reciprocal interaction between cancer cells and the tissue-specific stroma is critical for primary and metastatic tumor growth progression. Prostate cancer cells colonize preferentially bone (osteotropism), where they alter the physiological balance between osteoblast-mediated bone formation and osteoclast-mediated bone resorption, and elicit prevalently an osteoblastic response (osteoinduction). The molecular cues provided by osteoblasts for the survival and growth of bone metastatic prostate cancer cells are largely unknown. We exploited the sufficient divergence between human and mouse RNA sequences together with redefinition of highly species-specific gene arrays by computer-aided and experimental exclusion of cross-hybridizing oligonucleotide probes. This strategy allowed the dissection of the stroma (mouse) from the cancer cell (human) transcriptome in bone metastasis xenograft models of human osteoinductive prostate cancer cells (VCaP and C4-2B). As a result, we generated the osteoblastic bone metastasis-associated stroma transcriptome (OB-BMST). Subtraction of genes shared by inflammation, wound healing and desmoplastic responses, and by the tissue type-independent stroma responses to a variety of non-osteotropic and osteotropic primary cancers generated a curated gene signature ("Core" OB-BMST) putatively representing the bone marrow/bone-specific stroma response to prostate cancer-induced, osteoblastic bone metastasis. The expression pattern of three representative Core OB-BMST genes (PTN, EPHA3 and FSCN1) seems to confirm the bone specificity of this response. A robust induction of genes involved in osteogenesis and angiogenesis dominates both the OB-BMST and Core OB-BMST. This translates in an amplification of hematopoietic and, remarkably, prostate epithelial stem cell niche components that may function as a self-reinforcing bone metastatic niche providing a growth support specific for osteoinductive prostate cancer cells. The induction of this combinatorial stem cell niche is a novel mechanism that may also explain cancer cell osteotropism and local interference with hematopoiesis (myelophthisis). Accordingly, these stem cell niche components may represent innovative therapeutic targets and/or serum biomarkers in osteoblastic bone metastasis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over the last decade, a plethora of computer-aided diagnosis (CAD) systems have been proposed aiming to improve the accuracy of the physicians in the diagnosis of interstitial lung diseases (ILD). In this study, we propose a scheme for the classification of HRCT image patches with ILD abnormalities as a basic component towards the quantification of the various ILD patterns in the lung. The feature extraction method relies on local spectral analysis using a DCT-based filter bank. After convolving the image with the filter bank, q-quantiles are computed for describing the distribution of local frequencies that characterize image texture. Then, the gray-level histogram values of the original image are added forming the final feature vector. The classification of the already described patches is done by a random forest (RF) classifier. The experimental results prove the superior performance and efficiency of the proposed approach compared against the state-of-the-art.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In clinical practice, traditional X-ray radiography is widely used, and knowledge of landmarks and contours in anteroposterior (AP) pelvis X-rays is invaluable for computer aided diagnosis, hip surgery planning and image-guided interventions. This paper presents a fully automatic approach for landmark detection and shape segmentation of both pelvis and femur in conventional AP X-ray images. Our approach is based on the framework of landmark detection via Random Forest (RF) regression and shape regularization via hierarchical sparse shape composition. We propose a visual feature FL-HoG (Flexible- Level Histogram of Oriented Gradients) and a feature selection algorithm based on trace radio optimization to improve the robustness and the efficacy of RF-based landmark detection. The landmark detection result is then used in a hierarchical sparse shape composition framework for shape regularization. Finally, the extracted shape contour is fine-tuned by a post-processing step based on low level image features. The experimental results demonstrate that our feature selection algorithm reduces the feature dimension in a factor of 40 and improves both training and test efficiency. Further experiments conducted on 436 clinical AP pelvis X-rays show that our approach achieves an average point-to-curve error around 1.2 mm for femur and 1.9 mm for pelvis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of the present study was to investigate whether serous fluids, blood, cerebrospinal fluid (CSF), and putrefied CSF can be characterized and differentiated in synthetically calculated magnetic resonance (MR) images based on their quantitative T 1, T 2, and proton density (PD) values. Images from 55 postmortem short axis cardiac and 31 axial brain 1.5-T MR examinations were quantified using a quantification sequence. Serous fluids, fluid blood, sedimented blood, blood clots, CSF, and putrefied CSF were analyzed for their mean T 1, T 2, and PD values. Body core temperature was measured during the MRI scans. The fluid-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot as well as in statistical analysis, the quantitative T 1, T 2 and PD values of serous fluids, fluid blood, sedimented blood, blood clots, CSF, and putrefied CSF could be well differentiated from each other. The quantitative T 1 and T 2 values were temperature-dependent. Correction of quantitative values to a temperature of 37 °C resulted in significantly better discrimination between all investigated fluid mediums. We conclude that postmortem 1.5-T MR quantification is feasible to discriminate between blood, serous fluids, CSF, and putrefied CSF. This finding provides a basis for the computer-aided diagnosis and detection of fluids and hemorrhages.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Automated tissue characterization is one of the most crucial components of a computer aided diagnosis (CAD) system for interstitial lung diseases (ILDs). Although much research has been conducted in this field, the problem remains challenging. Deep learning techniques have recently achieved impressive results in a variety of computer vision problems, raising expectations that they might be applied in other domains, such as medical image analysis. In this paper, we propose and evaluate a convolutional neural network (CNN), designed for the classification of ILD patterns. The proposed network consists of 5 convolutional layers with 2×2 kernels and LeakyReLU activations, followed by average pooling with size equal to the size of the final feature maps and three dense layers. The last dense layer has 7 outputs, equivalent to the classes considered: healthy, ground glass opacity (GGO), micronodules, consolidation, reticulation, honeycombing and a combination of GGO/reticulation. To train and evaluate the CNN, we used a dataset of 14696 image patches, derived by 120 CT scans from different scanners and hospitals. To the best of our knowledge, this is the first deep CNN designed for the specific problem. A comparative analysis proved the effectiveness of the proposed CNN against previous methods in a challenging dataset. The classification performance (~85.5%) demonstrated the potential of CNNs in analyzing lung patterns. Future work includes, extending the CNN to three-dimensional data provided by CT volume scans and integrating the proposed method into a CAD system that aims to provide differential diagnosis for ILDs as a supportive tool for radiologists.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim is to obtain computationally more powerful, neuro physiologically founded, artificial neurons and neural nets. Artificial Neural Nets (ANN) of the Perceptron type evolved from the original proposal by McCulloch an Pitts classical paper [1]. Essentially, they keep the computing structure of a linear machine followed by a non linear operation. The McCulloch-Pitts formal neuron (which was never considered by the author’s to be models of real neurons) consists of the simplest case of a linear computation of the inputs followed by a threshold. Networks of one layer cannot compute anylogical function of the inputs, but only those which are linearly separable. Thus, the simple exclusive OR (contrast detector) function of two inputs requires two layers of formal neurons

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dendritic computation is a term that has been in neuro physiological research for a long time [1]. It is still controversial and far for been clarified within the concepts of both computation and neurophysiology [2], [3]. In any case, it hasnot been integrated neither in a formal computational scheme or structure, nor into formulations of artificial neural nets. Our objective here is to formulate a type of distributed computation that resembles dendritic trees, in such a way that it shows the advantages of neural network distributed computation, mostly the reliability that is shown under the existence of holes (scotomas) in the computing net, without ?blind spots?.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A good and early fault detection and isolation system along with efficient alarm management and fine sensor validation systems are very important in today¿s complex process plants, specially in terms of safety enhancement and costs reduction. This paper presents a methodology for fault characterization. This is a self-learning approach developed in two phases. An initial, learning phase, where the simulation of process units, without and with different faults, will let the system (in an automated way) to detect the key variables that characterize the faults. This will be used in a second (on line) phase, where these key variables will be monitored in order to diagnose possible faults. Using this scheme the faults will be diagnosed and isolated in an early stage where the fault still has not turned into a failure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pragmatism is the leading motivation of regularization. We can understand regularization as a modification of the maximum-likelihood estimator so that a reasonable answer could be given in an unstable or ill-posed situation. To mention some typical examples, this happens when fitting parametric or non-parametric models with more parameters than data or when estimating large covariance matrices. Regularization is usually used, in addition, to improve the bias-variance tradeoff of an estimation. Then, the definition of regularization is quite general, and, although the introduction of a penalty is probably the most popular type, it is just one out of multiple forms of regularization. In this dissertation, we focus on the applications of regularization for obtaining sparse or parsimonious representations, where only a subset of the inputs is used. A particular form of regularization, L1-regularization, plays a key role for reaching sparsity. Most of the contributions presented here revolve around L1-regularization, although other forms of regularization are explored (also pursuing sparsity in some sense). In addition to present a compact review of L1-regularization and its applications in statistical and machine learning, we devise methodology for regression, supervised classification and structure induction of graphical models. Within the regression paradigm, we focus on kernel smoothing learning, proposing techniques for kernel design that are suitable for high dimensional settings and sparse regression functions. We also present an application of regularized regression techniques for modeling the response of biological neurons. Supervised classification advances deal, on the one hand, with the application of regularization for obtaining a na¨ıve Bayes classifier and, on the other hand, with a novel algorithm for brain-computer interface design that uses group regularization in an efficient manner. Finally, we present a heuristic for inducing structures of Gaussian Bayesian networks using L1-regularization as a filter. El pragmatismo es la principal motivación de la regularización. Podemos entender la regularización como una modificación del estimador de máxima verosimilitud, de tal manera que se pueda dar una respuesta cuando la configuración del problema es inestable. A modo de ejemplo, podemos mencionar el ajuste de modelos paramétricos o no paramétricos cuando hay más parámetros que casos en el conjunto de datos, o la estimación de grandes matrices de covarianzas. Se suele recurrir a la regularización, además, para mejorar el compromiso sesgo-varianza en una estimación. Por tanto, la definición de regularización es muy general y, aunque la introducción de una función de penalización es probablemente el método más popular, éste es sólo uno de entre varias posibilidades. En esta tesis se ha trabajado en aplicaciones de regularización para obtener representaciones dispersas, donde sólo se usa un subconjunto de las entradas. En particular, la regularización L1 juega un papel clave en la búsqueda de dicha dispersión. La mayor parte de las contribuciones presentadas en la tesis giran alrededor de la regularización L1, aunque también se exploran otras formas de regularización (que igualmente persiguen un modelo disperso). Además de presentar una revisión de la regularización L1 y sus aplicaciones en estadística y aprendizaje de máquina, se ha desarrollado metodología para regresión, clasificación supervisada y aprendizaje de estructura en modelos gráficos. Dentro de la regresión, se ha trabajado principalmente en métodos de regresión local, proponiendo técnicas de diseño del kernel que sean adecuadas a configuraciones de alta dimensionalidad y funciones de regresión dispersas. También se presenta una aplicación de las técnicas de regresión regularizada para modelar la respuesta de neuronas reales. Los avances en clasificación supervisada tratan, por una parte, con el uso de regularización para obtener un clasificador naive Bayes y, por otra parte, con el desarrollo de un algoritmo que usa regularización por grupos de una manera eficiente y que se ha aplicado al diseño de interfaces cerebromáquina. Finalmente, se presenta una heurística para inducir la estructura de redes Bayesianas Gaussianas usando regularización L1 a modo de filtro.