956 resultados para Human Machine Interface
Resumo:
Weakening of cardiac function in patients with heart failure results from a loss of cardiomyocytes in the damaged heart. Cell replacement therapies as a way to induce myocardial regeneration in humans could represent attractive alternatives to classical drug-based approaches. However, a suitable source of precursor cells, which could produce a functional myocardium after transplantation, remains to be identified. In the present study, we isolated cardiovascular precursor cells from ventricles of human fetal hearts at 12 weeks of gestation. These cells expressed Nkx2.5 but not late cardiac markers such as α-actinin and troponin I. In addition, proliferating cells expressed the mesenchymal stem cell markers CD73, CD90, and CD105. Evidence for functional cardiogenic differentiation in vitro was demonstrated by the upregulation of cardiac gene expression as well as the appearance of cells with organized sarcomeric structures. Importantly, differentiated cells presented spontaneous and triggered calcium signals. Differentiation into smooth muscle cells was also detected. In contrast, precursor cells did not produce endothelial cells. The engraftment and differentiation capacity of green fluorescent protein (GFP)-labeled cardiac precursor cells were then tested in vivo after transfer into the heart of immunodeficient severe combined immunodeficient mice. Engrafted human cells were readily detected in the mouse myocardium. These cells retained their cardiac commitment and differentiated into α-actinin-positive cardiomyocytes. Expression of connexin-43 at the interface between GFP-labeled and endogenous cardiomyocytes indicated that precursor-derived cells connected to the mouse myocardium. Together, these results suggest that human ventricular nonmyocyte cells isolated from fetal hearts represent a suitable source of precursors for cell replacement therapies.
Resumo:
OBJECTIVE: The purpose of this study was to adapt and improve a minimally invasive two-step postmortem angiographic technique for use on human cadavers. Detailed mapping of the entire vascular system is almost impossible with conventional autopsy tools. The technique described should be valuable in the diagnosis of vascular abnormalities. MATERIALS AND METHODS: Postmortem perfusion with an oily liquid is established with a circulation machine. An oily contrast agent is introduced as a bolus injection, and radiographic imaging is performed. In this pilot study, the upper or lower extremities of four human cadavers were perfused. In two cases, the vascular system of a lower extremity was visualized with anterograde perfusion of the arteries. In the other two cases, in which the suspected cause of death was drug intoxication, the veins of an upper extremity were visualized with retrograde perfusion of the venous system. RESULTS: In each case, the vascular system was visualized up to the level of the small supplying and draining vessels. In three of the four cases, vascular abnormalities were found. In one instance, a venous injection mark engendered by the self-administration of drugs was rendered visible by exudation of the contrast agent. In the other two cases, occlusion of the arteries and veins was apparent. CONCLUSION: The method described is readily applicable to human cadavers. After establishment of postmortem perfusion with paraffin oil and injection of the oily contrast agent, the vascular system can be investigated in detail and vascular abnormalities rendered visible.
Resumo:
Dendritic cells (DCs) are the most potent antigen-presenting cells in the human lung and are now recognized as crucial initiators of immune responses in general. They are arranged as sentinels in a dense surveillance network inside and below the epithelium of the airways and alveoli, where thet are ideally situated to sample inhaled antigen. DCs are known to play a pivotal role in maintaining the balance between tolerance and active immune response in the respiratory system. It is no surprise that the lungs became a main focus of DC-related investigations as this organ provides a large interface for interactions of inhaled antigens with the human body. During recent years there has been a constantly growing body of lung DC-related publications that draw their data from in vitro models, animal models and human studies. This review focuses on the biology and functions of different DC populations in the lung and highlights the advantages and drawbacks of different models with which to study the role of lung DCs. Furthermore, we present a number of up-to-date visualization techniques to characterize DC-related cell interactions in vitro and/or in vivo.
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
TWEAK (TNF homologue with weak apoptosis-inducing activity) and Fn14 (fibroblast growth factor-inducible protein 14) are members of the tumor necrosis factor (TNF) ligand and receptor super-families. Having observed that Xenopus Fn14 cross-reacts with human TWEAK, despite its relatively low sequence homology to human Fn14, we examined the conservation in tertiary fold and binding interfaces between the two species. Our results, combining NMR solution structure determination, binding assays, extensive site-directed mutagenesis and molecular modeling, reveal that, in addition to the known and previously characterized β-hairpin motif, the helix-loop-helix motif makes an essential contribution to the receptor/ligand binding interface. We further discuss the insight provided by the structural analyses regarding how the cysteine-rich domains of the TNF receptor super-family may have evolved over time. DATABASE: Structural data are available in the Protein Data Bank/BioMagResBank databases under the accession codes 2KMZ, 2KN0 and 2KN1 and 17237, 17247 and 17252. STRUCTURED DIGITAL ABSTRACT: TWEAK binds to hFn14 by surface plasmon resonance (View interaction) xeFn14 binds to TWEAK by enzyme linked immunosorbent assay (View interaction) TWEAK binds to xeFn14 by surface plasmon resonance (View interaction) hFn14 binds to TWEAK by enzyme linked immunosorbent assay (View interaction).
Resumo:
This documents sums up a projectaimed at building a new web interfaceto the Apertium machine translationplatform, including pre-editing andpost-editing environments. It containsa description of the accomplished workon this project, as well as an overviewof possible evolutions.
Resumo:
Image filtering is a highly demanded approach of image enhancement in digital imaging systems design. It is widely used in television and camera design technologies to improve the quality of an output image to avoid various problems such as image blurring problem thatgains importance in design of displays of large sizes and design of digital cameras. This thesis proposes a new image filtering method basedon visual characteristics of human eye such as MTF. In contrast to the traditional filtering methods based on human visual characteristics this thesis takes into account the anisotropy of the human eye vision. The proposed method is based on laboratory measurements of the human eye MTF and takes into account degradation of the image by the latter. This method improves an image in the way it will be degraded by human eye MTF to give perception of the original image quality. This thesis gives a basic understanding of an image filtering approach and the concept of MTF and describes an algorithm to perform an image enhancement based on MTF of human eye. Performed experiments have shown quite good results according to human evaluation. Suggestions to improve the algorithm are also given for the future improvements.
Resumo:
Erythroid burst forming units (BFU-E) are proliferative cells present in peripheral blood and bone marrow which may be precursors of the erythroid colony forming cell found in the bone marrow. To examine the possible role of monocyte-macrophages in the modulation of erythropoiesis, the effect of monocytes on peripheral blood BFU-E proliferation in response to erythropoietin was investigated in the plasma clot culture system. Peripheral blood mononuclear cells from normal human donors were separated into four fractions. Fraction-I cells were obtained from the interface of Ficoll-Hypaque gradients (20-30% monocytes; 60-80% lymphocytes); fraction-II cells were fraction-I cells that were nonadherent to plastic (2-10% monocytes; 90-98% lymphocytes); fraction-III cells were obtained by incubation of fraction-II cells with carbonyl iron followed by Ficoll-Hypaque centrifugation (>99% lymphocytes); and fraction-IV cells represented the adherent population of fraction-II cells released from the plastic by lidocaine (>95% monocytes). When cells from these fractions were cultured in the presence of erythropoietin, the number of BFU-E-derived colonies was inversely proportional to the number of monocytes present (r = ¿0.96, P < 0.001). The suppressive effect of monocytes on BFU-E proliferation was confirmed by admixing autologous purified monocytes (fraction-IV cells) with fraction-III cells. Monocyte concentrations of ¿20% completely suppressed BFU-E activity. Reduction in the number of plated BFU-E by monocyte dilution could not account for these findings: a 15% reduction in the number of fraction-III cells plated resulted in only a 15% reduction in colony formation. These results indicate that monocyte-macrophages may play a significant role in the regulation of erythropoiesis and be involved in the pathogenesis of the hypoproliferative anemias associated with infection and certain neoplasia in which increased monocyte activity and monopoiesis also occur.
Resumo:
Mottling is one of the key defects in offset-printing. Mottling can be defined as unwanted unevenness of print. In this work, diameter of a mottle spot is defined between 0.5-10.0 mm. There are several types of mottling, but the reason behind the problem is still not fully understood. Several commercial machine vision products for the evaluation of print unevenness have been presented. Two of these methods used in these products have been implemented in this thesis. The one is the cluster method and the other is the band-pass method. The properties of human vision system have been taken into account in the implementation of these two methods. An index produced by the cluster method is a weighted sum of the number of found spots, and an index produced by band-pass method is a weighted sum of coefficients of variations of gray-levels for each spatial band. Both methods produce larger indices for visually poor samples, so they can discern good samples from the poor ones. The difference between the indices for good and poor samples is slightly larger produced by the cluster method. 11 However, without the samples evaluated by human experts, the goodness of these results is still questionable. This comparison will be left to the next phase of the project.
Resumo:
Problems related to fire hazard and fire management have become in recent decades one of the most relevant issues in the Wildland-Urban Interface (WUI), that is the area where human infrastructures meet or intermingle with natural vegetation. In this paper we develop a robust geospatial method for defining and mapping the WUI in the Alpine environment, where most interactions between infrastructures and wildland vegetation concern the fire ignition through human activities, whereas no significant threats exist for infrastructures due to contact with burning vegetation. We used the three Alpine Swiss cantons of Ticino, Valais and Grisons as the study area. The features representing anthropogenic infrastructures (urban or infrastructural components of the WUI) as well as forest cover related features (wildland component of the WUI) were selected from the Swiss Topographic Landscape Model (TLM3D). Georeferenced forest fire occurrences derived from the WSL Swissfire database were used to define suitable WUI interface distances. The Random Forest algorithm was applied to estimate the importance of predictor variables to fire ignition occurrence. This revealed that buildings and drivable roads are the most relevant anthropogenic components with respect to fire ignition. We consequently defined the combination of drivable roads and easily accessible (i.e. 100 m from the next drivable road) buildings as the WUI-relevant infrastructural component. For the definition of the interface (buffer) distance between WUI infrastructural and wildland components, we computed the empirical cumulative distribution functions (ECDF) of the percentage of ignition points (observed and simulated) arising at increasing distances from the selected infrastructures. The ECDF facilitates the calculation of both the distance at which a given percentage of ignition points occurred and, in turn, the amount of forest area covered at a given distance. Finally, we developed a GIS ModelBuilder routine to map the WUI for the selected buffer distance. The approach was found to be reproducible, robust (based on statistical analyses for evaluating parameters) and flexible (buffer distances depending on the targeted final area covered) so that fire managers may use it to detect WUI according to their specific priorities.
Resumo:
Acute lung injury (ALI) is a clinical manifestation of respiratory failure, caused by lung inflammation and the disruption of the alveolar-capillary barrier. Preservation of the physical integrity of the alveolar epithelial monolayer is of critical importance to prevent alveolar edema. Barrier integrity depends largely on the balance between physical forces on cell-cell and cell-matrix contacts, and this balance might be affected by alterations in the coagulation cascade in patients with ALI. We aimed to study the effects of activated protein C (APC) on mechanical tension and barrier integrity in human alveolar epithelial cells (A549) exposed to thrombin. Cells were pretreated for 3 h with APC (50 mg/ml) or vehicle (control). Subsequently, thrombin (50 nM) or medium was added to the cell culture. APC significantly reduced thrombin-induced cell monolayer permeability, cell stiffening, and cell contraction, measured by electrical impedance, optical magnetic twisting cytometry, and traction microscopy, respectively, suggesting a barrier-protective response. The dynamics of the barrier integrity was also assessed by western blotting and immunofluorescence analysis of the tight junction ZO-1. Thrombin resulted in more elongated ZO-1 aggregates at cell-cell interface areas and induced an increase in ZO-1 membrane protein content. APC attenuated the length of these ZO-1 aggregates and reduced the ZO-1 membrane protein levels induced by thrombin. In conclusion, pretreatment with APC reduced the disruption of barrier integrity induced by thrombin, thus contributing to alveolar epithelial barrier protection.
Resumo:
The Nucleus accumbens (Nacc) has been proposed to act as a limbic-motor interface. Here, using invasive intraoperative recordings in an awake patient suffering from obsessive-compulsive disease (OCD), we demonstrate that its activity is modulated by the quality of performance of the subject in a choice reaction time task designed to tap action monitoring processes. Action monitoring, that is, error detection and correction, is thought to be supported by a system involving the dopaminergic midbrain, the basal ganglia, and the medial prefrontal cortex. In surface electrophysiological recordings, action monitoring is indexed by an error-related negativity (ERN) appearing time-locked to the erroneous responses and emanating from the medial frontal cortex. In preoperative scalp recordings the patient's ERN was found to be signifi cantly increased compared to a large (n = 83) normal sample, suggesting enhanced action monitoring processes. Intraoperatively, error-related modulations were obtained from the Nacc but not from a site 5 mm above. Importantly, crosscorrelation analysis showed that error-related activity in the Nacc preceded surface activity by 40 ms. We propose that the Nacc is involved in action monitoring, possibly by using error signals from the dopaminergic midbrain to adjust the relative impact of limbic and prefrontal inputs on frontal control systems in order to optimize goal-directed behavior.
Resumo:
A brain-computer interface (BCI) is a new communication channel between the human brain and a computer. Applications of BCI systems comprise the restoration of movements, communication and environmental control. In this study experiments were made that used the BCI system to control or to navigate in virtual environments (VE) just by thoughts. BCI experiments for navigation in VR were conducted so far with synchronous BCI and asynchronous BCI systems. The synchronous BCI analyzes the EEG patterns in a predefined time window and has 2 to 3 degrees of freedom.
Resumo:
Print quality and the printability of paper are very important attributes when modern printing applications are considered. In prints containing images, high print quality is a basic requirement. Tone unevenness and non uniform glossiness of printed products are the most disturbing factors influencing overall print quality. These defects are caused by non ideal interactions of paper, ink and printing devices in high speed printing processes. Since print quality is a perceptive characteristic, the measurement of unevenness according to human vision is a significant problem. In this thesis, the mottling phenomenon is studied. Mottling is a printing defect characterized by a spotty, non uniform appearance in solid printed areas. Print mottle is usually the result of uneven ink lay down or non uniform ink absorption across the paper surface, especially visible in mid tone imagery or areas of uniform color, such as solids and continuous tone screen builds. By using existing knowledge on visual perception and known methods to quantify print tone variation, a new method for print unevenness evaluation is introduced. The method is compared to previous results in the field and is supported by psychometric experiments. Pilot studies are made to estimate the effect of optical paper characteristics prior to printing, on the unevenness of the printed area after printing. Instrumental methods for print unevenness evaluation have been compared and the results of the comparison indicate that the proposed method produces better results in terms of visual evaluation correspondence. The method has been successfully implemented as ail industrial application and is proved to be a reliable substitute to visual expertise.
Resumo:
Neutral alpha-mannosidase and lysosomal MAN2B1 alpha-mannosidase belong to glycoside hydrolase family 38, which contains essential enzymes required for the modification and catabolism of asparagine-linked glycans on proteins. MAN2B1 catalyses lysosomal glycan degradation, while neutral α-mannosidase is most likely involved in the catabolism of cytosolic free oligosaccharides. These mannose containing saccharides are generated during glycosylation or released from misfolded glycoproteins, which are detected by quality control in the endoplasmic reticulum. To characterise the biological function of human neutral α-mannosidase, I cloned the alpha-mannosidase cDNA and recombinantly expressed the enzyme. The purified enzyme trimmed the putative natural substrate Man9GlcNAc to Man5GlcNAc, whereas the reducing end GlcNAc2 limited trimming to Man8GlcNAc2. Neutral α-mannosidase showed highest enzyme activity at neutral pH and was activated by the cations Fe2+, Co2+ and Mn2+, Cu2+ in turn had a strong inhibitory effect on alpha-mannosidase activity. Analysis of its intracellular localisation revealed that neutral alpha-mannosidase is cytosolic and colocalises with proteasomes. Further work showed that the overexpression of neutral alpha-mannosidase affected the cytosolic free oligosaccharide content and led to enhanced endoplasmic reticulum associated degradation and underglycosylation of secreted proteins. The second part of the study focused on MAN2B1 and the inherited lysosomal storage disorder α-mannosidosis. In this disorder, deficient MAN2B1 activity is associated with mutations in the MAN2B1 gene. The thesis reports the molecular consequences of 35 alpha-mannosidosis associated mutations, including 29 novel missense mutations. According to experimental analyses, the mutations fall into four groups: Mutations, which prevent transport to lysosomes are accompanied with a lack of proteolytic processing of the enzyme (groups 1 and 3). Although the rest of the mutations (groups 2 and 4) allow transport to lysosomes, the mutated proteins are less efficiently processed to their mature form than is wild type MAN2B1. Analysis of the effect of the mutations on the model structure of human lysosomal alpha-mannosidase provides insights on their structural consequences. Mutations, which affect amino acids important for folding (prolines, glycines, cysteines) or domain interface interactions (arginines), arrest the enzyme in the endoplasmic reticulum. Surface mutations and changes, which do not drastically alter residue volume, are tolerated better. Descriptions of the mutations and clinical data are compiled in an α-mannosidosis database, which will be available for the scientific community. This thesis provides a detailed insight into two ubiquitous human alpha-mannosidases. It demonstrates that neutral alpha-mannosidase is involved in the degradation of cytosolic oligosaccharides and suggests that the regulation of this α-mannosidase is important for maintaining the cellular homeostasis of N-glycosylation and glycan degradation. The study on alpha-mannosidosis associated mutations identifies multiple mechanisms for how these mutations are detrimental for MAN2B1 activity. The α-mannosidosis database will benefit both clinicians and scientific research on lysosomal alpha‑mannosidosis.