94 resultados para Digtially-Driven Transformations
Resumo:
PURPOSE: The objective of this experiment is to establish a continuous postmortem circulation in the vascular system of porcine lungs and to evaluate the pulmonary distribution of the perfusate. This research is performed in the bigger scope of a revascularization project of Thiel embalmed specimens. This technique enables teaching anatomy, practicing surgical procedures and doing research under lifelike circumstances. METHODS: After cannulation of the pulmonary trunk and the left atrium, the vascular system was flushed with paraffinum perliquidum (PP) through a heart-lung machine. A continuous circulation was then established using red PP, during which perfusion parameters were measured. The distribution of contrast-containing PP in the pulmonary circulation was visualized on computed tomography. Finally, the amount of leak from the vascular system was calculated. RESULTS: A reperfusion of the vascular system was initiated for 37 min. The flow rate ranged between 80 and 130 ml/min throughout the experiment with acceptable perfusion pressures (range: 37-78 mm Hg). Computed tomography imaging and 3D reconstruction revealed a diffuse vascular distribution of PP and a decreasing vascularization ratio in cranial direction. A self-limiting leak (i.e. 66.8% of the circulating volume) towards the tracheobronchial tree due to vessel rupture was also measured. CONCLUSIONS: PP enables circulation in an isolated porcine lung model with an acceptable pressure-flow relationship resulting in an excellent recruitment of the vascular system. Despite these promising results, rupture of vessel walls may cause leaks. Further exploration of the perfusion capacities of PP in other organs is necessary. Eventually, this could lead to the development of reperfused Thiel embalmed human bodies, which have several applications.
Resumo:
Accurate modeling of flow instabilities requires computational tools able to deal with several interacting scales, from the scale at which fingers are triggered up to the scale at which their effects need to be described. The Multiscale Finite Volume (MsFV) method offers a framework to couple fine-and coarse-scale features by solving a set of localized problems which are used both to define a coarse-scale problem and to reconstruct the fine-scale details of the flow. The MsFV method can be seen as an upscaling-downscaling technique, which is computationally more efficient than standard discretization schemes and more accurate than traditional upscaling techniques. We show that, although the method has proven accurate in modeling density-driven flow under stable conditions, the accuracy of the MsFV method deteriorates in case of unstable flow and an iterative scheme is required to control the localization error. To avoid large computational overhead due to the iterative scheme, we suggest several adaptive strategies both for flow and transport. In particular, the concentration gradient is used to identify a front region where instabilities are triggered and an accurate (iteratively improved) solution is required. Outside the front region the problem is upscaled and both flow and transport are solved only at the coarse scale. This adaptive strategy leads to very accurate solutions at roughly the same computational cost as the non-iterative MsFV method. In many circumstances, however, an accurate description of flow instabilities requires a refinement of the computational grid rather than a coarsening. For these problems, we propose a modified iterative MsFV, which can be used as downscaling method (DMsFV). Compared to other grid refinement techniques the DMsFV clearly separates the computational domain into refined and non-refined regions, which can be treated separately and matched later. This gives great flexibility to employ different physical descriptions in different regions, where different equations could be solved, offering an excellent framework to construct hybrid methods.
Resumo:
Abstract : This work is concerned with the development and application of novel unsupervised learning methods, having in mind two target applications: the analysis of forensic case data and the classification of remote sensing images. First, a method based on a symbolic optimization of the inter-sample distance measure is proposed to improve the flexibility of spectral clustering algorithms, and applied to the problem of forensic case data. This distance is optimized using a loss function related to the preservation of neighborhood structure between the input space and the space of principal components, and solutions are found using genetic programming. Results are compared to a variety of state-of--the-art clustering algorithms. Subsequently, a new large-scale clustering method based on a joint optimization of feature extraction and classification is proposed and applied to various databases, including two hyperspectral remote sensing images. The algorithm makes uses of a functional model (e.g., a neural network) for clustering which is trained by stochastic gradient descent. Results indicate that such a technique can easily scale to huge databases, can avoid the so-called out-of-sample problem, and can compete with or even outperform existing clustering algorithms on both artificial data and real remote sensing images. This is verified on small databases as well as very large problems. Résumé : Ce travail de recherche porte sur le développement et l'application de méthodes d'apprentissage dites non supervisées. Les applications visées par ces méthodes sont l'analyse de données forensiques et la classification d'images hyperspectrales en télédétection. Dans un premier temps, une méthodologie de classification non supervisée fondée sur l'optimisation symbolique d'une mesure de distance inter-échantillons est proposée. Cette mesure est obtenue en optimisant une fonction de coût reliée à la préservation de la structure de voisinage d'un point entre l'espace des variables initiales et l'espace des composantes principales. Cette méthode est appliquée à l'analyse de données forensiques et comparée à un éventail de méthodes déjà existantes. En second lieu, une méthode fondée sur une optimisation conjointe des tâches de sélection de variables et de classification est implémentée dans un réseau de neurones et appliquée à diverses bases de données, dont deux images hyperspectrales. Le réseau de neurones est entraîné à l'aide d'un algorithme de gradient stochastique, ce qui rend cette technique applicable à des images de très haute résolution. Les résultats de l'application de cette dernière montrent que l'utilisation d'une telle technique permet de classifier de très grandes bases de données sans difficulté et donne des résultats avantageusement comparables aux méthodes existantes.
Resumo:
Le monde de l'action humanitaire a connu des développements importants durant les dernières décennies. Sur les terrains d'interventions, les crises et les conflits se sont fortement complexifiés, nécessitant la participation de spécialistes de nombreux domaines. Par ailleurs, les volumes financiers générés par les campagnes de dons et mis à disposition par les bailleurs de fonds ont considérablement augmenté. En corollaire de cette croissance financière, les exigences de contrôle et de traçabilité des fonds se sont renforcées. En lien avec ces éléments, le nombre de salariés dans les grandes organisations non gouvernementales a augmenté de manière exponentielle. Une littérature spécifique sur les modalités d'évaluation des performances, le management et le « leadership » des organisations dites du « tiers secteur » a d'ailleurs vu le jour, comme l'illustre la naissance, en 1990, de la revue « Nonprofit Management and Leadership ». Les pays bénéficiaires de l'aide ont également développé des exigences spécifiques envers les projets mis en oeuvre par les ONG. Par des phénomènes de « socialisation des standards occidentaux », ces derniers attendent des acteurs internationaux un certain niveau de qualité des programmes.Pour s'adapter à ces évolutions et répondre aux exigences d'efficacité auxquelles elles sont soumises, les organisations d'aide ont dû se transformer. Les grandes organisations ont ainsi connu durant les dernières décennies un mouvement de professionnalisation de leur structure, les conduisant à se rapprocher d'un modèle de fonctionnement que nous nommerons ici « institutionnel », à savoir formalisé et organisé. Nous employons ici le terme de professionnalisation dans l'appréciation qu'en font les acteurs du milieu humanitaire, à savoir en ce qu'il désigne « les restructurations internes auxquelles leurs organisations font face depuis la fin des années 1980 ». Différents indicateurs de cette professionnalisation au sein des ONG peuvent être identifiés, notamment une plus forte division du travail, le développement de statuts spécifiques, la salarisation croissante des métiers de l'humanitaire ou encore le recours aux fonds publics.Une conséquence également de cette évolution est l'entrée de nouveaux métiers sur la scène humanitaire. À côté des professions traditionnellement à l'origine des ONG (médecins, ingénieurs, juristes, etc.), la complexification et la diversification des tâches a rendu nécessaire de faire appel à des compétences professionnelles spécifiques dans des domaines tels que la communication, l'informatique ou la finance, pour ne citer que quelques exemples. Des connaissances et des pratiques spécifiques en matière de management des ONG se sont développées depuis la fin des années 1990. Le métier de logisticien est apparu, lequel est enseigné dans des structures spécialisées (par exemple par l'association Bioforce en France). Des formations académiques spécialisées dans le domaine de l'humanitaire et de la coopération ont également vu le jour, avec le but affiché de former des professionnels spécialistes de l'humanitaire. On peut par exemple citer le PIAH en Suisse (Programme interdisciniplinaire en action humanitaire, 2011), ou encore les formations dispensées par le CIHC aux États-Unis. [auteur]
Resumo:
We present a programmable microcontroller-driven injection system for the exchange of imaging medium during atomic force microscopy. Using this low-noise system, high-resolution imaging can be performed during this process of injection without disturbance. This latter circumstance was exemplified by the online imaging of conformational changes in DNA molecules during the injection of anticancer drug into the fluid chamber.
Resumo:
Environmental shifts and lifestyle changes may result in formerly adaptive traits becoming non-functional or maladaptive. The subsequent decay of such traits highlights the importance of natural selection for adaptations, yet its causes have rarely been investigated. To study the fate of formerly adaptive traits after lifestyle changes, we evaluated sexual traits in five independently derived asexual lineages, including traits that are specific to males and therefore not exposed to selection. At least four of the asexual lineages retained the capacity to produce males that display normal courtship behaviours and are able to fertilize eggs of females from related sexual species. The maintenance of male traits may stem from pleiotropy, or from these traits only regressing via drift, which may require millions of years to generate phenotypic effects. By contrast, we found parallel decay of sexual traits in females. Asexual females produced altered airborne and contact signals, had modified sperm storage organs, and lost the ability to fertilize their eggs, impeding reversals to sexual reproduction. Female sexual traits were decayed even in recently derived asexuals, suggesting that trait changes following the evolution of asexuality, when they occur, proceed rapidly and are driven by selective processes rather than drift.
Resumo:
Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.
Resumo:
The velocity of a liquid slug falling in a capillary tube is lower than predicted for Poiseuille flow due to presence of menisci, whose shapes are determined by the complex interplay of capillary, viscous, and gravitational forces. Due to the presence of menisci, a capillary pressure proportional to surface curvature acts on the slug and streamlines are bent close to the interface, resulting in enhanced viscous dissipation at the wedges. To determine the origin of drag-force increase relative to Poiseuille flow, we compute the force resultant acting on the slug by integrating Navier-Stokes equations over the liquid volume. Invoking relationships from differential geometry we demonstrate that the additional drag is due to viscous forces only and that no capillary drag of hydrodynamic origin exists (i.e., due to hydrodynamic deformation of the interface). Requiring that the force resultant is zero, we derive scaling laws for the steady velocity in the limit of small capillary numbers by estimating the leading order viscous dissipation in the different regions of the slug (i.e., the unperturbed Poiseuille-like bulk, the static menisci close to the tube axis and the dynamic regions close to the contact lines). Considering both partial and complete wetting, we find that the relationship between dimensionless velocity and weight is, in general, nonlinear. Whereas the relationship obtained for complete-wetting conditions is found in agreement with the experimental data of Bico and Quere [J. Bico and D. Quere, J. Colloid Interface Sci. 243, 262 (2001)], the scaling law under partial-wetting conditions is validated by numerical simulations performed with the Volume of Fluid method. The simulated steady velocities agree with the behavior predicted by the theoretical scaling laws in presence and in absence of static contact angle hysteresis. The numerical simulations suggest that wedge-flow dissipation alone cannot account for the entire additional drag and that the non-Poiseuille dissipation in the static menisci (not considered in previous studies) has to be considered for large contact angles.
Resumo:
For patients with chronic lung diseases, such as chronic obstructive pulmonary disease (COPD), exacerbations are life-threatening events causing acute respiratory distress that can even lead to hospitalization and death. Although a great deal of effort has been put into research of exacerbations and potential treatment options, the exact underlying mechanisms are yet to be deciphered and no therapy that effectively targets the excessive inflammation is available. In this study, we report that interleukin-1β (IL-1β) and interleukin-17A (IL-17A) are key mediators of neutrophilic inflammation in influenza-induced exacerbations of chronic lung inflammation. Using a mouse model of disease, our data shows a role for IL-1β in mediating lung dysfunction, and in driving neutrophilic inflammation during the whole phase of viral infection. We further report a role for IL-17A as a mediator of IL-1β induced neutrophilia at early time points during influenza-induced exacerbations. Blocking of IL-17A or IL-1 resulted in a significant abrogation of neutrophil recruitment to the airways in the initial phase of infection or at the peak of viral replication, respectively. Therefore, IL-17A and IL-1β are potential targets for therapeutic treatment of viral exacerbations of chronic lung inflammation.
Resumo:
Connectivity among populations plays a crucial role in maintaining genetic variation at a local scale, especially in small populations affected strongly by genetic drift. The negative consequences of population disconnection on allelic richness and gene diversity (heterozygosity) are well recognized and empirically established. It is not well recognized, however, that a sudden drop in local effective population size induced by such disconnection produces a temporary disequilibrium in allelic frequency distributions that is akin to the genetic signature of a demographic bottleneck. To document this effect, we used individual-based simulations and empirical data on allelic richness and gene diversity in six pairs of isolated versus well-connected (core) populations of European tree frogs. In our simulations, population disconnection depressed allelic richness more than heterozygosity and thus resulted in a temporary excess in gene diversity relative to mutation drift equilibrium (i.e., signature of a genetic bottleneck). We observed a similar excess in gene diversity in isolated populations of tree frogs. Our results show that population disconnection can create a genetic bottleneck in the absence of demographic collapse.