36 resultados para fundamental principles and applications

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of new drug delivery systems to target the anterior segment of the eye may offer many advantages: to increase the biodisponibility of the drug, to allow the penetration of drug that cannot be formulated as solutions, to obtain constant and sustained drug release, to achieve higher local concentrations without systemic effects, to target more specifically one tissue or cell type, to reduce the frequency of instillation and therefore increase the observance and comfort of the patient while reducing side effects of frequent instillation. Several approaches are developed, aiming to increase the corneal contact time by modified formulation or reservoir systems, or by increasing the tissue permeability using iontophoresis. To date, no ocular drug delivery system is ideal for all purposes. To maximize treatment efficacy, careful evaluation of the specific pathological condition, the targeted Intraocular tissue and the location of the most severe pathology must be made before selecting the method of delivery most suitable for each individual patient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, technological advances have allowed manufacturers to implement dual-energy computed tomography (DECT) on clinical scanners. With its unique ability to differentiate basis materials by their atomic number, DECT has opened new perspectives in imaging. DECT has been used successfully in musculoskeletal imaging with applications ranging from detection, characterization, and quantification of crystal and iron deposits; to simulation of noncalcium (improving the visualization of bone marrow lesions) or noniodine images. Furthermore, the data acquired with DECT can be postprocessed to generate monoenergetic images of varying kiloelectron volts, providing new methods for image contrast optimization as well as metal artifact reduction. The first part of this article reviews the basic principles and technical aspects of DECT including radiation dose considerations. The second part focuses on applications of DECT to musculoskeletal imaging including gout and other crystal-induced arthropathies, virtual noncalcium images for the study of bone marrow lesions, the study of collagenous structures, applications in computed tomography arthrography, as well as the detection of hemosiderin and metal particles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, technological advances have allowed manufacturers to implement dual-energy computed tomography (DECT) on clinical scanners. With its unique ability to differentiate basis materials by their atomic number, DECT has opened new perspectives in imaging. DECT has been successfully used in musculoskeletal imaging with applications ranging from detection, characterization, and quantification of crystal and iron deposits, to simulation of noncalcium (improving the visualization of bone marrow lesions) or noniodine images. Furthermore, the data acquired with DECT can be postprocessed to generate monoenergetic images of varying kiloelectron volts, providing new methods for image contrast optimization as well as metal artifact reduction. The first part of this article reviews the basic principles and technical aspects of DECT including radiation dose considerations. The second part focuses on applications of DECT to musculoskeletal imaging including gout and other crystal-induced arthropathies, virtual noncalcium images for the study of bone marrow lesions, the study of collagenous structures, applications in computed tomography arthrography, as well as the detection of hemosiderin and metal particles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of early detection to then intervene and improve the prognostic seems straightforward. Applied to asymptomatic subjects, this concept--screening--is rather complex. This review presents the rational and fundamental principles of screening. It underscores the fundamental principles related to the disease and to the screening test considered, the importance of considering screening as a program rather than a test only, and the validity of measures used to evaluate the efficacy of screening. Lastly, it reviews the most frequently bias encountered in screening studies and interpretations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Protein oxidation mechanisms result in a wide array of modifications, from backbone cleavage or protein crosslinking to more subtle modifications such as side chain oxidations. Protein oxidation occurs as part of normal regulatory processes, as a defence mechanism against oxidative stress, or as a deleterious processes when antioxidant defences are overcome. Because blood is continually exposed to reactive oxygen and nitrogen species, blood proteomics should inherently adopt redox proteomic strategies. In this review, we recall the biochemical basis of protein oxidation, review the proteomic methodologies applied to analyse redox modifications, and highlight some physiological and in vitro responses to oxidative stress of various blood components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent advances in sequencing technologies have given all microbiology laboratories access to whole genome sequencing. Providing that tools for the automated analysis of sequence data and databases for associated meta-data are developed, whole genome sequencing will become a routine tool for large clinical microbiology laboratories. Indeed, the continuing reduction in sequencing costs and the shortening of the 'time to result' makes it an attractive strategy in both research and diagnostics. Here, we review how high-throughput sequencing is revolutionizing clinical microbiology and the promise that it still holds. We discuss major applications, which include: (i) identification of target DNA sequences and antigens to rapidly develop diagnostic tools; (ii) precise strain identification for epidemiological typing and pathogen monitoring during outbreaks; and (iii) investigation of strain properties, such as the presence of antibiotic resistance or virulence factors. In addition, recent developments in comparative metagenomics and single-cell sequencing offer the prospect of a better understanding of complex microbial communities at the global and individual levels, providing a new perspective for understanding host-pathogen interactions. Being a high-resolution tool, high-throughput sequencing will increasingly influence diagnostics, epidemiology, risk management, and patient care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé: L'évaluation de l'exposition aux nuisances professionnelles représente une étape importante dans l'analyse de poste de travail. Les mesures directes sont rarement utilisées sur les lieux même du travail et l'exposition est souvent estimée sur base de jugements d'experts. Il y a donc un besoin important de développer des outils simples et transparents, qui puissent aider les spécialistes en hygiène industrielle dans leur prise de décision quant aux niveaux d'exposition. L'objectif de cette recherche est de développer et d'améliorer les outils de modélisation destinés à prévoir l'exposition. Dans un premier temps, une enquête a été entreprise en Suisse parmi les hygiénistes du travail afin d'identifier les besoins (types des résultats, de modèles et de paramètres observables potentiels). Il a été constaté que les modèles d'exposition ne sont guère employés dans la pratique en Suisse, l'exposition étant principalement estimée sur la base de l'expérience de l'expert. De plus, l'émissions de polluants ainsi que leur dispersion autour de la source ont été considérés comme des paramètres fondamentaux. Pour tester la flexibilité et la précision des modèles d'exposition classiques, des expériences de modélisations ont été effectuées dans des situations concrètes. En particulier, des modèles prédictifs ont été utilisés pour évaluer l'exposition professionnelle au monoxyde de carbone et la comparer aux niveaux d'exposition répertoriés dans la littérature pour des situations similaires. De même, l'exposition aux sprays imperméabilisants a été appréciée dans le contexte d'une étude épidémiologique sur une cohorte suisse. Dans ce cas, certains expériences ont été entreprises pour caractériser le taux de d'émission des sprays imperméabilisants. Ensuite un modèle classique à deux-zone a été employé pour évaluer la dispersion d'aérosol dans le champ proche et lointain pendant l'activité de sprayage. D'autres expériences ont également été effectuées pour acquérir une meilleure compréhension des processus d'émission et de dispersion d'un traceur, en se concentrant sur la caractérisation de l'exposition du champ proche. Un design expérimental a été développé pour effectuer des mesures simultanées dans plusieurs points d'une cabine d'exposition, par des instruments à lecture directe. Il a été constaté que d'un point de vue statistique, la théorie basée sur les compartiments est sensée, bien que l'attribution à un compartiment donné ne pourrait pas se faire sur la base des simples considérations géométriques. Dans une étape suivante, des données expérimentales ont été collectées sur la base des observations faites dans environ 100 lieux de travail différents: des informations sur les déterminants observés ont été associées aux mesures d'exposition des informations sur les déterminants observés ont été associé. Ces différentes données ont été employées pour améliorer le modèle d'exposition à deux zones. Un outil a donc été développé pour inclure des déterminants spécifiques dans le choix du compartiment, renforçant ainsi la fiabilité des prévisions. Toutes ces investigations ont servi à améliorer notre compréhension des outils des modélisations ainsi que leurs limitations. L'intégration de déterminants mieux adaptés aux besoins des experts devrait les inciter à employer cet outil dans leur pratique. D'ailleurs, en augmentant la qualité des outils des modélisations, cette recherche permettra non seulement d'encourager leur utilisation systématique, mais elle pourra également améliorer l'évaluation de l'exposition basée sur les jugements d'experts et, par conséquent, la protection de la santé des travailleurs. Abstract Occupational exposure assessment is an important stage in the management of chemical exposures. Few direct measurements are carried out in workplaces, and exposures are often estimated based on expert judgements. There is therefore a major requirement for simple transparent tools to help occupational health specialists to define exposure levels. The aim of the present research is to develop and improve modelling tools in order to predict exposure levels. In a first step a survey was made among professionals to define their expectations about modelling tools (what types of results, models and potential observable parameters). It was found that models are rarely used in Switzerland and that exposures are mainly estimated from past experiences of the expert. Moreover chemical emissions and their dispersion near the source have also been considered as key parameters. Experimental and modelling studies were also performed in some specific cases in order to test the flexibility and drawbacks of existing tools. In particular, models were applied to assess professional exposure to CO for different situations and compared with the exposure levels found in the literature for similar situations. Further, exposure to waterproofing sprays was studied as part of an epidemiological study on a Swiss cohort. In this case, some laboratory investigation have been undertaken to characterize the waterproofing overspray emission rate. A classical two-zone model was used to assess the aerosol dispersion in the near and far field during spraying. Experiments were also carried out to better understand the processes of emission and dispersion for tracer compounds, focusing on the characterization of near field exposure. An experimental set-up has been developed to perform simultaneous measurements through direct reading instruments in several points. It was mainly found that from a statistical point of view, the compartmental theory makes sense but the attribution to a given compartment could ñó~be done by simple geometric consideration. In a further step the experimental data were completed by observations made in about 100 different workplaces, including exposure measurements and observation of predefined determinants. The various data obtained have been used to improve an existing twocompartment exposure model. A tool was developed to include specific determinants in the choice of the compartment, thus largely improving the reliability of the predictions. All these investigations helped improving our understanding of modelling tools and identify their limitations. The integration of more accessible determinants, which are in accordance with experts needs, may indeed enhance model application for field practice. Moreover, while increasing the quality of modelling tool, this research will not only encourage their systematic use, but might also improve the conditions in which the expert judgments take place, and therefore the workers `health protection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bacteria have long been the targets for genetic manipulation, but more recently they have been synthetically designed to carry out specific tasks. Among the simplest of these tasks is chemical compound and toxicity detection coupled to the production of a quantifiable reporter signal. In this Review, we describe the current design of bacterial bioreporters and their use in a range of assays to measure the presence of harmful chemicals in water, air, soil, food or biological specimens. New trends for integrating synthetic biology and microengineering into the design of bacterial bioreporter platforms are also highlighted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the increasing popularity of enterprise architecture management (EAM) in practice, many EAM initiatives either do not fully meet the expected targets or fail. Several frameworks have been suggested as guidelines to EA implementation, but companies seldom follow prescriptive frameworks. Instead, they follow very diverse implementation approaches that depend on their organizational contingencies and the way of adopting and evolving EAM over time. This research strives for a broader understanding of EAM by exploring context-dependent EAM adoption approaches as well as identifying the main EA principles that affect EA effectiveness. Based on two studies, this dissertation aims to address two main questions: (1) EAM design: Which approaches do companies follow when adopting EAM? (2) EA principles and their impact: What impact does EA principles have on EA effectiveness/quality? By utilizing both qualitative and quantitative research methods, this research contributes to exploring different EAM designs in different organizational contingencies as well as using EA principles as an effective means to achieve principle-based EAM design. My research can help companies identify a suitable EAM design that fits their organizational settings and shape their EA through a set of principles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MR connectomics is an emerging framework in neuro-science that combines diffusion MRI and whole brain tractography methodologies with the analytical tools of network science. In the present work we review the current methods enabling structural connectivity mapping with MRI and show how such data can be used to infer new information of both brain structure and function. We also list the technical challenges that should be addressed in the future to achieve high-resolution maps of structural connectivity. From the resulting tremendous amount of data that is going to be accumulated soon, we discuss what new challenges must be tackled in terms of methods for advanced network analysis and visualization, as well data organization and distribution. This new framework is well suited to investigate key questions on brain complexity and we try to foresee what fields will most benefit from these approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enterprise-wide architecture has become a necessity for organizations to (re)align information technology (IT) to changing business requirements. Since a city planning metaphor inspired enterprise-wide architecture, this dissertation's research axes can be outlined by similarities between cities and enterprises. Both are characterized as dynamic super-systems that need to address the evolving interest of various architecture stakeholders. Further, both should simultaneously adhere to a set of principles to guide the evolution of architecture towards the expected benefits. The extant literature on enterprise-wide architecture not only disregards architecture adoption's complexities but also remains vague about how principles guide architecture evolution. To bridge this gap, this dissertation contains three interrelated research streams examining the principles and adoption of enterprise-wide architecture. The first research stream investigates organizational intricacies inherent in architecture adoption. It characterizes architecture adoption as an ongoing organizational adaptation process. By analyzing organizational response behaviors in this adaptation process, it also identifies four archetypes that represent very diverse architecture approaches. The second research stream ontologically clarifies the nature of architecture principles along with outlining new avenues for theoretical contributions. This research stream also provides an empirically validated set of principles and proposes a research model illustrating how principles can be applied to generate expected architecture benefits. The third research stream examines architecture adoption in multinational corporations (MNCs). MNCs are Specified by unique organizational characteristics that constantly strive for balancing global integration and local responsiveness. This research stream characterizes MNCs' architecture adoption as a continuous endeavor. This endeavor tries to constantly synchron ize architecture with stakeholders' beliefs about how to balance global integration and local responsiveness. To conclude, this dissertation provides a thorough explanation of a long-term journey in Which organizations learn over time to adopt an effective architecture approach. It also clarifies the role of principles to purposefully guide the aforementioned learning process. - L'Architecture d'Entreprise (AE) est devenue une nécessité pour permettre aux organisations de (ré)aligner les technologies de l'information (TI) avec les changements en termes de besoins métiers. En se basant sur la métaphore de la planification urbaine dont l'AE s'est inspirée, cette dissertation peut être présentée comme une comparaison entre les villes et les entreprises; les deux sont des super-systèmes dynamiques ayant besoin de répondre aux intérêts d'acteurs divers et variés en constants évolution. De plus, les deux devraient souscrire simultanément à un ensemble de principes afin de faire converger l'évolution de l'architecture vers les bénéfices attendus. La littérature sur l'AE, non seulement ne prend pas en considération les complexités de l'adoption d'architecture, mais aussi reste vague sur la manière dont les principes guident l'évolution de l'architecture. Pour pallier ce manque, cette dissertation est composée de trois volets de recherche étroitement liés examinant les principes et l'adoption de l'AE. Le premier volet examine la complexité organisationnelle inhérente à l'adoption de l'architecture. Il caractérise l'adoption de l'architecture en tant que processus d'adaptation continu. En analysant le comportement organisationnel en réponse à ce processus d'adaptation, ce volet distingue quatre archétypes représentant la diversité des approches de l'architecture. Le deuxième volet de recherche clarifie de manière ontologique la nature des principes d'architecture et envisage les contributions théoriques futures possibles. Cet axe de recherche fournit aussi un ensemble de principes, validés de manière empirique, et propose un modèle de recherche illustrant la manière dont ces principes peuvent être appliqués afin de générer les bénéfices attendus de l'architecture. Le troisième volet examine l'adoption de l'architecture dans les entreprises multinationales. Ces dernières possèdent des caractéristiques organisationnelles uniques et sont constamment à la recherche d'un équilibre entre une intégration globale et une flexibilité locale tout en prenant en compte les convictions des divers acteurs sur la manière d'atteindre cet équilibre. Pour conclure, cette dissertation fournit une explication sur le long voyage au cours duquel les entreprises apprennent à adopter une approche d'architecture efficace. Elle clarifie aussi le rôle des principes dans l'accompagnement de ce processus d'apprentissage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Retinoblastoma is the most common intraocular tumor in children. The diagnosis is usually established by the ophthalmologist on the basis of fundoscopy and US. Together with US, high-resolution MRI has emerged as an important imaging modality for pretreatment assessment, i.e. for diagnostic confirmation, detection of local tumor extent, detection of associated developmental malformation of the brain and detection of associated intracranial primitive neuroectodermal tumor (trilateral retinoblastoma). Minimum requirements for pretreatment diagnostic evaluation of retinoblastoma or mimicking lesions are presented, based on consensus among members of the European Retinoblastoma Imaging Collaboration (ERIC). The most appropriate techniques for imaging in a child with leukocoria are reviewed. CT is no longer recommended. Implementation of a standardized MRI protocol for retinoblastoma in clinical practice may benefit children worldwide, especially those with hereditary retinoblastoma, since a decreased use of CT reduces the exposure to ionizing radiation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract : This work is concerned with the development and application of novel unsupervised learning methods, having in mind two target applications: the analysis of forensic case data and the classification of remote sensing images. First, a method based on a symbolic optimization of the inter-sample distance measure is proposed to improve the flexibility of spectral clustering algorithms, and applied to the problem of forensic case data. This distance is optimized using a loss function related to the preservation of neighborhood structure between the input space and the space of principal components, and solutions are found using genetic programming. Results are compared to a variety of state-of--the-art clustering algorithms. Subsequently, a new large-scale clustering method based on a joint optimization of feature extraction and classification is proposed and applied to various databases, including two hyperspectral remote sensing images. The algorithm makes uses of a functional model (e.g., a neural network) for clustering which is trained by stochastic gradient descent. Results indicate that such a technique can easily scale to huge databases, can avoid the so-called out-of-sample problem, and can compete with or even outperform existing clustering algorithms on both artificial data and real remote sensing images. This is verified on small databases as well as very large problems. Résumé : Ce travail de recherche porte sur le développement et l'application de méthodes d'apprentissage dites non supervisées. Les applications visées par ces méthodes sont l'analyse de données forensiques et la classification d'images hyperspectrales en télédétection. Dans un premier temps, une méthodologie de classification non supervisée fondée sur l'optimisation symbolique d'une mesure de distance inter-échantillons est proposée. Cette mesure est obtenue en optimisant une fonction de coût reliée à la préservation de la structure de voisinage d'un point entre l'espace des variables initiales et l'espace des composantes principales. Cette méthode est appliquée à l'analyse de données forensiques et comparée à un éventail de méthodes déjà existantes. En second lieu, une méthode fondée sur une optimisation conjointe des tâches de sélection de variables et de classification est implémentée dans un réseau de neurones et appliquée à diverses bases de données, dont deux images hyperspectrales. Le réseau de neurones est entraîné à l'aide d'un algorithme de gradient stochastique, ce qui rend cette technique applicable à des images de très haute résolution. Les résultats de l'application de cette dernière montrent que l'utilisation d'une telle technique permet de classifier de très grandes bases de données sans difficulté et donne des résultats avantageusement comparables aux méthodes existantes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MR structural T1-weighted imaging using high field systems (>3T) is severely hampered by the existing large transmit field inhomogeneities. New sequences have been developed to better cope with such nuisances. In this work we show the potential of a recently proposed sequence, the MP2RAGE, to obtain improved grey white matter contrast with respect to conventional T1-w protocols, allowing for a better visualization of thalamic nuclei and different white matter bundles in the brain stem. Furthermore, the possibility to obtain high spatial resolution (0.65 mm isotropic) R1 maps fully independent of the transmit field inhomogeneities in clinical acceptable time is demonstrated. In this high resolution R1 maps it was possible to clearly observe varying properties of cortical grey matter throughout the cortex and observe different hippocampus fields with variations of intensity that correlate with known myelin concentration variations.