912 resultados para fundamental principles and applications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Se propone sintetizar nuevos materiales como bloques de construcción de estructuras en escala nanometrica o micrométrica: nanotubos de carbono funcionalizados; nanopartículas metálicas; hidrogeles inteligentes; carbones mesoporosos. Con ellos se construiran interfaces solido/liquido estructuradas: multicapas autoensambladas, patrones micrométricos con heterogeneidad tridimensional y estructuras jerárquicas. Se estudiara el intercambio de especies móviles en las interfaces usando técnicas electroquímicas, espectroelectroquimicas, ópticas y de microscopia. De esta manera se podran controlar el intercambio en esa interface. En base a este conocimientos se desarrollaran aplicaciones tecnológicas tales como sensores de oligonucletidos, microceldas de combustible, arreglos de microelectrodos y supercapacitores.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electrokinetic transport, electrochromatography, electroosmotic flow, electrophoresis, concentration polarization, fixed beds, monoliths, dynamic NMR microscopy, quantitative confocal laser scanning microscopy, mathematical modelling, numerical analysis

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simulation, modelling, proxels, PDEs, Markov chains, Petri nets, stochastic, performability, transient analysis

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The classical Lojasiewicz inequality and its extensions for partial differential equation problems (Simon) and to o-minimal structures (Kurdyka) have a considerable impact on the analysis of gradient-like methods and related problems: minimization methods, complexity theory, asymptotic analysis of dissipative partial differential equations, tame geometry. This paper provides alternative characterizations of this type of inequalities for nonsmooth lower semicontinuous functions defined on a metric or a real Hilbert space. In a metric context, we show that a generalized form of the Lojasiewicz inequality (hereby called the Kurdyka- Lojasiewicz inequality) relates to metric regularity and to the Lipschitz continuity of the sublevel mapping, yielding applications to discrete methods (strong convergence of the proximal algorithm). In a Hilbert setting we further establish that asymptotic properties of the semiflow generated by -∂f are strongly linked to this inequality. This is done by introducing the notion of a piecewise subgradient curve: such curves have uniformly bounded lengths if and only if the Kurdyka- Lojasiewicz inequality is satisfied. Further characterizations in terms of talweg lines -a concept linked to the location of the less steepest points at the level sets of f- and integrability conditions are given. In the convex case these results are significantly reinforced, allowing in particular to establish the asymptotic equivalence of discrete gradient methods and continuous gradient curves. On the other hand, a counterexample of a convex C2 function in R2 is constructed to illustrate the fact that, contrary to our intuition, and unless a specific growth condition is satisfied, convex functions may fail to fulfill the Kurdyka- Lojasiewicz inequality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

NORTH SEA STUDY OCCASIONAL PAPER No. 116

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent advances in sequencing technologies have given all microbiology laboratories access to whole genome sequencing. Providing that tools for the automated analysis of sequence data and databases for associated meta-data are developed, whole genome sequencing will become a routine tool for large clinical microbiology laboratories. Indeed, the continuing reduction in sequencing costs and the shortening of the 'time to result' makes it an attractive strategy in both research and diagnostics. Here, we review how high-throughput sequencing is revolutionizing clinical microbiology and the promise that it still holds. We discuss major applications, which include: (i) identification of target DNA sequences and antigens to rapidly develop diagnostic tools; (ii) precise strain identification for epidemiological typing and pathogen monitoring during outbreaks; and (iii) investigation of strain properties, such as the presence of antibiotic resistance or virulence factors. In addition, recent developments in comparative metagenomics and single-cell sequencing offer the prospect of a better understanding of complex microbial communities at the global and individual levels, providing a new perspective for understanding host-pathogen interactions. Being a high-resolution tool, high-throughput sequencing will increasingly influence diagnostics, epidemiology, risk management, and patient care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé: L'évaluation de l'exposition aux nuisances professionnelles représente une étape importante dans l'analyse de poste de travail. Les mesures directes sont rarement utilisées sur les lieux même du travail et l'exposition est souvent estimée sur base de jugements d'experts. Il y a donc un besoin important de développer des outils simples et transparents, qui puissent aider les spécialistes en hygiène industrielle dans leur prise de décision quant aux niveaux d'exposition. L'objectif de cette recherche est de développer et d'améliorer les outils de modélisation destinés à prévoir l'exposition. Dans un premier temps, une enquête a été entreprise en Suisse parmi les hygiénistes du travail afin d'identifier les besoins (types des résultats, de modèles et de paramètres observables potentiels). Il a été constaté que les modèles d'exposition ne sont guère employés dans la pratique en Suisse, l'exposition étant principalement estimée sur la base de l'expérience de l'expert. De plus, l'émissions de polluants ainsi que leur dispersion autour de la source ont été considérés comme des paramètres fondamentaux. Pour tester la flexibilité et la précision des modèles d'exposition classiques, des expériences de modélisations ont été effectuées dans des situations concrètes. En particulier, des modèles prédictifs ont été utilisés pour évaluer l'exposition professionnelle au monoxyde de carbone et la comparer aux niveaux d'exposition répertoriés dans la littérature pour des situations similaires. De même, l'exposition aux sprays imperméabilisants a été appréciée dans le contexte d'une étude épidémiologique sur une cohorte suisse. Dans ce cas, certains expériences ont été entreprises pour caractériser le taux de d'émission des sprays imperméabilisants. Ensuite un modèle classique à deux-zone a été employé pour évaluer la dispersion d'aérosol dans le champ proche et lointain pendant l'activité de sprayage. D'autres expériences ont également été effectuées pour acquérir une meilleure compréhension des processus d'émission et de dispersion d'un traceur, en se concentrant sur la caractérisation de l'exposition du champ proche. Un design expérimental a été développé pour effectuer des mesures simultanées dans plusieurs points d'une cabine d'exposition, par des instruments à lecture directe. Il a été constaté que d'un point de vue statistique, la théorie basée sur les compartiments est sensée, bien que l'attribution à un compartiment donné ne pourrait pas se faire sur la base des simples considérations géométriques. Dans une étape suivante, des données expérimentales ont été collectées sur la base des observations faites dans environ 100 lieux de travail différents: des informations sur les déterminants observés ont été associées aux mesures d'exposition des informations sur les déterminants observés ont été associé. Ces différentes données ont été employées pour améliorer le modèle d'exposition à deux zones. Un outil a donc été développé pour inclure des déterminants spécifiques dans le choix du compartiment, renforçant ainsi la fiabilité des prévisions. Toutes ces investigations ont servi à améliorer notre compréhension des outils des modélisations ainsi que leurs limitations. L'intégration de déterminants mieux adaptés aux besoins des experts devrait les inciter à employer cet outil dans leur pratique. D'ailleurs, en augmentant la qualité des outils des modélisations, cette recherche permettra non seulement d'encourager leur utilisation systématique, mais elle pourra également améliorer l'évaluation de l'exposition basée sur les jugements d'experts et, par conséquent, la protection de la santé des travailleurs. Abstract Occupational exposure assessment is an important stage in the management of chemical exposures. Few direct measurements are carried out in workplaces, and exposures are often estimated based on expert judgements. There is therefore a major requirement for simple transparent tools to help occupational health specialists to define exposure levels. The aim of the present research is to develop and improve modelling tools in order to predict exposure levels. In a first step a survey was made among professionals to define their expectations about modelling tools (what types of results, models and potential observable parameters). It was found that models are rarely used in Switzerland and that exposures are mainly estimated from past experiences of the expert. Moreover chemical emissions and their dispersion near the source have also been considered as key parameters. Experimental and modelling studies were also performed in some specific cases in order to test the flexibility and drawbacks of existing tools. In particular, models were applied to assess professional exposure to CO for different situations and compared with the exposure levels found in the literature for similar situations. Further, exposure to waterproofing sprays was studied as part of an epidemiological study on a Swiss cohort. In this case, some laboratory investigation have been undertaken to characterize the waterproofing overspray emission rate. A classical two-zone model was used to assess the aerosol dispersion in the near and far field during spraying. Experiments were also carried out to better understand the processes of emission and dispersion for tracer compounds, focusing on the characterization of near field exposure. An experimental set-up has been developed to perform simultaneous measurements through direct reading instruments in several points. It was mainly found that from a statistical point of view, the compartmental theory makes sense but the attribution to a given compartment could ñó~be done by simple geometric consideration. In a further step the experimental data were completed by observations made in about 100 different workplaces, including exposure measurements and observation of predefined determinants. The various data obtained have been used to improve an existing twocompartment exposure model. A tool was developed to include specific determinants in the choice of the compartment, thus largely improving the reliability of the predictions. All these investigations helped improving our understanding of modelling tools and identify their limitations. The integration of more accessible determinants, which are in accordance with experts needs, may indeed enhance model application for field practice. Moreover, while increasing the quality of modelling tool, this research will not only encourage their systematic use, but might also improve the conditions in which the expert judgments take place, and therefore the workers `health protection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we examine the problem of compositional data from a different startingpoint. Chemical compositional data, as used in provenance studies on archaeologicalmaterials, will be approached from the measurement theory. The results will show, in avery intuitive way that chemical data can only be treated by using the approachdeveloped for compositional data. It will be shown that compositional data analysis is aparticular case in projective geometry, when the projective coordinates are in thepositive orthant, and they have the properties of logarithmic interval metrics. Moreover,it will be shown that this approach can be extended to a very large number ofapplications, including shape analysis. This will be exemplified with a case study inarchitecture of Early Christian churches dated back to the 5th-7th centuries AD

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bacteria have long been the targets for genetic manipulation, but more recently they have been synthetically designed to carry out specific tasks. Among the simplest of these tasks is chemical compound and toxicity detection coupled to the production of a quantifiable reporter signal. In this Review, we describe the current design of bacterial bioreporters and their use in a range of assays to measure the presence of harmful chemicals in water, air, soil, food or biological specimens. New trends for integrating synthetic biology and microengineering into the design of bacterial bioreporter platforms are also highlighted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the increasing popularity of enterprise architecture management (EAM) in practice, many EAM initiatives either do not fully meet the expected targets or fail. Several frameworks have been suggested as guidelines to EA implementation, but companies seldom follow prescriptive frameworks. Instead, they follow very diverse implementation approaches that depend on their organizational contingencies and the way of adopting and evolving EAM over time. This research strives for a broader understanding of EAM by exploring context-dependent EAM adoption approaches as well as identifying the main EA principles that affect EA effectiveness. Based on two studies, this dissertation aims to address two main questions: (1) EAM design: Which approaches do companies follow when adopting EAM? (2) EA principles and their impact: What impact does EA principles have on EA effectiveness/quality? By utilizing both qualitative and quantitative research methods, this research contributes to exploring different EAM designs in different organizational contingencies as well as using EA principles as an effective means to achieve principle-based EAM design. My research can help companies identify a suitable EAM design that fits their organizational settings and shape their EA through a set of principles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MR connectomics is an emerging framework in neuro-science that combines diffusion MRI and whole brain tractography methodologies with the analytical tools of network science. In the present work we review the current methods enabling structural connectivity mapping with MRI and show how such data can be used to infer new information of both brain structure and function. We also list the technical challenges that should be addressed in the future to achieve high-resolution maps of structural connectivity. From the resulting tremendous amount of data that is going to be accumulated soon, we discuss what new challenges must be tackled in terms of methods for advanced network analysis and visualization, as well data organization and distribution. This new framework is well suited to investigate key questions on brain complexity and we try to foresee what fields will most benefit from these approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of the field-scale Erosion Productivity Impact Calculator (EPIC) model was initiated in 1981 to support assessments of soil erosion impacts on soil productivity for soil, climate, and cropping conditions representative of a broad spectrum of U.S. agricultural production regions. The first major application of EPIC was a national analysis performed in support of the 1985 Resources Conservation Act (RCA) assessment. The model has continuously evolved since that time and has been applied for a wide range of field, regional, and national studies both in the U.S. and in other countries. The range of EPIC applications has also expanded greatly over that time, including studies of (1) surface runoff and leaching estimates of nitrogen and phosphorus losses from fertilizer and manure applications, (2) leaching and runoff from simulated pesticide applications, (3) soil erosion losses from wind erosion, (4) climate change impacts on crop yield and erosion, and (5) soil carbon sequestration assessments. The EPIC acronym now stands for Erosion Policy Impact Climate, to reflect the greater diversity of problems to which the model is currently applied. The Agricultural Policy EXtender (APEX) model is essentially a multi-field version of EPIC that was developed in the late 1990s to address environmental problems associated with livestock and other agricultural production systems on a whole-farm or small watershed basis. The APEX model also continues to evolve and to be utilized for a wide variety of environmental assessments. The historical development for both models will be presented, as well as example applications on several different scales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Task Force members formulated these principles and practices as a way to promote good management practices, ethical conduct, and public accountability. By compiling the information in this guide, we hope to provide a valuable tool for organizations and individuals as they go about the work of building better Iowa communities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enterprise-wide architecture has become a necessity for organizations to (re)align information technology (IT) to changing business requirements. Since a city planning metaphor inspired enterprise-wide architecture, this dissertation's research axes can be outlined by similarities between cities and enterprises. Both are characterized as dynamic super-systems that need to address the evolving interest of various architecture stakeholders. Further, both should simultaneously adhere to a set of principles to guide the evolution of architecture towards the expected benefits. The extant literature on enterprise-wide architecture not only disregards architecture adoption's complexities but also remains vague about how principles guide architecture evolution. To bridge this gap, this dissertation contains three interrelated research streams examining the principles and adoption of enterprise-wide architecture. The first research stream investigates organizational intricacies inherent in architecture adoption. It characterizes architecture adoption as an ongoing organizational adaptation process. By analyzing organizational response behaviors in this adaptation process, it also identifies four archetypes that represent very diverse architecture approaches. The second research stream ontologically clarifies the nature of architecture principles along with outlining new avenues for theoretical contributions. This research stream also provides an empirically validated set of principles and proposes a research model illustrating how principles can be applied to generate expected architecture benefits. The third research stream examines architecture adoption in multinational corporations (MNCs). MNCs are Specified by unique organizational characteristics that constantly strive for balancing global integration and local responsiveness. This research stream characterizes MNCs' architecture adoption as a continuous endeavor. This endeavor tries to constantly synchron ize architecture with stakeholders' beliefs about how to balance global integration and local responsiveness. To conclude, this dissertation provides a thorough explanation of a long-term journey in Which organizations learn over time to adopt an effective architecture approach. It also clarifies the role of principles to purposefully guide the aforementioned learning process. - L'Architecture d'Entreprise (AE) est devenue une nécessité pour permettre aux organisations de (ré)aligner les technologies de l'information (TI) avec les changements en termes de besoins métiers. En se basant sur la métaphore de la planification urbaine dont l'AE s'est inspirée, cette dissertation peut être présentée comme une comparaison entre les villes et les entreprises; les deux sont des super-systèmes dynamiques ayant besoin de répondre aux intérêts d'acteurs divers et variés en constants évolution. De plus, les deux devraient souscrire simultanément à un ensemble de principes afin de faire converger l'évolution de l'architecture vers les bénéfices attendus. La littérature sur l'AE, non seulement ne prend pas en considération les complexités de l'adoption d'architecture, mais aussi reste vague sur la manière dont les principes guident l'évolution de l'architecture. Pour pallier ce manque, cette dissertation est composée de trois volets de recherche étroitement liés examinant les principes et l'adoption de l'AE. Le premier volet examine la complexité organisationnelle inhérente à l'adoption de l'architecture. Il caractérise l'adoption de l'architecture en tant que processus d'adaptation continu. En analysant le comportement organisationnel en réponse à ce processus d'adaptation, ce volet distingue quatre archétypes représentant la diversité des approches de l'architecture. Le deuxième volet de recherche clarifie de manière ontologique la nature des principes d'architecture et envisage les contributions théoriques futures possibles. Cet axe de recherche fournit aussi un ensemble de principes, validés de manière empirique, et propose un modèle de recherche illustrant la manière dont ces principes peuvent être appliqués afin de générer les bénéfices attendus de l'architecture. Le troisième volet examine l'adoption de l'architecture dans les entreprises multinationales. Ces dernières possèdent des caractéristiques organisationnelles uniques et sont constamment à la recherche d'un équilibre entre une intégration globale et une flexibilité locale tout en prenant en compte les convictions des divers acteurs sur la manière d'atteindre cet équilibre. Pour conclure, cette dissertation fournit une explication sur le long voyage au cours duquel les entreprises apprennent à adopter une approche d'architecture efficace. Elle clarifie aussi le rôle des principes dans l'accompagnement de ce processus d'apprentissage.