984 resultados para coherent magnify


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Matrix effects, which represent an important issue in liquid chromatography coupled to mass spectrometry or tandem mass spectrometry detection, should be closely assessed during method development. In the case of quantitative analysis, the use of stable isotope-labelled internal standard with physico-chemical properties and ionization behaviour similar to the analyte is recommended. In this paper, an example of the choice of a co-eluting deuterated internal standard to compensate for short-term and long-term matrix effect in the case of chiral (R,S)-methadone plasma quantification is reported. The method was fully validated over a concentration range of 5-800 ng/mL for each methadone enantiomer with satisfactory relative bias (-1.0 to 1.0%), repeatability (0.9-4.9%) and intermediate precision (1.4-12.0%). From the results obtained during validation, a control chart process during 52 series of routine analysis was established using both intermediate precision standard deviation and FDA acceptance criteria. The results of routine quality control samples were generally included in the +/-15% variability around the target value and mainly in the two standard deviation interval illustrating the long-term stability of the method. The intermediate precision variability estimated in method validation was found to be coherent with the routine use of the method. During this period, 257 trough concentration and 54 peak concentration plasma samples of patients undergoing (R,S)-methadone treatment were successfully analysed for routine therapeutic drug monitoring.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show how an ultrafast pump-pump excitation induces strong fluorescence depletion in biological samples, such as bacteria-containing droplets, in contrast with fluorescent interferents, such as polycyclic aromatic compounds, despite similar spectroscopic properties. Application to the optical remote discrimination of biotic versus non-biotic particles is proposed. Further improvement is required to allow the discrimination of one pathogenic among other non-pathogenic micro-organisms. This improved selectivity may be reached with optimal coherent control experiments, as discussed in the paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Résumé La mondialisation des marchés, les mutations du contexte économique et enfin l'impact des nouvelles technologies de l'information ont obligé les entreprises à revoir la façon dont elles gèrent leurs capitaux intellectuel (gestion des connaissances) et humain (gestion des compétences). II est communément admis aujourd'hui que ceux-ci jouent un rôle particulièrement stratégique dans l'organisation. L'entreprise désireuse de se lancer dans une politique gestion de ces capitaux devra faire face à différents problèmes. En effet, afin de gérer ces connaissances et ces compétences, un long processus de capitalisation doit être réalisé. Celui-ci doit passer par différentes étapes comme l'identification, l'extraction et la représentation des connaissances et des compétences. Pour cela, il existe différentes méthodes de gestion des connaissances et des compétences comme MASK, CommonKADS, KOD... Malheureusement, ces différentes méthodes sont très lourdes à mettre en oeuvre, et se cantonnent à certains types de connaissances et sont, par conséquent, plus limitées dans les fonctionnalités qu'elles peuvent offrir. Enfin, la gestion des compétences et la gestion des connaissances sont deux domaines dissociés alors qu'il serait intéressant d'unifier ces deux approches en une seule. En effet, les compétences sont très proches des connaissances comme le souligne la définition de la compétence qui suit : « un ensemble de connaissances en action dans un contexte donné ». Par conséquent, nous avons choisi d'appuyer notre proposition sur le concept de compétence. En effet, la compétence est parmi les connaissances de l'entreprise l'une des plus cruciales, en particulier pour éviter la perte de savoir-faire ou pour pouvoir prévenir les besoins futurs de l'entreprise, car derrière les compétences des collaborateurs, se trouve l'efficacité de l'organisation. De plus, il est possible de décrire grâce à la compétence de nombreux autres concepts de l'organisation, comme les métiers, les missions, les projets, les formations... Malheureusement, il n'existe pas réellement de consensus sur la définition de la compétence. D'ailleurs, les différentes définitions existantes, même si elles sont pleinement satisfaisantes pour les experts, ne permettent pas de réaliser un système opérationnel. Dans notre approche; nous abordons la gestion des compétences à l'aide d'une méthode de gestion des connaissances. En effet, de par leur nature même, connaissance et compétence sont intimement liées et donc une telle méthode est parfaitement adaptée à la gestion des compétences. Afin de pouvoir exploiter ces connaissances et ces compétences nous avons dû, dans un premier temps, définir les concepts organisationnels de façon claire et computationnelle. Sur cette base, nous proposons une méthodologie de construction des différents référentiels d'entreprise (référentiel de compétences, des missions, des métiers...). Pour modéliser ces différents référentiels, nous avons choisi l'ontologie, car elle permet d'obtenir des définitions cohérentes et consensuelles aux concepts tout en supportant les diversités langagières. Ensuite, nous cartographions les connaissances de l'entreprise (formations, missions, métiers...) sur ces différentes ontologies afin de pouvoir les exploiter et les diffuser. Notre approche de la gestion des connaissances et de la gestion des compétences a permis la réalisation d'un outil offrant de nombreuses fonctionnalités comme la gestion des aires de mobilités, l'analyse stratégique, les annuaires ou encore la gestion des CV. Abstract The globalization of markets, the easing of economical regulation and finally the impact of new information and communication technologies have obliged firms to re-examine the way they manage their knowledge capital (knowledge management) and their human capital (competence management). It is commonly admitted that knowledge plays a slightly strategical role in the organization. The firms who want to establish one politic of management of these capitals will have to face with different problems. To manage that knowledge, a long process of capitalization must be done. That one has different steps like identification, extraction and representation of knowledge and competences. There are some different methods of knowledge management like MASK, CommonKADS or KOD. Unfortunately, those methods are very difficult to implement and are using only some types of knowledge and are consequently more limited in the functionalities they can offer. Knowledge management and competence management are two different domain where it could be interesting to unify those to one. Indeed, competence is very close than knowledge as underline this definition: "a set of knowledge in action in a specified context". We choose in our approach to rely on the concept of competence. Indeed, the competence is one of crucial knowledge in the company, particularly to avoid the loss of know-how or to prevent future needs. Because behind collaborator's competence, we can find company efficiency. Unfortunately, there is no real consensus on the definition of the concept of competence. Moreover, existing different definitions don't permit to develop an operational system. Among other key concept, we can find jobs, mission, project, and training... Moreover, we approach different problems of the competence management under the angle of the knowledge management. Indeed, knowledge and competence are closely linked. Then, we propose a method to build different company repositories (competence, jobs, projects repositories). To model those different repositories we choose ontology because it permits to obtain coherent and consensual definitions of the concepts with support of linguistics diversities too. This building repositories method coupled with this knowledge and competence management approach permitted the realization of a tool offering functionalities like mobility management, strategical analysis, yellow pages or CV management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The availability of rich firm-level data sets has recently led researchers to uncover new evidence on the effects of trade liberalization. First, trade openness forces the least productive firms to exit the market. Secondly, it induces surviving firms to increase their innovation efforts and thirdly, it increases the degree of product market competition. In this paper we propose a model aimed at providing a coherent interpretation of these findings. We introducing firm heterogeneity into an innovation-driven growth model, where incumbent firms operating in oligopolistic industries perform cost-reducing innovations. In this framework, trade liberalization leads to higher product market competition, lower markups and higher quantity produced. These changes in markups and quantities, in turn, promote innovation and productivity growth through a direct competition effect, based on the increase in the size of the market, and a selection effect, produced by the reallocation of resources towards more productive firms. Calibrated to match US aggregate and firm-level statistics, the model predicts that a 10 percent reduction in variable trade costs reduces markups by 1:15 percent, firm surviving probabilities by 1 percent, and induces an increase in productivity growth of about 13 percent. More than 90 percent of the trade-induced growth increase can be attributed to the selection effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In less than half a century, allergy, originally perceived as a rare disease, has become a major public health threat, today affecting the lives of more than 60 million people in Europe, and probably close to one billion worldwide, thereby heavily impacting the budgets of public health systems. More disturbingly, its prevalence and impact are on the rise, a development that has been associated with environmental and lifestyle changes accompanying the continuous process of urbanization and globalization. Therefore, there is an urgent need to prioritize and concert research efforts in the field of allergy, in order to achieve sustainable results on prevention, diagnosis and treatment of this most prevalent chronic disease of the 21st century.The European Academy of Allergy and Clinical Immunology (EAACI) is the leading professional organization in the field of allergy, promoting excellence in clinical care, education, training and basic and translational research, all with the ultimate goal of improving the health of allergic patients. The European Federation of Allergy and Airways Diseases Patients' Associations (EFA) is a non-profit network of allergy, asthma and Chronic Obstructive Pulmonary Disorder (COPD) patients' organizations. In support of their missions, the present EAACI Position Paper, in collaboration with EFA, highlights the most important research needs in the field of allergy to serve as key recommendations for future research funding at the national and European levels.Although allergies may involve almost every organ of the body and an array of diverse external factors act as triggers, there are several common themes that need to be prioritized in research efforts. As in many other chronic diseases, effective prevention, curative treatment and accurate, rapid diagnosis represent major unmet needs. Detailed phenotyping/endotyping stands out as widely required in order to arrange or re-categorize clinical syndromes into more coherent, uniform and treatment-responsive groups. Research efforts to unveil the basic pathophysiologic pathways and mechanisms, thus leading to the comprehension and resolution of the pathophysiologic complexity of allergies will allow for the design of novel patient-oriented diagnostic and treatment protocols. Several allergic diseases require well-controlled epidemiological description and surveillance, using disease registries, pharmacoeconomic evaluation, as well as large biobanks. Additionally, there is a need for extensive studies to bring promising new biotechnological innovations, such as biological agents, vaccines of modified allergen molecules and engineered components for allergy diagnosis, closer to clinical practice. Finally, particular attention should be paid to the difficult-to-manage, precarious and costly severe disease forms and/or exacerbations. Nonetheless, currently arising treatments, mainly in the fields of immunotherapy and biologicals, hold great promise for targeted and causal management of allergic conditions. Active involvement of all stakeholders, including Patient Organizations and policy makers are necessary to achieve the aims emphasized herein.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper I review a series of theoretical concepts that are relevant for the integrated assessment of agricultural sustainability but that are not generally included in the curriculum of the various scientific disciplines dealing with quantitative analysis of agriculture. I first illustrate with plain narratives and concrete examples that sustainability is an extremely complex issue requiring the simultaneous consideration of several aspects, which cannot be reduced into a single indicator of performance. Following, I justify this obvious need for multi-criteria analysis with theoretical concepts dealing with the epistemological predicament of complexity, starting from classic philosophical lessons to arrive to recent developments in complex system theory, in particular Rosen´s theory of modelling relation which is essential to analyze the quality of any quantitative representation. The implications of these theoretical concepts are then illustrated with applications of multi-criteria analysis to the sustainability of agriculture. I wrap up by pointing out the crucial difference between "integrated assessment" and "integrated analysis". An integrated analysis is a set of indicators and analytical models generating an analytical output. An integrated assessment is much more than that. It is about finding an effective way to deal with three key issues: (i) legitimacy – how to handle the unavoidable existence of legitimate but contrasting points of view about different meanings given by social actors to the word "development"; (ii) pertinence – how to handle in a coherent way scientific analyses referring to different scales and dimensions; and (iii) credibility – how to handle the unavoidable existence of uncertainty and genuine ignorance, when dealing with the analysis of future scenarios.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this work is to present a multitechnique approach to define the geometry, the kinematics, and the failure mechanism of a retrogressive large landslide (upper part of the La Valette landslide, South French Alps) by the combination of airborne and terrestrial laser scanning data and ground-based seismic tomography data. The advantage of combining different methods is to constrain the geometrical and failure mechanism models by integrating different sources of information. Because of an important point density at the ground surface (4. 1 points m?2), a small laser footprint (0.09 m) and an accurate three-dimensional positioning (0.07 m), airborne laser scanning data are adapted as a source of information to analyze morphological structures at the surface. Seismic tomography surveys (P-wave and S-wave velocities) may highlight the presence of low-seismic-velocity zones that characterize the presence of dense fracture networks at the subsurface. The surface displacements measured from the terrestrial laser scanning data over a period of 2 years (May 2008?May 2010) allow one to quantify the landslide activity at the direct vicinity of the identified discontinuities. An important subsidence of the crown area with an average subsidence rate of 3.07 m?year?1 is determined. The displacement directions indicate that the retrogression is controlled structurally by the preexisting discontinuities. A conceptual structural model is proposed to explain the failure mechanism and the retrogressive evolution of the main scarp. Uphill, the crown area is affected by planar sliding included in a deeper wedge failure system constrained by two preexisting fractures. Downhill, the landslide body acts as a buttress for the upper part. Consequently, the progression of the landslide body downhill allows the development of dip-slope failures, and coherent blocks start sliding along planar discontinuities. The volume of the failed mass in the crown area is estimated at 500,000 m3 with the sloping local base level method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduït en el sistema de governança per la recent reforma del marc regulador de les telecomunicacions, el BEREC es presenta com un organisme supranacional de caràcter consultiu al qual se li atorga una missió i un paper fonamental en l'aplicació coherent de les normes en el mercat únic de les telecomunicacions. L'èxit d'aquesta missió dependrà tant de les eines amb les que compta, com dels mecanismes previstos de coordinació i la seva capacitat de relació amb la resta d'actors, les autoritats nacionals de regulació i la Comissió Europea principalment

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High-throughput technologies are now used to generate more than one type of data from the same biological samples. To properly integrate such data, we propose using co-modules, which describe coherent patterns across paired data sets, and conceive several modular methods for their identification. We first test these methods using in silico data, demonstrating that the integrative scheme of our Ping-Pong Algorithm uncovers drug-gene associations more accurately when considering noisy or complex data. Second, we provide an extensive comparative study using the gene-expression and drug-response data from the NCI-60 cell lines. Using information from the DrugBank and the Connectivity Map databases we show that the Ping-Pong Algorithm predicts drug-gene associations significantly better than other methods. Co-modules provide insights into possible mechanisms of action for a wide range of drugs and suggest new targets for therapy

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projecte de recerca elaborat a partir d’una estada a la London School of Economics and Political Science, United Kingdom, entre 2007 i 2009. L’objecte principal del projecte ha estat analitzar les implicacions jurídico-polítiques i institucionals d’una teoria de la justícia i la igualtat liberals aplicada a societats multiculturals amb un marcat predomini de la diversitat cultural. L’anàlisi desenvolupa una línia d'investigació interdisciplinar - entre el dret i la teoria política - iniciada en una tesis doctoral sobre multiculturalisme i drets de les minories culturals (UPF, 2000) que va culminar en la publicació de Group Rights as Human Rights (Springer, 2006). La recerca adopta com a punt de partida les conclusions de l'esmentada obra, en especial, la rellevància del reconeixement de drets col•lectius; tanmateix, el tipus de qüestions plantejades, l’enfoc i la metodologia emprades són substancialment diferents. En concret, s'adrecen preguntes específiques sobre el model i aspiracions del constitucionalisme democràtic i el paper del dret en contextos multiculturals. També s’atorga un pes central a la dimensió institucional dels models de gestió de la diversitat que s’analitzen, prioritzant un enfocament comparatiu a partir de l’estudi de controvèrsies concretes. L’objectiu és superar algunes limitacions importants de la literatura actual, com ara la tendència a examinar en abstracte la compatibilitat de determinades demandes amb el constitucionalisme democràtic, sense abordar el funcionament d'estratègies de gestió de la diversitat cultural emprades en contextos concrets. Els treballs producte d'aquest projecte articulen les línies bàsiques d’un model pluralista, basat en principis més que en regles, que desafia els plantejaments dominants actualment. Aquest model es caracteritza pel compromís amb la legitimitat i igualtat comparatives, rebutjant el paternalisme i les visions liberals típiques sobre el paper de la regulació. La presumpció de l’“standing” moral dels grups identitaris és fonamental per tal de considerar-los interlocutors vàlids amb interessos genuïns. També s’argumenta que la integració social en contextos multiculturals no depèn tant de l’eliminació del conflicte sinó, sobre tot, d’una gestió eficient que eviti abusos de poder sistemàtics. El model defensa el rol del dret en la institucionalització del diàleg intercultural, però admet que el diàleg no necessàriament condueix a l’acord o a una estructura reguladora coherent i uniforme. Les aspiracions del ordre jurídic pluralista són més modestes: afavorir la negociació i resolució en cada conflicte, malgrat la persistència de la fragmentació i la provisionalitat dels acords. La manca d'un marc regulador comú esdevé una virtut en la mesura que permet la interacció de diferents subordres; una interacció governada per una multiplicitat de regles no necessàriament harmòniques. Els avantatges i problemes d’aquest model s'analitzen a partir de l'anàlisi de l’estructura fragmentària de l'ordre jurídic internacional i del règim Europeu de drets humans.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The article is composed of two sections. The first one is a critical review of the three main alternative indices to GDP which were proposed in the last decades – the Human Development Index (HDI), the Genuine Progress Indicator (GPI), and the Happy Planet Index (HPI) – which is made on the basis of conceptual foundations, rather than looking at issues of statistical consistency or mathematical refinement as most of the literature does. The pars construens aims to propose an alternative measure, the composite wealth index, consistent with an approach to development based on the notion of composite wealth, which is in turn derived from an empirical common sense criterion. Arguably, this approach is suitable to be conveyed into an easily understandable and coherent indicator, and thus appropriate to track development in its various dimensions: simple in its formulation, the wealth approach can incorporate social and ecological goals without significant alterations in conceptual foundations, while reducing to a minimum arbitrary weighting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phenoxyalkanoic acid degradation is well studied in Beta- and Gammaproteobacteria, but the genetic background has not been elucidated so far in Alphaproteobacteria. We report the isolation of several genes involved in dichlor- and mecoprop degradation from the alphaproteobacterium Sphingomonas herbicidovorans MH and propose that the degradation proceeds analogously to that previously reported for 2,4-dichlorophenoxyacetic acid (2,4-D). Two genes for alpha-ketoglutarate-dependent dioxygenases, sdpA(MH) and rdpA(MH), were found, both of which were adjacent to sequences with potential insertion elements. Furthermore, a gene for a dichlorophenol hydroxylase (tfdB), a putative regulatory gene (cadR), two genes for dichlorocatechol 1,2-dioxygenases (dccA(I/II)), two for dienelactone hydrolases (dccD(I/II)), part of a gene for maleylacetate reductase (dccE), and one gene for a potential phenoxyalkanoic acid permease were isolated. In contrast to other 2,4-D degraders, the sdp, rdp, and dcc genes were scattered over the genome and their expression was not tightly regulated. No coherent pattern was derived on the possible origin of the sdp, rdp, and dcc pathway genes. rdpA(MH) was 99% identical to rdpA(MC1), an (R)-dichlorprop/alpha-ketoglutarate dioxygenase from Delftia acidovorans MC1, which is evidence for a recent gene exchange between Alpha- and Betaproteobacteria. Conversely, DccA(I) and DccA(II) did not group within the known chlorocatechol 1,2-dioxygenases, but formed a separate branch in clustering analysis. This suggests a different reservoir and reduced transfer for the genes of the modified ortho-cleavage pathway in Alphaproteobacteria compared with the ones in Beta- and Gammaproteobacteria.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The application of two approaches for high-throughput, high-resolution X-ray phase contrast tomographic imaging being used at the tomographic microscopy and coherent radiology experiments (TOMCAT) beamline of the SLS is discussed and illustrated. Differential phase contrast (DPC) imaging, using a grating interferometer and a phase-stepping technique, is integrated into the beamline environment at TOMCAT in terms of the fast acquisition and reconstruction of data and the availability to scan samples within an aqueous environment. A second phase contrast method is a modified transfer of intensity approach that can yield the 3D distribution of the decrement of the refractive index of a weakly absorbing object from a single tomographic dataset. The two methods are complementary to one another: the DPC method is characterised by a higher sensitivity and by moderate resolution with larger samples; the modified transfer of intensity approach is particularly suited for small specimens when high resolution (around 1 mu m) is required. Both are being applied to investigations in the biological and materials science fields.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

SUMMARY: Large sets of data, such as expression profiles from many samples, require analytic tools to reduce their complexity. The Iterative Signature Algorithm (ISA) is a biclustering algorithm. It was designed to decompose a large set of data into so-called 'modules'. In the context of gene expression data, these modules consist of subsets of genes that exhibit a coherent expression profile only over a subset of microarray experiments. Genes and arrays may be attributed to multiple modules and the level of required coherence can be varied resulting in different 'resolutions' of the modular mapping. In this short note, we introduce two BioConductor software packages written in GNU R: The isa2 package includes an optimized implementation of the ISA and the eisa package provides a convenient interface to run the ISA, visualize its output and put the biclusters into biological context. Potential users of these packages are all R and BioConductor users dealing with tabular (e.g. gene expression) data. AVAILABILITY: http://www.unil.ch/cbg/ISA CONTACT: sven.bergmann@unil.ch