977 resultados para EXTENDED UNCERTAINTY RELATIONS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We derive a new non-singular tree-level KLT relation for the n = 5-point amplitudes, with manifest 2(n-2)! symmetry, using information from one-loop amplitudes and IR divergences, and speculate how one might extend it to higher n-point functions. We show that the subleading-color N = 4 SYM 5-point amplitude has leading IR divergence of 1/epsilon, which is essential for the applications of this paper. We also propose a relation between the subleading-color N = 4 SYM and N = 8 supergravity 1-loop 5-point amplitudes, valid for the IR divergences and possibly for the whole amplitudes, using techniques similar to those used in our derivation of the new KLT relation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We extend and provide a vector-valued version of some results of C. Samuel about the geometric relations between the spaces of nuclear operators N(E, F) and spaces of compact operators K(E, F), where E and F are Banach spaces C(K) of all continuous functions defined on the countable compact metric spaces K equipped with the supremum norm. First we continue Samuel's work by proving that N(C(K-1), C(K-2)) contains no subspace isomorphic to K(C(K-3), C(K-4)) whenever K-1, K-2, K-3 and K-4 are arbitrary infinite countable compact metric spaces. Then we show that it is relatively consistent with ZFC that the above result and the main results of Samuel can be extended to C(K-1, X), C(K-2,Y), C(K-3, X) and C(K-4, Y) spaces, where K-1, K-2, K-3 and K-4 are arbitrary infinite totally ordered compact spaces; X comprises certain Banach spaces such that X* are isomorphic to subspaces of l(1); and Y comprises arbitrary subspaces of l(p), with 1 < p < infinity. Our results cover the cases of some non-classical Banach spaces X constructed by Alspach, by Alspach and Benyamini, by Benyamini and Lindenstrauss, by Bourgain and Delbaen and also by Argyros and Haydon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To check the effectiveness of campaigns preventing drug abuse or indicating local effects of efforts against drug trafficking, it is beneficial to know consumed amounts of substances in a high spatial and temporal resolution. The analysis of drugs of abuse in wastewater (WW) has the potential to provide this information. In this study, the reliability of WW drug consumption estimates is assessed and a novel method presented to calculate the total uncertainty in observed WW cocaine (COC) and benzoylecgonine (BE) loads. Specifically, uncertainties resulting from discharge measurements, chemical analysis and the applied sampling scheme were addressed and three approaches presented. These consist of (i) a generic model-based procedure to investigate the influence of the sampling scheme on the uncertainty of observed or expected drug loads, (ii) a comparative analysis of two analytical methods (high performance liquid chromatography-tandem mass spectrometry and gas chromatography-mass spectrometry), including an extended cross-validation by influent profiling over several days, and (iii) monitoring COC and BE concentrations in WW of the largest Swiss sewage treatment plants. In addition, the COC and BE loads observed in the sewage treatment plant of the city of Berne were used to back-calculate the COC consumption. The estimated mean daily consumed amount was 107 ± 21 g of pure COC, corresponding to 321 g of street-grade COC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES Resistance to extended-spectrum cephalosporins (ESCs) in Escherichia coli can be due to the production of ESBLs, plasmid-mediated AmpCs (pAmpCs) or chromosomal AmpCs (cAmpCs). Information regarding type and prevalence of β-lactamases, clonal relations and plasmids associated with the bla genes for ESC-R E. coli (ESC-R-Ec) detected in Switzerland is lacking. Moreover, data focusing on patients referred to the specialized outpatient clinics (SOCs) are needed. METHODS We analysed 611 unique E. coli isolated during September-December 2011. ESC-R-Ec were studied with microarrays, PCR/DNA sequencing for blaESBLs, blapAmpCs, promoter region of blacAmpC, IS elements, plasmid incompatibility group, and also implementing transformation, aIEF, rep-PCR and MLST. RESULTS The highest resistance rates were observed in the SOCs, whereas those in the hospital and community were lower (e.g. quinolone resistance of 22.6%, 17.2% and 9.0%, respectively; P = 0.003 for SOCs versus community). The prevalence of ESC-R-Ec in the three settings was 5.3% (n = 11), 7.8% (n = 22) and 5.7% (n = 7), respectively. Thirty isolates produced CTX-M ESBLs (14 were CTX-M-15), 5 produced CMY-2 pAmpC and 5 hyper-expressed cAmpCs due to promoter mutations. Fourteen isolates were of sequence type 131 (ST131; 10 with CTX-M-15). blaCTX-M and blaCMY-2 were associated with an intact or truncated ISEcp1 and were mainly carried by IncF, IncFII and IncI1plasmids. CONCLUSIONS ST131 producing CTX-M-15 is the predominant clone. The prevalence of ESC-R-Ec (overall 6.5%) is low, but an unusual relatively high frequency of AmpC producers (25%) was noted. The presence of ESC-R-Ec in the SOCs and their potential ability to be exchanged between hospital and community should be taken into serious consideration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Qing emperors, who ruled over China from 1644-1911, managed to bring large parts of Inner Asia under their control and extended the territory of China to an unprecedented degree. This paper maintains that the political technique of patronage with its formalized language, its emphasis on gift exchange and expressions of courtesy is a useful concept for explaining the integration of Inner Asian confederations into the empire. By re-interpreting the obligations of gift exchange, the Qing transformed the network of personal relationships, which had to be reinforced and consolidated permanently into a system with clearly defined rules. In this process of formalization, the Lifanyuan, the Court for the Administration of the Outer Regions, played a key role. While in the early years of the dynasty, it was responsible for collecting and disseminating information concerning the various patronage relationships with Inner Asian leaders, over the course of the 17th and 18th centuries its efforts were directed at standardizing and streamlining the contacts between ethnic minorities and the state. Through the Lifanyuan, the rules and principles of patronage were maintained in a modified form even in the later part of the dynasty, when the Qing exercised control in the outer regions more directly. The paper provides an explanation for the longevity and cohesiveness of the multi-ethnic Qing empire. Based on recently published Manchu and Mongolian language archival material and the Maussian concept of gift exchange the study sheds new light on the changing self-conception of the Qing emperors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION Extended-spectrum beta-lactamases (ESBL) and AmpC beta-lactamases (AmpC) are of concern for veterinary and public health because of their ability to cause treatment failure due to antimicrobial resistance in Enterobacteriaceae. The main objective was to assess the relative contribution (RC) of different types of meat to the exposure of consumers to ESBL/AmpC and their potential importance for human infections in Denmark. MATERIAL AND METHODS The prevalence of each genotype of ESBL/AmpC-producing E. coli in imported and nationally produced broiler meat, pork and beef was weighted by the meat consumption patterns. Data originated from the Danish surveillance program for antibiotic use and antibiotic resistance (DANMAP) from 2009 to 2011. DANMAP also provided data about human ESBL/AmpC cases in 2011, which were used to assess a possible genotype overlap. Uncertainty about the occurrence of ESBL/AmpC-producing E. coli in meat was assessed by inspecting beta distributions given the available data of the genotypes in each type of meat. RESULTS AND DISCUSSION Broiler meat represented the largest part (83.8%) of the estimated ESBL/AmpC-contaminated pool of meat compared to pork (12.5%) and beef (3.7%). CMY-2 was the genotype with the highest RC to human exposure (58.3%). However, this genotype is rarely found in human infections in Denmark. CONCLUSION The overlap between ESBL/AmpC genotypes in meat and human E. coli infections was limited. This suggests that meat might constitute a less important source of ESBL/AmpC exposure to humans in Denmark than previously thought - maybe because the use of cephalosporins is restricted in cattle and banned in poultry and pigs. Nonetheless, more detailed surveillance data are required to determine the contribution of meat compared to other sources, such as travelling, pets, water resources, community and hospitals in the pursuit of a full source attribution model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Una apropiada evaluación de los márgenes de seguridad de una instalación nuclear, por ejemplo, una central nuclear, tiene en cuenta todas las incertidumbres que afectan a los cálculos de diseño, funcionanmiento y respuesta ante accidentes de dicha instalación. Una fuente de incertidumbre son los datos nucleares, que afectan a los cálculos neutrónicos, de quemado de combustible o activación de materiales. Estos cálculos permiten la evaluación de las funciones respuesta esenciales para el funcionamiento correcto durante operación, y también durante accidente. Ejemplos de esas respuestas son el factor de multiplicación neutrónica o el calor residual después del disparo del reactor. Por tanto, es necesario evaluar el impacto de dichas incertidumbres en estos cálculos. Para poder realizar los cálculos de propagación de incertidumbres, es necesario implementar metodologías que sean capaces de evaluar el impacto de las incertidumbres de estos datos nucleares. Pero también es necesario conocer los datos de incertidumbres disponibles para ser capaces de manejarlos. Actualmente, se están invirtiendo grandes esfuerzos en mejorar la capacidad de analizar, manejar y producir datos de incertidumbres, en especial para isótopos importantes en reactores avanzados. A su vez, nuevos programas/códigos están siendo desarrollados e implementados para poder usar dichos datos y analizar su impacto. Todos estos puntos son parte de los objetivos del proyecto europeo ANDES, el cual ha dado el marco de trabajo para el desarrollo de esta tesis doctoral. Por tanto, primero se ha llevado a cabo una revisión del estado del arte de los datos nucleares y sus incertidumbres, centrándose en los tres tipos de datos: de decaimiento, de rendimientos de fisión y de secciones eficaces. A su vez, se ha realizado una revisión del estado del arte de las metodologías para la propagación de incertidumbre de estos datos nucleares. Dentro del Departamento de Ingeniería Nuclear (DIN) se propuso una metodología para la propagación de incertidumbres en cálculos de evolución isotópica, el Método Híbrido. Esta metodología se ha tomado como punto de partida para esta tesis, implementando y desarrollando dicha metodología, así como extendiendo sus capacidades. Se han analizado sus ventajas, inconvenientes y limitaciones. El Método Híbrido se utiliza en conjunto con el código de evolución isotópica ACAB, y se basa en el muestreo por Monte Carlo de los datos nucleares con incertidumbre. En esta metodología, se presentan diferentes aproximaciones según la estructura de grupos de energía de las secciones eficaces: en un grupo, en un grupo con muestreo correlacionado y en multigrupos. Se han desarrollado diferentes secuencias para usar distintas librerías de datos nucleares almacenadas en diferentes formatos: ENDF-6 (para las librerías evaluadas), COVERX (para las librerías en multigrupos de SCALE) y EAF (para las librerías de activación). Gracias a la revisión del estado del arte de los datos nucleares de los rendimientos de fisión se ha identificado la falta de una información sobre sus incertidumbres, en concreto, de matrices de covarianza completas. Además, visto el renovado interés por parte de la comunidad internacional, a través del grupo de trabajo internacional de cooperación para evaluación de datos nucleares (WPEC) dedicado a la evaluación de las necesidades de mejora de datos nucleares mediante el subgrupo 37 (SG37), se ha llevado a cabo una revisión de las metodologías para generar datos de covarianza. Se ha seleccionando la actualización Bayesiana/GLS para su implementación, y de esta forma, dar una respuesta a dicha falta de matrices completas para rendimientos de fisión. Una vez que el Método Híbrido ha sido implementado, desarrollado y extendido, junto con la capacidad de generar matrices de covarianza completas para los rendimientos de fisión, se han estudiado diferentes aplicaciones nucleares. Primero, se estudia el calor residual tras un pulso de fisión, debido a su importancia para cualquier evento después de la parada/disparo del reactor. Además, se trata de un ejercicio claro para ver la importancia de las incertidumbres de datos de decaimiento y de rendimientos de fisión junto con las nuevas matrices completas de covarianza. Se han estudiado dos ciclos de combustible de reactores avanzados: el de la instalación europea para transmutación industrial (EFIT) y el del reactor rápido de sodio europeo (ESFR), en los cuales se han analizado el impacto de las incertidumbres de los datos nucleares en la composición isotópica, calor residual y radiotoxicidad. Se han utilizado diferentes librerías de datos nucleares en los estudios antreriores, comparando de esta forma el impacto de sus incertidumbres. A su vez, mediante dichos estudios, se han comparando las distintas aproximaciones del Método Híbrido y otras metodologías para la porpagación de incertidumbres de datos nucleares: Total Monte Carlo (TMC), desarrollada en NRG por A.J. Koning y D. Rochman, y NUDUNA, desarrollada en AREVA GmbH por O. Buss y A. Hoefer. Estas comparaciones demostrarán las ventajas del Método Híbrido, además de revelar sus limitaciones y su rango de aplicación. ABSTRACT For an adequate assessment of safety margins of nuclear facilities, e.g. nuclear power plants, it is necessary to consider all possible uncertainties that affect their design, performance and possible accidents. Nuclear data are a source of uncertainty that are involved in neutronics, fuel depletion and activation calculations. These calculations can predict critical response functions during operation and in the event of accident, such as decay heat and neutron multiplication factor. Thus, the impact of nuclear data uncertainties on these response functions needs to be addressed for a proper evaluation of the safety margins. Methodologies for performing uncertainty propagation calculations need to be implemented in order to analyse the impact of nuclear data uncertainties. Nevertheless, it is necessary to understand the current status of nuclear data and their uncertainties, in order to be able to handle this type of data. Great eórts are underway to enhance the European capability to analyse/process/produce covariance data, especially for isotopes which are of importance for advanced reactors. At the same time, new methodologies/codes are being developed and implemented for using and evaluating the impact of uncertainty data. These were the objectives of the European ANDES (Accurate Nuclear Data for nuclear Energy Sustainability) project, which provided a framework for the development of this PhD Thesis. Accordingly, first a review of the state-of-the-art of nuclear data and their uncertainties is conducted, focusing on the three kinds of data: decay, fission yields and cross sections. A review of the current methodologies for propagating nuclear data uncertainties is also performed. The Nuclear Engineering Department of UPM has proposed a methodology for propagating uncertainties in depletion calculations, the Hybrid Method, which has been taken as the starting point of this thesis. This methodology has been implemented, developed and extended, and its advantages, drawbacks and limitations have been analysed. It is used in conjunction with the ACAB depletion code, and is based on Monte Carlo sampling of variables with uncertainties. Different approaches are presented depending on cross section energy-structure: one-group, one-group with correlated sampling and multi-group. Differences and applicability criteria are presented. Sequences have been developed for using different nuclear data libraries in different storing-formats: ENDF-6 (for evaluated libraries) and COVERX (for multi-group libraries of SCALE), as well as EAF format (for activation libraries). A revision of the state-of-the-art of fission yield data shows inconsistencies in uncertainty data, specifically with regard to complete covariance matrices. Furthermore, the international community has expressed a renewed interest in the issue through the Working Party on International Nuclear Data Evaluation Co-operation (WPEC) with the Subgroup (SG37), which is dedicated to assessing the need to have complete nuclear data. This gives rise to this review of the state-of-the-art of methodologies for generating covariance data for fission yields. Bayesian/generalised least square (GLS) updating sequence has been selected and implemented to answer to this need. Once the Hybrid Method has been implemented, developed and extended, along with fission yield covariance generation capability, different applications are studied. The Fission Pulse Decay Heat problem is tackled first because of its importance during events after shutdown and because it is a clean exercise for showing the impact and importance of decay and fission yield data uncertainties in conjunction with the new covariance data. Two fuel cycles of advanced reactors are studied: the European Facility for Industrial Transmutation (EFIT) and the European Sodium Fast Reactor (ESFR), and response function uncertainties such as isotopic composition, decay heat and radiotoxicity are addressed. Different nuclear data libraries are used and compared. These applications serve as frameworks for comparing the different approaches of the Hybrid Method, and also for comparing with other methodologies: Total Monte Carlo (TMC), developed at NRG by A.J. Koning and D. Rochman, and NUDUNA, developed at AREVA GmbH by O. Buss and A. Hoefer. These comparisons reveal the advantages, limitations and the range of application of the Hybrid Method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At head of title: 93d Congress, 1st session, Senate, Executive report ; no. 93-16.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two experiments tested the prediction that uncertainty reduction and self-enhancement motivations have an interactive effect on ingroup identification. In Experiment 1 (N = 64), uncertainty and group status were manipulated, and the effect on ingroup identification was measured. As predicted, low-uncertainty participants identified more strongly with a high- than low-status group, whereas high-uncertainty participants showed no preference; and low-status group members identified more strongly under high than low uncertainty, whereas high-status group members showed no preference. Experiment 2 (N = 210) replicated Experiment 1, but with a third independent variable that manipulated how prototypical participants were of their group. As predicted, the effects obtained in Experiment 1 only emerged where participants were highly prototypical. Low prototypicality depressed identification with a low-status group under high uncertainty. The implications of these results for intergroup relations and the role of prototypicality in social identity processes are discussed.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pattern discovery in temporal event sequences is of great importance in many application domains, such as telecommunication network fault analysis. In reality, not every type of event has an accurate timestamp. Some of them, defined as inaccurate events may only have an interval as possible time of occurrence. The existence of inaccurate events may cause uncertainty in event ordering. The traditional support model cannot deal with this uncertainty, which would cause some interesting patterns to be missing. A new concept, precise support, is introduced to evaluate the probability of a pattern contained in a sequence. Based on this new metric, we define the uncertainty model and present an algorithm to discover interesting patterns in the sequence database that has one type of inaccurate event. In our model, the number of types of inaccurate events can be extended to k readily, however, at a cost of increasing computational complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis provides an interoperable language for quantifying uncertainty using probability theory. A general introduction to interoperability and uncertainty is given, with particular emphasis on the geospatial domain. Existing interoperable standards used within the geospatial sciences are reviewed, including Geography Markup Language (GML), Observations and Measurements (O&M) and the Web Processing Service (WPS) specifications. The importance of uncertainty in geospatial data is identified and probability theory is examined as a mechanism for quantifying these uncertainties. The Uncertainty Markup Language (UncertML) is presented as a solution to the lack of an interoperable standard for quantifying uncertainty. UncertML is capable of describing uncertainty using statistics, probability distributions or a series of realisations. The capabilities of UncertML are demonstrated through a series of XML examples. This thesis then provides a series of example use cases where UncertML is integrated with existing standards in a variety of applications. The Sensor Observation Service - a service for querying and retrieving sensor-observed data - is extended to provide a standardised method for quantifying the inherent uncertainties in sensor observations. The INTAMAP project demonstrates how UncertML can be used to aid uncertainty propagation using a WPS by allowing UncertML as input and output data. The flexibility of UncertML is demonstrated with an extension to the GML geometry schemas to allow positional uncertainty to be quantified. Further applications and developments of UncertML are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work introduces a novel inversion-based neurocontroller for solving control problems involving uncertain nonlinear systems which could also compensate for multi-valued systems. The approach uses recent developments in neural networks, especially in the context of modelling statistical distributions, which are applied to forward and inverse plant models. Provided that certain conditions are met, an estimate of the intrinsic uncertainty for the outputs of neural networks can be obtained using the statistical properties of networks. More generally, multicomponent distributions can be modelled by the mixture density network. Based on importance sampling from these distributions a novel robust inverse control approach is obtained. This importance sampling provides a structured and principled approach to constrain the complexity of the search space for the ideal control law. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localise the possible control solutions to consider. Convergence of the output error for the proposed control method is verified by using a Lyapunov function. Several simulation examples are provided to demonstrate the efficiency of the developed control method. The manner in which such a method is extended to nonlinear multi-variable systems with different delays between the input-output pairs is considered and demonstrated through simulation examples.