918 resultados para CPT - framework
Resumo:
This paper presents possible selective current compensation strategies based on the Conservative Power Theory (CPT). This recently proposed theory, introduces the concept of complex power conservation under non-sinusoidal conditions. Moreover, the related current decompositions results in several current terms, which are associated with a specific physical phenomena (power absorption P, energy storage Q, voltage and current distortion D). Such current components are used in this work for the definition of different current compensators, which can be selective in terms of minimizing particular disturbing effects. The choice of one or other current component for compensation directly affects the sizing and cost of active and/or passive devices and it will be demonstrated that it can be done to attend predefined limits for harmonic distortion, unbalances and/or power factor. Single and three-phase compensation strategies will be discussed by means of the CPT Framework. Simulation and experimental results will be demonstrated in order to validate their performance. © 2009 IEEE.
Resumo:
We searched for a sidereal modulation in the MINOS far detector neutrino rate. Such a signal would be a consequence of Lorentz and CPT violation as described by the standard-model extension framework. It also would be the first detection of a perturbative effect to conventional neutrino mass oscillations. We found no evidence for this sidereal signature, and the upper limits placed on the magnitudes of the Lorentz and CPT violating coefficients describing the theory are an improvement by factors of 20-510 over the current best limits found by using the MINOS near detector.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Resource specialisation, although a fundamental component of ecological theory, is employed in disparate ways. Most definitions derive from simple counts of resource species. We build on recent advances in ecophylogenetics and null model analysis to propose a concept of specialisation that comprises affinities among resources as well as their co-occurrence with consumers. In the distance-based specialisation index (DSI), specialisation is measured as relatedness (phylogenetic or otherwise) of resources, scaled by the null expectation of random use of locally available resources. Thus, specialists use significantly clustered sets of resources, whereas generalists use over-dispersed resources. Intermediate species are classed as indiscriminate consumers. The effectiveness of this approach was assessed with differentially restricted null models, applied to a data set of 168 herbivorous insect species and their hosts. Incorporation of plant relatedness and relative abundance greatly improved specialisation measures compared to taxon counts or simpler null models, which overestimate the fraction of specialists, a problem compounded by insufficient sampling effort. This framework disambiguates the concept of specialisation with an explicit measure applicable to any mode of affinity among resource classes, and is also linked to ecological and evolutionary processes. This will enable a more rigorous deployment of ecological specialisation in empirical and theoretical studies.
Resumo:
A search for a sidereal modulation in the MINOS near detector neutrino data was performed. If present, this signature could be a consequence of Lorentz and CPT violation as predicted by the effective field theory called the standard-model extension. No evidence for a sidereal signal in the data set was found, implying that there is no significant change in neutrino propagation that depends on the direction of the neutrino beam in a sun-centered inertial frame. Upper limits on the magnitudes of the Lorentz and CPT violating terms in the standard-model extension lie between 10(-4) and 10(-2) of the maximum expected, assuming a suppression of these signatures by a factor of 10(-17).
Resumo:
We present precise tests of CP and CPT symmetry based on the full data set of K -> pi pi decays collected by the KTeV experiment at Fermi National Accelerator Laboratory during 1996, 1997, and 1999. This data set contains 16 x 10(6) K -> pi(0)pi(0) and 69 x 10(6) K -> pi(+)pi(-) decays. We measure the direct CP violation parameter Re(epsilon'/epsilon) = (19.2 +/- 2.1) x 10(-4). We find the K(L) -> K(S) mass difference Delta m = (5270 +/- 12) x 10(6) (h) over tilde s(-1) and the K(S) lifetime tau(S) = (89.62 +/- 0.05) x 10(-12) s. We also measure several parameters that test CPT invariance. We find the difference between the phase of the indirect CP violation parameter epsilon and the superweak phase: phi(epsilon) - phi(SW) =(0.40 +/- 0.56)degrees. We measure the difference of the relative phases between the CP violating and CP conserving decay amplitudes for K -> pi(+)pi(-) (phi(+-)) and for K -> pi(0)pi(0) (phi(00)): Delta phi = (0.30 +/- 0.35)degrees. From these phase measurements, we place a limit on the mass difference between K(0) and (K) over bar (0): Delta M < 4.8 x 10(-19) GeV/c(2) at 95% C.L. These results are consistent with those of other experiments, our own earlier measurements, and CPT symmetry.
Resumo:
We use the boundary effective theory approach to thermal field theory in order to calculate the pressure of a system of massless scalar fields with quartic interaction. The method naturally separates the infrared physics, and is essentially nonperturbative. To lowest order, the main ingredient is the solution of the free Euler-Lagrange equation with nontrivial (time) boundary conditions. We derive a resummed pressure, which is in good agreement with recent calculations found in the literature, following a very direct and compact procedure.
Resumo:
In a 4D chiral Thirring model we analyze the possibility that radiative corrections may produce spontaneous breaking of Lorentz and CPT symmetry. By studying the effective potential, we verified that the chiral current (psi) over bar gamma(mu)gamma(5)psi may assume a nonzero vacuum expectation value which triggers Lorentz and CPT violations. Furthermore, by making fluctuations on the minimum of the potential we dynamically induce a bumblebee-like model containing a Chem-Simons term.
Resumo:
This paper presents a framework to build medical training applications by using virtual reality and a tool that helps the class instantiation of this framework. The main purpose is to make easier the building of virtual reality applications in the medical training area, considering systems to simulate biopsy exams and make available deformation, collision detection, and stereoscopy functionalities. The instantiation of the classes allows quick implementation of the tools for such a purpose, thus reducing errors and offering low cost due to the use of open source tools. Using the instantiation tool, the process of building applications is fast and easy. Therefore, computer programmers can obtain an initial application and adapt it to their needs. This tool allows the user to include, delete, and edit parameters in the functionalities chosen as well as storing these parameters for future use. In order to verify the efficiency of the framework, some case studies are presented.
Resumo:
Product lifecycle management (PLM) innovates as it defines both the product as a central element to aggregate enterprise information and the lifecycle as a new time dimension for information integration and analysis. Because of its potential benefits to shorten innovation lead-times and to reduce costs, PLM has attracted a lot of attention at industry and at research. However, the current PLM implementation stage at most organisations still does not apply the lifecycle management concepts thoroughly. In order to close the existing realisation gap, this article presents a process oriented framework to support effective PLM implementation. The framework central point consists of a set of lifecycle oriented business process reference models which links the necessary fundamental concepts, enterprise knowledge and software solutions to effectively deploy PLM. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
High urban transport energy consumption is directly influenced by transport energy dependence. Dramatic reductions in urban transport energy dependence or consumption are not yet being widely observed despite the variety of urban planning tools currently available. A new urban development framework is presented to tackle this issue that makes use of a recently developed and successfully trialed GIS-based tool, the Transport Energy Specification (TES). The TES was simulated on a neighborhood in Sao Carlos, Brazil. In the simulation, energy dependence was reduced by a factor of 8 through activity location or infrastructure modifications to the built environment.
Resumo:
This paper provides a computational framework, based on Defeasible Logic, to capture some aspects of institutional agency. Our background is Kanger-Lindahl-P\"orn account of organised interaction, which describes this interaction within a multi-modal logical setting. This work focuses in particular on the notions of counts-as link and on those of attempt and of personal and direct action to realise states of affairs. We show how standard Defeasible Logic can be extended to represent these concepts: the resulting system preserves some basic properties commonly attributed to them. In addition, the framework enjoys nice computational properties, as it turns out that the extension of any theory can be computed in time linear to the size of the theory itself.
Resumo:
We explore of the feasibility of the computationally oriented institutional agency framework proposed by Governatori and Rotolo testing it against an industrial strength scenario. In particular we show how to encode in defeasible logic the dispute resolution policy described in Article 67 of FIDIC.
Resumo:
There are many techniques for electricity market price forecasting. However, most of them are designed for expected price analysis rather than price spike forecasting. An effective method of predicting the occurrence of spikes has not yet been observed in the literature so far. In this paper, a data mining based approach is presented to give a reliable forecast of the occurrence of price spikes. Combined with the spike value prediction techniques developed by the same authors, the proposed approach aims at providing a comprehensive tool for price spike forecasting. In this paper, feature selection techniques are firstly described to identify the attributes relevant to the occurrence of spikes. A simple introduction to the classification techniques is given for completeness. Two algorithms: support vector machine and probability classifier are chosen to be the spike occurrence predictors and are discussed in details. Realistic market data are used to test the proposed model with promising results.