933 resultados para THIRD GENERATION SYSTEMS
Resumo:
The third law of thermodynamics is formulated precisely: all points of the state space of zero temperature I""(0) are physically adiabatically inaccessible from the state space of a simple system. In addition to implying the unattainability of absolute zero in finite time (or ""by a finite number of operations""), it admits as corollary, under a continuity assumption, that all points of I""(0) are adiabatically equivalent. We argue that the third law is universally valid for all macroscopic systems which obey the laws of quantum mechanics and/or quantum field theory. We also briefly discuss why a precise formulation of the third law for black holes remains an open problem.
Resumo:
The problem of resonant generation of nonground-state condensates is addressed aiming at resolving the seeming paradox that arises when one resorts to the adiabatic representation. In this picture, the eigenvalues and eigenfunctions of a time-dependent Gross-Pitaevskii Hamiltonian are also functions of time. Since the level energies vary in time, no definite transition frequency can be introduced. Hence no external modulation with a fixed frequency can be made resonant. Thus, the resonant generation of adiabatic coherent modes is impossible. However, this paradox occurs only in the frame of the adiabatic picture. It is shown that no paradox exists in the properly formulated diabatic representation. The resonant generation of diabatic coherent modes is a well defined phenomenon. As an example, the equations are derived, describing the generation of diabatic coherent modes by the combined resonant modulation of the trapping potential and atomic scattering length.
Resumo:
The third-order optical susceptibility and dispersion of the linear refractive index of Er(3+)-doped lead phosphate glass were measured in the wavelength range between 400 and 1940 nm by using the spectrally resolved femtosecond Maker fringes technique. The nonlinear refractive index obtained from the third-order susceptibility was found to be five times higher than that of silica, indicating that Er(3+)-doped lead phosphate glass is a potential candidate to be used as the base component for the fabrication of photonic devices. For comparison purposes, the Z-scan technique was also employed to obtain the values of the nonlinear refractive index of Er(3+)-doped lead phosphate glass at several wavelengths, and the values obtained using the two techniques agree to within 15%.
Resumo:
In this work an efficient third order non-linear finite difference scheme for solving adaptively hyperbolic systems of one-dimensional conservation laws is developed. The method is based oil applying to the solution of the differential equation an interpolating wavelet transform at each time step, generating a multilevel representation for the solution, which is thresholded and a sparse point representation is generated. The numerical fluxes obtained by a Lax-Friedrichs flux splitting are evaluated oil the sparse grid by an essentially non-oscillatory (ENO) approximation, which chooses the locally smoothest stencil among all the possibilities for each point of the sparse grid. The time evolution of the differential operator is done on this sparse representation by a total variation diminishing (TVD) Runge-Kutta method. Four classical examples of initial value problems for the Euler equations of gas dynamics are accurately solved and their sparse solutions are analyzed with respect to the threshold parameters, confirming the efficiency of the wavelet transform as an adaptive grid generation technique. (C) 2008 IMACS. Published by Elsevier B.V. All rights reserved.
Resumo:
A few years ago, it was reported that ozone is produced in human atherosclerotic arteries, on the basis of the identification of 3 beta-hydroxy-5-oxo-5,6-secocholestan-6-al and 3 beta-hydroxy-5 beta-hydroxy-B-norcholestane-6 beta-carboxaldehyde (ChAld) as their 2,4-dinitrophenylhydrazones. The formation of endogenous ozone was attributed to water oxidation catalyzed by antibodies, with the formation of dihydrogen trioxide as a key intermediate. We now report that ChAld is also generated by the reaction of cholesterol with singlet molecular oxygen [O(2) ((1)Delta(g))] that is produced by photodynamic action or by the thermodecomposition of 1,4-dimethylnaphthalene endoperoxide, a defined pure chemical source of O(2) ((1)Delta(g)). On the basis of (18)O-labeled ChAld mass spectrometry, NMR, light emission measurements, and derivatization studies, we propose that the mechanism of ChAld generation involves the formation of the well-known cholesterol 5 alpha-hydroperoxide (5 alpha-OOH) (the major product of O(2) ((1)Delta(g))-oxidation of cholesterol) and/or a 1,2-dioxetane intermediate formed by O(2) ((1)Delta(g)) attack at the Delta(5) position. The Hock cleavage of 5 alpha-OOH (the major pathway) or unstable cholesterol dioxetane decomposition (a minor pathway, traces) gives a 5,6-secosterol intermediate, which undergoes intramolecular aldolization to yield ChAld. These results show clearly and unequivocally that ChAld is generated upon the reaction of cholesterol with O(2) ((1)Delta(g)) and raises questions about the role of ozone in biological processes.
Resumo:
The diffusion of Concentrating Solar Power Systems (CSP) systems is currently taking place at a much slower pace than photovoltaic (PV) power systems. This is mainly because of the higher present cost of the solar thermal power plants, but also for the time that is needed in order to build them. Though economic attractiveness of different Concentrating technologies varies, still PV power dominates the market. The price of CSP is expected to drop significantly in the near future and wide spread installation of them will follow. The main aim of this project is the creation of different relevant case studies on solar thermal power generation and a comparison betwwen them. The purpose of this detailed comparison is the techno-economic appraisal of a number of CSP systems and the understanding of their behaviour under various boundary conditions. The CSP technologies which will be examined are the Parabolic Trough, the Molten Salt Power Tower, the Linear Fresnel Mirrors and the Dish Stirling. These systems will be appropriatly sized and simulated. All of the simulations aim in the optimization of the particular system. This includes two main issues. The first is the achievement of the lowest possible levelized cost of electricity and the second is the maximization of the annual energy output (kWh). The project also aims in the specification of these factors which affect more the results and more specifically, in what they contribute to the cost reduction or the power generation. Also, photovoltaic systems will be simulated under same boundary conditions to facolitate a comparison between the PV and the CSP systems. Last but not leats, there will be a determination of the system which performs better in each case study.
Resumo:
Agent-oriented software engineering (AOSE) is a promising approach to developing applications for dynamic open systems. If well developed, these applications can be opportunistic, taking advantage of services implemented by other developers at appropriate times. However, methodologies are needed to aid the development of systems that are both flexible enough to be opportunistic and tightly defined by the application requirements. In this paper, we investigate how developers can choose the coordination mechanisms of agents so that the agents will best fulfil application requirements in an open system.
Resumo:
Semantic Analysis is a business analysis method designed to capture system requirements. While these requirements may be represented as text, the method also advocates the use of Ontology Charts to formally denote the system's required roles, relationships and forms of communication. Following model driven engineering techniques, Ontology Charts can be transformed to temporal Database schemas, class diagrams and component diagrams, which can then be used to produce software systems. A nice property of these transformations is that resulting system design models lend themselves to complicated extensions that do not require changes to the design models. For example, resulting databases can be extended with new types of data without the need to modify the database schema of the legacy system. Semantic Analysis is not widely used in software engineering, so there is a lack of experts in the field and no design patterns are available. This make it difficult for the analysts to pass organizational knowledge to the engineers. This study describes an implementation that is readily usable by engineers, which includes an automated technique that can produce a prototype from an Ontology Chart. The use of such tools should enable developers to make use of Semantic Analysis with minimal expertise of ontologies and MDA.
Resumo:
Agent-oriented cooperation techniques and standardized electronic healthcare record exchange protocols can be used to combine information regarding different facets of a therapy received by a patient from different healthcare providers at different locations. Provenance is an innovative approach to trace events in complex distributed processes, dependencies between such events, and associated decisions by human actors. We focus on three aspects of provenance in agent-mediated healthcare systems: first, we define the provenance concept and show how it can be applied to agent-mediated healthcare applications; second, we investigate and provide a method for independent and autonomous healthcare agents to document the processes they are involved in without directly interacting with each other; and third, we show that this method solves the privacy issues of provenance in agent-mediated healthcare systems.
Resumo:
I begin by citing a definition of "third wave" from the glossary in Turbo Chicks: Talking Young Feminisms at length because it communicates several key issues that I develop in this project. The definition introduces a tension within "third wave" feminism of building and differentiating itself from second wave feminism, the newness of the term "third wave," its association with "young" women, complexity of contemporary feminisms, and attention to multiple identities and oppressions. Uncovering explanations of "third wave" feminism that go beyond, like this one, generational associations, is not an easy task. Authors consistently group new feminist voices together by age under the label "third wave" feminists without questioning the accuracy of the designation. Most explorations of "third wave" feminism overlook the complexities and distinctions that abound among "young" feminists ; not all young feminists espouse similar ideas, tactics, and actions; and for various reasons, not all young feminists identify with a "third wave" of feminism. Less than a year after I began to learn about feminism I discovered Barbara Findlen's Listen Up: Voices From the Next Feminist Generation. Although the collection nor its contributors declare association with "third wave" feminism, consequent reviews and citations in articles identify it, along with Rebecca Walker's To Be Real: Telling the Truth and Changing the Voice of Feminism, as a major text of "third wave" feminism. Re-reading Listen Up since beginning to research "third wave" feminism, I now understand its fundamental influence on my research questions as a starting point for assessing persistent exclusion in contemporary feminism, rather than as a revolutionary text (as it is claimed to be in many reviews). Findlen begins the introduction with the bold claim, "My feminism wasn't shaped by antiwar or civil rights activism ..." (xi). Framing the collection with a disavowal of the influence women of color's organizational efforts negates, for me, the project's proclaimed commitment to multivocality. Though several contributions examine persistent exclusion within contemporary feminist movement, the larger project seems to rely on these essays to reflect this commitment, suggesting that Listen Up does not go beyond the "add and stir" approach to "diversity." Interestingly, this statement does not appear in the new edition of Listen Up published in 2001. And the content has changed with this new edition, including several more Latina contributors and other "corrective" additions.
Resumo:
The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.
Resumo:
In order to adapt to new markets, the coffee supply chain has gone through numerous changes during the last years, which led to the creation of the voluntary standard systems. Adopting a Voluntary Standard System (VSS) consists of becoming a member of a certifier or verifier, in which an independent third party sets specific criteria to ensure a product complies with standards. Yet, the segment is still relatively new and raises some doubts about the economic and financial advantages of investing in sustainability-related certification. This study analyzes the perception of coffee producers about VSS – whether it brings economic benefits. The literature review covers various VSS in the coffee sector, the brief history of the commodity in Brazil, as well as the description of the supply chain. Certified and non-certified producers in the States of Sao Paulo and Minas Gerais, answered questionnaires to indicate the perceived advantages of certification. The results show that, despite some added value that certification can bestow, the quality is what really matter, since it allows producers to sell the product at higher prices and to gain advantage over competitors.
Resumo:
Nowadays, more than half of the computer development projects fail to meet the final users' expectations. One of the main causes is insufficient knowledge about the organization of the enterprise to be supported by the respective information system. The DEMO methodology (Design and Engineering Methodology for Organizations) has been proved as a well-defined method to specify, through models and diagrams, the essence of any organization at a high level of abstraction. However, this methodology is platform implementation independent, lacking the possibility of saving and propagating possible changes from the organization models to the implemented software, in a runtime environment. The Universal Enterprise Adaptive Object Model (UEAOM) is a conceptual schema being used as a basis for a wiki system, to allow the modeling of any organization, independent of its implementation, as well as the previously mentioned change propagation in a runtime environment. Based on DEMO and UEAOM, this project aims to develop efficient and standardized methods, to enable an automatic conversion of DEMO Ontological Models, based on UEAOM specification into BPMN (Business Process Model and Notation) models of processes, using clear semantics, without ambiguities, in order to facilitate the creation of processes, almost ready for being executed on workflow systems that support BPMN.