973 resultados para Galilean covariant formalism


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Social, technological, and economic time series are divided by events which are usually assumed to be random, albeit with some hierarchical structure. It is well known that the interevent statistics observed in these contexts differs from the Poissonian profile by being long-tailed distributed with resting and active periods interwoven. Understanding mechanisms generating consistent statistics has therefore become a central issue. The approach we present is taken from the continuous-time random-walk formalism and represents an analytical alternative to models of nontrivial priority that have been recently proposed. Our analysis also goes one step further by looking at the multifractal structure of the interevent times of human decisions. We here analyze the intertransaction time intervals of several financial markets. We observe that empirical data describe a subtle multifractal behavior. Our model explains this structure by taking the pausing-time density in the form of a superstatistics where the integral kernel quantifies the heterogeneous nature of the executed tasks. A stretched exponential kernel provides a multifractal profile valid for a certain limited range. A suggested heuristic analytical profile is capable of covering a broader region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design methods and languages targeted to modern System-on-Chip designs are facing tremendous pressure of the ever-increasing complexity, power, and speed requirements. To estimate any of these three metrics, there is a trade-off between accuracy and abstraction level of detail in which a system under design is analyzed. The more detailed the description, the more accurate the simulation will be, but, on the other hand, the more time consuming it will be. Moreover, a designer wants to make decisions as early as possible in the design flow to avoid costly design backtracking. To answer the challenges posed upon System-on-chip designs, this thesis introduces a formal, power aware framework, its development methods, and methods to constraint and analyze power consumption of the system under design. This thesis discusses on power analysis of synchronous and asynchronous systems not forgetting the communication aspects of these systems. The presented framework is built upon the Timed Action System formalism, which offer an environment to analyze and constraint the functional and temporal behavior of the system at high abstraction level. Furthermore, due to the complexity of System-on-Chip designs, the possibility to abstract unnecessary implementation details at higher abstraction levels is an essential part of the introduced design framework. With the encapsulation and abstraction techniques incorporated with the procedure based communication allows a designer to use the presented power aware framework in modeling these large scale systems. The introduced techniques also enable one to subdivide the development of communication and computation into own tasks. This property is taken into account in the power analysis part as well. Furthermore, the presented framework is developed in a way that it can be used throughout the design project. In other words, a designer is able to model and analyze systems from an abstract specification down to an implementable specification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article intends to answer the question: "what is the best way to evaluate the strength of acids and bases?" The meaning of the word strength, the main acid-base theories (ionotropic and electron pair), the neutralization reactions and the thermodynamical formalism are considered. Some cases are presented and discussed. In conclusion, evaluating acid-base strength is dependent on the theory (formalism) as well as on the system and measuring techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A thermodynamic formalism based on the Gibbs Dividing Surface (GDS) for the description of a solid-fluid interface is presented, so that the adsorption layer is understand as a phase and the adsorption process as the transference of components between a 3-dimensional phase and a 2-dimensional one. Using a state equation derived from the Henry's Law, we shall show how the Langmuir isotherm is deduced from de Gibbs isotherm. The GDS is useful also for understanding the release of heat by a system as the adsorption occurs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Potential energy and dipole moment curves for the HCl molecule were computed. Calculations were performed at different levels of theory (DFT, MRCI). Spectroscopic properties are reported and compared with experimental data, for validating the theoretical approaches. Interaction of infrared radiation with HCl is simulated using the wave packet formalism. The quantum control model for population dynamics of the vibrational levels, based on pi-pulse theory, is applied. The results demonstrate that wavepackets with specific composition can be built with short infrared laser pulses and provide the basis for studies of H + HCl collision dynamics with infrared laser excitation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper was to present a simple and fast way of simulating Nuclear Magnetic Resonance signals using the Bloch equations. These phenomenological equations describe the classical behavior of macroscopic magnetization and are easily simulated using rotation matrices. Many NMR pulse sequences can be simulated with this formalism, allowing a quantitative description of the influence of many experimental parameters. Finally, the paper presents simulations of conventional sequences such as Single Pulse, Inversion Recovery, Spin Echo and CPMG.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The formalism of supersymmetric Quantum Mechanics can be extended to arbitrary dimensions. We introduce this formalism and explore its utility to solve the Schrödinger equation for a bidimensinal potential. This potential can be applied in several systems in physical and chemistry context , for instance, it can be used to study benzene molecule.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En väsentlig fråga inom såväl lingvistiska som kognitiva teorier är, hur språket beskriver kausala relationer. I finskan finns det en speciell typ av kausativa verb avledda med suffixet (U)ttA som används för att uttrycka att handlingen i fråga utförs av någon annan än subjektreferenten, t.ex. Maija haetuttaa Matilla kirjastosta kirjan ’Maija låter Matti hämta boken från biblioteket’ och Matti juoksuttaa Maijan kaupunkiin ’Matti låter Maija springa till staden’. Syftet med denna avhandling var att med exempel av sociala dominansens kausativer undersöka ordbildningens natur samt begreppet ’socialt förorsakande’. För att beskriva avledningars regelbundna argumentstruktur i form av kopplingen mellan syntaxen och semantiken upprättades deras prototypiska strukturer. Dessa verb har emellertid också specifika användningsområden som framhäver variationer i sociala relationer. Säregna egenskaper hos den sociala dominansens kausativer inkluderades i undersökningen och definierades som konstruktioner. Konstruktionerna omfattar speciella syntaktiska och/eller semantiska element och utöver det också pragmatiska värderande implikationer. Uppbyggnaden av den sociala dimensionen hos de undersökta verben består av egenskaper förbundna med typen av förorsakande, argumentens agentiva egenskaper (aktivitet eller passivitet, dominans, kontroll, viljestyrdhet och ansvarighet) samt konventionaliserade attityder och tolkningar. Ett exempel på en s.k. 'tolkningskonstruktion’ är den negativa dominansens uttryck som i avhandlingen kallas Maktmissbrukskonstruktionen. Denna konstruktion inkluderar talarens starkt kritiska hållning till den uttryckta situationen, t.ex. Asiakas juoksuttaa lentoemäntää ’Kunden låter flygvärdinnan springa’. Dessa konstruktioner fyller en viktig funktion i språklig kommunikation: att beskriva avvikande av sociala normer och att foga expressivitet till budskapet. Metodologiskt kombinerar denna avhandling teorier som baseras på det aktuella språkbruket och teoretisk lingvistisk analys. Verbens samt konstruktionernas konceptuella lexikala struktur och prototypstrukturerna analyserades med hjälp av den konceptuella semantikens verktyg, som har utvecklats av Jackendoff, Nikanne och Pörn.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ABSTRACT This study aims at presenting the process of machine design and agricultural implements by means of a reference model, formulated with the purpose of explaining the development activities of new products, serving as a guideline to coach human resources and to assist in formalizing the process in small and medium-sized businesses (SMB), i.e. up to 500 employees. The methodology used included the process modeling, carried out from case studies in the SMB, and the study of reference models in literature. The modeling formalism used was based on the IDEF0 standard, which identifies the dimensions required for the model detailing: input information; activities; tasks; knowledge domains; mechanisms; controls and information produced. These dimensions were organized in spreadsheets and graphs. As a result, a reference model with 27 activities and 71 tasks was obtained, distributed over four phases of the design process. The evaluation of the model was carried out by the companies participating in the case studies and by experts, who concluded that the model explains the actions needed to develop new products in SMB.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis addresses the use of covariant phase space observables in quantum tomography. Necessary and sufficient conditions for the informational completeness of covariant phase space observables are proved, and some state reconstruction formulae are derived. Different measurement schemes for measuring phase space observables are considered. Special emphasis is given to the quantum optical eight-port homodyne detection scheme and, in particular, on the effect of non-unit detector efficiencies on the measured observable. It is shown that the informational completeness of the observable does not depend on the efficiencies. As a related problem, the possibility of reconstructing the position and momentum distributions from the marginal statistics of a phase space observable is considered. It is shown that informational completeness for the phase space observable is neither necessary nor sufficient for this procedure. Two methods for determining the distributions from the marginal statistics are presented. Finally, two alternative methods for determining the state are considered. Some of their shortcomings when compared to the phase space method are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the field of molecular biology, scientists adopted for decades a reductionist perspective in their inquiries, being predominantly concerned with the intricate mechanistic details of subcellular regulatory systems. However, integrative thinking was still applied at a smaller scale in molecular biology to understand the underlying processes of cellular behaviour for at least half a century. It was not until the genomic revolution at the end of the previous century that we required model building to account for systemic properties of cellular activity. Our system-level understanding of cellular function is to this day hindered by drastic limitations in our capability of predicting cellular behaviour to reflect system dynamics and system structures. To this end, systems biology aims for a system-level understanding of functional intraand inter-cellular activity. Modern biology brings about a high volume of data, whose comprehension we cannot even aim for in the absence of computational support. Computational modelling, hence, bridges modern biology to computer science, enabling a number of assets, which prove to be invaluable in the analysis of complex biological systems, such as: a rigorous characterization of the system structure, simulation techniques, perturbations analysis, etc. Computational biomodels augmented in size considerably in the past years, major contributions being made towards the simulation and analysis of large-scale models, starting with signalling pathways and culminating with whole-cell models, tissue-level models, organ models and full-scale patient models. The simulation and analysis of models of such complexity very often requires, in fact, the integration of various sub-models, entwined at different levels of resolution and whose organization spans over several levels of hierarchy. This thesis revolves around the concept of quantitative model refinement in relation to the process of model building in computational systems biology. The thesis proposes a sound computational framework for the stepwise augmentation of a biomodel. One starts with an abstract, high-level representation of a biological phenomenon, which is materialised into an initial model that is validated against a set of existing data. Consequently, the model is refined to include more details regarding its species and/or reactions. The framework is employed in the development of two models, one for the heat shock response in eukaryotes and the second for the ErbB signalling pathway. The thesis spans over several formalisms used in computational systems biology, inherently quantitative: reaction-network models, rule-based models and Petri net models, as well as a recent formalism intrinsically qualitative: reaction systems. The choice of modelling formalism is, however, determined by the nature of the question the modeler aims to answer. Quantitative model refinement turns out to be not only essential in the model development cycle, but also beneficial for the compilation of large-scale models, whose development requires the integration of several sub-models across various levels of resolution and underlying formal representations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Kirjallisuusarvostelu

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Human beings have always strived to preserve their memories and spread their ideas. In the beginning this was always done through human interpretations, such as telling stories and creating sculptures. Later, technological progress made it possible to create a recording of a phenomenon; first as an analogue recording onto a physical object, and later digitally, as a sequence of bits to be interpreted by a computer. By the end of the 20th century technological advances had made it feasible to distribute media content over a computer network instead of on physical objects, thus enabling the concept of digital media distribution. Many digital media distribution systems already exist, and their continued, and in many cases increasing, usage is an indicator for the high interest in their future enhancements and enriching. By looking at these digital media distribution systems, we have identified three main areas of possible improvement: network structure and coordination, transport of content over the network, and the encoding used for the content. In this thesis, our aim is to show that improvements in performance, efficiency and availability can be done in conjunction with improvements in software quality and reliability through the use of formal methods: mathematical approaches to reasoning about software so that we can prove its correctness, together with the desirable properties. We envision a complete media distribution system based on a distributed architecture, such as peer-to-peer networking, in which different parts of the system have been formally modelled and verified. Starting with the network itself, we show how it can be formally constructed and modularised in the Event-B formalism, such that we can separate the modelling of one node from the modelling of the network itself. We also show how the piece selection algorithm in the BitTorrent peer-to-peer transfer protocol can be adapted for on-demand media streaming, and how this can be modelled in Event-B. Furthermore, we show how modelling one peer in Event-B can give results similar to simulating an entire network of peers. Going further, we introduce a formal specification language for content transfer algorithms, and show that having such a language can make these algorithms easier to understand. We also show how generating Event-B code from this language can result in less complexity compared to creating the models from written specifications. We also consider the decoding part of a media distribution system by showing how video decoding can be done in parallel. This is based on formally defined dependencies between frames and blocks in a video sequence; we have shown that also this step can be performed in a way that is mathematically proven correct. Our modelling and proving in this thesis is, in its majority, tool-based. This provides a demonstration of the advance of formal methods as well as their increased reliability, and thus, advocates for their more wide-spread usage in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimization of quantum measurement processes has a pivotal role in carrying out better, more accurate or less disrupting, measurements and experiments on a quantum system. Especially, convex optimization, i.e., identifying the extreme points of the convex sets and subsets of quantum measuring devices plays an important part in quantum optimization since the typical figures of merit for measuring processes are affine functionals. In this thesis, we discuss results determining the extreme quantum devices and their relevance, e.g., in quantum-compatibility-related questions. Especially, we see that a compatible device pair where one device is extreme can be joined into a single apparatus essentially in a unique way. Moreover, we show that the question whether a pair of quantum observables can be measured jointly can often be formulated in a weaker form when some of the observables involved are extreme. Another major line of research treated in this thesis deals with convex analysis of special restricted quantum device sets, covariance structures or, in particular, generalized imprimitivity systems. Some results on the structure ofcovariant observables and instruments are listed as well as results identifying the extreme points of covariance structures in quantum theory. As a special case study, not published anywhere before, we study the structure of Euclidean-covariant localization observables for spin-0-particles. We also discuss the general form of Weyl-covariant phase-space instruments. Finally, certain optimality measures originating from convex geometry are introduced for quantum devices, namely, boundariness measuring how ‘close’ to the algebraic boundary of the device set a quantum apparatus is and the robustness of incompatibility quantifying the level of incompatibility for a quantum device pair by measuring the highest amount of noise the pair tolerates without becoming compatible. Boundariness is further associated to minimum-error discrimination of quantum devices, and robustness of incompatibility is shown to behave monotonically under certain compatibility-non-decreasing operations. Moreover, the value of robustness of incompatibility is given for a few special device pairs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines the relation between intuition and concept in Kant in light of John McDowell's neo-Kantian position that intuitions are concept-laden.2 The focus is on Kant's twofold pronouncement that thoughts without content are empty and that intuitions without concepts are blind. I show that intuitions as singular representations are not instances of passive data intake but the result of synthetic unification of the given manifold of the senses by the power of the imagination under the guidance of the understanding. Against McDowell I argue that the amenability of intuitions to conceptual determination is not due some pre-existing, absolute conceptuality of the real but to the "work of the subject."3 On a more programmatic level, this paper seeks to demonstrate the limitations of a selective appropriation of Kant and the philosophical potential of a more comprehensive and thorough consideration of his work. Section 1 addresses the unique balance in Kant's philosophy between the work on particular problems and the orientation toward a systematic whole. Section 2 outlines McDowell's take on the Kantian distinction between intuition and concept in the context of the Kant readings by Sellars and Strawson. Section 3 exposes McDowell's relapse into the Myth of the Given. Section 4 proposes a reading of Kant's theoretical philosophy as an epistemology of metaphysical cognition. Section 5 details Kant's original account of sensible intuition in the Inaugural-Dissertation of 1770. Section 6 presents the transition from the manifold of the senses to the synthesis in the imagination and the unification through the categories in the Critique of pure reason (1781 and 1787). Section 7 addresses Kant's formalism in epistemology and metaphysics.