965 resultados para Gelfand-Dickey formalism
Resumo:
The 4πβ-γ coincidence counting method and its close relatives are widely used for the primary standardization of radioactivity. Both the general formalism and specific implementation of these methods have been well-documented. In particular, previous papers contain the extrapolation equations used for various decay schemes, methods for determining model parameters and, in some cases, tabulated uncertainty budgets. Two things often lacking from experimental reports are both the rationale for estimating uncertainties in a specific way and the details of exactly how a specific component of uncertainty was estimated. Furthermore, correlations among the components of uncertainty are rarely mentioned. To fill in these gaps, the present article shares the best-practices from a few practitioners of this craft. We explain and demonstrate with examples of how these approaches can be used to estimate the uncertainty of the reported massic activity. We describe uncertainties due to measurement variability, extrapolation functions, dead-time and resolving-time effects, gravimetric links, and nuclear and atomic data. Most importantly, a thorough understanding of the measurement system and its response to the decay under study can be used to derive a robust estimate of the measurement uncertainty.
Resumo:
This thesis develops a comprehensive and a flexible statistical framework for the analysis and detection of space, time and space-time clusters of environmental point data. The developed clustering methods were applied in both simulated datasets and real-world environmental phenomena; however, only the cases of forest fires in Canton of Ticino (Switzerland) and in Portugal are expounded in this document. Normally, environmental phenomena can be modelled as stochastic point processes where each event, e.g. the forest fire ignition point, is characterised by its spatial location and occurrence in time. Additionally, information such as burned area, ignition causes, landuse, topographic, climatic and meteorological features, etc., can also be used to characterise the studied phenomenon. Thereby, the space-time pattern characterisa- tion represents a powerful tool to understand the distribution and behaviour of the events and their correlation with underlying processes, for instance, socio-economic, environmental and meteorological factors. Consequently, we propose a methodology based on the adaptation and application of statistical and fractal point process measures for both global (e.g. the Morisita Index, the Box-counting fractal method, the multifractal formalism and the Ripley's K-function) and local (e.g. Scan Statistics) analysis. Many measures describing the space-time distribution of environmental phenomena have been proposed in a wide variety of disciplines; nevertheless, most of these measures are of global character and do not consider complex spatial constraints, high variability and multivariate nature of the events. Therefore, we proposed an statistical framework that takes into account the complexities of the geographical space, where phenomena take place, by introducing the Validity Domain concept and carrying out clustering analyses in data with different constrained geographical spaces, hence, assessing the relative degree of clustering of the real distribution. Moreover, exclusively to the forest fire case, this research proposes two new methodologies to defining and mapping both the Wildland-Urban Interface (WUI) described as the interaction zone between burnable vegetation and anthropogenic infrastructures, and the prediction of fire ignition susceptibility. In this regard, the main objective of this Thesis was to carry out a basic statistical/- geospatial research with a strong application part to analyse and to describe complex phenomena as well as to overcome unsolved methodological problems in the characterisation of space-time patterns, in particular, the forest fire occurrences. Thus, this Thesis provides a response to the increasing demand for both environmental monitoring and management tools for the assessment of natural and anthropogenic hazards and risks, sustainable development, retrospective success analysis, etc. The major contributions of this work were presented at national and international conferences and published in 5 scientific journals. National and international collaborations were also established and successfully accomplished. -- Cette thèse développe une méthodologie statistique complète et flexible pour l'analyse et la détection des structures spatiales, temporelles et spatio-temporelles de données environnementales représentées comme de semis de points. Les méthodes ici développées ont été appliquées aux jeux de données simulées autant qu'A des phénomènes environnementaux réels; nonobstant, seulement le cas des feux forestiers dans le Canton du Tessin (la Suisse) et celui de Portugal sont expliqués dans ce document. Normalement, les phénomènes environnementaux peuvent être modélisés comme des processus ponctuels stochastiques ou chaque événement, par ex. les point d'ignition des feux forestiers, est déterminé par son emplacement spatial et son occurrence dans le temps. De plus, des informations tels que la surface bru^lée, les causes d'ignition, l'utilisation du sol, les caractéristiques topographiques, climatiques et météorologiques, etc., peuvent aussi être utilisées pour caractériser le phénomène étudié. Par conséquent, la définition de la structure spatio-temporelle représente un outil puissant pour compren- dre la distribution du phénomène et sa corrélation avec des processus sous-jacents tels que les facteurs socio-économiques, environnementaux et météorologiques. De ce fait, nous proposons une méthodologie basée sur l'adaptation et l'application de mesures statistiques et fractales des processus ponctuels d'analyse global (par ex. l'indice de Morisita, la dimension fractale par comptage de boîtes, le formalisme multifractal et la fonction K de Ripley) et local (par ex. la statistique de scan). Des nombreuses mesures décrivant les structures spatio-temporelles de phénomènes environnementaux peuvent être trouvées dans la littérature. Néanmoins, la plupart de ces mesures sont de caractère global et ne considèrent pas de contraintes spatiales com- plexes, ainsi que la haute variabilité et la nature multivariée des événements. A cet effet, la méthodologie ici proposée prend en compte les complexités de l'espace géographique ou le phénomène a lieu, à travers de l'introduction du concept de Domaine de Validité et l'application des mesures d'analyse spatiale dans des données en présentant différentes contraintes géographiques. Cela permet l'évaluation du degré relatif d'agrégation spatiale/temporelle des structures du phénomène observé. En plus, exclusif au cas de feux forestiers, cette recherche propose aussi deux nouvelles méthodologies pour la définition et la cartographie des zones périurbaines, décrites comme des espaces anthropogéniques à proximité de la végétation sauvage ou de la forêt, et de la prédiction de la susceptibilité à l'ignition de feu. A cet égard, l'objectif principal de cette Thèse a été d'effectuer une recherche statistique/géospatiale avec une forte application dans des cas réels, pour analyser et décrire des phénomènes environnementaux complexes aussi bien que surmonter des problèmes méthodologiques non résolus relatifs à la caractérisation des structures spatio-temporelles, particulièrement, celles des occurrences de feux forestières. Ainsi, cette Thèse fournit une réponse à la demande croissante de la gestion et du monitoring environnemental pour le déploiement d'outils d'évaluation des risques et des dangers naturels et anthro- pogéniques. Les majeures contributions de ce travail ont été présentées aux conférences nationales et internationales, et ont été aussi publiées dans 5 revues internationales avec comité de lecture. Des collaborations nationales et internationales ont été aussi établies et accomplies avec succès.
Resumo:
This review has tried to collect and correlate all the various equations for the g matrix of strong field d5 systems obtained from different basis sets using full electron and hole formalism calculations. It has corrected mistakes found in the literature and shown how the failure to properly take in symmetry boundary conditions has produced a variety of apparently inconsistent equations in the literature. The review has reexamined the problem of spin-orbit interaction with excited t4e states and finds that the earlier reports that it is zero in octahedral symmetry is not correct. It has shown how redefining what x, y, and z are in the principal coordinate system simplifies, compared to previous methods, the analysis of experimental g values with the equations.
Resumo:
This paper deals with Carathédory's formulation of the second law of thermodynamics. The material is presented in a didatical way, which allows a second year undergraduate student to follow the formalism. An application is made to an ideal gas with two independent variables. A criticism to Carnot formulation of the second law and an investigation of the historical origins of the Carathéodory formalism are also presented.
Resumo:
Statistical mechanics Monte Carlo simulation is reviewed as a formalism to study thermodynamic properties of liquids. Considering the importance of free energy changes in chemical processes, the thermodynamic perturbation theory implemented in the Monte Carlo method is discussed. The representation of molecular interaction by the Lennard-Jones and Coulomb potential functions is also discussed. Charges derived from quantum molecular electrostatic potential are also discussed as an useful methodology to generate an adequate set of partial charges to be used in liquid simulation.
Resumo:
Consensus is gathering that antimicrobial peptides that exert their antibacterial action at the membrane level must reach a local concentration threshold to become active. Studies of peptide interaction with model membranes do identify such disruptive thresholds but demonstrations of the possible correlation of these with the in vivo onset of activity have only recently been proposed. In addition, such thresholds observed in model membranes occur at local peptide concentrations close to full membrane coverage. In this work we fully develop an interaction model of antimicrobial peptides with biological membranes; by exploring the consequences of the underlying partition formalism we arrive at a relationship that provides antibacterial activity prediction from two biophysical parameters: the affinity of the peptide to the membrane and the critical bound peptide to lipid ratio. A straightforward and robust method to implement this relationship, with potential application to high-throughput screening approaches, is presented and tested. In addition, disruptive thresholds in model membranes and the onset of antibacterial peptide activity are shown to occur over the same range of locally bound peptide concentrations (10 to 100 mM), which conciliates the two types of observations
Resumo:
The fundaments of the modern Density Functional Theory (DFT), its basic theorems, principles and methodology are presented. This review also discuss important and widely used concepts in chemistry but that had not been precisely defined until the development of the DFT. These concepts were proposed and used from an empirical base, but now their precise definition are well established in the DFT formalism. Concepts such as chemical potential (electronegativity), hardness, softness and Fukui function are presented and their consequences to the understanding of chemical reactivity are discussed.
Resumo:
In this paper we model the multicointegration relation, allowing for one structural break. Since multicointegration is a particular case of polynomial or I(2) cointegration, our proposal can also be applied in these cases. The paper proposes the use of a residualbased Dickey-Fuller class of statistic that accounts for one known or unknown structural break. Finite sample performance of the proposed statistic is investigated by using Monte Carlo simulations, which reveals that the statistic shows good properties in terms of empirical size and power. We complete the study with an empirical application of the sustainability of the US external deficit. Contrary to existing evidence, the consideration of one structural break leads to conclude in favour of the sustainability of the US external deficit.
Resumo:
Internal energy dependence of the competitive unimolecular dissociation channels of dimethyl ether were studied with the statistical RRKM formalism. The C-O and C-H fission reactions and the 1,2-H and 1,3-H shifts, and 1,1-H2 and 1,3-H2 molecular eliminations are discussed as a function of energy dependence of k a(E*), the microcanonical rate constant for production of transition states. C-O fission is the dominant process while reaction channels involving C-H fission, 1,1-H2 and 1,3-H2 elimination and production of MeOH should be competitive at energies around 400 kJ mol-1. The less favorable process is the channel of CH4 formation.
Resumo:
Social, technological, and economic time series are divided by events which are usually assumed to be random, albeit with some hierarchical structure. It is well known that the interevent statistics observed in these contexts differs from the Poissonian profile by being long-tailed distributed with resting and active periods interwoven. Understanding mechanisms generating consistent statistics has therefore become a central issue. The approach we present is taken from the continuous-time random-walk formalism and represents an analytical alternative to models of nontrivial priority that have been recently proposed. Our analysis also goes one step further by looking at the multifractal structure of the interevent times of human decisions. We here analyze the intertransaction time intervals of several financial markets. We observe that empirical data describe a subtle multifractal behavior. Our model explains this structure by taking the pausing-time density in the form of a superstatistics where the integral kernel quantifies the heterogeneous nature of the executed tasks. A stretched exponential kernel provides a multifractal profile valid for a certain limited range. A suggested heuristic analytical profile is capable of covering a broader region.
Resumo:
The design methods and languages targeted to modern System-on-Chip designs are facing tremendous pressure of the ever-increasing complexity, power, and speed requirements. To estimate any of these three metrics, there is a trade-off between accuracy and abstraction level of detail in which a system under design is analyzed. The more detailed the description, the more accurate the simulation will be, but, on the other hand, the more time consuming it will be. Moreover, a designer wants to make decisions as early as possible in the design flow to avoid costly design backtracking. To answer the challenges posed upon System-on-chip designs, this thesis introduces a formal, power aware framework, its development methods, and methods to constraint and analyze power consumption of the system under design. This thesis discusses on power analysis of synchronous and asynchronous systems not forgetting the communication aspects of these systems. The presented framework is built upon the Timed Action System formalism, which offer an environment to analyze and constraint the functional and temporal behavior of the system at high abstraction level. Furthermore, due to the complexity of System-on-Chip designs, the possibility to abstract unnecessary implementation details at higher abstraction levels is an essential part of the introduced design framework. With the encapsulation and abstraction techniques incorporated with the procedure based communication allows a designer to use the presented power aware framework in modeling these large scale systems. The introduced techniques also enable one to subdivide the development of communication and computation into own tasks. This property is taken into account in the power analysis part as well. Furthermore, the presented framework is developed in a way that it can be used throughout the design project. In other words, a designer is able to model and analyze systems from an abstract specification down to an implementable specification.
Resumo:
The purpose of the thesis is to analyze whether the returns of general stock market indices of Estonia, Latvia and Lithuania follow the random walk hypothesis (RWH), and in addition, whether they are consistent with the weak-form efficiency criterion. Also the existence of the day-of-the-week anomaly is examined in the same regional markets. The data consists of daily closing quotes of the OMX Tallinn, Riga and Vilnius total return indices for the sample period from January 3, 2000 to August 28, 2009. Moreover, the full sample period is also divided into two sub-periods. The RWH is tested by applying three quantitative methods (i.e. the Augmented Dickey-Fuller unit root test, serial correlation test and non-parametric runs test). Ordinary Least Squares (OLS) regression with dummy variables is employed to detect the day-of-the-week anomalies. The random walk hypothesis (RWH) is rejected in the Estonian and Lithuanian stock markets. The Latvian stock market exhibits more efficient behaviour, although some evidence of inefficiency is also found, mostly during the first sub-period from 2000 to 2004. Day-of-the-week anomalies are detected on every stock market examined, though no longer during the later sub-period.
Resumo:
This article intends to answer the question: "what is the best way to evaluate the strength of acids and bases?" The meaning of the word strength, the main acid-base theories (ionotropic and electron pair), the neutralization reactions and the thermodynamical formalism are considered. Some cases are presented and discussed. In conclusion, evaluating acid-base strength is dependent on the theory (formalism) as well as on the system and measuring techniques.
Resumo:
A thermodynamic formalism based on the Gibbs Dividing Surface (GDS) for the description of a solid-fluid interface is presented, so that the adsorption layer is understand as a phase and the adsorption process as the transference of components between a 3-dimensional phase and a 2-dimensional one. Using a state equation derived from the Henry's Law, we shall show how the Langmuir isotherm is deduced from de Gibbs isotherm. The GDS is useful also for understanding the release of heat by a system as the adsorption occurs.
Resumo:
Potential energy and dipole moment curves for the HCl molecule were computed. Calculations were performed at different levels of theory (DFT, MRCI). Spectroscopic properties are reported and compared with experimental data, for validating the theoretical approaches. Interaction of infrared radiation with HCl is simulated using the wave packet formalism. The quantum control model for population dynamics of the vibrational levels, based on pi-pulse theory, is applied. The results demonstrate that wavepackets with specific composition can be built with short infrared laser pulses and provide the basis for studies of H + HCl collision dynamics with infrared laser excitation.