996 resultados para Summed probability functions
Resumo:
There is an on-going debate on the environmental effects of genetically modified crops to which this paper aims to contribute. First, data on environmental impacts of genetically modified (GM) and conventional crops are collected from peer-reviewed journals, and secondly an analysis is conducted in order to examine which crop type is less harmful for the environment. Published data on environmental impacts are measured using an array of indicators, and their analysis requires their normalisation and aggregation. Taking advantage of composite indicators literature, this paper builds composite indicators to measure the impact of GM and conventional crops in three dimensions: (1) non-target key species richness, (2) pesticide use, and (3) aggregated environmental impact. The comparison between the three composite indicators for both crop types allows us to establish not only a ranking to elucidate which crop is more convenient for the environment but the probability that one crop type outperforms the other from an environmental perspective. Results show that GM crops tend to cause lower environmental impacts than conventional crops for the analysed indicators.
Resumo:
Five minute-averaged values of sky clearness, direct and diffuse indices, were used to model the frequency distributions of these variables in terms of optical air mass. From more than four years of solar radiation observations it was found that variations in the frequency distributions of the three indices of optical air mass for Botucatu, Brazil, are similar to those in other places, as published in the literature. The proposed models were obtained by linear combination of normalized Beta probability functions, using the observed distributions derived from three years of data. The versatility of these functions allows modelling of all three irradiance indexes to similar levels of accuracy. A comparison with the observed distributions obtained from one year of observations indicate that the models are able to reproduce the observed frequency distributions of all three indices at the 95% confidence level.
Resumo:
Climate change, whether gradual or sudden, has frequently been invoked as a causal factor to explain many aspects of cultural change during the prehistoric and early historic periods. Critiquing such theories has often proven difficult, not least because of the imprecise dating of many aspects of the palaeoclimate or archaeological records and the difficulties of merging the two strands of research. Here we consider one example of the archaeological record – peatland site construction in Ireland – which has previously been interpreted in terms of social response to climate change and examine whether close scrutiny of the archaeological and palaeoenvironmental records uphold the climatically deterministic hypotheses. We evaluate evidence for phasing in the temporal distribution of trackways and related sites in Irish peatlands, of which more than 3,500 examples have been recorded, through the examination of ~350 dendrochronological and 14C dates from these structures. The role of climate change in influencing when such sites were constructed is assessed by comparing visually and statistically the frequency of sites over the last 4,500 years with well-dated, multi-proxy climate reconstructions from Irish peatlands. We demonstrate that national patterns of “peatland activity” exist that indicate that the construction of sites in bogs was neither a constant nor random phenomenon. Phases of activity (i.e. periods in which the number of structures increased), as well as the ‘lulls’ that separate them, show no consistent correlation with periods of wetter or drier conditions on the bogs, suggesting that the impetus for the start or cessation of such activity was not climatically-determined. We propose that trigger(s) for peatland site construction in Ireland must instead also be sought within the wider, contemporary social background. Perhaps not surprisingly, a comparison with archaeological and palynological evidence shows that peatland activity tends to occur at times of more expansive settlement and land-use, suggesting that the bogs were used when the landscape was being more widely occupied. Interestingly, the lulls in peatland site construction coincide with transitional points between nominal archaeological phases, typically defined on the basis of their material culture, implying that there may indeed have been a cultural discontinuity at these times. © 2012 Elsevier Ltd. All rights reserved.
Resumo:
Electron energy probability functions measured with a passively compensated Langmuir probe in asymmetric capacitively coupled hydrogen and deuterium plasmas exhibit structure. The otherwise relatively continuous distribution appears to have an abrupt peak in electron density near 5 eV. This structure occurs at a higher energy in deuterium than hydrogen and there is a correlation between floating potential and the voltage at which the structure is observed in the second derivative of the I(V) characteristic. While the cause of the structure has yet to be clarified, spectroscopic observations and computer-based hydrogen models indicate that the high energy tail of the distribution is strongly modulated during the radio frequency cycle. The effect of this modulation on plasma properties and probe measurements has yet to be explored. (C) 1999 American Institute of Physics. [S0003-6951(99)00819-0].
Resumo:
The results of comprehensive experimental studies of the operation, stability, and plasma parameters of the low-frequency (0.46 MHz) inductively coupled plasmas sustained by the internal oscillating rf current are reported. The rf plasma is generated by using a custom-designed configuration of the internal rf coil that comprises two perpendicular sets of eight currents in each direction. Various diagnostic tools, such as magnetic probes, optical emission spectroscopy, and an rf-compensated Langmuir probe were used to investigate the electromagnetic, optical, and global properties of the argon plasma in wide ranges of the applied rf power and gas feedstock pressure. It is found that the uniformity of the electromagnetic field inside the plasma reactor is improved as compared to the conventional sources of inductively coupled plasmas with the external flat coil configuration. A reasonable agreement between the experimental data and computed electromagnetic field topography inside the chamber is reported. The Langmuir probe measurements reveal that the spatial profiles of the electron density, the effective electron temperature, plasma potential, and electron energy distribution/probability functions feature a high degree of the radial and axial uniformity and a weak azimuthal dependence, which is consistent with the earlier theoretical predictions. As the input rf power increases, the azimuthal dependence of the global plasma parameters vanishes. The obtained results demonstrate that by introducing the internal oscillated rf currents one can noticeably improve the uniformity of electromagnetic field topography, rf power deposition, and the plasma density in the reactor.
Resumo:
Control and diagnostics of low-frequency (∼ 500 kHz) inductively coupled plasmas for chemical vapor deposition (CVD) of nano-composite carbon nitride-based films is reported. Relation between the discharge control parameters, plasma electron energy distribution/probability functions (EEDF/EEPF), and elemental composition in the deposited C-N based thin films is investigated. Langmuir probe technique is employed to monitor the plasma density and potential, effective electron temperature, and EEDFs/EEPFs in Ar + N2 + CH4 discharges. It is revealed that varying RF power and gas composition/pressure one can engineer the EEDFs/EEPFs to enhance the desired plasma-chemical gas-phase reactions thus controlling the film chemical structure. Auxiliary diagnostic tools for study of the RF power deposition, plasma composition, stability, and optical emission are discussed as well.
Resumo:
Motivated by the observation of the rate effect on material failure, a model of nonlinear and nonlocal evolution is developed, that includes both stochastic and dynamic effects. In phase space a transitional region prevails, which distinguishes the failure behavior from a globally stable one to that of catastrophic. Several probability functions are found to characterize the distinctive features of evolution due to different degrees of nucleation, growth and coalescence rates. The results may provide a better understanding of material failure.
Resumo:
We report an observation of femtosecond optical fluctuations of transmitted light when a coherent femtosecond pulse propagates through a random medium. They are a result of random interference among scattered waves coming from different trajectories in the time domain. Temporal fluctuations are measured by using cross-correlated frequency optical gating. It is shown that a femtosecond pulse will be broadened and distorted in pulse shape while it is propagating in random medium. The real and imaginary components of transmitted electric field are also distorted severely. The average of the fluctuated transmission pulses yields a smooth profile, probability functions show good agreement with Gaussian distribution. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
Two electrical techniques that are frequently used to characterize radio frequency plasmas are described: current-voltage probes for plasma power input and compensated Langmuir probes for electron energy probability functions and other parameters. The following examples of the use of these techniques, sometimes in conjunction with other diagnostic methods, are presented: plasma source standardization, plasma system comparison, power efficiency, plasma modelling and complex processing plasma mechanisms.
Resumo:
Peatlands are a key component of the global carbon cycle. Chronologies of peatland initiation are typically based on compiled basal peat radiocarbon (14C) dates and frequency histograms of binned calibrated age ranges. However, such compilations are problematic because poor quality 14C dates are commonly included and because frequency histograms of binned age ranges introduce chronological artefacts that bias the record of peatland initiation. Using a published compilation of 274 basal 14C dates from Alaska as a case study, we show that nearly half the 14C dates are inappropriate for reconstructing peatland initiation, and that the temporal structure of peatland initiation is sensitive to sampling biases and treatment of calibrated14C dates. We present revised chronologies of peatland initiation for Alaska and the circumpolar Arctic based on summed probability distributions of calibrated 14C dates. These revised chronologies reveal that northern peatland initiation lagged abrupt increases in atmospheric CH4 concentration at the start of the Bølling–Allerød interstadial (Termination 1A) and the end of the Younger Dryas chronozone (Termination 1B), suggesting that northern peatlands were not the primary drivers of the rapid increases in atmospheric CH4. Our results demonstrate that subtle methodological changes in the synthesis of basal 14C ages lead to substantially different interpretations of temporal trends in peatland initiation, with direct implications for the role of peatlands in the global carbon cycle.
Resumo:
La present Tesi Doctoral, titulada desenvolupament computacional de la semblança molecular quàntica, tracta, fonamentalment, els aspectes de càlcul de mesures de semblança basades en la comparació de funcions de densitat electrònica.El primer capítol, Semblança quàntica, és introductori. S'hi descriuen les funcions de densitat de probabilitat electrònica i llur significança en el marc de la mecànica quàntica. Se n'expliciten els aspectes essencials i les condicions matemàtiques a satisfer, cara a una millor comprensió dels models de densitat electrònica que es proposen. Hom presenta les densitats electròniques, mencionant els teoremes de Hohenberg i Kohn i esquematitzant la teoria de Bader, com magnituds fonamentals en la descripció de les molècules i en la comprensió de llurs propietats.En el capítol Models de densitats electròniques moleculars es presenten procediments computacionals originals per l'ajust de funcions densitat a models expandits en termes de gaussianes 1s centrades en els nuclis. Les restriccions físico-matemàtiques associades a les distribucions de probabilitat s'introdueixen de manera rigorosa, en el procediment anomenat Atomic Shell Approximation (ASA). Aquest procediment, implementat en el programa ASAC, parteix d'un espai funcional quasi complert, d'on se seleccionen variacionalment les funcions o capes de l'expansió, d'acord als requisits de no negativitat. La qualitat d'aquestes densitats i de les mesures de semblança derivades es verifica abastament. Aquest model ASA s'estén a representacions dinàmiques, físicament més acurades, en quant que afectades per les vibracions nuclears, cara a una exploració de l'efecte de l'esmorteïment dels pics nuclears en les mesures de semblança molecular. La comparació de les densitats dinàmiques respecte les estàtiques evidencia un reordenament en les densitats dinàmiques, d'acord al que constituiria una manifestació del Principi quàntic de Le Chatelier. El procediment ASA, explícitament consistent amb les condicions de N-representabilitat, s'aplica també a la determinació directe de densitats electròniques hidrogenoides, en un context de teoria del funcional de la densitat.El capítol Maximització global de la funció de semblança presenta algorismes originals per la determinació de la màxima sobreposició de les densitats electròniques moleculars. Les mesures de semblança molecular quàntica s'identifiquen amb el màxim solapament, de manera es mesuri la distància entre les molècules, independentment dels sistemes de referència on es defineixen les densitats electròniques. Partint de la solució global en el límit de densitats infinitament compactades en els nuclis, es proposen tres nivells de aproximació per l'exploració sistemàtica, no estocàstica, de la funció de semblança, possibilitant la identificació eficient del màxim global, així com també dels diferents màxims locals. Es proposa també una parametrització original de les integrals de recobriment a través d'ajustos a funcions lorentzianes, en quant que tècnica d'acceleració computacional. En la pràctica de les relacions estructura-activitat, aquests avenços possibiliten la implementació eficient de mesures de semblança quantitatives, i, paral·lelament, proporcionen una metodologia totalment automàtica d'alineació molecular. El capítol Semblances d'àtoms en molècules descriu un algorisme de comparació dels àtoms de Bader, o regions tridimensionals delimitades per superfícies de flux zero de la funció de densitat electrònica. El caràcter quantitatiu d'aquestes semblances possibilita la mesura rigorosa de la noció química de transferibilitat d'àtoms i grups funcionals. Les superfícies de flux zero i els algorismes d'integració usats han estat publicats recentment i constitueixen l'aproximació més acurada pel càlcul de les propietats atòmiques. Finalment, en el capítol Semblances en estructures cristal·lines hom proposa una definició original de semblança, específica per la comparació dels conceptes de suavitat o softness en la distribució de fonons associats a l'estructura cristal·lina. Aquests conceptes apareixen en estudis de superconductivitat a causa de la influència de les interaccions electró-fonó en les temperatures de transició a l'estat superconductor. En aplicar-se aquesta metodologia a l'anàlisi de sals de BEDT-TTF, s'evidencien correlacions estructurals entre sals superconductores i no superconductores, en consonància amb les hipòtesis apuntades a la literatura sobre la rellevància de determinades interaccions.Conclouen aquesta tesi un apèndix que conté el programa ASAC, implementació de l'algorisme ASA, i un capítol final amb referències bibliogràfiques.
Resumo:
The estimation of the long-term wind resource at a prospective site based on a relatively short on-site measurement campaign is an indispensable task in the development of a commercial wind farm. The typical industry approach is based on the measure-correlate-predict �MCP� method where a relational model between the site wind velocity data and the data obtained from a suitable reference site is built from concurrent records. In a subsequent step, a long-term prediction for the prospective site is obtained from a combination of the relational model and the historic reference data. In the present paper, a systematic study is presented where three new MCP models, together with two published reference models �a simple linear regression and the variance ratio method�, have been evaluated based on concurrent synthetic wind speed time series for two sites, simulating the prospective and the reference site. The synthetic method has the advantage of generating time series with the desired statistical properties, including Weibull scale and shape factors, required to evaluate the five methods under all plausible conditions. In this work, first a systematic discussion of the statistical fundamentals behind MCP methods is provided and three new models, one based on a nonlinear regression and two �termed kernel methods� derived from the use of conditional probability density functions, are proposed. All models are evaluated by using five metrics under a wide range of values of the correlation coefficient, the Weibull scale, and the Weibull shape factor. Only one of all models, a kernel method based on bivariate Weibull probability functions, is capable of accurately predicting all performance metrics studied.
Resumo:
La comparación de las diferentes ofertas presentadas en la licitación de un proyecto,con el sistema de contratación tradicional de medición abierta y precio unitario cerrado, requiere herramientas de análisis que sean capaces de discriminar propuestas que teniendo un importe global parecido pueden presentar un impacto económico muy diferente durante la ejecución. Una de las situaciones que no se detecta fácilmente con los métodos tradicionales es el comportamiento del coste real frente a las variaciones de las cantidades realmente ejecutadas en obra respecto de las estimadas en el proyecto. Este texto propone abordar esta situación mediante un sistema de análisis cuantitativo del riesgo como el método de Montecarlo. Este procedimiento, como es sabido, consiste en permitir que los datos de entrada que definen el problema varíen unas funciones de probabilidad definidas, generar un gran número de casos de prueba y tratar los resultados estadísticamente para obtener los valores finales más probables,con los parámetros necesarios para medir la fiabilidad de la estimación. Se presenta un modelo para la comparación de ofertas, desarrollado de manera que puede aplicarse en casos reales aplicando a los datos conocidos unas condiciones de variación que sean fáciles de establecer por los profesionales que realizan estas tareas. ABSTRACT: The comparison of the different bids in the tender for a project, with the traditional contract system based on unit rates open to and re-measurement, requires analysis tools that are able to discriminate proposals having a similar overall economic impact, but that might show a very different behaviour during the execution of the works. One situation not easily detected by traditional methods is the reaction of the actual cost to the changes in the exact quantity of works finally executed respect of the work estimated in the project. This paper intends to address this situation through the Monte Carlo method, a system of quantitative risk analysis. This procedure, as is known, is allows the input data defining the problem to vary some within well defined probability functions, generating a large number of test cases, the results being statistically treated to obtain the most probable final values, with the rest of the parameters needed to measure the reliability of the estimate. We present a model for the comparison of bids, designed in a way that it can be applied in real cases, based on data and assumptions that are easy to understand and set up by professionals who wish to perform these tasks.
Resumo:
This paper presents a new method for producing a functional-structural plant model that simulates response to different growth conditions, yet does not require detailed knowledge of underlying physiology. The example used to present this method is the modelling of the mountain birch tree. This new functional-structural modelling approach is based on linking an L-system representation of the dynamic structure of the plant with a canonical mathematical model of plant function. Growth indicated by the canonical model is allocated to the structural model according to probabilistic growth rules, such as rules for the placement and length of new shoots, which were derived from an analysis of architectural data. The main advantage of the approach is that it is relatively simple compared to the prevalent process-based functional-structural plant models and does not require a detailed understanding of underlying physiological processes, yet it is able to capture important aspects of plant function and adaptability, unlike simple empirical models. This approach, combining canonical modelling, architectural analysis and L-systems, thus fills the important role of providing an intermediate level of abstraction between the two extremes of deeply mechanistic process-based modelling and purely empirical modelling. We also investigated the relative importance of various aspects of this integrated modelling approach by analysing the sensitivity of the standard birch model to a number of variations in its parameters, functions and algorithms. The results show that using light as the sole factor determining the structural location of new growth gives satisfactory results. Including the influence of additional regulating factors made little difference to global characteristics of the emergent architecture. Changing the form of the probability functions and using alternative methods for choosing the sites of new growth also had little effect. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
This paper synthesizes and discusses the spatial and temporal patterns of archaeological sites in Ireland, spanning the Neolithic period and the Bronze Age transition (4300–1900 cal BC), in order to explore the timing and implications of the main changes that occurred in the archaeological record of that period. Large amounts of new data are sourced from unpublished developer-led excavations and combined with national archives, published excavations and online databases. Bayesian radiocarbon models and context- and sample-sensitive summed radiocarbon probabilities are used to examine the dataset. The study captures the scale and timing of the initial expansion of Early Neolithic settlement and the ensuing attenuation of all such activity—an apparent boom-and-bust cycle. The Late Neolithic and Chalcolithic periods are characterised by a resurgence and diversification of activity. Contextualisation and spatial analysis of radiocarbon data reveals finer-scale patterning than is usually possible with summed-probability approaches: the boom-and-bust models of prehistoric populations may, in fact, be a misinterpretation of more subtle demographic changes occurring at the same time as cultural change and attendant differences in the archaeological record.