973 resultados para Gaussian Probability Distribution
Resumo:
We present a new record of eolian dust flux to the western Subarctic North Pacific (SNP) covering the past 27000 years based on a core from the Detroit Seamount. Comparing the SNP dust record to the NGRIP ice core record shows significant differences in the amplitude of dust changes to the two regions during the last deglaciation, while the timing of abrupt changes is synchronous. If dust deposition in the SNP faithfully records its mobilization in East Asian source regions, then the difference in the relative amplitude must reflect climate-related changes in atmospheric dust transport to Greenland. Based on the synchronicity in the timing of dust changes in the SNP and Greenland, we tie abrupt deglacial transitions in the 230Th-normalized 4He flux record to corresponding transitions in the well-dated NGRIP dust flux record to provide a new chronostratigraphic technique for marine sediments from the SNP. Results from this technique are complemented by radiocarbon dating, which allows us to independently constrain radiocarbon paleoreservoir ages. We find paleoreservoir ages of 745 ± 140 yr at 11653 yr BP, 680 ± 228 yr at 14630 yr BP and 790 ± 498 yr at 23290 yr BP. Our reconstructed paleoreservoir ages are consistent with modern surface water reservoir ages in the western SNP. Good temporal synchronicity between eolian dust records from the Subantarctic Atlantic and equatorial Pacific and the ice core record from Antarctica supports the reliability of the proposed dust tuning method to be used more widely in other global ocean regions.
Resumo:
The modern subarctic Pacific is characterized by a steep salinity-driven surface water stratification, which hampers the supply of saline and nutrient-rich deeper waters into the euphotic zone, limiting productivity. However, the strength of the halocline might have varied in the past. Here, we present diatom oxygen (d18Odiat) and silicon (d30Sidiat) stable isotope data from the open subarctic North-East (NE) Pacific (SO202-27-6; Gulf of Alaska), in combination with other proxy data (Neogloboquadrina pachydermasin d18O, biogenic opal, Ca and Fe intensities, IRD), to evaluate changes in surface water hydrography and productivity during Marine Isotope Stage (MIS) 3, characterized by millennial-scale temperature changes (Dansgaard-Oeschger (D-O) cycles) documented in Greenland ice cores.
Resumo:
The glacial-to-Holocene evolution of subarctic Pacific surface water stratification and silicic acid (Si) dynamics is investigated based on new combined diatom oxygen (d18Odiat) and silicon (d30Sidiat) isotope records, along with new biogenic opal, subsurface foraminiferal d18O, alkenone-based sea surface temperature, sea ice, diatom, and core logging data from the NE Pacific. Our results suggest that d18Odiat values are primarily influenced by changes in freshwater discharge from the Cordilleran Ice Sheet (CIS), while corresponding d30Sidiat are primarily influenced by changes in Si supply to surface waters. Our data indicate enhanced glacial to mid Heinrich Stadial 1 (HS1) NE Pacific surface water stratification, generally limiting the Si supply to surface waters. However, we suggest that an increase in Si supply during early HS1, when surface waters were still stratified, is linked to increased North Pacific Intermediate Water formation. The coincidence between fresh surface waters during HS1 and enhanced ice-rafted debris sedimentation in the North Atlantic indicates a close link between CIS and Laurentide Ice Sheet dynamics and a dominant atmospheric control on CIS deglaciation. The Bølling/Allerød (B/A) is characterized by destratification in the subarctic Pacific and an increased supply of saline, Si-rich waters to surface waters. This change toward increased convection occurred prior to the Bølling warming and is likely triggered by a switch to sea ice-free conditions during late HS1. Our results furthermore indicate a decreased efficiency of the biological pump during late HS1 and the B/A (possibly also the Younger Dryas), suggesting that the subarctic Pacific has then been a source region of atmospheric CO2.
Resumo:
A 6200 year old peat sequence, cored in a volcanic crater on the sub-Antarctic Ile de la Possession (Iles Crozet), has been investigated, based on a multi-proxy approach. The methods applied are macrobotanical (mosses, seeds and fruits) and diatom analyses, complemented by geochemical (Rock-Eval6) and rock magnetic measurements. The chronology of the core is based on 5 radiocarbon dates. When combining all the proxy data the following changes could be inferred. From the onset of the peat formation (6200 cal yr BP) until ca. 5550 cal yr BP, biological production was high and climatic conditions must have been relatively warm. At ca. 5550 cal yr BP a shift to low biological production occurred, lasting until ca. 4600 cal yr BP. During this period the organic matter is well preserved, pointing to a cold and/or wet environment. At ca. 4600 cal yr BP, biological production increased again. From ca. 4600 cal yr BP until ca. 4100 cal yr BP a 'hollow and hummock' micro topography developed at the peat surface, resulting in the presence of a mixture of wetter and drier species in the macrobotanical record. After ca. 4100 cal yr BP, the wet species disappear and a generally drier, acidic bog came into existence. A major shift in all the proxy data is observed at ca. 2800 cal yr BP, pointing to wetter and especially windier climatic conditions on the island probably caused by an intensification and/or latitudinal shift of the southern westerly belt. Caused by a stronger wind regime, erosion of the peat surface occurred at that time and a lake was formed in the peat deposits of the crater, which is still present today.
Resumo:
Three ice type regimes at Ice Station Belgica (ISB), during the 2007 International Polar Year SIMBA (Sea Ice Mass Balance in Antarctica) expedition, were characterized and assessed for elevation, snow depth, ice freeboard and thickness. Analyses of the probability distribution functions showed great potential for satellite-based altimetry for estimating ice thickness. In question is the required altimeter sampling density for reasonably accurate estimation of snow surface elevation given inherent spatial averaging. This study assesses an effort to determine the number of laser altimeter 'hits' of the ISB floe, as a representative Antarctic floe of mixed first- and multi-year ice types, for the purpose of statistically recreating the in situ-determined ice-thickness and snow depth distribution based on the fractional coverage of each ice type. Estimates of the fractional coverage and spatial distribution of the ice types, referred to as ice 'towns', for the 5 km**2 floe were assessed by in situ mapping and photo-visual documentation. Simulated ICESat altimeter tracks, with spot size ~70 m and spacing ~170 m, sampled the floe's towns, generating a buoyancy-derived ice thickness distribution. 115 altimeter hits were required to statistically recreate the regional thickness mean and distribution for a three-town assemblage of mixed first- and multi-year ice, and 85 hits for a two-town assemblage of first-year ice only: equivalent to 19.5 and 14.5 km respectively of continuous altimeter track over a floe region of similar structure. Results have significant implications toward model development of sea-ice sampling performance of the ICESat laser altimeter record as well as maximizing sampling characteristics of satellite/airborne laser and radar altimetry missions for sea-ice thickness.
Resumo:
This paper presents a theoretical analysis and an optimization method for envelope amplifier. Highly efficient envelope amplifiers based on a switching converter in parallel or series with a linear regulator have been analyzed and optimized. The results of the optimization process have been shown and these two architectures are compared regarding their complexity and efficiency. The optimization method that is proposed is based on the previous knowledge about the transmitted signal type (OFDM, WCDMA...) and it can be applied to any signal type as long as the envelope probability distribution is known. Finally, it is shown that the analyzed architectures have an inherent efficiency limit.
Resumo:
Opportunities offered by high performance computing provide a significant degree of promise in the enhancement of the performance of real-time flood forecasting systems. In this paper, a real-time framework for probabilistic flood forecasting through data assimilation is presented. The distributed rainfall-runoff real-time interactive basin simulator (RIBS) model is selected to simulate the hydrological process in the basin. Although the RIBS model is deterministic, it is run in a probabilistic way through the results of calibration developed in a previous work performed by the authors that identifies the probability distribution functions that best characterise the most relevant model parameters. Adaptive techniques improve the result of flood forecasts because the model can be adapted to observations in real time as new information is available. The new adaptive forecast model based on genetic programming as a data assimilation technique is compared with the previously developed flood forecast model based on the calibration results. Both models are probabilistic as they generate an ensemble of hydrographs, taking the different uncertainties inherent in any forecast process into account. The Manzanares River basin was selected as a case study, with the process being computationally intensive as it requires simulation of many replicas of the ensemble in real time.
Resumo:
Abstract This paper describes a two-part methodology for managing the risk posed by water supply variability to irrigated agriculture. First, an econometric model is used to explain the variation in the production value of irrigated agriculture. The explanatory variables include an index of irrigation water availability (surface storage levels), a price index representative of the crops grown in each geographical unit, and a time variable. The model corrects for autocorrelation and it is applied to 16 representative Spanish provinces in terms of irrigated agriculture. In the second part, the fitted models are used for the economic evaluation of drought risk. In flow variability in the hydrological system servicing each province is used to perform ex-ante evaluations of economic output for the upcoming irrigation season. The model?s error and the probability distribution functions (PDFs) of the reservoirs? storage variations are used to generate Monte Carlo (Latin Hypercube) simulations of agricultural output 7 and 3 months prior to the irrigation season. The results of these simulations illustrate the different risk profiles of each management unit, which depend on farm productivity and on the probability distribution function of water in flow to reservoirs. The potential for ex-ante drought impact assessments is demonstrated. By complementing hydrological models, this method can assist water managers and decisionmakers in managing reservoirs.
Resumo:
The purpose of this paper is to present a program written in Matlab-Octave for the simulation of the time evolution of student curricula, i.e, how students pass their subjects along time until graduation. The program computes, from the simulations, the academic performance rates for the subjects of the study plan for each semester as well as the overall rates, which are a) the efficiency rate defined as the ratio of the number of students passing the exam to the number of students who registered for it and b) the success rate, defined as the ratio of the number of students passing the exam to the number of students who not only registered for it but also actually took it. Additionally, we compute the rates for the bachelor academic degree which are established for Spain by the National Quality Evaluation and Accreditation Agency (ANECA) and which are the graduation rate (measured as the percentage of students who finish as scheduled in the plan or taking an extra year) and the efficiency rate (measured as the percentage of credits which a student who graduated has really taken). The simulation is done in terms of the probabilities of passing all the subjects in their study plan. The application of the simulator to Polytech students in Madrid, where requirements for passing are specially stiff in first and second year subjects, is particularly relevant to analyze student cohorts and the probabilities of students finishing in the minimum of four years, or taking and extra year or two extra years, and so forth. It is a very useful tool when designing new study plans. The calculation of the probability distribution of the random variable "number of semesters a student has taken to complete the curricula and graduate" is difficult or even unfeasible to obtain analytically, and this is even truer when we incorporate uncertainty in parameter estimation. This is why we apply Monte Carlo simulation which not only provides illustration of the stochastic process but also a method for computation. The stochastic simulator is proving to be a useful tool for identification of the subjects most critical in the distribution of the number of semesters for curriculum vitae (CV) completion and subsequently for a decision making process in terms of CV planning and passing standards in the University. Simulations are performed through a graphical interface where also the results are presented in appropriate figures. The Project has been funded by the Call for Innovation in Education Projects of Universidad Politécnica de Madrid (UPM) through a Project of its school Escuela Técnica Superior de Ingenieros Industriales ETSII during the period September 2010-September 2011.
Resumo:
Lately, several researchers have pointed out that climate change is expected to increase temperatures and lower rainfall in Mediterranean regions, simultaneously increasing the intensity of extreme rainfall events. These changes could have consequences regarding rainfall regime, erosion, sediment transport and water quality, soil management, and new designs in diversion ditches. Climate change is expected to result in increasingly unpredictable and variable rainfall, in amount and timing, changing seasonal patterns and increasing the frequency of extreme weather events. Consequently, the evolution of frequency and intensity of drought periods is of most important as in agro-ecosystems many processes will be affected by them. Realising the complex and important consequences of an increasing frequency of extreme droughts at the Ebro River basin, our aim is to study the evolution of drought events at this site statistically, with emphasis on the occurrence and intensity of them. For this purpose, fourteen meteorological stations were selected based on the length of the rainfall series and the climatic classification to obtain a representative untreated dataset from the river basin. Daily rainfall series from 1957 to 2002 were obtained from each meteorological station and no-rain period frequency as the consecutive numbers of days were extracted. Based on this data, we study changes in the probability distribution in several sub-periods. Moreover we used the Standardized Precipitation Index (SPI) for identification of drought events in a year scale and then we use this index to fit log-linear models to the contingency tables between the SPI index and the sub-periods, this adjusted is carried out with the help of ANOVA inference.
Resumo:
We propose distributed algorithms for sampling networks based on a new class of random walks that we call Centrifugal Random Walks (CRW). A CRW is a random walk that starts at a source and always moves away from it. We propose CRW algorithms for connected networks with arbitrary probability distributions, and for grids and networks with regular concentric connectivity with distance based distributions. All CRW sampling algorithms select a node with the exact probability distribution, do not need warm-up, and end in a number of hops bounded by the network diameter.
Resumo:
Sampling a network with a given probability distribution has been identified as a useful operation. In this paper we propose distributed algorithms for sampling networks, so that nodes are selected by a special node, called the source, with a given probability distribution. All these algorithms are based on a new class of random walks, that we call Random Centrifugal Walks (RCW). A RCW is a random walk that starts at the source and always moves away from it. Firstly, an algorithm to sample any connected network using RCW is proposed. The algorithm assumes that each node has a weight, so that the sampling process must select a node with a probability proportional to its weight. This algorithm requires a preprocessing phase before the sampling of nodes. In particular, a minimum diameter spanning tree (MDST) is created in the network, and then nodes weights are efficiently aggregated using the tree. The good news are that the preprocessing is done only once, regardless of the number of sources and the number of samples taken from the network. After that, every sample is done with a RCW whose length is bounded by the network diameter. Secondly, RCW algorithms that do not require preprocessing are proposed for grids and networks with regular concentric connectivity, for the case when the probability of selecting a node is a function of its distance to the source. The key features of the RCW algorithms (unlike previous Markovian approaches) are that (1) they do not need to warm-up (stabilize), (2) the sampling always finishes in a number of hops bounded by the network diameter, and (3) it selects a node with the exact probability distribution.
Resumo:
We propose a new Bayesian framework for automatically determining the position (location and orientation) of an uncalibrated camera using the observations of moving objects and a schematic map of the passable areas of the environment. Our approach takes advantage of static and dynamic information on the scene structures through prior probability distributions for object dynamics. The proposed approach restricts plausible positions where the sensor can be located while taking into account the inherent ambiguity of the given setting. The proposed framework samples from the posterior probability distribution for the camera position via data driven MCMC, guided by an initial geometric analysis that restricts the search space. A Kullback-Leibler divergence analysis is then used that yields the final camera position estimate, while explicitly isolating ambiguous settings. The proposed approach is evaluated in synthetic and real environments, showing its satisfactory performance in both ambiguous and unambiguous settings.
Resumo:
Introducing cover crops (CC) interspersed with intensively fertilized crops in rotation has the potential to reduce nitrate leaching. This paper evaluates various strategies involving CC between maize and compares the economic and environmental results with respect to a typical maize?fallow rotation. The comparison is performed through stochastic (Monte-Carlo) simulation models of farms? profits using probability distribution functions (pdfs) of yield and N fertilizer saving fitted with data collected from various field trials and pdfs of crop prices and the cost of fertilizer fitted from statistical sources. Stochastic dominance relationships are obtained to rank the most profitable strategies from a farm financial perspective. A two-criterion comparison scheme is proposed to rank alternative strategies based on farm profit and nitrate leaching levels, taking the baseline scenario as the maize?fallow rotation. The results show that when CC biomass is sold as forage instead of keeping it in the soil, greater profit and less leaching of nitrates are achieved than in the baseline scenario. While the fertilizer saving will be lower if CC is sold than if it is kept in the soil, the revenue obtained from the sale of the CC compensates for the reduced fertilizer savings. The results show that CC would perhaps provide a double dividend of greater profit and reduced nitrate leaching in intensive irrigated cropping systems in Mediterranean regions.
Resumo:
Prediction at ungauged sites is essential for water resources planning and management. Ungauged sites have no observations about the magnitude of floods, but some site and basin characteristics are known. Regression models relate physiographic and climatic basin characteristics to flood quantiles, which can be estimated from observed data at gauged sites. However, these models assume linear relationships between variables Prediction intervals are estimated by the variance of the residuals in the estimated model. Furthermore, the effect of the uncertainties in the explanatory variables on the dependent variable cannot be assessed. This paper presents a methodology to propagate the uncertainties that arise in the process of predicting flood quantiles at ungauged basins by a regression model. In addition, Bayesian networks were explored as a feasible tool for predicting flood quantiles at ungauged sites. Bayesian networks benefit from taking into account uncertainties thanks to their probabilistic nature. They are able to capture non-linear relationships between variables and they give a probability distribution of discharges as result. The methodology was applied to a case study in the Tagus basin in Spain.