920 resultados para probability distribution
Resumo:
Many datasets used by economists and other social scientists are collected by stratified sampling. The sampling scheme used to collect the data induces a probability distribution on the observed sample that differs from the target or underlying distribution for which inference is to be made. If this effect is not taken into account, subsequent statistical inference can be seriously biased. This paper shows how to do efficient semiparametric inference in moment restriction models when data from the target population is collected by three widely used sampling schemes: variable probability sampling, multinomial sampling, and standard stratified sampling.
Resumo:
This paper explores the dynamic linkages that portray different facets of the joint probability distribution of stock market returns in NAFTA (i.e., Canada, Mexico, and the US). Our examination of interactions of the NAFTA stock markets considers three issues. First, we examine the long-run relationship between the three markets, using cointegration techniques. Second, we evaluate the dynamic relationships between the three markets, using impulse-response analysis. Finally, we explore the volatility transmission process between the three markets, using a variety of multivariate GARCH models. Our results also exhibit significant volatility transmission between the second moments of the NAFTA stock markets, albeit not homogenous. The magnitude and trend of the conditional correlations indicate that in the last few years, the Mexican stock market exhibited a tendency toward increased integration with the US market. Finally, we do note that evidence exists that the Peso and Asian financial crises as well as the stock-market crash in the US affect the return and volatility time-series relationships.
Resumo:
In Part One, the foundations of Bayesian inference are reviewed, and the technicalities of the Bayesian method are illustrated. Part Two applies the Bayesian meta-analysis program, the Confidence Profile Method (CPM), to clinical trial data and evaluates the merits of using Bayesian meta-analysis for overviews of clinical trials.^ The Bayesian method of meta-analysis produced similar results to the classical results because of the large sample size, along with the input of a non-preferential prior probability distribution. These results were anticipated through explanations in Part One of the mechanics of the Bayesian approach. ^
Resumo:
We present a new record of eolian dust flux to the western Subarctic North Pacific (SNP) covering the past 27000 years based on a core from the Detroit Seamount. Comparing the SNP dust record to the NGRIP ice core record shows significant differences in the amplitude of dust changes to the two regions during the last deglaciation, while the timing of abrupt changes is synchronous. If dust deposition in the SNP faithfully records its mobilization in East Asian source regions, then the difference in the relative amplitude must reflect climate-related changes in atmospheric dust transport to Greenland. Based on the synchronicity in the timing of dust changes in the SNP and Greenland, we tie abrupt deglacial transitions in the 230Th-normalized 4He flux record to corresponding transitions in the well-dated NGRIP dust flux record to provide a new chronostratigraphic technique for marine sediments from the SNP. Results from this technique are complemented by radiocarbon dating, which allows us to independently constrain radiocarbon paleoreservoir ages. We find paleoreservoir ages of 745 ± 140 yr at 11653 yr BP, 680 ± 228 yr at 14630 yr BP and 790 ± 498 yr at 23290 yr BP. Our reconstructed paleoreservoir ages are consistent with modern surface water reservoir ages in the western SNP. Good temporal synchronicity between eolian dust records from the Subantarctic Atlantic and equatorial Pacific and the ice core record from Antarctica supports the reliability of the proposed dust tuning method to be used more widely in other global ocean regions.
Resumo:
The modern subarctic Pacific is characterized by a steep salinity-driven surface water stratification, which hampers the supply of saline and nutrient-rich deeper waters into the euphotic zone, limiting productivity. However, the strength of the halocline might have varied in the past. Here, we present diatom oxygen (d18Odiat) and silicon (d30Sidiat) stable isotope data from the open subarctic North-East (NE) Pacific (SO202-27-6; Gulf of Alaska), in combination with other proxy data (Neogloboquadrina pachydermasin d18O, biogenic opal, Ca and Fe intensities, IRD), to evaluate changes in surface water hydrography and productivity during Marine Isotope Stage (MIS) 3, characterized by millennial-scale temperature changes (Dansgaard-Oeschger (D-O) cycles) documented in Greenland ice cores.
Resumo:
The glacial-to-Holocene evolution of subarctic Pacific surface water stratification and silicic acid (Si) dynamics is investigated based on new combined diatom oxygen (d18Odiat) and silicon (d30Sidiat) isotope records, along with new biogenic opal, subsurface foraminiferal d18O, alkenone-based sea surface temperature, sea ice, diatom, and core logging data from the NE Pacific. Our results suggest that d18Odiat values are primarily influenced by changes in freshwater discharge from the Cordilleran Ice Sheet (CIS), while corresponding d30Sidiat are primarily influenced by changes in Si supply to surface waters. Our data indicate enhanced glacial to mid Heinrich Stadial 1 (HS1) NE Pacific surface water stratification, generally limiting the Si supply to surface waters. However, we suggest that an increase in Si supply during early HS1, when surface waters were still stratified, is linked to increased North Pacific Intermediate Water formation. The coincidence between fresh surface waters during HS1 and enhanced ice-rafted debris sedimentation in the North Atlantic indicates a close link between CIS and Laurentide Ice Sheet dynamics and a dominant atmospheric control on CIS deglaciation. The Bølling/Allerød (B/A) is characterized by destratification in the subarctic Pacific and an increased supply of saline, Si-rich waters to surface waters. This change toward increased convection occurred prior to the Bølling warming and is likely triggered by a switch to sea ice-free conditions during late HS1. Our results furthermore indicate a decreased efficiency of the biological pump during late HS1 and the B/A (possibly also the Younger Dryas), suggesting that the subarctic Pacific has then been a source region of atmospheric CO2.
Resumo:
A 6200 year old peat sequence, cored in a volcanic crater on the sub-Antarctic Ile de la Possession (Iles Crozet), has been investigated, based on a multi-proxy approach. The methods applied are macrobotanical (mosses, seeds and fruits) and diatom analyses, complemented by geochemical (Rock-Eval6) and rock magnetic measurements. The chronology of the core is based on 5 radiocarbon dates. When combining all the proxy data the following changes could be inferred. From the onset of the peat formation (6200 cal yr BP) until ca. 5550 cal yr BP, biological production was high and climatic conditions must have been relatively warm. At ca. 5550 cal yr BP a shift to low biological production occurred, lasting until ca. 4600 cal yr BP. During this period the organic matter is well preserved, pointing to a cold and/or wet environment. At ca. 4600 cal yr BP, biological production increased again. From ca. 4600 cal yr BP until ca. 4100 cal yr BP a 'hollow and hummock' micro topography developed at the peat surface, resulting in the presence of a mixture of wetter and drier species in the macrobotanical record. After ca. 4100 cal yr BP, the wet species disappear and a generally drier, acidic bog came into existence. A major shift in all the proxy data is observed at ca. 2800 cal yr BP, pointing to wetter and especially windier climatic conditions on the island probably caused by an intensification and/or latitudinal shift of the southern westerly belt. Caused by a stronger wind regime, erosion of the peat surface occurred at that time and a lake was formed in the peat deposits of the crater, which is still present today.
Resumo:
Three ice type regimes at Ice Station Belgica (ISB), during the 2007 International Polar Year SIMBA (Sea Ice Mass Balance in Antarctica) expedition, were characterized and assessed for elevation, snow depth, ice freeboard and thickness. Analyses of the probability distribution functions showed great potential for satellite-based altimetry for estimating ice thickness. In question is the required altimeter sampling density for reasonably accurate estimation of snow surface elevation given inherent spatial averaging. This study assesses an effort to determine the number of laser altimeter 'hits' of the ISB floe, as a representative Antarctic floe of mixed first- and multi-year ice types, for the purpose of statistically recreating the in situ-determined ice-thickness and snow depth distribution based on the fractional coverage of each ice type. Estimates of the fractional coverage and spatial distribution of the ice types, referred to as ice 'towns', for the 5 km**2 floe were assessed by in situ mapping and photo-visual documentation. Simulated ICESat altimeter tracks, with spot size ~70 m and spacing ~170 m, sampled the floe's towns, generating a buoyancy-derived ice thickness distribution. 115 altimeter hits were required to statistically recreate the regional thickness mean and distribution for a three-town assemblage of mixed first- and multi-year ice, and 85 hits for a two-town assemblage of first-year ice only: equivalent to 19.5 and 14.5 km respectively of continuous altimeter track over a floe region of similar structure. Results have significant implications toward model development of sea-ice sampling performance of the ICESat laser altimeter record as well as maximizing sampling characteristics of satellite/airborne laser and radar altimetry missions for sea-ice thickness.
Resumo:
We have developed a new projector model specifically tailored for fast list-mode tomographic reconstructions in Positron emission tomography (PET) scanners with parallel planar detectors. The model provides an accurate estimation of the probability distribution of coincidence events defined by pairs of scintillating crystals. This distribution is parameterized with 2D elliptical Gaussian functions defined in planes perpendicular to the main axis of the tube of response (TOR). The parameters of these Gaussian functions have been obtained by fitting Monte Carlo simulations that include positron range, acolinearity of gamma rays, as well as detector attenuation and scatter effects. The proposed model has been applied efficiently to list-mode reconstruction algorithms. Evaluation with Monte Carlo simulations over a rotating high resolution PET scanner indicates that this model allows to obtain better recovery to noise ratio in OSEM (ordered-subsets, expectation-maximization) reconstruction, if compared to list-mode reconstruction with symmetric circular Gaussian TOR model, and histogram-based OSEM with precalculated system matrix using Monte Carlo simulated models and symmetries.
Resumo:
This paper presents a theoretical analysis and an optimization method for envelope amplifier. Highly efficient envelope amplifiers based on a switching converter in parallel or series with a linear regulator have been analyzed and optimized. The results of the optimization process have been shown and these two architectures are compared regarding their complexity and efficiency. The optimization method that is proposed is based on the previous knowledge about the transmitted signal type (OFDM, WCDMA...) and it can be applied to any signal type as long as the envelope probability distribution is known. Finally, it is shown that the analyzed architectures have an inherent efficiency limit.
Resumo:
Opportunities offered by high performance computing provide a significant degree of promise in the enhancement of the performance of real-time flood forecasting systems. In this paper, a real-time framework for probabilistic flood forecasting through data assimilation is presented. The distributed rainfall-runoff real-time interactive basin simulator (RIBS) model is selected to simulate the hydrological process in the basin. Although the RIBS model is deterministic, it is run in a probabilistic way through the results of calibration developed in a previous work performed by the authors that identifies the probability distribution functions that best characterise the most relevant model parameters. Adaptive techniques improve the result of flood forecasts because the model can be adapted to observations in real time as new information is available. The new adaptive forecast model based on genetic programming as a data assimilation technique is compared with the previously developed flood forecast model based on the calibration results. Both models are probabilistic as they generate an ensemble of hydrographs, taking the different uncertainties inherent in any forecast process into account. The Manzanares River basin was selected as a case study, with the process being computationally intensive as it requires simulation of many replicas of the ensemble in real time.
Resumo:
Abstract This paper describes a two-part methodology for managing the risk posed by water supply variability to irrigated agriculture. First, an econometric model is used to explain the variation in the production value of irrigated agriculture. The explanatory variables include an index of irrigation water availability (surface storage levels), a price index representative of the crops grown in each geographical unit, and a time variable. The model corrects for autocorrelation and it is applied to 16 representative Spanish provinces in terms of irrigated agriculture. In the second part, the fitted models are used for the economic evaluation of drought risk. In flow variability in the hydrological system servicing each province is used to perform ex-ante evaluations of economic output for the upcoming irrigation season. The model?s error and the probability distribution functions (PDFs) of the reservoirs? storage variations are used to generate Monte Carlo (Latin Hypercube) simulations of agricultural output 7 and 3 months prior to the irrigation season. The results of these simulations illustrate the different risk profiles of each management unit, which depend on farm productivity and on the probability distribution function of water in flow to reservoirs. The potential for ex-ante drought impact assessments is demonstrated. By complementing hydrological models, this method can assist water managers and decisionmakers in managing reservoirs.
Resumo:
The purpose of this paper is to present a program written in Matlab-Octave for the simulation of the time evolution of student curricula, i.e, how students pass their subjects along time until graduation. The program computes, from the simulations, the academic performance rates for the subjects of the study plan for each semester as well as the overall rates, which are a) the efficiency rate defined as the ratio of the number of students passing the exam to the number of students who registered for it and b) the success rate, defined as the ratio of the number of students passing the exam to the number of students who not only registered for it but also actually took it. Additionally, we compute the rates for the bachelor academic degree which are established for Spain by the National Quality Evaluation and Accreditation Agency (ANECA) and which are the graduation rate (measured as the percentage of students who finish as scheduled in the plan or taking an extra year) and the efficiency rate (measured as the percentage of credits which a student who graduated has really taken). The simulation is done in terms of the probabilities of passing all the subjects in their study plan. The application of the simulator to Polytech students in Madrid, where requirements for passing are specially stiff in first and second year subjects, is particularly relevant to analyze student cohorts and the probabilities of students finishing in the minimum of four years, or taking and extra year or two extra years, and so forth. It is a very useful tool when designing new study plans. The calculation of the probability distribution of the random variable "number of semesters a student has taken to complete the curricula and graduate" is difficult or even unfeasible to obtain analytically, and this is even truer when we incorporate uncertainty in parameter estimation. This is why we apply Monte Carlo simulation which not only provides illustration of the stochastic process but also a method for computation. The stochastic simulator is proving to be a useful tool for identification of the subjects most critical in the distribution of the number of semesters for curriculum vitae (CV) completion and subsequently for a decision making process in terms of CV planning and passing standards in the University. Simulations are performed through a graphical interface where also the results are presented in appropriate figures. The Project has been funded by the Call for Innovation in Education Projects of Universidad Politécnica de Madrid (UPM) through a Project of its school Escuela Técnica Superior de Ingenieros Industriales ETSII during the period September 2010-September 2011.
Resumo:
Lately, several researchers have pointed out that climate change is expected to increase temperatures and lower rainfall in Mediterranean regions, simultaneously increasing the intensity of extreme rainfall events. These changes could have consequences regarding rainfall regime, erosion, sediment transport and water quality, soil management, and new designs in diversion ditches. Climate change is expected to result in increasingly unpredictable and variable rainfall, in amount and timing, changing seasonal patterns and increasing the frequency of extreme weather events. Consequently, the evolution of frequency and intensity of drought periods is of most important as in agro-ecosystems many processes will be affected by them. Realising the complex and important consequences of an increasing frequency of extreme droughts at the Ebro River basin, our aim is to study the evolution of drought events at this site statistically, with emphasis on the occurrence and intensity of them. For this purpose, fourteen meteorological stations were selected based on the length of the rainfall series and the climatic classification to obtain a representative untreated dataset from the river basin. Daily rainfall series from 1957 to 2002 were obtained from each meteorological station and no-rain period frequency as the consecutive numbers of days were extracted. Based on this data, we study changes in the probability distribution in several sub-periods. Moreover we used the Standardized Precipitation Index (SPI) for identification of drought events in a year scale and then we use this index to fit log-linear models to the contingency tables between the SPI index and the sub-periods, this adjusted is carried out with the help of ANOVA inference.
Resumo:
We propose distributed algorithms for sampling networks based on a new class of random walks that we call Centrifugal Random Walks (CRW). A CRW is a random walk that starts at a source and always moves away from it. We propose CRW algorithms for connected networks with arbitrary probability distributions, and for grids and networks with regular concentric connectivity with distance based distributions. All CRW sampling algorithms select a node with the exact probability distribution, do not need warm-up, and end in a number of hops bounded by the network diameter.