945 resultados para Probability Distribution Function
Resumo:
Three ice type regimes at Ice Station Belgica (ISB), during the 2007 International Polar Year SIMBA (Sea Ice Mass Balance in Antarctica) expedition, were characterized and assessed for elevation, snow depth, ice freeboard and thickness. Analyses of the probability distribution functions showed great potential for satellite-based altimetry for estimating ice thickness. In question is the required altimeter sampling density for reasonably accurate estimation of snow surface elevation given inherent spatial averaging. This study assesses an effort to determine the number of laser altimeter 'hits' of the ISB floe, as a representative Antarctic floe of mixed first- and multi-year ice types, for the purpose of statistically recreating the in situ-determined ice-thickness and snow depth distribution based on the fractional coverage of each ice type. Estimates of the fractional coverage and spatial distribution of the ice types, referred to as ice 'towns', for the 5 km**2 floe were assessed by in situ mapping and photo-visual documentation. Simulated ICESat altimeter tracks, with spot size ~70 m and spacing ~170 m, sampled the floe's towns, generating a buoyancy-derived ice thickness distribution. 115 altimeter hits were required to statistically recreate the regional thickness mean and distribution for a three-town assemblage of mixed first- and multi-year ice, and 85 hits for a two-town assemblage of first-year ice only: equivalent to 19.5 and 14.5 km respectively of continuous altimeter track over a floe region of similar structure. Results have significant implications toward model development of sea-ice sampling performance of the ICESat laser altimeter record as well as maximizing sampling characteristics of satellite/airborne laser and radar altimetry missions for sea-ice thickness.
Resumo:
Context. The ESA Rosetta spacecraft, currently orbiting around cornet 67P/Churyumov-Gerasimenko, has already provided in situ measurements of the dust grain properties from several instruments, particularly OSIRIS and GIADA. We propose adding value to those measurements by combining them with ground-based observations of the dust tail to monitor the overall, time-dependent dust-production rate and size distribution. Aims. To constrain the dust grain properties, we take Rosetta OSIRIS and GIADA results into account, and combine OSIRIS data during the approach phase (from late April to early June 2014) with a large data set of ground-based images that were acquired with the ESO Very Large Telescope (VLT) from February to November 2014. Methods. A Monte Carlo dust tail code, which has already been used to characterise the dust environments of several comets and active asteroids, has been applied to retrieve the dust parameters. Key properties of the grains (density, velocity, and size distribution) were obtained from. Rosetta observations: these parameters were used as input of the code to considerably reduce the number of free parameters. In this way, the overall dust mass-loss rate and its dependence on the heliocentric distance could be obtained accurately. Results. The dust parameters derived from the inner coma measurements by OSIRIS and GIADA and from distant imaging using VLT data are consistent, except for the power index of the size-distribution function, which is alpha = -3, instead of alpha = -2, for grains smaller than 1 mm. This is possibly linked to the presence of fluffy aggregates in the coma. The onset of cometary activity occurs at approximately 4.3 AU, with a dust production rate of 0.5 kg/s, increasing up to 15 kg/s at 2.9 AU. This implies a dust-to-gas mass ratio varying between 3.8 and 6.5 for the best-fit model when combined with water-production rates from the MIRO experiment.
Resumo:
Introduction : The source and deployment of finance are central issues in economic development. Since 1966, when the Soeharto Administration was inaugurated, Indonesian economic development has relied on funds in the form of aid from international organizations and foreign countries. After the 1990s, a further abundant inflow of capital sustained a rapid economic development. Foreign funding was the basis of Indonesian economic growth. This paper will describe the mechanism for allocating funds in the Indonesian economy. It will identify the problems this mechanism generated in the Indonesian experience, and it will attempt to explain why there was a collapse of the financial system in the wake of the Asian Currency Crisis of 1997. History of the Indonesian Financial system The year 1966 saw the emergence of commercial banks in Indonesia. It can be said that before 1966 a financial system hardly existed, a fact commonly attributed to economic disruptions like the consecutive runs of fiscal deficit and hyperinflation under the Soekarno Administration. After 1996, with the inauguration of Soeharto, a regulatory system of financial legislation, e.g. central banking law and banking regulation, was introduced and implemented, and the banking sector that is the basis of the current financial system in Indonesia was built up. The Indonesian financial structure was significantly altered at the first financial reform of 1983. Between 1966 and 1982, the banking sector consisted of Bank Indonesia (the Central Bank) and the state-owned banks. There was also a system for distributing the abundant public revenue derived from the soaring oil price of the 1970s. The public finance distribution function, incorporated in Indonesian financial system, changed after the successive financial reforms of 1983 and 1988, when there was a move away from the monopoly-market style dominated by state-owned banks (which was a system of public finance distribution that operated at the discretion of the government) towards a modern market mechanism. The five phases of development The Indonesian financial system developed in five phases between 1966 and the present time. The first period (1966-72) was its formative period, the second (1973-82) its policy based finance period under soaring oil prices, the third (1983-91) its financial-reform period, the fourth (1992-97) its period of expansion, and the fifth (1998-) its period of financial restructuring. The first section of this paper summarizes the financial policies operative during each of the periods identified above. In the second section changes to the financial sector in response to policies are examined, and an analysis of these changes shows that an important development of the financial sector occurred during the financial reform period. In the third section the focus of analysis shifts from the general financial sector to particular commercial banks’ performances. In the third section changes in commercial banks’ lending and fund-raising behaviour after the 1990s are analysed by comparing several banking groups in terms of their ownership and foundation time. The last section summarizes the foregoing analyses and examines the problems that remain in the Indonesian financial sector, which is still undergoing restructuring.
Resumo:
The objective of this thesis is the development of cooperative localization and tracking algorithms using nonparametric message passing techniques. In contrast to the most well-known techniques, the goal is to estimate the posterior probability density function (PDF) of the position of each sensor. This problem can be solved using Bayesian approach, but it is intractable in general case. Nevertheless, the particle-based approximation (via nonparametric representation), and an appropriate factorization of the joint PDFs (using message passing methods), make Bayesian approach acceptable for inference in sensor networks. The well-known method for this problem, nonparametric belief propagation (NBP), can lead to inaccurate beliefs and possible non-convergence in loopy networks. Therefore, we propose four novel algorithms which alleviate these problems: nonparametric generalized belief propagation (NGBP) based on junction tree (NGBP-JT), NGBP based on pseudo-junction tree (NGBP-PJT), NBP based on spanning trees (NBP-ST), and uniformly-reweighted NBP (URW-NBP). We also extend NBP for cooperative localization in mobile networks. In contrast to the previous methods, we use an optional smoothing, provide a novel communication protocol, and increase the efficiency of the sampling techniques. Moreover, we propose novel algorithms for distributed tracking, in which the goal is to track the passive object which cannot locate itself. In particular, we develop distributed particle filtering (DPF) based on three asynchronous belief consensus (BC) algorithms: standard belief consensus (SBC), broadcast gossip (BG), and belief propagation (BP). Finally, the last part of this thesis includes the experimental analysis of some of the proposed algorithms, in which we found that the results based on real measurements are very similar with the results based on theoretical models.
Resumo:
We have developed a new projector model specifically tailored for fast list-mode tomographic reconstructions in Positron emission tomography (PET) scanners with parallel planar detectors. The model provides an accurate estimation of the probability distribution of coincidence events defined by pairs of scintillating crystals. This distribution is parameterized with 2D elliptical Gaussian functions defined in planes perpendicular to the main axis of the tube of response (TOR). The parameters of these Gaussian functions have been obtained by fitting Monte Carlo simulations that include positron range, acolinearity of gamma rays, as well as detector attenuation and scatter effects. The proposed model has been applied efficiently to list-mode reconstruction algorithms. Evaluation with Monte Carlo simulations over a rotating high resolution PET scanner indicates that this model allows to obtain better recovery to noise ratio in OSEM (ordered-subsets, expectation-maximization) reconstruction, if compared to list-mode reconstruction with symmetric circular Gaussian TOR model, and histogram-based OSEM with precalculated system matrix using Monte Carlo simulated models and symmetries.
Resumo:
This paper presents some of the results of a method to determine the main reliability functions of concentrator solar cells. High concentrator GaAs single junction solar cells have been tested in an Accelerated Life Test. The method can be directly applied to multi-junction solar cells. The main conclusions of this test carried out show that these solar cells are robust devices with a very low probability of failure caused by degradation during their operation life (more than 30 years). The evaluation of the probability operation function (i.e. the reliability function R(t)) is obtained for two nominal operation conditions of these cells, namely simulated concentration ratios of 700 and 1050 suns. Preliminary determination of the Mean Time to Failure indicates a value much higher than the intended operation life time of the concentrator cells.
Resumo:
This paper presents a theoretical analysis and an optimization method for envelope amplifier. Highly efficient envelope amplifiers based on a switching converter in parallel or series with a linear regulator have been analyzed and optimized. The results of the optimization process have been shown and these two architectures are compared regarding their complexity and efficiency. The optimization method that is proposed is based on the previous knowledge about the transmitted signal type (OFDM, WCDMA...) and it can be applied to any signal type as long as the envelope probability distribution is known. Finally, it is shown that the analyzed architectures have an inherent efficiency limit.
Resumo:
This paper deals with the detection and tracking of an unknown number of targets using a Bayesian hierarchical model with target labels. To approximate the posterior probability density function, we develop a two-layer particle filter. One deals with track initiation, and the other with track maintenance. In addition, the parallel partition method is proposed to sample the states of the surviving targets.
Resumo:
Opportunities offered by high performance computing provide a significant degree of promise in the enhancement of the performance of real-time flood forecasting systems. In this paper, a real-time framework for probabilistic flood forecasting through data assimilation is presented. The distributed rainfall-runoff real-time interactive basin simulator (RIBS) model is selected to simulate the hydrological process in the basin. Although the RIBS model is deterministic, it is run in a probabilistic way through the results of calibration developed in a previous work performed by the authors that identifies the probability distribution functions that best characterise the most relevant model parameters. Adaptive techniques improve the result of flood forecasts because the model can be adapted to observations in real time as new information is available. The new adaptive forecast model based on genetic programming as a data assimilation technique is compared with the previously developed flood forecast model based on the calibration results. Both models are probabilistic as they generate an ensemble of hydrographs, taking the different uncertainties inherent in any forecast process into account. The Manzanares River basin was selected as a case study, with the process being computationally intensive as it requires simulation of many replicas of the ensemble in real time.
Resumo:
Many existing engineering works model the statistical characteristics of the entities under study as normal distributions. These models are eventually used for decision making, requiring in practice the definition of the classification region corresponding to the desired confidence level. Surprisingly enough, however, a great amount of computer vision works using multidimensional normal models leave unspecified or fail to establish correct confidence regions due to misconceptions on the features of Gaussian functions or to wrong analogies with the unidimensional case. The resulting regions incur in deviations that can be unacceptable in high-dimensional models. Here we provide a comprehensive derivation of the optimal confidence regions for multivariate normal distributions of arbitrary dimensionality. To this end, firstly we derive the condition for region optimality of general continuous multidimensional distributions, and then we apply it to the widespread case of the normal probability density function. The obtained results are used to analyze the confidence error incurred by previous works related to vision research, showing that deviations caused by wrong regions may turn into unacceptable as dimensionality increases. To support the theoretical analysis, a quantitative example in the context of moving object detection by means of background modeling is given.
Resumo:
The purpose of this paper is to present a program written in Matlab-Octave for the simulation of the time evolution of student curricula, i.e, how students pass their subjects along time until graduation. The program computes, from the simulations, the academic performance rates for the subjects of the study plan for each semester as well as the overall rates, which are a) the efficiency rate defined as the ratio of the number of students passing the exam to the number of students who registered for it and b) the success rate, defined as the ratio of the number of students passing the exam to the number of students who not only registered for it but also actually took it. Additionally, we compute the rates for the bachelor academic degree which are established for Spain by the National Quality Evaluation and Accreditation Agency (ANECA) and which are the graduation rate (measured as the percentage of students who finish as scheduled in the plan or taking an extra year) and the efficiency rate (measured as the percentage of credits which a student who graduated has really taken). The simulation is done in terms of the probabilities of passing all the subjects in their study plan. The application of the simulator to Polytech students in Madrid, where requirements for passing are specially stiff in first and second year subjects, is particularly relevant to analyze student cohorts and the probabilities of students finishing in the minimum of four years, or taking and extra year or two extra years, and so forth. It is a very useful tool when designing new study plans. The calculation of the probability distribution of the random variable "number of semesters a student has taken to complete the curricula and graduate" is difficult or even unfeasible to obtain analytically, and this is even truer when we incorporate uncertainty in parameter estimation. This is why we apply Monte Carlo simulation which not only provides illustration of the stochastic process but also a method for computation. The stochastic simulator is proving to be a useful tool for identification of the subjects most critical in the distribution of the number of semesters for curriculum vitae (CV) completion and subsequently for a decision making process in terms of CV planning and passing standards in the University. Simulations are performed through a graphical interface where also the results are presented in appropriate figures. The Project has been funded by the Call for Innovation in Education Projects of Universidad Politécnica de Madrid (UPM) through a Project of its school Escuela Técnica Superior de Ingenieros Industriales ETSII during the period September 2010-September 2011.
Resumo:
Lately, several researchers have pointed out that climate change is expected to increase temperatures and lower rainfall in Mediterranean regions, simultaneously increasing the intensity of extreme rainfall events. These changes could have consequences regarding rainfall regime, erosion, sediment transport and water quality, soil management, and new designs in diversion ditches. Climate change is expected to result in increasingly unpredictable and variable rainfall, in amount and timing, changing seasonal patterns and increasing the frequency of extreme weather events. Consequently, the evolution of frequency and intensity of drought periods is of most important as in agro-ecosystems many processes will be affected by them. Realising the complex and important consequences of an increasing frequency of extreme droughts at the Ebro River basin, our aim is to study the evolution of drought events at this site statistically, with emphasis on the occurrence and intensity of them. For this purpose, fourteen meteorological stations were selected based on the length of the rainfall series and the climatic classification to obtain a representative untreated dataset from the river basin. Daily rainfall series from 1957 to 2002 were obtained from each meteorological station and no-rain period frequency as the consecutive numbers of days were extracted. Based on this data, we study changes in the probability distribution in several sub-periods. Moreover we used the Standardized Precipitation Index (SPI) for identification of drought events in a year scale and then we use this index to fit log-linear models to the contingency tables between the SPI index and the sub-periods, this adjusted is carried out with the help of ANOVA inference.
Resumo:
We propose distributed algorithms for sampling networks based on a new class of random walks that we call Centrifugal Random Walks (CRW). A CRW is a random walk that starts at a source and always moves away from it. We propose CRW algorithms for connected networks with arbitrary probability distributions, and for grids and networks with regular concentric connectivity with distance based distributions. All CRW sampling algorithms select a node with the exact probability distribution, do not need warm-up, and end in a number of hops bounded by the network diameter.
Resumo:
Several authors have analysed the changes of the probability density function of the solar radiation with different time resolutions. Some others have approached to study the significance of these changes when produced energy calculations are attempted. We have undertaken different transformations to four Spanish databases in order to clarify the interrelationship between radiation models and produced energy estimations. Our contribution is straightforward: the complexity of a solar radiation model needed for yearly energy calculations, is very low. Twelve values of monthly mean of solar radiation are enough to estimate energy with errors below 3%. Time resolutions better than hourly samples do not improve significantly the result of energy estimations.
Resumo:
The paper discusses the dispersion relation for longitudinal electron waves propagating in a collisionless, homogeneous isotropic plasma, which contains both Maxwellian and suprathermal electrons. I t is found that the dispersion curve, known to have two separate branches for zero suprathermal energy spread,depends sensitively on this quantity. As the energy half-width of the suprathermal population increases, the branches approach each other until they touch at a connexion point, for a small critical value of that half-width. The topology of the dispersion curves is different for half-widths above and below critical; and this can affect the use of wave-propagation measurements as a diagnostic technique for the determination of the electron distribution function. Both the distance between the branches and spatial damping near the connexion frequency depend on the half-width, if below critical, and can be used to determine it. The theory is applied to experimental data.