892 resultados para estimating conditional probabilities
Resumo:
This paper presents a new approach to the LU decomposition method for the simulation of stationary and ergodic random fields. The approach overcomes the size limitations of LU and is suitable for any size simulation. The proposed approach can facilitate fast updating of generated realizations with new data, when appropriate, without repeating the full simulation process. Based on a novel column partitioning of the L matrix, expressed in terms of successive conditional covariance matrices, the approach presented here demonstrates that LU simulation is equivalent to the successive solution of kriging residual estimates plus random terms. Consequently, it can be used for the LU decomposition of matrices of any size. The simulation approach is termed conditional simulation by successive residuals as at each step, a small set (group) of random variables is simulated with a LU decomposition of a matrix of updated conditional covariance of residuals. The simulated group is then used to estimate residuals without the need to solve large systems of equations.
Resumo:
Distance sampling using line transects has not been previously used or tested for estimating koala abundance. In July 2001, a pilot survey was conducted to compare the use of line transects with strip transects for estimating koala abundance. Both methods provided a similar estimate of density. On the basis of the results of the pilot survey, the distribution and abundance of koalas in the Pine Rivers Shire, south-east Queensland, was determined using line-transect sampling. In total, 134 lines (length 64 km) were used to sample bushland areas. Eighty-two independent koalas were sighted. Analysis of the frequency distribution of sighting distances using the software program DISTANCE enabled a global detection function to be estimated for survey sites in bushland areas across the Shire. Abundance in urban parts of the Shire was estimated from densities obtained from total counts at eight urban sites that ranged from 26 to 51 ha in size. Koala abundance in the Pine Rivers Shire was estimated at 4584 (95% confidence interval, 4040-5247). Line-transect sampling is a useful method for estimating koala abundance provided experienced koala observers are used when conducting surveys.
Resumo:
This study aimed to develop a practical method of estimating energy expenditure (EE) during tennis. Twenty-four elite female tennis players first completed a tennis-specific graded test in which five different intensity levels were applied randomly. Each intensity level was intended to simulate a game of singles tennis and comprised six 14 s periods of activity alternated with 20 s of active rest. Oxygen consumption (VO2) and heart rate (HR) were measured continuously and each player's rate of perceived exertion (RPE) was recorded at the end of each intensity level. Rate of energy expenditure (EEVO2) during the test was calculated using the sum of VO2 during play and the 'O-2 debt' during recovery, divided by the duration of the activity. There were significant individual linear relationships between EEVO2 and RPE, EEVO2 and HR, (rgreater than or equal to0.89 rgreater than or equal to0.93; p
Resumo:
Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.
Resumo:
Financial literature and financial industry use often zero coupon yield curves as input for testing hypotheses, pricing assets or managing risk. They assume this provided data as accurate. We analyse implications of the methodology and of the sample selection criteria used to estimate the zero coupon bond yield term structure on the resulting volatility of spot rates with different maturities. We obtain the volatility term structure using historical volatilities and Egarch volatilities. As input for these volatilities we consider our own spot rates estimation from GovPX bond data and three popular interest rates data sets: from the Federal Reserve Board, from the US Department of the Treasury (H15), and from Bloomberg. We find strong evidence that the resulting zero coupon bond yield volatility estimates as well as the correlation coefficients among spot and forward rates depend significantly on the data set. We observe relevant differences in economic terms when volatilities are used to price derivatives.
Resumo:
27th Annual Conference of the European Cetacean Society. Setúbal, Portugal, 8-10 April 2013.
Resumo:
Modern real-time systems, with a more flexible and adaptive nature, demand approaches for timeliness evaluation based on probabilistic measures of meeting deadlines. In this context, simulation can emerge as an adequate solution to understand and analyze the timing behaviour of actual systems. However, care must be taken with the obtained outputs under the penalty of obtaining results with lack of credibility. Particularly important is to consider that we are more interested in values from the tail of a probability distribution (near worst-case probabilities), instead of deriving confidence on mean values. We approach this subject by considering the random nature of simulation output data. We will start by discussing well known approaches for estimating distributions out of simulation output, and the confidence which can be applied to its mean values. This is the basis for a discussion on the applicability of such approaches to derive confidence on the tail of distributions, where the worst-case is expected to be.
Resumo:
We propose an efficient algorithm to estimate the number of live computer nodes in a network. This algorithm is fully distributed, and has a time-complexity which is independent of the number of computer nodes. The algorithm is designed to take advantage of a medium access control (MAC) protocol which is prioritized; that is, if two or more messages on different nodes contend for the medium, then the node contending with the highest priority will win, and all nodes will know the priority of the winner.
Resumo:
In this thesis we implement estimating procedures in order to estimate threshold parameters for the continuous time threshold models driven by stochastic di®erential equations. The ¯rst procedure is based on the EM (expectation-maximization) algorithm applied to the threshold model built from the Brownian motion with drift process. The second procedure mimics one of the fundamental ideas in the estimation of the thresholds in time series context, that is, conditional least squares estimation. We implement this procedure not only for the threshold model built from the Brownian motion with drift process but also for more generic models as the ones built from the geometric Brownian motion or the Ornstein-Uhlenbeck process. Both procedures are implemented for simu- lated data and the least squares estimation procedure is also implemented for real data of daily prices from a set of international funds. The ¯rst fund is the PF-European Sus- tainable Equities-R fund from the Pictet Funds company and the second is the Parvest Europe Dynamic Growth fund from the BNP Paribas company. The data for both funds are daily prices from the year 2004. The last fund to be considered is the Converging Europe Bond fund from the Schroder company and the data are daily prices from the year 2005.
Resumo:
OBJECTIVE To investigate differences in HIV infection- related risk practices by Female Sex Workers according to workplace and the effects of homophily on estimating HIV prevalence. METHODS Data from 2,523 women, recruited using Respondent-Driven Sampling, were used for the study carried out in 10 Brazilian cities in 2008-2009. The study included female sex workers aged 18 and over. The questionnaire was completed by the subjects and included questions on characteristics of professional activity, sexual practices, use of drugs, HIV testing, and access to health services. HIV quick tests were conducted. The participants were classified in two groups according to place of work: on the street or indoor venues, like nightclubs and saunas. To compare variable distributions by place of work, we used Chi-square homogeneity tests, taking into consideration unequal selection probabilities as well as the structure of dependence between observations. We tested the effect of homophily by workplace on estimated HIV prevalence. RESULTS The highest HIV risk practices were associated with: working on the streets, lower socioeconomic status, low regular smear test coverage, higher levels of crack use and higher levels of syphilis serological scars as well as higher prevalence of HIV infection. The effect of homophily was higher among sex workers in indoor venues. However, it did not affect the estimated prevalence of HIV, even after using a post-stratification by workplace procedure. CONCLUSIONS The findings suggest that strategies should focus on extending access to, and utilization of, health services. Prevention policies should be specifically aimed at street workers. Regarding the application of Respondent-Driven Sampling, the sample should be sufficient to estimate transition probabilities, as the network develops more quickly among sex workers in indoor venues.
Resumo:
The use of fiber reinforced plastics has increased in the last decades due to their unique properties. Advantages of their use are related with low weight, high strength and stiffness. Drilling of composite plates can be carried out in conventional machinery with some adaptations. However, the presence of typical defects like delamination can affect mechanical properties of produced parts. In this paper delamination influence in bearing stress of drilled hybrid carbon+glass/epoxy quasi-isotropic plates is studied by using image processing and analysis techniques. Results from bearing test show that damage minimization is an important mean to improve mechanical properties of the joint area of the plate. The appropriateness of the image processing and analysis techniques used in the measurement of the damaged area is demonstrated.
Resumo:
Consolidation consists in scheduling multiple virtual machines onto fewer servers in order to improve resource utilization and to reduce operational costs due to power consumption. However, virtualization technologies do not offer performance isolation, causing applications’ slowdown. In this work, we propose a performance enforcing mechanism, composed of a slowdown estimator, and a interference- and power-aware scheduling algorithm. The slowdown estimator determines, based on noisy slowdown data samples obtained from state-of-the-art slowdown meters, if tasks will complete within their deadlines, invoking the scheduling algorithm if needed. When invoked, the scheduling algorithm builds performance and power aware virtual clusters to successfully execute the tasks. We conduct simulations injecting synthetic jobs which characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our strategy can be efficiently integrated with state-of-the-art slowdown meters to fulfil contracted SLAs in real-world environments, while reducing operational costs in about 12%.
Resumo:
A new method, based on linear correlation and phase diagrams was successfully developed for processes like the sedimentary process, where the deposition phase can have different time duration - represented by repeated values in a series - and where the erosion can play an important rule deleting values of a series. The sampling process itself can be the cause of repeated values - large strata twice sampled - or deleted values: tiny strata fitted between two consecutive samples. What we developed was a mathematical procedure which, based upon the depth chemical composition evolution, allows the establishment of frontiers as well as the periodicity of different sedimentary environments. The basic tool isn't more than a linear correlation analysis which allow us to detect the existence of eventual evolution rules, connected with cyclical phenomena within time series (considering the space assimilated to time), with the final objective of prevision. A very interesting discovery was the phenomenon of repeated sliding windows that represent quasi-cycles of a series of quasi-periods. An accurate forecast can be obtained if we are inside a quasi-cycle (it is possible to predict the other elements of the cycle with the probability related with the number of repeated and deleted points). We deal with an innovator methodology, reason why it's efficiency is being tested in some case studies, with remarkable results that shows it's efficacy. Keywords: sedimentary environments, sequence stratigraphy, data analysis, time-series, conditional probability.