939 resultados para robust estimation statistics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Convictions statistics were the first criminal statistics available in Europe during the nineteenth century. Their main weaknesses as crime measures and for comparative purposes were identified by Alphonse de Candolle in the 1830s. Currently, they are seldom used by comparative criminologists, although they provide a less valid but more reliable measure of crime and formal social control than police statistics. This article uses conviction statistics, compiled from the four editions of the European Sourcebook of Crime and Criminal Justice Statistics, to study the evolution of persons convicted in European countries from 1990 to 2006. Trends in persons convicted for six offences -intentional homicide, assault, rape, robbery, theft, and drug offences- and up to 26 European countries are analysed. These trends are established for the whole of Europe as well as for a cluster of Western European countries and a cluster of Central and Eastern European countries. The analyses show similarities between both regions of Europe at the beginning and at the end of the period under study. After a general increase of the rate of persons convicted in the early 1990s in the whole of Europe, trends followed different directions in Western and in Central and Eastern Europe. However, during the 2000s, it can be observed, throughout Europe, a certain stability of the rates of persons convicted for intentional homicides, accompanied by a general decrease of the rate of persons convicted for property offences, and an increase of the rate of those convicted for drug offences. The latter goes together with an increase of the rate of persons convicted for non lethal violent offences, which only reached some stability at the end of the time series. These trends show that there is no general crime drop in Europe. After a discussion of possible theoretical explanations, a multifactor model, inspired by opportunity-based theories, is proposed to explain the trends observed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information underlying analyses of coffee fertilization systems should consider both the soil and the nutritional status of plants. This study investigated the spatial relationship between phosphorus (P) levels in coffee plant tissues and soil chemical and physical properties. The study was performed using two arabica and one canephora coffee variety. Sampling grids were established in the areas, and the points georeferenced. The assessed properties of the soil were levels of available phosphorus (P-Mehlich), remaining phosphorus (P-rem) and particle size, and of the plant tissue, phosphorus levels (foliar P). The data were subjected to descriptive statistical analysis, correlation analysis, cluster analysis, and probability tests. Geostatistical and trend analyses were only performed for pairs of variables with significant linear correlation. The spatial variability for foliar P content was high for the variety Catuai and medium for the other evaluated plants. Unlike P-Mehlich, the variability in P-rem of the soil indicated the nutritional status of this nutrient in the plant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Taking into account the nature of the hydrological processes involved in in situ measurement of Field Capacity (FC), this study proposes a variation of the definition of FC aiming not only at minimizing the inadequacies of its determination, but also at maintaining its original, practical meaning. Analysis of FC data for 22 Brazilian soils and additional FC data from the literature, all measured according to the proposed definition, which is based on a 48-h drainage time after infiltration by shallow ponding, indicates a weak dependency on the amount of infiltrated water, antecedent moisture level, soil morphology, and the level of the groundwater table, but a strong dependency on basic soil properties. The dependence on basic soil properties allowed determination of FC of the 22 soil profiles by pedotransfer functions (PTFs) using the input variables usually adopted in prediction of soil water retention. Among the input variables, soil moisture content θ (6 kPa) had the greatest impact. Indeed, a linear PTF based only on it resulted in an FC with a root mean squared residue less than 0.04 m³ m-3 for most soils individually. Such a PTF proved to be a better FC predictor than the traditional method of using moisture content at an arbitrary suction. Our FC data were compatible with an equivalent and broader USA database found in the literature, mainly for medium-texture soil samples. One reason for differences between FCs of the two data sets of fine-textured soils is due to their different drainage times. Thus, a standardized procedure for in situ determination of FC is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Field capacity (FC) is a parameter widely used in applied soil science. However, its in situ method of determination may be difficult to apply, generally because of the need of large supplies of water at the test sites. Ottoni Filho et al. (2014) proposed a standardized procedure for field determination of FC and showed that such in situ FC can be estimated by a linear pedotransfer function (PTF) based on volumetric soil water content at the matric potential of -6 kPa [θ(6)] for the same soils used in the present study. The objective of this study was to use soil moisture data below a double ring infiltrometer measured 48 h after the end of the infiltration test in order to develop PTFs for standard in situ FC. We found that such ring FC data were an average of 0.03 m³ m- 3 greater than standard FC values. The linear PTF that was developed for the ring FC data based only on θ(6) was nearly as accurate as the equivalent PTF reported by Ottoni Filho et al. (2014), which was developed for the standard FC data. The root mean squared residues of FC determined from both PTFs were about 0.02 m³ m- 3. The proposed method has the advantage of estimating the soil in situ FC using the water applied in the infiltration test.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This publication is an historical recording of the most requested statistics on vital events and is a source of information that can be used in further analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This publication is an historical recording of the most requested statistics on vital events and is a source of information that can be used in further analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This publication is an historical recording of the most requested statistics on vital events and is a source of information that can be used in further analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This publication is an historical recording of the most requested statistics on vital events and is a source of information that can be used in further analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This publication is an historical recording of the most requested statistics on vital events and is a source of information that can be used in further analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present exact equations and expressions for the first-passage-time statistics of dynamical systems that are a combination of a diffusion process and a random external force modeled as dichotomous Markov noise. We prove that the mean first passage time for this system does not show any resonantlike behavior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 2010-2011 (FY11) edition of Iowa Public Library Statistics includes information on income, expenditures, collections, circulation, and other measures, including staff. Each section is arranged by size code, then alphabetically by city. The totals and percentiles for each size code grouping are given immediately following the alphabetical listings. Totals and medians for all reporting libraries are given at the end of each section. There are 543 libraries included in this publication; 525 submitted a report. The table of size codes (page 5) lists the libraries alphabetically. The following table lists the size code designations, the population range in each size code, the number of libraries reporting in each size code, and the total population of the reporting libraries in each size code. The total population served by the 543 libraries is 2,339,070. Population data is used to determine per capita figures throughout the publication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Photon migration in a turbid medium has been modeled in many different ways. The motivation for such modeling is based on technology that can be used to probe potentially diagnostic optical properties of biological tissue. Surprisingly, one of the more effective models is also one of the simplest. It is based on statistical properties of a nearest-neighbor lattice random walk. Here we develop a theory allowing one to calculate the number of visits by a photon to a given depth, if it is eventually detected at an absorbing surface. This mimics cw measurements made on biological tissue and is directed towards characterizing the depth reached by photons injected at the surface. Our development of the theory uses formalism based on the theory of a continuous-time random walk (CTRW). Formally exact results are given in the Fourier-Laplace domain, which, in turn, are used to generate approximations for parameters of physical interest.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.