243 resultados para Library statistics.
Resumo:
China's state sector reform process is examined through the key sector of agriculture. A preview of aggregate statistics and broader reform measures indicate the declining role of the state. However, a systematic analysis of administrative, service and enterprise structures reveal the nuances of how the state has retained strong capacity to guide development of the agricultural sector. State and Party policy makers aim not only to support the livelihoods of hundreds of millions of farmers, but also to pursue agricultural modernization in the context of rapid industrialization. These goals are unlikely to be achieved through a wholesale transfer of functions to the private sector, so the state has maintained or developed new mechanisms of influence, particularly in the areas of service provision and enterprise development.
Resumo:
It is shown that quasigroups constructed using the standard construction from 2-perfect directed m-cycle systems are precisely the finite members of a variety if and only if m=3, 4 or 5.
Resumo:
To date very Few families of critical sets for latin squares are known. The only previously known method for constructing critical sets involves taking a critical set which is known to satisfy certain strong initial conditions and using a doubling construction. This construction can be applied to the known critical sets in back circulant latin squares of even order. However, the doubling construction cannot be applied to critical sets in back circulant latin squares of odd order. In this paper a family of critical sets is identified for latin squares which are the product of the latin square of order 2 with a back circulant latin square of odd order. The proof that each element of the critical set is an essential part of the reconstruction process relies on the proof of the existence of a large number of latin interchanges.
Resumo:
We calculate the stationary state of the system of two non-identical two-level atoms driven by a finite-bandwidth two-mode squeezed vacuum. It is well known that two identical two-level atoms driven by a broadband squeezed vacuum may decay to a pure state, called the pure two-atom squeezed state, and that the presence of the antisymmetric state can change its purity. Here, we show that for small interatomic separations the stationary state of two non-identical atoms is not sensitive to the presence of the antisymmetric state and is the pure two-atom squeezed state. This effect is a consequence of the fact that in the system of two non-identical atoms the antisymmetric state is no longer the trapping state. We also calculate the squeezing properties of the emitted field and find that the squeezing spectrum of the output field may exhibit larger squeezing than that in the input squeezed vacuum. Moreover, we show that squeezing in the total field attains the optimum value which can ever be achieved in the field emitted by two atoms.
Resumo:
The squeezing properties of the fluorescence field emitted by a two-level atom driven by a coherent laser field in a squeezed vacuum are calculated. We show that in the region of the anomalous resonance fluorescence the emitted field exhibits squeezing that is much larger than that in the input squeezed vacuum. The squeezing spectrum attains a minimum value that corresponds to 75% squeezing. We also find that, in the total fluorescence field, squeezing attains an optimum achievable value in the fluorescence field emitted by a two-level atom. The optimum squeezing is associated with the collapse of the system into a pure state. (C) 1997 Optical Society of America.
Resumo:
A G-design of order n is a pair (P,B) where P is the vertex set of the complete graph K-n and B is an edge-disjoint decomposition of K-n into copies of the simple graph G. Following design terminology, we call these copies ''blocks''. Here K-4 - e denotes the complete graph K-4 with one edge removed. It is well-known that a K-4 - e design of order n exists if and only if n = 0 or 1 (mod 5), n greater than or equal to 6. The intersection problem here asks for which k is it possible to find two K-4 - e designs (P,B-1) and (P,B-2) of order n, with \B-1 boolean AND B-2\ = k, that is, with precisely k common blocks. Here we completely solve this intersection problem for K-4 - e designs.
Resumo:
The classification rules of linear discriminant analysis are defined by the true mean vectors and the common covariance matrix of the populations from which the data come. Because these true parameters are generally unknown, they are commonly estimated by the sample mean vector and covariance matrix of the data in a training sample randomly drawn from each population. However, these sample statistics are notoriously susceptible to contamination by outliers, a problem compounded by the fact that the outliers may be invisible to conventional diagnostics. High-breakdown estimation is a procedure designed to remove this cause for concern by producing estimates that are immune to serious distortion by a minority of outliers, regardless of their severity. In this article we motivate and develop a high-breakdown criterion for linear discriminant analysis and give an algorithm for its implementation. The procedure is intended to supplement rather than replace the usual sample-moment methodology of discriminant analysis either by providing indications that the dataset is not seriously affected by outliers (supporting the usual analysis) or by identifying apparently aberrant points and giving resistant estimators that are not affected by them.
Resumo:
In this paper use consider the problem of providing standard errors of the component means in normal mixture models fitted to univariate or multivariate data by maximum likelihood via the EM algorithm. Two methods of estimation of the standard errors are considered: the standard information-based method and the computationally-intensive bootstrap method. They are compared empirically by their application to three real data sets and by a small-scale Monte Carlo experiment.
Resumo:
In a recent paper [16], one of us identified all of the quasi-stationary distributions for a non-explosive, evanescent birth-death process for which absorption is certain, and established conditions for the existence of the corresponding limiting conditional distributions. Our purpose is to extend these results in a number of directions. We shall consider separately two cases depending on whether or not the process is evanescent. In the former case we shall relax the condition that absorption is certain. Furthermore, we shall allow for the possibility that the minimal process might be explosive, so that the transition rates alone will not necessarily determine the birth-death process uniquely. Although we shall be concerned mainly with the minimal process, our most general results hold for any birth-death process whose transition probabilities satisfy both the backward and the forward Kolmogorov differential equations.
Resumo:
We prove two asymptotical estimates for minimizers of a Ginzburg-Landau functional of the form integral(Omega) [1/2 \del u\(2) + 1/4 epsilon(2) (1 - \u\(2))(2) W (x)] dx.
Resumo:
Izenman and Sommer (1988) used a non-parametric Kernel density estimation technique to fit a seven-component model to the paper thickness of the 1872 Hidalgo stamp issue of Mexico. They observed an apparent conflict when fitting a normal mixture model with three components with unequal variances. This conflict is examined further by investigating the most appropriate number of components when fitting a normal mixture of components with equal variances.
Resumo:
The small sample performance of Granger causality tests under different model dimensions, degree of cointegration, direction of causality, and system stability are presented. Two tests based on maximum likelihood estimation of error-correction models (LR and WALD) are compared to a Wald test based on multivariate least squares estimation of a modified VAR (MWALD). In large samples all test statistics perform well in terms of size and power. For smaller samples, the LR and WALD tests perform better than the MWALD test. Overall, the LR test outperforms the other two in terms of size and power in small samples.
Resumo:
Victoria Police statistics show that, since the late 1980s, there has been a significant increase in reported rapes in that State. One interpretation of this trend is that there has been an increase in the underlying incidence of sexual violence in the community. An alternative explanation is that rape victims have become more willing to report to the police, in response to factors such as improved provision of support services to sexual assault victims, reforms to substantive and procedural law, and changes in police attitudes and procedures. In order to rest these competing interpretations data were collected and analysed on the characteristics of rapes reported to the Victoria Police in the late 1980s/early 1990s. This analysis showed that: (I) most of the additional offences reported in the early 1990s were allegations of rapes committed by family members, spouses and other intimates; and (2) an increasing number of reports related to offences which had been committed at feast one year prior to a report being made to the police. It is argued that these changing patterns are consistent with a significant increase in the reporting rate for rape. More generally, the research reported in this paper highlights the limitations of reported crime statistics as measures of the level of social violence, and points to the need for crime researchers to develop alternative methodologies for measuring and interpreting trends.
Resumo:
Analysis of a major multi-site epidemiologic study of heart disease has required estimation of the pairwise correlation of several measurements across sub-populations. Because the measurements from each sub-population were subject to sampling variability, the Pearson product moment estimator of these correlations produces biased estimates. This paper proposes a model that takes into account within and between sub-population variation, provides algorithms for obtaining maximum likelihood estimates of these correlations and discusses several approaches for obtaining interval estimates. (C) 1997 by John Wiley & Sons, Ltd.
Resumo:
Using the method of quantum trajectories we show that a known pure state can be optimally monitored through time when subject to a sequence of discrete measurements. By modifying the way that we extract information from the measurement apparatus we can minimize the average algorithmic information of the measurement record, without changing the unconditional evolution of the measured system. We define an optimal measurement scheme as one which has the lowest average algorithmic information allowed. We also show how it is possible to extract information about system operator averages from the measurement records and their probabilities. The optimal measurement scheme, in the limit of weak coupling, determines the statistics of the variance of the measured variable directly. We discuss the relevance of such measurements for recent experiments in quantum optics.