942 resultados para Refined Perfect Diagonalization Procedure
Resumo:
Estimation of a population size by means of capture-recapture techniques is an important problem occurring in many areas of life and social sciences. We consider the frequencies of frequencies situation, where a count variable is used to summarize how often a unit has been identified in the target population of interest. The distribution of this count variable is zero-truncated since zero identifications do not occur in the sample. As an application we consider the surveillance of scrapie in Great Britain. In this case study holdings with scrapie that are not identified (zero counts) do not enter the surveillance database. The count variable of interest is the number of scrapie cases per holding. For count distributions a common model is the Poisson distribution and, to adjust for potential heterogeneity, a discrete mixture of Poisson distributions is used. Mixtures of Poissons usually provide an excellent fit as will be demonstrated in the application of interest. However, as it has been recently demonstrated, mixtures also suffer under the so-called boundary problem, resulting in overestimation of population size. It is suggested here to select the mixture model on the basis of the Bayesian Information Criterion. This strategy is further refined by employing a bagging procedure leading to a series of estimates of population size. Using the median of this series, highly influential size estimates are avoided. In limited simulation studies it is shown that the procedure leads to estimates with remarkable small bias.
A refined LEED analysis of water on Ru{0001}: an experimental test of the partial dissociation model
Resumo:
Despite a number of earlier studies which seemed to confirm molecular adsorption of water on close-packed surfaces of late transition metals, new controversy has arisen over a recent theoretical work by Feibelman, according to which partial dissociation occurs on the Ru{0001} surface leading to a mixed (H2O + OH + H) superstructure. Here, we present a refined LEED-IV analysis of the (root3 x root3)R30degrees-D2O-Ru{0001} structure, testing explicitly this new model by Feibelman. Our results favour the model proposed earlier by Held and Menzel assuming intact water molecules with almost coplanar oxygen atoms and out-of-plane hydrogen atoms atop the slightly higher oxygen atoms. The partially dissociated model with an almost identical arrangement of oxygen atoms can, however, not unambiguously be excluded, especially when the single hydrogen atoms are not present in the surface unit cell. In contrast to the earlier LEED-IV analysis, we can, however, clearly exclude a buckled geometry of oxygen atoms.
Resumo:
We explore the potential for making statistical decadal predictions of sea surface temperatures (SSTs) in a perfect model analysis, with a focus on the Atlantic basin. Various statistical methods (Lagged correlations, Linear Inverse Modelling and Constructed Analogue) are found to have significant skill in predicting the internal variability of Atlantic SSTs for up to a decade ahead in control integrations of two different global climate models (GCMs), namely HadCM3 and HadGEM1. Statistical methods which consider non-local information tend to perform best, but which is the most successful statistical method depends on the region considered, GCM data used and prediction lead time. However, the Constructed Analogue method tends to have the highest skill at longer lead times. Importantly, the regions of greatest prediction skill can be very different to regions identified as potentially predictable from variance explained arguments. This finding suggests that significant local decadal variability is not necessarily a prerequisite for skillful decadal predictions, and that the statistical methods are capturing some of the dynamics of low-frequency SST evolution. In particular, using data from HadGEM1, significant skill at lead times of 6–10 years is found in the tropical North Atlantic, a region with relatively little decadal variability compared to interannual variability. This skill appears to come from reconstructing the SSTs in the far north Atlantic, suggesting that the more northern latitudes are optimal for SST observations to improve predictions. We additionally explore whether adding sub-surface temperature data improves these decadal statistical predictions, and find that, again, it depends on the region, prediction lead time and GCM data used. Overall, we argue that the estimated prediction skill motivates the further development of statistical decadal predictions of SSTs as a benchmark for current and future GCM-based decadal climate predictions.
Resumo:
In the absence of a suitable method for routine analysis of large numbers of natural river water samples for organic nitrogen and phosphorus fractions, a new simultaneous digestion technique was developed, based on a standard persulphate digestion procedure. This allows rapid analysis of river, lake and groundwater samples from a range of environments for total nitrogen and phosphorus. The method was evaluated using a range of organic nitrogen and phosphorus structures tested at low, mid and high range concentrations from 2 to 50 mg l-1 nitrogen and 0.2 to 10 mg l-1 phosphorus. Mean recoveries for nitrogen ranged from 94.5% (2 mg I-1) to 92.7% (50 mg I-1) and for phosphorus were 98.2% (0.2 mg l-1) to 100.2% (10 mg l-1). The method is precise in its ability m reproduce results from replicate digestions, and robust in its ability to handle a variety of natural water samples in the pH range 5-8.
Resumo:
We extend the a priori error analysis of Trefftz-discontinuous Galerkin methods for time-harmonic wave propagation problems developed in previous papers to acoustic scattering problems and locally refined meshes. To this aim, we prove refined regularity and stability results with explicit dependence of the stability constant on the wave number for non convex domains with non connected boundaries. Moreover, we devise a new choice of numerical flux parameters for which we can prove L2-error estimates in the case of locally refined meshes near the scatterer. This is the setting needed to develop a complete hp-convergence analysis.
Resumo:
We exploit a theory of price linkages that lends itself readily to empirical examination using Markovchain, Monte Carlo methods. The methodology facilitates classification and discrimination among alternative regimes in economic time series. The theory and procedures are applied to annual series (1955-1992) on the U.S. beef sector
Resumo:
The rules and the principles of the common law are formed from the cases decided in courts of common law. The unique nature of the evolution of the common law has long been the subject of study. Less frequently studied has been the impact of procedure upon the development of substantive law. This paper examines how the procedures applicable to the trial of a case can affect the substance of the resulting decision. The focus of the examination is the decision in Bell v Lever Bros [1932] AC 161. While the case has long been regarded as a leading, albeit confusing, contract law case it is also greatly concerned with the conduct of litigation. This paper argues that the substantive decision was largely determined by the civil procedure available. Different rules of civil procedure, it is suggested, would have resulted in a better decision in the English law of contract.