29 resultados para Estimator standard error and efficiency


Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the enablers for new consumer electronics based products to be accepted in to the market is the availability of inexpensive, flexible and multi-standard chipsets and services. DVB-T, the principal standard for terrestrial broadcast of digital video in Europe, has been extremely successful in leading to governments reconsidering their targets for analogue television broadcast switch-off. To enable one further small step in creating increasingly cost effective chipsets, the ODFM deterministic equalizer has been presented before with its application to DVB-T. This paper discusses the test set-up of a DVB-T compliant baseband simulation that includes the deterministic equalizer and DVB-T standard propagation channels. This is then followed by a presentation of the found inner and outer Bit Error Rate (BER) results using various modulation levels, coding rates and propagation channels in order to ascertain the actual performance of the deterministic equalizer(1).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports the findings of a small-scale research project which investigated the levels of awareness and knowledge of written standard English of 10 and 11 year old children in two English primary schools. The project involved repeating in 2010 a written questionnaire previously used with children in the same schools in three separate surveys in 1999, 2002 and 2005. Data from the latest survey are compared to those from the previous three. The analysis seeks to identify any changes over time in children’s ability to recognise non-standard forms and supply standard English alternatives, as well as their ability to use technical terms related to language variation. Differences between the performance of boys and girls and that of the two schools are also analysed. The paper concludes that the socio-economic context of the schools may be a more important factor than gender in variations over time identified in the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relationship between valuations and the subsequent sale price continues to be a matter of both theoretical and practical interest. This paper reports the analysis of over 700 property sales made during the 1974/90 period. Initial results imply an average under-valuation of 7% and a standard error of 18% across the sample. A number of techniques are applied to the data set using other variables such as the region, the type of property and the return from the market to explain the difference between the valuation and the subsequent sale price. The analysis reduces the unexplained error; the bias is fully accounted for and the standard error is reduced to 15.3%. This model finds that about 6% of valuations over-estimated the sale price by more than 20% and about 9% of the valuations under-estimated the sale prices by more than 20%. The results suggest that valuations are marginally more accurate than might be expected, both from consideration of theoretical considerations and from comparison with the equivalent valuation in equity markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of spurious excitation of gravity waves in the context of four-dimensional data assimilation is investigated using a simple model of balanced dynamics. The model admits a chaotic vortical mode coupled to a comparatively fast gravity wave mode, and can be initialized such that the model evolves on a so-called slow manifold, where the fast motion is suppressed. Identical twin assimilation experiments are performed, comparing the extended and ensemble Kalman filters (EKF and EnKF, respectively). The EKF uses a tangent linear model (TLM) to estimate the evolution of forecast error statistics in time, whereas the EnKF uses the statistics of an ensemble of nonlinear model integrations. Specifically, the case is examined where the true state is balanced, but observation errors project onto all degrees of freedom, including the fast modes. It is shown that the EKF and EnKF will assimilate observations in a balanced way only if certain assumptions hold, and that, outside of ideal cases (i.e., with very frequent observations), dynamical balance can easily be lost in the assimilation. For the EKF, the repeated adjustment of the covariances by the assimilation of observations can easily unbalance the TLM, and destroy the assumptions on which balanced assimilation rests. It is shown that an important factor is the choice of initial forecast error covariance matrix. A balance-constrained EKF is described and compared to the standard EKF, and shown to offer significant improvement for observation frequencies where balance in the standard EKF is lost. The EnKF is advantageous in that balance in the error covariances relies only on a balanced forecast ensemble, and that the analysis step is an ensemble-mean operation. Numerical experiments show that the EnKF may be preferable to the EKF in terms of balance, though its validity is limited by ensemble size. It is also found that overobserving can lead to a more unbalanced forecast ensemble and thus to an unbalanced analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background: The analysis of the Auditory Brainstem Response (ABR) is of fundamental importance to the investigation of the auditory system behaviour, though its interpretation has a subjective nature because of the manual process employed in its study and the clinical experience required for its analysis. When analysing the ABR, clinicians are often interested in the identification of ABR signal components referred to as Jewett waves. In particular, the detection and study of the time when these waves occur (i.e., the wave latency) is a practical tool for the diagnosis of disorders affecting the auditory system. Significant differences in inter-examiner results may lead to completely distinct clinical interpretations of the state of the auditory system. In this context, the aim of this research was to evaluate the inter-examiner agreement and variability in the manual classification of ABR. Methods: A total of 160 ABR data samples were collected, for four different stimulus intensity (80dBHL, 60dBHL, 40dBHL and 20dBHL), from 10 normal-hearing subjects (5 men and 5 women, from 20 to 52 years). Four examiners with expertise in the manual classification of ABR components participated in the study. The Bland-Altman statistical method was employed for the assessment of inter-examiner agreement and variability. The mean, standard deviation and error for the bias, which is the difference between examiners’ annotations, were estimated for each pair of examiners. Scatter plots and histograms were employed for data visualization and analysis. Results: In most comparisons the differences between examiner’s annotations were below 0.1 ms, which is clinically acceptable. In four cases, it was found a large error and standard deviation (>0.1 ms) that indicate the presence of outliers and thus, discrepancies between examiners. Conclusions: Our results quantify the inter-examiner agreement and variability of the manual analysis of ABR data, and they also allows for the determination of different patterns of manual ABR analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Code for Sustainable Homes (the Code) will require new homes in the United Kingdom to be ‘zero carbon’ from 2016. Drawing upon an evolutionary innovation perspective, this paper contributes to a gap in the literature by investigating which low and zero carbon technologies are actually being used by house builders, rather than the prevailing emphasis on the potentiality of these technologies. Using the results from a questionnaire three empirical contributions are made. First, house builders are selecting a narrow range of technologies. Second, these choices are made to minimise the disruption to their standard design and production templates (SDPTs). Finally, the coalescence around a small group of technologies is expected to intensify with solar-based technologies predicted to become more important. This paper challenges the dominant technical rationality in the literature that technical efficiency and cost benefits are the primary drivers for technology selection. These drivers play an important role but one which is mediated by the logic of maintaining the SDPTs of the house builders. This emphasises the need for construction diffusion of innovation theory to be problematized and developed within the context of business and market regimes constrained and reproduced by resilient technological trajectories.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aerosols affect the Earth's energy budget directly by scattering and absorbing radiation and indirectly by acting as cloud condensation nuclei and, thereby, affecting cloud properties. However, large uncertainties exist in current estimates of aerosol forcing because of incomplete knowledge concerning the distribution and the physical and chemical properties of aerosols as well as aerosol-cloud interactions. In recent years, a great deal of effort has gone into improving measurements and datasets. It is thus feasible to shift the estimates of aerosol forcing from largely model-based to increasingly measurement-based. Our goal is to assess current observational capabilities and identify uncertainties in the aerosol direct forcing through comparisons of different methods with independent sources of uncertainties. Here we assess the aerosol optical depth (τ), direct radiative effect (DRE) by natural and anthropogenic aerosols, and direct climate forcing (DCF) by anthropogenic aerosols, focusing on satellite and ground-based measurements supplemented by global chemical transport model (CTM) simulations. The multi-spectral MODIS measures global distributions of aerosol optical depth (τ) on a daily scale, with a high accuracy of ±0.03±0.05τ over ocean. The annual average τ is about 0.14 over global ocean, of which about 21%±7% is contributed by human activities, as estimated by MODIS fine-mode fraction. The multi-angle MISR derives an annual average AOD of 0.23 over global land with an uncertainty of ~20% or ±0.05. These high-accuracy aerosol products and broadband flux measurements from CERES make it feasible to obtain observational constraints for the aerosol direct effect, especially over global the ocean. A number of measurement-based approaches estimate the clear-sky DRE (on solar radiation) at the top-of-atmosphere (TOA) to be about -5.5±0.2 Wm-2 (median ± standard error from various methods) over the global ocean. Accounting for thin cirrus contamination of the satellite derived aerosol field will reduce the TOA DRE to -5.0 Wm-2. Because of a lack of measurements of aerosol absorption and difficulty in characterizing land surface reflection, estimates of DRE over land and at the ocean surface are currently realized through a combination of satellite retrievals, surface measurements, and model simulations, and are less constrained. Over the oceans the surface DRE is estimated to be -8.8±0.7 Wm-2. Over land, an integration of satellite retrievals and model simulations derives a DRE of -4.9±0.7 Wm-2 and -11.8±1.9 Wm-2 at the TOA and surface, respectively. CTM simulations derive a wide range of DRE estimates that on average are smaller than the measurement-based DRE by about 30-40%, even after accounting for thin cirrus and cloud contamination. A number of issues remain. Current estimates of the aerosol direct effect over land are poorly constrained. Uncertainties of DRE estimates are also larger on regional scales than on a global scale and large discrepancies exist between different approaches. The characterization of aerosol absorption and vertical distribution remains challenging. The aerosol direct effect in the thermal infrared range and in cloudy conditions remains relatively unexplored and quite uncertain, because of a lack of global systematic aerosol vertical profile measurements. A coordinated research strategy needs to be developed for integration and assimilation of satellite measurements into models to constrain model simulations. Enhanced measurement capabilities in the next few years and high-level scientific cooperation will further advance our knowledge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two different TAMSAT (Tropical Applications of Meteorological Satellites) methods of rainfall estimation were developed for northern and southern Africa, based on Meteosat images. These two methods were used to make rainfall estimates for the southern rainy season from October 1995 to April 1996. Estimates produced by both TAMSAT methods and estimates produced by the CPC (Climate Prediction Center) method were then compared with kriged data from over 800 raingauges in southern Africa. This shows that operational TAMSAT estimates are better over plateau regions, with 59% of estimates within one standard error (s.e.) of the kriged rainfall. Over mountainous regions the CPC approach is generally better, although all methods underestimate and give only 40% of estimates within 1 s.e. The two TAMSAT methods show little difference across a whole season, but when looked at in detail the northern method gives unsatisfactory calibrations. The CPC method does have significant overall improvements by building in real-time raingauge data, but only where sufficient raingauges are available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Lincoln–Petersen estimator is one of the most popular estimators used in capture–recapture studies. It was developed for a sampling situation in which two sources independently identify members of a target population. For each of the two sources, it is determined if a unit of the target population is identified or not. This leads to a 2 × 2 table with frequencies f11, f10, f01, f00 indicating the number of units identified by both sources, by the first but not the second source, by the second but not the first source and not identified by any of the two sources, respectively. However, f00 is unobserved so that the 2 × 2 table is incomplete and the Lincoln–Petersen estimator provides an estimate for f00. In this paper, we consider a generalization of this situation for which one source provides not only a binary identification outcome but also a count outcome of how many times a unit has been identified. Using a truncated Poisson count model, truncating multiple identifications larger than two, we propose a maximum likelihood estimator of the Poisson parameter and, ultimately, of the population size. This estimator shows benefits, in comparison with Lincoln–Petersen’s, in terms of bias and efficiency. It is possible to test the homogeneity assumption that is not testable in the Lincoln–Petersen framework. The approach is applied to surveillance data on syphilis from Izmir, Turkey.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Systematic review (SR) is a rigorous, protocol-driven approach designed to minimise error and bias when summarising the body of research evidence relevant to a specific scientific question. Taking as a comparator the use of SR in synthesising research in healthcare, we argue that SR methods could also pave the way for a “step change” in the transparency, objectivity and communication of chemical risk assessments (CRA) in Europe and elsewhere. We suggest that current controversies around the safety of certain chemicals are partly due to limitations in current CRA procedures which have contributed to ambiguity about the health risks posed by these substances. We present an overview of how SR methods can be applied to the assessment of risks from chemicals, and indicate how challenges in adapting SR methods from healthcare research to the CRA context might be overcome. Regarding the latter, we report the outcomes from a workshop exploring how to increase uptake of SR methods, attended by experts representing a wide range of fields related to chemical toxicology, risk analysis and SR. Priorities which were identified include: the conduct of CRA-focused prototype SRs; the development of a recognised standard of reporting and conduct for SRs in toxicology and CRA; and establishing a network to facilitate research, communication and training in SR methods. We see this paper as a milestone in the creation of a research climate that fosters communication between experts in CRA and SR and facilitates wider uptake of SR methods into CRA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and Aims: Phosphate (Pi) is one of the most limiting nutrients for agricultural production in Brazilian soils due to low soil Pi concentrations and rapid fixation of fertilizer Pi by adsorption to oxidic minerals and/or precipitation by iron and aluminum ions. The objectives of this study were to quantify phosphorus (P) uptake and use efficiency in cultivars of the species Coffea arabica L. and Coffea canephora L., and group them in terms of efficiency and response to Pi availability. Methods: Plants of 21 cultivars of C. arabica and four cultivars of C. canephora were grown under contrasting soil Pi availabilities. Biomass accumulation, tissue P concentration and accumulation and efficiency indices for P use were measured. Key Results: Coffee plant growth was significantly reduced under low Pi availability, and P concentration was higher in cultivars of C. canephora. The young leaves accumulated more P than any other tissue. The cultivars of C. canephora had a higher root/shoot ratio and were significantly more efficient in P uptake, while the cultivars of C. arabica were more efficient in P utilization. Agronomic P use efficiency varied among coffee cultivars and E16 Shoa, E22 Sidamo, Iêmen and Acaiá cultivars were classified as the most efficient and responsive to Pi supply. A positive correlation between P uptake efficiency and root to shoot ratio was observed across all cultivars at low Pi supply. These data identify Coffea genotypes better adapted to low soil Pi availabilities, and the traits that contribute to improved P uptake and use efficiency. These data could be used to select current genotypes with improved P uptake or utilization efficiencies for use on soils with low Pi availability and also provide potential breeding material and targets for breeding new cultivars better adapted to the low Pi status of Brazilian soils. This could ultimately reduce the use of Pi fertilizers in tropical soils, and contribute to more sustainable coffee production.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motion of a car is described using a stochastic model in which the driving processes are the steering angle and the tangential acceleration. The model incorporates exactly the kinematic constraint that the wheels do not slip sideways. Two filters based on this model have been implemented, namely the standard EKF, and a new filter (the CUF) in which the expectation and the covariance of the system state are propagated accurately. Experiments show that i) the CUF is better than the EKF at predicting future positions of the car; and ii) the filter outputs can be used to control the measurement process, leading to improved ability to recover from errors in predictive tracking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although climate models have been improving in accuracy and efficiency over the past few decades, it now seems that these incremental improvements may be slowing. As tera/petascale computing becomes massively parallel, our legacy codes are less suitable, and even with the increased resolution that we are now beginning to use, these models cannot represent the multiscale nature of the climate system. This paper argues that it may be time to reconsider the use of adaptive mesh refinement for weather and climate forecasting in order to achieve good scaling and representation of the wide range of spatial scales in the atmosphere and ocean. Furthermore, the challenge of introducing living organisms and human responses into climate system models is only just beginning to be tackled. We do not yet have a clear framework in which to approach the problem, but it is likely to cover such a huge number of different scales and processes that radically different methods may have to be considered. The challenges of multiscale modelling and petascale computing provide an opportunity to consider a fresh approach to numerical modelling of the climate (or Earth) system, which takes advantage of the computational fluid dynamics developments in other fields and brings new perspectives on how to incorporate Earth system processes. This paper reviews some of the current issues in climate (and, by implication, Earth) system modelling, and asks the question whether a new generation of models is needed to tackle these problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Models developed to identify the rates and origins of nutrient export from land to stream require an accurate assessment of the nutrient load present in the water body in order to calibrate model parameters and structure. These data are rarely available at a representative scale and in an appropriate chemical form except in research catchments. Observational errors associated with nutrient load estimates based on these data lead to a high degree of uncertainty in modelling and nutrient budgeting studies. Here, daily paired instantaneous P and flow data for 17 UK research catchments covering a total of 39 water years (WY) have been used to explore the nature and extent of the observational error associated with nutrient flux estimates based on partial fractions and infrequent sampling. The daily records were artificially decimated to create 7 stratified sampling records, 7 weekly records, and 30 monthly records from each WY and catchment. These were used to evaluate the impact of sampling frequency on load estimate uncertainty. The analysis underlines the high uncertainty of load estimates based on monthly data and individual P fractions rather than total P. Catchments with a high baseflow index and/or low population density were found to return a lower RMSE on load estimates when sampled infrequently than those with a tow baseflow index and high population density. Catchment size was not shown to be important, though a limitation of this study is that daily records may fail to capture the full range of P export behaviour in smaller catchments with flashy hydrographs, leading to an underestimate of uncertainty in Load estimates for such catchments. Further analysis of sub-daily records is needed to investigate this fully. Here, recommendations are given on load estimation methodologies for different catchment types sampled at different frequencies, and the ways in which this analysis can be used to identify observational error and uncertainty for model calibration and nutrient budgeting studies. (c) 2006 Elsevier B.V. All rights reserved.