178 resultados para Error Vector Magnitude (EVM)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ECMWF full-physics and dry singular vector (SV) packages, using a dry energy norm and a 1-day optimization time, are applied to four high impact European cyclones of recent years that were almost universally badly forecast in the short range. It is shown that these full-physics SVs are much more relevant to severe cyclonic development than those based on dry dynamics plus boundary layer alone. The crucial extra ingredient is the representation of large-scale latent heat release. The severe winter storms all have a long, nearly straight region of high baroclinicity stretching across the Atlantic towards Europe, with a tongue of very high moisture content on its equatorward flank. In each case some of the final-time top SV structures pick out the region of the actual storm. The initial structures were generally located in the mid- to low troposphere. Forecasts based on initial conditions perturbed by moist SVs with opposite signs and various amplitudes show the range of possible 1-day outcomes for reasonable magnitudes of forecast error. In each case one of the perturbation structures gave a forecast very much closer to the actual storm than the control forecast. Deductions are made about the predictability of high-impact extratropical cyclone events. Implications are drawn for the short-range forecast problem and suggestions made for one practicable way to approach short-range ensemble forecasting. Copyright © 2005 Royal Meteorological Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Flow in the world's oceans occurs at a wide range of spatial scales, from a fraction of a metre up to many thousands of kilometers. In particular, regions of intense flow are often highly localised, for example, western boundary currents, equatorial jets, overflows and convective plumes. Conventional numerical ocean models generally use static meshes. The use of dynamically-adaptive meshes has many potential advantages but needs to be guided by an error measure reflecting the underlying physics. A method of defining an error measure to guide an adaptive meshing algorithm for unstructured tetrahedral finite elements, utilizing an adjoint or goal-based method, is described here. This method is based upon a functional, encompassing important features of the flow structure. The sensitivity of this functional, with respect to the solution variables, is used as the basis from which an error measure is derived. This error measure acts to predict those areas of the domain where resolution should be changed. A barotropic wind driven gyre problem is used to demonstrate the capabilities of the method. The overall objective of this work is to develop robust error measures for use in an oceanographic context which will ensure areas of fine mesh resolution are used only where and when they are required. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the primary goals of the Center for Integrated Space Weather Modeling (CISM) effort is to assess and improve prediction of the solar wind conditions in near‐Earth space, arising from both quasi‐steady and transient structures. We compare 8 years of L1 in situ observations to predictions of the solar wind speed made by the Wang‐Sheeley‐Arge (WSA) empirical model. The mean‐square error (MSE) between the observed and model predictions is used to reach a number of useful conclusions: there is no systematic lag in the WSA predictions, the MSE is found to be highest at solar minimum and lowest during the rise to solar maximum, and the optimal lead time for 1 AU solar wind speed predictions is found to be 3 days. However, MSE is shown to frequently be an inadequate “figure of merit” for assessing solar wind speed predictions. A complementary, event‐based analysis technique is developed in which high‐speed enhancements (HSEs) are systematically selected and associated from observed and model time series. WSA model is validated using comparisons of the number of hit, missed, and false HSEs, along with the timing and speed magnitude errors between the forecasted and observed events. Morphological differences between the different HSE populations are investigated to aid interpretation of the results and improvements to the model. Finally, by defining discrete events in the time series, model predictions from above and below the ecliptic plane can be used to estimate an uncertainty in the predicted HSE arrival times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The one-dimensional variational assimilation of vertical temperature information in the presence of a boundary-layer capping inversion is studied. For an optimal analysis of the vertical temperature profile, an accurate representation of the background error covariances is essential. The background error covariances are highly flow-dependent due to the variability in the presence, structure and height of the boundary-layer capping inversion. Flow-dependent estimates of the background error covariances are shown by studying the spread in an ensemble of forecasts. A forecast of the temperature profile (used as a background state) may have a significant error in the position of the capping inversion with respect to observations. It is shown that the assimilation of observations may weaken the inversion structure in the analysis if only magnitude errors are accounted for as is the case for traditional data assimilation methods used for operational weather prediction. The positional error is treated explicitly here in a new data assimilation scheme to reduce positional error, in addition to the traditional framework to reduce magnitude error. The distribution of the positional error of the background inversion is estimated for use with the new scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimating the magnitude of Agulhas leakage, the volume flux of water from the Indian to the Atlantic Ocean, is difficult because of the presence of other circulation systems in the Agulhas region. Indian Ocean water in the Atlantic Ocean is vigorously mixed and diluted in the Cape Basin. Eulerian integration methods, where the velocity field perpendicular to a section is integrated to yield a flux, have to be calibrated so that only the flux by Agulhas leakage is sampled. Two Eulerian methods for estimating the magnitude of Agulhas leakage are tested within a high-resolution two-way nested model with the goal to devise a mooring-based measurement strategy. At the GoodHope line, a section halfway through the Cape Basin, the integrated velocity perpendicular to that line is compared to the magnitude of Agulhas leakage as determined from the transport carried by numerical Lagrangian floats. In the first method, integration is limited to the flux of water warmer and more saline than specific threshold values. These threshold values are determined by maximizing the correlation with the float-determined time series. By using the threshold values, approximately half of the leakage can directly be measured. The total amount of Agulhas leakage can be estimated using a linear regression, within a 90% confidence band of 12 Sv. In the second method, a subregion of the GoodHope line is sought so that integration over that subregion yields an Eulerian flux as close to the float-determined leakage as possible. It appears that when integration is limited within the model to the upper 300 m of the water column within 900 km of the African coast the time series have the smallest root-mean-square difference. This method yields a root-mean-square error of only 5.2 Sv but the 90% confidence band of the estimate is 20 Sv. It is concluded that the optimum thermohaline threshold method leads to more accurate estimates even though the directly measured transport is a factor of two lower than the actual magnitude of Agulhas leakage in this model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Targeted observations are generally taken in regions of high baroclinicity, but often show little impact. One plausible explanation is that important dynamical information, such as upshear tilt, is not extracted from the targeted observations by the data assimilation scheme and used to correct initial condition error. This is investigated by generating pseudo targeted observations which contain a singular vector (SV) structure that is not present in the background field or routine observations, i.e. assuming that the background has an initial condition error with tilted growing structure. Experiments were performed for a single case-study with varying numbers of pseudo targeted observations. These were assimilated by the Met Office four-dimensional variational (4D-Var) data assimilation scheme, which uses a 6 h window for observations and background-error covariances calculated using the National Meteorological Centre (NMC) method. The forecasts were run using the operational Met Office Unified Model on a 24 km grid. The results presented clearly demonstrate that a 6 h window 4D-Var system is capable of extracting baroclinic information from a limited set of observations and using it to correct initial condition error. To capture the SV structure well (projection of 0.72 in total energy), 50 sondes over an area of 1×106 km2 were required. When the SV was represented by only eight sondes along an example targeting flight track covering a smaller area, the projection onto the SV structure was lower; the resulting forecast perturbations showed an SV structure with increased tilt and reduced initial energy. The total energy contained in the perturbations decreased as the SV structure was less well described by the set of observations (i.e. as fewer pseudo observations were assimilated). The assimilated perturbation had lower energy than the SV unless the pseudo observations were assimilated with the dropsonde observation errors halved from operational values. Copyright © 2010 Royal Meteorological Society

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A powerful way to test the realism of ocean general circulation models is to systematically compare observations of passive tracer concentration with model predictions. The general circulation models used in this way cannot resolve a full range of vigorous mesoscale activity (on length scales between 10–100 km). In the real ocean, however, this activity causes important variability in tracer fields. Thus, in order to rationally compare tracer observations with model predictions these unresolved fluctuations (the model variability error) must be estimated. We have analyzed this variability using an eddy‐resolving reduced‐gravity model in a simple midlatitude double‐gyre configuration. We find that the wave number spectrum of tracer variance is only weakly sensitive to the distribution of (large scale slowly varying) tracer sources and sinks. This suggests that a universal passive tracer spectrum may exist in the ocean. We estimate the spectral shape using high‐resolution measurements of potential temperature on an isopycnal in the upper northeast Atlantic Ocean, finding a slope near k −1.7 between 10 and 500 km. The typical magnitude of the variance is estimated by comparing tracer simulations using different resolutions. For CFC‐ and tritium‐type transient tracers the peak magnitude of the model variability saturation error may reach 0.20 for scales shorter than 100 km. This is of the same order as the time mean saturation itself and well over an order of magnitude greater than the instrumental uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper considers meta-analysis of diagnostic studies that use a continuous score for classification of study participants into healthy or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might be confounded by a potentially unknown variation of the cut-off value. To cope with this phenomena it is suggested to use, instead, an overall estimate of the misclassification error previously suggested and used as Youden’s index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel–Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden’s index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ECMWF ensemble weather forecasts are generated by perturbing the initial conditions of the forecast using a subset of the singular vectors of the linearised propagator. Previous results show that when creating probabilistic forecasts from this ensemble better forecasts are obtained if the mean of the spread and the variability of the spread are calibrated separately. We show results from a simple linear model that suggest that this may be a generic property for all singular vector based ensemble forecasting systems based on only a subset of the full set of singular vectors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper exploits a structural time series approach to model the time pattern of multiple and resurgent food scares and their direct and cross-product impacts on consumer response. A structural time series Almost Ideal Demand System (STS-AIDS) is embedded in a vector error correction framework to allow for dynamic effects (VEC-STS-AIDS). Italian aggregate household data on meat demand is used to assess the time-varying impact of a resurgent BSE crisis (1996 and 2000) and the 1999 Dioxin crisis. The VEC-STS-AIDS model monitors the short-run impacts and performs satisfactorily in terms of residuals diagnostics, overcoming the major problems encountered by the customary vector error correction approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Uganda, control of vector-borne diseases is mainly in form of vector control, and chemotherapy. There have been reports that acaricides are being misused in the pastoralist systems in Uganda. This is because of the belief by scientists that intensive application of acaricide is uneconomical and unsustainable particularly in the indigenous cattle. The objective of this study was to investigate the strategies, rationale and effectiveness of vector-borne disease control by pastoralists. To systematically carry out these investigations, a combination of qualitative and quantitative research methods was used, in both the collection and the analysis of data. Cattle keepers were found to control tick-borne diseases (TBDs) mainly through spraying, in contrast with the control of trypanosomosis for which the main method of control was by chemotherapy. The majority of herders applied acaricides weekly and used an acaricide of lower strength than recommended by the manufacturers. They used very little acaricide wash, and spraying was preferred to dipping. Furthermore, pastoralists either treated sick animals themselves or did nothing at all, rather than using veterinary personnel. Oxytetracycline (OTC) was the drug commonly used in the treatment of TBDs. Nevertheless, although pastoralists may not have been following recommended practices in their control of ticks and tick-borne diseases, they were neither wasteful nor uneconomical and their methods appeared to be effective. Trypanosomosis was not a problem either in Sembabule or Mbarara district. Those who used trypanocides were found to use more drugs than were necessary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of authors have proposed clinical trial designs involving the comparison of several experimental treatments with a control treatment in two or more stages. At the end of the first stage, the most promising experimental treatment is selected, and all other experimental treatments are dropped from the trial. Provided it is good enough, the selected experimental treatment is then compared with the control treatment in one or more subsequent stages. The analysis of data from such a trial is problematic because of the treatment selection and the possibility of stopping at interim analyses. These aspects lead to bias in the maximum-likelihood estimate of the advantage of the selected experimental treatment over the control and to inaccurate coverage for the associated confidence interval. In this paper, we evaluate the bias of the maximum-likelihood estimate and propose a bias-adjusted estimate. We also propose an approach to the construction of a confidence region for the vector of advantages of the experimental treatments over the control based on an ordering of the sample space. These regions are shown to have accurate coverage, although they are also shown to be necessarily unbounded. Confidence intervals for the advantage of the selected treatment are obtained from the confidence regions and are shown to have more accurate coverage than the standard confidence interval based upon the maximum-likelihood estimate and its asymptotic standard error.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The theta-logistic is a widely used generalisation of the logistic model of regulated biological processes which is used in particular to model population regulation. Then the parameter theta gives the shape of the relationship between per-capita population growth rate and population size. Estimation of theta from population counts is however subject to bias, particularly when there are measurement errors. Here we identify factors disposing towards accurate estimation of theta by simulation of populations regulated according to the theta-logistic model. Factors investigated were measurement error, environmental perturbation and length of time series. Large measurement errors bias estimates of theta towards zero. Where estimated theta is close to zero, the estimated annual return rate may help resolve whether this is due to bias. Environmental perturbations help yield unbiased estimates of theta. Where environmental perturbations are large, estimates of theta are likely to be reliable even when measurement errors are also large. By contrast where the environment is relatively constant, unbiased estimates of theta can only be obtained if populations are counted precisely Our results have practical conclusions for the design of long-term population surveys. Estimation of the precision of population counts would be valuable, and could be achieved in practice by repeating counts in at least some years. Increasing the length of time series beyond ten or 20 years yields only small benefits. if populations are measured with appropriate accuracy, given the level of environmental perturbation, unbiased estimates can be obtained from relatively short censuses. These conclusions are optimistic for estimation of theta. (C) 2008 Elsevier B.V All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper considers meta-analysis of diagnostic studies that use a continuous Score for classification of study participants into healthy, or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between Studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might he confounded by a potentially unknown variation of the cut-off Value. To cope with this phenomena it is suggested to use, instead an overall estimate of the misclassification error previously suggested and used as Youden's index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel-Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden's index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.