911 resultados para Verification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new objective climatology of polar lows in the Nordic (Norwegian and Barents) seas has been derived from a database of diagnostics of objectively identified cyclones spanning the period January 2000 to April 2004. There are two distinct parts to this study: the development of the objective climatology and a characterization of the dynamical forcing of the polar lows identified. Polar lows are an intense subset of polar mesocyclones. Polar mesocyclones are distinguished from other cyclones in the database as those that occur in cold air outbreaks over the open ocean. The difference between the wet-bulb potential temperature at 700 hPa and the sea surface temperature (SST) is found to be an effective discriminator between the atmospheric conditions associated with polar lows and other cyclones in the Nordic seas. A verification study shows that the objective identification method is reliable in the Nordic seas region. After demonstrating success at identifying polar lows using the above method, the dynamical forcing of the polar lows in the Nordic seas is characterized. Diagnostics of the ratio of mid-level vertical motion attributable to quasi-geostrophic forcing from upper and lower levels (U/L ratio) are used to determine the prevalence of a recently proposed category of extratropical cyclogenesis, type C, for which latent heat release is crucial to development. Thirty-one percent of the objectively identified polar low events (36 from 115) exceeded the U/L ratio of 4.0, previously identified as a threshold for type C cyclones. There is a contrast between polar lows to the north and south of the Nordic seas. In the southern Norwegian Sea, the population of polar low events is dominated by type C cyclones. These possess strong convection and weak low-level baroclinicity. Over the Barents and northern Norwegian seas, the well-known cyclogenesis types A and B dominate. These possess stronger low-level baroclinicity and weaker convection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advances made over the past decade in structure determination from powder diffraction data are reviewed with particular emphasis on algorithmic developments and the successes and limitations of the technique. While global optimization methods have been successful in the solution of molecular crystal structures, new methods are required to make the solution of inorganic crystal structures more routine. The use of complementary techniques such as NMR to assist structure solution is discussed and the potential for the combined use of X-ray and neutron diffraction data for structure verification is explored. Structures that have proved difficult to solve from powder diffraction data are reviewed and the limitations of structure determination from powder diffraction data are discussed. Furthermore, the prospects of solving small protein crystal structures over the next decade are assessed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impact of targeted sonde observations on the 1-3 day forecasts for northern Europe is evaluated using the Met Office four-dimensional variational data assimilation scheme and a 24 km gridlength limited-area version of the Unified Model (MetUM). The targeted observations were carried out during February and March 2007 as part of the Greenland Flow Distortion Experiment, using a research aircraft based in Iceland. Sensitive area predictions using either total energy singular vectors or an ensemble transform Kalman filter were used to predict where additional observations should be made to reduce errors in the initial conditions of forecasts for northern Europe. Targeted sonde data was assimilated operationally into the MetUM. Hindcasts show that the impact of the sondes was mixed. Only two out of the five cases showed clear forecast improvement; the maximum forecast improvement seen over the verifying region was approximately 5% of the forecast error 24 hours into the forecast. These two cases are presented in more detail: in the first the improvement propagates into the verification region with a developing polar low; and in the second the improvement is associated with an upper-level trough. The impact of cycling targeted data in the background of the forecast (including the memory of previous targeted observations) is investigated. This is shown to cause a greater forecast impact, but does not necessarily lead to a greater forecast improvement. Finally, the robustness of the results is assessed using a small ensemble of forecasts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The chess endgame is increasingly being seen through the lens of, and therefore effectively defined by, a data ‘model’ of itself. It is vital that such models are clearly faithful to the reality they purport to represent. This paper examines that issue and systems engineering responses to it, using the chess endgame as the exemplar scenario. A structured survey has been carried out of the intrinsic challenges and complexity of creating endgame data by reviewing the past pattern of errors during work in progress, surfacing in publications and occurring after the data was generated. Specific measures are proposed to counter observed classes of error-risk, including a preliminary survey of techniques for using state-of-the-art verification tools to generate EGTs that are correct by construction. The approach may be applied generically beyond the game domain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Forecasting atmospheric blocking is one of the main problems facing medium-range weather forecasters in the extratropics. The European Centre for Medium-Range Weather Forecasts (ECMWF) Ensemble Prediction System (EPS) provides an excellent basis for medium-range forecasting as it provides a number of different possible realizations of the meteorological future. This ensemble of forecasts attempts to account for uncertainties in both the initial conditions and the model formulation. Since 18 July 2000, routine output from the EPS has included the field of potential temperature on the potential vorticity (PV) D 2 PV units (PVU) surface, the dynamical tropopause. This has enabled the objective identification of blocking using an index based on the reversal of the meridional potential-temperature gradient. A year of EPS probability forecasts of Euro-Atlantic and Pacific blocking have been produced and are assessed in this paper, concentrating on the Euro-Atlantic sector. Standard verification techniques such as Brier scores, Relative Operating Characteristic (ROC) curves and reliability diagrams are used. It is shown that Euro-Atlantic sector-blocking forecasts are skilful relative to climatology out to 10 days, and are more skilful than the deterministic control forecast at all lead times. The EPS is also more skilful than a probabilistic version of this deterministic forecast, though the difference is smaller. In addition, it is shown that the onset of a sector-blocking episode is less well predicted than its decay. As the lead time increases, the probability forecasts tend towards a model climatology with slightly less blocking than is seen in the real atmosphere. This small under-forecasting bias in the blocking forecasts is possibly related to a westerly bias in the ECMWF model. Copyright © 2003 Royal Meteorological Society

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) is a World Weather Research Programme project. One of its main objectives is to enhance collaboration on the development of ensemble prediction between operational centers and universities by increasing the availability of ensemble prediction system (EPS) data for research. This study analyzes the prediction of Northern Hemisphere extratropical cyclones by nine different EPSs archived as part of the TIGGE project for the 6-month time period of 1 February 2008–31 July 2008, which included a sample of 774 cyclones. An objective feature tracking method has been used to identify and track the cyclones along the forecast trajectories. Forecast verification statistics have then been produced [using the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analysis as the truth] for cyclone position, intensity, and propagation speed, showing large differences between the different EPSs. The results show that the ECMWF ensemble mean and control have the highest level of skill for all cyclone properties. The Japanese Meteorological Administration (JMA), the National Centers for Environmental Prediction (NCEP), the Met Office (UKMO), and the Canadian Meteorological Centre (CMC) have 1 day less skill for the position of cyclones throughout the forecast range. The relative performance of the different EPSs remains the same for cyclone intensity except for NCEP, which has larger errors than for position. NCEP, the Centro de Previsão de Tempo e Estudos Climáticos (CPTEC), and the Australian Bureau of Meteorology (BoM) all have faster intensity error growth in the earlier part of the forecast. They are also very underdispersive and significantly underpredict intensities, perhaps due to the comparatively low spatial resolutions of these EPSs not being able to accurately model the tilted structure essential to cyclone growth and decay. There is very little difference between the levels of skill of the ensemble mean and control for cyclone position, but the ensemble mean provides an advantage over the control for all EPSs except CPTEC in cyclone intensity and there is an advantage for propagation speed for all EPSs. ECMWF and JMA have an excellent spread–skill relationship for cyclone position. The EPSs are all much more underdispersive for cyclone intensity and propagation speed than for position, with ECMWF and CMC performing best for intensity and CMC performing best for propagation speed. ECMWF is the only EPS to consistently overpredict cyclone intensity, although the bias is small. BoM, NCEP, UKMO, and CPTEC significantly underpredict intensity and, interestingly, all the EPSs underpredict the propagation speed, that is, the cyclones move too slowly on average in all EPSs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The creation of OFDM based Wireless Personal Area Networks (WPANs) has allowed the development of high bit-rate wireless communication devices suitable for streaming High Definition video between consumer products, as demonstrated in Wireless-USB and Wireless-HDMI. However, these devices need high frequency clock rates, particularly for the OFDM, FFT and symbol processing sections resulting in high silicon cost and high electrical power. The high clock rates make hardware prototyping difficult and verification is therefore very important but costly. Acknowledging that electrical power in wireless consumer devices is more critical than the number of implemented logic gates, this paper presents a Double Data Rate (DDR) architecture for implementation inside a OFDM baseband codec in order to reduce the high frequency clock rates by a complete factor of 2. The presented architecture has been implemented and tested for ECMA-368 (Wireless- USB context) resulting in a maximum clock rate of 264MHz instead of the expected 528MHz clock rate existing anywhere on the baseband codec die.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a concerted global effort to digitize biodiversity occurrence data from herbarium and museum collections that together offer an unparalleled archive of life on Earth over the past few centuries. The Global Biodiversity Information Facility provides the largest single gateway to these data. Since 2004 it has provided a single point of access to specimen data from databases of biological surveys and collections. Biologists now have rapid access to more than 120 million observations, for use in many biological analyses. We investigate the quality and coverage of data digitally available, from the perspective of a biologist seeking distribution data for spatial analysis on a global scale. We present an example of automatic verification of geographic data using distributions from the International Legume Database and Information Service to test empirically, issues of geographic coverage and accuracy. There are over 1/2 million records covering 31% of all Legume species, and 84% of these records pass geographic validation. These data are not yet a global biodiversity resource for all species, or all countries. A user will encounter many biases and gaps in these data which should be understood before data are used or analyzed. The data are notably deficient in many of the world's biodiversity hotspots. The deficiencies in data coverage can be resolved by an increased application of resources to digitize and publish data throughout these most diverse regions. But in the push to provide ever more data online, we should not forget that consistent data quality is of paramount importance if the data are to be useful in capturing a meaningful picture of life on Earth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hunting foxes with hounds has been a countryside pursuit in Britain since the 17th Century, but its effect nationally on habitat management is little understood by the general public. A survey questionnaire was distributed to 163 mounted fox hunts of England and Wales to quantify their management practices in woodland and other habitat. Ninety-two hunts (56%), covering 75,514 km(2), returned details on woodland management motivated by the improvement of their sport. The management details were verified via on-site visits for a sample of 200 woodlands. Following verification, the area of woodlands containing the management was conservatively estimated at 24,053 (+/- 2241) ha, comprising 5.9% of woodland area within the whole of the area hunted by the 92 hunts. Management techniques included: tree planting, coppicing, felling, ride and perimeter management. A case study in five hunt countries in southern England examined, through the use of botanical survey and butterfly counts, the consequences of the hunt management on woodland ground flora and butterflies. Managed areas had, within the last 5 years, been coppiced and rides had been cleared. Vegetation cover in managed and unmanaged sites averaged 86% and 64%, respectively, and managed areas held on average 4 more plant species and a higher plant diversity than unmanaged areas (Shannon index of diversity: 2.25 vs. 1.95). Both the average number of butterfly species (2.2 vs. 0.3) and individuals counted (4.6 vs. 0.3) were higher in the managed than unmanaged sites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The management of information in engineering organisations is facing a particular challenge in the ever-increasing volume of information. It has been recognised that an effective methodology is required to evaluate information in order to avoid information overload and to retain the right information for reuse. By using, as a starting point, a number of the current tools and techniques which attempt to obtain ‘the value’ of information, it is proposed that an assessment or filter mechanism for information is needed to be developed. This paper addresses this issue firstly by briefly reviewing the information overload problem, the definition of value, and related research work on the value of information in various areas. Then a “characteristic” based framework of information evaluation is introduced using the key characteristics identified from related work as an example. A Bayesian Network diagram method is introduced to the framework to build the linkage between the characteristics and information value in order to quantitatively calculate the quality and value of information. The training and verification process for the model is then described using 60 real engineering documents as a sample. The model gives a reasonable accurate result and the differences between the model calculation and training judgements are summarised as the potential causes are discussed. Finally, several further issues including the challenge of the framework and the implementations of this evaluation assessment method are raised.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Probiotics—live microorganisms that when administered in adequate amounts confer a health benefit on the host—have been studied for both human and animal applications, and worldwide research on this topic has accelerated in recent years. This paper reviews the literature on probiotics, describes how probiotics work in human ecosystems, and outlines the impact of probiotics on human health and disease. The paper also addresses safety issues of probiotic use, suggests future developments in the field of probiotics, and provides research and policy recommendations. Product considerations and potential future developments regarding probiotics also are discussed. The authors conclude that controlled human studies have revealed a diverse range of health benefits from consumption of probiotics, due largely to their impact on immune function or on microbes colonizing the body. Additional, well-designed and properly controlled human and mechanistic studies with probiotics will advance the essential understanding of active principles, mechanisms of action, and degree of effects that can be realized by specific consumer groups. Recommendations include establishment of a standard of identity for the term “probiotic,” adoption of third-party verification of label claims, use of probiotics selectively in clinical conditions, and use of science-based assessment of the benefits and risks of genetically engineered probiotic microbes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advances made over the past decade in structure determination from powder diffraction data are reviewed with particular emphasis on algorithmic developments and the successes and limitations of the technique. While global optimization methods have been successful in the solution of molecular crystal structures, new methods are required to make the solution of inorganic crystal structures more routine. The use of complementary techniques such as NMR to assist structure solution is discussed and the potential for the combined use of X-ray and neutron diffraction data for structure verification is explored. Structures that have proved difficult to solve from powder diffraction data are reviewed and the limitations of structure determination from powder diffraction data are discussed. Furthermore, the prospects of solving small protein crystal structures over the next decade are assessed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The SystemVerilog implementation of the Open Verification Methodology (OVM) is exercised on an 8b/10b RTL open core design in the hope of being a simple yet complete exercise to expose the key features of OVM. Emphasis is put onto the actual usage of the verification components rather than a complete verification flow aiming at being of help to readers unfamiliar with OVM seeking to apply the methodology to their own designs. A link that takes you to the complete code is given to reinforce this aim. We found the methodology easy to use but intimidating at first glance specially for someone with little experience in object oriented programming. However it is clear to see the flexibility, portability and reusability of verification code once you manage to give some first steps.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BCI systems require correct classification of signals interpreted from the brain for useful operation. To this end this paper investigates a method proposed in [1] to correctly classify a series of images presented to a group of subjects in [2]. We show that it is possible to use the proposed methods to correctly recognise the original stimuli presented to a subject from analysis of their EEG. Additionally we use a verification set to show that the trained classification method can be applied to a different set of data. We go on to investigate the issue of invariance in EEG signals. That is, the brain representation of similar stimuli is recognisable across different subjects. Finally we consider the usefulness of the methods investigated towards an improved BCI system and discuss how it could potentially lead to great improvements in the ease of use for the end user by offering an alternative, more intuitive control based mode of operation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a feature selection approach based on Gabor wavelet feature and boosting for face verification. By convolution with a group of Gabor wavelets, the original images are transformed into vectors of Gabor wavelet features. Then for individual person, a small set of significant features are selected by the boosting algorithm from a large set of Gabor wavelet features. The experiment results have shown that the approach successfully selects meaningful and explainable features for face verification. The experiments also suggest that for the common characteristics such as eyes, noses, mouths may not be as important as some unique characteristic when training set is small. When training set is large, the unique characteristics and the common characteristics are both important.