933 resultados para correctness verification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the on-going research performed in order to integrate process automation and process management support in the context of media production. This has been addressed on the basis of a holistic approach to software engineering applied to media production modelling to ensure design correctness, completeness and effectiveness. The focus of the research and development has been to enhance the metadata management throughout the process in a similar fashion to that achieved in Decision Support Systems (DSS) to facilitate well-grounded business decisions. The paper sets out the aims and objectives and the methodology deployed. The paper describes the solution in some detail and sets out some preliminary conclusions and the planned future work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BCI systems require correct classification of signals interpreted from the brain for useful operation. To this end this paper investigates a method proposed in [1] to correctly classify a series of images presented to a group of subjects in [2]. We show that it is possible to use the proposed methods to correctly recognise the original stimuli presented to a subject from analysis of their EEG. Additionally we use a verification set to show that the trained classification method can be applied to a different set of data. We go on to investigate the issue of invariance in EEG signals. That is, the brain representation of similar stimuli is recognisable across different subjects. Finally we consider the usefulness of the methods investigated towards an improved BCI system and discuss how it could potentially lead to great improvements in the ease of use for the end user by offering an alternative, more intuitive control based mode of operation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a feature selection approach based on Gabor wavelet feature and boosting for face verification. By convolution with a group of Gabor wavelets, the original images are transformed into vectors of Gabor wavelet features. Then for individual person, a small set of significant features are selected by the boosting algorithm from a large set of Gabor wavelet features. The experiment results have shown that the approach successfully selects meaningful and explainable features for face verification. The experiments also suggest that for the common characteristics such as eyes, noses, mouths may not be as important as some unique characteristic when training set is small. When training set is large, the unique characteristics and the common characteristics are both important.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the forecasting of binary events, verification measures that are “equitable” were defined by Gandin and Murphy to satisfy two requirements: 1) they award all random forecasting systems, including those that always issue the same forecast, the same expected score (typically zero), and 2) they are expressible as the linear weighted sum of the elements of the contingency table, where the weights are independent of the entries in the table, apart from the base rate. The authors demonstrate that the widely used “equitable threat score” (ETS), as well as numerous others, satisfies neither of these requirements and only satisfies the first requirement in the limit of an infinite sample size. Such measures are referred to as “asymptotically equitable.” In the case of ETS, the expected score of a random forecasting system is always positive and only falls below 0.01 when the number of samples is greater than around 30. Two other asymptotically equitable measures are the odds ratio skill score and the symmetric extreme dependency score, which are more strongly inequitable than ETS, particularly for rare events; for example, when the base rate is 2% and the sample size is 1000, random but unbiased forecasting systems yield an expected score of around −0.5, reducing in magnitude to −0.01 or smaller only for sample sizes exceeding 25 000. This presents a problem since these nonlinear measures have other desirable properties, in particular being reliable indicators of skill for rare events (provided that the sample size is large enough). A potential way to reconcile these properties with equitability is to recognize that Gandin and Murphy’s two requirements are independent, and the second can be safely discarded without losing the key advantages of equitability that are embodied in the first. This enables inequitable and asymptotically equitable measures to be scaled to make them equitable, while retaining their nonlinearity and other properties such as being reliable indicators of skill for rare events. It also opens up the possibility of designing new equitable verification measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An increasing importance is assigned to the estimation and verification of carbon stocks in forests. Forestry practice has several long-established and reliable methods for the assessment of aboveground biomass; however we still miss accurate predictors of belowground biomass. A major windthrow event exposing the coarse root systems of Norway spruce trees allowed us to assess the effects of contrasting soil stone and water content on belowground allocation. Increasing stone content decreases root/shoot ratio, while soil waterlogging leads to an increase in this ratio. We constructed allometric relationships for belowground biomass prediction and were able to show that only soil waterlogging significantly impacts model parameters. We showed that diameter at breast height is a reliable predictor of belowground biomass and, once site-specific parameters have been developed, it is possible to accurately estimate belowground biomass in Norway spruce.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The past decade has witnessed explosive growth of mobile subscribers and services. With the purpose of providing better-swifter-cheaper services, radio network optimisation plays a crucial role but faces enormous challenges. The concept of Dynamic Network Optimisation (DNO), therefore, has been introduced to optimally and continuously adjust network configurations, in response to changes in network conditions and traffic. However, the realization of DNO has been seriously hindered by the bottleneck of optimisation speed performance. An advanced distributed parallel solution is presented in this paper, as to bridge the gap by accelerating the sophisticated proprietary network optimisation algorithm, while maintaining the optimisation quality and numerical consistency. The ariesoACP product from Arieso Ltd serves as the main platform for acceleration. This solution has been prototyped, implemented and tested. Real-project based results exhibit a high scalability and substantial acceleration at an average speed-up of 2.5, 4.9 and 6.1 on a distributed 5-core, 9-core and 16-core system, respectively. This significantly outperforms other parallel solutions such as multi-threading. Furthermore, augmented optimisation outcome, alongside high correctness and self-consistency, have also been fulfilled. Overall, this is a breakthrough towards the realization of DNO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelling the interaction of terahertz(THz) radiation with biological tissueposes many interesting problems. THzradiation is neither obviously described byan electric field distribution or anensemble of photons and biological tissueis an inhomogeneous medium with anelectronic permittivity that is bothspatially and frequency dependent making ita complex system to model.A three-layer system of parallel-sidedslabs has been used as the system throughwhich the passage of THz radiation has beensimulated. Two modelling approaches havebeen developed a thin film matrix model anda Monte Carlo model. The source data foreach of these methods, taken at the sametime as the data recorded to experimentallyverify them, was a THz spectrum that hadpassed though air only.Experimental verification of these twomodels was carried out using athree-layered in vitro phantom. Simulatedtransmission spectrum data was compared toexperimental transmission spectrum datafirst to determine and then to compare theaccuracy of the two methods. Goodagreement was found, with typical resultshaving a correlation coefficient of 0.90for the thin film matrix model and 0.78 forthe Monte Carlo model over the full THzspectrum. Further work is underway toimprove the models above 1 THz.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A vision system for recognizing rigid and articulated three-dimensional objects in two-dimensional images is described. Geometrical models are extracted from a commercial computer aided design package. The models are then augmented with appearance and functional information which improves the system's hypothesis generation, hypothesis verification, and pose refinement. Significant advantages over existing CAD-based vision systems, which utilize only information available in the CAD system, are realized. Examples show the system recognizing, locating, and tracking a variety of objects in a robot work-cell and in natural scenes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report on a distributed moisture detection scheme which uses a cable design based on waterswellable hydrogel polymers. The cable modulates the loss characteristic of light guided within a multi-mode optical fibre in response to relative water potentials in the surrounding environment. Interrogation of the cable using conventional optical time-domain reflectometry (OTDR) instruments allows water ingress points to be identified and located with a spatial resolution of 50 cm. The system has been tested in a simulated tendon duct grouting experiment as a means of mapping the extent of fill along the duct during the grouting process. Voided regions were detected and identified to within 50 cm. A series of salt solutions has been used to determine the sensor behaviour over a range of water potentials. These experiments predict that measurements of soil moisture content can be made over the range 0 to – 1500 kPa. Preliminary data on soil measurements have shown that the sensor can detect water pressure changes with a resolution of 45 kPa. Applications for the sensor include quality assurance of grouting procedures, verification of waterproofing barriers and soil moisture content determination (for load-bearing calculations).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an enhanced hypothesis verification strategy for 3D object recognition. A new learning methodology is presented which integrates the traditional dichotomic object-centred and appearance-based representations in computer vision giving improved hypothesis verification under iconic matching. The "appearance" of a 3D object is learnt using an eigenspace representation obtained as it is tracked through a scene. The feature representation implicitly models the background and the objects observed enabling the segmentation of the objects from the background. The method is shown to enhance model-based tracking, particularly in the presence of clutter and occlusion, and to provide a basis for identification. The unified approach is discussed in the context of the traffic surveillance domain. The approach is demonstrated on real-world image sequences and compared to previous (edge-based) iconic evaluation techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The sub-continent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite-derived rainfall data from the Microwave Infrared Rainfall Algorithm (MIRA). This dataset covers the period from 1993 to 2002 and the whole of southern Africa at a spatial resolution of 0.1° longitude/latitude. This paper concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of present-day rainfall variability over southern Africa and is not intended to discuss possible future changes in climate as these have been documented elsewhere. Simulations of current climate from the UKMeteorological Office Hadley Centre’s climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. Secondly, the ability of the model to reproduce daily rainfall extremes is assessed, again by a comparison with extremes from the MIRA dataset. The results suggest that the model reproduces the number and spatial distribution of rainfall extremes with some accuracy, but that mean rainfall and rainfall variability is underestimated (over-estimated) over wet (dry) regions of southern Africa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During April-May 2010 volcanic ash clouds from the Icelandic Eyjafjallajökull volcano reached Europe causing an unprecedented disruption of the EUR/NAT region airspace. Civil aviation authorities banned all flight operations because of the threat posed by volcanic ash to modern turbine aircraft. New quantitative airborne ash mass concentration thresholds, still under discussion, were adopted for discerning regions contaminated by ash. This has implications for ash dispersal models routinely used to forecast the evolution of ash clouds. In this new context, quantitative model validation and assessment of the accuracies of current state-of-the-art models is of paramount importance. The passage of volcanic ash clouds over central Europe, a territory hosting a dense network of meteorological and air quality observatories, generated a quantity of observations unusual for volcanic clouds. From the ground, the cloud was observed by aerosol lidars, lidar ceilometers, sun photometers, other remote-sensing instru- ments and in-situ collectors. From the air, sondes and multiple aircraft measurements also took extremely valuable in-situ and remote-sensing measurements. These measurements constitute an excellent database for model validation. Here we validate the FALL3D ash dispersal model by comparing model results with ground and airplane-based measurements obtained during the initial 14e23 April 2010 Eyjafjallajökull explosive phase. We run the model at high spatial resolution using as input hourly- averaged observed heights of the eruption column and the total grain size distribution reconstructed from field observations. Model results are then compared against remote ground-based and in-situ aircraft-based measurements, including lidar ceilometers from the German Meteorological Service, aerosol lidars and sun photometers from EARLINET and AERONET networks, and flight missions of the German DLR Falcon aircraft. We find good quantitative agreement, with an error similar to the spread in the observations (however depending on the method used to estimate mass eruption rate) for both airborne and ground mass concentration. Such verification results help us understand and constrain the accuracy and reliability of ash transport models and it is of enormous relevance for designing future operational mitigation strategies at Volcanic Ash Advisory Centers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The calibration of the CloudSat spaceborne cloud radar has been thoroughly assessed using very accurate internal link budgets before launch, comparisons with predicted ocean surface backscatter at 94 GHz, direct comparisons with airborne cloud radars, and statistical comparisons with ground-based cloud radars at different locations of the world. It is believed that the calibration of CloudSat is accurate to within 0.5–1 dB. In the present paper it is shown that an approach similar to that used for the statistical comparisons with ground-based radars can now be adopted the other way around to calibrate other ground-based or airborne radars against CloudSat and/or to detect anomalies in long time series of ground-based radar measurements, provided that the calibration of CloudSat is followed up closely (which is the case). The power of using CloudSat as a global radar calibrator is demonstrated using the Atmospheric Radiation Measurement cloud radar data taken at Barrow, Alaska, the cloud radar data from the Cabauw site, Netherlands, and airborne Doppler cloud radar measurements taken along the CloudSat track in the Arctic by the Radar System Airborne (RASTA) cloud radar installed in the French ATR-42 aircraft for the first time. It is found that the Barrow radar data in 2008 are calibrated too high by 9.8 dB, while the Cabauw radar data in 2008 are calibrated too low by 8.0 dB. The calibration of the RASTA airborne cloud radar using direct comparisons with CloudSat agrees well with the expected gains and losses resulting from the change in configuration that required verification of the RASTA calibration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a method that employs Earth Observation (EO) data to calculate spatiotemporal estimates of soil heat flux, G, using a physically-based method (the Analytical Method). The method involves a harmonic analysis of land surface temperature (LST) data. It also requires an estimate of near-surface soil thermal inertia; this property depends on soil textural composition and varies as a function of soil moisture content. The EO data needed to drive the model equations, and the ground-based data required to provide verification of the method, were obtained over the Fakara domain within the African Monsoon Multidisciplinary Analysis (AMMA) program. LST estimates (3 km × 3 km, one image 15 min−1) were derived from MSG-SEVIRI data. Soil moisture estimates were obtained from ENVISAT-ASAR data, while estimates of leaf area index, LAI, (to calculate the effect of the canopy on G, largely due to radiation extinction) were obtained from SPOT-HRV images. The variation of these variables over the Fakara domain, and implications for values of G derived from them, were discussed. Results showed that this method provides reliable large-scale spatiotemporal estimates of G. Variations in G could largely be explained by the variability in the model input variables. Furthermore, it was shown that this method is relatively insensitive to model parameters related to the vegetation or soil texture. However, the strong sensitivity of thermal inertia to soil moisture content at low values of relative saturation (<0.2) means that in arid or semi-arid climates accurate estimates of surface soil moisture content are of utmost importance, if reliable estimates of G are to be obtained. This method has the potential to improve large-scale evaporation estimates, to aid land surface model prediction and to advance research that aims to explain failure in energy balance closure of meteorological field studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is currently an increased interest of Government and Industry in the UK, as well as at the European Community level and International Agencies (i.e. Department of Energy, American International Energy Agency), to improve the performance and uptake of Ground Coupled Heat Pumps (GCHP), in order to meet the 2020 renewable energy target. A sound knowledge base is required to help inform the Government Agencies and advisory bodies; detailed site studies providing reliable data for model verification have an important role to play in this. In this study we summarise the effect of heat extraction by a horizontal ground heat exchanger (installed at 1 m depth) on the soil physical environment (between 0 and 1 m depth) for a site in the south of the UK. Our results show that the slinky influences the surrounding soil by significantly decreasing soil temperatures. Furthermore, soil moisture contents were lower for the GCHP soil profile, most likely due to temperature-gradient related soil moisture migration effects and a decreased hydraulic conductivity, the latter as a result of increased viscosity (caused by the lower temperatures for the GCHP soil profile). The effects also caused considerable differences in soil thermal properties. This is the first detailed mechanistic study conducted in the UK with the aim to understand the interactions between the soil, horizontal heat exchangers and the aboveground environment. An increased understanding of these interactions will help to achieve an optimum and sustainable use of the soil heat resources in the future. The results of this study will help to calibrate and verify a simulation model that will provide UK-wide recommendations to improve future GCHP uptake and performance, while safeguarding the soil physical resources.