77 resultados para Homomorphic e-Auction, Bid Validity Check, Batch Verification, Oblivious Transfer
Resumo:
It's a fact that functional verification (FV) is paramount within the hardware's design cycle. With so many new techniques available today to help with FV, which techniques should we really use? The answer is not straightforward and is often confusing and costly. The tools and techniques to be used in a project have to be decided upon early in the design cycle to get the best value for these new verification methods. This paper gives a quick survey in the form of an overview on FV, establishes the difference between verification and validation, describes the bottlenecks that appear in the verification process, examines the challenges in FV and exposes the current FV technologies and trends.
Resumo:
Cloud radar and lidar can be used to evaluate the skill of numerical weather prediction models in forecasting the timing and placement of clouds, but care must be taken in choosing the appropriate metric of skill to use due to the non- Gaussian nature of cloud-fraction distributions. We compare the properties of a number of different verification measures and conclude that of existing measures the Log of Odds Ratio is the most suitable for cloud fraction. We also propose a new measure, the Symmetric Extreme Dependency Score, which has very attractive properties, being equitable (for large samples), difficult to hedge and independent of the frequency of occurrence of the quantity being verified. We then use data from five European ground-based sites and seven forecast models, processed using the ‘Cloudnet’ analysis system, to investigate the dependence of forecast skill on cloud fraction threshold (for binary skill scores), height, horizontal scale and (for the Met Office and German Weather Service models) forecast lead time. The models are found to be least skillful at predicting the timing and placement of boundary-layer clouds and most skilful at predicting mid-level clouds, although in the latter case they tend to underestimate mean cloud fraction when cloud is present. It is found that skill decreases approximately inverse-exponentially with forecast lead time, enabling a forecast ‘half-life’ to be estimated. When considering the skill of instantaneous model snapshots, we find typical values ranging between 2.5 and 4.5 days. Copyright c 2009 Royal Meteorological Society
Resumo:
It is sometimes argued that experimental economists do not have to worry about external validity so long as the design sticks closely to a theoretical model. This position mistakes the model for the theory. As a result, applied economics designs often study phenomena distinct from their stated objects of inquiry. Because the implemented models are abstract, they may provide improbable analogues to their stated subject matter. This problem is exacerbated by the relational character of the social world, which also sets epistemic limits for the social science laboratory more generally.
Resumo:
The HIRDLS instrument contains 21 spectral channels spanning a wavelength range from 6 to 18mm. For each of these channels the spectral bandwidth and position are isolated by an interference bandpass filter at 301K placed at an intermediate focal plane of the instrument. A second filter cooled to 65K positioned at the same wavelength but designed with a wider bandwidth is placed directly in front of each cooled detector element to reduce stray radiation from internally reflected in-band signals, and to improve the out-of-band blocking. This paper describes the process of determining the spectral requirements for the two bandpass filters and the antireflection coatings used on the lenses and dewar window of the instrument. This process uses a system throughput performance approach taking the instrument spectral specification as a target. It takes into account the spectral characteristics of the transmissive optical materials, the relative spectral response of the detectors, thermal emission from the instrument, and the predicted atmospheric signal to determine the radiance profile for each channel. Using this design approach an optimal design for the filters can be achieved, minimising the number of layers to improve the in-band transmission and to aid manufacture. The use of this design method also permits the instrument spectral performance to be verified using the measured response from manufactured components. The spectral calculations for an example channel are discussed, together with the spreadsheet calculation method. All the contributions made by the spectrally active components to the resulting instrument channel throughput are identified and presented.
Resumo:
In this paper we discuss current work concerning Appearance-based and CAD-based vision; two opposing vision strategies. CAD-based vision is geometry based, reliant on having complete object centred models. Appearance-based vision builds view dependent models from training images. Existing CAD-based vision systems that work with intensity images have all used one and zero dimensional features, for example lines, arcs, points and corners. We describe a system we have developed for combining these two strategies. Geometric models are extracted from a commercial CAD library of industry standard parts. Surface appearance characteristics are then learnt automatically by observing actual object instances. This information is combined with geometric information and is used in hypothesis evaluation. This augmented description improves the systems robustness to texture, specularities and other artifacts which are hard to model with geometry alone, whilst maintaining the advantages of a geometric description.
Resumo:
A novel algorithm for solving nonlinear discrete time optimal control problems with model-reality differences is presented. The technique uses Dynamic Integrated System Optimisation and Parameter Estimation (DISOPE) which has been designed to achieve the correct optimal solution in spite of deficiencies in the mathematical model employed in the optimisation procedure. A method based on Broyden's ideas is used for approximating some derivative trajectories required. Ways for handling con straints on both manipulated and state variables are described. Further, a method for coping with batch-to- batch dynamic variations in the process, which are common in practice, is introduced. It is shown that the iterative procedure associated with the algorithm naturally suits applications to batch processes. The algorithm is success fully applied to a benchmark problem consisting of the input profile optimisation of a fed-batch fermentation process.
Resumo:
A novel algorithm for solving nonlinear discrete time optimal control problems with model-reality differences is presented. The technique uses dynamic integrated system optimisation and parameter estimation (DISOPE) which achieves the correct optimal solution in spite of deficiencies in the mathematical model employed in the optimisation procedure. A new method for approximating some Jacobian trajectories required by the algorithm is introduced. It is shown that the iterative procedure associated with the algorithm naturally suits applications to batch chemical processes.
Resumo:
Three potential explanations of past reforms of the Common Agricultural Policy (CAP) can be identified in the literature: a budget constraint, pressure from General Agreement on Tariffs and Trade/World Trade Organization (GATT/WTO) negotiations or commitments and a paradigm shift emphasising agriculture’s provision of public goods. This discussion on the driving forces of CAP reform links to broader theoretical questions on the role of budgetary politics, globalisation of public policy and paradigm shift in explaining policy change. In this article, the Health Check reforms of 2007/2008 are assessed. They were probably more ambitious than first supposed, although it was a watered-down package agreed by ministers in November 2008. We conclude that the Health Check was not primarily driven by budget concerns or by the supposed switch from the state-assisted to the multifunctional policy paradigm. The European Commission’s wish to adopt an offensive negotiating stance in the closing phases of the Doha Round was a more likely explanatory factor. The shape and purpose of the CAP post-2013 is contested with divergent views among the Member States.
Resumo:
Research into the topic of liquidity has greatly benefited from the availability of data. Although bid-ask spreads were inaccessible to researchers, Roll (1984) provided a conceptual model that estimated the effective bid-ask prices from regular time series data, recorded on a daily or longer interval. Later data availability improved and researchers were able to address questions regarding the factors that influenced the spreads and the relationship between spreads and risk, return and liquidity. More recently transaction data have been used to measure the effective spread and researchers have been able to refine the concepts of liquidity to include the impact of transactions on price movements (Clayton and McKinnon, 2000) on a trade-by-trade analysis. This paper aims to use techniques that combine elements from all three approaches and, by studying US data over a relatively long time period, to throw light on earlier research as well as to reveal the changes in liquidity over the period controlling for extraneous factors such as market, age and size of REIT. It also reveals some comparable results for the UK market over the same period.
Resumo:
The chapter examines the evidence for budget concerns or external (WTO) pressures being the drivers for the 'Health Check' reform of the European Union's common agricultural policy.