957 resultados para VERIFICATION


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Formal tools like finite-state model checkers have proven useful in verifying the correctness of systems of bounded size and for hardening single system components against arbitrary inputs. However, conventional applications of these techniques are not well suited to characterizing emergent behaviors of large compositions of processes. In this paper, we present a methodology by which arbitrarily large compositions of components can, if sufficient conditions are proven concerning properties of small compositions, be modeled and completely verified by performing formal verifications upon only a finite set of compositions. The sufficient conditions take the form of reductions, which are claims that particular sequences of components will be causally indistinguishable from other shorter sequences of components. We show how this methodology can be applied to a variety of network protocol applications, including two features of the HTTP protocol, a simple active networking applet, and a proposed web cache consistency algorithm. We also doing discuss its applicability to framing protocol design goals and to representing systems which employ non-model-checking verification methodologies. Finally, we briefly discuss how we hope to broaden this methodology to more general topological compositions of network applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

NetSketch is a tool that enables the specification of network-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system so as to retain sufficient enough details to enable future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis approach based on a strongly-typed, Domain-Specific Language (DSL) to specify network configurations at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we overview NetSketch, highlight its salient features, and illustrate how it could be used in applications, including the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications). In a companion paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

NetSketch is a tool for the specification of constrained-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system while retaining sufficient information about it to carry out future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis. The compositional analysis is based on a strongly-typed Domain-Specific Language (DSL) for describing and reasoning about constrained-flow networks at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity. In a companion paper [6], we overview NetSketch, highlight its salient features, and illustrate how it could be used in two applications: the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In work that involves mathematical rigor, there are numerous benefits to adopting a representation of models and arguments that can be supplied to a formal reasoning or verification system: reusability, automatic evaluation of examples, and verification of consistency and correctness. However, accessibility has not been a priority in the design of formal verification tools that can provide these benefits. In earlier work [Lap09a], we attempt to address this broad problem by proposing several specific design criteria organized around the notion of a natural context: the sphere of awareness a working human user maintains of the relevant constructs, arguments, experiences, and background materials necessary to accomplish the task at hand. This work expands one aspect of the earlier work by considering more extensively an essential capability for any formal reasoning system whose design is oriented around simulating the natural context: native support for a collection of mathematical relations that deal with common constructs in arithmetic and set theory. We provide a formal definition for a context of relations that can be used to both validate and assist formal reasoning activities. We provide a proof that any algorithm that implements this formal structure faithfully will necessary converge. Finally, we consider the efficiency of an implementation of this formal structure that leverages modular implementations of well-known data structures: balanced search trees and transitive closures of hypergraphs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A computer model has been developed to optimize the performance of a 50kWp photovoltaic system which supplies electrical energy to a dairy farm at Fota Island in Cork Harbour. Optimization of the system involves maximising the efficiency and increasing the performance and reliability of each hardware unit. The model accepts horizontal insolation, ambient temperature, wind speed, wind direction and load demand as inputs. An optimization program uses the computer model to simulate the optimum operating conditions. From this analysis, criteria are established which are used to improve the photovoltaic system operation. This thesis describes the model concepts, the model implementation and the model verification procedures used during development. It also describes the techniques which are used during system optimization. The software, which is written in FORTRAN, is structured in modular units to provide logical and efficient programming. These modular units may also be used in the modelling and optimization of other photovoltaic systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spoken language and learned song are complex communication behaviors found in only a few species, including humans and three groups of distantly related birds--songbirds, parrots, and hummingbirds. Despite their large phylogenetic distances, these vocal learners show convergent behaviors and associated brain pathways for vocal communication. However, it is not clear whether this behavioral and anatomical convergence is associated with molecular convergence. Here we used oligo microarrays to screen for genes differentially regulated in brain nuclei necessary for producing learned vocalizations relative to adjacent brain areas that control other behaviors in avian vocal learners versus vocal non-learners. A top candidate gene in our screen was a calcium-binding protein, parvalbumin (PV). In situ hybridization verification revealed that PV was expressed significantly higher throughout the song motor pathway, including brainstem vocal motor neurons relative to the surrounding brain regions of all distantly related avian vocal learners. This differential expression was specific to PV and vocal learners, as it was not found in avian vocal non-learners nor for control genes in learners and non-learners. Similar to the vocal learning birds, higher PV up-regulation was found in the brainstem tongue motor neurons used for speech production in humans relative to a non-human primate, macaques. These results suggest repeated convergent evolution of differential PV up-regulation in the brains of vocal learners separated by more than 65-300 million years from a common ancestor and that the specialized behaviors of learned song and speech may require extra calcium buffering and signaling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software-based control of life-critical embedded systems has become increasingly complex, and to a large extent has come to determine the safety of the human being. For example, implantable cardiac pacemakers have over 80,000 lines of code which are responsible for maintaining the heart within safe operating limits. As firmware-related recalls accounted for over 41% of the 600,000 devices recalled in the last decade, there is a need for rigorous model-driven design tools to generate verified code from verified software models. To this effect, we have developed the UPP2SF model-translation tool, which facilitates automatic conversion of verified models (in UPPAAL) to models that may be simulated and tested (in Simulink/Stateflow). We describe the translation rules that ensure correct model conversion, applicable to a large class of models. We demonstrate how UPP2SF is used in themodel-driven design of a pacemaker whosemodel is (a) designed and verified in UPPAAL (using timed automata), (b) automatically translated to Stateflow for simulation-based testing, and then (c) automatically generated into modular code for hardware-level integration testing of timing-related errors. In addition, we show how UPP2SF may be used for worst-case execution time estimation early in the design stage. Using UPP2SF, we demonstrate the value of integrated end-to-end modeling, verification, code-generation and testing process for complex software-controlled embedded systems. © 2014 ACM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This short position paper considers issues in developing Data Architecture for the Internet of Things (IoT) through the medium of an exemplar project, Domain Expertise Capture in Authoring and Development ­Environments (DECADE). A brief discussion sets the background for IoT, and the development of the ­distinction between things and computers. The paper makes a strong argument to avoid reinvention of the wheel, and to reuse approaches to distributed heterogeneous data architectures and the lessons learned from that work, and apply them to this situation. DECADE requires an autonomous recording system, ­local data storage, semi-autonomous verification model, sign-off mechanism, qualitative and ­quantitative ­analysis ­carried out when and where required through web-service architecture, based on ontology and analytic agents, with a self-maintaining ontology model. To develop this, we describe a web-service ­architecture, ­combining a distributed data warehouse, web services for analysis agents, ontology agents and a ­verification engine, with a centrally verified outcome database maintained by certifying body for qualification/­professional status.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The contract work has demonstrated that older data can be assessed and entered into the MR format. Older data has associated problems but is retrievable. The contract successfully imported all datasets as required. MNCR survey sheets fit well into the MR format. The data validation and verification process can be improved. A number of computerised short cuts can be suggested and the process made more intuitive. Such a move is vital if MR is to be adopted as a standard by the recording community both on a voluntary level and potentially by consultancies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coccolithophores are the largest source of calcium carbonate in the oceans and are considered to play an important role in oceanic carbon cycles. Current methods to detect the presence of coccolithophore blooms from Earth observation data often produce high numbers of false positives in shelf seas and coastal zones due to the spectral similarity between coccolithophores and other suspended particulates. Current methods are therefore unable to characterise the bloom events in shelf seas and coastal zones, despite the importance of these phytoplankton in the global carbon cycle. A novel approach to detect the presence of coccolithophore blooms from Earth observation data is presented. The method builds upon previous optical work and uses a statistical framework to combine spectral, spatial and temporal information to produce maps of coccolithophore bloom extent. Validation and verification results for an area of the north east Atlantic are presented using an in situ database (N = 432) and all available SeaWiFS data for 2003 and 2004. Verification results show that the approach produces a temporal seasonal signal consistent with biological studies of these phytoplankton. Validation using the in situ coccolithophore cell count database shows a high correct recognition rate of 80% and a low false-positive rate of 0.14 (in comparison to 63% and 0.34 respectively for the established, purely spectral approach). To guide its broader use, a full sensitivity analysis for the algorithm parameters is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Satellite altimetry has revolutionized our understanding of ocean dynamics thanks to frequent sampling and global coverage. Nevertheless, coastal data have been flagged as unreliable due to land and calm water interference in the altimeter and radiometer footprint and uncertainty in the modelling of high-frequency tidal and atmospheric forcing. Our study addresses the first issue, i.e. altimeter footprint contamination, via retracking, presenting ALES, the Adaptive Leading Edge Subwaveform retracker. ALES is potentially applicable to all the pulse-limited altimetry missions and its aim is to retrack both open ocean and coastal data with the same accuracy using just one algorithm. ALES selects part of each returned echo and models it with a classic ”open ocean” Brown functional form, by means of least square estimation whose convergence is found through the Nelder-Mead nonlinear optimization technique. By avoiding echoes from bright targets along the trailing edge, it is capable of retrieving more coastal waveforms than the standard processing. By adapting the width of the estimation window according to the significant wave height, it aims at maintaining the accuracy of the standard processing in both the open ocean and the coastal strip. This innovative retracker is validated against tide gauges in the Adriatic Sea and in the Greater Agulhas System for three different missions: Envisat, Jason-1 and Jason-2. Considerations of noise and biases provide a further verification of the strategy. The results show that ALES is able to provide more reliable 20-Hz data for all three missions in areas where even 1-Hz averages are flagged as unreliable in standard products. Application of the ALES retracker led to roughly a half of the analysed tracks showing a marked improvement in correlation with the tide gauge records, with the rms difference being reduced by a factor of 1.5 for Jason-1 and Jason-2 and over 4 for Envisat in the Adriatic Sea (at the closest point to the tide gauge).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The process of invasion and the desire to predict the invasiveness (and associated impacts) of new arrivals has been a focus of attention for ecologists over centuries. The volunteer recording community has made unique and inspiring contributions to our understanding of invasion biology within Britain. Indeed information on non-native species (NNS) compiled within the GB Non-Native Species Information Portal (GB-NNSIP) would not have been possible without the involvement of volunteer experts from across Britain. Here we review examples of ways in which biological records have informed invasion biology. We specifically examine NNS information available within the GB-NNSIP to describe patterns in the arrival and establishment of NNS providing an overview of habitat associations of NNS in terrestrial, marine and freshwater environments. Monitoring and surveillance of the subset of NNS that are considered to be adversely affecting biodiversity, society or the economy, termed invasive non-native species (INNS), is critical for early warning and rapid response. Volunteers are major contributors to monitoring and surveillance of INNS and not only provide records from across Britain but also underpin the system of verification necessary to confirm the identification of sightings. Here we describe the so-called ‘alert system’ which links volunteer experts with the wider recording community to provide early warning of INNS occurrence. We highlight the need to increase understanding of community and ecosystem-level effects of invasions and particularly understanding of ecological resilience. Detailed field observations, through biological recording, will provide the spatial, temporal and taxonomic breadth required for such research. The role of the volunteer recording community in contributing to the understanding of invasion biology has been invaluable and it is clear that their expertise and commitment will continue to be so. © 2015 The Linnean Society of London, Biological Journal of the Linnean Society, 2015,

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Actualmente las empresas requieren estrategias innovadoras, que permitan gestionar de forma integral, optimizando los recursos y maximizando los resultados. Siendo el objetivo del presente trabajo de investigación, diseñar un instrumento de gestión integral (gestión de la calidad, ambiental, seguridad y salud ocupacional) para el sector construcción en Cusco. Se realizó un análisis actual de la actividad de construcción, se planteó a diferencia de otras propuestas, el diseño que inicia con el diagnóstico, planificación, organización, ejecución, supervisión y optimización del sistema integral. Los resultados están expresados en el diagnóstico de los 3 sistemas, información clave, para el planteamiento y propuesta de las etapas posteriores; estructurando el planteamiento de los subprogramas basados en el diagnóstico integral, para finalmente determinar los lineamientos estratégicos, de implementación, evaluación y verificación del sistema, teniendo en cuenta la norma ISO 9001: 2008, ISO 14001:2004 y OSHAS 18001:2007; así como la legislación vigente para el Perú.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

New R-matrix calculations of electron impact excitation rates in Ca XV are used to derive theoretical electron density diagnostic emission line intensity ratios involving 2s(2)2p(2)- 2s2p(3) transitions, specifically R-1 = I(208.70 Angstrom)/I(200.98 Angstrom), R-2 = I(181.91 Angstrom)/I(200.98 Angstrom), and R-3 = I(215.38 Angstrom)/I(200.98 Angstrom), for a range of electron temperatures (T-e = 10(6.4)-10(6.8) K) and densities (Ne = 10(9)-10(13) cm(-3)) appropriate to solar coronal plasmas. Electron densities deduced from the observed values of R-1, R-2, and R-3 for several solar flares, measured from spectra obtained with the Naval Research Laboratory's S082A spectrograph on board Skylab, are found to be consistent. In addition, the derived electron densities are in excellent agreement with those determined from line ratios in Ca XVI, which is formed at a similar electron temperature to Ca XV. These results provide some experimental verification for the accuracy of the line ratio calculations, and hence the atomic data on which they are based. A set of eight theoretical Ca XV line ratios involving 2s(2)2p(2)-2s2p(3) transitions in the wavelength range similar to140-216 Angstrom are also found to be in good agreement with those measured from spectra of the TEXT tokamak plasma, for which the electron temperature and density have been independently determined. This provides additional support for the accuracy of the theoretical line ratios and atomic data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work deals with modelling and experimental verification of desalination theory (surface force pore flow) . The work has direct application in desalination of sea water.