982 resultados para safe system
Resumo:
The Science of Network Service Composition has clearly emerged as one of the grand themes driving many of our research questions in the networking field today [NeXtworking 2003]. This driving force stems from the rise of sophisticated applications and new networking paradigms. By "service composition" we mean that the performance and correctness properties local to the various constituent components of a service can be readily composed into global (end-to-end) properties without re-analyzing any of the constituent components in isolation, or as part of the whole composite service. The set of laws that would govern such composition is what will constitute that new science of composition. The combined heterogeneity and dynamic open nature of network systems makes composition quite challenging, and thus programming network services has been largely inaccessible to the average user. We identify (and outline) a research agenda in which we aim to develop a specification language that is expressive enough to describe different components of a network service, and that will include type hierarchies inspired by type systems in general programming languages that enable the safe composition of software components. We envision this new science of composition to be built upon several theories (e.g., control theory, game theory, network calculus, percolation theory, economics, queuing theory). In essence, different theories may provide different languages by which certain properties of system components can be expressed and composed into larger systems. We then seek to lift these lower-level specifications to a higher level by abstracting away details that are irrelevant for safe composition at the higher level, thus making theories scalable and useful to the average user. In this paper we focus on services built upon an overlay management architecture, and we use control theory and QoS theory as example theories from which we lift up compositional specifications.
Resumo:
NetSketch is a tool that enables the specification of network-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system so as to retain sufficient enough details to enable future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis approach based on a strongly-typed, Domain-Specific Language (DSL) to specify network configurations at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we overview NetSketch, highlight its salient features, and illustrate how it could be used in applications, including the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications). In a companion paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity.
Resumo:
NetSketch is a tool for the specification of constrained-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system while retaining sufficient information about it to carry out future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis. The compositional analysis is based on a strongly-typed Domain-Specific Language (DSL) for describing and reasoning about constrained-flow networks at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity. In a companion paper [6], we overview NetSketch, highlight its salient features, and illustrate how it could be used in two applications: the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications).
Resumo:
In research areas involving mathematical rigor, there are numerous benefits to adopting a formal representation of models and arguments: reusability, automatic evaluation of examples, and verification of consistency and correctness. However, accessibility has not been a priority in the design of formal verification tools that can provide these benefits. In earlier work [30] we attempt to address this broad problem by proposing several specific design criteria organized around the notion of a natural context: the sphere of awareness a working human user maintains of the relevant constructs, arguments, experiences, and background materials necessary to accomplish the task at hand. In this report we evaluate our proposed design criteria by utilizing within the context of novel research a formal reasoning system that is designed according to these criteria. In particular, we consider how the design and capabilities of the formal reasoning system that we employ influence, aid, or hinder our ability to accomplish a formal reasoning task – the assembly of a machine-verifiable proof pertaining to the NetSketch formalism. NetSketch is a tool for the specification of constrained-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. It provides capabilities for compositional analysis based on a strongly-typed domain-specific language (DSL) for describing and reasoning about constrained-flow networks and invariants that need to be enforced thereupon. In a companion paper [13] we overview NetSketch, highlight its salient features, and illustrate how it could be used in actual applications. In this paper, we define using a machine-readable syntax major parts of the formal system underlying the operation of NetSketch, along with its semantics and a corresponding notion of validity. We then provide a proof of soundness for the formalism that can be partially verified using a lightweight formal reasoning system that simulates natural contexts. A traditional presentation of these definitions and arguments can be found in the full report on the NetSketch formalism [12].
Resumo:
This paper formally defines the operational semantic for TRAFFIC, a specification language for flow composition applications proposed in BUCS-TR-2005-014, and presents a type system based on desired safety assurance. We provide proofs on reduction (weak-confluence, strong-normalization and unique normal form), on soundness and completeness of type system with respect to reduction, and on equivalence classes of flow specifications. Finally, we provide a pseudo-code listing of a syntax-directed type checking algorithm implementing rules of the type system capable of inferring the type of a closed flow specification.
Resumo:
In the framework of iBench research project, our previous work created a domain specific language TRAFFIC [6] that facilitates specification, programming, and maintenance of distributed applications over a network. It allows safety property to be formalized in terms of types and subtyping relations. Extending upon our previous work, we add Hindley-Milner style polymorphism [8] with constraints [9] to the type system of TRAFFIC. This allows a programmer to use for-all quantifier to describe types of network components, escalating power and expressiveness of types to a new level that was not possible before with propositional subtyping relations. Furthermore, we design our type system with a pluggable constraint system, so it can adapt to different application needs while maintaining soundness. In this paper, we show the soundness of the type system, which is not syntax-directed but is easier to do typing derivation. We show that there is an equivalent syntax-directed type system, which is what a type checker program would implement to verify the safety of a network flow. This is followed by discussion on several constraint systems: polymorphism with subtyping constraints, Linear Programming, and Constraint Handling Rules (CHR) [3]. Finally, we provide some examples to illustrate workings of these constraint systems.
Resumo:
Open environments involve distributed entities interacting with each other in an open manner. Many distributed entities are unknown to each other but need to collaborate and share resources in a secure fashion. Usually resource owners alone decide who is trusted to access their resources. Since resource owners in open environments do not have a complete picture of all trusted entities, trust management frameworks are used to ensure that only authorized entities will access requested resources. Every trust management system has limitations, and the limitations can be exploited by malicious entities. One vulnerability is due to the lack of globally unique interpretation for permission specifications. This limitation means that a malicious entity which receives a permission in one domain may misuse the permission in another domain via some deceptive but apparently authorized route; this malicious behaviour is called subterfuge. This thesis develops a secure approach, Subterfuge Safe Trust Management (SSTM), that prevents subterfuge by malicious entities. SSTM employs the Subterfuge Safe Authorization Language (SSAL) which uses the idea of a local permission with a globally unique interpretation (localPermission) to resolve the misinterpretation of permissions. We model and implement SSAL with an ontology-based approach, SSALO, which provides a generic representation for knowledge related to the SSAL-based security policy. SSALO enables integration of heterogeneous security policies which is useful for secure cooperation among principals in open environments where each principal may have a different security policy with different implementation. The other advantage of an ontology-based approach is the Open World Assumption, whereby reasoning over an existing security policy is easily extended to include further security policies that might be discovered in an open distributed environment. We add two extra SSAL rules to support dynamic coalition formation and secure cooperation among coalitions. Secure federation of cloud computing platforms and secure federation of XMPP servers are presented as case studies of SSTM. The results show that SSTM provides robust accountability for the use of permissions in federation. It is also shown that SSAL is a suitable policy language to express the subterfuge-safe policy statements due to its well-defined semantics, ease of use, and integrability.
Resumo:
The environmental attractions of air-cycle refrigeration are considerable. Following a thermodynamic design analysis, an air-cycle demonstrator plant was constructed within the restricted physical envelope of an existing Thermo King SL200 trailer refrigeration unit. This unique plant operated satisfactorily, delivering sustainable cooling for refrigerated trailers using a completely natural and safe working fluid. The full load capacity of the air-cycle unit at -20 °C was 7,8 kW, 8% greater than the equivalent vapour-cycle unit, but the fuel consumption of the air-cycle plant was excessively high. However, at part load operation the disparity in fuel consumption dropped from approximately 200% to around 80%. The components used in the air-cycle demonstrator were not optimised and considerable potential exists for efficiency improvements, possibly to the point where the air-cycle system could rival the efficiency of the standard vapour-cycle system at part-load operation, which represents the biggest proportion of operating time for most units.
Resumo:
Burkholderia species are extremely multidrug resistant, environmental bacteria with extraordinary bioremediation and biocontrol properties. At the same time, these bacteria cause serious opportunistic infections in vulnerable patient populations while some species can potentially be used as bioweapons. The complete DNA sequence of more than 10 Burkholderia genomes provides an opportunity to apply functional genomics to a collection of widely adaptable environmental bacteria thriving in diverse niches and establishing both symbiotic and pathogenic associations with many different organisms. However, extreme multidrug resistance hampers genetic manipulations in Burkholderia. We have developed and evaluated a mutagenesis system based on the homing endonuclease I-SceI to construct targeted, non-polar unmarked gene deletions in Burkholderia. Using the cystic fibrosis pathogen Burkholderia cenocepacia K56-2 as a model strain, we demonstrate this system allows for clean deletions of one or more genes within an operon and also the introduction of multiple deletions in the same strain. We anticipate this tool will have widespread environmental and biomedical applications, facilitating functional genomic studies and construction of safe strains for bioremediation and biocontrol, as well as clinical applications such as live vaccines for Burkholderia and other Gram-negative bacterial species.
Resumo:
Lung disease in cystic fibrosis (CF) is typified by the development of chronic airways infection culminating in bronchiectasis and progression to end-stage respiratory disease. Pseudomonas aeruginosa, a ubiquitous gram-negative bacteria, is the archetypical CF pathogen and is associated with an accelerated clinical decline. The development and widespread use of chronic suppressive aerosolized antibacterial therapies, in particular Tobramycin Inhalation Solution (TIS), in CF has contributed to reduced lung function decline and improved survival. However, the requirement for the aerosolization of these agents through nebulizers has been associated with increased treatment burden, reduced quality of life and remain a barrier to broader uptake. Tobramycin Inhalation Powder (TIP™) has been developed by Novartis with the express purpose of delivering the same benefits as TIS in a time-effective manner. Administered via the T-326™ (Novartis) Inhaler in four individual 28-mg capsules, TIP can be administered in a quarter of the time of traditional nebulizers and is inherently portable. In clinical studies, TIP has been shown to be safe, result in equivalent or superior reductions in P. aeruginosa sputum density and produce similar improvements in pulmonary function. TIP offers significant advantages in time saving, portability and convenience over traditional nebulized TIS with comparable clinical outcomes for individuals with CF.
Resumo:
Background: Angiotensin-converting enzyme inhibitors (ACEIs) and angiotensin II receptor blockers (ARBs) are commonly prescribed to the growing number of cancer patients (more than two million in the UK alone) often to treat hypertension. However, increased fatal cancer in ARB users in a randomized trial and increased breast cancer recurrence rates in ACEI users in a recent observational study have raised concerns about their safety in cancer patients. We investigated whether ACEI or ARB use after breast, colorectal or prostate cancer diagnosis was associated with increased risk of cancer-specific mortality.
Methods: Population-based cohorts of 9,814 breast, 4,762 colorectal and 6,339 prostate cancer patients newly diagnosed from 1998 to 2006 were identified in the UK Clinical Practice Research Datalink and confirmed by cancer registry linkage. Cancer-specific and all-cause mortality were identified from Office of National Statistics mortality data in 2011 (allowing up to 13 years of follow-up). A nested case–control analysis was conducted to compare ACEI/ARB use (from general practitioner prescription records) in cancer patients dying from cancer with up to five controls (not dying from cancer). Conditional logistic regression estimated the risk of cancer-specific, and all-cause, death in ACEI/ARB users compared with non-users.
Results: The main analysis included 1,435 breast, 1,511 colorectal and 1,184 prostate cancer-specific deaths (and 7,106 breast, 7,291 colorectal and 5,849 prostate cancer controls). There was no increase in cancer-specific mortality in patients using ARBs after diagnosis of breast (adjusted odds ratio (OR) = 1.06 95% confidence interval (CI) 0.84, 1.35), colorectal (adjusted OR = 0.82 95% CI 0.64, 1.07) or prostate cancer (adjusted OR = 0.79 95% CI 0.61, 1.03). There was also no evidence of increases in cancer-specific mortality with ACEI use for breast (adjusted OR = 1.06 95% CI 0.89, 1.27), colorectal (adjusted OR = 0.78 95% CI 0.66, 0.92) or prostate cancer (adjusted OR = 0.78 95% CI 0.66, 0.92).
Conclusions: Overall, we found no evidence of increased risks of cancer-specific mortality in breast, colorectal or prostate cancer patients who used ACEI or ARBs after diagnosis. These results provide some reassurance that these medications are safe in patients diagnosed with these cancers.
Keywords: Colorectal cancer; Breast cancer; Prostate cancer; Mortality; Angiotensin-converting enzyme inhibitors and angiotensin II receptor blockers
Resumo:
Several agricultural fields show high contents of arsenic because of irrigation with arsenic- contaminated groundwater. Vegetables accumulate arse- nic in their edible parts when grown in contaminated soils. Polluted vegetables are one of the main sources of arsenic in the food chain, especially for people living in rural arsenic endemic villages of India and Bangladesh. The aim of this study was to assess the feasibility of floriculture in the crop rotation system of arsenic en- demic areas of the Bengal Delta. The effects of different arsenic concentrations (0, 0.5, 1.0, and 2.0 mg As L−1) and types of flowering plant (Gomphrena globosa and Zinnia elegans) on plant growth and arsenic accumula- tion were studied under hydroponic conditions. Total arsenic was quantified using atomic absorption spec- trometer with hydride generation (HG-AAS). Arsenic was mainly accumulated in the roots (72 %), followed by leaves (12 %), stems (10 %), and flowers (<1 %). The flowering plants studied did not show as high phytoremediation capacities as other wild species, suchas ferns. However, they behaved as arsenic tolerant plants and grew and bloomed well, without showing any phytotoxic signs. This study proves that floriculture could be included within the crop rotation system in arsenic-contaminated agricultural soils, in order to im- prove food safety and also food security by increasing farmer’s revenue.
Resumo:
The central hypothesis to be tested is the relevance of gold in the determination of the value of the US dollar as an international reserve currency after 1971. In the first section the market value of the US dollar is analysed by looking at new forms of value (financial derivative products), the dollar as a safe haven, the choice of a standard of value and the role of SDRs in reforming the international monetary system. Based on dimensional analysis, the second section analyses the definition and meaning of a numéraire for international currency and the justification for a variable standard of value based on a commodity (gold). The second section is the theoretical foundation for the empirical and econometric analysis in the third and fourth sections. The third section is devoted to the specification of an econometric model and a graphical analysis of the data. It is clear that an inverse relation exists between the value of the US dollar and the price of gold. The fourth section shows the estimations of the different specifications of the model including linear regression and cointegration analysis. The most important econometric result is that the null hypothesis is rejected in favour of a significant link between the price of gold and the value of the US dollar. There is also a positive relationship between gold price and inflation. An inverse statistically significant relation between gold price and monetary policy is shown by applying a dynamic model of cointegration with lags.
Resumo:
Objective To determine scoliosis curve types using non invasive surface acquisition, without prior knowledge from X-ray data. Methods Classification of scoliosis deformities according to curve type is used in the clinical management of scoliotic patients. In this work, we propose a robust system that can determine the scoliosis curve type from non invasive acquisition of the 3D back surface of the patients. The 3D image of the surface of the trunk is divided into patches and local geometric descriptors characterizing the back surface are computed from each patch and constitute the features. We reduce the dimensionality by using principal component analysis and retain 53 components using an overlap criterion combined with the total variance in the observed variables. In this work, a multi-class classifier is built with least-squares support vector machines (LS-SVM). The original LS-SVM formulation was modified by weighting the positive and negative samples differently and a new kernel was designed in order to achieve a robust classifier. The proposed system is validated using data from 165 patients with different scoliosis curve types. The results of our non invasive classification were compared with those obtained by an expert using X-ray images. Results The average rate of successful classification was computed using a leave-one-out cross-validation procedure. The overall accuracy of the system was 95%. As for the correct classification rates per class, we obtained 96%, 84% and 97% for the thoracic, double major and lumbar/thoracolumbar curve types, respectively. Conclusion This study shows that it is possible to find a relationship between the internal deformity and the back surface deformity in scoliosis with machine learning methods. The proposed system uses non invasive surface acquisition, which is safe for the patient as it involves no radiation. Also, the design of a specific kernel improved classification performance.
Resumo:
Embedded systems, especially Wireless Sensor Nodes are highly prone to Type Safety and Memory Safety issues. Contiki, a prominent Operating System in the domain is even more affected by the problem since it makes extensive use of Type casts and Pointers. The work is an attempt to nullify the possibility of Safety violations in Contiki. We use a powerful, still efficient tool called Deputy to achieve this. We also try to automate the process