81 resultados para Compositional Rule of Inference


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many commentators have treated the internet as a site of democratic freedom and as a new kind of public sphere. While there are good reasons for optimism, like any social space digital space also has its dark side. Citizens and governments alike have expressed anxiety about cybercrime and cyber-security. In August 2011, the Australian government introduced legislation to give effect to Australia becoming a signatory to the European Convention on Cybercrime (2001). At the time of writing, that legislation is still before the Parliament. In this article, attention is given to how the legal and policy-making process enabling Australia to be compliant with the European Convention on Cybercrime came about. Among the motivations that informed both the development of the Convention in Europe and then the Australian exercise of legislating for compliance with it was a range of legitimate concerns about the impact that cybercrime can have on individuals and communities. This article makes the case that equal attention also needs to be given to ensuring that legislators and policy makers differentiate between legitimate security imperatives and any over-reach evident in the implementation of this legislation that affects rule of law principles, our capacity to engage in democratic practices, and our civic and human rights.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose a framework for both gradient descent image and object alignment in the Fourier domain. Our method centers upon the classical Lucas & Kanade (LK) algorithm where we represent the source and template/model in the complex 2D Fourier domain rather than in the spatial 2D domain. We refer to our approach as the Fourier LK (FLK) algorithm. The FLK formulation is advantageous when one pre-processes the source image and template/model with a bank of filters (e.g. oriented edges, Gabor, etc.) as: (i) it can handle substantial illumination variations, (ii) the inefficient pre-processing filter bank step can be subsumed within the FLK algorithm as a sparse diagonal weighting matrix, (iii) unlike traditional LK the computational cost is invariant to the number of filters and as a result far more efficient, and (iv) this approach can be extended to the inverse compositional form of the LK algorithm where nearly all steps (including Fourier transform and filter bank pre-processing) can be pre-computed leading to an extremely efficient and robust approach to gradient descent image matching. Further, these computational savings translate to non-rigid object alignment tasks that are considered extensions of the LK algorithm such as those found in Active Appearance Models (AAMs).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The practical number of charge carriers loaded is crucial to the evaluation of the capacity performance of carbon-based electrodes in service, and cannot be easily addressed experimentally. In this paper, we report a density functional theory study of charge carrier adsorption onto zigzag edge-shaped graphene nanoribbons (ZGNRs), both pristine and incorporating edge substitution with boron, nitrogen or oxygen atoms. All edge substitutions are found to be energetically favorable, especially in oxidized environments. The maximal loading of protons onto the substituted ZGNR edges obeys a rule of [8-n-1], where n is the number of valence electrons of the edge-site atom constituting the adsorption site. Hence, a maximum charge loading is achieved with boron substitution. This result correlates in a transparent manner with the electronic structure characteristics of the edge atom. The boron edge atom, characterized by the most empty p band, facilitates more than the other substitutional cases the accommodation of valence electrons transferred from the ribbon, induced by adsorption of protons. This result not only further confirms the possibility of enhancing charge storage performance of carbon-based electrochemical devices through chemical functionalization but also, more importantly, provides the physical rationale for further design strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of environmental justice is well developed in North America, but is still at the evolutionary stage in most other jurisdictions around the globe. This paper seeks to explore two jurisdictions where incidents of environmental justice are likely to be seen in the future as a result of manufacturing and mining practices. The discussion will centre upon avenues to environmental justice for both private citizens and the public at large. The first jurisdiction considered is China, where environmental liability claims brought by Chinese citizens have increased at an annual average of 25% (Yang 2011). Manufacturing is at the core of the Chinese economy and is responsible for some of the unprecedented economic growth in the region. Less discussed are the industry impacts on water and air pollution levels and the associated implications of these pollutants on local communities. China introduced the Tort Liability Law (TLL) in 2010, which may provide avenues to justice for private citizens. The other jurisdiction considered by the paper is Australia, where the mining boom has buffered the Australian economy from the global financial crisis. There is some limited case law in Australia where private citizens have made a claim in toxic torts; however the framework is underdeveloped in terms of the significant risks facing indigenous and local communities in mining areas and also by comparison to the developments of the TLL framework in China. This paper traces the regulatory responses to the affects of major industries on communities in China and Australia. From this it examines the need for environmental justice avenues that align with rule of law principles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose To investigate hyperopic shifts and the oblique (or 45-degree/135-degree) component of astigmatism at large angles in the horizontal visual field using the Hartmann-Shack technique. Methods The adult participants consisted of 6 hypermetropes, 13 emmetropes and 11 myopes. Measurements were made with a modified COAS-HD Hartmann-Shack aberrometer across T60 degrees along the horizontal visual field in 5-degree steps. Eyes were dilated with 1% cyclopentolate. Peripheral refraction was estimated as mean spherical (or spherical equivalent) refraction, with/against the rule of astigmatism and oblique astigmatism components, and as horizontal and vertical refraction components based on 3-mm major diameter elliptical pupils. Results Thirty percent of eyes showed a pattern that was a combination of type IV and type I patterns of Rempt et al. (Rempt F, Hoogerheide J, Hoogenboom WP. Peripheral retinoscopy and the skiagram. Ophthalmologica 1971;162:1Y10), which shows the characteristics of type IV (relative hypermetropia along the vertical meridian and relative myopia along the horizontal meridian) out to an angle of between 40 and 50 degrees before behaving like type I (both meridians show relative hypermetropia). We classified this pattern as type IV/I. Seven of 13 emmetropes had this pattern. As a group, there was no significant variation of the oblique component of astigmatism with angle, but about one-half of the eyes showed significant positive slopes (more positive or less negative values in the nasal field than in the temporal field) and one-fourth showed significant negative slopes. Conclusions It is often considered that a pattern of relative peripheral hypermetropia predisposes to the development of myopia. In this context, the finding of a considerable portion of emmetropes with the IV/I pattern suggests that it is unlikely that refraction at visual field angles beyond 40 degrees from fixation contributes to myopia development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scores of well-researched individual papers and posters specifically or indirectly addressing the occurrence, measurement or exposure impacts of chemicals in buildings were presented at 2012 Healthy Buildings Conference. Many of these presentations offered advances in sampling and characterisation of chemical pollutants while others extended the frontiers of knowledge on the emission, adsorption, risk, fate and compositional levels of chemicals in indoor and outdoor microenvironments. Several modelled or monitored indoor chemistry, including processes that generated secondary pollutants. This article provides an overview of the state of knowledge on healthy buildings based on papers presented in chemistry sessions at Healthy Buildings 2012 (HB2012) Conference. It also suggests future directions in healthy buildings research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Articular cartilage is the load-bearing tissue that consists of proteoglycan macromolecules entrapped between collagen fibrils in a three-dimensional architecture. To date, the drudgery of searching for mathematical models to represent the biomechanics of such a system continues without providing a fitting description of its functional response to load at micro-scale level. We believe that the major complication arose when cartilage was first envisaged as a multiphasic model with distinguishable components and that quantifying those and searching for the laws that govern their interaction is inadequate. To the thesis of this paper, cartilage as a bulk is as much continuum as is the response of its components to the external stimuli. For this reason, we framed the fundamental question as to what would be the mechano-structural functionality of such a system in the total absence of one of its key constituents-proteoglycans. To answer this, hydrated normal and proteoglycan depleted samples were tested under confined compression while finite element models were reproduced, for the first time, based on the structural microarchitecture of the cross-sectional profile of the matrices. These micro-porous in silico models served as virtual transducers to produce an internal noninvasive probing mechanism beyond experimental capabilities to render the matrices micromechanics and several others properties like permeability, orientation etc. The results demonstrated that load transfer was closely related to the microarchitecture of the hyperelastic models that represent solid skeleton stress and fluid response based on the state of the collagen network with and without the swollen proteoglycans. In other words, the stress gradient during deformation was a function of the structural pattern of the network and acted in concert with the position-dependent compositional state of the matrix. This reveals that the interaction between indistinguishable components in real cartilage is superimposed by its microarchitectural state which directly influences macromechanical behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we investigate the use of quantum probability theory for ranking documents. Quantum probability theory is used to estimate the probability of relevance of a document given a user's query. We posit that quantum probability theory can lead to a better estimation of the probability of a document being relevant to a user's query than the common approach, i. e. the Probability Ranking Principle (PRP), which is based upon Kolmogorovian probability theory. Following our hypothesis, we formulate an analogy between the document retrieval scenario and a physical scenario, that of the double slit experiment. Through the analogy, we propose a novel ranking approach, the quantum probability ranking principle (qPRP). Key to our proposal is the presence of quantum interference. Mathematically, this is the statistical deviation between empirical observations and expected values predicted by the Kolmogorovian rule of additivity of probabilities of disjoint events in configurations such that of the double slit experiment. We propose an interpretation of quantum interference in the document ranking scenario, and examine how quantum interference can be effectively estimated for document retrieval. To validate our proposal and to gain more insights about approaches for document ranking, we (1) analyse PRP, qPRP and other ranking approaches, exposing the assumptions underlying their ranking criteria and formulating the conditions for the optimality of the two ranking principles, (2) empirically compare three ranking principles (i. e. PRP, interactive PRP, and qPRP) and two state-of-the-art ranking strategies in two retrieval scenarios, those of ad-hoc retrieval and diversity retrieval, (3) analytically contrast the ranking criteria of the examined approaches, exposing similarities and differences, (4) study the ranking behaviours of approaches alternative to PRP in terms of the kinematics they impose on relevant documents, i. e. by considering the extent and direction of the movements of relevant documents across the ranking recorded when comparing PRP against its alternatives. Our findings show that the effectiveness of the examined ranking approaches strongly depends upon the evaluation context. In the traditional evaluation context of ad-hoc retrieval, PRP is empirically shown to be better or comparable to alternative ranking approaches. However, when we turn to examine evaluation contexts that account for interdependent document relevance (i. e. when the relevance of a document is assessed also with respect to other retrieved documents, as it is the case in the diversity retrieval scenario) then the use of quantum probability theory and thus of qPRP is shown to improve retrieval and ranking effectiveness over the traditional PRP and alternative ranking strategies, such as Maximal Marginal Relevance, Portfolio theory, and Interactive PRP. This work represents a significant step forward regarding the use of quantum theory in information retrieval. It demonstrates in fact that the application of quantum theory to problems within information retrieval can lead to improvements both in modelling power and retrieval effectiveness, allowing the constructions of models that capture the complexity of information retrieval situations. Furthermore, the thesis opens up a number of lines for future research. These include: (1) investigating estimations and approximations of quantum interference in qPRP; (2) exploiting complex numbers for the representation of documents and queries, and; (3) applying the concepts underlying qPRP to tasks other than document ranking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lawyering and Positive Professional Identities aims to help law students successfully navigate the demands of law studies and legal practice through the development of positive professional legal identities. It does this by focusing on the knowledge, skills and attitudes necessary for law students to be motivated and engaged learners, and psychologically healthy individuals. The text will fill an important gap for many law schools seeking to enact the threshold learning outcomes for law by addressing these important topics in their curricula. It is a valuable guide for all law students who wish to maximise their success and chances of thriving at law school and beyond. Positive lawyering knowledge and practice are central themes of this book, with a particular emphasis on lawyers’ roles as upholders of the rule of law, as dispute resolvers and as ethical professionals. Throughout, the authors provide practical, experience-based advice on the development of core skills for legal education and practice.