196 resultados para proof


Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the emergence of multi-core processors into the mainstream, parallel programming is no longer the specialized domain it once was. There is a growing need for systems to allow programmers to more easily reason about data dependencies and inherent parallelism in general purpose programs. Many of these programs are written in popular imperative programming languages like Java and C]. In this thesis I present a system for reasoning about side-effects of evaluation in an abstract and composable manner that is suitable for use by both programmers and automated tools such as compilers. The goal of developing such a system is to both facilitate the automatic exploitation of the inherent parallelism present in imperative programs and to allow programmers to reason about dependencies which may be limiting the parallelism available for exploitation in their applications. Previous work on languages and type systems for parallel computing has tended to focus on providing the programmer with tools to facilitate the manual parallelization of programs; programmers must decide when and where it is safe to employ parallelism without the assistance of the compiler or other automated tools. None of the existing systems combine abstraction and composition with parallelization and correctness checking to produce a framework which helps both programmers and automated tools to reason about inherent parallelism. In this work I present a system for abstractly reasoning about side-effects and data dependencies in modern, imperative, object-oriented languages using a type and effect system based on ideas from Ownership Types. I have developed sufficient conditions for the safe, automated detection and exploitation of a number task, data and loop parallelism patterns in terms of ownership relationships. To validate my work, I have applied my ideas to the C] version 3.0 language to produce a language extension called Zal. I have implemented a compiler for the Zal language as an extension of the GPC] research compiler as a proof of concept of my system. I have used it to parallelize a number of real-world applications to demonstrate the feasibility of my proposed approach. In addition to this empirical validation, I present an argument for the correctness of the type system and language semantics I have proposed as well as sketches of proofs for the correctness of the sufficient conditions for parallelization proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The contributions of this thesis fall into three areas of certificateless cryptography. The first area is encryption, where we propose new constructions for both identity-based and certificateless cryptography. We construct an n-out-of- n group encryption scheme for identity-based cryptography that does not require any special means to generate the keys of the trusted authorities that are participating. We also introduce a new security definition for chosen ciphertext secure multi-key encryption. We prove that our construction is secure as long as at least one authority is uncompromised, and show that the existing constructions for chosen ciphertext security from identity-based encryption also hold in the group encryption case. We then consider certificateless encryption as the special case of 2-out-of-2 group encryption and give constructions for highly efficient certificateless schemes in the standard model. Among these is the first construction of a lattice-based certificateless encryption scheme. Our next contribution is a highly efficient certificateless key encapsulation mechanism (KEM), that we prove secure in the standard model. We introduce a new way of proving the security of certificateless schemes based that are based on identity-based schemes. We leave the identity-based part of the proof intact, and just extend it to cover the part that is introduced by the certificateless scheme. We show that our construction is more efficient than any instanciation of generic constructions for certificateless key encapsulation in the standard model. The third area where the thesis contributes to the advancement of certificateless cryptography is key agreement. Swanson showed that many certificateless key agreement schemes are insecure if considered in a reasonable security model. We propose the first provably secure certificateless key agreement schemes in the strongest model for certificateless key agreement. We extend Swanson's definition for certificateless key agreement and give more power to the adversary. Our new schemes are secure as long as each party has at least one uncompromised secret. Our first construction is in the random oracle model and gives the adversary slightly more capabilities than our second construction in the standard model. Interestingly, our standard model construction is as efficient as the random oracle model construction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Real-world business processes rely on the availability of scarce, shared resources, both human and non-human. Current workflow management systems support allocation of individual human resources to tasks but lack support for the full range of resource types used in practice, and the inevitable constraints on their availability and applicability. Based on past experience with resource-intensive workflow applications, we derive generic requirements for a workflow system which can use its knowledge of resource capabilities and availability to help create feasible task schedules. We then define the necessary architecture for implementing such a system and demonstrate its practicality through a proof-of-concept implementation. This work is presented in the context of a real-life surgical care process observed in a number of German hospitals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The requirement to prove a society united by a body of law and customs to establish native title rights has been identified as a major hurdle to achieving native title recognition. The recent appeal decision of the Federal Court in Sampi on behalf of the Bardi and Jawi People v Western Australia [2010] opens the potential for a new judicial interpretation of society based on the internal view of native title claimants. The decision draws on defining features of legal positivism to inform the court’s findings as to the existence of a single Bardi Jawi society of ‘one people’ living under ‘one law’. The case of Bodney v Bennell [2008] is analysed through comparitive study of how the application of the received positivist framework may limit native title recognition. This paper argues that the framing of Indigenous law by reference to Western legal norms is problematic due to the assumptions of legal positivism and that an internal view based on Indigenous worldviews, which see law as intrinsically linked to the spiritual and ancestral connection to country, is more appropriate to determine proof in native title claims.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an automated verification method for security of Diffie–Hellman–based key exchange protocols. The method includes a Hoare-style logic and syntactic checking. The method is applied to protocols in a simplified version of the Bellare–Rogaway–Pointcheval model (2000). The security of the protocol in the complete model can be established automatically by a modular proof technique of Kudla and Paterson (2005).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sample complexity results from computational learning theory, when applied to neural network learning for pattern classification problems, suggest that for good generalization performance the number of training examples should grow at least linearly with the number of adjustable parameters in the network. Results in this paper show that if a large neural network is used for a pattern classification problem and the learning algorithm finds a network with small weights that has small squared error on the training patterns, then the generalization performance depends on the size of the weights rather than the number of weights. For example, consider a two-layer feedforward network of sigmoid units, in which the sum of the magnitudes of the weights associated with each unit is bounded by A and the input dimension is n. We show that the misclassification probability is no more than a certain error estimate (that is related to squared error on the training set) plus A3 √((log n)/m) (ignoring log A and log m factors), where m is the number of training patterns. This may explain the generalization performance of neural networks, particularly when the number of training examples is considerably smaller than the number of weights. It also supports heuristics (such as weight decay and early stopping) that attempt to keep the weights small during training. The proof techniques appear to be useful for the analysis of other pattern classifiers: when the input domain is a totally bounded metric space, we use the same approach to give upper bounds on misclassification probability for classifiers with decision boundaries that are far from the training examples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the behavior of the empirical minimization algorithm using various methods. We first analyze it by comparing the empirical, random, structure and the original one on the class, either in an additive sense, via the uniform law of large numbers, or in a multiplicative sense, using isomorphic coordinate projections. We then show that a direct analysis of the empirical minimization algorithm yields a significantly better bound, and that the estimates we obtain are essentially sharp. The method of proof we use is based on Talagrand’s concentration inequality for empirical processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present new expected risk bounds for binary and multiclass prediction, and resolve several recent conjectures on sample compressibility due to Kuzmin and Warmuth. By exploiting the combinatorial structure of concept class F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy. The key step in their proof is a d = VC(F) bound on the graph density of a subgraph of the hypercube—oneinclusion graph. The first main result of this paper is a density bound of n [n−1 <=d-1]/[n <=d] < d, which positively resolves a conjecture of Kuzmin and Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved one-inclusion mistake bound. The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is an algebraic topological property of maximum classes of VC-dimension d as being d contractible simplicial complexes, extending the well-known characterization that d = 1 maximum classes are trees. We negatively resolve a minimum degree conjecture of Kuzmin and Warmuth—the second part to a conjectured proof of correctness for Peeling—that every class has one-inclusion minimum degree at most its VCdimension. Our final main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This result improves on known PAC-based expected risk bounds by a factor of O(logn) and is shown to be optimal up to an O(logk) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

H. Simon and B. Szörényi have found an error in the proof of Theorem 52 of “Shifting: One-inclusion mistake bounds and sample compression”, Rubinstein et al. (2009). In this note we provide a corrected proof of a slightly weakened version of this theorem. Our new bound on the density of one-inclusion hypergraphs is again in terms of the capacity of the multilabel concept class. Simon and Szörényi have recently proved an alternate result in Simon and Szörényi (2009).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper "the importance of convexity in learning with squared loss" gave a lower bound on the sample complexity of learning with quadratic loss using a nonconvex function class. The proof contains an error. We show that the lower bound is true under a stronger condition that holds for many cases of interest.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present new expected risk bounds for binary and multiclass prediction, and resolve several recent conjectures on sample compressibility due to Kuzmin and Warmuth. By exploiting the combinatorial structure of concept class F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy. The key step in their proof is a d=VC(F) bound on the graph density of a subgraph of the hypercube—one-inclusion graph. The first main result of this report is a density bound of n∙choose(n-1,≤d-1)/choose(n,≤d) < d, which positively resolves a conjecture of Kuzmin and Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved one-inclusion mistake bound. The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is an algebraic topological property of maximum classes of VC-dimension d as being d-contractible simplicial complexes, extending the well-known characterization that d=1 maximum classes are trees. We negatively resolve a minimum degree conjecture of Kuzmin and Warmuth—the second part to a conjectured proof of correctness for Peeling—that every class has one-inclusion minimum degree at most its VC-dimension. Our final main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This result improves on known PAC-based expected risk bounds by a factor of O(log n) and is shown to be optimal up to a O(log k) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an algorithm called Optimistic Linear Programming (OLP) for learning to optimize average reward in an irreducible but otherwise unknown Markov decision process (MDP). OLP uses its experience so far to estimate the MDP. It chooses actions by optimistically maximizing estimated future rewards over a set of next-state transition probabilities that are close to the estimates, a computation that corresponds to solving linear programs. We show that the total expected reward obtained by OLP up to time T is within C(P) log T of the reward obtained by the optimal policy, where C(P) is an explicit, MDP-dependent constant. OLP is closely related to an algorithm proposed by Burnetas and Katehakis with four key differences: OLP is simpler, it does not require knowledge of the supports of transition probabilities, the proof of the regret bound is simpler, but our regret bound is a constant factor larger than the regret of their algorithm. OLP is also similar in flavor to an algorithm recently proposed by Auer and Ortner. But OLP is simpler and its regret bound has a better dependence on the size of the MDP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"Bouncing Back: Resilient Design for Brisbane" was an opportunity for QUT students to communicate their inspiring design responses to adversity, to the larger Brisbane community. The exhibition demonstrates new and innovative ways of thinking about our cities, and how they are built to be resilient and to suit extreme environmental conditions. The challenge for architecture students is to address the state of architecture as a reflection of today's world and to consider how design fits into the 21st century. Students have explored notions of 'Urban Resilience' from multiple perspectives, including emergency design while facing flooding, flood proof housing and urban designs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It was reported that the manuscript of Crash was returned to the publisher with a note reading ‘The author is beyond psychiatric help’. Ballard took the lay diagnosis as proof of complete artistic success. Crash conflates the Freudian tropes of libido and thanatos, overlaying these onto the twentieth century erotic icon, the car. Beyond mere incompetent adolescent copulatory fumblings in the back seat of the parental sedan or the clichéd phallic locomotor of the mid-life Ferrari, Ballard engages the full potentialities of the automobile as the locus and sine qua non of a perverse, though functional erotic. ‘Autoeroticism’ is transformed into automotive, traumatic or surgical paraphilia, driving Helmut Newton’s insipid photo-essays of BDSM and orthopædics into an entirely new dimension, dancing precisely where (but more crucially, because) the ‘body is bruised to pleasure soul’. The serendipity of quotidian accidental collisions is supplanted, in pursuit of the fetishised object, by contrived (though not simulated) recreations of iconographic celebrity deaths. Penetration remains as a guiding trope of sexuality, but it is confounded by a perversity of focus. Such an obsessive pursuit of this autoerotic-as-reality necessitates the rejection of the law of human sexual regulation, requiring the re-interpretation of what constitutes sex itself by looking beyond or through conventional sexuality into Ballard’s paraphiliac and nightmarish consensual Other. This Other allows for (if not demands) the tangled wreckage of a sportscar to function as a transformative sexual agent, creating, of woman, a being of ‘free and perverse sexuality, releasing within its dying chromium and leaking engine-parts, all the deviant possibilities of her sex’.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tort law reform has resulted in legislation being passed by all Australian jurisdictions in the past decade implementing the recommendations contained in the Ipp Report. The report was in response to a perceived crisis in medical indemnity insurance. The objective was to restrict and limit liability in negligence actions. This paper will consider to what extent the reforms have impacted on the liability of health professionals in medical negligence actions. The reversal of the onus of proof through the obvious risk sections has attempted to extend the scope of the defence of voluntary assumption of risk. There is no liability for the materialisation of an inherent risk. Presumptions and mandatory reductions for contributory negligence have attempted to reduce the liability of defendants. It is now possible for reductions of 100% for contributory negligence. Apologies can be made with no admission of legal liability to encourage them being made and thereby reduce the number of actions being commenced. The peer acceptance defence has been introduced and enacted by legislation. There is protection for good samaritans even though the Ipp Report recommended against such protection. Limitation periods have been amended. Provisions relating to mental harm have been introduced re-instating the requirement of normal fortitude and direct perception. After an analysis of the legislation, it will be argued in this paper that while there has been some limitation and restriction, courts have generally interpreted the civil liability reforms in compliance with the common law. It has been the impact of statutory limits on the assessment of damages which has limited the liability of health professionals in medical negligence actions.