941 resultados para Cantor Manifold


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Denial-of-service (DoS) attacks are a growing concern to networked services like the Internet. In recent years, major Internet e-commerce and government sites have been disabled due to various DoS attacks. A common form of DoS attack is a resource depletion attack, in which an attacker tries to overload the server's resources, such as memory or computational power, rendering the server unable to service honest clients. A promising way to deal with this problem is for a defending server to identify and segregate malicious traffic as earlier as possible. Client puzzles, also known as proofs of work, have been shown to be a promising tool to thwart DoS attacks in network protocols, particularly in authentication protocols. In this thesis, we design efficient client puzzles and propose a stronger security model to analyse client puzzles. We revisit a few key establishment protocols to analyse their DoS resilient properties and strengthen them using existing and novel techniques. Our contributions in the thesis are manifold. We propose an efficient client puzzle that enjoys its security in the standard model under new computational assumptions. Assuming the presence of powerful DoS attackers, we find a weakness in the most recent security model proposed to analyse client puzzles and this study leads us to introduce a better security model for analysing client puzzles. We demonstrate the utility of our new security definitions by including two hash based stronger client puzzles. We also show that using stronger client puzzles any protocol can be converted into a provably secure DoS resilient key exchange protocol. In other contributions, we analyse DoS resilient properties of network protocols such as Just Fast Keying (JFK) and Transport Layer Security (TLS). In the JFK protocol, we identify a new DoS attack by applying Meadows' cost based framework to analyse DoS resilient properties. We also prove that the original security claim of JFK does not hold. Then we combine an existing technique to reduce the server cost and prove that the new variant of JFK achieves perfect forward secrecy (the property not achieved by original JFK protocol) and secure under the original security assumptions of JFK. Finally, we introduce a novel cost shifting technique which reduces the computation cost of the server significantly and employ the technique in the most important network protocol, TLS, to analyse the security of the resultant protocol. We also observe that the cost shifting technique can be incorporated in any Diffine{Hellman based key exchange protocol to reduce the Diffie{Hellman exponential cost of a party by one multiplication and one addition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Whether to keep products segregated (e.g., unbundled) or integrate some or all of them (e.g., bundle) has been a problem of profound interest in areas such as portfolio theory in finance, risk capital allocations in insurance and marketing of consumer products. Such decisions are inherently complex and depend on factors such as the underlying product values and consumer preferences, the latter being frequently described using value functions, also known as utility functions in economics. In this paper, we develop decision rules for multiple products, which we generally call ‘exposure units’ to naturally cover manifold scenarios spanning well beyond ‘products’. Our findings show, e.g. that the celebrated Thaler's principles of mental accounting hold as originally postulated when the values of all exposure units are positive (i.e. all are gains) or all negative (i.e. all are losses). In the case of exposure units with mixed-sign values, decision rules are much more complex and rely on cataloging the Bell number of cases that grow very fast depending on the number of exposure units. Consequently, in the present paper, we provide detailed rules for the integration and segregation decisions in the case up to three exposure units, and partial rules for the arbitrary number of units.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of breast cancer is a complex process that involves multiple genes at many stages, from initial cell cycle dysregulation to disease progression. To identify genetic variations that influence this process, we conducted a large-scale association study using a collection of German cases and controls and >25,000 SNPs located within 16,000 genes. One of the loci identified was located on chromosome 11q13 [odds ratio (OR)=1.85, P=0.017]. The initial association was subsequently tested in two independent breast cancer collections. In both sample sets, the frequency of the susceptibility allele was increased in the cases (OR=1.6, P=0.01). The susceptibility allele was also associated with an increase in cancer family history (P=0.1). Fine mapping showed that the region of association extends approximately 300 kb and spans several genes, including the gene encoding the nuclear mitotic apparatus protein (NuMA). A nonsynonymous SNP (A794G) in NuMA was identified that showed a stronger association with breast cancer risk than the initial marker SNP (OR=2.8, P=0.005 initial sample; OR=2.1, P=0.002 combined). NuMA is a cell cycle-related protein essential for normal mitosis that is degraded in early apoptosis. NuMA-retinoic acid receptor alpha fusion proteins have been described in acute promyelocytic leukemia. Although the potential functional relevance of the A794G variation requires further biological validation, we conclude that variations in the NuMA gene are likely responsible for the observed increased breast cancer risk.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We conducted a large-scale association study to identify genes that influence nonfamilial breast cancer risk using a collection of German cases and matched controls and >25,000 single nucleotide polymorphisms located within 16,000 genes. One of the candidate loci identified was located on chromosome 19p13.2 [odds ratio (OR) = 1.5, P = 0.001]. The effect was substantially stronger in the subset of cases with reported family history of breast cancer (OR = 3.4, P = 0.001). The finding was subsequently replicated in two independent collections (combined OR = 1.4, P < 0.001) and was also associated with predisposition to prostate cancer in an independent sample set of prostate cancer cases and matched controls (OR = 1.4, P = 0.002). High-density single nucleotide polymorphism mapping showed that the extent of association spans 20 kb and includes the intercellular adhesion molecule genes ICAM1, ICAM4, and ICAM5. Although genetic variants in ICAM5 showed the strongest association with disease status, ICAM1 is expressed at highest levels in normal and tumor breast tissue. A variant in ICAM5 was also associated with disease progression and prognosis. Because ICAMs are suitable targets for antibodies and small molecules, these findings may not only provide diagnostic and prognostic markers but also new therapeutic opportunities in breast and prostate cancer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A major challenge for robot localization and mapping systems is maintaining reliable operation in a changing environment. Vision-based systems in particular are susceptible to changes in illumination and weather, and the same location at another time of day may appear radically different to a system using a feature-based visual localization system. One approach for mapping changing environments is to create and maintain maps that contain multiple representations of each physical location in a topological framework or manifold. However, this requires the system to be able to correctly link two or more appearance representations to the same spatial location, even though the representations may appear quite dissimilar. This paper proposes a method of linking visual representations from the same location without requiring a visual match, thereby allowing vision-based localization systems to create multiple appearance representations of physical locations. The most likely position on the robot path is determined using particle filter methods based on dead reckoning data and recent visual loop closures. In order to avoid erroneous loop closures, the odometry-based inferences are only accepted when the inferred path's end point is confirmed as correct by the visual matching system. Algorithm performance is demonstrated using an indoor robot dataset and a large outdoor camera dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this article, we analyse bifurcations from stationary stable spots to travelling spots in a planar three-component FitzHugh-Nagumo system that was proposed previously as a phenomenological model of gas-discharge systems. By combining formal analyses, center-manifold reductions, and detailed numerical continuation studies, we show that, in the parameter regime under consideration, the stationary spot destabilizes either through its zeroth Fourier mode in a Hopf bifurcation or through its first Fourier mode in a pitchfork or drift bifurcation, whilst the remaining Fourier modes appear to create only secondary bifurcations. Pitchfork bifurcations result in travelling spots, and we derive criteria for the criticality of these bifurcations. Our main finding is that supercritical drift bifurcations, leading to stable travelling spots, arise in this model, which does not seem possible for its two-component version.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research on journalists’ characteristics, values, attitudes and role perceptions has expanded manifold since the first large-scale survey in the United States in the 1970s. Scholars around the world have investigated the work practices of a large variety of journalists, to the extent that we now have a sizeable body of evidence in this regard. Comparative research across cultures, however, has only recently begun to gain ground, with scholars interested in concepts of journalism culture in an age of globalisation. As part of a wider, cross-cultural effort, this study reports the results of a survey of 100 Australian journalists in order to paint a picture of the way journalists see their role in society. Such a study is important due to the relative absence of large-scale surveys of Australian journalists since Henningham’s (1993) seminal work. This paper reports some important trends in the Australian news media since the early 1990s, with improvements in gender balance and journalists now being older, better educated, and holding more leftist political views. In locating Australian journalism culture within the study’s framework, some long-held assumptions are reinforced, with journalists following traditional values of objectivity, passive reporting and the ideal of the fourth estate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analysis of behavioural consistency is an important aspect of software engineering. In process and service management, consistency verification of behavioural models has manifold applications. For instance, a business process model used as system specification and a corresponding workflow model used as implementation have to be consistent. Another example would be the analysis to what degree a process log of executed business operations is consistent with the corresponding normative process model. Typically, existing notions of behaviour equivalence, such as bisimulation and trace equivalence, are applied as consistency notions. Still, these notions are exponential in computation and yield a Boolean result. In many cases, however, a quantification of behavioural deviation is needed along with concepts to isolate the source of deviation. In this article, we propose causal behavioural profiles as the basis for a consistency notion. These profiles capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities of a process model. Consistency based on these profiles is weaker than trace equivalence, but can be computed efficiently for a broad class of models. In this article, we introduce techniques for the computation of causal behavioural profiles using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets. We also elaborate on the findings of applying our technique to three industry model collections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reviews some recent results in motion control of marine vehicles using a technique called Interconnection and Damping Assignment Passivity-based Control (IDA-PBC). This approach to motion control exploits the fact that vehicle dynamics can be described in terms of energy storage, distribution, and dissipation, and that the stable equilibrium points of mechanical systems are those at which the potential energy attains a minima. The control forces are used to transform the closed-loop dynamics into a port-controlled Hamiltonian system with dissipation. This is achieved by shaping the energy-storing characteristics of the system, modifying its interconnection structure (how the energy is distributed), and injecting damping. The end result is that the closed-loop system presents a stable equilibrium (hopefully global) at the desired operating point. By forcing the closed-loop dynamics into a Hamiltonian form, the resulting total energy function of the system serves as a Lyapunov function that can be used to demonstrate stability. We consider the tracking and regulation of fully actuated unmanned underwater vehicles, its extension to under-actuated slender vehicles, and also manifold regulation of under-actuated surface vessels. The paper is concluded with an outlook on future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent advances suggest that encoding images through Symmetric Positive Definite (SPD) matrices and then interpreting such matrices as points on Riemannian manifolds can lead to increased classification performance. Taking into account manifold geometry is typically done via (1) embedding the manifolds in tangent spaces, or (2) embedding into Reproducing Kernel Hilbert Spaces (RKHS). While embedding into tangent spaces allows the use of existing Euclidean-based learning algorithms, manifold shape is only approximated which can cause loss of discriminatory information. The RKHS approach retains more of the manifold structure, but may require non-trivial effort to kernelise Euclidean-based learning algorithms. In contrast to the above approaches, in this paper we offer a novel solution that allows SPD matrices to be used with unmodified Euclidean-based learning algorithms, with the true manifold shape well-preserved. Specifically, we propose to project SPD matrices using a set of random projection hyperplanes over RKHS into a random projection space, which leads to representing each matrix as a vector of projection coefficients. Experiments on face recognition, person re-identification and texture classification show that the proposed approach outperforms several recent methods, such as Tensor Sparse Coding, Histogram Plus Epitome, Riemannian Locality Preserving Projection and Relational Divergence Classification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional nearest points methods use all the samples in an image set to construct a single convex or affine hull model for classification. However, strong artificial features and noisy data may be generated from combinations of training samples when significant intra-class variations and/or noise occur in the image set. Existing multi-model approaches extract local models by clustering each image set individually only once, with fixed clusters used for matching with various image sets. This may not be optimal for discrimination, as undesirable environmental conditions (eg. illumination and pose variations) may result in the two closest clusters representing different characteristics of an object (eg. frontal face being compared to non-frontal face). To address the above problem, we propose a novel approach to enhance nearest points based methods by integrating affine/convex hull classification with an adapted multi-model approach. We first extract multiple local convex hulls from a query image set via maximum margin clustering to diminish the artificial variations and constrain the noise in local convex hulls. We then propose adaptive reference clustering (ARC) to constrain the clustering of each gallery image set by forcing the clusters to have resemblance to the clusters in the query image set. By applying ARC, noisy clusters in the query set can be discarded. Experiments on Honda, MoBo and ETH-80 datasets show that the proposed method outperforms single model approaches and other recent techniques, such as Sparse Approximated Nearest Points, Mutual Subspace Method and Manifold Discriminant Analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Person re-identification is particularly challenging due to significant appearance changes across separate camera views. In order to re-identify people, a representative human signature should effectively handle differences in illumination, pose and camera parameters. While general appearance-based methods are modelled in Euclidean spaces, it has been argued that some applications in image and video analysis are better modelled via non-Euclidean manifold geometry. To this end, recent approaches represent images as covariance matrices, and interpret such matrices as points on Riemannian manifolds. As direct classification on such manifolds can be difficult, in this paper we propose to represent each manifold point as a vector of similarities to class representers, via a recently introduced form of Bregman matrix divergence known as the Stein divergence. This is followed by using a discriminative mapping of similarity vectors for final classification. The use of similarity vectors is in contrast to the traditional approach of embedding manifolds into tangent spaces, which can suffer from representing the manifold structure inaccurately. Comparative evaluations on benchmark ETHZ and iLIDS datasets for the person re-identification task show that the proposed approach obtains better performance than recent techniques such as Histogram Plus Epitome, Partial Least Squares, and Symmetry-Driven Accumulation of Local Features.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Existing multi-model approaches for image set classification extract local models by clustering each image set individually only once, with fixed clusters used for matching with other image sets. However, this may result in the two closest clusters to represent different characteristics of an object, due to different undesirable environmental conditions (such as variations in illumination and pose). To address this problem, we propose to constrain the clustering of each query image set by forcing the clusters to have resemblance to the clusters in the gallery image sets. We first define a Frobenius norm distance between subspaces over Grassmann manifolds based on reconstruction error. We then extract local linear subspaces from a gallery image set via sparse representation. For each local linear subspace, we adaptively construct the corresponding closest subspace from the samples of a probe image set by joint sparse representation. We show that by minimising the sparse representation reconstruction error, we approach the nearest point on a Grassmann manifold. Experiments on Honda, ETH-80 and Cambridge-Gesture datasets show that the proposed method consistently outperforms several other recent techniques, such as Affine Hull based Image Set Distance (AHISD), Sparse Approximated Nearest Points (SANP) and Manifold Discriminant Analysis (MDA).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

FROM KCWS 2010 Ch airs and Summit Proceeding Ed ito rs ‘Knowledge’ is a resource, which relies on the past for a better future. In the 21st century, more than ever before, cities around the world depend on the knowledge of their citizens, their institutions and their firms and enterprises. The knowledge image, the human competence and the reputation of their public and private institutions and corporations profiles a city. It attracts investment, qualified labour and professionals, as well as students and researchers. And it creates local life spaces and professional milieus, which offer the quality of life to the citizens that are seeking to cope with the challenges of modern life in a competitive world. Integrating knowledge-based development in urban strategies and policies, beyond the provision of schools and locations for higher education, has become a new ambitious arena of city politics. Coming from theory to practice, and bringing together the manifold knowledge stakeholders in a city and preparing joint visions for the knowledge city is a new challenge for city managers, urban planners and leaders of the civic society . It requires visionary power, creativity, holistic thinking, the willingness to cooperate with all groups of the local civil society, and the capability to moderate communication processes to overcome conflicts and to develop joint action for a sustainable future. This timely Melbourne 2010 – The Third Knowledge City World Summit makes an important reminder that ‘knowledge’ is the key notion in the 21st Century development. Considering this notion, the summit aims to shed light on the multi-faceted dimensions and various scales of building the ‘knowledge city’ and on ‘knowledge-based development’ paradigms. At this summit, the theoretical and practical maturing of knowledge-based development paradigms will be advanced through the interplay between the world’s leading academic’s theories and the practical models and strategies of practitioners’ and policy makers’ drawn from around the world. As chairs of The Melbourne 2010 Summit, we have compiled this summit proceeding in order to disseminate the knowledge generated and shared in Melbourne with the wider research, governance, and practice communities. The papers in the proceedings reflect the broad range of contributions to the summit. They report on recent developments in planning and managing knowledge cities and ICT infrastructure, they assess the role of knowledge institutions in regional innovation systems and of the intellectual capital of cities and regions; they describe the evolution of knowledge-based approaches to urban development in differing cultural environments; they finally bridge the discourse on the knowledge city to other urban development paradigms such as the creative city, the ubiquitous city or the compact city. The diversity of papers presented shows how different scholars from planning cultures around the world interpret the knowledge dimension in urban and regional development. All papers of this proceeding have gone through a double-blind peer review process and been reviewed by our summit editorial review and advisory board members. We cordially thank the members of the Summit Proceeding Editorial Review and Advisory Board for their diligent work in the review of the papers. We hope the papers in this proceeding will inspire and make a significant contribution to the research, governance, and practice circles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents an empirical study of the effects of topology on cellular automata rule spaces. The classical definition of a cellular automaton is restricted to that of a regular lattice, often with periodic boundary conditions. This definition is extended to allow for arbitrary topologies. The dynamics of cellular automata within the triangular tessellation were analysed when transformed to 2-manifolds of topological genus 0, genus 1 and genus 2. Cellular automata dynamics were analysed from a statistical mechanics perspective. The sample sizes required to obtain accurate entropy calculations were determined by an entropy error analysis which observed the error in the computed entropy against increasing sample sizes. Each cellular automata rule space was sampled repeatedly and the selected cellular automata were simulated over many thousands of trials for each topology. This resulted in an entropy distribution for each rule space. The computed entropy distributions are indicative of the cellular automata dynamical class distribution. Through the comparison of these dynamical class distributions using the E-statistic, it was identified that such topological changes cause these distributions to alter. This is a significant result which implies that both global structure and local dynamics play a important role in defining long term behaviour of cellular automata.