61 resultados para Information Model
Resumo:
We address the problem of designing distributed algorithms for large scale networks that are robust to Byzantine faults. We consider a message passing, full information model: the adversary is malicious, controls a constant fraction of processors, and can view all messages in a round before sending out its own messages for that round. Furthermore, each bad processor may send an unlimited number of messages. The only constraint on the adversary is that it must choose its corrupt processors at the start, without knowledge of the processors’ private random bits.
A good quorum is a set of O(logn) processors, which contains a majority of good processors. In this paper, we give a synchronous algorithm which uses polylogarithmic time and Õ(vn) bits of communication per processor to bring all processors to agreement on a collection of n good quorums, solving Byzantine agreement as well. The collection is balanced in that no processor is in more than O(logn) quorums. This yields the first solution to Byzantine agreement which is both scalable and load-balanced in the full information model.
The technique which involves going from situation where slightly more than 1/2 fraction of processors are good and and agree on a short string with a constant fraction of random bits to a situation where all good processors agree on n good quorums can be done in a fully asynchronous model as well, providing an approach for extending the Byzantine agreement result to this model.
Resumo:
We study the fundamental Byzantine leader election problem in dynamic networks where the topology can change from round to round and nodes can also experience heavy {\em churn} (i.e., nodes can join and leave the network continuously over time). We assume the full information model where the Byzantine nodes have complete knowledge about the entire state of the network at every round (including random choices made by all the nodes), have unbounded computational power and can deviate arbitrarily from the protocol. The churn is controlled by an adversary that has complete knowledge and control over which nodes join and leave and at what times and also may rewire the topology in every round and has unlimited computational power, but is oblivious to the random choices made by the algorithm. Our main contribution is an $O(\log^3 n)$ round algorithm that achieves Byzantine leader election under the presence of up to $O({n}^{1/2 - \epsilon})$ Byzantine nodes (for a small constant $\epsilon > 0$) and a churn of up to \\$O(\sqrt{n}/\poly\log(n))$ nodes per round (where $n$ is the stable network size).The algorithm elects a leader with probability at least $1-n^{-\Omega(1)}$ and guarantees that it is an honest node with probability at least $1-n^{-\Omega(1)}$; assuming the algorithm succeeds, the leader's identity will be known to a $1-o(1)$ fraction of the honest nodes. Our algorithm is fully-distributed, lightweight, and is simple to implement. It is also scalable, as it runs in polylogarithmic (in $n$) time and requires nodes to send and receive messages of only polylogarithmic size per round.To the best of our knowledge, our algorithm is the first scalable solution for Byzantine leader election in a dynamic network with a high rate of churn; our protocol can also be used to solve Byzantine agreement in a straightforward way.We also show how to implement an (almost-everywhere) public coin with constant bias in a dynamic network with Byzantine nodes and provide a mechanism for enabling honest nodes to store information reliably in the network, which might be of independent interest.
Resumo:
We propose a recursive method of pricing an information good in a network of holders and demanders of this good. The prices are determined via a unique equilibrium outcome in a sequence of bilateral bargaining games that are played by connected agents. If the information is an homogenous, non-depreciating good without network effects we derive explicit formulae which elucidate the role of the link pattern among the players. Particularly, we find out that the equilibrium price is intimately related to the existence of cycles in the network: It is zero if a cycle covers the trading pair and it is proportional to the direct and indirect utility that the good generates otherwise.
Resumo:
The cerebral cortex contains circuitry for continuously computing properties of the environment and one's body, as well as relations among those properties. The success of complex perceptuomotor performances requires integrated, simultaneous use of such relational information. Ball catching is a good example as it involves reaching and grasping of visually pursued objects that move relative to the catcher. Although integrated neural control of catching has received sparse attention in the neuroscience literature, behavioral observations have led to the identification of control principles that may be embodied in the involved neural circuits. Here, we report a catching experiment that refines those principles via a novel manipulation. Visual field motion was used to perturb velocity information about balls traveling on various trajectories relative to a seated catcher, with various initial hand positions. The experiment produced evidence for a continuous, prospective catching strategy, in which hand movements are planned based on gaze-centered ball velocity and ball position information. Such a strategy was implemented in a new neural model, which suggests how position, velocity, and temporal information streams combine to shape catching movements. The model accurately reproduces the main and interaction effects found in the behavioral experiment and provides an interpretation of recently observed target motion-related activity in the motor cortex during interceptive reaching by monkeys. It functionally interprets a broad range of neurobiological and behavioral data, and thus contributes to a unified theory of the neural control of reaching to stationary and moving targets.
Resumo:
Estimating a time interval and temporally coordinating movements in space are fundamental skills, but the relationships between these different forms of timing, and the neural processes that they incur, are not well understood. While different theories have been proposed to account for time perception, time estimation, and the temporal patterns of coordination, there are no general mechanisms which unify these various timing skills. This study considers whether a model of perceptuo-motor timing, the tau(GUIDE), can also describe how certain judgements of elapsed time are made. To evaluate this, an equation for determining interval estimates was derived from the tau(GUIDE) model and tested in a task where participants had to throw a ball and estimate when it would hit the floor. The results showed that in accordance with the model, very accurate judgements could be made without vision (mean timing error -19.24 msec), and the model was a good predictor of skilled participants' estimate timing. It was concluded that since the tau(GUIDE) principle provides temporal information in a generic form, it could be a unitary process that links different forms of timing.
Resumo:
Fifty-two CFLP mice had an open femoral diaphyseal osteotomy held in compression by a four-pin external fixator. The movement of 34 of the mice in their cages was quantified before and after operation, until sacrifice at 4, 8, 16 or 24 days. Thirty-three specimens underwent histomorphometric analysis and 19 specimens underwent torsional stiffness measurement. The expected combination of intramembranous and endochondral bone formation was observed, and the model was shown to be reliable in that variation in the histological parameters of healing was small between animals at the same time point, compared to the variation between time-points. There was surprisingly large individual variation in the amount of animal movement about the cage, which correlated with both histomorphometric and mechanical measures of healing. Animals that moved more had larger external calluses containing more cartilage and demonstrated lower torsional stiffness at the same time point. Assuming that movement of the whole animal predicts, at least to some extent, movement at the fracture site, this correlation is what would be expected in a model that involves similar processes to those in human fracture healing. Models such as this, employed to determine the effect of experimental interventions, will yield more information if the natural variation in animal motion is measured and included in the analysis.
Resumo:
An analogy is established between the syntagm and paradigm from Saussurean linguistics and the message and messages for selection from the information theory initiated by Claude Shannon. The analogy is pursued both as an end itself and for its analytic value in understanding patterns of retrieval from full text systems. The multivalency of individual words when isolated from their syntagm is contrasted with the relative stability of meaning of multi-word sequences, when searching ordinary written discourse. The syntagm is understood as the linear sequence of oral and written language. Saussureâ??s understanding of the word, as a unit which compels recognition by the mind, is endorsed, although not regarded as final. The lesser multivalency of multi-word sequences is understood as the greater determination of signification by the extended syntagm. The paradigm is primarily understood as the network of associations a word acquires when considered apart from the syntagm. The restriction of information theory to expression or signals, and its focus on the combinatorial aspects of the message, is sustained. The message in the model of communication in information theory can include sequences of written language. Shannonâ??s understanding of the written word, as a cohesive group of letters, with strong internal statistical influences, is added to the Saussurean conception. Sequences of more than one word are regarded as weakly correlated concatenations of cohesive units.
Resumo:
This paper provides a summary of our studies on robust speech recognition based on a new statistical approach – the probabilistic union model. We consider speech recognition given that part of the acoustic features may be corrupted by noise. The union model is a method for basing the recognition on the clean part of the features, thereby reducing the effect of the noise on recognition. To this end, the union model is similar to the missing feature method. However, the two methods achieve this end through different routes. The missing feature method usually requires the identity of the noisy data for noise removal, while the union model combines the local features based on the union of random events, to reduce the dependence of the model on information about the noise. We previously investigated the applications of the union model to speech recognition involving unknown partial corruption in frequency band, in time duration, and in feature streams. Additionally, a combination of the union model with conventional noise-reduction techniques was studied, as a means of dealing with a mixture of known or trainable noise and unknown unexpected noise. In this paper, a unified review, in the context of dealing with unknown partial feature corruption, is provided into each of these applications, giving the appropriate theory and implementation algorithms, along with an experimental evaluation.
Managing expectations and benefits: a model for electronic trading and EDI in the insurance industry
Resumo:
Model Driven Architecture supports the transformation from reusable models to executable software. Business representations, however, cannot be fully and explicitly represented in such models for direct transformation into running systems. Thus, once business needs change, the language abstractions used by MDA (e.g. Object Constraint Language / Action Semantics), being low level, have to be edited directly. We therefore describe an Agent-oriented Model Driven Architecture (AMDA) that uses a set of business models under continuous maintenance by business people, reflecting the current business needs and being associated with adaptive agents that interpret the captured knowledge to behave dynamically. Three contributions of the AMDA approach are identified: 1) to Agent-oriented Software Engineering, a method of building adaptive Multi-Agent Systems; 2) to MDA, a means of abstracting high level business-oriented models to align executable systems with their requirements at runtime; 3) to distributed systems, the interoperability of disparate components and services via the agent abstraction.
Resumo:
This paper discusses the approaches and techniques used to build a realistic numerical model to analyse the cooling phase of the injection moulding process. The procedures employed to select an appropriate mesh and the boundary and initial conditions for the problem are discussed and justified. The final model is validated using direct comparisons with experimental results generated in an earlier study. The model is shown to be a useful tool for further studies aimed at optimising the cooling phase of the injection moulding process. Using the numerical model provides additional information relating to changes in conditions throughout the process, which otherwise could not be deduced or assessed experimentally. These results, and other benefits related to the use of the model, are also discussed in the paper. © 2007 Elsevier B.V. All rights reserved.