22 resultados para Artificial aging and KNO3
Resumo:
Large scale distributed data stores rely on optimistic replication to scale and remain highly available in the face of net work partitions. Managing data without coordination results in eventually consistent data stores that allow for concurrent data updates. These systems often use anti-entropy mechanisms (like Merkle Trees) to detect and repair divergent data versions across nodes. However, in practice hash-based data structures are too expensive for large amounts of data and create too many false conflicts. Another aspect of eventual consistency is detecting write conflicts. Logical clocks are often used to track data causality, necessary to detect causally concurrent writes on the same key. However, there is a nonnegligible metadata overhead per key, which also keeps growing with time, proportional with the node churn rate. Another challenge is deleting keys while respecting causality: while the values can be deleted, perkey metadata cannot be permanently removed without coordination. Weintroduceanewcausalitymanagementframeworkforeventuallyconsistentdatastores,thatleveragesnodelogicalclocks(BitmappedVersion Vectors) and a new key logical clock (Dotted Causal Container) to provides advantages on multiple fronts: 1) a new efficient and lightweight anti-entropy mechanism; 2) greatly reduced per-key causality metadata size; 3) accurate key deletes without permanent metadata.
Resumo:
The development of ubiquitous computing (ubicomp) environments raises several challenges in terms of their evaluation. Ubicomp virtual reality prototyping tools enable users to experience the system to be developed and are of great help to face those challenges, as they support developers in assessing the consequences of a design decision in the early phases of development. Given the situated nature of ubicomp environments, a particular issue to consider is the level of realism provided by the prototypes. This work presents a case study where two ubicomp prototypes, featuring different levels of immersion (desktop-based versus CAVE-based), were developed and compared. The goal was to determine the cost/benefits relation of both solutions, which provided better user experience results, and whether or not simpler solutions provide the same user experience results as more elaborate one.
Resumo:
Model finders are very popular for exploring scenarios, helping users validate specifications by navigating through conforming model instances. To be practical, the semantics of such scenario exploration operations should be formally defined and, ideally, controlled by the users, so that they are able to quickly reach interesting scenarios. This paper explores the landscape of scenario exploration operations, by formalizing them with a relational model finder. Several scenario exploration operations provided by existing tools are formalized, and new ones are proposed, namely to allow the user to easily explore very similar (or different) scenarios, by attaching preferences to model elements. As a proof-of-concept, such operations were implemented in the popular Alloy Analyzer, further increasing its usefulness for (user-guided) scenario exploration.
Resumo:
Temporal logics targeting real-time systems are traditionally undecidable. Based on a restricted fragment of MTL-R, we propose a new approach for the runtime verification of hard real-time systems. The novelty of our technique is that it is based on incremental evaluation, allowing us to e↵ectively treat duration properties (which play a crucial role in real-time systems). We describe the two levels of operation of our approach: offline simplification by quantifier removal techniques; and online evaluation of a three-valued interpretation for formulas of our fragment. Our experiments show the applicability of this mechanism as well as the validity of the provided complexity results.
Resumo:
This paper introduces the metaphorism pattern of relational specification and addresses how specification following this pattern can be refined into recursive programs. Metaphorisms express input-output relationships which preserve relevant information while at the same time some intended optimization takes place. Text processing, sorting, representation changers, etc., are examples of metaphorisms. The kind of metaphorism refinement proposed in this paper is a strategy known as change of virtual data structure. It gives sufficient conditions for such implementations to be calculated using relation algebra and illustrates the strategy with the derivation of quicksort as example.
Resumo:
Tese de doutoramento em ciências jurídicas públicas
Resumo:
Body and brain undergo several changes with aging. One of the domains in which these changes are more remarkable relates with cognitive performance. In the present work, electroencephalogram (EEG) markers (power spectral density and spectral coherence) of age-related cognitive decline were sought whilst the subjects performed the Wisconsin Card Sorting Test (WCST). Considering the expected age-related cognitive deficits, WCST was applied to young, mid-age and elderly participants, and the theta and alpha frequency bands were analyzed. From the results herein presented, higher theta and alpha power were found to be associated with a good performance in the WCST of younger subjects. Additionally, higher theta and alpha coherence were also associated with good performance and were shown to decline with age and a decrease in alpha peak frequency seems to be associated with aging. Additionally, inter-hemispheric long-range coherences and parietal theta power were identified as age-independent EEG correlates of cognitive performance. In summary, these data reveals age-dependent as well as age-independent EEG correlates of cognitive performance that contribute to the understanding of brain aging and related cognitive deficits.