996 resultados para Conflict detection


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work presents exact, hybrid algorithms for mixed resource Allocation and Scheduling problems; in general terms, those consist into assigning over time finite capacity resources to a set of precedence connected activities. The proposed methods have broad applicability, but are mainly motivated by applications in the field of Embedded System Design. In particular, high-performance embedded computing recently witnessed the shift from single CPU platforms with application-specific accelerators to programmable Multi Processor Systems-on-Chip (MPSoCs). Those allow higher flexibility, real time performance and low energy consumption, but the programmer must be able to effectively exploit the platform parallelism. This raises interest in the development of algorithmic techniques to be embedded in CAD tools; in particular, given a specific application and platform, the objective if to perform optimal allocation of hardware resources and to compute an execution schedule. On this regard, since embedded systems tend to run the same set of applications for their entire lifetime, off-line, exact optimization approaches are particularly appealing. Quite surprisingly, the use of exact algorithms has not been well investigated so far; this is in part motivated by the complexity of integrated allocation and scheduling, setting tough challenges for ``pure'' combinatorial methods. The use of hybrid CP/OR approaches presents the opportunity to exploit mutual advantages of different methods, while compensating for their weaknesses. In this work, we consider in first instance an Allocation and Scheduling problem over the Cell BE processor by Sony, IBM and Toshiba; we propose three different solution methods, leveraging decomposition, cut generation and heuristic guided search. Next, we face Allocation and Scheduling of so-called Conditional Task Graphs, explicitly accounting for branches with outcome not known at design time; we extend the CP scheduling framework to effectively deal with the introduced stochastic elements. Finally, we address Allocation and Scheduling with uncertain, bounded execution times, via conflict based tree search; we introduce a simple and flexible time model to take into account duration variability and provide an efficient conflict detection method. The proposed approaches achieve good results on practical size problem, thus demonstrating the use of exact approaches for system design is feasible. Furthermore, the developed techniques bring significant contributions to combinatorial optimization methods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Operator Choice Model (OCM) was developed to model the behaviour of operators attending to complex tasks involving interdependent concurrent activities, such as in Air Traffic Control (ATC). The purpose of the OCM is to provide a flexible framework for modelling and simulation that can be used for quantitative analyses in human reliability assessment, comparison between human computer interaction (HCI) designs, and analysis of operator workload. The OCM virtual operator is essentially a cycle of four processes: Scan Classify Decide Action Perform Action. Once a cycle is complete, the operator will return to the Scan process. It is also possible to truncate a cycle and return to Scan after each of the processes. These processes are described using Continuous Time Probabilistic Automata (CTPA). The details of the probability and timing models are specific to the domain of application, and need to be specified using domain experts. We are building an application of the OCM for use in ATC. In order to develop a realistic model we are calibrating the probability and timing models that comprise each process using experimental data from a series of experiments conducted with student subjects. These experiments have identified the factors that influence perception and decision making in simplified conflict detection and resolution tasks. This paper presents an application of the OCM approach to a simple ATC conflict detection experiment. The aim is to calibrate the OCM so that its behaviour resembles that of the experimental subjects when it is challenged with the same task. Its behaviour should also interpolate when challenged with scenarios similar to those used to calibrate it. The approach illustrated here uses logistic regression to model the classifications made by the subjects. This model is fitted to the calibration data, and provides an extrapolation to classifications in scenarios outside of the calibration data. A simple strategy is used to calibrate the timing component of the model, and the results for reaction times are compared between the OCM and the student subjects. While this approach to timing does not capture the full complexity of the reaction time distribution seen in the data from the student subjects, the mean and the tail of the distributions are similar.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this study was to examine the way Australian air traffic controllers manage their airspace. Fourteen controllers ranging from 7 to 30 years experience were sampled from the Brisbane air traffic control centre. All had previously been endorsed for en route radar sectors. Five static pictures varying in workload level (low, medium and high) were presented to participants. Controllers were asked to work through the scenarios and describe aloud how they would resolve any potential conflicts between the aircraft. Following this controllers were asked a set of probe questions based on the critical decision method, to extract further information about the way they manage their airspace. A content analysis was used to assess patterns in the way controllers scan, strategies used in conflict detection and conflict resolution and the effect of workload on strategy choice. Findings revealed that controllers use specific strategies (such as working in a left to right scan or prioritising levels) when managing their airspace. Further analyses are still planned however a model based on the processes controllers used to resolve conflicts has been developed and will be presented as a summary of the results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This chapter explores ways in which rigorous mathematical techniques, termed formal methods, can be employed to improve the predictability and dependability of autonomic computing. Model checking, formal specification, and quantitative verification are presented in the contexts of conflict detection in autonomic computing policies, and of implementation of goal and utility-function policies in autonomic IT systems, respectively. Each of these techniques is illustrated using a detailed case study, and analysed to establish its merits and limitations. The analysis is then used as a basis for discussing the challenges and opportunities of this endeavour to transition the development of autonomic IT systems from the current practice of using ad-hoc methods and heuristic towards a more principled approach. © 2012, IGI Global.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Large read-only or read-write transactions with a large read set and a small write set constitute an important class of transactions used in such applications as data mining, data warehousing, statistical applications, and report generators. Such transactions are best supported with optimistic concurrency, because locking of large amounts of data for extended periods of time is not an acceptable solution. The abort rate in regular optimistic concurrency algorithms increases exponentially with the size of the transaction. The algorithm proposed in this dissertation solves this problem by using a new transaction scheduling technique that allows a large transaction to commit safely with significantly greater probability that can exceed several orders of magnitude versus regular optimistic concurrency algorithms. A performance simulation study and a formal proof of serializability and external consistency of the proposed algorithm are also presented.^ This dissertation also proposes a new query optimization technique (lazy queries). Lazy Queries is an adaptive query execution scheme which optimizes itself as the query runs. Lazy queries can be used to find an intersection of sub-queries in a very efficient way, which does not require full execution of large sub-queries nor does it require any statistical knowledge about the data.^ An efficient optimistic concurrency control algorithm used in a massively parallel B-tree with variable-length keys is introduced. B-trees with variable-length keys can be effectively used in a variety of database types. In particular, we show how such a B-tree was used in our implementation of a semantic object-oriented DBMS. The concurrency control algorithm uses semantically safe optimistic virtual "locks" that achieve very fine granularity in conflict detection. This algorithm ensures serializability and external consistency by using logical clocks and backward validation of transactional queries. A formal proof of correctness of the proposed algorithm is also presented. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the multi-core CPU world, transactional memory (TM)has emerged as an alternative to lock-based programming for thread synchronization. Recent research proposes the use of TM in GPU architectures, where a high number of computing threads, organized in SIMT fashion, requires an effective synchronization method. In contrast to CPUs, GPUs offer two memory spaces: global memory and local memory. The local memory space serves as a shared scratch-pad for a subset of the computing threads, and it is used by programmers to speed-up their applications thanks to its low latency. Prior work from the authors proposed a lightweight hardware TM (HTM) support based in the local memory, modifying the SIMT execution model and adding a conflict detection mechanism. An efficient implementation of these features is key in order to provide an effective synchronization mechanism at the local memory level. After a quick description of the main features of our HTM design for GPU local memory, in this work we gather together a number of proposals designed with the aim of improving those mechanisms with high impact on performance. Firstly, the SIMT execution model is modified to increase the parallelism of the application when transactions must be serialized in order to make forward progress. Secondly, the conflict detection mechanism is optimized depending on application characteristics, such us the read/write sets, the probability of conflict between transactions and the existence of read-only transactions. As these features can be present in hardware simultaneously, it is a task of the compiler and runtime to determine which ones are more important for a given application. This work includes a discussion on the analysis to be done in order to choose the best configuration solution.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Previous research with the ratio-bias task found larger response latencies for conflict trials where the heuristic- and analytic-based responses are assumed to be in opposition (e.g., choosing between 1/10 and 9/100 ratios of success) when compared to no-conflict trials where both processes converge on the same response (e.g., choosing between 1/10 and 11/100). This pattern is consistent with parallel dualprocess models, which assume that there is effective, rather than lax, monitoring of the output of heuristic processing. It is, however, unclear why conflict resolution sometimes fails. Ratio-biased choices may increase because of a decline in analytical reasoning (leaving heuristic-based responses unopposed) or to a rise in heuristic processing (making it more difficult for analytic processes to override the heuristic preferences). Using the process-dissociation procedure, we found that instructions to respond logically and response speed affected analytic (controlled) processing (C), leaving heuristic processing (H) unchanged, whereas the intuitive preference for large nominators (as assessed by responses to equal ratio trials) affected H but not C. These findings create new challenges to the debate between dual-process and singleprocess accounts, which are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fallibility is inherent in human cognition and so a system that will monitor performance is indispensable. While behavioral evidence for such a system derives from the finding that subjects slow down after trials that are likely to produce errors, the neural and behavioral characterization that enables such control is incomplete. Here, we report a specific role for dopamine/basal ganglia in response conflict by accessing deficits in performance monitoring in patients with Parkinson's disease. To characterize such a deficit, we used a modification of the oculomotor countermanding task to show that slowing down of responses that generate robust response conflict, and not post-error per se, is deficient in Parkinson's disease patients. Poor performance adjustment could be either due to impaired ability to slow RT subsequent to conflicts or due to impaired response conflict recognition. If the latter hypothesis was true, then PD subjects should show evidence of impaired error detection/correction, which was found to be the case. These results make a strong case for impaired performance monitoring in Parkinson's patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary conflicts among social hymenopteran nestmates are theoretically likely to arise over the production of males and the sex ratio. Analysis of these conflicts has become an important focus of research into the role of kin selection in shaping social traits of hymenopteran colonies. We employ microsatellite analysis of nestmates of one social hymenopteran, the primitively eusocial and monogynous bumblebee Bombus hypnorum, to evaluate these conflicts. In our 14 study colonies, B. hypnorum queens mated between one and six times (arithmetic mean 2.5). One male generally predominated, fathering most of the offspring, thus the effective number of matings was substantially lower (1–3.13; harmonic mean 1.26). In addition, microsatellite analysis allowed the detection of alien workers, those who could not have been the offspring of the queen, in approximately half the colonies. Alien workers within the same colony were probably sisters. Polyandry and alien workers resulted in high variation among colonies in their sociogenetic organization. Genetic data were consistent with the view that all males (n = 233 examined) were produced by a colony’s queen. Male parentage was therefore independent of the sociogenetic organization of the colony, suggesting that the queen, and not the workers, was in control of the laying of male-destined eggs. The population-wide sex ratio (fresh weight investment ratio) was weakly female biased. No evidence for colony-level adaptive sex ratio biasing could be detected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of alternative combination rules in DS theory when evidence is in conflict has emerged again recently as an interesting topic, especially in data/information fusion applications. These studies have mainly focused on investigating which alternative would be appropriate for which conflicting situation, under the assumption that a conflict is identified. The issue of detection (or identification) of conflict among evidence has been ignored. In this paper, we formally define when two basic belief assignments are in conflict. This definition deploys quantitative measures of both the mass of the combined belief assigned to the emptyset before normalization and the distance between betting commitments of beliefs.We argue that only when both measures are high, it is safe to say the evidence is in conflict. This definition can be served as a prerequisite for selecting appropriate combination rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When people evaluate syllogisms, their judgments of validity are often biased by the believability of the conclusions of the problems. Thus, it has been suggested that syllogistic reasoning performance is based on an interplay between a conscious and effortful evaluation of logicality and an intuitive appreciation of the believability of the conclusions (e.g., Evans, Newstead, Allen, & Pollard, 1994). However, logic effects in syllogistic reasoning emerge even when participants are unlikely to carry out a full logical analysis of the problems (e.g., Shynkaruk & Thompson, 2006). There is also evidence that people can implicitly detect the conflict between their beliefs and the validity of the problems, even if they are unable to consciously produce a logical response (e.g., De Neys, Moyens, & Vansteenwegen, 2010). In 4 experiments we demonstrate that people intuitively detect the logicality of syllogisms, and this effect emerges independently of participants' conscious mindset and their cognitive capacity. This logic effect is also unrelated to the superficial structure of the problems. Additionally, we provide evidence that the logicality of the syllogisms is detected through slight changes in participants' affective states. In fact, subliminal affective priming had an effect on participants' subjective evaluations of the problems. Finally, when participants misattributed their emotional reactions to background music, this significantly reduced the logic effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When implementing autonomic management of multiple non-functional concerns a trade-off must be found between the ability to develop independently management of the individual concerns (following the separation of concerns principle) and the detection and resolution of conflicts that may arise when combining the independently developed management code. Here we discuss strategies to establish this trade-off and introduce a model checking based methodology aimed at simplifying the discovery and handling of conflicts arising from deployment-within the same parallel application-of independently developed management policies. Preliminary results are shown demonstrating the feasibility of the approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on Blindsight, Neglect/Extinction and Phantom limb syndromes, as well as electrical measurements of mammalian brain activity, have suggested the dependence of vivid perception on both incoming sensory information at primary sensory cortex and reentrant information from associative cortex. Coherence between incoming and reentrant signals seems to be a necessary condition for (conscious) perception. General reticular activating system and local electrical synchronization are some of the tools used by the brain to establish coarse coherence at the sensory cortex, upon which biochemical processes are coordinated. Besides electrical synchrony and chemical modulation at the synapse, a central mechanism supporting such a coherence is the N-methyl-D-aspartate channel, working as a 'coincidence detector' for an incoming signal causing the depolarization necessary to remove Mg 2+, and reentrant information releasing the glutamate that finally prompts Ca 2+ entry. We propose that a signal transduction pathway activated by Ca 2+ entry into cortical neurons is in charge of triggering a quantum computational process that accelerates inter-neuronal communication, thus solving systemic conflict and supporting the unity of consciousness. © 2001 Elsevier Science Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, the main Executive Control theories are exposed. Methods typical of Cognitive and Computational Neuroscience are introduced and the role of behavioural tasks involving conflict resolution in the response elaboration, after the presentation of a stimulus to the subject, are highlighted. In particular, the Eriksen Flanker Task and its variants are discussed. Behavioural data, from scientific literature, are illustrated in terms of response times and error rates. During experimental behavioural tasks, EEG is registered simultaneously. Thanks to this, event related potential, related with the current task, can be studied. Different theories regarding relevant event related potential in this field - such as N2, fERN (feedback Error Related Negativity) and ERN (Error Related Negativity) – are introduced. The aim of this thesis is to understand and simulate processes regarding Executive Control, including performance improvement, error detection mechanisms, post error adjustments and the role of selective attention, with the help of an original neural network model. The network described here has been built with the purpose to simulate behavioural results of a four choice Eriksen Flanker Task. Model results show that the neural network can simulate response times, error rates and event related potentials quite well. Finally, results are compared with behavioural data and discussed in light of the mentioned Executive Control theories. Future perspective for this new model are outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.