92 resultados para Logical necessity
Resumo:
We analyze the effect of a quantum error correcting code on the entanglement of encoded logical qubits in the presence of a dephasing interaction with a correlated environment. Such correlated reservoir introduces entanglement between physical qubits. We show that for short times the quantum error correction interprets such entanglement as errors and suppresses it. However, for longer time, although quantum error correction is no longer able to correct errors, it enhances the rate of entanglement production due to the interaction with the environment.
Resumo:
In this preliminary case study, we investigate how inconsistency in a network intrusion detection rule set can be measured. To achieve this, we first examine the structure of these rules which incorporate regular expression (Regex) pattern matching. We then identify primitive elements in these rules in order to translate the rules into their (equivalent) logical forms and to establish connections between them. Additional rules from background knowledge are also introduced to make the correlations among rules more explicit. Finally, we measure the degree of inconsistency in formulae of such a rule set (using the Scoring function, Shapley inconsistency values and Blame measure for prioritized knowledge) and compare the informativeness of these measures. We conclude that such measures are useful for the network intrusion domain assuming that incorporating domain knowledge for correlation of rules is feasible.
Resumo:
In this preliminary study, we investigate how inconsistency in a network intrusion detection rule set can be measured. To achieve this, we first examine the structure of these rules which are based on Snort and incorporate regular expression (Regex) pattern matching. We then identify primitive elements in these rules in order to translate the rules into their (equivalent) logical forms and to establish connections between them. Additional rules from background knowledge are also introduced to make the correlations among rules more explicit. We measure the degree of inconsistency in formulae of such a rule set (using the Scoring function, Shapley inconsistency values and Blame measure for prioritized knowledge) and compare the informativeness of these measures. Finally, we propose a new measure of inconsistency for prioritized knowledge which incorporates the normalized number of atoms in a language involved in inconsistency to provide a deeper inspection of inconsistent formulae. We conclude that such measures are useful for the network intrusion domain assuming that introducing expert knowledge for correlation of rules is feasible.
Resumo:
Belief revision characterizes the process of revising an agent’s beliefs when receiving new evidence. In the field of artificial intelligence, revision strategies have been extensively studied in the context of logic-based formalisms and probability kinematics. However, so far there is not much literature on this topic in evidence theory. In contrast, combination rules proposed so far in the theory of evidence, especially Dempster rule, are symmetric. They rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When one source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation
is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should be changed minimally to that effect. To deal with this issue, this paper defines the notion of revision for the theory of evidence in such a way as to bring together probabilistic and logical views. Several revision rules previously proposed are reviewed and we advocate one of them as better corresponding to the idea of revision. It is extended to cope with inconsistency between prior and input information. It reduces to Dempster
rule of combination, just like revision in the sense of Alchourron, Gardenfors, and Makinson (AGM) reduces to expansion, when the input is strongly consistent with the prior belief function. Properties of this revision rule are also investigated and it is shown to generalize Jeffrey’s rule of updating, Dempster rule of conditioning and a form of AGM revision.
Resumo:
High-performance liquid chromatography (HPLC) methodologies were evaluated for the detection and quantification of thyreostatic drug residues in cattle serum and thyroid tissue. The paper details a protocol, using a simple ethyl acetate extraction for the determination of thiouracil, tapazole, methyl thiouracil, propyl thiouracil and phenyl thiouracil in thyroid tissue. Using two sequential HPLC injections, and quantitative analysis, in two steps, all five thyreostats were detectable at concentrations greater than 2.45-4.52 ng/g. Modifications to a published method for detection of thyreostatic residues in serum involving the addition of mercaptoethanol and a freezing step are described. The modifications improved sensitivity and allowed detection of the five thyreostats at levels greater than 16.98-35.25 ng/ml. Young bulls were treated with thyreostats to demonstrate the validity of the methodologies described. Administered thyreostats were not absorbed equally by the test animals and the compounds were not all detected in the serum samples removed at 7 days following drug withdrawal. These experiments indicate the necessity to be able to detect thyreostat residues in a variety of matrices. (C) 1998 Elsevier Science B.V. All rights reserved.
Resumo:
19-Nortestosterone (beta-NT) is banned for use as a growth promoter in food animals within the European Union. For regulatory control purposes, urine and bile samples are routinely screened by immunoassay. The aim of the present study was to compare the ability of two immunoassays, using two rabbit polyclonal antibodies raised against two different NT derivatives, to detect NT residues in bovine bile. One antiserum cross-reacted with both alpha-NT and beta-NT (alpha/beta-NT), whereas the other was specific for alpha-NT. Bile samples from 266 slaughtered cattle were deconjugated and analyzed using both antibodies, with all screening positives (>2 ng ml(-1)) confirmed by high resolution gas chromatography mass spectrometry. The alpha/beta-NT and alpha-NT antibody-based ELISAs screened 39 and 44 samples positive, respectively, with NT confirmed in 22 and 39, respectively. The alpha/beta-NT antibody-based ELISA produced a false-negative rate of 44% compared to 0% for the alpha-NT antibody-based ELISA. Supplementary investigations concluded that a matrix effect was a major cause of the marked differences in false-negative rates. This result underlines the necessity to validate immunoassays in the sample matrix.
Resumo:
A post-Markovian master equation has been recently proposed as a tool to describe the evolution of a system coupled to a memory-keeping environment [A. Shabani and D. A. Lidar, Phys. Rev. A 71, 020101 ( R) ( 2005)]. For a single qubit affected by appropriately chosen environmental conditions, the corresponding dynamics is always legitimate and physical. Here we extend such a situation to the case of two qubits, only one of which experiences the environmental effects. We show how, despite the innocence of such an extension, the introduction of the second qubit should be done cum grano salis to avoid consequences such as the breaking of the positivity of the associated dynamical map. This hints at the necessity of using care when adopting phenomenologically derived models for evolutions occurring outside the Markovian framework.
Resumo:
Despite major improvements in diagnostics and interventional therapies, cardiovascular diseases remain a major health care and socio-economic burden both in western and developing countries, in which this burden is increasing in close correlation to economic growth. Health authorities and the general population have started to recognize that the fight against these diseases can only be won if their burden is faced by increasing our investment on interventions in lifestyle changes and prevention. There is an overwhelming evidence of the efficacy of secondary prevention initiatives including cardiac rehabilitation in terms of reduction in morbidity and mortality. However, secondary prevention is still too poorly implemented in clinical practice, often only on selected populations and over a limited period of time. The development of systematic and full comprehensive preventive programmes is warranted, integrated in the organization of national health systems. Furthermore, systematic monitoring of the process of delivery and outcomes is a necessity. Cardiology and secondary prevention, including cardiac rehabilitation, have evolved almost independently of each other and although each makes a unique contribution it is now time to join forces under the banner of preventive cardiology and create a comprehensive model that optimizes long term outcomes for patients and reduces the future burden on health care services. These are the aims that the Cardiac Rehabilitation Section of the European Association for Cardiovascular Prevention & Rehabilitation has foreseen to promote secondary preventive cardiology in clinical practice.
Resumo:
The biennial meeting on 'Exploiting Bacteriophages for Bioscience, Biotechnology and Medicine', held in London, UK, on 20 January 2012, and chaired by George Salmond (University of Cambridge, UK) hosted over 50 participants representing 13 countries. The highly multidisciplinary meeting covered a diverse range of topics, reflecting the current expansion of interest in this field, including the use of bacteriophages as the source of biochemical reagents for molecular biology, bacteriophages for the treatment of human and animal diseases, bacteriophage-based diagnostics and therapeutic delivery technologies and necessity for, and regulatory challenges associated with, robust clinical trials of phage-based therapeutics. This report focuses on a number of presentations from the meeting relating to cutting-edge research on bacteriophages as anti-infective agents.
Resumo:
THE MACHINIST LANDSCAPE: AN ENTROPIC GRID OF VARIANCE
‘By drawing a diagram, a ground plan of a house, a street plan to the location of a site, or a topographic map, one draws a “logical two dimensional picture”. A “logical picture” differs from a natural or realistic picture in that it rarely looks like the thing it stands for.’
A Provisional Theory of Non-Sites, Robert Smithson (1968)
Between design and ground there are variances, deviations and gaps. These exist as physical interstices between what is conceptualised and what is realised; and they reveal moments in the design process that resist the reconciliation of people and their environment (McHarg 1963). The Machinist Landscape interrogates the significance of these variances through the contrasting processes of coppice and photovoltaic energy. It builds on the potential of these gaps, and in doing so proposes that these spaces of variance can reveal the complexity of relationships between consumption and remediation, design and nature.
Fresh Kills Park, and in particular the draft master plan (2006), offers a framework to explore this artificial construct. Central to the Machinist Landscape is the analysis of the landfill gas collection system, planned on a notional 200ft grid. Variations are revealed between this diagrammatic grid measure and that which has been constructed on the site. These variances between the abstract and the real offer the Machinist Landscape a powerful space of enquiry. Are these gaps a result of unexpected conditions below ground, topographic nuances or natural phenomena? Does this space of difference, between what is planned and what is constructed, have the potential to redefine the dynamic processes and relations with the land?
The Machinist Landscape is structured through this space of variance with an ‘entropic grid’, the under-storey of which hosts a carefully managed system of short-rotation coppice (SRC). The coppice, a medieval practice related to energy, product, and space, operates on theoretical and programmatic levels. It is planted along a structure of linear bunds, stabilized through coppice pole retaining structures and enriched with nutrients from coppice produced bio-char. Above the coppice is built an upper-storey of photovoltaic (PV); its structures fabricated from the coppiced timber and the PV produced with graphene from coppice charcoal processes.
Resumo:
In this research note, we introduce a graded BDI agent development framework, g-BDI for short, that allows to build agents as multi-context systems that reason about three fundamental and graded mental attitudes (i.e. beliefs, desires and intentions). We propose a sound and complete logical framework for them and some logical extensions to accommodate slightly different views on desires. © 2011 Elsevier B.V. All rights reserved.
Resumo:
Utilising cameras as a means to survey the surrounding environment is becoming increasingly popular in a number of different research areas and applications. Central to using camera sensors as input to a vision system, is the need to be able to manipulate and process the information captured in these images. One such application, is the use of cameras to monitor the quality of airport landing lighting at aerodromes where a camera is placed inside an aircraft and used to record images of the lighting pattern during the landing phase of a flight. The images are processed to determine a performance metric. This requires the development of custom software for the localisation and identification of luminaires within the image data. However, because of the necessity to keep airport operations functioning as efficiently as possible, it is difficult to collect enough image data to develop, test and validate any developed software. In this paper, we present a technique to model a virtual landing lighting pattern. A mathematical model is postulated which represents the glide path of the aircraft including random deviations from the expected path. A morphological method has been developed to localise and track the luminaires under different operating conditions. © 2011 IEEE.
Resumo:
The conjunction fallacy has been cited as a classic example of the automatic contextualisation of problems. In two experiments we compared the performance of autistic and typically developing adolescents on a set of conjunction fallacy tasks. Participants with autism were less susceptible to the conjunction fallacy. Experiment 2 also demonstrated that the difference between the groups did not result from increased sensitivity to the conjunction rule, or from impaired processing of social materials amongst the autistic participants. Although adolescents with autism showed less bias in their reasoning they were not more logical than the control group in a normative sense. The findings are discussed in the light of accounts which emphasise differences in contextual processing between typical and autistic populations.
Resumo:
When people evaluate syllogisms, their judgments of validity are often biased by the believability of the conclusions of the problems. Thus, it has been suggested that syllogistic reasoning performance is based on an interplay between a conscious and effortful evaluation of logicality and an intuitive appreciation of the believability of the conclusions (e.g., Evans, Newstead, Allen, & Pollard, 1994). However, logic effects in syllogistic reasoning emerge even when participants are unlikely to carry out a full logical analysis of the problems (e.g., Shynkaruk & Thompson, 2006). There is also evidence that people can implicitly detect the conflict between their beliefs and the validity of the problems, even if they are unable to consciously produce a logical response (e.g., De Neys, Moyens, & Vansteenwegen, 2010). In 4 experiments we demonstrate that people intuitively detect the logicality of syllogisms, and this effect emerges independently of participants' conscious mindset and their cognitive capacity. This logic effect is also unrelated to the superficial structure of the problems. Additionally, we provide evidence that the logicality of the syllogisms is detected through slight changes in participants' affective states. In fact, subliminal affective priming had an effect on participants' subjective evaluations of the problems. Finally, when participants misattributed their emotional reactions to background music, this significantly reduced the logic effect.
Resumo:
This study examined performance on transitive inference problems in children with developmental dyscalculia (DD), typically developing controls matched on IQ, working memory and reading skills, and in children with outstanding mathematical abilities. Whereas mainstream approaches currently consider DD as a domain-specific deficit, we hypothesized that the development of mathematical skills is closely related to the development of logical abilities, a domain-general skill. In particular, we expected a close link between mathematical skills and the ability to reason independently of one's beliefs. Our results showed that this was indeed the case, with children with DD performing more poorly than controls, and high maths ability children showing outstanding skills in logical reasoning about belief-laden problems. Nevertheless, all groups performed poorly on structurally equivalent problems with belief-neutral content. This is in line with suggestions that abstract reasoning skills (i.e. the ability to reason about content without real-life referents) develops later than the ability to reason about belief-inconsistent fantasy content.A video abstract of this article can be viewed at http://www.youtube.com/watch?v=90DWY3O4xx8.