65 resultados para ANSWER


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Answer set programming is a form of declarative programming that has proven very successful in succinctly formulating and solving complex problems. Although mechanisms for representing and reasoning with the combined answer set programs of multiple agents have already been proposed, the actual gain in expressivity when adding communication has not been thoroughly studied. We show that allowing simple programs to talk to each other results in the same expressivity as adding negation-as-failure. Furthermore, we show that the ability to focus on one program in a network of simple programs results in the same expressivity as adding disjunction in the head of the rules.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fuzzy answer set programming (FASP) is a generalization of answer set programming to continuous domains. As it can not readily take uncertainty into account, however, FASP is not suitable as a basis for approximate reasoning and cannot easily be used to derive conclusions from imprecise information. To cope with this, we propose an extension of FASP based on possibility theory. The resulting framework allows us to reason about uncertain information in continuous domains, and thus also about information that is imprecise or vague. We propose a syntactic procedure, based on an immediate consequence operator, and provide a characterization in terms of minimal models, which allows us to straightforwardly implement our framework using existing FASP solvers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many problems in artificial intelligence can be encoded as answer set programs (ASP) in which some rules are uncertain. ASP programs with incorrect rules may have erroneous conclusions, but due to the non-monotonic nature of ASP, omitting a correct rule may also lead to errors. To derive the most certain conclusions from an uncertain ASP program, we thus need to consider all situations in which some, none, or all of the least certain rules are omitted. This corresponds to treating some rules as optional and reasoning about which conclusions remain valid regardless of the inclusion of these optional rules. While a version of possibilistic ASP (PASP) based on this view has recently been introduced, no implementation is currently available. In this paper we propose a simulation of the main reasoning tasks in PASP using (disjunctive) ASP programs, allowing us to take advantage of state-of-the-art ASP solvers. Furthermore, we identify how several interesting AI problems can be naturally seen as special cases of the considered reasoning tasks, including cautious abductive reasoning and conformant planning. As such, the proposed simulation enables us to solve instances of the latter problem types that are more general than what current solvers can handle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Community-driven Question Answering (CQA) systems that crowdsource experiential information in the form of questions and answers and have accumulated valuable reusable knowledge. Clustering of QA datasets from CQA systems provides a means of organizing the content to ease tasks such as manual curation and tagging. In this paper, we present a clustering method that exploits the two-part question-answer structure in QA datasets to improve clustering quality. Our method, {\it MixKMeans}, composes question and answer space similarities in a way that the space on which the match is higher is allowed to dominate. This construction is motivated by our observation that semantic similarity between question-answer data (QAs) could get localized in either space. We empirically evaluate our method on a variety of real-world labeled datasets. Our results indicate that our method significantly outperforms state-of-the-art clustering methods for the task of clustering question-answer archives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of the paper is to demonstrate how a research diary methodology, designed to analyse A-level and GNVQ classrooms, can be a powerful tool for examining pedagogy and quality of learning at the level of case study. Two subject areas, science and business studies, are presented as cases. Twelve teachers and thirty-four students were studied over a four-week period in May 1997 and contrasts were drawn between lessons from three A-level physics teachers/three Advanced GNVQ science teachers and two A-level business/economics teachers/four Advanced GNVQ business teachers. Lessons were analysed within a cognitive framework which distinguishes between conceptual and procedural learning and emphasizes the importance of metacognition and epistemological beliefs. Two dimensions of lessons were identified: pedagogical activities (e.g. teacher-led explanation, teacher-led guidance on a task, question/answer sessions, group discussions, working with IT) and cognitive outcomes (e.g. structuring and memorizing facts, understanding concepts and arguments, critical thinking, problem-solving, learning core skills, identifying values). Immediately after each lesson, teachers and students (three per class) completed structured research diaries with respect to the above dimensions. Data from the diaries reveal general and unique features of the lessons. Time-ofyear effects were evident (examinations pending in May), particularly in A-level classrooms. Students in business studies classes reported a wider range of learning activities and greater variety in cognitive outcomes than did students in science classes. Science students self-rating of their ability to manage and direct their own learning was generally low. The phenomenological aspects of the classrooms were consistently linked to teachers' lesson plans and what their teaching objectives were for those particular students at that particular time of the year.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article examines the role of contemporary art in a post-9/11 context through The American Effect exhibition at the Whitney Museum of American Art in New York in 2003. This exhibition displayed a range of artworks from around the world that specifically engaged with, commented upon and interrogated the USA's pre-eminent position as a global superpower. In the politically charged climate after 9/11, the exhibition offered itself as a critical voice amid the more obvious patriotic clamour: it was one of the places where Americans could ask (and answer) the question, `Why do they hate us so much?' Although The American Effect claimed to be a space of dissent, it ultimately failed to question, let alone challenge, US global hegemony. Instead, the exhibition articulated a benevolent patriotism that forced artwork from other nations into supplicating and abject positions, and it obscured the complex discursive networks that connect artists, curators, critics, audiences and art museums.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a perception that teaching space in universities is a rather scarce resource. However, some studies have revealed that in many institutions it is actually chronically under-used. Often, rooms are occupied only half the time, and even when in use they are often only half full. This is usually measured by the ‘utilization’ which is defined as the percentage of available ‘seat-hours’ that are employed. Within real institutions, studies have shown that this utilization can often take values as low as 20–40%. One consequence of such a low level of utilization is that space managers are under pressure to make more efficient use of the available teaching space. However, better management is hampered because there does not appear to be a good understanding within space management (near-term planning) of why this happens. This is accompanied, within space planning (long-term planning) by a lack of experise on how best to accommodate the expected low utilizations. This motivates our two main goals: (i) To understand the factors that drive down utilizations, (ii) To set up methods to provide better space planning. Here, we provide quantitative evidence that constraints arising from timetabling and location requirements easily have the potential to explain the low utilizations seen in reality. Furthermore, on considering the decision question ‘Can this given set of courses all be allocated in the available teaching space?’ we find that the answer depends on the associated utilization in a way that exhibits threshold behaviour: There is a sharp division between regions in which the answer is ‘almost always yes’ and those of ‘almost always no’. Through analysis and understanding of the space of potential solutions, our work suggests that better use of space within universities will come about through an understanding of the effects of timetabling constraints and when it is statistically likely that it will be possible for a set of courses to be allocated to a particular space. The results presented here provide a firm foundation for university managers to take decisions on how space should be managed and planned for more effectively. Our multi-criteria approach and new methodology together provide new insight into the interaction between the course timetabling problem and the crucial issue of space planning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Osteoporosis (OP) is one of the most prevalent bone diseases worldwide with bone fracture the major clinical consequence. The effect of OP on fracture repair is disputed and although it might be expected for fracture repair to be delayed in osteoporotic individuals, a definitive answer to this question still eludes us. The aim of this study was to clarify the effect of osteoporosis in a rodent fracture model. OP was induced in 3-month-old rats (n = 53) by ovariectomy (OVX) followed by an externally fixated, mid-diaphyseal femoral osteotomy at 6 months (OVX group). A further 40 animals underwent a fracture at 6 months (control group). Animals were sacrificed at 1, 2, 4, 6, and 8 weeks postfracture with outcome measures of histology, biomechanical strength testing, pQCT, relative BMD, and motion detection. OVX animals had significantly lower BMD, slower fracture repair (histologically), reduced stiffness in the fractured femora (8 weeks) and strength in the contralateral femora (6 and 8 weeks), increased body weight, and decreased motion. This study has demonstrated that OVX is associated with decrease in BMD (particularly in trabecular bone) and a reduction in the mechanical properties of intact bone and healing fractures. The histological, biomechanical, and radiological measures of union suggest that OVX delayed fracture healing. (C) 2007 Orthopaedic Research Society. Published by Wiley Periodicals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Nemertodermatida are a small group of microscopic marine worms. Recent molecular studies have demonstrated that they are likely to be the earliest extant bilaterian animals. What was the nervous system (NS) of a bilaterian ancestor like? In order to answer that question, the NS of Nemertoderma westbladi was investigated by means of indirect immunofluorescence technique and confocal scanning laser microscopy. The antibodies to a flatworm neuropeptide GYIRFamide were used in combination with anti-serotonin antibodies and phalloidin-TRITC staining. The immunostaining revealed an entirely basiepidermal NS. A ring lying outside the body wall musculature at the level of the statocyst forms the only centralisation, the

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aims/hypothesis Glomerular hyperfiltration is a well established phenomenon occurring early in some patients with type 1 diabetes. However, there is no consistent answer regarding whether hyperfiltration predicts later development of nephropathy. We performed a systematic review and meta-analysis of observational studies that compared the risk of developing diabetic nephropathy in patients with and without glomerular hyperfiltration and also explored the impact of baseline GFR.

Methods A systematic review and meta-analysis was carried out. Cohort studies in type 1 diabetic participants were included if they contained data on the development of incipient or overt nephropathy with baseline measurement
of GFR and presence or absence of hyperfiltration.

Results We included ten cohort studies following 780 patients. After a study median follow-up of 11.2 years, 130 patients had developed nephropathy. Using a random effects model, the pooled odds of progression to a minimum
of microalbuminuria in patients with hyperfiltration was 2.71 (95% CI 1.20–6.11) times that of patients with normofiltration. There was moderate heterogeneity (heterogeneity test p=0.05, measure of degree of inconsistency=48%) and some evidence of funnel plot asymmetry, possibly due to publication bias. The pooled weighted mean difference in baseline GFR was 13.8 ml min-1 1.73 m-2 (95% CI 5.0–22.7) greater in the group progressing to nephropathy than in those not progressing (heterogeneity test p<0.01).

Conclusions/interpretation In published studies, individuals with glomerular hyperfiltration were at increased risk of progression to diabetic nephropathy using study level data. Further larger studies are required to explore this relationship and the role of potential confounding variables.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a perception that teaching space in universities is a rather scarce resource. However, some studies have revealed that in many institutions it is actually chronically under-used. Often, rooms are occupied only half the time, and even when in use they are often only half full. This is usually measured by the “utilisation” which is basically the percentage of available ’seat-hours’ that are employed. In real institutions, this utilisation can often takes values as low as 20-40%. One consequence of such low utilisation is that space managers are under pressure to make a more efficient use of the available teaching space. However, better management is hampered because there does not appear to be a good understanding within space management (near-term planning) of why this happens. Nor, a good basis within space planning (long-term planning) of how best to accommodate the expected low utilisations. This motivates our two main goals: (i) To understand the factors that drive down utilisations, (ii) To set up methods to provide better space planning. Here, we provide quantitative evidence that constraints arising from timetabling and location requirements easily have the potential to explain the low utilisations seen in reality. Furthermore, on considering the decision question “Can this given set of courses all be allocated in the available teaching space?” we find that the answer depends on the associated utilisation in a way that exhibits threshold behaviour: There is a sharp division between regions in which the answer is “almost always yes” and those of “almost always no”. Our work suggests that progress in space management and planning will arise from an integrated approach; combining purely space issues with restrictions representing an aggregated or abstracted version of key constraints such as timetabling or location, and

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present comprehensive photometric and spectroscopic observations of the faint transient SN 2008S discovered in the nearby galaxy NGC 6946. SN 2008S exhibited slow photometric evolution and almost no spectral variability during the first nine months, implying a long photon diffusion time and a high-density circumstellar medium. Its bolometric luminosity (similar or equal to 10(41) erg s(-1) at peak) is low with respect to most core-collapse supernovae but is comparable to the faintest Type II-P events. Our quasi-bolometric light curve extends to 300 d and shows a tail phase decay rate consistent with that of Co-56. We propose that this is evidence for an explosion and formation of Ni-56 (0.0014 +/- 0.0003 M-circle dot). Spectra of SN 2008S show intense emission lines of H alpha, [Ca II] doublet and Ca II near-infrared (NIR) triplet, all without obvious P-Cygni absorption troughs. The large mid-infrared (MIR) flux detected shortly after explosion can be explained by a light echo from pre-existing dust. The late NIR flux excess is plausibly due to a combination of warm newly formed ejecta dust together with shock-heated dust in the circumstellar environment. We reassess the progenitor object detected previously in Spitzer archive images, supplementing this discussion with a model of the MIR spectral energy distribution. This supports the idea of a dusty, optically thick shell around SN 2008S with an inner radius of nearly 90 AU and outer radius of 450 AU, and an inferred heating source of 3000 K. The luminosity of the central star is L similar or equal to 10(4.6) L-circle dot. All the nearby progenitor dust was likely evaporated in the explosion leaving only the much older dust lying further out in the circumstellar environment. The combination of our long-term multiwavelength monitoring data and the evidence from the progenitor analysis leads us to support the scenario of a weak electron-capture supernova explosion in a super-asymptotic giant branch progenitor star (of initial mass 6-8 M-circle dot) embedded within a thick circumstellar gaseous envelope. We suggest that all of main properties of the electron-capture SN phenomenon are observed in SN 2008S and future observations may allow a definitive answer.