17 resultados para Computer Generated Proofs

em CentAUR: Central Archive University of Reading - UK


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Human-like computer interaction systems requires far more than just simple speech input/output. Such a system should communicate with the user verbally, using a conversational style language. It should be aware of its surroundings and use this context for any decisions it makes. As a synthetic character, it should have a computer generated human-like appearance. This, in turn, should be used to convey emotions, expressions and gestures. Finally, and perhaps most important of all, the system should interact with the user in real time, in a fluent and believable manner.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Explanations are an important by-product of medical decisionsupport activities, as they have proved to favour compliance and correct treatment performance. To achieve this purpose, these texts should have a strong argumentation content and should adapt to emotional, as well as to rational attitudes of the Addressee. This paper describes how Rhetorical Sentence Planning can contribute to this aim: the rulebased plan discourse revision is introduced between Text Planning and Linguistic Realization, and exploits knowledge about the user personality and emotions and about the potential impact of domain items on user compliance and memory recall. The proposed approach originates from analytical and empirical evaluation studies of computer generated explanation texts in the domain of drug prescription. This work was partially supported by a British-Italian Collaboration in Research and Higher Education Project, which involved the Universities of Reading and of Bari, in 1996.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study evaluates computer-generated written explanations about drug prescriptions that are based on an analysis of both patient and doctor informational needs. Three experiments examine the effects of varying the type of information given about the possible side effects of the medication, and the order of information within the explanation. Experiment 1 investigated the effects of these two factors on people's ratings of how good they consider the explanations to be and of their perceived likelihood of taking the medication, as well as on their memory for the information in the explanation. Experiment 2 further examined the effects of varying information about side effects by separating out the contribution of number and severity of side effects. It was found that participants in this study did not “like” explanations that described severe side effects, and also judged that they would be less likely to take the medication if given such explanations. Experiment 3 therefore investigated whether information about severe side effects could be presented in such a way as to increase judgements of how good explanations are thought to be, as well as the perceived likelihood of adherence. The results showed some benefits of providing additional explanatory information.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we describe how we generated written explanations to ‘indirect users’ of a knowledge-based system in the domain of drug prescription. We call ‘indirect users’ the intended recipients of explanations, to distinguish them from the prescriber (the ‘direct’ user) who interacts with the system. The Explanation Generator was designed after several studies about indirect users' information needs and physicians' explanatory attitudes in this domain. It integrates text planning techniques with ATN-based surface generation. A double modeling component enables adapting the information content, order and style to the indirect user to whom explanation is addressed. Several examples of computer-generated texts are provided, and they are contrasted with the physicians' explanations to discuss advantages and limits of the approach adopted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Three experiments examine the effect of different forms of computer-generated advice on concurrent and subsequent performance of individuals controlling a simulated intensive-care task. Experiment 1 investigates the effect of optional and compulsory advice and shows that both result in an improvement in subjects' performance while receiving the advice, and also in an improvement in subsequent unaided performance. However, although the advice compliance displayed by the optional advice group shows a strong correlation with subsequent unaided performance, compulsory advice has no extra benefit over the optional use of advice. Experiment 2 examines the effect of providing users with on-line explanations of the advice, as well as providing less specific advice. The results show that both groups perform at the same level on the task as the advice groups from Experiment 1, although subjects receiving explanations scored significantly higher on a written post-task questionnaire. Experiment 3 investigates in more detail the relationship between advice compliance and performance. The results reveal a complex relationship between natural ability on the task and the following of advice, in that people who use the advice more tend to perform either better or worse than the more moderate users. The theoretical and practical implications of these experiments are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. Methods: Research subject group: "At-risk" patients registered with computerised general practices in two geographical regions in England. Design: Parallel group pragmatic cluster randomised trial. Interventions: Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. Primary outcome measures: The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs - with a computer-recorded diagnosis of asthma being prescribed beta-blockers - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. Secondary outcome measures; These relate to a number of other examples of potentially hazardous prescribing and medicines management. Economic analysis: An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. Qualitative analysis: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. Sample size: 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. Discussion: At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes a new method for reconstructing 3D surface points and a wireframe on the surface of a freeform object using a small number, e.g. 10, of 2D photographic images. The images are taken at different viewing directions by a perspective camera with full prior knowledge of the camera configurations. The reconstructed surface points are frontier points and the wireframe is a network of contour generators. Both of them are reconstructed by pairing apparent contours in the 2D images. Unlike previous works, we empirically demonstrate that if the viewing directions are uniformly distributed around the object's viewing sphere, then the reconstructed 3D points automatically cluster closely on a highly curved part of the surface and are widely spread on smooth or flat parts. The advantage of this property is that the reconstructed points along a surface or a contour generator are not under-sampled or under-represented because surfaces or contours should be sampled or represented with more densely points where their curvatures are high. The more complex the contour's shape, the greater is the number of points required, but the greater the number of points is automatically generated by the proposed method. Given that the viewing directions are uniformly distributed, the number and distribution of the reconstructed points depend on the shape or the curvature of the surface regardless of the size of the surface or the size of the object. The unique pattern of the reconstructed points and contours may be used in 31) object recognition and measurement without computationally intensive full surface reconstruction. The results are obtained from both computer-generated and real objects. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Medication errors are common in primary care and are associated with considerable risk of patient harm. We tested whether a pharmacist-led, information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention. Methods: In this pragmatic, cluster randomised trial general practices in the UK were stratified by research site and list size, and randomly assigned by a web-based randomisation service in block sizes of two or four to one of two groups. The practices were allocated to either computer-generated simple feedback for at-risk patients (control) or a pharmacist-led information technology intervention (PINCER), composed of feedback, educational outreach, and dedicated support. The allocation was masked to general practices, patients, pharmacists, researchers, and statisticians. Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinically important errors: non-selective non-steroidal anti-inflammatory drugs (NSAIDs) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor; β blockers prescribed to those with a history of asthma; long-term prescription of angiotensin converting enzyme (ACE) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months. The cost per error avoided was estimated by incremental cost-eff ectiveness analysis. This study is registered with Controlled-Trials.com, number ISRCTN21785299. Findings: 72 general practices with a combined list size of 480 942 patients were randomised. At 6 months’ follow-up, patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection (OR 0∙58, 95% CI 0∙38–0∙89); a β blocker if they had asthma (0∙73, 0∙58–0∙91); or an ACE inhibitor or loop diuretic without appropriate monitoring (0∙51, 0∙34–0∙78). PINCER has a 95% probability of being cost eff ective if the decision-maker’s ceiling willingness to pay reaches £75 per error avoided at 6 months. Interpretation: The PINCER intervention is an effective method for reducing a range of medication errors in general practices with computerised clinical records. Funding: Patient Safety Research Portfolio, Department of Health, England.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Augmented Reality systems overlay computer generated information onto a user's natural senses. Where this additional information is visual, the information is overlaid on the user's natural visual field of view through a head mounted (or “head-up”) display device. Integrated Home Systems provides a network that links every electrical device in the home which provides to a user both control and data transparency across the network.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Three topics are discussed. First, an issue in the epistemology of computer simulation - that of the chess endgame 'becoming' what computer-generated data says it is. Secondly, the endgames of the longest known games are discussed, and the concept of a Bionic Game is defined. Lastly, the set of record-depth positions published by Bourzutschky and Konoval are evaluated by the new MVL tables in Moscow - alongside the deepest known mate of 549 moves.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The emergence and development of digital imaging technologies and their impact on mainstream filmmaking is perhaps the most familiar special effects narrative associated with the years 1981-1999. This is in part because some of the questions raised by the rise of the digital still concern us now, but also because key milestone films showcasing advancements in digital imaging technologies appear in this period, including Tron (1982) and its computer generated image elements, the digital morphing in The Abyss (1989) and Terminator 2: Judgment Day (1991), computer animation in Jurassic Park (1993) and Toy Story (1995), digital extras in Titanic (1997), and ‘bullet time’ in The Matrix (1999). As a result it is tempting to characterize 1981-1999 as a ‘transitional period’ in which digital imaging processes grow in prominence and technical sophistication, and what we might call ‘analogue’ special effects processes correspondingly become less common. But such a narrative risks eliding the other practices that also shape effects sequences in this period. Indeed, the 1980s and 1990s are striking for the diverse range of effects practices in evidence in both big budget films and lower budget productions, and for the extent to which analogue practices persist independently of or alongside digital effects work in a range of production and genre contexts. The chapter seeks to document and celebrate this diversity and plurality, this sustaining of earlier traditions of effects practice alongside newer processes, this experimentation with materials and technologies old and new in the service of aesthetic aspirations alongside budgetary and technical constraints. The common characterization of the period as a series of rapid transformations in production workflows, practices and technologies will be interrogated in relation to the persistence of certain key figures as Douglas Trumbull, John Dykstra, and James Cameron, but also through a consideration of the contexts for and influences on creative decision-making. Comparative analyses of the processes used to articulate bodies, space and scale in effects sequences drawn from different generic sites of special effects work, including science fiction, fantasy, and horror, will provide a further frame for the chapter’s mapping of the commonalities and specificities, continuities and variations in effects practices across the period. In the process, the chapter seeks to reclaim analogue processes’ contribution both to moments of explicit spectacle, and to diegetic verisimilitude, in the decades most often associated with the digital’s ‘arrival’.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to gain a better understanding of online conceptual collaborative design processes this paper investigates how student designers make use of a shared virtual synchronous environment when engaged in conceptual design. The software enables users to talk to each other and share sketches when they are remotely located. The paper describes a novel methodology for observing and analysing collaborative design processes by adapting the concepts of grounded theory. Rather than concentrating on narrow aspects of the final artefacts, emerging “themes” are generated that provide a broader picture of collaborative design process and context descriptions. Findings on the themes of “grounding – mutual understanding” and “support creativity” complement findings from other research, while important themes associated with “near-synchrony” have not been emphasised in other research. From the study, a series of design recommendations are made for the development of tools to support online computer-supported collaborative work in design using a shared virtual environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long distance dispersal (LDD) plays an important role in many population processes like colonization, range expansion, and epidemics. LDD of small particles like fungal spores is often a result of turbulent wind dispersal and is best described by functions with power-law behavior in the tails ("fat tailed"). The influence of fat-tailed LDD on population genetic structure is reported in this article. In computer simulations, the population structure generated by power-law dispersal with exponents in the range of -2 to -1, in distinct contrast to that generated by exponential dispersal, has a fractal structure. As the power-law exponent becomes smaller, the distribution of individual genotypes becomes more self-similar at different scales. Common statistics like G(ST) are not well suited to summarizing differences between the population genetic structures. Instead, fractal and self-similarity statistics demonstrated differences in structure arising from fat-tailed and exponential dispersal. When dispersal is fat tailed, a log-log plot of the Simpson index against distance between subpopulations has an approximately constant gradient over a large range of spatial scales. The fractal dimension D-2 is linearly inversely related to the power-law exponent, with a slope of similar to -2. In a large simulation arena, fat-tailed LDD allows colonization of the entire space by all genotypes whereas exponentially bounded dispersal eventually confines all descendants of a single clonal lineage to a relatively small area.