906 resultados para Raymond Queneau
Resumo:
At the previous conference in this series, Corney, Lister and Teague presented research results showing relationships between code writing, code tracing and code explaining, from as early as week 3 of semester. We concluded that the problems some students face in learning to program start very early in the semester. In this paper we report on our replication of that experiment, at two institutions, where one is the same as the original institution. In some cases, we did not find the same relationship between explaining code and writing code, but we believe this was because our teachers discussed the code in lectures between the two tests. Apart from that exception, our replication results at both institutions are consistent with our original study.
Resumo:
Recent research on novice programmers has suggested that they pass through neo-Piagetian stages: sensorimotor, preoperational, and concrete operational stages, before eventually reaching programming competence at the formal operational stage. This paper presents empirical results in support of this neo-Piagetian perspective. The major novel contributions of this paper are empirical results for some exam questions aimed at testing novices for the concrete operational abilities to reason with quantities that are conserved, processes that are reversible, and properties that hold under transitive inference. While the questions we used had been proposed earlier by Lister, he did not present any data for how students performed on these questions. Our empirical results demonstrate that many students struggle to answer these problems, despite the apparent simplicity of these problems. We then compare student performance on these questions with their performance on six explain in plain English questions.
Resumo:
Information mismatch and overload are two fundamental issues influencing the effectiveness of information filtering systems. Even though both term-based and pattern-based approaches have been proposed to address the issues, neither of these approaches alone can provide a satisfactory decision for determining the relevant information. This paper presents a novel two-stage decision model for solving the issues. The first stage is a novel rough analysis model to address the overload problem. The second stage is a pattern taxonomy mining model to address the mismatch problem. The experimental results on RCV1 and TREC filtering topics show that the proposed model significantly outperforms the state-of-the-art filtering systems.
Resumo:
We describe the population pharmacokinetics of an acepromazine (ACP) metabolite (2-(1-hydroxyethyl)promazine) (HEPS) in horses for the estimation of likely detection times in plasma and urine. Acepromazine (30 mg) was administered to 12 horses, and blood and urine samples were taken at frequent intervals for chemical analysis. A Bayesian hierarchical model was fitted to describe concentration-time data and cumulative urine amounts for HEPS. The metabolite HEPS was modelled separately from the parent ACP as the half-life of the parent was considerably less than that of the metabolite. The clearance ($Cl/F_{PM}$) and volume of distribution ($V/F_{PM}$), scaled by the fraction of parent converted to metabolite, were estimated as 769 L/h and 6874 L, respectively. For a typical horse in the study, after receiving 30 mg of ACP, the upper limit of the detection time was 35 hours in plasma and 100 hours in urine, assuming an arbitrary limit of detection of 1 $\mu$g/L, and a small ($\approx 0.01$) probability of detection. The model derived allowed the probability of detection to be estimated at the population level. This analysis was conducted on data collected from only 12 horses, but we assume that this is representative of the wider population.
Resumo:
Contamination of packaged foods due to micro-organisms entering through air leaks can cause serious public health issues and cost companies large amounts of money due to product recalls, consumer impact and subsequent loss of market share. The main source of contamination is leaks in packaging which allow air, moisture and microorganisms to enter the package. In the food processing and packaging industry worldwide, there is an increasing demand for cost effective state of the art inspection technologies that are capable of reliably detecting leaky seals and delivering products at six-sigma. The new technology will develop non-destructive testing technology using digital imaging and sensing combined with a differential vacuum technique to assess seal integrity of food packages on a high-speed production line. The cost of leaky packages in Australian food industries is estimated close to AUD $35 Million per year. Contamination of packaged foods due to micro-organisms entering through air leaks can cause serious public health issues and cost companies large sums of money due to product recalls, compensation claims and loss of market share. The main source of contamination is leaks in packaging which allow air, moisture and micro-organisms to enter the package. Flexible plastic packages are widely used, and are the least expensive form of retaining the quality of the product. These packets can be used to seal, and therefore maximise, the shelf life of both dry and moist products. The seals of food packages need to be airtight so that the food content is not contaminated due to contact with microorganisms that enter as a result of air leakage. Airtight seals also extend the shelf life of packaged foods, and manufacturers attempt to prevent food products with leaky seals being sold to consumers. There are many current NDT (non-destructive testing) methods of testing the seal of flexible packages best suited to random sampling, and for laboratory purposes. The three most commonly used methods are vacuum/pressure decay, bubble test, and helium leak detection. Although these methods can detect very fine leaks, they are limited by their high processing time and are not viable in a production line. Two nondestructive in-line packaging inspection machines are currently available and are discussed in the literature review. The detailed design and development of the High-Speed Sensing and Detection System (HSDS) is the fundamental requirement of this project and the future prototype and production unit. Successful laboratory testing was completed and a methodical design procedure was needed for a successful concept. The Mechanical tests confirmed the vacuum hypothesis and seal integrity with good consistent results. Electrically, the testing also provided solid results to enable the researcher to move the project forward with a certain amount of confidence. The laboratory design testing allowed the researcher to confirm theoretical assumptions before moving into the detailed design phase. Discussion on the development of the alternative concepts in both mechanical and electrical disciplines enables the researcher to make an informed decision. Each major mechanical and electrical component is detailed through the research and design process. The design procedure methodically works through the various major functions both from a mechanical and electrical perspective. It opens up alternative ideas for the major components that although are sometimes not practical in this application, show that the researcher has exhausted all engineering and functionality thoughts. Further concepts were then designed and developed for the entire HSDS unit based on previous practice and theory. In the future, it would be envisaged that both the Prototype and Production version of the HSDS would utilise standard industry available components, manufactured and distributed locally. Future research and testing of the prototype unit could result in a successful trial unit being incorporated in a working food processing production environment. Recommendations and future works are discussed, along with options in other food processing and packaging disciplines, and other areas in the non-food processing industry.
Resumo:
In information retrieval (IR) research, more and more focus has been placed on optimizing a query language model by detecting and estimating the dependencies between the query and the observed terms occurring in the selected relevance feedback documents. In this paper, we propose a novel Aspect Language Modeling framework featuring term association acquisition, document segmentation, query decomposition, and an Aspect Model (AM) for parameter optimization. Through the proposed framework, we advance the theory and practice of applying high-order and context-sensitive term relationships to IR. We first decompose a query into subsets of query terms. Then we segment the relevance feedback documents into chunks using multiple sliding windows. Finally we discover the higher order term associations, that is, the terms in these chunks with high degree of association to the subsets of the query. In this process, we adopt an approach by combining the AM with the Association Rule (AR) mining. In our approach, the AM not only considers the subsets of a query as “hidden” states and estimates their prior distributions, but also evaluates the dependencies between the subsets of a query and the observed terms extracted from the chunks of feedback documents. The AR provides a reasonable initial estimation of the high-order term associations by discovering the associated rules from the document chunks. Experimental results on various TREC collections verify the effectiveness of our approach, which significantly outperforms a baseline language model and two state-of-the-art query language models namely the Relevance Model and the Information Flow model
Resumo:
In the era of Web 2.0, huge volumes of consumer reviews are posted to the Internet every day. Manual approaches to detecting and analyzing fake reviews (i.e., spam) are not practical due to the problem of information overload. However, the design and development of automated methods of detecting fake reviews is a challenging research problem. The main reason is that fake reviews are specifically composed to mislead readers, so they may appear the same as legitimate reviews (i.e., ham). As a result, discriminatory features that would enable individual reviews to be classified as spam or ham may not be available. Guided by the design science research methodology, the main contribution of this study is the design and instantiation of novel computational models for detecting fake reviews. In particular, a novel text mining model is developed and integrated into a semantic language model for the detection of untruthful reviews. The models are then evaluated based on a real-world dataset collected from amazon.com. The results of our experiments confirm that the proposed models outperform other well-known baseline models in detecting fake reviews. To the best of our knowledge, the work discussed in this article represents the first successful attempt to apply text mining methods and semantic language models to the detection of fake consumer reviews. A managerial implication of our research is that firms can apply our design artifacts to monitor online consumer reviews to develop effective marketing or product design strategies based on genuine consumer feedback posted to the Internet.
Resumo:
Objectives To assess the effects of information interventions which orient patients and their carers/family to a cancer care facility and the services available within the facility. Design Systematic review of randomised controlled trials (RCTs), cluster RCTs and quasi-RCTs. Data sources MEDLINE, CINAHL, PsycINFO, EMBASE and the Cochrane Central Register of Controlled Trials. Methods We included studies evaluating the effect of an orientation intervention, compared with a control group which received usual care, or with trials comparing one orientation intervention with another orientation intervention. Results Four RCTs of 610 participants met the criteria for inclusion. Findings from two RCTs demonstrated significant benefits of the orientation intervention in relation to reduced levels of distress (mean difference (MD): −8.96, 95% confidence interval (95%CI): −11.79 to −6.13), but non-significant benefits in relation to the levels state anxiety levels (MD −9.77) (95%CI: −24.96 to 5.41). There are insufficient data on the other outcomes of interest. Conclusions This review has demonstrated the feasibility and some potential benefits of orientation interventions. There was a low level of evidence to suggest that orientation interventions can reduce distress in patients. However, other outcomes, including patient knowledge recall/satisfaction, remain inconclusive. The majority of trials were subjected to high risk of bias and were likely to be insufficiently powered. Further well conducted and powered RCTs are required to provide evidence for determining the most appropriate intensity, nature, mode and resources for such interventions. Patient and carer-focused outcomes should be included.
Resumo:
The development of text classification techniques has been largely promoted in the past decade due to the increasing availability and widespread use of digital documents. Usually, the performance of text classification relies on the quality of categories and the accuracy of classifiers learned from samples. When training samples are unavailable or categories are unqualified, text classification performance would be degraded. In this paper, we propose an unsupervised multi-label text classification method to classify documents using a large set of categories stored in a world ontology. The approach has been promisingly evaluated by compared with typical text classification methods, using a real-world document collection and based on the ground truth encoded by human experts.
Resumo:
This Guide is designed to assist workers better understand the and negotiate the complex interplay of ethical, legal and organisational considerations in their practice. The goal is to provide frontline workers and managers with information, questions and principles which promote good youth AOD practice. Legal information provided relates to Queensland, Australia.
Resumo:
Abstract Background: The importance of quality-of-life (QoL) research has been recognised over the past two decades in patients with head and neck (H&N) cancer. The aims of this systematic review are to evaluate the QoL status of H&N cancer survivors one year after treatment and to identify the determinants affecting their QoL. Methods: Pubmed, Medline, Scopus, Sciencedirect and CINAHL (2000–2011) were searched for relevant studies, and two of the present authors assessed their methodological quality. The characteristics and main findings of the studies were extracted and reported. Results: Thirty-seven studies met the inclusion criteria, and the methodological quality of the majority was moderate to high. While patients of the group in question recover their global QoL by 12 months after treatment, a number of outstanding issues persist – deterioration in physical functioning, fatigue, xerostomia and sticky saliva. Age, cancer site, stage of disease, social support, smoking, feeding tube placement and alcohol consumption are the significant determinants of QoL at 12 months, while gender has little or no influence. Conclusions: Regular assessments should be carried out to monitor physical functioning,degree of fatigue, xerostomia and sticky saliva. Further research is required to develop appropriate and effective interventions to deal with these issues, and thus to promote the patients’ QoL.
Resumo:
Purpose: To provide an overview and a critical appraisal of systematic reviews (SRs) of published interventions for the prevention/management of radiation dermatitis. Methods and Materials: We searched Medline, CINAHL, Embase, and the Cochrane Library. We also manually searched through individual reference lists of potentially eligible articles and a number of key journals in the topic area. Two authors screened all potential articles and included eligible SRs. Two authors critically appraised and extracted key findings from the included reviews using AMSTAR (the measurement tool for “assessment of multiple systematic reviews”). Results: Of 1837 potential titles, 6 SRs were included. A number of interventions have been reported to be potentially beneficial for managing radiation dermatitis. Interventions evaluated in these reviews included skin care advice, steroidal/nonsteroidal topical agents, systemic therapies, modes of radiation delivery, and dressings. However, all the included SRs reported that there is insufficient evidence supporting any single effective intervention. The methodological quality of the included studies varied, and methodological shortfalls in these reviews might create biases to the overall results or recommendations for clinical practice. Conclusions: An up-to-date high-quality SR in the prevention/management of radiation dermatitis is needed to guide practice and direction for future research. We recommend that clinicians or guideline developers critically evaluate the information of SRs in their decision making.
Resumo:
Systematic reviews (SRs) are increasingly recognised as the standard approach in summarising health research and influence clinical nursing practice and health care decisions (Coster and Norman, 2009, Grimshaw and Russell, 1993 and Griffiths and Norman, 2005). High quality SRs should have a clearly stated set of objectives with pre-defined eligibility criteria for studies; an explicit reproducible methodology; a systematic search that attempts to identify all studies that would meet the eligibility criteria; an assessment of the validity of the findings of the included studies; the assessment of risk of bias; and a systematic presentation and synthesis of the characteristics of findings of the included study (Higgins and Green, 2011). Although SRs are highly regarded and are expected to be rigorous, just as other research, their quality may vary (Choi et al., 2001 and Hoving et al., 2001)...
Resumo:
Building Web 2.0 sites does not necessarily ensure the success of the site. We aim to better understand what improves the success of a site by drawing insight from biologically inspired design patterns. Web 2.0 sites provide a mechanism for human interaction enabling powerful intercommunication between massive volumes of users. Early Web 2.0 site providers that were previously dominant are being succeeded by newer sites providing innovative social interaction mechanisms. Understanding what site traits contribute to this success drives research into Web sites mechanics using models to describe the associated social networking behaviour. Some of these models attempt to show how the volume of users provides a self-organising and self-contextualisation of content. One model describing coordinated environments is called stigmergy, a term originally describing coordinated insect behavior. This paper explores how exploiting stigmergy can provide a valuable mechanism for identifying and analysing online user behavior specifically when considering that user freedom of choice is restricted by the provided web site functionality. This will aid our building better collaborative Web sites improving the collaborative processes.