44 resultados para Errors and blunders, Literary
Resumo:
In three studies we looked at two typical misconceptions of probability: the representativeness heuristic, and the equiprobability bias. The literature on statistics education predicts that some typical errors and biases (e.g., the equiprobability bias) increase with education, whereas others decrease. This is in contrast with reasoning theorists’ prediction who propose that education reduces misconceptions in general. They also predict that students with higher cognitive ability and higher need for cognition are less susceptible to biases. In Experiments 1 and 2 we found that the equiprobability bias increased with statistics education, and it was negatively correlated with students’ cognitive abilities. The representativeness heuristic was mostly unaffected by education, and it was also unrelated to cognitive abilities. In Experiment 3 we demonstrated through an instruction manipulation (by asking participants to think logically vs. rely on their intuitions) that the reason for these differences was that these biases originated in different cognitive processes.
Resumo:
Romanticism and Blackwood's Magazine is inspired by the ongoing critical fascination with Blackwood's Edinburgh Magazine, and the burgeoning recognition of its centrality to the Romantic age. Though the magazine itself was published continuously for well over a century and a half, this volume concentrates specifically on those years when William Blackwood was at the helm, beginning with his founding of the magazine in 1817 and closing with his death in 1834. These were the years when, as Samuel Taylor Coleridge put it in 1832, Blackwood's reigned as 'an unprecedented Phenomenon in the world of letters.' The magazine placed itself at the centre of the emerging mass media, commented decisively on all the major political and cultural issues that shaped the Romantic movement, and published some of the leading writers of the day, including Coleridge, Thomas De Quincey, John Galt, Felicia Hemans, James Hogg, Walter Scott, and Mary Shelley.
'This much-needed volume reminds us not only why Blackwood's was the most influential periodical publication of the time, but also how its writers, writings, and critical agendas continue to shape so many of the scholarly concerns of Romantic studies in the twenty-first century.' - Charles Mahoney, Associate Professor, University of Connecticut, USA
List of Illustrations
Acknowledgements
Abbreviations
Notes on Contributors
'A character so various, and yet so indisputably its own': A Passage to Blackwood's Edinburgh Magazine; R.Morrison & D.S.Roberts
PART I: BLACKWOOD'S AND THE PERIODICAL PRESS
Beginning Blackwood's: The Right Mix of Dulce and Ùtile; P.Flynn
John Gibson Lockhart and Blackwood's: Shaping the Romantic Periodical Press; T.Richardson
From Gluttony to Justified Sinning: Confessional Writing in Blackwood's and the London Magazine; D.Higgins
Camaraderie and Conflict: De Quincey and Wilson on Enemy Lines; R.Morrison
Selling Blackwood's Magazine, 1817-1834; D.Finkelstein
PART II: BLACKWOOD'S CULTURE AND CRITICISM
Blackwood's 'Personalities'; T.Mole
Communal Reception, Mary Shelley, and the 'Blackwood's School' of Criticism; N.Mason
Blackwoodian Allusion and the Culture of Miscellaneity; D.Stewart
Blackwood's Edinburgh Magazine in the Scientific Culture of Early Nineteenth-Century Edinburgh; W.Christie
The Art and Science of Politics in Blackwood's Edinburgh Magazine, c. 1817-1841; D.Kelly
Prosing Poetry: Blackwood's and Generic Transposition, 1820-1840; J.Camlot
PART III: BLACKWOOD'S FICTIONS
Blackwood's and the Boundaries of the Short Story; T.Killick
The Edinburgh of Blackwood's Edinburgh Magazine and James Hogg's Fiction; G.Hughes
'The Taste for Violence in Blackwood's Magazine'; M.Schoenfield
PART IV: BLACKWOOD'S AT HOME
John Wilson and Regency Authorship; R.Cronin
John Wilson and Sport; J.Strachan
William Maginn and the Blackwood's 'Preface' of 1826; D.E.Latané, Jr.
All Work and All Play: Felicia Hemans's Edinburgh Noctes; N.Sweet
PART V: BLACKWOOD'S ABROAD
Imagining India in Early Blackwood's; D.S.Roberts
Tales of the Colonies: Blackwood's, Provincialism, and British Interests Abroad; A.Jarrells
Selected Bibliography
Index
ROBERT MORRISON is Queen's National Scholar at Queen's University, Kingston, Ontario, Canada. His book, The English Opium-Eater: A Biography of Thomas De Quincey was a finalist for the James Tait Black Prize. He has edited writings by Jane Austen, Leigh Hunt, Thomas De Quincey, and John Polidori.
DANIEL SANJIV ROBERTS is Reader in English at Queen's University Belfast, UK. His publications include a monograph, Revisionary Gleam: De Quincey, Coleridge, and the High Romantic Argument (2000), and major critical editions of Thomas De Quincey's Autobiographic Sketches and Robert Southey's The Curse of Kehama; the latter was cited as a Distinguished Scholarly Edition by the MLA. He is currently working on an edition of Charles Johnstone's novel The History of Arsaces, Prince of Betlis for the Early Irish Fiction series.
Resumo:
Objectives: Study objectives were to investigate the prevalence and causes of prescribing errors amongst foundation doctors (i.e. junior doctors in their first (F1) or second (F2) year of post-graduate training), describe their knowledge and experience of prescribing errors, and explore their self-efficacy (i.e. confidence) in prescribing.
Method: A three-part mixed-methods design was used, comprising: prospective observational study; semi-structured interviews and cross-sectional survey. All doctors prescribing in eight purposively selected hospitals in Scotland participated. All foundation doctors throughout Scotland participated in the survey. The number of prescribing errors per patient, doctor, ward and hospital, perceived causes of errors and a measure of doctors’ self-efficacy were established.
Results: 4710 patient charts and 44,726 prescribed medicines were reviewed. There were 3364 errors, affecting 1700 (36.1%) charts (overall error rate: 7.5%; F1:7.4%; F2:8.6%; consultants:6.3%). Higher error rates were associated with : teaching hospitals (p,0.001), surgical (p = ,0.001) or mixed wards (0.008) rather thanmedical ward, higher patient turnover wards (p,0.001), a greater number of prescribed medicines (p,0.001) and the months December and June (p,0.001). One hundred errors were discussed in 40 interviews. Error causation was multi-factorial; work environment and team factors were particularly noted. Of 548 completed questionnaires (national response rate of 35.4%), 508 (92.7% of respondents) reported errors, most of which (328 (64.6%) did not reach the patient. Pressure from other staff, workload and interruptions were cited as the main causes of errors. Foundation year 2 doctors reported greater confidence than year 1 doctors in deciding the most appropriate medication regimen.
Conclusions: Prescribing errors are frequent and of complex causation. Foundation doctors made more errors than other doctors, but undertook the majority of prescribing, making them a key target for intervention. Contributing causes included work environment, team, task, individual and patient factors. Further work is needed to develop and assess interventions that address these.
Resumo:
This study aims to evaluate the use of Varian radiotherapy dynamic treatment log (DynaLog) files to verify IMRT plan delivery as part of a routine quality assurance procedure. Delivery accuracy in terms of machine performance was quantified by multileaf collimator (MLC) position errors and fluence delivery accuracy for patients receiving intensity modulated radiation therapy (IMRT) treatment. The relationship between machine performance and plan complexity, quantified by the modulation complexity score (MCS) was also investigated. Actual MLC positions and delivered fraction of monitor units (MU), recorded every 50 ms during IMRT delivery, were extracted from the DynaLog files. The planned MLC positions and fractional MU were taken from the record and verify system MLC control file. Planned and delivered beam data were compared to determine leaf position errors with and without the overshoot effect. Analysis was also performed on planned and actual fluence maps reconstructed from the MLC control file and delivered treatment log files respectively. This analysis was performed for all treatment fractions for 5 prostate, 5 prostate and pelvic node (PPN) and 5 head and neck (H&N) IMRT plans, totalling 82 IMRT fields in ∼5500 DynaLog files. The root mean square (RMS) leaf position errors without the overshoot effect were 0.09, 0.26, 0.19 mm for the prostate, PPN and H&N plans respectively, which increased to 0.30, 0.39 and 0.30 mm when the overshoot effect was considered. Average errors were not affected by the overshoot effect and were 0.05, 0.13 and 0.17 mm for prostate, PPN and H&N plans respectively. The percentage of pixels passing fluence map gamma analysis at 3%/3 mm was 99.94 ± 0.25%, which reduced to 91.62 ± 11.39% at 1%/1 mm criterion. Leaf position errors, but not gamma passing rate, were directly related to plan complexity as determined by the MCS. Site specific confidence intervals for average leaf position errors were set at -0.03-0.12 mm for prostate and -0.02-0.28 mm for more complex PPN and H&N plans. For all treatment sites confidence intervals for RMS errors with the overshoot was set at 0-0.50 mm and for the percentage of pixels passing a gamma analysis at 1%/1 mm a confidence interval of 68.83% was set also for all treatment sites. This work demonstrates the successful implementation of treatment log files to validate IMRT deliveries and how dynamic log files can diagnose delivery errors not possible with phantom based QC. Machine performance was found to be directly related to plan complexity but this is not the dominant determinant of delivery accuracy.
Resumo:
The motivation for this study was to reduce physics workload relating to patient- specific quality assurance (QA). VMAT plan delivery accuracy was determined from analysis of pre- and on-treatment trajectory log files and phantom-based ionization chamber array measurements. The correlation in this combination of measurements for patient-specific QA was investigated. The relationship between delivery errors and plan complexity was investigated as a potential method to further reduce patient-specific QA workload. Thirty VMAT plans from three treatment sites - prostate only, prostate and pelvic node (PPN), and head and neck (H&N) - were retrospectively analyzed in this work. The 2D fluence delivery reconstructed from pretreatment and on-treatment trajectory log files was compared with the planned fluence using gamma analysis. Pretreatment dose delivery verification was also car- ried out using gamma analysis of ionization chamber array measurements compared with calculated doses. Pearson correlations were used to explore any relationship between trajectory log file (pretreatment and on-treatment) and ionization chamber array gamma results (pretreatment). Plan complexity was assessed using the MU/ arc and the modulation complexity score (MCS), with Pearson correlations used to examine any relationships between complexity metrics and plan delivery accu- racy. Trajectory log files were also used to further explore the accuracy of MLC and gantry positions. Pretreatment 1%/1 mm gamma passing rates for trajectory log file analysis were 99.1% (98.7%-99.2%), 99.3% (99.1%-99.5%), and 98.4% (97.3%-98.8%) (median (IQR)) for prostate, PPN, and H&N, respectively, and were significantly correlated to on-treatment trajectory log file gamma results (R = 0.989, p < 0.001). Pretreatment ionization chamber array (2%/2 mm) gamma results were also significantly correlated with on-treatment trajectory log file gamma results (R = 0.623, p < 0.001). Furthermore, all gamma results displayed a significant correlation with MCS (R > 0.57, p < 0.001), but not with MU/arc. Average MLC position and gantry angle errors were 0.001 ± 0.002 mm and 0.025° ± 0.008° over all treatment sites and were not found to affect delivery accuracy. However, vari- ability in MLC speed was found to be directly related to MLC position accuracy. The accuracy of VMAT plan delivery assessed using pretreatment trajectory log file fluence delivery and ionization chamber array measurements were strongly correlated with on-treatment trajectory log file fluence delivery. The strong corre- lation between trajectory log file and phantom-based gamma results demonstrates potential to reduce our current patient-specific QA. Additionally, insight into MLC and gantry position accuracy through trajectory log file analysis and the strong cor- relation between gamma analysis results and the MCS could also provide further methodologies to both optimize the VMAT planning and QA process.
Resumo:
Purpose of this paper:
Recent literature indicates that around one third of perishable products finish as waste (Mena et al., 2014): 60% of this waste can be classified as avoidable (EC, 2010) suggesting logistics and operational inefficiencies along the supply chain. In developed countries perishable products are predominantly wasted in wholesale and retail (Gustavsson et al., 2011) due to customer demand uncertainty the errors and delays in the supply chain (Fernie and Sparks, 2014). While research on logistics of large retail supply chains is well documented, research on retail small and medium enterprises’ (SMEs) capabilities to prevent and manage waste of perishable products is in its infancy (c.f. Ellegaard, 2008) and needs further exploration. In our study, we investigate the retail logistics practice of small food retailers, the factors that contribute to perishable products waste and the barriers and opportunities of SMEs in retail logistics to preserve product quality and participate in reverse logistics flows.
Design/methodology/approach:
As research on waste of perishable products for SMEs is scattered, we focus on identifying key variables that contribute to the creation of avoidable waste. Secondly we identify patterns of waste creation at the retail level and its possibilities for value added recovery. We use explorative case studies (Eisenhardt, 1989) and compare four SMEs and one large retailer that operate in a developed market. To get insights into specificities of SMEs that affect retail logistics practice, we select two types of food retailers: specialised (e.g. greengrocers and bakers) and general (e.g. convenience store that sells perishable products as a part of the assortment)
Findings:
Our preliminary findings indicate that there is a difference between large retailers and SME retailers in factors that contribute to the waste creation, as well as opportunities for value added recovery of products. While more factors appear to affect waste creation and management at large retailers, a small number of specific factors appears to affect SMEs. Similarly, large retailers utilise a range of practices to reduce risks of product perishability and short shelf life, manage demand, and manage reverse logistics practices. Retail SMEs on the other hand have limited options to address waste creation and value added recovery. However, our findings show that specialist SMEs could successfully minimize waste and even create possibilities for value added recovery of perishable products. Data indicates that business orientation of the SME, the buyersupplier relationship, and an extent of adoption of lean principles in retail coupled with SME resources, product specific regulations and support from local authorities for waste management or partnerships with other organizations determine extent of successful preservation of a product quality and value added recovery.
Value:
Our contribution to the SCM academic literature is threefold: first, we identify major factors that contribute to the generation waste of perishable products in retail environment; second, we identify possibilities for value added recovery for perishable products and third, we present opportunities and challenges for SME retailers to manage or participate in activities of value added recovery. Our findings contribute to theory by filling a gap in the literature that considers product quality preservation and value added recovery in the context of retail logistics and SMEs.
Research limitations/implications:
Our findings are limited to insights from five case studies of retail companies that operate within a developed market. To improve on generalisability, we intend to increase the number of cases and include data obtained from the suppliers and organizations involved in reverse logistics flows (e.g. local authorities, charities, etc.).
Practical implications:
With this paper, we contribute to the improvement of retail logistics and operations in SMEs which constitute over 99% of business activities in UK (Rhodes, 2015). Our findings will help retail managers and owners to better understand the possibilities for value added recovery, investigate a range of logistics and retail strategies suitable for the specificities of SME environment and, ultimately, improve their profitability and sustainability.
Resumo:
The comments of Charles Kegan Paul, the Victorian publisher who was involved in publishing the novels of the nineteenth-century British-Indian author Philip Meadows Taylor as single volume reprints in the 1880s, are illuminating. They are indicative of the publisher's position with regard to publishing - that there was often no correlation between commercial success and the artistic merit of a work. According to Kegan Paul, a substandard or mediocre text would be commercially successful as long it met a perceived want on the part of the public. In effect, the ruminations of the publisher suggests that a firm desirous of acquiring commercial success for a work should be an astute judge of the pre-existing wants of consumers within the market. Yet Theodor Adorno, writing in the mid-twentieth century, offers an entirely distinctive perspective to Kegan Paul's observations, arguing that there is nothing foreordained about consumer demand for certain cultural tropes or productions. They in fact are driven by an industry that preempts and conditions the possible reactions of the consumer. Both Kegan Paul's and Adorno's insights are illuminating when it comes to addressing the key issues explored in this essay. Kegan Paul's comments allude to the ways in which the publisher's promotion of Philip Meadows Taylor's fictional depictions of India and its peoples were to a large extent driven in the mid- to late-nineteenth century by their expectations of what metropolitan readers desired at any given time, whereas Adorno's insights reveal the ways in which British-Indian narratives and the public identity of their authors were not assured in advance, but were, to a large extent, engineered by the publishing industry and the literary marketplace.
Resumo:
Closing feedback loops using an IEEE 802.11b ad hoc wireless communication network incurs many challenges sensitivity to varying channel conditions and lower physical transmission rates tend to limit the bandwidth of the communication channel. Given that the bandwidth usage and control performance are linked, a method of adapting the sampling interval based on an 'a priori', static sampling policy has been proposed and, more significantly, assuring stability in the mean square sense using discrete-time Markov jump linear system theory. Practical issues including current limitations of the 802.11 b protocol, the sampling policy and stability are highlighted. Simulation results on a cart-mounted inverted pendulum show that closed-loop stability can be improved using sample rate adaptation and that the control design criteria can be met in the presence of channel errors and severe channel contention.
Resumo:
This is the first paper that shows and theoretically analyses that the presence of auto-correlation can produce considerable alterations in the Type I and Type II errors in univariate and multivariate statistical control charts. To remove this undesired effect, linear inverse ARMA filter are employed and the application studies in this paper show that false alarms (increased Type I errors) and an insensitive monitoring statistics (increased Type II errors) were eliminated.
Resumo:
Many genetic studies have demonstrated an association between the 7-repeat (7r) allele of a 48-base pair variable number of tandem repeats (VNTR) in exon 3 of the DRD4 gene and the phenotype of attention deficit hyperactivity disorder (ADHD). Previous studies have shown inconsistent associations between the 7r allele and neurocognitive performance in children with ADHD. We investigated the performance of 128 children with and without ADHD on the Fixed and Random versions of the Sustained Attention to Response Task (SART). We employed timeseries analyses of reaction-time data to allow a fine-grained analysis of reaction time variability, a candidate endophenotype for ADHD. Children were grouped into either the 7r-present group (possessing at least one copy of the 7r allele) or the 7r-absent group. The ADHD group made significantly more commission errors and was significantly more variable in RT in terms of fast moment-to-moment variability than the control group, but no effect of genotype was found on these measures. Children with ADHD without the 7r allele made significantly more omission errors, were significantly more variable in the slow frequency domain and showed less sensitivity to the signal (d') than those children with ADHD the 7r and control children with or without the 7r. These results highlight the utility of time-series analyses of reaction time data for delineating the neuropsychological deficits associated with ADHD and the DRD4 VNTR. Absence of the 7-repeat allele in children with ADHD is associated with a neurocognitive profile of drifting sustained attention that gives rise to variable and inconsistent performance. (c) 2008 Wiley-Liss, Inc.
Resumo:
We analyze the effect of a quantum error correcting code on the entanglement of encoded logical qubits in the presence of a dephasing interaction with a correlated environment. Such correlated reservoir introduces entanglement between physical qubits. We show that for short times the quantum error correction interprets such entanglement as errors and suppresses it. However, for longer time, although quantum error correction is no longer able to correct errors, it enhances the rate of entanglement production due to the interaction with the environment.
Resumo:
Behavioral effects of a novel anti-inflammatory SEN1176 were investigated. This pyrrolo[3,2-e][1,2,4]triazolo[1,5-a]pyrimidine suppresses amyloid-ß (Aß)1-42-induced macrophage production of nitric oxide, TNF-a, IL-1ß, and IL-6 in a dose-dependent fashion, an activity profile consistent with SEN1176 being a neuroinflammation inhibitor. Using male Sprague-Dawley rats, SEN1176 was examined relative to detrimental behavioral effects induced following bilateral intrahippocampal (IH) injections of aggregated Aß1-42. The rats were trained to respond under an alternating-lever cyclic-ratio (ALCR) schedule of food reinforcement, enabling measurement of parameters of operant performance that reflect aspects of learning and memory. Under the ALCR schedule, orally administered SEN1176 at 5, 20, or 30 mg/kg was effective in reducing the behavioral deficit caused by bilateral IH aggregated Aß1-42 injections in a dose-related manner over a 90-day treatment period. SEN1176 at 20 and 30 mg/kg significantly reduced lever switching errors and, at doses of 5, 10, and 30 mg/kg, significantly reduced incorrect lever perseverations, indicating a reduction of the behavioral deficit induced as a result of inflammation following IH Aß1-42 injections. When treatment with SEN1176 was instigated 30 days after IH Aß1-42 injections, it resulted in progressive protection, and withdrawal of SEN1176 treatment 60 days after IH Aß1-42 injections revealed partial retention of the protective effect. SEN1176 also significantly reduced numbers of activated astrocytes adjacent to the aggregated Aß1-42 injection sites. These results indicate the potential of SEN1176 for alleviating chronic neuroinflammatory processes related to brain Aß deposition that affect learning and memory in Alzheimer's disease.
Resumo:
The finite element method plays an extremely important role in forging process design as it provides a valid means to quantify forging errors and thereby govern die shape modification to improve the dimensional accuracy of the component. However, this dependency on process simulation could raise significant problems and present a major drawback if the finite element simulation results were inaccurate. This paper presents a novel approach to assess the dimensional accuracy and shape quality of aeroengine blades formed from finite element hot-forging simulation. The proposed virtual inspection system uses conventional algorithms adopted by modern coordinate measurement processes as well as the latest free-form surface evaluation techniques to provide a robust framework for virtual forging error assessment. Established techniques for the physical registration of real components have been adapted to localise virtual models in relation to a nominal Design Coordinate System. Blades are then automatically analysed using a series of intelligent routines to generate measurement data and compute dimensional errors. The results of a comparison study indicate that the virtual inspection results and actual coordinate measurement data are highly comparable, validating the approach as an effective and accurate means to quantify forging error in a virtual environment. Consequently, this provides adequate justification for the implementation of the virtual inspection system in the virtual process design, modelling and validation of forged aeroengine blades in industry.
Resumo:
Melt viscosity is a key indicator of product quality in polymer extrusion processes. However, real time monitoring and control of viscosity is difficult to achieve. In this article, a novel “soft sensor” approach based on dynamic gray-box modeling is proposed. The soft sensor involves a nonlinear finite impulse response model with adaptable linear parameters for real-time prediction of the melt viscosity based on the process inputs; the model output is then used as an input of a model with a simple-fixed structure to predict the barrel pressure which can be measured online. Finally, the predicted pressure is compared to the measured value and the corresponding error is used as a feedback signal to correct the viscosity estimate. This novel feedback structure enables the online adaptability of the viscosity model in response to modeling errors and disturbances, hence producing a reliable viscosity estimate. The experimental results on different material/die/extruder confirm the effectiveness of the proposed “soft sensor” method based on dynamic gray-box modeling for real-time monitoring and control of polymer extrusion processes. POLYM. ENG. SCI., 2012. © 2012 Society of Plastics Engineers