44 resultados para Errors and blunders, Literary
Resumo:
The efficient development of multi-threaded software has, for many years, been an unsolved problem in computer science. Finding a solution to this problem has become urgent with the advent of multi-core processors. Furthermore, the problem has become more complicated because multi-cores are everywhere (desktop, laptop, embedded system). As such, they execute generic programs which exhibit very different characteristics than the scientific applications that have been the focus of parallel computing in the past.
Implicitly parallel programming is an approach to parallel pro- gramming that promises high productivity and efficiency and rules out synchronization errors and race conditions by design. There are two main ingredients to implicitly parallel programming: (i) a con- ventional sequential programming language that is extended with annotations that describe the semantics of the program and (ii) an automatic parallelizing compiler that uses the annotations to in- crease the degree of parallelization.
It is extremely important that the annotations and the automatic parallelizing compiler are designed with the target application do- main in mind. In this paper, we discuss the Paralax approach to im- plicitly parallel programming and we review how the annotations and the compiler design help to successfully parallelize generic programs. We evaluate Paralax on SPECint benchmarks, which are a model for such programs, and demonstrate scalable speedups, up to a factor of 6 on 8 cores.
Resumo:
Aims: To characterize the population pharmacokinetics of ranitidine in critically ill children and to determine the influence of various clinical and demographic factors on its disposition. Methods: Data were collected prospectively from 78 paediatric patients (n = 248 plasma samples) who received oral or intravenous ranitidine for prophylaxis against stress ulcers, gastrointestinal bleeding or the treatment of gastro-oesophageal reflux. Plasma samples were analysed using high-performance liquid chromatography, and the data were subjected to population pharmacokinetic analysis using nonlinear mixed-effects modelling. Results: A one-compartment model best described the plasma concentration profile, with an exponential structure for interindividual errors and a proportional structure for intra-individual error. After backward stepwise elimination, the final model showed a significant decrease in objective function value (-12.618; P <0.001) compared with the weight-corrected base model. Final parameter estimates for the population were 32.1lh for total clearance and 285l for volume of distribution, both allometrically modelled for a 70kg adult. Final estimates for absorption rate constant and bioavailability were 1.31h and 27.5%, respectively. No significant relationship was found between age and weight-corrected ranitidine pharmacokinetic parameters in the final model, with the covariate for cardiac failure or surgery being shown to reduce clearance significantly by a factor of 0.46. Conclusions: Currently, ranitidine dose recommendations are based on children's weights. However, our findings suggest that a dosing scheme that takes into consideration both weight and cardiac failure/surgery would be more appropriate in order to avoid administration of higher or more frequent doses than necessary.
Resumo:
The article opens with the introduction of Joel Chandler Harris and his literary output. As one of “local colourists,” Harris depicted American plantation life in 19th-century Georgia and included many cultural as well as folk elements in his works. The following analysis of his stories about Uncle Remus focuses on (1) the levels of narration; (2) the linguistic complexity of the text (the stories abound in slang and dialectal expressions); (3) the form; and (4) the folklore value. These four aspects guide the discussion of the only Polish translation of the Uncle Remus stories. Prepared by Wladyslawa Wielinska in 1929, it was addressed to children. Therefore, the article aims to determine the profile of the translation as a children’s book, to consider it in relation to the skopos of the source text and to establish the extent to which it preserved the peculiar character of the Uncle Remus stories.
Resumo:
Background: Ineffective risk stratification can delay diagnosis of serious disease in patients with hematuria. We applied a systems biology approach to analyze clinical, demographic and biomarker measurements (n = 29) collected from 157 hematuric patients: 80 urothelial cancer (UC) and 77 controls with confounding pathologies.
Methods: On the basis of biomarkers, we conducted agglomerative hierarchical clustering to identify patient and biomarker clusters. We then explored the relationship between the patient clusters and clinical characteristics using Chi-square analyses. We determined classification errors and areas under the receiver operating curve of Random Forest Classifiers (RFC) for patient subpopulations using the biomarker clusters to reduce the dimensionality of the data.
Results: Agglomerative clustering identified five patient clusters and seven biomarker clusters. Final diagnoses categories were non-randomly distributed across the five patient clusters. In addition, two of the patient clusters were enriched with patients with ‘low cancer-risk’ characteristics. The biomarkers which contributed to the diagnostic classifiers for these two patient clusters were similar. In contrast, three of the patient clusters were significantly enriched with patients harboring ‘high cancer-risk” characteristics including proteinuria, aggressive pathological stage and grade, and malignant cytology. Patients in these three clusters included controls, that is, patients with other serious disease and patients with cancers other than UC. Biomarkers which contributed to the diagnostic classifiers for the largest ‘high cancer- risk’ cluster were different than those contributing to the classifiers for the ‘low cancer-risk’ clusters. Biomarkers which contributed to subpopulations that were split according to smoking status, gender and medication were different.
Conclusions: The systems biology approach applied in this study allowed the hematuric patients to cluster naturally on the basis of the heterogeneity within their biomarker data, into five distinct risk subpopulations. Our findings highlight an approach with the promise to unlock the potential of biomarkers. This will be especially valuable in the field of diagnostic bladder cancer where biomarkers are urgently required. Clinicians could interpret risk classification scores in the context of clinical parameters at the time of triage. This could reduce cystoscopies and enable priority diagnosis of aggressive diseases, leading to improved patient outcomes at reduced costs. © 2013 Emmert-Streib et al; licensee BioMed Central Ltd.
Resumo:
Purpose: To determine the efficacy of a custom made wheelchair simulation in training children to use a powered wheelchair (PWC). Design: Randomised controlled trial employing the 4C/ID-model of learning. Twenty-eight typically developing children (13M, 15F; mean age 6 years, SD 6 months) were assessed on their operation of a PWC using a functional evaluation rating scale. Participants were randomly assigned to intervention (8x 30minute training sessions using a joystick operated wheelchair simulation) or control conditions (no task), and were re-assessed on their PWC use following the intervention phase. Additional data from the simulation on completion times, errors and total scores were recorded for the intervention group. Results: Analysis of variance showed a main effect of time, with planned comparisons revealing a statistically significant change in PWC use for the intervention (p = 0.022) but not the control condition. Whilst the intervention group showed greater improvement than the controls this did not reach statistical significance. Multiple regression analyses showed that gender was predictive of pre-test (p = 0.005) functional ability. Implications: A simulated wheelchair task appears to be effective in helping children learn to operate a PWC. Greater attention should be given to female learners who underperformed when compared to their male counterparts. This low cost intervention could be easily employed at home to reduce PWC training times in children with motor disorders.
Resumo:
The range of potential applications for indoor and campus based personnel localisation has led researchers to create a wide spectrum of different algorithmic approaches and systems. However, the majority of the proposed systems overlook the unique radio environment presented by the human body leading to systematic errors and inaccuracies when deployed in this context. In this paper RSSI-based Monte Carlo Localisation was implemented using commercial 868 MHz off the shelf hardware and empirical data was gathered across a relatively large number of scenarios within a single indoor office environment. This data showed that the body shadowing effect caused by the human body introduced path skew into location estimates. It was also shown that, by using two body-worn nodes in concert, the effect of body shadowing can be mitigated by averaging the estimated position of the two nodes worn on either side of the body. © Springer Science+Business Media, LLC 2012.
Resumo:
In recent years, there has been a significant increase in the number of bridges which are being instrumented and monitored on an ongoing basis. This is in part due to the introduction of bridge management systems designed to provide a high level of protection to the public and early warning if the bridge becomes unsafe. This paper investigates a novel alternative; a low-cost method consisting of the use of a vehicle fitted with accelerometers on its axles to monitor the dynamic behaviour of bridges. A simplified half-car vehicle-bridge interaction model is used in theoretical simulations to test the effectiveness of the approach in identifying the damping ratio of the bridge. The method is tested for a range of bridge spans and vehicle velocities using theoretical simulations and the influences of road roughness, initial vibratory condition of the vehicle, signal noise, modelling errors and frequency matching on the accuracy of the results are investigated.
Resumo:
This paper presents a statistical model for the thermal behaviour of the line model based on lab tests and field measurements. This model is based on Partial Least Squares (PLS) multi regression and is used for the Dynamic Line Rating (DLR) in a wind intensive area. DLR provides extra capacity to the line, over the traditional seasonal static rating, which makes it possible to defer the need for reinforcement the existing network or building new lines. The proposed PLS model has a number of appealing features; the model is linear, so it is straightforward to use for predicting the line rating for future periods using the available weather forecast. Unlike the available physical models, the proposed model does not require any physical parameters of the line, which avoids the inaccuracies resulting from the errors and/or variations in these parameters. The developed model is compared with physical model, the Cigre model, and has shown very good accuracy in predicting the conductor temperature as well as in determining the line rating for future time periods.
Resumo:
Previous studies on work instruction delivery for complex assembly tasks have shown that the mode and delivery method for the instructions in an engineering context can influence both build time and product quality. The benefits of digital, animated instructional formats when compared to static pictures and text only formats have already been demonstrated. Although pictograms have found applications for relatively straight forward operations and activities, their applicability to relatively complex assembly tasks has yet to be demonstrated. This study compares animated instructions and pictograms for the assembly of an aircraft panel. Based around a series of build experiments, the work records build time as well as the number of media references to measure and compare build efficiency. The number of build errors and the time required to correct them is also recorded. The experiments included five participants completing five builds over five consecutive days for each media type. Results showed that on average the total build time was 13.1% lower for the group using animated instructions. The benefit of animated instructions on build time was most prominent in the first three builds, by build four this benefit had disappeared. There were a similar number of instructional references for the two groups over the five builds but the pictogram users required a lot more references during build 1. There were more errors among the group using pictograms requiring more time for corrections during the build.
Resumo:
The increased access to books afforded to blind people via e-publishing has given them long-sought independence for both recreational and educational reading. In most cases, blind readers access materials using speech output. For some content such as highly technical texts, music, and graphics, speech is not an appropriate access modality as it does not promote deep understanding. Therefore blind braille readers often prefer electronic braille displays. But these are prohibitively expensive. The search is on, therefore, for a low-cost refreshable display that would go beyond current technologies and deliver graphical content as well as text. And many solutions have been proposed, some of which reduce costs by restricting the number of characters that can be displayed, even down to a single braille cell. In this paper, we demonstrate that restricting tactile cues during braille reading leads to poorer performance in a letter recognition task. In particular, we show that lack of sliding contact between the fingertip and the braille reading surface results in more errors and that the number of errors increases as a function of presentation speed. These findings suggest that single cell displays which do not incorporate sliding contact are likely to be less effective for braille reading.
Resumo:
We investigate a collision-sensitive secondary network that intends to opportunistically aggregate and utilize spectrum of a primary network to achieve higher data rates. In opportunistic spectrum access with imperfect sensing of idle primary spectrum, secondary transmission can collide with primary transmission. When the secondary network aggregates more channels in the presence of the imperfect sensing, collisions could occur more often, limiting the performance obtained by spectrum aggregation. In this context, we aim to address a fundamental query, that is, how much spectrum aggregation is worthy with imperfect sensing. For collision occurrence, we focus on two different types of collision: one is imposed by asynchronous transmission; and the other by imperfect spectrum sensing. The collision probability expression has been derived in closed-form with various secondary network parameters: primary traffic load, secondary user transmission parameters, spectrum sensing errors, and the number of aggregated sub-channels. In addition, the impact of spectrum aggregation on data rate is analysed under the constraint of collision probability. Then, we solve an optimal spectrum aggregation problem and propose the dynamic spectrum aggregation approach to increase the data rate subject to practical collision constraints. Our simulation results show clearly that the proposed approach outperforms the benchmark that passively aggregates sub-channels with lack of collision awareness.
Resumo:
In discrete choice experiments respondents are generally assumed to consider all of the attributes across each of the alternatives, and to choose their most preferred. However, results in this paper indicate that many respondents employ simplified lexicographic decision-making rules, whereby they have a ranking of the attributes, but their choice of an alternative is based solely on the level of their most important attribute(s). Not accounting for these simple decision-making heuristics introduces systemic errors and leads to biased point estimates, as they are a violation of the continuity axiom and a departure from the use of compensatory decision-making. In this paper the implications of lexicographic preferences are examined. In particular, using a mixed logit specification this paper investigates the sensitivity of individual-specific willingness to pay (WTP) estimates conditional on whether lexicographic decision-making rules are accounted for in the modelling of discrete choice responses. Empirical results are obtained from a discrete choice experiment that was carried out to address the value of a number of rural landscape attributes in Ireland
Resumo:
This paper focuses on quantifying the benefits of pictogram based instructions relative to static images for work instruction delivery. The assembly of a stiffened aircraft panel has been used as an exemplar for the work which seeks to address the challenge of identifying an instructional mode that can be location or language neutral while at the same time optimising assembly build times and maintaining build quality. Key performance parameters measured using a series of panel build experiments conducted by two separate groups were: overall build time, the number of subject references to instructional media, the number of build errors and the time taken to correct any mistakes. Overall build time for five builds for a group using pictogram instructions was about 20% lower than for the group using image based instructions. Also, the pictogram group made fewer errors. Although previous work identified that animated instructions result in optimal build times, the language neutrality of pictograms as well as the fact that they can be used without visualisation hardware mean that, on balance, they have broader applicability in terms of transferring assembly knowledge to the manufacturing environment.
Resumo:
Royal Charter providing the Company of Stationers with corporate legal status within the City of London, and conferring on them exclusive control over printing within England. The grant of the Charter ensured that the Company's licensing procedures became the standard by which members of the book trade secured the right to print and publish literary works, giving rise to what is generally referred to as ‘stationers' copyright'.
The grant of the Charter by Mary is often understood as the point at which the monarchy established an effective regulatory institution to control and censure the press, in the guise of the Stationers' Company, in exchange for an absolute monopoly over the production of printed works. In fact, the commentary suggests that censorship of the press throughout the Tudor period remained an essentially ad hoc and reactive phenomenon, and that both Mary and Elizabeth relied, not primarily upon the Company of Stationers, but on the use of statutory instruments and royal proclamations to censure heretical and treasonous texts.