949 resultados para Error-Free Transformations
Resumo:
Software forms an important part of the interface between citizens and their government. An increasing amount of government functions are being performed, controlled, or delivered electronically. This software, like all language, is never value-neutral, but must, to some extent, reflect the values of the coder and proprietor. The move that many governments are making towards e-governance, and the increasing reliance that is being placed upon software in government, necessitates a rethinking of the relationships of power and control that are embodied in software.
Resumo:
The problem of steady subcritical free surface flow past a submerged inclined step is considered. The asymptotic limit of small Froude number is treated, with particular emphasis on the effect that changing the angle of the step face has on the surface waves. As demonstrated by Chapman & Vanden-Broeck (2006), the divergence of a power series expansion in powers of the square of the Froude number is caused by singularities in the analytic continuation of the free surface; for an inclined step, these singularities may correspond to either the corners or stagnation points of the step, or both, depending on the angle of incline. Stokes lines emanate from these singularities, and exponentially small waves are switched on at the point the Stokes lines intersect with the free surface. Our results suggest that for a certain range of step angles, two wavetrains are switched on, but the exponentially subdominant one is switched on first, leading to an intermediate wavetrain not previously noted. We extend these ideas to the problem of flow over a submerged bump or trench, again with inclined sides. This time there may be two, three or four active Stokes lines, depending on the inclination angles. We demonstrate how to construct a base topography such that wave contributions from separate Stokes lines are of equal magnitude but opposite phase, thus cancelling out. Our asymptotic results are complemented by numerical solutions to the fully nonlinear equations.
Resumo:
The quality of conceptual business process models is highly relevant for the design of corresponding information systems. In particular, a precise measurement of model characteristics can be beneficial from a business perspective, helping to save costs thanks to early error detection. This is just as true from a software engineering point of view. In this latter case, models facilitate stakeholder communication and software system design. Research has investigated several proposals as regards measures for business process models, from a rather correlational perspective. This is helpful for understanding, for example size and complexity as general driving forces of error probability. Yet, design decisions usually have to build on thresholds, which can reliably indicate that a certain counter-action has to be taken. This cannot be achieved only by providing measures; it requires a systematic identification of effective and meaningful thresholds. In this paper, we derive thresholds for a set of structural measures for predicting errors in conceptual process models. To this end, we use a collection of 2,000 business process models from practice as a means of determining thresholds, applying an adaptation of the ROC curves method. Furthermore, an extensive validation of the derived thresholds was conducted by using 429 EPC models from an Australian financial institution. Finally, significant thresholds were adapted to refine existing modeling guidelines in a quantitative way.
Resumo:
A rule-based approach for classifying previously identified medical concepts in the clinical free text into an assertion category is presented. There are six different categories of assertions for the task: Present, Absent, Possible, Conditional, Hypothetical and Not associated with the patient. The assertion classification algorithms were largely based on extending the popular NegEx and Context algorithms. In addition, a health based clinical terminology called SNOMED CT and other publicly available dictionaries were used to classify assertions, which did not fit the NegEx/Context model. The data for this task includes discharge summaries from Partners HealthCare and from Beth Israel Deaconess Medical Centre, as well as discharge summaries and progress notes from University of Pittsburgh Medical Centre. The set consists of 349 discharge reports, each with pairs of ground truth concept and assertion files for system development, and 477 reports for evaluation. The system’s performance on the evaluation data set was 0.83, 0.83 and 0.83 for recall, precision and F1-measure, respectively. Although the rule-based system shows promise, further improvements can be made by incorporating machine learning approaches.
Resumo:
Error correction is perhaps the most widely used method for responding to student writing. While various studies have investigated the effectiveness of providing error correction, there has been relatively little research incorporating teachers' beliefs, practices, and students' preferences in written error correction. The current study adopted features of an ethnographic research design in order to explore the beliefs and practices of ESL teachers, and investigate the preferences of L2 students regarding written error correction in the context of a language institute situated in the Brisbane metropolitan district. In this study, two ESL teachers and two groups of adult intermediate L2 students were interviewed and observed. The beliefs and practices of the teachers were elicited through interviews and classroom observations. The preferences of L2 students were elicited through focus group interviews. Responses of the participants were encoded and analysed. Results of the teacher interviews showed that teachers believe that providing written error correction has advantages and disadvantages. Teachers believe that providing written error correction helps students improve their proof-reading skills in order to revise their writing more efficiently. However, results also indicate that providing written error correction is very time consuming. Furthermore, teachers prefer to provide explicit written feedback strategies during the early stages of the language course, and move to a more implicit strategy of providing written error correction in order to facilitate language learning. On the other hand, results of the focus group interviews suggest that students regard their teachers' practice of written error correction as important in helping them locate their errors and revise their writing. However, students also feel that the process of providing written error correction is time consuming. Nevertheless, students want and expect their teachers to provide written feedback because they believe that the benefits they gain from receiving feedback on their writing outweigh the apparent disadvantages of their teachers' written error correction strategies.
Resumo:
Unsteady natural convection inside a triangular cavity has been studied in this study. The cavity is filled with a saturated porous medium with non-isothermal left inclined wall while the bottom surface is isothermally heated and the right inclined surface is isothermally cooled. An internal heat generation is also considered which is dependent on the fluid temperature. The governing equations are solved numerically by finite volume method. The Prandtl number, Pr of the fluid is considered as 0.7 (air) while the aspect ratio and the Rayleigh number, Ra are considered as 0.5 and 105 respectively. The effect of heat generation on the fluid flow and heat transfer have been presented as a form of streamlines and isotherms. The rate of heat transfer through three surfaces of the enclosure is also presented.
Resumo:
Australians are the creators and custodians of a broad range of cultural materials. This material includes literary, photographic, video and audio archives. These archives should be made available to all Australians for access and reuse, as part of a pre-competitive platform which promotes the interests of the Australian public in gaining access to a diverse range of content that contributes to the development of national and cultural identity. This does not mean that all material must be made available for access and reuse for free and in an unrestricted fashion. But for publicly funded content, free and unrestricted access should be the default. The Venturous Australia report on the National Innovation System recommended that “[t]o the maximum extent possible, information, research and content funded by Australian governments – including national collections – should be made freely available over the internet as part of the global public commons.”1 The report further stated that “both for its direct and indirect benefits to Australia and for the greater global good, Australia should energetically and proudly maximise the extent to which it makes government funded content available as part of the global digital commons...
Resumo:
Individual-based models describing the migration and proliferation of a population of cells frequently restrict the cells to a predefined lattice. An implicit assumption of this type of lattice based model is that a proliferative population will always eventually fill the lattice. Here we develop a new lattice-free individual-based model that incorporates cell-to-cell crowding effects. We also derive approximate mean-field descriptions for the lattice-free model in two special cases motivated by commonly used experimental setups. Lattice-free simulation results are compared to these mean-field descriptions and to a corresponding lattice-based model. Data from a proliferation experiment is used to estimate the parameters for the new model, including the cell proliferation rate, showing that the model fits the data well. An important aspect of the lattice-free model is that the confluent cell density is not predefined, as with lattice-based models, but an emergent model property. As a consequence of the more realistic, irregular configuration of cells in the lattice-free model, the population growth rate is much slower at high cell densities and the population cannot reach the same confluent density as an equivalent lattice-based model.
Resumo:
Nutrition interventions in the form of both self-management education and individualised diet therapy are considered essential for the long-term management of type 2 diabetes mellitus (T2DM). The measurement of diet is essential to inform, support and evaluate nutrition interventions in the management of T2DM. Barriers inherent within health care settings and systems limit ongoing access to personnel and resources, while traditional prospective methods of assessing diet are burdensome for the individual and often result in changes in typical intake to facilitate recording. This thesis investigated the inclusion of information and communication technologies (ICT) to overcome limitations to current approaches in the nutritional management of T2DM, in particular the development, trial and evaluation of the Nutricam dietary assessment method (NuDAM) consisting of a mobile phone photo/voice application to assess nutrient intake in a free-living environment with older adults with T2DM. Study 1: Effectiveness of an automated telephone system in promoting change in dietary intake among adults with T2DM The effectiveness of an automated telephone system, Telephone-Linked Care (TLC) Diabetes, designed to deliver self-management education was evaluated in terms of promoting dietary change in adults with T2DM and sub-optimal glycaemic control. In this secondary data analysis independent of the larger randomised controlled trial, complete data was available for 95 adults (59 male; mean age(±SD)=56.8±8.1 years; mean(±SD)BMI=34.2±7.0kg/m2). The treatment effect showed a reduction in total fat of 1.4% and saturated fat of 0.9% energy intake, body weight of 0.7 kg and waist circumference of 2.0 cm. In addition, a significant increase in the nutrition self-efficacy score of 1.3 (p<0.05) was observed in the TLC group compared to the control group. The modest trends observed in this study indicate that the TLC Diabetes system does support the adoption of positive nutrition behaviours as a result of diabetes self-management education, however caution must be applied in the interpretation of results due to the inherent limitations of the dietary assessment method used. The decision to use a close-list FFQ with known bias may have influenced the accuracy of reporting dietary intake in this instance. This study provided an example of the methodological challenges experienced with measuring changes in absolute diet using a FFQ, and reaffirmed the need for novel prospective assessment methods capable of capturing natural variance in usual intakes. Study 2: The development and trial of NuDAM recording protocol The feasibility of the Nutricam mobile phone photo/voice dietary record was evaluated in 10 adults with T2DM (6 Male; age=64.7±3.8 years; BMI=33.9±7.0 kg/m2). Intake was recorded over a 3-day period using both Nutricam and a written estimated food record (EFR). Compared to the EFR, the Nutricam device was found to be acceptable among subjects, however, energy intake was under-recorded using Nutricam (-0.6±0.8 MJ/day; p<0.05). Beverages and snacks were the items most frequently not recorded using Nutricam; however forgotten meals contributed to the greatest difference in energy intake between records. In addition, the quality of dietary data recorded using Nutricam was unacceptable for just under one-third of entries. It was concluded that an additional mechanism was necessary to complement dietary information collected via Nutricam. Modifications to the method were made to allow for clarification of Nutricam entries and probing forgotten foods during a brief phone call to the subject the following morning. The revised recording protocol was evaluated in Study 4. Study 3: The development and trial of the NuDAM analysis protocol Part A explored the effect of the type of portion size estimation aid (PSEA) on the error associated with quantifying four portions of 15 single foods items contained in photographs. Seventeen dietetic students (1 male; age=24.7±9.1 years; BMI=21.1±1.9 kg/m2) estimated all food portions on two occasions: without aids and with aids (food models or reference food photographs). Overall, the use of a PSEA significantly reduced mean (±SD) group error between estimates compared to no aid (-2.5±11.5% vs. 19.0±28.8%; p<0.05). The type of PSEA (i.e. food models vs. reference food photograph) did not have a notable effect on the group estimation error (-6.7±14.9% vs. 1.4±5.9%, respectively; p=0.321). This exploratory study provided evidence that the use of aids in general, rather than the type, was more effective in reducing estimation error. Findings guided the development of the Dietary Estimation and Assessment Tool (DEAT) for use in the analysis of the Nutricam dietary record. Part B evaluated the effect of the DEAT on the error associated with the quantification of two 3-day Nutricam dietary records in a sample of 29 dietetic students (2 males; age=23.3±5.1 years; BMI=20.6±1.9 kg/m2). Subjects were randomised into two groups: Group A and Group B. For Record 1, the use of the DEAT (Group A) resulted in a smaller error compared to estimations made without the tool (Group B) (17.7±15.8%/day vs. 34.0±22.6%/day, p=0.331; respectively). In comparison, all subjects used the DEAT to estimate Record 2, with resultant error similar between Group A and B (21.2±19.2%/day vs. 25.8±13.6%/day; p=0.377 respectively). In general, the moderate estimation error associated with quantifying food items did not translate into clinically significant differences in the nutrient profile of the Nutricam dietary records, only amorphous foods were notably over-estimated in energy content without the use of the DEAT (57kJ/day vs. 274kJ/day; p<0.001). A large proportion (89.6%) of the group found the DEAT helpful when quantifying food items contained in the Nutricam dietary records. The use of the DEAT reduced quantification error, minimising any potential effect on the estimation of energy and macronutrient intake. Study 4: Evaluation of the NuDAM The accuracy and inter-rater reliability of the NuDAM to assess energy and macronutrient intake was evaluated in a sample of 10 adults (6 males; age=61.2±6.9 years; BMI=31.0±4.5 kg/m2). Intake recorded using both the NuDAM and a weighed food record (WFR) was coded by three dietitians and compared with an objective measure of total energy expenditure (TEE) obtained using the doubly labelled water technique. At the group level, energy intake (EI) was under-reported to a similar extent using both methods, with the ratio of EI:TEE was 0.76±0.20 for the NuDAM and 0.76±0.17 for the WFR. At the individual level, four subjects reported implausible levels of energy intake using the WFR method, compared to three using the NuDAM. Overall, moderate to high correlation coefficients (r=0.57-0.85) were found across energy and macronutrients except fat (r=0.24) between the two dietary measures. High agreement was observed between dietitians for estimates of energy and macronutrient derived for both the NuDAM (ICC=0.77-0.99; p<0.001) and WFR (ICC=0.82-0.99; p<0.001). All subjects preferred using the NuDAM over the WFR to record intake and were willing to use the novel method again over longer recording periods. This research program explored two novel approaches which utilised distinct technologies to aid in the nutritional management of adults with T2DM. In particular, this thesis makes a significant contribution to the evidence base surrounding the use of PhRs through the development, trial and evaluation of a novel mobile phone photo/voice dietary record. The NuDAM is an extremely promising advancement in the nutritional management of individuals with diabetes and other chronic conditions. Future applications lie in integrating the NuDAM with other technologies to facilitate practice across the remaining stages of the nutrition care process.
Resumo:
Tailor-made water-soluble macromolecules, including a glycopolymer, obtained by living/controlled RAFT-mediated polymerization are demonstrated to react in water with diene-functionalized poly(ethylene glycol)s without pre- or post-functionalization steps or the need for a catalyst at ambient temperature. As previously observed in organic solvents, hetero-Diels-Alder (HDA) conjugations reached quantitative conversion within minutes when cyclopentadienyl moieties were involved. However, while catalysts and elevated temperatures were previously necessary for open-chain diene conjugation, additive-free HDA cycloadditions occur in water within a few hours at ambient temperature. Experimental evidence for efficient conjugations is provided via unambiguous ESI-MS, UV/vis, NMR, and SEC data.