36 resultados para Yallop, Collin: An introduction to phonetics and phonology
Resumo:
For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
In Part 1 of this paper a methodology for back-to-back testing of simulation software was described. Residuals with error-dependent geometric properties were generated. A set of potential coding errors was enumerated, along with a corresponding set of feature matrices, which describe the geometric properties imposed on the residuals by each of the errors. In this part of the paper, an algorithm is developed to isolate the coding errors present by analysing the residuals. A set of errors is isolated when the subspace spanned by their combined feature matrices corresponds to that of the residuals. Individual feature matrices are compared to the residuals and classified as 'definite', 'possible' or 'impossible'. The status of 'possible' errors is resolved using a dynamic subset testing algorithm. To demonstrate and validate the testing methodology presented in Part 1 and the isolation algorithm presented in Part 2, a case study is presented using a model for biological wastewater treatment. Both single and simultaneous errors that are deliberately introduced into the simulation code are correctly detected and isolated. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Of all human cancers, HNSCC is the most distressing affecting pain, disfigurement, speech and the basic survival functions of breathing and swallowing. Mortality rates have not significantly changed in the last 40 years despite advances in radiotherapy and surgical treatment. Molecular markers are currently being identified that can determine prognosis preoperatively by routine tumour biopsy Leading to improved management of HNSCC patients. The approach could help decide which early stage patient should have adjuvant neck dissection and radiotherapy, and whether Later stage patients with operable lesions would benefit from resection and reconstructive surgery or adopt a conservative approach to patients with poor prognosis regardless of treatment. In the future, understanding these basic genetic changes in HNSCC would be important for the management of HNSCC. (C) 2004 The British Association of Plastic Surgeons. Published by Elsevier Ltd. All rights reserved.
Resumo:
Bourdieu … makes it possible to explain how the actions of principals are always contextual, since their interests vary with issue, location, time, school mix, composition of staff and so on. This 'identity' perspective points at a different kind of research about principal practice: to understand the game and its logic requires an analysis of the situated everyday rather than abstractions that claim truth in all instances and places. (Thomson 2001a: 14)
Resumo:
Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.
Resumo:
This special issue revisits the relationship between women, work and technology, focusing specifically on gender equity in information technology (IT) employment. Along with theoretical contestation over the broader relationship between gender and technology, arguments about the prospects for women in IT employment have ranged from optimistic to pessimistic extremes. On the one hand, optimists have envisaged a more gender-egalitarian workforce based on new occupations lacking traditional gender markers, and 'young' firms offering positive flexibilities and equal employment opportunity protections. On the other hand, pessimists anticipate that ongoing male dominance over tedmology and competitive pressures in the IT sector will ensure that the most prestigious and highly rewarded jobs remain concentrated in male hands, even as teclmologies and jobs are themselves transformed.
Resumo:
Classical mechanics is formulated in complex Hilbert space with the introduction of a commutative product of operators, an antisymmetric bracket and a quasidensity operator that is not positive definite. These are analogues of the star product, the Moyal bracket, and the Wigner function in the phase space formulation of quantum mechanics. Quantum mechanics is then viewed as a limiting form of classical mechanics, as Planck's constant approaches zero, rather than the other way around. The forms of semiquantum approximations to classical mechanics, analogous to semiclassical approximations to quantum mechanics, are indicated.