80 resultados para Hoey, Michael: Textual interaction: An introduction to written discourse analysis
Resumo:
"Wills' Mineral Processing Technology" provides practising engineers and students of mineral processing, metallurgy and mining with a review of all of the common ore-processing techniques utilized in modern processing installations. Now in its Seventh Edition, this renowned book is a standard reference for the mineral processing industry. Chapters deal with each of the major processing techniques, and coverage includes the latest technical developments in the processing of increasingly complex refractory ores, new equipment and process routes. This new edition has been prepared by the prestigious J K Minerals Research Centre of Australia, which contributes its world-class expertise and ensures that this will continue to be the book of choice for professionals and students in this field. This latest edition highlights the developments and the challenges facing the mineral processor, particularly with regard to the environmental problems posed in improving the efficiency of the existing processes and also in dealing with the waste created. The work is fully indexed and referenced. -The classic mineral processing text, revised and updated by a prestigious new team -Provides a clear exposition of the principles and practice of mineral processing, with examples taken from practice -Covers the latest technological developments and highlights the challenges facing the mineral processor -New sections on environmental problems, improving the efficiency of existing processes and dealing with waste.
Resumo:
A program can be decomposed into a set of possible execution paths. These can be described in terms of primitives such as assignments, assumptions and coercions, and composition operators such as sequential composition and nondeterministic choice as well as finitely or infinitely iterated sequential composition. Some of these paths cannot possibly be followed (they are dead or infeasible), and they may or may not terminate. Decomposing programs into paths provides a foundation for analyzing properties of programs. Our motivation is timing constraint analysis of real-time programs, but the same techniques can be applied in other areas such as program testing. In general the set of execution paths for a program is infinite. For timing analysis we would like to decompose a program into a finite set of subpaths that covers all possible execution paths, in the sense that we only have to analyze the subpaths in order to determine suitable timing constraints that cover all execution paths.
Resumo:
Bourdieu … makes it possible to explain how the actions of principals are always contextual, since their interests vary with issue, location, time, school mix, composition of staff and so on. This 'identity' perspective points at a different kind of research about principal practice: to understand the game and its logic requires an analysis of the situated everyday rather than abstractions that claim truth in all instances and places. (Thomson 2001a: 14)
Resumo:
Classical mechanics is formulated in complex Hilbert space with the introduction of a commutative product of operators, an antisymmetric bracket and a quasidensity operator that is not positive definite. These are analogues of the star product, the Moyal bracket, and the Wigner function in the phase space formulation of quantum mechanics. Quantum mechanics is then viewed as a limiting form of classical mechanics, as Planck's constant approaches zero, rather than the other way around. The forms of semiquantum approximations to classical mechanics, analogous to semiclassical approximations to quantum mechanics, are indicated.
Resumo:
Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.
The structure of middle management remuneration packages: An application to Australian mine managers
Resumo:
This paper investigates the composition of remuneration packages for middle managers and relates the structure of remuneration contracts to firm-specific attributes. A statutorily defined position in a single industry is studied as an example of middle management. This allows us to control for differences in task complexity across managers and industry-induced factors that could determine differences in remuneration contracts. Higher-risk firms are expected to pay their mine managers a greater proportion of variable salaries and market and/or accounting-based compensation than low-risk firms. Results indicate that high-risk firms pay a higher proportion of variable salaries and more compensation based on market and/or accounting performance.
Resumo:
As the use of technological devices in everyday environments becomes more prevalent, it is clear that access to these devices has become an important aspect of occupational performance. Children are increasingly required to competently manipulate technology such as the computer to fulfil occupational roles of student and player. Occupational therapists are in a position to facilitate the successful interface between children and standard computer technologies. The literature has supported the use of direct manipulation interfaces in computing that requires mastery of devices such as the mouse. Identification of children likely to experience difficulties with mouse use will inform the development of appropriate methods of intervention promoting mouse skill and further enhance participation in occupational tasks. The aim of this paper is to discuss the development of an assessment of mouse proficiency for children. It describes the construction of the assessment, the content of the test, and its content validity.
Resumo:
In this paper we apply a new method for the determination of surface area of carbonaceous materials, using the local surface excess isotherms obtained from the Grand Canonical Monte Carlo simulation and a concept of area distribution in terms of energy well-depth of solid–fluid interaction. The range of this well-depth considered in our GCMC simulation is from 10 to 100 K, which is wide enough to cover all carbon surfaces that we dealt with (for comparison, the well-depth for perfect graphite surface is about 58 K). Having the set of local surface excess isotherms and the differential area distribution, the overall adsorption isotherm can be obtained in an integral form. Thus, given the experimental data of nitrogen or argon adsorption on a carbon material, the differential area distribution can be obtained from the inversion process, using the regularization method. The total surface area is then obtained as the area of this distribution. We test this approach with a number of data in the literature, and compare our GCMC-surface area with that obtained from the classical BET method. In general, we find that the difference between these two surface areas is about 10%, indicating the need to reliably determine the surface area with a very consistent method. We, therefore, suggest the approach of this paper as an alternative to the BET method because of the long-recognized unrealistic assumptions used in the BET theory. Beside the surface area obtained by this method, it also provides information about the differential area distribution versus the well-depth. This information could be used as a microscopic finger-print of the carbon surface. It is expected that samples prepared from different precursors and different activation conditions will have distinct finger-prints. We illustrate this with Cabot BP120, 280 and 460 samples, and the differential area distributions obtained from the adsorption of argon at 77 K and nitrogen also at 77 K have exactly the same patterns, suggesting the characteristics of this carbon.