108 resultados para Use-wear analysis
Resumo:
The use of gate-to-drain capacitance (C-gd) measurement as a tool to characterize hot-carrier-induced charge centers in submicron n- and p-MOSFET's has been reviewed and demonstrated. By analyzing the change in C-gd measured at room and cryogenic temperature before and after high gate-to-drain transverse field (high field) and maximum substrate current (I-bmax) stress, it is concluded that the degradation was found to be mostly due to trapping of majority carriers and generation of interface states. These interface states were found to be acceptor states at top half of band gap for n-MOSFETs and donor states at bottom half of band gap for p-MOSFETs. In general, hot electrons are more likely to be trapped in gate oxide as compared to hot holes while the presence of hot holes generates more interface states. Also, we have demonstrated a new method for extracting the spatial distribution of oxide trapped charge, Q(ot), through gate-to-substrate capacitance (C-gb) measurement. This method is simple to implement and does not require additional information from simulation or detailed knowledge of the device's structure. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
There is increasing awareness of the importance of disruptive behaviour in people with dementia and the need for rating scales to accurately and reliably measure this behaviour. When rating scales are to be administered by nurses, scale characteristics must take into account the limitations of the nursing role and the nature of the environment in which nurses work. This paper reviews thirty-one rating scales that have been used to measure behaviour in dementia. From this analysis, five scales were identified as suitable for use by nurses when measuring disruptive behaviour in older people with dementia.
Resumo:
It has been recognised that in order to study the displacement, timing and co-ordination of articulatory components (i.e., tongue. lips, jaw) in speech production it is desirable to obtain high-resolution movement data on multiple structures inside and outside the vocal tract. Until recently, with the exception of X-ray techniques such as cineradiography, the study 0. speech movements has been hindered by the inaccessibility of the oral cavity during speech. X-ray techniques are generally not used because of unacceptable radiation exposure. The aim of the present study was to demonstrate the use of a new physiological device, the electromagnetic articulograph, for assessing articulatory dysfunction subsequent to traumatic brain injury. The components of the device together with the measuring principle are described and data collected from a single case presented. A 19 year-old male who exhibited dysarthria subsequent to a traumatic brain injury was fitted wit 2 the electromagnetic articulograph (Carstens AG-100) and a kinematic analysis of his tongue movements during production of the lingual consonants it, s, k/ within single syllable words was performed. Examination of kinematic parameters including movemmt trajectories, velocity, and acceleration revealed differences in the speed and accuracy of his tongue movements compared to those produced by a non-neurologically impaired adult male. It was concluded that the articulograph is a useful device for diagnosing speed and accuracy disorders in tongue movements during speech and that the device has potential for incorporation into physiologically based rehabilitation programs as a real-time biofeedback instrument.
Resumo:
A course which has a large student enrolment consequently puts a heavy load on instructors both in the presentation and the assessment areas. In the School of Economics at the University of Queensland, this is the case for the quantitative analysis subjects. Assessment for many years has been through mid-semester and end of semester exams, as well as Computer Managed Learning (CML) assignments. In 2000 it was decided to incorporate a system of flexible assessment where neither the CML nor the mid-semester exam was compulsory. The outcomes are assessed and the advantages and disadvantages discussed.
Resumo:
Performance indicators in the public sector have often been criticised for being inadequate and not conducive to analysing efficiency. The main objective of this study is to use data envelopment analysis (DEA) to examine the relative efficiency of Australian universities. Three performance models are developed, namely, overall performance, performance on delivery of educational services, and performance on fee-paying enrolments. The findings based on 1995 data show that the university sector was performing well on technical and scale efficiency but there was room for improving performance on fee-paying enrolments. There were also small slacks in input utilisation. More universities were operating at decreasing returns to scale, indicating a potential to downsize. DEA helps in identifying the reference sets for inefficient institutions and objectively determines productivity improvements. As such, it can be a valuable benchmarking tool for educational administrators and assist in more efficient allocation of scarce resources. In the absence of market mechanisms to price educational outputs, which renders traditional production or cost functions inappropriate, universities are particularly obliged to seek alternative efficiency analysis methods such as DEA.
Resumo:
Understanding the genetic architecture of quantitative traits can greatly assist the design of strategies for their manipulation in plant-breeding programs. For a number of traits, genetic variation can be the result of segregation of a few major genes and many polygenes (minor genes). The joint segregation analysis (JSA) is a maximum-likelihood approach for fitting segregation models through the simultaneous use of phenotypic information from multiple generations. Our objective in this paper was to use computer simulation to quantify the power of the JSA method for testing the mixed-inheritance model for quantitative traits when it was applied to the six basic generations: both parents (P-1 and P-2), F-1, F-2, and both backcross generations (B-1 and B-2) derived from crossing the F-1 to each parent. A total of 1968 genetic model-experiment scenarios were considered in the simulation study to quantify the power of the method. Factors that interacted to influence the power of the JSA method to correctly detect genetic models were: (1) whether there were one or two major genes in combination with polygenes, (2) the heritability of the major genes and polygenes, (3) the level of dispersion of the major genes and polygenes between the two parents, and (4) the number of individuals examined in each generation (population size). The greatest levels of power were observed for the genetic models defined with simple inheritance; e.g., the power was greater than 90% for the one major gene model, regardless of the population size and major-gene heritability. Lower levels of power were observed for the genetic models with complex inheritance (major genes and polygenes), low heritability, small population sizes and a large dispersion of favourable genes among the two parents; e.g., the power was less than 5% for the two major-gene model with a heritability value of 0.3 and population sizes of 100 individuals. The JSA methodology was then applied to a previously studied sorghum data-set to investigate the genetic control of the putative drought resistance-trait osmotic adjustment in three crosses. The previous study concluded that there were two major genes segregating for osmotic adjustment in the three crosses. Application of the JSA method resulted in a change in the proposed genetic model. The presence of the two major genes was confirmed with the addition of an unspecified number of polygenes.
Resumo:
For the improvement of genetic material suitable for on farm use under low-input conditions, participatory and formal plant breeding strategies are frequently presented as competing options. A common frame of reference to phrase mechanisms and purposes related to breeding strategies will facilitate clearer descriptions of similarities and differences between participatory plant breeding and formal plant breeding. In this paper an attempt is made to develop such a common framework by means of a statistically inspired language that acknowledges the importance of both on farm trials and research centre trials as sources of information for on farm genetic improvement. Key concepts are the genetic correlation between environments, and the heterogeneity of phenotypic and genetic variance over environments. Classic selection response theory is taken as the starting point for the comparison of selection trials (on farm and research centre) with respect to the expected genetic improvement in a target environment (low-input farms). The variance-covariance parameters that form the input for selection response comparisons traditionally come from a mixed model fit to multi-environment trial data. In this paper we propose a recently developed class of mixed models, namely multiplicative mixed models, also called factor-analytic models, for modelling genetic variances and covariances (correlations). Mixed multiplicative models allow genetic variances and covariances to be dependent on quantitative descriptors of the environment, and confer a high flexibility in the choice of variance-covariance structure, without requiring the estimation of a prohibitively high number of parameters. As a result detailed considerations regarding selection response comparisons are facilitated. ne statistical machinery involved is illustrated on an example data set consisting of barley trials from the International Center for Agricultural Research in the Dry Areas (ICARDA). Analysis of the example data showed that participatory plant breeding and formal plant breeding are better interpreted as providing complementary rather than competing information.
Resumo:
Previous studies have shown that a negative relationship exists between transpiration efficiency (TE) and carbon isotope discrimination (Delta) and between TE and specific leaf area (SLA) in Stylosanthes scabra, A glasshouse experiment was conducted to confirm these relationships in an F-2 population and to study the causal nature of these relationships through quantitative trait loci (QTL) analysis, One hundred and twenty F-2 genotypes from a cross between two genotypes within S. scabra were used. Three replications for each genotype were maintained through vegetative propagation, Water stress was imposed by maintaining plants at 40% of field capacity for about 45 d. To facilitate QTL analysis, a genetic linkage map consisting of 151 RAPD markers was developed, Results from this study show that Delta was significantly and negatively correlated with TE and biomass production. Similarly, SLA showed significant negative correlation with TE and biomass production, Most of the QTL for TE and Delta were present on linkage groups 5 and 11. Similarly, QTL for SLA, transpiration and biomass productivity traits were clustered on linkage groups 13 and 24, One unlinked marker was also associated with these traits, There were several markers coincident between different traits, At all the coincident QTL, the direction of QTL effects was consistent with phenotypic data, At the coincident markers between TE and Delta, high alleles of TE were associated with low alleles of Delta. Similarly, low alleles of SLA were associated with high alleles of biomass productivity traits and transpiration. At the coincident markers between trans-4-hydroxy-N-methyl proline (MHP) and relative water content (RWC), low alleles of MHP were associated with high alleles of RWC, This study suggests the causal nature of the relationship between TE and Delta. Phenotypic data and QTL, data show that SLA was more closely associated with biomass production than with TE, This study also shows that a cause-effect relationship may exist between SLA and biomass production.
Resumo:
Qualitative data analysis (QDA) is often a time-consuming and laborious process usually involving the management of large quantities of textual data. Recently developed computer programs offer great advances in the efficiency of the processes of QDA. In this paper we report on an innovative use of a combination of extant computer software technologies to further enhance and simplify QDA. Used in appropriate circumstances, we believe that this innovation greatly enhances the speed with which theoretical and descriptive ideas can be abstracted from rich, complex, and chaotic qualitative data. © 2001 Human Sciences Press, Inc.
Resumo:
Neurological disease or dysfunction in newborn infants is often first manifested by seizures. Prolonged seizures can result in impaired neurodevelopment or even death. In adults, the clinical signs of seizures are well defined and easily recognized. In newborns, however, the clinical signs are subtle and may be absent or easily missed without constant close observation. This article describes the use of adaptive signal processing techniques for removing artifacts from newborn electroencephalogram (EEG) signals. Three adaptive algorithms have been designed in the context of EEG signals. This preprocessing is necessary before attempting a fine time-frequency analysis of EEG rhythmical activities, such as electrical seizures, corrupted by high amplitude signals. After an overview of newborn EEG signals, the authors describe the data acquisition set-up. They then introduce the basic physiological concepts related to normal and abnormal newborn EEGs and discuss the three adaptive algorithms for artifact removal. They also present time-frequency representations (TFRs) of seizure signals and discuss the estimation and modeling of the instantaneous frequency related to the main ridge of the TFR.
Resumo:
The authors discuss the regulation of rural land use and compensation for property-rights restrictions, both of which appear to have become more commonplace in recent years but also more contested. The implications of contemporary theories in relation to this matter are examined, including: the applicability of new welfare economics; the relevance of the neoclassical theory of politics; and the implications of contemporary theories of social conflict resolution and communication. Examination of examples of Swiss and Australian regulation of the use of rural properties, and the ensuing conflicts, reveals that many decisions reflect a mixture of these elements. Rarely, if ever, are social decisions in this area made solely on the basis of welfare economics, for instance social cost-benefit analysis. Only some aspects of such decisions can be explained by the neoclassical theory of politics. Theories of social conflict resolution suggest why, and in what way, approaches of discourse and participation may resolve conflicts regarding regulation and compensation. These theories and their practical application seem to gain in importance as opposition to government decisions increases. The high degree of complexity of most conflicts concerning regulation and compensation cannot be tackled with narrow economic theories. Moreover, the Swiss and Australian examples show that approaches involving conflict resolution may favour environmental standards.
Resumo:
The personal computer revolution has resulted in the widespread availability of low-cost image analysis hardware. At the same time, new graphic file formats have made it possible to handle and display images at resolutions beyond the capability of the human eye. Consequently, there has been a significant research effort in recent years aimed at making use of these hardware and software technologies for flotation plant monitoring. Computer-based vision technology is now moving out of the research laboratory and into the plant to become a useful means of monitoring and controlling flotation performance at the cell level. This paper discusses the metallurgical parameters that influence surface froth appearance and examines the progress that has been made in image analysis of flotation froths. The texture spectrum and pixel tracing techniques developed at the Julius Kruttschnitt Mineral Research Centre are described in detail. The commercial implementation, JKFrothCam, is one of a number of froth image analysis systems now reaching maturity. In plants where it is installed, JKFrothCam has shown a number of performance benefits. Flotation runs more consistently, meeting product specifications while maintaining high recoveries. The system has also shown secondary benefits in that reagent costs have been significantly reduced as a result of improved flotation control. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
The development of the new TOGA (titration and off-gas analysis) sensor for the detailed study of biological processes in wastewater treatment systems is outlined. The main innovation of the sensor is the amalgamation of titrimetric and off-gas measurement techniques. The resulting measured signals are: hydrogen ion production rate (HPR), oxygen transfer rate (OTR), nitrogen transfer rate (NTR), and carbon dioxide transfer rate (CTR). While OTR and NTR are applicable to aerobic and anoxic conditions, respectively, HPR and CTR are useful signals under all of the conditions found in biological wastewater treatment systems, namely, aerobic, anoxic and anaerobic. The sensor is therefore a powerful tool for studying the key biological processes under all these conditions. A major benefit from the integration of the titrimetric and off-gas analysis methods is that the acid/base buffering systems, in particular the bicarbonate system, are properly accounted for. Experimental data resulting from the TOGA sensor in aerobic, anoxic, and anaerobic conditions demonstrates the strength of the new sensor. In the aerobic environment, carbon oxidation (using acetate as an example carbon source) and nitrification are studied. Both the carbon and ammonia removal rates measured by the sensor compare very well with those obtained from off-line chemical analysis. Further, the aerobic acetate removal process is examined at a fundamental level using the metabolic pathway and stoichiometry established in the literature, whereby the rate of formation of storage products is identified. Under anoxic conditions, the denitrification process is monitored and, again, the measured rate of nitrogen gas transfer (NTR) matches well with the removal of the oxidised nitrogen compounds (measured chemically). In the anaerobic environment, the enhanced biological phosphorus process was investigated. In this case, the measured sensor signals (HPR and CTR) resulting from acetate uptake were used to determine the ratio of the rates of carbon dioxide production by competing groups of microorganisms, which consequently is a measure of the activity of these organisms. The sensor involves the use of expensive equipment such as a mass spectrometer and requires special gases to operate, thus incurring significant capital and operational costs. This makes the sensor more an advanced laboratory tool than an on-line sensor. (C) 2003 Wiley Periodicals, Inc.
Resumo:
Observations of an insect's movement lead to theory on the insect's flight behaviour and the role of movement in the species' population dynamics. This theory leads to predictions of the way the population changes in time under different conditions. If a hypothesis on movement predicts a specific change in the population, then the hypothesis can be tested against observations of population change. Routine pest monitoring of agricultural crops provides a convenient source of data for studying movement into a region and among fields within a region. Examples of the use of statistical and computational methods for testing hypotheses with such data are presented. The types of questions that can be addressed with these methods and the limitations of pest monitoring data when used for this purpose are discussed. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
This article examines child welfare workers' understanding of physical child abuse and the Implications for those supervising these workers. The article Is based on the results of a study that involved In-depth Interviews and focus groups with statutory child welfare workers. Analysis revealed that workers' understanding of physical child abuse embodied a wide range of ideas that were generally consistent with existing literature. The study highlights the value and utility of a reflective approach In stimulating and making explicit the theoretical underpinnings of child welfare workers practice. Specific Implications for professional supervision are addressed.