996 resultados para Spectral measure
Resumo:
Economics education research studies conducted in the UK, USA and Australia to investigate the effects of learning inputs on academic performance have been dominated by the input-output model (Shanahan and Meyer, 2001). In the Student Experience of Learning framework, however, the link between learning inputs and outputs is mediated by students' learning approaches which in turn are influenced by their perceptions of the learning contexts (Evans, Kirby, & Fabrigar, 2003). Many learning inventories such as Biggs' Study Process Questionnaires and Entwistle and Ramsden' Approaches to Study Inventory have been designed to measure approaches to academic learning. However, there is a limitation to using generalised learning inventories in that they tend to aggregate different learning approaches utilised in different assessments. As a result, important relationships between learning approaches and learning outcomes that exist in specific assessment context(s) will be missed (Lizzio, Wilson, & Simons, 2002). This paper documents the construction of an assessment specific instrument to measure learning approaches in economics. The post-dictive validity of the instrument was evaluated by examining the association of learning approaches to students' perceived assessment demand in different assessment contexts.
Resumo:
Hot and cold temperatures significantly increase mortality rates around the world, but which measure of temperature is the best predictor of mortality is not known. We used mortality data from 107 US cities for the years 1987–2000 and examined the association between temperature and mortality using Poisson regression and modelled a non-linear temperature effect and a non-linear lag structure. We examined mean, minimum and maximum temperature with and without humidity, and apparent temperature and the Humidex. The best measure was defined as that with the minimum cross-validated residual. We found large differences in the best temperature measure between age groups, seasons and cities, and there was no one temperature measure that was superior to the others. The strong correlation between different measures of temperature means that, on average, they have the same predictive ability. The best temperature measure for new studies can be chosen based on practical concerns, such as choosing the measure with the least amount of missing data.
Resumo:
A voglite mineral sample of Volrite Canyon #1 mine, Frey Point, White Canyon Mine District, San Juan County, Utah, USA is used in the present study. An EPR study on powdered sample confirms the presence of Mn(II) and Cu(II). Optical absorption spectral results are due to Cu(II) which is in distorted octahedron. NIR results are indicating the presence of water fundamentals.
Resumo:
Objective: The Brief Michigan Alcoholism Screening Test (bMAST) is a 10-item test derived from the 25-item Michigan Alcoholism Screening Test (MAST). It is widely used in the assessment of alcohol dependence. In the absence of previous validation studies, the principal aim of this study was to assess the validity and reliability of the bMAST as a measure of the severity of problem drinking. Method: There were 6,594 patients (4,854 men, 1,740 women) who had been referred for alcohol-use disorders to a hospital alcohol and drug service who voluntarily participated in this study. Results: An exploratory factor analysis defined a two-factor solution, consisting of Perception of Current Drinking and Drinking Consequences factors. Structural equation modeling confirmed that the fit of a nine-item, two-factor model was superior to the original one-factor model. Concurrent validity was assessed through simultaneous administration of the Alcohol Use Disorders Identification Test (AUDIT) and associations with alcohol consumption and clinically assessed features of alcohol dependence. The two-factor bMAST model showed moderate correlations with the AUDIT. The two-factor bMAST and AUDIT were similarly associated with quantity of alcohol consumption and clinically assessed dependence severity features. No differences were observed between the existing weighted scoring system and the proposed simple scoring system. Conclusions: In this study, both the existing bMAST total score and the two-factor model identified were as effective as the AUDIT in assessing problem drinking severity. There are additional advantages of employing the two-factor bMAST in the assessment and treatment planning of patients seeking treatment for alcohol-use disorders. (J. Stud. Alcohol Drugs 68: 771-779,2007)
Resumo:
Robust image hashing seeks to transform a given input image into a shorter hashed version using a key-dependent non-invertible transform. These image hashes can be used for watermarking, image integrity authentication or image indexing for fast retrieval. This paper introduces a new method of generating image hashes based on extracting Higher Order Spectral features from the Radon projection of an input image. The feature extraction process is non-invertible, non-linear and different hashes can be produced from the same image through the use of random permutations of the input. We show that the transform is robust to typical image transformations such as JPEG compression, noise, scaling, rotation, smoothing and cropping. We evaluate our system using a verification-style framework based on calculating false match, false non-match likelihoods using the publicly available Uncompressed Colour Image database (UCID) of 1320 images. We also compare our results to Swaminathan’s Fourier-Mellin based hashing method with at least 1% EER improvement under noise, scaling and sharpening.
Resumo:
Product placement is a fast growing multi-billion dollar industry yet measures of its effectiveness, which influence the critical area of pricing, have been problematic. Past attempts to measure the effect of a placement, and therefore provide a basis for pricing of placements, have been confounded by the effect on consumers of multiple prior exposures of a brand name in all marketing communications. Virtual product placement offers certain advantages: as a tool to measure the effectiveness of product placements; assistance with the problem of lack of audience selectivity in traditional product placement; testing different audiences for brands and addressing a gap in the existing academic literature by focusing on the impact of product placement on recall and recognition of new brands.
Resumo:
This thesis investigates aspects of encoding the speech spectrum at low bit rates, with extensions to the effect of such coding on automatic speaker identification. Vector quantization (VQ) is a technique for jointly quantizing a block of samples at once, in order to reduce the bit rate of a coding system. The major drawback in using VQ is the complexity of the encoder. Recent research has indicated the potential applicability of the VQ method to speech when product code vector quantization (PCVQ) techniques are utilized. The focus of this research is the efficient representation, calculation and utilization of the speech model as stored in the PCVQ codebook. In this thesis, several VQ approaches are evaluated, and the efficacy of two training algorithms is compared experimentally. It is then shown that these productcode vector quantization algorithms may be augmented with lossless compression algorithms, thus yielding an improved overall compression rate. An approach using a statistical model for the vector codebook indices for subsequent lossless compression is introduced. This coupling of lossy compression and lossless compression enables further compression gain. It is demonstrated that this approach is able to reduce the bit rate requirement from the current 24 bits per 20 millisecond frame to below 20, using a standard spectral distortion metric for comparison. Several fast-search VQ methods for use in speech spectrum coding have been evaluated. The usefulness of fast-search algorithms is highly dependent upon the source characteristics and, although previous research has been undertaken for coding of images using VQ codebooks trained with the source samples directly, the product-code structured codebooks for speech spectrum quantization place new constraints on the search methodology. The second major focus of the research is an investigation of the effect of lowrate spectral compression methods on the task of automatic speaker identification. The motivation for this aspect of the research arose from a need to simultaneously preserve the speech quality and intelligibility and to provide for machine-based automatic speaker recognition using the compressed speech. This is important because there are several emerging applications of speaker identification where compressed speech is involved. Examples include mobile communications where the speech has been highly compressed, or where a database of speech material has been assembled and stored in compressed form. Although these two application areas have the same objective - that of maximizing the identification rate - the starting points are quite different. On the one hand, the speech material used for training the identification algorithm may or may not be available in compressed form. On the other hand, the new test material on which identification is to be based may only be available in compressed form. Using the spectral parameters which have been stored in compressed form, two main classes of speaker identification algorithm are examined. Some studies have been conducted in the past on bandwidth-limited speaker identification, but the use of short-term spectral compression deserves separate investigation. Combining the major aspects of the research, some important design guidelines for the construction of an identification model when based on the use of compressed speech are put forward.
Resumo:
The Lane Change Test (LCT) is one of the growing number of methods developed to quantify driving performance degradation brought about by the use of in-vehicle devices. Beyond its validity and reliability, for such a test to be of practical use, it must also be sensitive to the varied demands of individual tasks. The current study evaluated the ability of several recent LCT lateral control and event detection parameters to discriminate between visual-manual and cognitive surrogate In-Vehicle Information System tasks with different levels of demand. Twenty-seven participants (mean age 24.4 years) completed a PC version of the LCT while performing visual search and math problem solving tasks. A number of the lateral control metrics were found to be sensitive to task differences, but the event detection metrics were less able to discriminate between tasks. The mean deviation and lane excursion measures were able to distinguish between the visual and cognitive tasks, but were less sensitive to the different levels of task demand. The other LCT metrics examined were less sensitive to task differences. A major factor influencing the sensitivity of at least some of the LCT metrics could be the type of lane change instructions given to participants. The provision of clear and explicit lane change instructions and further refinement of its metrics will be essential for increasing the utility of the LCT as an evaluation tool.
Resumo:
The literature was reviewed and analyzed to determine the feasibility of using a combination of acid hydrolysis and CO2-C release during long-term incubation to determine soil organic carbon (SOC) pool sizes and mean residence times (MRTs). Analysis of 1100 data points showed the SOC remaining after hydrolysis with 6 M HCI ranged from 30 to 80% of the total SOC depending on soil type, depth, texture, and management. Nonhydrolyzable carbon (NHC) in conventional till soils represented 48% of SOC; no-till averaged 56%, forest 55%, and grassland 56%. Carbon dates showed an average of 1200 yr greater MRT for the NHC fraction than total SOC. Longterm incubation, involving measurement of CO2 evolution and curve fitting, measured active and slow pools. Active-pool C comprised 2 to 8% of the SOC with MRTs of days to months; the slow pool comprised 45 to 65% of the SOC and had MRTs of 10 to 80 yr. Comparison of field C-14 and (13) C data with hydrolysis-incubation data showed a high correlation between independent techniques across soil types and experiments. There were large differences in MRTs depending on the length of the experiment. Insertion of hydrolysis-incubation derived estimates of active (C-a), slow (C-s), and resistant Pools (C-r) into the DAYCENT model provided estimates of daily field CO2 evolution rates. These were well correlated with field CO2 measurements. Although not without some interpretation problems, acid hydrolysis-laboratory incubation is useful for determining SOC pools and fluxes especially when used in combination with associated measurements.
Resumo:
This project discusses a component of the research study conducted to provide construction organizations with a generic benchmarking framework to assess their extent of information communication technology (ICT) adoption for building project management processes. It defines benchmarking and discusses objectives of the required benchmarking framework and development of the framework. The study focuses on ICT adoption by small and medium enterprises (SMEs) in the construction industry and with respect to SMEs it is important to understand processes, their indicators, and measures in the local context. Structure of the suggested benchmarking framework has been derived after extensive literature survey and a questionnaire survey conducted in the Indian construction industry. The suggested benchmarking process is an iterative process divided into four stages. It can be implemented at organization and industry levels for rating the construction organizations for ICT adoption and performance measurement. The framework has a generic structure and can be generalized and applied for other countries with due considerations.
Resumo:
The Mobile Emissions Assessment System for Urban and Regional Evaluation (MEASURE) model provides an external validation capability for hot stabilized option; the model is one of several new modal emissions models designed to predict hot stabilized emission rates for various motor vehicle groups as a function of the conditions under which the vehicles are operating. The validation of aggregate measurements, such as speed and acceleration profile, is performed on an independent data set using three statistical criteria. The MEASURE algorithms have proved to provide significant improvements in both average emission estimates and explanatory power over some earlier models for pollutants across almost every operating cycle tested.
Resumo:
Purpose: To undertake rigorous psychometric testing of the newly developed contemporary work environment measure (the Brisbane Practice Environment Measure [B-PEM]) using exploratory factor analysis and confirmatory factor analysis. Methods: Content validity of the 33-item measure was established by a panel of experts. Initial testing involved 195 nursing staff using principal component factor analysis with varimax rotation (orthogonal) and Cronbach's alpha coefficients. Confirmatory factor analysis was conducted using data from a further 983 nursing staff. Results: Principal component factor analysis yielded a four-factor solution with eigenvalues greater than 1 that explained 52.53% of the variance. These factors were then verified using confirmatory factor analysis. Goodness-of-fit indices showed an acceptable fit overall with the full model, explaining 21% to 73% of the variance. Deletion of items took place throughout the evolution of the instrument, resulting in a 26-item, four-factor measure called the Brisbane Practice Environment Measure-Tested. Conclusions: The B-PEM has undergone rigorous psychometric testing, providing evidence of internal consistency and goodness-of-fit indices within acceptable ranges. The measure can be utilised as a subscale or total score reflective of a contemporary nursing work environment. Clinical Relevance: An up-to-date instrument to measure practice environment may be useful for nursing leaders to monitor the workplace and to assist in identifying areas for improvement, facilitating greater job satisfaction and retention.