811 resultados para Mahler Measure
Resumo:
Computer simulated trajectories of bulk water molecules form complex spatiotemporal structures at the picosecond time scale. This intrinsic complexity, which underlies the formation of molecular structures at longer time scales, has been quantified using a measure of statistical complexity. The method estimates the information contained in the molecular trajectory by detecting and quantifying temporal patterns present in the simulated data (velocity time series). Two types of temporal patterns are found. The first, defined by the short-time correlations corresponding to the velocity autocorrelation decay times (â‰0.1â€ps), remains asymptotically stable for time intervals longer than several tens of nanoseconds. The second is caused by previously unknown longer-time correlations (found at longer than the nanoseconds time scales) leading to a value of statistical complexity that slowly increases with time. A direct measure based on the notion of statistical complexity that describes how the trajectory explores the phase space and independent from the particular molecular signal used as the observed time series is introduced. © 2008 The American Physical Society.
Resumo:
PURPOSE: To investigate the MacDQoL test-retest reliability and sensitivity to change in vision over a period of one year in a sample of patients with age-related macular degeneration (AMD). DESIGN: A prospective, observational study. METHOD: Patients with AMD from an ophthalmologist's list (n = 135) completed the MacDQoL questionnaire by telephone interview and underwent a vision assessment on two occasions, one year apart. RESULTS: Among participants whose vision was stable over one year (n = 87), MacDQoL scores at baseline and follow-up were highly correlated (r = 0.95; P < .0001). Twelve of the 22 scale items had intraclass correlations of >.80; only two were correlated <.7. There was no difference between baseline and follow-up scores (P = .85), indicating excellent test-retest reliability. Poorer quality of life (QoL) at follow-up, measured by the MacDQoL present QoL overview item, was associated with deterioration in both the better eye and binocular distance visual acuity [VA] (r = 0.29; P = .001, r = 0.21; P = .016, respectively; n = 135). There was a positive correlation between deterioration in the Mac. DQoL average weighted impact score and deterioration in both binocular near VA and reading speed (r = 0.20; P = .019, r = 0.18; P = .041, respectively; n = 135). CONCLUSION: The MacDQoL has excellent test-retest reliability. Its sensitivity to change in vision status was demonstrated in correlational analyses. The measure indicates that the negative impact of AMD on QoL increases with increasing severity of visual impairment.
Resumo:
Data Envelopment Analysis (DEA) is a nonparametric method for measuring the efficiency of a set of decision making units such as firms or public sector agencies, first introduced into the operational research and management science literature by Charnes, Cooper, and Rhodes (CCR) [Charnes, A., Cooper, W.W., Rhodes, E., 1978. Measuring the efficiency of decision making units. European Journal of Operational Research 2, 429–444]. The original DEA models were applicable only to technologies characterized by positive inputs/outputs. In subsequent literature there have been various approaches to enable DEA to deal with negative data. In this paper, we propose a semi-oriented radial measure, which permits the presence of variables which can take both negative and positive values. The model is applied to data on a notional effluent processing system to compare the results with those yielded by two alternative methods for dealing with negative data in DEA: The modified slacks-based model suggested by Sharp et al. [Sharp, J.A., Liu, W.B., Meng, W., 2006. A modified slacks-based measure model for data envelopment analysis with ‘natural’ negative outputs and inputs. Journal of Operational Research Society 57 (11) 1–6] and the range directional model developed by Portela et al. [Portela, M.C.A.S., Thanassoulis, E., Simpson, G., 2004. A directional distance approach to deal with negative data in DEA: An application to bank branches. Journal of Operational Research Society 55 (10) 1111–1121]. A further example explores the advantages of using the new model.
Resumo:
Objectives: To develop an objective measure to enable hospital Trusts to compare their use of antibiotics. Design: Self-completion, postal questionnaire with telephone follow up. Sample: 4 hospital trusts in the English Midlands. Results: The survey showed that it was possible to collect data concerning the number of Defined Daily Doses (DDD's) of quinolone antibiotic dispensed per Finished Consultant Episode (FCE) in each Trust.. In the 4 trusts studied the mean DDD/FCE was 0.197 (range 0.117 to 0.258). This indicates that based on a typical course length of 5 days, 3.9% of patient episodes resulted in the prescription of a quinolone antibiotic. Antibiotic prescribing control measures in each Trust were found to be comparable. Conclusion: The measure will enable Trusts to objectively compare their usage of quinolone antibiotics and use this information to carry out clinical audit should differences be recorded. This is likely to be applicable to other groups of antibiotics.
Resumo:
There has been much recent research into extracting useful diagnostic features from the electrocardiogram with numerous studies claiming impressive results. However, the robustness and consistency of the methods employed in these studies is rarely, if ever, mentioned. Hence, we propose two new methods; a biologically motivated time series derived from consecutive P-wave durations, and a mathematically motivated regularity measure. We investigate the robustness of these two methods when compared with current corresponding methods. We find that the new time series performs admirably as a compliment to the current method and the new regularity measure consistently outperforms the current measure in numerous tests on real and synthetic data.
Resumo:
Personal selling and sales management play a critical role in the short and long term success of the firm, and have thus received substantial academic interest since the 1970s. Sales research has examined the role of the sales manager in some depth, defining a number of key technical and interpersonal roles which sales managers have in influencing sales force effectiveness. However, one aspect of sales management which appears to remain unexplored is that of their resolution of salesperson-related problems. This study represents the first attempt to address this gap by reporting on the conceptual and empirical development of an instrument designed to measure sales managers' problem resolution styles. A comprehensive literature review and qualitative research study identified three key constructs relating to sales managers' problem resolution styles. The three constructs identified were termed; sales manager willingness to respond, sales manager caring, and sales manager aggressiveness. Building on this, existing literature was used to develop a conceptual model of salesperson-specific consequences of the three problem resolution style constructs. The quantitative phase of the study consisted of a mail survey of UK salespeople, achieving a total sample of 140 fully usable responses. Rigorous statistical assessment of the sales manager problem resolution style measures was undertaken, and construct validity examined. Following this, the conceptual model was tested using latent variable path analysis. The results for the model were encouraging overall, and also with regard to the individual hypotheses. Sales manager problem resolution styles were found individually to have significant impacts on the salesperson-specific variables of role ambiguity, emotional exhaustion, job satisfaction, organisational commitment and organisational citizenship behaviours. The findings, theoretical and managerial implications, limitations and directions for future research are discussed.
Resumo:
This paper develops a theory of tourist satisfaction which is tested by using a consumerist gap scale derived from the Ragheb and Beard Leisure Motivation Scale. The sample consists of 1127 holiday makers from the East Midlands, UK. The results confirm the four dimensions of the original scale, and are used to develop clusters of holidaymakers. These clusters are found to be determinants of attitudes towards holiday destination attributes, and are independent of socio-demographic variables. Other determinants of holiday maker satisfaction are also examined. Among the conclusions drawn are the continuing importance of life cycle stages and previous holiday maker satisfaction. There is little evidence found for the travel career hypothesis developed by Professor Philip Pearce.
Resumo:
We developed an alternative approach for measuring information and communication technology (ICT), applying Data Envelopment Analysis (DEA) using data from the International Telecommunications Union as a sample of 183 economies. We compared the ICT-Opportunity Index (ICT-OI) with our DEA-Opportunity Index (DEA-OI) and found a high correlation between the two. Our findings suggest that both indices are consistent in their measurement of digital opportunity, though differences still exist in different regions. Our new DEA-OI offers much more than the ICT-OI. Using our model, the target and peer groups for each country can be identified.
Resumo:
This paper examines the problems in the definition of the General Non-Parametric Corporate Performance (GNCP) and introduces a multiplicative linear programming as an alternative model for corporate performance. We verified and tested a statistically significant difference between the two models based on the application of 27 UK industries using six performance ratios. Our new model is found to be a more robust performance model than the previous standard Data Envelopment Analysis (DEA) model.
Resumo:
In this paper, we propose a new similarity measure to compute the pair-wise similarity of text-based documents based on patterns of the words in the documents. First we develop a kappa measure for pair-wise comparison of documents then we use ordered weighting averaging operator to define a document similarity measure for a set of documents.
Resumo:
Aims - To develop a method that prospectively assesses adherence rates in paediatric patients with acute lymphoblastic leukaemia (ALL) who are receiving the oral thiopurine treatment 6-mercaptopurine (6-MP). Methods - A total of 19 paediatric patients with ALL who were receiving 6-MP therapy were enrolled in this study. A new objective tool (hierarchical cluster analysis of drug metabolite concentrations) was explored as a novel approach to assess non-adherence to oral thiopurines, in combination with other objective measures (the pattern of variability in 6-thioguanine nucleotide erythrocyte concentrations and 6-thiouric acid plasma levels) and the subjective measure of self-reported adherence questionnaire. Results - Parents of five ALL patients (26.3%) reported at least one aspect of non-adherence, with the majority (80%) citing “carelessness at times about taking medication” as the primary reason for non-adherence followed by “forgetting to take the medication” (60%). Of these patients, three (15.8%) were considered non-adherent to medication according to the self-reported adherence questionnaire (scored ≥ 2). Four ALL patients (21.1%) had metabolite profiles indicative of non-adherence (persistently low levels of metabolites and/or metabolite levels clustered variably with time). Out of these four patients, two (50%) admitted non-adherence to therapy. Overall, when both methods were combined, five patients (26.3%) were considered non-adherent to medication, with higher age representing a risk factor for non-adherence (P < 0.05). Conclusions - The present study explored various ways to assess adherence rates to thiopurine medication in ALL patients and highlighted the importance of combining both objective and subjective measures as a better way to assess adherence to oral thiopurines.
Resumo:
Using a modified deprivation (or poverty) function, in this paper, we theoretically study the changes in poverty with respect to the 'global' mean and variance of the income distribution using Indian survey data. We show that when the income obeys a log-normal distribution, a rising mean income generally indicates a reduction in poverty while an increase in the variance of the income distribution increases poverty. This altruistic view for a developing economy, however, is not tenable anymore once the poverty index is found to follow a pareto distribution. Here although a rising mean income indicates a reduction in poverty, due to the presence of an inflexion point in the poverty function, there is a critical value of the variance below which poverty decreases with increasing variance while beyond this value, poverty undergoes a steep increase followed by a decrease with respect to higher variance. Identifying this inflexion point as the poverty line, we show that the pareto poverty function satisfies all three standard axioms of a poverty index [N.C. Kakwani, Econometrica 43 (1980) 437; A.K. Sen, Econometrica 44 (1976) 219] whereas the log-normal distribution falls short of this requisite. Following these results, we make quantitative predictions to correlate a developing with a developed economy. © 2006 Elsevier B.V. All rights reserved.
Resumo:
Over the last few years Data Envelopment Analysis (DEA) has been gaining increasing popularity as a tool for measuring efficiency and productivity of Decision Making Units (DMUs). Conventional DEA models assume non-negative inputs and outputs. However, in many real applications, some inputs and/or outputs can take negative values. Recently, Emrouznejad et al. [6] introduced a Semi-Oriented Radial Measure (SORM) for modelling DEA with negative data. This paper points out some issues in target setting with SORM models and introduces a modified SORM approach. An empirical study in bank sector demonstrates the applicability of the proposed model. © 2014 Elsevier Ltd. All rights reserved.