758 resultados para Determinant-based sparseness measure
Resumo:
Background: To derive preference-based measures from various condition-specific descriptive health-related quality of life (HRQOL) measures. A general 2-stage method is evolved: 1) an item from each domain of the HRQOL measure is selected to form a health state classification system (HSCS); 2) a sample of health states is valued and an algorithm derived for estimating the utility of all possible health states. The aim of this analysis was to determine whether confirmatory or exploratory factor analysis (CFA, EFA) should be used to derive a cancer-specific utility measure from the EORTC QLQ-C30. Methods: Data were collected with the QLQ-C30v3 from 356 patients receiving palliative radiotherapy for recurrent or metastatic cancer (various primary sites). The dimensional structure of the QLQ-C30 was tested with EFA and CFA, the latter based on a conceptual model (the established domain structure of the QLQ-C30: physical, role, emotional, social and cognitive functioning, plus several symptoms) and clinical considerations (views of both patients and clinicians about issues relevant to HRQOL in cancer). The dimensions determined by each method were then subjected to item response theory, including Rasch analysis. Results: CFA results generally supported the proposed conceptual model, with residual correlations requiring only minor adjustments (namely, introduction of two cross-loadings) to improve model fit (increment χ2(2) = 77.78, p < .001). Although EFA revealed a structure similar to the CFA, some items had loadings that were difficult to interpret. Further assessment of dimensionality with Rasch analysis aligned the EFA dimensions more closely with the CFA dimensions. Three items exhibited floor effects (>75% observation at lowest score), 6 exhibited misfit to the Rasch model (fit residual > 2.5), none exhibited disordered item response thresholds, 4 exhibited DIF by gender or cancer site. Upon inspection of the remaining items, three were considered relatively less clinically important than the remaining nine. Conclusions: CFA appears more appropriate than EFA, given the well-established structure of the QLQ-C30 and its clinical relevance. Further, the confirmatory approach produced more interpretable results than the exploratory approach. Other aspects of the general method remain largely the same. The revised method will be applied to a large number of data sets as part of the international and interdisciplinary project to develop a multi-attribute utility instrument for cancer (MAUCa).
Resumo:
Background and Aims Research into craving is hampered by lack of theoretical specification and a plethora of substance-specific measures. This study aimed to develop a generic measure of craving based on elaborated intrusion (EI) theory. Confirmatory factor analysis (CFA) examined whether a generic measure replicated the three-factor structure of the Alcohol Craving Experience (ACE) scale over different consummatory targets and time-frames. Design Twelve studies were pooled for CFA. Targets included alcohol, cigarettes, chocolate and food. Focal periods varied from the present moment to the previous week. Separate analyses were conducted for strength and frequency forms. Setting Nine studies included university students, with single studies drawn from an internet survey, a community sample of smokers and alcohol-dependent out-patients. Participants A heterogeneous sample of 1230 participants. Measurements Adaptations of the ACE questionnaire. Findings Both craving strength [comparative fit indices (CFI = 0.974; root mean square error of approximation (RMSEA) = 0.039, 95% confidence interval (CI) = 0.035–0.044] and frequency (CFI = 0.971, RMSEA = 0.049, 95% CI = 0.044–0.055) gave an acceptable three-factor solution across desired targets that mapped onto the structure of the original ACE (intensity, imagery, intrusiveness), after removing an item, re-allocating another and taking intercorrelated error terms into account. Similar structures were obtained across time-frames and targets. Preliminary validity data on the resulting 10-item Craving Experience Questionnaire (CEQ) for cigarettes and alcohol were strong. Conclusions The Craving Experience Questionnaire (CEQ) is a brief, conceptually grounded and psychometrically sound measure of desires. It demonstrates a consistent factor structure across a range of consummatory targets in both laboratory and clinical contexts.
Resumo:
Fractional anisotropy (FA), a very widely used measure of fiber integrity based on diffusion tensor imaging (DTI), is a problematic concept as it is influenced by several quantities including the number of dominant fiber directions within each voxel, each fiber's anisotropy, and partial volume effects from neighboring gray matter. With High-angular resolution diffusion imaging (HARDI) and the tensor distribution function (TDF), one can reconstruct multiple underlying fibers per voxel and their individual anisotropy measures by representing the diffusion profile as a probabilistic mixture of tensors. We found that FA, when compared with TDF-derived anisotropy measures, correlates poorly with individual fiber anisotropy, and may sub-optimally detect disease processes that affect myelination. By contrast, mean diffusivity (MD) as defined in standard DTI appears to be more accurate. Overall, we argue that novel measures derived from the TDF approach may yield more sensitive and accurate information than DTI-derived measures.
Resumo:
Purpose Health service quality is an important determinant for health service satisfaction and behavioral intentions. The purpose of this paper is to investigate requirements of e‐health services and to develop a measurement model to analyze the construct of “perceived e‐health service quality.” Design/methodology/approach The paper adapts the C‐OAR‐SE procedure for scale development by Rossiter. The focal aspect is the “physician‐patient relationship” which forms the core dyad in the healthcare service provision. Several in‐depth interviews were conducted in Switzerland; first with six patients (as raters), followed by two experts of the healthcare system (as judges). Based on the results and an extensive literature research, the classification of object and attributes is developed for this model. Findings The construct e‐health service quality can be described as an abstract formative object and is operationalized with 13 items: accessibility, competence, information, usability/user friendliness, security, system integration, trust, individualization, empathy, ethical conduct, degree of performance, reliability, and ability to respond. Research limitations/implications Limitations include the number of interviews with patients and experts as well as critical issues associated with C‐OAR‐SE. More empirical research is needed to confirm the quality indicators of e‐health services. Practical implications Health care providers can utilize the results for the evaluation of their service quality. Practitioners can use the hierarchical structure to measure service quality at different levels. The model provides a diagnostic tool to identify poor and/or excellent performance with regard to the e‐service delivery. Originality/value The paper contributes to knowledge with regard to the measurement of e‐health quality and improves the understanding of how customers evaluate the quality of e‐health services.
Resumo:
Following the spirit of the enhanced Russell graph measure, this paper proposes an enhanced Russell-based directional distance measure (ERBDDM) model for dealing with desirable and undesirable outputs in data envelopment analysis (DEA) and allowing some inputs and outputs to be zero. The proposed method is analogous to the output oriented slacks-based measure (OSBM) and directional output distance function approach because it allows the expansion of desirable outputs and the contraction of undesirable outputs. The ERBDDM is superior to the OSBM model and traditional approach since it is not only able to identify all the inefficiency slacks just as the latter, but also avoids the misperception and misspecification of the former, which fails to identify null-jointness production of goods and bads. The paper also imposes a strong complementary slackness condition on the ERBDDM model to deal with the occurrence of multiple projections. Furthermore, we use the Penn Table data to help us explore our new approach in the context of environmental policy evaluations and guidance for performance improvements in 111 countries.
Resumo:
A new method based on analysis of a single diffraction pattern is proposed to measure deflections in micro-cantilever (MC) based sensor probes, achieving typical deflection resolutions of 1nm and surface stress changes of 50 mu N/m. The proposed method employs a double MC structure where the deflection of one of the micro-cantilevers relative to the other due to surface stress changes results in a linear shift of intensity maxima of the Fraunhofer diffraction pattern of the transilluminated MC. Measurement of such shifts in the intensity maxima of a particular order along the length of the structure can be done to an accuracy of 0.01mm leading to the proposed sensitivity of deflection measurement in a typical microcantilever. This method can overcome the fundamental measurement sensitivity limit set by diffraction and pointing stability of laser beam in the widely used Optical Beam Deflection method (OBDM).
Resumo:
A new approach based on occupation measures is introduced for studying stochastic differential games. For two-person zero-sum games, the existence of values and optimal strategies for both players is established for various payoff criteria. ForN-person games, the existence of equilibria in Markov strategies is established for various cases.
Resumo:
We introduce a novel temporal feature of a signal, namely extrema-based signal track length (ESTL) for the problem of speech segmentation. We show that ESTL measure is sensitive to both amplitude and frequency of the signal. The short-time ESTL (ST_ESTL) shows a promising way to capture the significant segments of speech signal, where the segments correspond to acoustic units of speech having distinct temporal waveforms. We compare ESTL based segmentation with ML and STM methods and find that it is as good as spectral feature based segmentation, but with lesser computational complexity.
Resumo:
Although many optical fibre applications are based on their capacity to transmit optical signals with low losses, it can also be desirable for the optical fibre to be strongly affected by a certain physical parameter in the environment. In this way, it can be used as a sensor for this parameter. There are many strong arguments for the use of POFs as sensors. In addition to being easy to handle and low cost, they demonstrate advantages common to all multimode optical fibres. These specifically include flexibility, small size, good electromagnetic compatibility behaviour, and in general, the possibility of measuring any phenomenon without physically interacting with it. In this paper, a sensor based on POF is designed and analysed with the aim of measuring the volume and turbidity of a low viscosity fluid, in this case water, as it passes through a pipe. A comparative study with a commercial sensor is provided to validate the proven flow measurement. Likewise, turbidity is measured using different colour dyes. Finally, this paper will present the most significant results and conclusions from all the tests which are carried out.
Resumo:
Current measures of global gene expression analyses, such as correlation and mutual information-based approaches, largely depend on the degree of association between mRNA levels and to a lesser extent on variability. I develop and implement a new approach, called the Ratiometric method, which is based on the coefficient of variation of the expression ratio of two genes, relying more on variation than previous methods. The advantage of such modus operandi is the ability to detect possible gene pair interactions regardless of the degree of expression dispersion across the sample group. Gene pairs with low expression dispersion, i.e., their absolute expressions remain constant across the sample group, are systematically missed by correlation and mutual information analyses. The superiority of the Ratiometric method in finding these gene pair interactions is demonstrated in a data set of RNA-seq B-cell samples from the 1000 Genomes Project Consortium. The Ratiometric method renders a more comprehensive recovery of KEGG pathways and GO-terms.