803 resultados para preference-based measures
Resumo:
An assumption of theory-based physical activity interventions is that active participation positively affects the theoretical constructs upon which the intervention is based. This assumption is rarely tested. This study assessed whether participation, defined as completion of homework, in a lifestyle physical activity intervention was associated with changes over 6 months in constructs the homework addressed: the behavioral and cognitive processes of change, self-efficacy, and decisional balance (the pros and cons). Participants were 244 sedentary adults aged 25 to 75 years. They completed an average of 12 of 20 homework assignments. Those completing at least two-thirds of the homework (n = 113) had greater changes in the theoretical constructs from pretest to posttest than those completing less (n = 90). Post-hoc analyses suggest that completing theory-based homework may impact the processes of change and self-efficacy in lifestyle physical activity interventions and, therefore, are warranted in future interventions.
Resumo:
Performance prediction models for partial face mechanical excavators, when developed in laboratory conditions, depend on relating the results of a set of rock property tests and indices to specific cutting energy (SE) for various rock types. There exist some studies in the literature aiming to correlate the geotechnical properties of intact rocks with the SE, especially for massive and widely jointed rock environments. However, those including direct and/or indirect measures of rock fracture parameters such as rock brittleness and fracture toughness, along with the other rock parameters expressing different aspects of rock behavior under drag tools (picks), are rather limited. With this study, it was aimed to investigate the relationships between the indirect measures of rock brittleness and fracture toughness and the SE depending on the results of a new and two previous linear rock cutting programmes. Relationships between the SE, rock strength parameters, and the rock index tests have also been investigated in this study. Sandstone samples taken from the different fields around Ankara, Turkey were used in the new testing programme. Detailed mineralogical analyses, petrographic studies, and rock mechanics and rock cutting tests were performed on these selected sandstone specimens. The assessment of rock cuttability was based on the SE. Three different brittleness indices (B1, B2, and B4) were calculated for sandstones samples, whereas a toughness index (T-i), being developed by Atkinson et al.(1), was employed to represent the indirect rock fracture toughness. The relationships between the SE and the large amounts of new data obtained from the mineralogical analyses, petrographic studies, rock mechanics, and linear rock cutting tests were evaluated by using bivariate correlation and curve fitting techniques, variance analysis, and Student's t-test. Rock cutting and rock property testing data that came from well-known studies of McFeat-Smith and Fowell(2) and Roxborough and Philips(3) have also been employed in statistical analyses together with the new data. Laboratory tests and subsequent analyses revealed that there were close correlations between the SE and B4 whereas no statistically significant correlation has been found between the SE and T-i. Uniaxial compressive and Brazilian tensile strengths and Shore scleroscope hardness of sandstones also exhibited strong relationships with the SE. NCB cone indenter test had the greatest influence on the SE among the other engineering properties of rocks, confirming the previous studies in rock cutting and mechanical excavation. Therefore, it was recommended to employ easy-to-use index tests of NCB cone indenter and Shore scleroscope in the estimation of laboratory SE of sandstones ranging from very low to high strengths in the absence of a rock cutting rig to measure it until the easy-to-use universal measures of the rock brittleness and especially the rock fracture toughness, being an intrinsic rock property, are developed.
Resumo:
This trial of cognitive-behavioural therapy (CBT) based amphetamine abstinence program (n = 507) focused on refusal self-efficacy, improved coping, improved problem solving and planning for relapse prevention. Measures included the Severity of Dependence Scale (SDS), the General Health Questionnaire-28 (GHQ-28) and Amphetamine Refusal Self-Efficacy. Psychiatric case identification (caseness) across the four GHQ-28 sub-scales was compared with Australian normative data. Almost 90% were amphetamine-dependent (SDS 8.15 +/- 3.17). Pretreatment, all GHQ-28 sub-scale measures were below reported Australian population values. Caseness was substantially higher than Australian normative values {Somatic Symptoms (52.3%), Anxiety (68%), Social Dysfunction (46.5%) and Depression (33.7%). One hundred and sixty-eight subjects (33%) completed and reported program abstinence. Program completers reported improvement across all GHQ-28 sub-scales Somatic Symptoms (p < 0.001), Anxiety (p < 0.001), Social Dysfunction (p < 0.001) and Depression (p < 0.001)}. They also reported improvement in amphetamine refusal self-efficacy (p < 0.001). Improvement remained significant following intention-to-treat analyses, imputing baseline data for subjects that withdrew from the program. The GHQ-28 sub-scales, Amphetamine Refusal Self-Efficacy Questionnaire and the SDS successfully predicted treatment compliance through a discriminant analysis function (p
Resumo:
Domain specific information retrieval has become in demand. Not only domain experts, but also average non-expert users are interested in searching domain specific (e.g., medical and health) information from online resources. However, a typical problem to average users is that the search results are always a mixture of documents with different levels of readability. Non-expert users may want to see documents with higher readability on the top of the list. Consequently the search results need to be re-ranked in a descending order of readability. It is often not practical for domain experts to manually label the readability of documents for large databases. Computational models of readability needs to be investigated. However, traditional readability formulas are designed for general purpose text and insufficient to deal with technical materials for domain specific information retrieval. More advanced algorithms such as textual coherence model are computationally expensive for re-ranking a large number of retrieved documents. In this paper, we propose an effective and computationally tractable concept-based model of text readability. In addition to textual genres of a document, our model also takes into account domain specific knowledge, i.e., how the domain-specific concepts contained in the document affect the document’s readability. Three major readability formulas are proposed and applied to health and medical information retrieval. Experimental results show that our proposed readability formulas lead to remarkable improvements in terms of correlation with users’ readability ratings over four traditional readability measures.
Resumo:
This paper derives the performance union bound of space-time trellis codes in orthogonal frequency division multiplexing system (STTC-OFDM) over quasi-static frequency selective fading channels based on the distance spectrum technique. The distance spectrum is the enumeration of the codeword difference measures and their multiplicities by exhausted searching through all the possible error event paths. Exhaustive search approach can be used for low memory order STTC with small frame size. However with moderate memory order STTC and moderate frame size the computational cost of exhaustive search increases exponentially, and may become impractical for high memory order STTCs. This requires advanced computational techniques such as Genetic Algorithms (GAS). In this paper, a GA with sharing function method is used to locate the multiple solutions of the distance spectrum for high memory order STTCs. Simulation evaluates the performance union bound and the complexity comparison of non-GA aided and GA aided distance spectrum techniques. It shows that the union bound give a close performance measure at high signal-to-noise ratio (SNR). It also shows that GA sharing function method based distance spectrum technique requires much less computational time as compared with exhaustive search approach but with satisfactory accuracy.
Resumo:
Neural networks are statistical models and learning rules are estimators. In this paper a theory for measuring generalisation is developed by combining Bayesian decision theory with information geometry. The performance of an estimator is measured by the information divergence between the true distribution and the estimate, averaged over the Bayesian posterior. This unifies the majority of error measures currently in use. The optimal estimators also reveal some intricate interrelationships among information geometry, Banach spaces and sufficient statistics.
Resumo:
The concept of entropy rate is well defined in dynamical systems theory but is impossible to apply it directly to finite real world data sets. With this in mind, Pincus developed Approximate Entropy (ApEn), which uses ideas from Eckmann and Ruelle to create a regularity measure based on entropy rate that can be used to determine the influence of chaotic behaviour in a real world signal. However, this measure was found not to be robust and so an improved formulation known as the Sample Entropy (SampEn) was created by Richman and Moorman to address these issues. We have developed a new, related, regularity measure which is not based on the theory provided by Eckmann and Ruelle and proves a more well-behaved measure of complexity than the previous measures whilst still retaining a low computational cost.
Resumo:
We propose a new mathematical model for efficiency analysis, which combines DEA methodology with an old idea-Ratio Analysis. Our model, called DEA-R, treats all possible ratios "output/input" as outputs within the standard DEA model. Although DEA and DEA-R generate different summary measures for efficiency, the two measures are comparable. Our mathematical and empirical comparisons establish the validity of DEA-R model in its own right. The key advantage of DEA-R over DEA is that it allows effective integration of the model with experts' opinions via flexible restrictive conditions on individual "output/input" pairs. © 2007 Springer Science+Business Media, LLC.
Resumo:
People and their performance are key to an organization's effectiveness. This review describes an evidence-based framework of the links between some key organizational influences and staff performance, health and well-being. This preliminary framework integrates management and psychological approaches, with the aim of assisting future explanation, prediction and organizational change. Health care is taken as the focus of this review, as there are concerns internationally about health care effectiveness. The framework considers empirical evidence for links between the following organizational levels: 1. Context (organizational culture and inter-group relations; resources, including staffing; physical environment) 2. People management (HRM practices and strategies; job design, workload and teamwork; employee involvement and control over work; leadership and support) 3. Psychological consequences for employees (health and stress; satisfaction and commitment; knowledge, skills and motivation) 4. Employee behaviour (absenteeism and turnover; task and contextual performance; errors and near misses) 5. Organizational performance; patient care. This review contributes to an evidence base for policies and practices of people management and performance management. Its usefulness will depend on future empirical research, using appropriate research designs, sufficient study power and measures that are reliable and valid.
Resumo:
In industrialised countries age-related macular disease (ARMD) is the leading cause of visual loss in older people. Because oxidative stress is purported to be associated with an increased risk of disease development the role of antioxidant supplementation is of interest. Lutein is a carotenoid antioxidant that accumulates within the retina and is thought to filter blue light. Increased levels of lutein have been associated with reduced risk of developing ARMD and improvements in visual and retinal function in eyes with ARMD. The aim of this randomised controlled trial (RCT) was to investigate the effect of a lutein-based nutritional supplement on subjective and objective measures of visual function in healthy eyes and in eyes with age-related maculopathy (ARM) – an early form of ARMD. Supplement withdrawal effects were also investigated. A sample size of 66 healthy older (HO), healthy younger (HY), and ARM eyes were randomly allocated to receive a lutein-based supplement or no treatment for 40 weeks. The supplemented group then stopped supplementation to look at the effects of withdrawal over a further 20 weeks. The primary outcome measure was multifocal electroretinogram (mfERG) N1P1 amplitude. Secondary outcome measures were mfERG N1, P1 and N2 latency, contrast sensitivity (CS), Visual acuity (VA) and macular pigment optical density (MPOD). Sample sizes were sufficient for the RCT to have an 80% power to detect a significant clinical effect at the 5% significance level for all outcome measures when the healthy eye groups were combined, and CS, VA and mfERG in the ARM group. This RCT demonstrates significant improvements in MPOD in HY and HO supplemented eyes. When HY and HO supplemented groups were combined, MPOD improvements were maintained, and mfERG ring 2 P1 latency became shorter. On withdrawal of the supplement mfERG ring 1 N1P1 amplitude reduced in HO eyes. When HO and HY groups were combined, mfERG ring 1 and ring 2 N1P1 amplitudes were reduced. In ARM eyes, ring 3 N2 latency and ring 4 P1 latency became longer. These statistically significant changes may not be clinically significant. The finding that a lutein-based supplement increases MPOD in healthy eyes, but does not increase mfERG amplitudes contrasts with the CARMIS study and contributes to the debate on the use of nutritional supplementation in ARM.
Resumo:
Creative activities including arts are characteristic to humankind. Our understanding of creativity is limited, yet there is substantial research trying to mimic human creativity in artificial systems and in particular to produce systems that automatically evolve art appreciated by humans. We propose here to model human visual preference by a set of aesthetic measures identified through observation of human selection of images and then use these for automatic evolution of aesthetic images. © 2011 Springer-Verlag.
Resumo:
The topic of this thesis is the development of knowledge based statistical software. The shortcomings of conventional statistical packages are discussed to illustrate the need to develop software which is able to exhibit a greater degree of statistical expertise, thereby reducing the misuse of statistical methods by those not well versed in the art of statistical analysis. Some of the issues involved in the development of knowledge based software are presented and a review is given of some of the systems that have been developed so far. The majority of these have moved away from conventional architectures by adopting what can be termed an expert systems approach. The thesis then proposes an approach which is based upon the concept of semantic modelling. By representing some of the semantic meaning of data, it is conceived that a system could examine a request to apply a statistical technique and check if the use of the chosen technique was semantically sound, i.e. will the results obtained be meaningful. Current systems, in contrast, can only perform what can be considered as syntactic checks. The prototype system that has been implemented to explore the feasibility of such an approach is presented, the system has been designed as an enhanced variant of a conventional style statistical package. This involved developing a semantic data model to represent some of the statistically relevant knowledge about data and identifying sets of requirements that should be met for the application of the statistical techniques to be valid. Those areas of statistics covered in the prototype are measures of association and tests of location.
Resumo:
The field of free radical biology and medicine continues to move at a tremendous pace, with a constant flow of ground-breaking discoveries. The following collection of papers in this issue of Biochemical Society Transactions highlights several key areas of topical interest, including the crucial role of validated measurements of radicals and reactive oxygen species in underpinning nearly all research in the field, the important advances being made as a result of the overlap of free radical research with the reinvigorated field of lipidomics (driven in part by innovations in MS-based analysis), the acceleration of new insights into the role of oxidative protein modifications (particularly to cysteine residues) in modulating cell signalling, and the effects of free radicals on the functions of mitochondria, extracellular matrix and the immune system. In the present article, we provide a brief overview of these research areas, but, throughout this discussion, it must be remembered that it is the availability of reliable analytical methodologies that will be a key factor in facilitating continuing developments in this exciting research area.
Resumo:
Our understanding of creativity is limited, yet there is substantial research trying to mimic human creativity in artificial systems and in particular to produce systems that automatically evolve art appreciated by humans. We propose here to study human visual preference through observation of nearly 500 user sessions with a simple evolutionary art system. The progress of a set of aesthetic measures throughout each interactive user session is monitored and subsequently mimicked by automatic evolution in an attempt to produce an image to the liking of the human user.
Resumo:
Although there is a large body of research on brand equity, little in terms of a literature review has been published on this since Feldwick’s (1996) paper. To address this gap, this paper brings together the scattered literature on consumer based brand equity’s conceptualisation and measurement. Measures of consumer based brand equity are classified as either direct or indirect. Indirect measures assess consumer-based brand equity through its demonstrable dimensions and are superior from a diagnostic level. The paper concludes with directions for future research and managerial pointers for setting up a brand equity measurement system.