820 resultados para Reliability and safeties
Resumo:
Purpose Health service quality is an important determinant for health service satisfaction and behavioral intentions. The purpose of this paper is to investigate requirements of e‐health services and to develop a measurement model to analyze the construct of “perceived e‐health service quality.” Design/methodology/approach The paper adapts the C‐OAR‐SE procedure for scale development by Rossiter. The focal aspect is the “physician‐patient relationship” which forms the core dyad in the healthcare service provision. Several in‐depth interviews were conducted in Switzerland; first with six patients (as raters), followed by two experts of the healthcare system (as judges). Based on the results and an extensive literature research, the classification of object and attributes is developed for this model. Findings The construct e‐health service quality can be described as an abstract formative object and is operationalized with 13 items: accessibility, competence, information, usability/user friendliness, security, system integration, trust, individualization, empathy, ethical conduct, degree of performance, reliability, and ability to respond. Research limitations/implications Limitations include the number of interviews with patients and experts as well as critical issues associated with C‐OAR‐SE. More empirical research is needed to confirm the quality indicators of e‐health services. Practical implications Health care providers can utilize the results for the evaluation of their service quality. Practitioners can use the hierarchical structure to measure service quality at different levels. The model provides a diagnostic tool to identify poor and/or excellent performance with regard to the e‐service delivery. Originality/value The paper contributes to knowledge with regard to the measurement of e‐health quality and improves the understanding of how customers evaluate the quality of e‐health services.
Resumo:
Background Adherence to hypertension management in patients with hypertension is known to influence their blood pressure control. It is important to measure patients’ adherence behaviours to assist with designing appropriate interventions to improve blood pressure control. Aims The purposes of this study were to use confirmatory factor analysis to revalidate the Therapeutic Adherence Scale for Hypertensive Patients (TASHP), and to calculate the cut-off score for classifying adherence behaviours into two groups: satisfactory and low adherence behaviours. Methods Systematic random sampling was used to recruit patients with hypertension in China. Demographic characteristics, the TASHP and blood pressure were collected. The psychometric tests of the TASHP included: construct validity, criteria-related validity, internal reliability, and split-half reliability. The area under the receiver operating characteristics curve and Youden index were used to identify the cut-off score of the TASHP for blood pressure control. Results This study involved 366 patients. Confirmatory factor analysis supported the four-component structure of the TASHP proposed in the original scale development study. The TASHP has a satisfactory internal reliability (Cronbach’s α > 0.7) and a satisfactory split-half reliability (Spearman–Brown coefficients > 0.7). The patients with overall scores of the TASHP ⩾ 109 points were considered to have satisfactory adherence behaviours. Conclusion The TASHP is a validated and reliable instrument to measure the adherence to hypertension management in Chinese patients with hypertension. The cut-off score of 109 points can be considered as an effective measure to classify the level of adherence into satisfactory and low adherence behaviours.
Resumo:
The study of 1777 male and female adolescent students of 11-19 years in the Colombian Caribbean had two objectives: development and validation of two reproductive health intention scales and analyze gender differences. The pilot of the scale consisted of 8 items and was reduced to 6, to check the reliability and validity using factor analysis and principal components with VARIMAX rotation yielded two factors: Intention and Intention Risk Protection, explained between 69.4% and 70% respectively. In the male Protection Intent (M = 3.87 and SD = 1.29) and risk (M = 2.56 and SD = 1.18) obtained an alpha between 0.74 and 0.86, and in Protection of Intent to female (M = 3.49 and SD = 1.35) and risk (M = 1.50 and SD = 0.89) ranged between 0.78 and 086. In conclusion, the reliability and structural stability are adequate and there are gender differences in the scales.
Resumo:
Distraction in the workplace is increasingly more common in the information age. Several tasks and sources of information compete for a worker's limited cognitive capacities in human-computer interaction (HCI). In some situations even very brief interruptions can have detrimental effects on memory. Nevertheless, in other situations where persons are continuously interrupted, virtually no interruption costs emerge. This dissertation attempts to reveal the mental conditions and causalities differentiating the two outcomes. The explanation, building on the theory of long-term working memory (LTWM; Ericsson and Kintsch, 1995), focuses on the active, skillful aspects of human cognition that enable the storage of task information beyond the temporary and unstable storage provided by short-term working memory (STWM). Its key postulate is called a retrieval structure an abstract, hierarchical knowledge representation built into long-term memory that can be utilized to encode, update, and retrieve products of cognitive processes carried out during skilled task performance. If certain criteria of practice and task processing are met, LTWM allows for the storage of large representations for long time periods, yet these representations can be accessed with the accuracy, reliability, and speed typical of STWM. The main thesis of the dissertation is that the ability to endure interruptions depends on the efficiency in which LTWM can be recruited for maintaing information. An observational study and a field experiment provide ecological evidence for this thesis. Mobile users were found to be able to carry out heavy interleaving and sequencing of tasks while interacting, and they exhibited several intricate time-sharing strategies to orchestrate interruptions in a way sensitive to both external and internal demands. Interruptions are inevitable, because they arise as natural consequences of the top-down and bottom-up control of multitasking. In this process the function of LTWM is to keep some representations ready for reactivation and others in a more passive state to prevent interference. The psychological reality of the main thesis received confirmatory evidence in a series of laboratory experiments. They indicate that after encoding into LTWM, task representations are safeguarded from interruptions, regardless of their intensity, complexity, or pacing. However, when LTWM cannot be deployed, the problems posed by interference in long-term memory and the limited capacity of the STWM surface. A major contribution of the dissertation is the analysis of when users must resort to poorer maintenance strategies, like temporal cues and STWM-based rehearsal. First, one experiment showed that task orientations can be associated with radically different patterns of retrieval cue encodings. Thus the nature of the processing of the interface determines which features will be available as retrieval cues and which must be maintained by other means. In another study it was demonstrated that if the speed of encoding into LTWM, a skill-dependent parameter, is slower than the processing speed allowed for by the task, interruption costs emerge. Contrary to the predictions of competing theories, these costs turned out to involve intrusions in addition to omissions. Finally, it was learned that in rapid visually oriented interaction, perceptual-procedural expectations guide task resumption, and neither STWM nor LTWM are utilized due to the fact that access is too slow. These findings imply a change in thinking about the design of interfaces. Several novel principles of design are presented, basing on the idea of supporting the deployment of LTWM in the main task.
Resumo:
Mastery motivation is an important developmental construct that has implications for development across the lifespan. Research to date has focused predominantly on infants and children, with the Dimensions of Mastery Questionnaire (DMQ) being the most widely used measure of mastery motivation. This paper reports on the development and initial validation of an adult measure: the Dimensions of Adult Mastery Motivation Questionnaire (DAMMQ). Six hundred and twenty-eight adults (68 % female) aged 18 to 90 years completed the questionnaire. Factor analysis produced 24 items that represented five factors: task persistence, preference for challenge, task related pleasure, task absorption and competence/self-efficacy. The DAMMQ was found to have good internal consistency, test-retest reliability and concurrent validity. Within group differences for age, gender and education are reported. The development of the DAMMQ paves the way for future research about mastery motivation in adult populations.
Resumo:
The main goal of this study was to explore experiences induced by playing digital games (i.e. meaning of playing). In addition, the study aimed at structuring the larger entities of gaming experience. This was done by using theory-driven and data grounded approaches. Previously gaming experiences have not been explored as a whole. The consideration of gaming experiences on the basis of psychological theories and studies has also been rare. The secondary goal of this study was to clarify, whether the individual meanings of playing are connected with flow experience in an occasional gaming situation. Flow is an enjoyable experience and usually activities that induce flow are gladly repeated. Previously, flow has been proved to be an essential concept in the context of playing, but the relations between meanings of playing and flow have not been studied. The relations between gender and gaming experiences were examined throughout the study, as well as the relationship between gaming frequency and experiences. The study was divided into two sections, of which the first was composed according to the main goals. Its data was gathered by using an Internet questionnaire. The other section covered the themes that were formulated on the basis of the secondary aims. In that section, the participants played a driving game for 40 minutes and then filled in a questionnaire, which measured flow related experiences. In both sections, the participants were mainly young Finnish adults. All the participants in the second section (n = 60) had already participated in the first section (n = 267). Both qualitative and quantitative research techniques were used in the study. In the first section, freely described gaming experiences were classified according to the grounded theory. After that, the most common categories were further classified into the basic structures of gaming experience, some according to the existing theories of experience structure and some according to the data (i.e. grounded theory). In the other section flow constructs were measured and used as grouping variables in a cluster analysis. Three meaningful groups were compared regarding the meanings of gaming that were explored in the first section. The descriptions of gaming experiences were classified into four main categories, which were conceptions of the gaming process, emotions, motivations and focused attention. All the theory-driven categories were found in the data. This frame of reference can be utilized in future when reliability and validity of already existing methods for measuring gaming experiences are considered or new methods will be developed. The connection between the individual relevance of gaming and flow was minor. However, as the scope was specified to relations between primary meanings of playing and flow, it was noticed that attributing enjoyment to gaming did not lead to the strongest flow-experiences. This implies that the issue should be studied more in future. As a whole this study proves that gamer-related research from numerous vantage points can benefit from concentrating on gaming experiences.
Resumo:
DNA evidence has made a significant contribution to criminal investigations in Australia and around the world since it was widely adopted in the 1990s (Gans & Urbas 2002). The direct matching of DNA profiles, such as comparing one obtained from a crime scene with one obtained from a suspect or database, remains a widely used technique in criminal investigations. A range of new DNA profiling techniques continues to be developed and applied in criminal investigations around the world (Smith & Urbas 2012). This paper is the third in a series by the Australian Institute of Criminology (AIC) on DNA evidence. The first, published in 1990 when the technology was in its relative infancy, outlined the scientific background for DNA evidence, considered early issues such as scientific reliability and privacy and described its application in early criminal cases (Easteal & Easteal 1990). The second, published in 2002, expanded on the scientific background and discussed a significant number of Australian cases in a 12-year period, illustrating issues that had arisen in investigations, at trial and in the use of DNA in the review of convictions and acquittals (Gans & Urbas 2002). There have been some significant developments in the science and technology behind DNA evidence in the 13 years since 2002 that have important implications for law enforcement and the legal system. These are discussed through a review of relevant legal cases and the latest empirical evidence. This paper is structured in three sections. The first examines the scientific techniques and how they have been applied in police investigations, drawing on a number of recent cases to illustrate them. The second considers empirical research evaluating DNA evidence and databases and the impact DNA has on investigative and court outcomes. The final section discusses significant cases that establish legal precedent relating to DNA evidence in criminal trials where significant issues have arisen or new techniques have been applied that have not yet been widely discussed in the literature. The paper concludes by reflecting on implications for policy and practice.
Resumo:
A recent editorial in International Research in Geographical and Environmental Education (IRGEE) (Stoltman, Lidstone & Kidman, 2014) highlighted an opportunity for the inclusion of geography as a subject in the Trends in International Mathematics and Science Study (TIMSS) tests. At present TIMSS tests only encompass mathematics and physical sciences. The IRGEE editors encouraged geography educators to take the initiative and be proactive for a TIMSS international assessment in geography to become a reality. This paper reports on a research project to identify the perceptions of the global geography education community on the advantages and challenges of initiating and implementing such tests. The authors highlight a number of consistencies and tensions revealed by the respondents as well as potential issues of validity, reliability and fairness of a geography assessment instrument. The implications of these findings for ongoing research are discussed.
Resumo:
Develop new, high yielding, disease resistant mungbean varieties that will increase crop reliability and grower confidence in the northern region.
Resumo:
The availability of a small fleet of aircraft in a flying-base, repair-depot combination is modeled and studied. First, a deterministic flow model relates parameters of interest and represents the state-of-the art in the planning of such systems. Second, a cyclic queue model shows the effect of the principal uncertainties in operation and repair and shows the consequent decrease in the availability of aircraft at the flying-base. Several options such as increasing fleet size, investments in additional repair facilities, or building reliability and maintainability into the individual aircraft during its life-cycle are open for increasing the availability. A life-cycle cost criterion brings out some of these features. Numerical results confirm Rose's prediction that there exists a minimal cost combination of end products and repair-depot capability to achieve a prescribed operational availability.
Resumo:
Digital fluidic and pneumatic systems incorporate displays for the presentation of information to the operator. Displays reported so far for such systems use moving pistons, tapes, and other mechanisms leading to lower reliability. This paper describes a nonmoving part fluidic display employing the photoelastic effect. The display is pressure actuated and has a long life. When fabricated from compatible materials, this device can withstand hostile environments like nuclear radiation, vibrations, etc. The display is compact, economical and is virtually maintenance free. The display unit has been tested in the laboratory for reliability and speed of response.
Resumo:
During the past ten years, large-scale transcript analysis using microarrays has become a powerful tool to identify and predict functions for new genes. It allows simultaneous monitoring of the expression of thousands of genes and has become a routinely used tool in laboratories worldwide. Microarray analysis will, together with other functional genomics tools, take us closer to understanding the functions of all genes in genomes of living organisms. Flower development is a genetically regulated process which has mostly been studied in the traditional model species Arabidopsis thaliana, Antirrhinum majus and Petunia hybrida. The molecular mechanisms behind flower development in them are partly applicable in other plant systems. However, not all biological phenomena can be approached with just a few model systems. In order to understand and apply the knowledge to ecologically and economically important plants, other species also need to be studied. Sequencing of 17 000 ESTs from nine different cDNA libraries of the ornamental plant Gerbera hybrida made it possible to construct a cDNA microarray with 9000 probes. The probes of the microarray represent all different ESTs in the database. From the gerbera ESTs 20% were unique to gerbera while 373 were specific to the Asteraceae family of flowering plants. Gerbera has composite inflorescences with three different types of flowers that vary from each other morphologically. The marginal ray flowers are large, often pigmented and female, while the central disc flowers are smaller and more radially symmetrical perfect flowers. Intermediate trans flowers are similar to ray flowers but smaller in size. This feature together with the molecular tools applied to gerbera, make gerbera a unique system in comparison to the common model plants with only a single kind of flowers in their inflorescence. In the first part of this thesis, conditions for gerbera microarray analysis were optimised including experimental design, sample preparation and hybridization, as well as data analysis and verification. Moreover, in the first study, the flower and flower organ-specific genes were identified. After the reliability and reproducibility of the method were confirmed, the microarrays were utilized to investigate transcriptional differences between ray and disc flowers. This study revealed novel information about the morphological development as well as the transcriptional regulation of early stages of development in various flower types of gerbera. The most interesting finding was differential expression of MADS-box genes, suggesting the existence of flower type-specific regulatory complexes in the specification of different types of flowers. The gerbera microarray was further used to profile changes in expression during petal development. Gerbera ray flower petals are large, which makes them an ideal model to study organogenesis. Six different stages were compared and specifically analysed. Expression profiles of genes related to cell structure and growth implied that during stage two, cells divide, a process which is marked by expression of histones, cyclins and tubulins. Stage 4 was found to be a transition stage between cell division and expansion and by stage 6 cells had stopped division and instead underwent expansion. Interestingly, at the last analysed stage, stage 9, when cells did not grow any more, the highest number of upregulated genes was detected. The gerbera microarray is a fully-functioning tool for large-scale studies of flower development and correlation with real-time RT-PCR results show that it is also highly sensitive and reliable. Gene expression data presented here will be a source for gene expression mining or marker gene discovery in the future studies that will be performed in the Gerbera Laboratory. The publicly available data will also serve the plant research community world-wide.
Resumo:
The prevalence and assessment of neuroleptic-induced movement disorders (NIMDs) in a naturalistic schizophrenia population that uses conventional neuroleptics were studied. We recruited 99 chronic schizophrenic institutionalized adult patients from a state nursing home in central Estonia. The total prevalence of NIMDs according to the diagnostic criteria of the Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV) was 61.6%, and 22.2% had more than one NIMD. We explored the reliability and validity of different instruments for measuring these disorders. First, we compared DSM-IV with the established observer rating scales of Barnes Akathisia Rating Scale (BARS), Simpson-Angus Scale (SAS) (for neuroleptic-induced parkinsonism, NIP) and Abnormal Involuntary Movement Scale (AIMS) (for tardive dyskinesia), all three of which have been used for diagnosing NIMD. We found a good overlap of cases for neuroleptic-induced akathisia (NIA) and tardive dyskinesia (TD) but somewhat poorer overlap for NIP, for which we suggest raising the commonly used threshold value of 0.3 to 0.65. Second, we compared the established observer rating scales with an objective motor measurement, namely controlled rest lower limb activity measured by actometry. Actometry supported the validity of BARS and SAS, but it could not be used alone in this naturalistic population with several co-existing NIMDs. It could not differentiate the disorders from each other. Quantitative actometry may be useful in measuring changes in NIA and NIP severity, in situations where the diagnosis has been made using another method. Third, after the relative failure of quantitative actometry to show diagnostic power in a naturalistic population, we explored descriptive ways of analysing actometric data, and demonstrated diagnostic power pooled NIA and pseudoakathisia (PsA) in our population. A subjective question concerning movement problems was able to discriminate NIA patients from all other subjects. Answers to this question were not selective for other NIMDs. Chronic schizophrenia populations are common worldwide, NIMD affected two-thirds of our study population. Prevention, diagnosis and treatment of NIMDs warrant more attention, especially in countries where typical antipsychotics are frequently used. Our study supported the validity and reliability of DSM-IV diagnostic criteria for NIMD in comparison with established rating scales and actometry. SAS can be used with minor modifications for screening purposes. Controlled rest lower limb actometry was not diagnostically specific in our naturalistic population with several co-morbid NIMDs, but it may be sensitive in measuring changes in NIMDs.
Resumo:
Biological systems present remarkable adaptation, reliability, and robustness in various environments, even under hostility. Most of them are controlled by the individuals in a distributed and self-organized way. These biological mechanisms provide useful resources for designing the dynamical and adaptive routing schemes of wireless mobile sensor networks, in which the individual nodes should ideally operate without central control. This paper investigates crucial biologically inspired mechanisms and the associated techniques for resolving routing in wireless sensor networks, including Ant-based and genetic approaches. Furthermore, the principal contributions of this paper are as follows. We present a mathematical theory of the biological computations in the context of sensor networks; we further present a generalized routing framework in sensor networks by diffusing different modes of biological computations using Ant-based and genetic approaches; finally, an overview of several emerging research directions are addressed within the new biologically computational framework.
Resumo:
Objective: To evaluate the feasibility, reliability and acceptability of the mini clinical evaluation exercise (mini-CEX) for performance assessment among international medical graduates (IMGs). Design, setting and participants: Observational study of 209 patient encounters involving 28 IMGs and 35 examiners at three metropolitan teaching hospitals in New South Wales, Victoria and Queensland, September-December 2006. Main outcome measures: The reliability of the mini-CEX was estimated using generatisability (G) analysis, and its acceptability was evaluated by a written survey of the examiners and IMGs. Results: The G coefficient for eight encounters was 0.88, suggesting that the reliability of the mini-CEX was 0.90 for 10 encounters. Almost half of the IMGs (7/16) and most examiners (14/18) were satisfied with the mini-CEX as a learning tool. Most of the IMGs and examiners enjoyed the immediate feedback, which is a strong component of the tool. Conclusion: The mini-CEX is a reliable tool for performance assessment of IMGs, and is acceptable to and well received by both learners and supervisors.