75 resultados para job task analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: People with disabilities have difficulties in obtaining work. However, evidence suggests that those with disabilities derive substantial mental health benefits from employment. This paper assesses how the relationship between work and mental health is influenced by psychosocial job quality for people working with a disability. METHODS: The study design was a longitudinal cohort with 13 annual waves of data collection, yielding a sample of 122,883 observations from 21,848 people. Fixed-effects within-person regression was used to control for time invariant confounding. The Mental Component Summary (MCS) of the Short Form 36 (SF-36) measure was used as the primary outcome measure. The main exposure was a six-category measure of psychosocial job quality and employment status (including 'not in the labour force' [NILF] and unemployment). Disability status ('no waves of disability reported' and 'all contributed waves with reported disability') was assessed as an effect modifier. We also conducted a secondary analysis on respondents contributing both disability and non-disability waves. RESULTS: For those with no disability, the greatest difference in mental health (compared to optimal employment) occurs when people have the poorest quality jobs (-2.12, 95% CI -2.48, -1.75, p < 0.001). The relative difference in mental health was less in relation to NILF and unemployment (-0.39 and -0.66 respectively). For those with consistent disability, the difference in mental health when employed in an optimal job was similar between the poorest quality jobs (-2.25, 95% CI -3.84, -0.65, p = 0.006), NILF (-2.84, 95% CI -4.49, -1.20, p = 0.001) or unemployment (-2.56, 95% CI -4.32, -0.80, p = 0.004). These results were confirmed by the secondary analysis. CONCLUSIONS: Efforts to improve psychosocial job quality may have significant mental health benefits for people with disabilities. This will contribute to the economic viability of disability employment insurance schemes in Australia and other high-income countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on the results of a study aimed at identifying the relative influence of generic and job-specific stressors experienced by a cohort of Australian managers. The results of a regression analysis revealed that both the generic components of the job strain model (JSM) and job-specific stressors were predictive of the strain experienced by participants. However, when looking at the total amount of variance that is explained by the predictor variables, the combined influence of job demand, job control and social support contributed 98 per cent of the explained variance in job satisfaction and 90 per cent of the variance in psychological health. The large amount of variance explained by the JSM suggests that this model provides an accurate account of the work characteristics that contribute to the strain experienced by managers and no augmentation is needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper focuses on the issue of comparing social groups or collectivities using measures derived from individual-level multivariate data. In this case, groups need to be differentiated such that: (a) between-group differences are maximized; (b) within-group differences are minimised; and (c) `differences' are calibrated to a scale that reflects a set indicators or observed variables.This paper demonstrates empirically how correspondence analysis can achieve this. It presents a scale of `workplace morale' derived from the responses of employees in a large sample of workplaces to questions concerning satisfaction with various facets of their job and their workplace. The scale derived through correspondence analysis is shown to achieve the three criteria described above.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predicting which consumers will be amongst the first to adopt an innovative product is a difficult task but is valuable in allowing effective and efficient use of marketing resources. This paper examines the accuracy of predictions made about likely first adopters based on the most widely accepted theory and compares them to predictions made by examining the relevant past behavior of consumers. A survey of over 1000 consumers examined adoption of an innovative technology: compact fluorescent lightglobes. The results show that variables which were derived from a utility and awareness perspective were a more accurate and managerially useful predictor than the demographic variables derived from the widely accepted theory based on the work of Rogers. It is suggested that these alternative variables could be utilized more readily by marketing managers in many circumstances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cluster computing has come to prominence as a cost-effective parallel processing tool for solving many complex computational problems. In this paper, we propose a new timesharing opportunistic scheduling policy to support remote batch job executions over networked clusters to be used in conjunction with the Condor Up-Down scheduling algorithm. We show that timesharing approaches can be used in an opportunistic setting to improve both mean job slowdowns and mean response times with little or no throughput reduction. We also show that the proposed algorithm achieves significant improvement in job response time and slowdown as compared to exiting approaches and some recently proposed new approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the continuing need for an empirically validated classification of marketing positions, for theoretical and practical purposes, it appears that no such schema exists. This study contributes to the development of such a taxonomy through an empirical examination of marketing positions. Specifically, the research extends an existing taxonomy by empirically investigating personal selling marketing activities. Based on the taxonomy developed by Darmon (1998), data were collected about the information load, information complexity, and time and relationship management activities of marketers. Various analytical techniques were used to investigate specific features of the instrument and the taxonomy, as well as to provide convergence for the conclusions drawn by the researchers. It was established that sales positions can be more meaningfully identified, and, therefore, better categorised, by six dimensions than by job title or job role. Further, it appears that marketers in the same selling position vary significantly on these dimensions. These findings have important implications for marketing theory, applied research and management. However, future research should refine the instrument used in this study, since some anomalies emerged in the findings, and extend the study by investigating a wider range of marketing positions. Such research may also explore whether the dimensions identified in this study influence the performance and job satisfaction of marketers, and the extent to which marketing managers account for these variations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Determining the causal relation among attributes in a domain
is a key task in the data mining and knowledge discovery. In this
paper, we applied a causal discovery algorithm to the business traveler
expenditure survey data [1]. A general class of causal models is adopted in
this paper to discover the causal relationship among continuous and discrete variables. All those factors which have direct effect on the expense
pattern of travelers could be detected. Our discovery results reinforced
some conclusions of the rough set analysis and found some new conclusions which might significantly improve the understanding of expenditure behaviors of the business traveler.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protocol analysis is an empirical method applied by researchers in cognitive psychology and behavioural analysis. Protocol analysis can be used to collect, document and analyse thought processes by an individual problem solver. In general, research subjects are asked to think aloud when performing a given task. Their verbal reports are transcribed and represent a sequence of their thoughts and cognitive activities. These verbal reports are analysed to identify relevant segments of cognitive behaviours by the research subjects. The analysis results may be cross-examined (or validated through retrospective interviews with the research subjects). This paper offers a critical analysis of this research method, its approaches to data collection and analysis, strengths and limitations, and discusses its use in information systems research. The aim is to explore the use of protocol analysis in studying the creative requirements engineering process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the problem of performance modeling for large-scale heterogeneous distributed systems with emphases on multi-cluster computing systems. Since the overall performance of distributed systems is often depends on the effectiveness of its communication network, the study of the interconnection networks for these systems is very important. Performance modeling is required to avoid poorly chosen components and architectures as well as discovering a serious shortfall during system testing just prior to deployment time. However, the multiplicity of components and associated complexity make performance analysis of distributed computing systems a challenging task. To this end, we present an analytical performance model for the interconnection networks of heterogeneous multi-cluster systems. The analysis is based on a parametric family of fat-trees, the m-port n-tree, and a deterministic routing algorithm, which is proposed in this paper. The model is validated through comprehensive simulation, which demonstrated that the proposed model exhibits a good degree of accuracy for various system organizations and under different working conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When building a cost-effective high-performance parallel processing system, a performance model is a useful tool for exploring the design space and examining various parameters. However, performance analysis in such systems has proven to be a challenging task that requires the innovative performance analysis tools and methods to keep up with the rapid evolution and ever increasing complexity of such systems. To this end, we propose an analytical model for heterogeneous multi-cluster systems. The model takes into account stochastic quantities as well as network heterogeneity in bandwidth and latency in each cluster. Also, blocking and non-blocking network architecture model is proposed and are used in performance analysis of the system. The message latency is used as the primary performance metric. The model is validated by constructing a set of simulators to simulate different types of clusters, and by comparing the modeled results with the simulated ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims. To explore and explain nurses' use of readily available clinical information when deciding whether a patient is at risk of a critical event.

Background. Half of inpatients who suffer a cardiac arrest have documented but unacted upon clinical signs of deterioration in the 24 hours prior to the event. Nurses appear to be both misinterpreting and mismanaging the nursing-knowledge 'basics' such as heart rate, respiratory rate and oxygenation. Whilst many medical interventions originate from nurses, up to 26% of nurses' responses to abnormal signs result in delays of between one and three hours.

Methods. A double system judgement analysis using Brunswik's lens model of cognition was undertaken with 245 Dutch, UK, Canadian and Australian acute care nurses. Nurses were asked to judge the likelihood of a critical event, 'at-risk' status, and whether they would intervene in response to 50 computer-presented clinical scenarios in which data on heart rate, systolic blood pressure, urine output, oxygen saturation, conscious level and oxygenation support were varied. Nurses were also presented with a protocol recommendation and also placed under time pressure for some of the scenarios. The ecological criterion was the predicted level of risk from the Modified Early Warning Score assessments of 232 UK acute care inpatients.

Results. Despite receiving identical information, nurses varied considerably in their risk assessments. The differences can be partly explained by variability in weightings given to information. Time and protocol recommendations were given more weighting than clinical information for key dichotomous choices such as classifying a patient as 'at risk' and deciding to intervene. Nurses' weighting of cues did not mirror the same information's contribution to risk in real patients. Nurses synthesized information in non-linear ways that contributed little to decisional accuracy. The low-moderate achievement (Ra) statistics suggests that nurses' assessments of risk were largely inaccurate; these assessments were applied consistently among 'patients' (scenarios). Critical care experience was statistically associated with estimates of risk, but not with the decision to intervene.

Conclusion. Nurses overestimated the risk and the need to intervene in simulated paper patients at risk of a critical event. This average response masked considerable variation in risk predictions, the need for action and the weighting afforded to the information they had available to them. Nurses did not make use of the linear reasoning required for accurate risk predictions in this task. They also failed to employ any unique knowledge that could be shown to make them more accurate. The influence of time pressure and protocol recommendations depended on the kind of judgement faced suggesting then that knowing more about the types of decisions nurses face may influence information use.

Relevance to clinical practice. Practice developers and educators need to pay attention to the quality of nurses' clinical experience as well as the quantity when developing judgement expertise in nurses. Intuitive unaided decision making in the assessment of risk may not be as accurate as supported decision making. Practice developers and educators should consider teaching nurses normative rules for revising probabilities (even subjective ones) such as Bayes' rule for diagnostic or assessment judgements and also that linear ways of thinking, in which decision support may help, may be useful for many choices that nurses face. Nursing needs to separate the rhetoric of 'holism' and 'expertise' from the science of predictive validity, accuracy and competence in judgement and decision making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims at examining gender differences in the experience of work–family interference and perceived job–life satisfaction in a group of Iranian employees. The participants in the study consist of 387 Iranian male and female employees from a variety of organizations. The results of t tests and multiple regression analysis using EQS 6.1 support the hypothesis that Iranian male and female employees experience similar interference in their work–family domains although they spend different numbers of hours in the workplace. The findings also show that whereas work-to-family interference has significant and negative effects on job–life satisfaction among male employees, for female employees, working hours and family-to-work interference had even more significant and negative effects on their job–ife satisfaction. Implications are discussed and recommendations made regarding future research and interventions in this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis reviews progress toward an understanding of the processes involved in the solution of spatial problems. Previous work employing factor analysis and information processing analysis is reviewed and the emphasis on variations in speed and accuracy as the major contributers to individual differences is noted. It is argued that the strategy used by individuals is a preferable explanatory concept for identifying the cognitive substratum necessary for problem solving. Using the protocols obtained from subjects solving The Minnesota Paper Form Board (Revised), a test commonly regarded as measuring skill in spatial visualization, a number of different strategies are isolated. Assumptions as to the task variants which undergird these strategies are made and tested experimentally. The results suggest that task variants such as the size of the stimulus and the shape of the pieces interact with subject variables to produce the operating strategy. Skill in problem solving is revealed in the ability to structure the array, to hold a structured image and to reduce the number of answers requiring intensive processing. The interaction between task and subject variables results in appropriate or inappropriate strategies which in turn affect speed and accuracy. Results suggest that strategy formation and usage are the keys to explaining individual differences and an heuristic model is presented to explain the performance of individual subjects on the problems involved in the Minnesota Paper Form Board. The model can be used to predict performance on other tests; and as an aid to teaching subjects experiencing difficulties. The model presented incorporates strategy variation and is consequently mores complex than previously suggested models. It is argued that such complexity is necessary to explain the nature of a subject's performance and is also necessary to perform diagnostic evaluation. Certain structural -features of the Minnesota Paper Form Board are questioned and suggestions for improvement included. The essential explanatory function of the strategy in use makes the prevalent group administration approach suspect in the prediction of future performance in spatial or vocational activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis reports on a quantitative exposure assessment and on an analysis of the attributes of the data used in the estimations, in particular distinguishing between its uncertainty and variability. A retrospective assessment of exposure to benzene was carried out for a case control study of leukaemia in the Australian petroleum industry. The study used the mean of personal task-based measurements (Base Estimates) in a deterministic algorithm and applied factors to model back to places, times etc for which no exposure measurements were available. Mean daily exposures were estimated, on an individual subject basis, by summing the task-based exposures. These mean exposures were multiplied by the years spent on each job to provide exposure estimates in ppm-years. These were summed to provide a Cumulative Estimate for each subject. Validation was completed for the model and key inputs. Exposures were low, most jobs were below TWA of 5 ppm benzene. Exposures in terminals were generally higher than at refineries. Cumulative Estimates ranged from 0.005 to 50.9 ppm-years, with 84 percent less than 10 ppm-years. Exposure probability distributions were developed for tanker drivers using Monte Carlo simulation of the exposure estimation algorithm. The outcome was a lognormal distribution of exposure for each driver. These provide the basis for alternative risk assessment metrics e.g. the frequency of short but intense exposures which provided only a minimal contribution to the long-term average exposure but may increase risk of leukaemia. The effect of different inputs to the model were examined and their significance assessed using Monte Carlo simulation. The Base Estimates were the most important determinant of exposure in the model. The sources of variability in the measured data were examined, including the effect of having censored data and the between and within-worker variability. The sources of uncertainty in the exposure estimates were analysed and consequential improvements in exposure assessment identified. Monte Carlo sampling was also used to examine the uncertainties and variability associated with the tanker drivers' exposure assessment, to derive an estimate of the range and to put confidence intervals on the daily mean exposures. The identified uncertainty was less than the variability associated with the estimates. The traditional approach to exposure estimation typically derives only point estimates of mean exposure. The approach developed here allows a range of exposure estimates to be made and provides a more flexible and improved basis for risk assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research explores the transition from student to registered nurse from the perspective of the new graduate. This interpretive study uses narrative analysis as the methodology. Individual stories were collected and processed using the method of core story creation and emplotment (Emden 1998). Four newly registered nurses were invited to share stories related to how they were experiencing their role. Participants were encouraged to tell their stories in response to the open question 'what is it like to be a registered nurse?' In the final step of the analysis one honest and critical story has been crafted (Barone 1992) using a process termed emplotment thus disclosing the themes that allow the stories to be grasped together as a single story (Polkinghorne 1988, Emden 1998). The final story of 'Fable' gives insight into the ways in which newly registered nurses experience their role. Becoming a registered nurse is not easy however, Fable finds that nursing is more than just a job and describes many rewarding experiences. It is hoped that the outcomes of this research will be valuable to students, graduates, nurse academics and the profession of nursing generally by enhancing understandings of the relationship between the graduate and the actual employment experience.