988 resultados para Career changes
Resumo:
Using 20 years of employment and job mobility data from a representative German sample (N = 1259), we employ optimal matching analysis (OMA) to identify six career patterns which deviate from the traditional career path of long-term, full-time employment in one organization. Then, in further analyses, we examine which socio-demographic predictors affect whether or not individuals follow that traditional career path. Results indicate that age, gender, marital status, number of children, education, and career starts in the public sector significantly predicted whether or not individuals followed the traditional career path. The article concludes with directions for future theoretical and methodological research on career patterns.
Resumo:
Demographic changes necessitate that companies commit younger workers and motivate older workers through work design. Age-related differences in occupational goals should be taken into account when accomplishing these challenges. In this study, we investigated goal contents and goal characteristics of employees from different age groups. We surveyed 150 employees working in the service sector (average age = 44 years, age range 19 to 60 years) on their most important occupational goals. Employees who stated goals from the area of organizational citizenship were significantly older than employees with other goals. Employees who stated goals from the areas of training and pay/career were significantly younger than employees with other goals. After controlling for gender, education, and work characteristics, no age-related differences were found in the goal areas teamwork, job security, working time, well-being, and new challenges. In addition, no relationships were found between age and the goal characteristics specificity, planning intensity, as well as positive and negative goal emotions. We recommend that companies provide older workers with more opportunities for organizational citizenship and commit younger workers by providing development opportunities and adequate pay
Resumo:
Previous research showed that daily manifestations of career adaptability fluctuate within individuals over short periods of time, and predict important daily job and career outcomes. Using a quantitative daily diary study design (N = 156 employees; 591 daily entries), the author investigated daily job characteristics (i.e., daily job demands, daily job autonomy, and daily supervisory career mentoring) and daily individual characteristics (i.e., daily Big Five personality characteristics, daily core self-evaluations, and daily temporal focus) as within-person predictors of daily career adaptability and its four dimensions (concern, control, curiosity, and confidence). Results showed that daily job demands, daily job autonomy, daily conscientiousness, daily openness to experience, as well as daily past and future temporal focus positively predicted daily career adaptability. Differential results emerged for the four career adaptability dimensions. Implications for future research on within-person variability in career adaptability are discussed.
Resumo:
Most research on career adaptability has examined the construct as an individual differences variable and neglected that it may vary within an individual over a short period of time. In two daily diary studies, the author investigated the relationships of career adaptability and its four dimensions (concern, control, curiosity, and confidence) to their daily manifestations as well as daily job and career outcomes. Both Study 1 (N = 53) and Study 2 (N = 234) demonstrated substantial within-person variability in employees' behavioral expressions of career adaptability across five work days. Results further showed that daily career adaptability and daily confidence positively predicted daily task and career performance, as well as daily job and career satisfaction. Daily control positively predicted daily task performance, as well as daily job and career satisfaction. Daily concern positively predicted daily career performance and satisfaction, and daily curiosity positively predicted daily career satisfaction.
Resumo:
Research on career adaptability and its relationships with work outcomes has so far primarily focused on the cohort of younger workers and largely neglected older workers. We investigated the relationship between career adaptability and job satisfaction in a sample of 577 older workers from Australia (M age = 59.6 years, SD = 2.4, range 54–66 years), who participated in a 4-wave substudy of the 45 and Up Study. Based on socioemotional selectivity theory, we examined older workers’ chronological age (as a proxy for retirement proximity) and motivation to continue working after traditional retirement age as moderators of the relationship between career adaptability and job satisfaction. We hypothesized that the positive relationship between career adaptability and job satisfaction is stronger among relatively younger workers and workers with a high motivation to continue working compared to relatively older workers and workers with a low motivation to continue working. Results showed that older workers’ age, but not their motivation to continue working, moderated the relationship between career adaptability and job satisfaction consistent with the expected pattern. Implications for future research on age and career adaptability as well as ideas on how to maintain and improve older workers’ career adaptability and job satisfaction are discussed.
Resumo:
According to career construction theory, continuous adaptation to the work environment is crucial to achieve work and career success. In this study, we examined the relative importance of career adaptability for job performance ratings using an experimental policy-capturing design. Employees (N = 135) from different vocational backgrounds rated the overall job performance of fictitious employees in 40 scenarios based on information about their career adaptability, mental ability, conscientiousness, and job complexity. We used multilevel modeling to investigate the relative importance of each factor. Consistent with expectations, career adaptability positively predicted job performance ratings, and this effect was relatively smaller than the effects of conscientiousness and mental ability. Job complexity did not moderate the effect of career adaptability on job performance ratings, suggesting that career adaptability predicts job performance ratings in high-, medium-, and low-complexity jobs. Consistent with previous research, the effect of mental ability on job performance ratings was stronger in high- compared to low-complexity jobs. Overall, our findings provide initial evidence for the predictive validity of employees' career adaptability with regard to other people's ratings of job performance.
Resumo:
- Objective To examine changes in sitting time (ST) in women over nine years and to identify associations between life events and these changes. - Methods Young (born 1973–78, n = 5215) and mid-aged (born 1946–51, n = 6973) women reported life events and ST in four surveys of the Australian Longitudinal Study on Women's Health between 2000 and 2010. Associations between life events and changes in ST between surveys (decreasers ≥ 2 h/day less, increasers ≥ 2 h/day more) were estimated using generalized estimating equations. - Results Against a background of complex changes there was an overall decrease in ST in young women (median change − 0.48 h/day, interquartile range [IQR] = − 2.54, 1.50) and an increase in ST in mid-aged women (median change 0.43 h/day; IQR = − 1.29, 2.0) over nine years. In young women, returning to study and job loss were associated with increased ST, while having a baby, beginning work and decreased income were associated with decreased ST. In mid-aged women, changes at work were associated with increased ST, while retiring and decreased income were associated with decreased ST. - Conclusions ST changed over nine years in young and mid-aged Australian women. The life events they experienced, particularly events related to work and family, were associated with these changes.
Resumo:
Weed management practices in cotton systems that were based on frequent cultivation, residual herbicides, and some post-emergent herbicides have changed. The ability to use glyphosate as a knockdown before planting, in shielded sprayers, and now over-the-top in glyphosate-tolerant cotton has seen a significant reduction in the use of residual herbicides and cultivation. Glyphosate is now the dominant herbicide in both crop and fallow. This reliance increases the risk of shifts to glyphosate-tolerant species and the evolution of glyphosate-resistant weeds. Four surveys were undertaken in the 2008-09 and 2010-11 seasons. Surveys were conducted at the start of the summer cropping season (November-December) and at the end of the same season (March-April). Fifty fields previously surveyed in irrigated and non-irrigated cotton systems were re-surveyed. A major species shift towards Conyza bonariensis was observed. There was also a minor increase in the prevalence of Sonchus oleraceus. Several species were still present at the end of the season, indicating either poor control and/or late-season germinations. These included C. bonariensis, S. oleraceus, Hibiscus verdcourtii and Hibiscus tridactylites, Echinochloa colona, Convolvulus sp., Ipomea lonchophylla, Chamaesyce drummondii, Cullen sp., Amaranthus macrocarpus, and Chloris virgata. These species, with the exception of E. colona, H. verdcourtii, and H. tridactylites, have tolerance to glyphosate and therefore are likely candidates to either remain or increase in dominance in a glyphosate-based system.
Resumo:
In this study, we used Parthenium hysterophorus and one of its biological control agents, the winter rust (Puccinia abrupta var. partheniicola) as a model system to investigate how the weed may respond to infection under a climate change scenario involving an elevated atmospheric CO2 (550 μmol mol−1) concentration. Under such a scenario, P. hysterophorus plants grew significantly taller (52%) and produced more biomass (55%) than under the ambient atmospheric CO2 concentration (380 μmol mol−1). Following winter rust infection, biomass production was reduced by 17% under the ambient and by 30% under the elevated atmospheric CO2 concentration. The production of branches and leaf area was significantly increased by 62% and 120%, under the elevated as compared with ambient CO2 concentration, but unaffected by rust infection under either condition. The photosynthesis and water use efficiency (WUE) of P. hysterophorus plants were increased by 94% and 400%, under the elevated as compared with the ambient atmospheric CO2 concentration. However, in the rust-infected plants, the photosynthesis and WUE decreased by 18% and 28%, respectively, under the elevated CO2 and were unaffected by the ambient atmospheric CO2 concentration. The results suggest that although P. hysterophorus will benefit from a future climate involving an elevation of the atmospheric CO2 concentration, it is also likely that the winter rust will perform more effectively as a biological control agent under these same conditions.
Resumo:
PURPOSE To quantify the influence of short-term wear of miniscleral contact lenses on the morphology of the corneo-scleral limbus, the conjunctiva, episclera and sclera. METHODS OCT images of the anterior eye were captured before, immediately following 3h of wear and then 3h after removal of a miniscleral contact lens for 10 young (27±5 years) healthy participants (neophyte rigid lens wearers). The region of analysis encompassed 1mm anterior, to 3.5mm posterior to the scleral spur. Natural diurnal variations in thickness were measured on a separate day and compensated for in subsequent analyses. RESULTS Following 3h of lens wear, statistically significant tissue thinning was observed across all quadrants, with a mean decrease in thickness of -24.1±3.6μm (p<0.001), which diminished, but did not return to baseline 3h after lens removal (-16.9±1.9μm, p<0.001). The largest tissue compression was observed in the superior quadrant (-49.9±8.5μm, p<0.01) and in the annular zone 1.5mm from the scleral spur (-48.2±5.7μm), corresponding to the approximate edge of the lens landing zone. Compression of the conjunctiva/episclera accounted for about 70% of the changes. CONCLUSIONS Optimal fitting miniscleral contact lenses worn for three hours resulted in significant tissue compression in young healthy eyes, with the greatest thinning observed superiorly, potentially due to the additional force of the eyelid, with a partial recovery of compression 3h after lens removal. Most of the morphological changes occur in the conjunctiva/episclera layers.
Resumo:
We investigate the extent to which individuals’ global motivation (self-determined and non-self-determined types) influences adjustment (anxiety, positive reappraisal) and engagement (intrinsic motivation, task performance) in reaction to changes to the level of work control available during a work simulation. Participants (N = 156) completed 2 trials of an inbox activity under conditions of low or high work control—with the ordering of these levels varied to create an increase, decrease, or no change in work control. In support of the hypotheses, results revealed that for more self-determined individuals, high work control led to the increased use of positive reappraisal. Follow-up moderated mediation analyses revealed that the increases in positive reappraisal observed for self-determined individuals in the conditions in which work control was high by Trial 2 consequently increased their intrinsic motivation toward the task. For more non-self-determined individuals, high work control (as well as changes in work control) led to elevated anxiety. Follow-up moderated mediation analyses revealed that the increases in anxiety observed for non-self-determined individuals in the high-to-high work control condition consequently reduced their task performance. It is concluded that adjustment to a demanding work task depends on a fit between individuals’ global motivation and the work control available, which has consequences for engagement with demanding work.
Resumo:
Background People admitted to intensive care units and those with chronic health care problems often require long-term vascular access. Central venous access devices (CVADs) are used for administering intravenous medications and blood sampling. CVADs are covered with a dressing and secured with an adhesive or adhesive tape to protect them from infection and reduce movement. Dressings are changed when they become soiled with blood or start to come away from the skin. Repeated removal and application of dressings can cause damage to the skin. The skin is an important barrier that protects the body against infection. Less frequent dressing changes may reduce skin damage, but it is unclear whether this practice affects the frequency of catheter-related infections. Objectives To assess the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections and other outcomes including pain and skin damage. Search methods In June 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE and EBSCO CINAHL. We also searched clinical trials registries for registered trials. There were no restrictions with respect to language, date of publication or study setting. Selection criteria All randomised controlled trials (RCTs) evaluating the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections on all patients in any healthcare setting. Data collection and analysis We used standard Cochrane review methodology. Two review authors independently assessed studies for inclusion, performed risk of bias assessment and data extraction. We undertook meta-analysis where appropriate or otherwise synthesised data descriptively when heterogeneous. Main results We included five RCTs (2277 participants) that compared different frequencies of CVAD dressing changes. The studies were all conducted in Europe and published between 1995 and 2009. Participants were recruited from the intensive care and cancer care departments of one children's and four adult hospitals. The studies used a variety of transparent dressings and compared a longer interval between dressing changes (5 to15 days; intervention) with a shorter interval between changes (2 to 5 days; control). In each study participants were followed up until the CVAD was removed or until discharge from ICU or hospital. - Confirmed catheter-related bloodstream infection (CRBSI) One trial randomised 995 people receiving central venous catheters to a longer or shorter interval between dressing changes and measured CRBSI. It is unclear whether there is a difference in the risk of CRBSI between people having long or short intervals between dressing changes (RR 1.42, 95% confidence interval (CI) 0.40 to 4.98) (low quality evidence). - Suspected catheter-related bloodstream infection Two trials randomised a total of 151 participants to longer or shorter dressing intervals and measured suspected CRBSI. It is unclear whether there is a difference in the risk of suspected CRBSI between people having long or short intervals between dressing changes (RR 0.70, 95% CI 0.23 to 2.10) (low quality evidence). - All cause mortality Three trials randomised a total of 896 participants to longer or shorter dressing intervals and measured all cause mortality. It is unclear whether there is a difference in the risk of death from any cause between people having long or short intervals between dressing changes (RR 1.06, 95% CI 0.90 to 1.25) (low quality evidence). - Catheter-site infection Two trials randomised a total of 371 participants to longer or shorter dressing intervals and measured catheter-site infection. It is unclear whether there is a difference in risk of catheter-site infection between people having long or short intervals between dressing changes (RR 1.07, 95% CI 0.71 to 1.63) (low quality evidence). - Skin damage One small trial (112 children) and three trials (1475 adults) measured skin damage. There was very low quality evidence for the effect of long intervals between dressing changes on skin damage compared with short intervals (children: RR of scoring ≥ 2 on the skin damage scale 0.33, 95% CI 0.16 to 0.68; data for adults not pooled). - Pain Two studies involving 193 participants measured pain. It is unclear if there is a difference between long and short interval dressing changes on pain during dressing removal (RR 0.80, 95% CI 0.46 to 1.38) (low quality evidence). Authors' conclusions The best available evidence is currently inconclusive regarding whether longer intervals between CVAD dressing changes are associated with more or less catheter-related infection, mortality or pain than shorter intervals.
Resumo:
Climate projections over the next two to four decades indicate that most of Australia’s wheat-belt is likely to become warmer and drier. Here we used a shire scale, dynamic stress-index model that accounts for the impacts of rainfall and temperature on wheat yield, and a range of climate change projections from global circulation models to spatially estimate yield changes assuming no adaptation and no CO2 fertilisation effects. We modelled five scenarios, a baseline climate (climatology, 1901–2007), and two emission scenarios (“low” and “high” CO2) for two time horizons, namely 2020 and 2050. The potential benefits from CO2 fertilisation were analysed separately using a point level functional simulation model. Irrespective of the emissions scenario, the 2020 projection showed negligible changes in the modelled yield relative to baseline climate, both using the shire or functional point scale models. For the 2050-high emissions scenario, changes in modelled yield relative to the baseline ranged from −5 % to +6 % across most of Western Australia, parts of Victoria and southern New South Wales, and from −5 to −30 % in northern NSW, Queensland and the drier environments of Victoria, South Australia and in-land Western Australia. Taking into account CO2 fertilisation effects across a North–south transect through eastern Australia cancelled most of the yield reductions associated with increased temperatures and reduced rainfall by 2020, and attenuated the expected yield reductions by 2050.
Resumo:
A zonally averaged version of the Goddard Laboratory for Atmospheric Sciences (GLAS) climate model is used to study the sensitivity of the northern hemisphere (NH) summer mean meridional circulation to changes in the large scale eddy forcing. A standard solution is obtained by prescribing the latent heating field and climatological horizontal transports of heat and momentum by the eddies. The radiative heating and surface fluxes are calculated by model parameterizations. This standard solution is compared with the results of several sensitivity studies. When the eddy forcing is reduced to 0.5 times or increased to 1.5 times the climatological values, the strength of the Ferrel cells decrease or increase proportionally. It is also seen that such changes in the eddy forcing can influence the strength of theNH Hadley cell significantly. Possible impact of such changes in the large scale eddy forcing on the monsoon circulation via changes in the Hadley circulation is discussed. Sensitivity experiments including only one component of eddy forcing at a time show that the eddy momentum fluxes seem to be more important in maintaining the Ferrel cells than the eddy heat fluxes. In the absence of the eddy heat fluxes, the observed eddy momentum fluxes alone produce subtropical westerly jets which are weaker than those in the standard solution. On the other hand, the observed eddy heat fluxes alone produce subtropical westerly jets which are stronger than those in the standard solution.