992 resultados para Pubertal Changes
Resumo:
BACKGROUND: The lesser grain borer, Rhyzopertha dominica (F.), is a highly destructive pest of stored grain that is strongly resistant to the fumigant phosphine (PH3). Phosphine resistance is due to genetic variants at the rph2 locus that alter the function of the dihydrolipoamide dehydrogenase (DLD) gene. This discovery now enables direct detection of resistance variants at the rph2 locus in field populations. RESULTS: A genotype assay was developed for direct detection of changes in distribution and frequency of a phosphine resistance allele in field populations of R. dominica. Beetles were collected from ten farms in south-east Queensland in 2006 and resampled in 2011. Resistance allele frequency increased in the period from 2006 to 2011 on organic farms with no history of phosphine use, implying that migration of phosphine-resistant R. dominica had occurred from nearby storages. CONCLUSION: Increasing resistance allele frequencies on organic farms suggest local movement of beetles and dispersal of insects from areas where phosphine has been used. This research also highlighted for the first time the utility of a genetic DNA marker in accurate and rapid determination of the distribution of phosphine-resistant insects in the grain value chain. Extending this research over larger landscapes would help in identifying resistance problems and enable timely pest management decisions. © 2013 Society of Chemical Industry © 2013 Society of Chemical Industry 69 6 June 2013 10.1002/ps.3514 Rapid Report Rapid Report © 2013 Society of Chemical Industry.
Resumo:
TRFLP (terminal restriction fragment length polymorphism) was used to assess whether management practices that improved disease suppression and/or yield in a 4-year ginger field trial were related to changes in soil microbial community structure. Bacterial and fungal community profiles were defined by presence and abundance of terminal restriction fragments (TRFs), where each TRF represents one or more species. Results indicated inclusion of an organic amendment and minimum tillage increased the relative diversity of dominant fungal populations in a system dependant way. Inclusion of an organic amendment increased bacterial species richness in the pasture treatment. Redundancy analysis showed shifts in microbial community structure associated with different management practices and treatments grouped according to TRF abundance in relation to yield and disease incidence. ANOVA also indicated the abundance of certain TRFs was significantly affected by farming system management practices, and a number of these TRFs were also correlated with yield or disease suppression. Further analyses are required to determine whether identified TRFs can be used as general or soil-type specific bio-indicators of productivity (increased and decreased) and Pythium myriotylum suppressiveness.
Resumo:
Visual problems may be the first symptoms of diabetes. There have been several reports of transient changes in refraction of people newly diagnosed with diabetes. Visual acuity and refraction may be affected when there are ocular biometric changes. Small but significant biometrical changes have been found by some authors during hyperglycaemia and during reduction of hyperglycaemia.[4] Here, we describe a case of type 2 diabetes that was detected from ocular straylight and intraocular thickness measurements...
Resumo:
D-vitamiini ylläpitää normaalia luun kasvua ja uudistumista koko elämän ajan. Suomessa, kuten monissa muissakin länsimaissa, väestön D-vitamiinitilanne on riittämätön – talvisin osalla jopa puutteellinen. Tässä väitöskirjassa on tutkittu, lisääkö D-vitamiini luumassan kertymistä kasvuiässä, ja ylläpitäkö D-vitamiini luuston tasapainoista aineenvaihduntaa aikuisiällä. Nämä vaikutukset saattavat ehkäisi osteoporoosin kehittymistä eri ikäkausina. Väitöskirjatyössä tutkittiin erisuuruisten D-vitamiinilisäysten vaikutuksia kolmessa eri ikäryhmässä, jotka olivat 11-12 -vuotiaat tytöt (N=228), 21-49 -vuotiaat miehet (N=54) ja 65-85 -vuotiaat naiset (N=52). Tutkittavat satunnaistettiin ryhmiin, jotka nauttivat joko lumevalmistetta tai 5-20 µg D3-vitamiinia vitamiinilisänä. Tutkimukset olivat kaksoissokkoutettuja. Tutkimuksen aikana tutkittavilta otettiin paastoveri- ja virtsanäytteitä. Lisäksi he täyttivät tutkimuslomakkeen taustatietojen kartoittamiseksi sekä frekvenssikyselylomakkeen kalsiumin ja D-vitamiinin saannin selvittämiseksi. Tyttöjen luunmineraalitiheys (BMD) mitattiin DXA–laitteella ja miesten volumetrinen luuntiheys pQCT-menetelmällä. Näytteistä määritettiin mm. seerumin 25-hydroksi-D-vitamiinin (=S-25-OHD), lisäkilpirauhashormonin (=S-PTH) ja luun aineenvaihduntaa kuvaavien merkkiaineiden pitoisuuksia. Murrosikäisten tyttöjen poikkileikkaustutkimuksessa S-25-OHD- ja luun muodostusmerkkiaineen pitoisuudet vaihtelivat kuukausien välillä; suurimmat pitoisuudet mitattiin syyskuussa ja pienimmät maaliskuussa, mikä kuvastaa vuodenaikaisvaihtelua. Vastaava vaihtelu havaittiin lannerangan ja reisiluun BMD:ssä. D-vitamiinilisäyksellä oli myönteinen vaikutus tyttöjen luumassan lisääntymiseen. Suurin D-vitamiinilisä (10 µg/vrk) lisäsi luumassaa 17.2% enemmän reisiluussa ja 12.5% enemmän lannerangassa verrattuna lumevalmistetta nauttivien tyttöjen vastaaviin tuloksiin, mutta tulos riippui hoitomyöntyvyydestä. D-vitamiinin vaikutus luustoon välittyi vähentyneen luun hajotuksen kautta. Tutkimustuloksiin perustuen riittävä D-vitamiinin saanti murrosikäisille tytöille on 15 µg/vrk. D-vitamiinilisän vaikutus 65-85 -vuotiaiden naisten S-25-OHD-pitoisuuteen vakioitui kuudessa viikossa annoksen ollessa 5-20 µg/vrk. Näillä D-vitamiiniannoksilla ei saavutettu tavoiteltavaa S-25-OHD-pitoisuutta, joka on 80 nmol/l. Arvioimme, että 60 nmol/l -pitoisuuden, jota esiintyy kesäisin tämän ikäryhmän suomalaisilla, tämän ikäryhmän naiset saavuttaisivat 24 µg:n päivittäisellä D-vitamiinin saannilla. Terveillä miehillä havaittiin vuodenaikaisvaihtelu S-25-OHD- ja S-PTH-pitoisuudessa sekä luun hajotusta kuvaavassa merkkiainepitoisuudessa. Toisaalta vaihtelua ei havaittu radiuksen volumetrisessä luuntiheydessä eikä luun muodostusmerkkiaineen pitoisuudessa. Vuodenaikaisvaihtelu estettiin 17 µg:n päivittäisellä D-vitamiinin saannilla, mutta tämän ei havaittu vaikuttavan radiuksen luuntiheyteen kuusi kuukautta kestävän tutkimuksen aikana. Yhteenvetona todetaan, että D-vitamiinin saanti on edelleenkin riittämätöntä tutkimusten kohderyhmillä. Tämä näkyy S-25-OHD- ja PTH-pitoisuuden sekä luunaineenvaihduntaa kuvaavien merkkiaineiden vuodenaikaisvaihteluna, mikä on haitallista luuston hyvinvoinnille. D-vitamiinin saantia tulisi lisätä, jotta vähintäänkin riittävä D-vitamiinitilanne (S-25-OHD>50 nmol/l) tai mahdollisesti jopa tavoiteltava D-vitaminitilanne (S-25-OHD≥80 nmol/l) saavutettaisiin. Jotta D-vitamiinin saannin lisääminen olisi kaikissa ikäryhmissä mahdollista, on suunniteltava nykyistä enemmän D-vitamiinilla täydennettyjä elintarvikkeita.
Resumo:
In industrial and organizational psychology, there is a long tradition of studying personality as an antecedent of work outcomes. Recently, however, scholars have suggested that personality characteristics may not only predict, but also change due to certain work experiences, a notion that is depicted in the dynamic developmental model (DDM) of personality and work. Upward job changes are an important part of employees’ careers and career success in particular, and we argue that these career transitions can shape personality over time. In this study, we investigate the Big Five personality characteristics as both predictors and outcomes of upward job changes into managerial and professional positions. We tested our hypotheses by applying event history analyses and propensity score matching to a longitudinal dataset collected over five years from employees in Australia. Results indicated that participants’ openness to experience not only predicted, but that changes in openness to experience also followed from upward job changes into managerial and professional positions. Our findings thus provide support for a dynamic perspective on personality characteristics in the context of work and careers.
Resumo:
- Objective To examine changes in sitting time (ST) in women over nine years and to identify associations between life events and these changes. - Methods Young (born 1973–78, n = 5215) and mid-aged (born 1946–51, n = 6973) women reported life events and ST in four surveys of the Australian Longitudinal Study on Women's Health between 2000 and 2010. Associations between life events and changes in ST between surveys (decreasers ≥ 2 h/day less, increasers ≥ 2 h/day more) were estimated using generalized estimating equations. - Results Against a background of complex changes there was an overall decrease in ST in young women (median change − 0.48 h/day, interquartile range [IQR] = − 2.54, 1.50) and an increase in ST in mid-aged women (median change 0.43 h/day; IQR = − 1.29, 2.0) over nine years. In young women, returning to study and job loss were associated with increased ST, while having a baby, beginning work and decreased income were associated with decreased ST. In mid-aged women, changes at work were associated with increased ST, while retiring and decreased income were associated with decreased ST. - Conclusions ST changed over nine years in young and mid-aged Australian women. The life events they experienced, particularly events related to work and family, were associated with these changes.
Resumo:
Weed management practices in cotton systems that were based on frequent cultivation, residual herbicides, and some post-emergent herbicides have changed. The ability to use glyphosate as a knockdown before planting, in shielded sprayers, and now over-the-top in glyphosate-tolerant cotton has seen a significant reduction in the use of residual herbicides and cultivation. Glyphosate is now the dominant herbicide in both crop and fallow. This reliance increases the risk of shifts to glyphosate-tolerant species and the evolution of glyphosate-resistant weeds. Four surveys were undertaken in the 2008-09 and 2010-11 seasons. Surveys were conducted at the start of the summer cropping season (November-December) and at the end of the same season (March-April). Fifty fields previously surveyed in irrigated and non-irrigated cotton systems were re-surveyed. A major species shift towards Conyza bonariensis was observed. There was also a minor increase in the prevalence of Sonchus oleraceus. Several species were still present at the end of the season, indicating either poor control and/or late-season germinations. These included C. bonariensis, S. oleraceus, Hibiscus verdcourtii and Hibiscus tridactylites, Echinochloa colona, Convolvulus sp., Ipomea lonchophylla, Chamaesyce drummondii, Cullen sp., Amaranthus macrocarpus, and Chloris virgata. These species, with the exception of E. colona, H. verdcourtii, and H. tridactylites, have tolerance to glyphosate and therefore are likely candidates to either remain or increase in dominance in a glyphosate-based system.
Resumo:
In this study, we used Parthenium hysterophorus and one of its biological control agents, the winter rust (Puccinia abrupta var. partheniicola) as a model system to investigate how the weed may respond to infection under a climate change scenario involving an elevated atmospheric CO2 (550 μmol mol−1) concentration. Under such a scenario, P. hysterophorus plants grew significantly taller (52%) and produced more biomass (55%) than under the ambient atmospheric CO2 concentration (380 μmol mol−1). Following winter rust infection, biomass production was reduced by 17% under the ambient and by 30% under the elevated atmospheric CO2 concentration. The production of branches and leaf area was significantly increased by 62% and 120%, under the elevated as compared with ambient CO2 concentration, but unaffected by rust infection under either condition. The photosynthesis and water use efficiency (WUE) of P. hysterophorus plants were increased by 94% and 400%, under the elevated as compared with the ambient atmospheric CO2 concentration. However, in the rust-infected plants, the photosynthesis and WUE decreased by 18% and 28%, respectively, under the elevated CO2 and were unaffected by the ambient atmospheric CO2 concentration. The results suggest that although P. hysterophorus will benefit from a future climate involving an elevation of the atmospheric CO2 concentration, it is also likely that the winter rust will perform more effectively as a biological control agent under these same conditions.
Resumo:
PURPOSE To quantify the influence of short-term wear of miniscleral contact lenses on the morphology of the corneo-scleral limbus, the conjunctiva, episclera and sclera. METHODS OCT images of the anterior eye were captured before, immediately following 3h of wear and then 3h after removal of a miniscleral contact lens for 10 young (27±5 years) healthy participants (neophyte rigid lens wearers). The region of analysis encompassed 1mm anterior, to 3.5mm posterior to the scleral spur. Natural diurnal variations in thickness were measured on a separate day and compensated for in subsequent analyses. RESULTS Following 3h of lens wear, statistically significant tissue thinning was observed across all quadrants, with a mean decrease in thickness of -24.1±3.6μm (p<0.001), which diminished, but did not return to baseline 3h after lens removal (-16.9±1.9μm, p<0.001). The largest tissue compression was observed in the superior quadrant (-49.9±8.5μm, p<0.01) and in the annular zone 1.5mm from the scleral spur (-48.2±5.7μm), corresponding to the approximate edge of the lens landing zone. Compression of the conjunctiva/episclera accounted for about 70% of the changes. CONCLUSIONS Optimal fitting miniscleral contact lenses worn for three hours resulted in significant tissue compression in young healthy eyes, with the greatest thinning observed superiorly, potentially due to the additional force of the eyelid, with a partial recovery of compression 3h after lens removal. Most of the morphological changes occur in the conjunctiva/episclera layers.
Resumo:
We investigate the extent to which individuals’ global motivation (self-determined and non-self-determined types) influences adjustment (anxiety, positive reappraisal) and engagement (intrinsic motivation, task performance) in reaction to changes to the level of work control available during a work simulation. Participants (N = 156) completed 2 trials of an inbox activity under conditions of low or high work control—with the ordering of these levels varied to create an increase, decrease, or no change in work control. In support of the hypotheses, results revealed that for more self-determined individuals, high work control led to the increased use of positive reappraisal. Follow-up moderated mediation analyses revealed that the increases in positive reappraisal observed for self-determined individuals in the conditions in which work control was high by Trial 2 consequently increased their intrinsic motivation toward the task. For more non-self-determined individuals, high work control (as well as changes in work control) led to elevated anxiety. Follow-up moderated mediation analyses revealed that the increases in anxiety observed for non-self-determined individuals in the high-to-high work control condition consequently reduced their task performance. It is concluded that adjustment to a demanding work task depends on a fit between individuals’ global motivation and the work control available, which has consequences for engagement with demanding work.
Resumo:
Background People admitted to intensive care units and those with chronic health care problems often require long-term vascular access. Central venous access devices (CVADs) are used for administering intravenous medications and blood sampling. CVADs are covered with a dressing and secured with an adhesive or adhesive tape to protect them from infection and reduce movement. Dressings are changed when they become soiled with blood or start to come away from the skin. Repeated removal and application of dressings can cause damage to the skin. The skin is an important barrier that protects the body against infection. Less frequent dressing changes may reduce skin damage, but it is unclear whether this practice affects the frequency of catheter-related infections. Objectives To assess the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections and other outcomes including pain and skin damage. Search methods In June 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE and EBSCO CINAHL. We also searched clinical trials registries for registered trials. There were no restrictions with respect to language, date of publication or study setting. Selection criteria All randomised controlled trials (RCTs) evaluating the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections on all patients in any healthcare setting. Data collection and analysis We used standard Cochrane review methodology. Two review authors independently assessed studies for inclusion, performed risk of bias assessment and data extraction. We undertook meta-analysis where appropriate or otherwise synthesised data descriptively when heterogeneous. Main results We included five RCTs (2277 participants) that compared different frequencies of CVAD dressing changes. The studies were all conducted in Europe and published between 1995 and 2009. Participants were recruited from the intensive care and cancer care departments of one children's and four adult hospitals. The studies used a variety of transparent dressings and compared a longer interval between dressing changes (5 to15 days; intervention) with a shorter interval between changes (2 to 5 days; control). In each study participants were followed up until the CVAD was removed or until discharge from ICU or hospital. - Confirmed catheter-related bloodstream infection (CRBSI) One trial randomised 995 people receiving central venous catheters to a longer or shorter interval between dressing changes and measured CRBSI. It is unclear whether there is a difference in the risk of CRBSI between people having long or short intervals between dressing changes (RR 1.42, 95% confidence interval (CI) 0.40 to 4.98) (low quality evidence). - Suspected catheter-related bloodstream infection Two trials randomised a total of 151 participants to longer or shorter dressing intervals and measured suspected CRBSI. It is unclear whether there is a difference in the risk of suspected CRBSI between people having long or short intervals between dressing changes (RR 0.70, 95% CI 0.23 to 2.10) (low quality evidence). - All cause mortality Three trials randomised a total of 896 participants to longer or shorter dressing intervals and measured all cause mortality. It is unclear whether there is a difference in the risk of death from any cause between people having long or short intervals between dressing changes (RR 1.06, 95% CI 0.90 to 1.25) (low quality evidence). - Catheter-site infection Two trials randomised a total of 371 participants to longer or shorter dressing intervals and measured catheter-site infection. It is unclear whether there is a difference in risk of catheter-site infection between people having long or short intervals between dressing changes (RR 1.07, 95% CI 0.71 to 1.63) (low quality evidence). - Skin damage One small trial (112 children) and three trials (1475 adults) measured skin damage. There was very low quality evidence for the effect of long intervals between dressing changes on skin damage compared with short intervals (children: RR of scoring ≥ 2 on the skin damage scale 0.33, 95% CI 0.16 to 0.68; data for adults not pooled). - Pain Two studies involving 193 participants measured pain. It is unclear if there is a difference between long and short interval dressing changes on pain during dressing removal (RR 0.80, 95% CI 0.46 to 1.38) (low quality evidence). Authors' conclusions The best available evidence is currently inconclusive regarding whether longer intervals between CVAD dressing changes are associated with more or less catheter-related infection, mortality or pain than shorter intervals.