103 resultados para Projected length
Resumo:
The time course of elongation and recovery of axial length associated with a 30 minute accommodative task was studied using optical low coherence reflectometry in a population of young adult myopic (n = 37) and emmetropic (n = 22) subjects. Ten of the 59 subjects were excluded from analysis either due to inconsistent accommodative response, or incomplete anterior biometry data. Those subjects with valid data (n = 49) were found to exhibit a significant axial elongation immediately following the commencement of a 30 minute, 4 D accommodation task, which was sustained for the duration of the task, and ¬was evident to a lesser extent immediately following task cessation. During the accommodation task, on average, the myopic subjects exhibited 22 ± 34 µm, and the emmetropic subjects 6 ± 22 µm of axial elongation, however the differences in axial elongation between the myopic and emmetropic subjects were not statistically significant (p = 0.136). Immediately following the completion of the task, the myopic subjects still exhibited an axial elongation (mean magnitude 12 ± 28 µm), that was significantly greater (p < 0.05) than the changes in axial length observed in the emmetropic subjects (mean change -3 ± 16 µm). Axial length had returned to baseline levels 10 minutes after completion of the accommodation task. The time for recovery from accommodation-induced axial elongation was greater in myopes, which may reflect differences in the biomechanical properties of the globe associated with refractive error. Changes in subfoveal choroidal thickness were able to be measured in 37 of the 59 subjects, and a small amount of choroidal thinning was observed during the accommodation task that was statistically significant in the myopic subjects (p < 0.05). These subfoveal choroidal changes could account for some but not all of the increased axial length during accommodation.
Resumo:
Purpose. The purpose of the study was to investigate the changes in axial length occurring with shifts in gaze direction. Methods. Axial length measurements were obtained from the left eye of 30 young adults (10 emmetropes, 10 low myopes, and 10 moderate myopes) through a rotating prism with 15° deviation, along the foveal axis, using a noncontact optical biometer in each of the nine different cardinal directions of gaze over 5 minutes. The subject's fellow eye fixated on an external distance (6 m) target to control accommodation, also with 15° deviation. Axial length measurements were also performed in 15° and 25° downward gaze with the biometer inclined on a tilting table, allowing gaze shifts to be achieved with either full head turn but no eye turn, or full eye turn with no head turn. Results. There was a significant influence of gaze angle and time on axial length (both P < 0.001), with the greatest axial elongation (+18 ± 8 μm) occurring with inferonasal gaze (P < 0.001) and a slight decrease in axial length in superior gaze (−12 ± 17 μm) compared with primary gaze (P < 0.001). In downward gaze, a significant axial elongation occurred when eye turn was used (P < 0.001), but not when head turn was used to shift gaze (P > 0.05). Conclusions. The angle of gaze has a small but significant short-term effect on axial length, with greatest elongation occurring in inferonasal gaze. The elongation of the eye appears to be due to the influence of the extraocular muscles, in particular the oblique muscles.
Resumo:
Recent research indicates that brief periods (60 minutes) of monocular defocus lead to small but significant changes in human axial length. However, the effects of longer periods of defocus on the axial length of human eyes are unknown. We examined the influence of a 12 hour period of monocular myopic defocus on the natural daily variations occurring in axial length and choroidal thickness of young adult emmetropes. A series of axial length and choroidal thickness measurements (collected at ~3 hourly intervals, with the first measurement at ~9 am and the final measurement at ~9 pm) were obtained for 13 emmetropic young adults over three consecutive days. The natural daily rhythms (Day 1, baseline day, no defocus), the daily rhythms with monocular myopic defocus (Day 2, defocus day, +1.50 DS spectacle lens over the right eye), and the recovery from any defocus induced changes (Day 3, recovery day, no defocus) were all examined. Significant variations over the course of the day were observed in both axial length and choroidal thickness on each of the three measurement days (p<0.0001). The magnitude and timing of the daily variations in axial length and choroidal thickness were significantly altered with the monocular myopic defocus on day 2 (p<0.0001). Following the introduction of monocular myopic defocus, the daily peak in axial length occurred approximately 6 hours later, and the peak in choroidal thickness approximately 8.5 hours earlier in the day compared to days 1 and 3 (with no defocus). The mean amplitude (peak to trough) of change in axial length (0.030 ± 0.012 on day 1, 0.020 ± 0.010 on day 2 and 0.033 ± 0.012 mm on day 3) and choroidal thickness (0.030 ± 0.007 on day 1, 0.022 ± 0.006 on day 2 and 0.027 ± 0.009 mm on day 3) were also significantly different between the three days (both p<0.05). The introduction of monocular myopic defocus disrupts the daily variations in axial length and choroidal thickness of human eyes (in terms of both amplitude and timing) that return to normal the following day after removal of the defocus.
Resumo:
Projected increases in atmospheric carbon dioxide concentration ([CO2]) and air temperature associated with future climate change are expected to affect crop development, crop yield, and, consequently, global food supplies. They are also likely to change agricultural production practices, especially those related to agricultural water management and sowing date. The magnitude of these changes and their implications to local production systems are mostly unknown. The objectives of this study were to: (i) simulate the effect of projected climate change on spring wheat (Triticum aestivum L. cv. Lang) yield and water use for the subtropical environment of the Darling Downs, Queensland, Australia; and (ii) investigate the impact of changing sowing date, as an adaptation strategy to future climate change scenarios, on wheat yield and water use. The multimodel climate projections from the IPCC Coupled Model Intercomparison Project (CMIP3) for the period 2030–2070 were used in this study. Climate scenarios included combinations of four changes in air temperature (08C, 18C, 28C, and 38C), three [CO2] levels (380 ppm, 500 ppm, and 600 ppm), and three changes in rainfall (–30%, 0%, and +20%), which were superimposed on observed station data. Crop management scenarios included a combination of six sowing dates (1 May, 10 May, 20 May, 1 June, 10 June, and 20 June) and three irrigation regimes (no irrigation (NI), deficit irrigation (DI), and full irrigation (FI)). Simulations were performed with the model DSSAT4.5, using 50 years of daily weather data.Wefound that: (1) grain yield and water-use efficiency (yield/evapotranspiration) increased linearly with [CO2]; (2) increases in [CO2] had minimal impact on evapotranspiration; (3) yield increased with increasing temperature for the irrigated scenarios (DI and FI), but decreased for the NI scenario; (4) yield increased with earlier sowing dates; and (5) changes in rainfall had a small impact on yield for DI and FI, but a high impact for the NI scenario.
Resumo:
Rationale: The Australasian Nutrition Care Day Survey (ANCDS) evaluated if malnutrition and decreased food intake are independent risk factors for negative outcomes in hospitalised patients. Methods: A multicentre (56 hospitals) cross-sectional survey was conducted in two phases. Phase 1 evaluated nutritional status (defined by Subjective Global Assessment) and 24-hour food intake recorded as 0, 25, 50, 75, and 100% intake. Phase 2 data, which included length of stay (LOS), readmissions and mortality, were collected 90 days post-Phase 1. Logistic regression was used to control for confounders: age, gender, disease type and severity (using Patient Clinical Complexity Level scores). Results: Of 3122 participants (53% males, mean age: 65±18 years) 32% were malnourished and 23% consumed�25% of the offered food. Median LOS for malnourished (MN) patients was higher than well-nourished (WN) patients (15 vs. 10 days, p<0.0001). Median LOS for patients consuming �25% of the food was higher than those consuming �50% (13 vs. 11 days, p<0.0001). MN patients had higher readmission rates (36% vs. 30%, p = 0.001). The odds ratios of 90-day in-hospital mortality were 1.8 times greater for MN patients (CI: 1.03 3.22, p = 0.04) and 2.7 times greater for those consuming �25% of the offered food (CI: 1.54 4.68, p = 0.001). Conclusion: The ANCDS demonstrates that malnutrition and/or decreased food intake are associated with longer LOS and readmissions. The survey also establishes that malnutrition and decreased food intake are independent risk factors for in-hospital mortality in acute care patients; and highlights the need for appropriate nutritional screening and support during hospitalisation. Disclosure of Interest: None Declared.
Resumo:
Wing length is a key character for essential behaviours related to bird flight such as migration and foraging. In the present study, we initiate the search for the genes underlying wing length in birds by studying a long-distance migrant, the great reed warbler (Acrocephalus arundinaceus). In this species wing length is an evolutionary interesting trait with pronounced latitudinal gradient and sex-specific selection regimes in local populations. We performed a quantitative trait locus (QTL) scan for wing length in great reed warblers using phenotypic, genotypic, pedigree and linkage map data from our long-term study population in Sweden. We applied the linkage analysis mapping method implemented in GRIDQTL (a new web-based software) and detected a genome-wide significant QTL for wing length on chromosome 2, to our knowledge, the first detected QTL in wild birds. The QTL extended over 25 cM and accounted for a substantial part (37%) of the phenotypic variance of the trait. A genome scan for tarsus length (a bodysize-related trait) did not show any signal, implying that the wing-length QTL on chromosome 2 was not associated with body size. Our results provide a first important step into understanding the genetic architecture of avian wing length, and give opportunities to study the evolutionary dynamics of wing length at the locus level. This journal is© 2010 The Royal Society.
Resumo:
BACKGROUND: A long length of stay (LOS) in the emergency department (ED) associated with overcrowding has been found to adversely affect the quality of ED care. The objective of this study is to determine whether patients who speak a language other than English at home have a longer LOS in EDs compared to those whose speak only English at home. METHODS: A secondary data analysis of a Queensland state-wide hospital EDs dataset (Emergency Department Information System) was conducted for the period, 1 January 2008 to 31 December 2010. RESULTS: The interpreter requirement was the highest among Vietnamese speakers (23.1%) followed by Chinese (19.8%) and Arabic speakers (18.7%). There were significant differences in the distributions of the departure statuses among the language groups (Chi-squared=3236.88, P<0.001). Compared with English speakers, the Beta coeffi cient for the LOS in the EDs measured in minutes was among Vietnamese, 26.3 (95%CI: 22.1–30.5); Arabic, 10.3 (95%CI: 7.3–13.2); Spanish, 9.4 (95%CI: 7.1–11.7); Chinese, 8.6 (95%CI: 2.6–14.6); Hindi, 4.0 (95%CI: 2.2–5.7); Italian, 3.5 (95%CI: 1.6–5.4); and German, 2.7 (95%CI: 1.0–4.4). The fi nal regression model explained 17% of the variability in LOS. CONCLUSION: There is a close relationship between the language spoken at home and the LOS at EDs, indicating that language could be an important predictor of prolonged LOS in EDs and improving language services might reduce LOS and ease overcrowding in EDs in Queensland's public hospitals.
Resumo:
Dear Editor We thank Dr Klek for his interest in our article and giving us the opportunity to clarify our study and share our thoughts. Our study looks at the prevalence of malnutrition in an acute tertiary hospital and tracked the outcomes prospectively.1 There are a number of reasons why we chose Subjective Global Assessment (SGA) to determine the nutritional status of patients. Firstly, we took the view that nutrition assessment tools should be used to determine nutrition status and diagnose presence and severity of malnutrition; whereas the purpose of nutrition screening tools are to identify individuals who are at risk of malnutrition. Nutritional assessment rather than screening should be used as the basis for planning and evaluating nutrition interventions for those diagnosed with malnutrition. Secondly, Subjective Global Assessment (SGA) has been well accepted and validated as an assessment tool to diagnose the presence and severity of malnutrition in clinical practice.2, 3 It has been used in many studies as a valid prognostic indicator of a range of nutritional and clinical outcomes.4, 5, 6 On the other hand, Malnutrition Universal Screening Tool (MUST)7 and Nutrition Risk Screening 2002 (NRS 2002)8 have been established as screening rather than assessment tools.
Resumo:
We predict here from first-principle calculations that finite-length (n,0) single walled carbon nanotubes (SWCNTs) with H-termination at the open ends displaying antiferromagnetic coupling when n is greater than 6. An opposite local gating effect of the spin states, i.e., half metallicity, is found under the influence of an external electric field along the direction of tube axis. Remarkably, boron doping of unpassivated SWCNTs at both zigzag edges is found to favor a ferromagnetic ground state, with the B-doped tubes displaying half-metallic behavior even in the absence of an electric field. Aside of the intrinsic interest of these results, an important avenue for development of CNT-based spintronic is suggested.
Resumo:
Objective To evaluate relative telomere length of female migraine patients. Background Migraine is a debilitating disorder affecting 6-28% of the population. Studies on the mechanisms of migraine have demonstrated genetic causes but the pathophysiology and subcellular effects of the disease remain poorly understood. Shortened telomere length is associated with age-related or chronic diseases, and induced stresses. Migraine attacks may impart significant stress on cellular function, thus this study investigates a correlation between shortening of telomeres and migraine. Methods Relative telomere length was measured using a previously described quantitative polymerase chain reaction method. A regression analysis was performed to assess differences in mean relative telomere length between migraine patients and healthy controls. Results The leukocyte telomeres of a cohort of 142 Caucasian female migraine subjects aged 18-77 years and 143 matched 17-77-year-old healthy control Caucasian women were examined.A significantly shorter relative telomere length was observed in the migraine group compared with the control group after adjusting for age and body mass index (P = .001). In addition, age of onset was observed to associate with the loss of relative telomere length, especially at early age of onset (<17 years old). No association was observed between relative telomere length and the severity and frequency of migraine attacks and the duration of migraine. Conclusion Telomeres are shorter in migraine patients and there is more variation in telomere length in migraine patients.
Resumo:
FOXP1 is a transcriptional repressor that has been proposed to repress the expression of some NFκB-responsive genes. Furthermore, truncated forms of FOXP1 have been associated with a subtype of Diffuse Large B-cell Lymphoma characterised by constitutive NFκB activity, indicating that they may inhibit this repression. We have shown that FL tumors have increased relative abundance of truncated FOXP1 isoforms and this is associated with increased expression of NFκB-associated genes. Our results provide strong evidence that relative FOXP1 isoform abundance is associated with NFκB activity in FL, and could potentially be used as a marker for this gene signature.
Resumo:
Essential hypertension is a highly hereditable disorder in which genetic influences predominate over environmental factors. The molecular genetic profiles which predispose to essential hypertension are not known. In rats with genetic hypertension, there is some recent evidence pointing to linkage of renin gene alleles with blood pressure. The genes for renin and antithrombin III belong to a conserved synteny group which, in humans, spans the q21.3-32.3 region of chromosome I and, in rats, is linkage group X on chromosome 13. The present study examined the association of particular human renin gene (REN) and antithrombin III gene (AT3) polymorphisms with essential hypertension by comparing the frequency of specific alleles for each of these genes in 50 hypertensive offspring of hypertensive parents and 91 normotensive offspring of normotensive parents. In addition, linkage relationships were examined in hypertensive pedigrees with multiple affected individuals. Alleles of a REN HindIII restriction fragment length polymorphism (RFLP) were detected using a genomic clone, λHR5, to probe Southern blots of HindIII-cut leucocyte DNA, and those for an AT3 Pstl RFLP were detected by phATIII 113 complementary DNA probe. The frequencies of each REN allele in the hypertensive group were 0.76 and 0.24 compared with 0.74 and 0.26 in the normotensive group. For AT3, hypertensive allele frequencies were 0.49 and 0.51 compared with normotensive values of 0.54 and 0.46. These differences were not significant by χ2 analysis (P > 0.2). Linkage analysis of a family (data from 16 family members, 10 of whom were hypertensive), informative for both markers, without an age-of-onset correction, and assuming dominant inheritance of hypertension, complete penetrance and a disease frequency of 20%, did not indicate linkage of REN with hypertension, but gave a positive, although not significant, logarithm of the odds for linkage score of 0.784 at a recombination fraction of 0 for AT3 linkage to hypertension. In conclusion, the present study could find no evidence for an association of a REN HindIII RFLP with essential hypertension or for a linkage of the locus defined by this RFLP in a family segregating for hypertension. In the case of an AT3 Pstl RFLP, although association analysis was negative, linkage analysis suggested possible involvement (odds of 6:1 in favour) of a gene located near the 1q23 locus with hypertension in one informative family.
Resumo:
Purpose To investigate the influence of monocular hyperopic defocus on the normal diurnal rhythms in axial length and choroidal thickness of young adults. Methods A series of axial length and choroidal thickness measurements (collected at ~3 hourly intervals, with the first measurement at ~9 am and the final measurement at ~9 pm) were obtained for 15 emmetropic young adults over three consecutive days. The natural diurnal rhythms (Day 1, no defocus), diurnal rhythms with monocular hyperopic defocus (Day 2, – 2.00 DS spectacle lens over the right eye), and the recovery from any defocus induced changes (Day 3, no defocus) in diurnal rhythms were examined. Results Both axial length and choroidal thickness underwent significant diurnal changes on each of the three measurement days (p<0.0001). The introduction of monocular hyperopic defocus resulted in significant changes in the diurnal variations observed in both parameters (p<0.05). A significant (p<0.001) increase in the mean amplitude (peak to trough) of change in axial length (mean increase, 0.016 ± 0.005 mm) and choroidal thickness (mean increase, 0.011 ± 0.003 mm) was observed on day 2 with hyperopic defocus compared to the two ‘no defocus’ days (days 1 and 3). At the second measurement (mean time 12:10 pm) on the day with hyperopic defocus, the eye was significantly longer by 0.012 ± 0.002 mm compared to the other two days (p<0.05). No significant difference was observed in the average timing of the daily peaks in axial length (mean peak time 12:12 pm) and choroidal thickness (21:02 pm) over the three days. Conclusions The introduction of monocular hyperopic defocus resulted in a significant increase in the amplitude of the diurnal change in axial length and choroidal thickness that returned to normal the following day after removal of the blur stimulus.