101 resultados para Reproductive outcomes
Resumo:
Forty-five Large White gilts were used to study the effect of energy intake from 28 to 176 d of age on body composition and reproductive development. From 28 to 60 d, the gilts were fed ad libitum a 16.6 MJ DE/kg, 24% crude protein and 1.3% total lysine diet. From 61 d of age three dietary treatments were used; 1) ad libitum access to feed (15.6 MJ DE/kg, 21% crude protein and 1.07% total lysine) (H), 2) feed offered at 75% (M) of the previous days intake of H, and 3) feed offered at 60% (L) of the previous days intake of H. ADG from 61 to 176 d of age was (p <0.05) affected by treatment. Although live weight at 176 d of age did not differ (p >0.1) the H gilts had higher (p <0.08) carcass weights than the M or L gilts. Back fat depths were similar (p >0.1) for all treatments at 115 d of age, however by 176 d of age M and H gilts were fatter (p <0.1) than L gilts. The mean lipid deposition (LD) from 115 to 176 d of age for L gilts (78.9 g/d) was less (p <0.05) than for M gilts (143.6 g/d) and H gilts (135.6 g/d). There were no differences between treatments for protein deposition (PD) over the same period. More (p <0.05) H gilts (n=8) attained puberty (first observed estrus) than either M gilts or L gilts (n=4 for both). Follicle numbers were similar (p >0.1) across treatments. For gilts that attained puberty, H gilts had fewer (p <0.05) follicles (13.5) than M gilts (19.7) and L gilts (21.3). For gilts with follicular development, H gilts had the heaviest (458.7 g) reproductive tract weight (RTW). However, for those that attained puberty, L gilts had the heaviest RTW. RTW were lowest for those with no follicular development. Energy restriction had a negative impact on puberty attainment, i.e. it took longer to reach puberty. However, for gilts that attained puberty, the number of follicles was greater for those on lower feed intakes. It would appear that rate of fat deposition, but not necessarily the total amount of fat, plays an important role in puberty attainment.
Resumo:
Genetic research on risk of alcohol, tobacco or drug dependence must make allowance for the partial overlap of risk-factors for initiation of use, and risk-factors for dependence or other outcomes in users. Except in the extreme cases where genetic and environmental risk-factors for initiation and dependence overlap completely or are uncorrelated, there is no consensus about how best to estimate the magnitude of genetic or environmental correlations between Initiation and Dependence in twin and family data. We explore by computer simulation the biases to estimates of genetic and environmental parameters caused by model misspecification when Initiation can only be defined as a binary variable. For plausible simulated parameter values, the two-stage genetic models that we consider yield estimates of genetic and environmental variances for Dependence that, although biased, are not very discrepant from the true values. However, estimates of genetic (or environmental) correlations between Initiation and Dependence may be seriously biased, and may differ markedly under different two-stage models. Such estimates may have little credibility unless external data favor selection of one particular model. These problems can be avoided if Initiation can be assessed as a multiple-category variable (e.g. never versus early-onset versus later onset user), with at least two categories measurable in users at risk for dependence. Under these conditions, under certain distributional assumptions., recovery of simulated genetic and environmental correlations becomes possible, Illustrative application of the model to Australian twin data on smoking confirmed substantial heritability of smoking persistence (42%) with minimal overlap with genetic influences on initiation.
Resumo:
This paper presents evidence from two survey's to help explain the poor ratings consistently given to the teaching of economics at Australian universities. The evidence suggests that the Poor ratings of economics teaching can be attributed to two related factors: inappropriate pedagogical practices and lack of rewards for allocating additional time to teaching. The survey data oil pedagogy, in economics consist of 205 responses from graduates from two Queensland universities. The time elapsed since graduation ranges from 1 to 10 years. The survey data on academics' time allocation consist of 290 responses from academic economists across a wide range of Australian universities.
Resumo:
Objective: This study aimed to describe discharge outcomes and explore their correlates for patients rehabilitated after stroke at an Australian hospital from 1993 to 1998. Design: Data on length of stay, discharge functional status, and discharge destination were retrospectively obtained from medical records. Patients' actual rehabilitation length of stay was compared with the Australian National Sub-Acute and Non-Acute Patient predicted length of stay. The change in length of stay over the 5-yr period from 1993 to 1998 was documented. Results: Patients' mean converted motor FIMTM scores improved from 53.1 at admission to 74.1 at discharge. Lower admission-converted motor FIM scores were related to longer length of stay, lower discharge-converted motor FIM scores, and the need for a change in living situation on discharge. Conclusion: The results of this study provide Australian data on discharge outcomes after stroke to assist in the planning and delivery of appropriate interventions to individual patients during rehabilitation.
Resumo:
This paper examines occupational performance in Australia across three racial groups in Australia: Indigenous Australians; Asian people, defined as all those whose language spoken at home was either Chinese, Vietnamese or other forms of a South-east or East Asian language; and white people, defined as the residual category. The paper has as its starting point, observed differences in occupational attainment among the three groups in Australia and sets out to account for these observed differences on the basis of both race and non-racial attributes such as, age, education and area of residence.
Resumo:
Objectives: To determine the incidence of dysphagia (defined as the inability to manage a diet of normal consistencies) at hospital discharge and beyond 1 year post-surgery and examine the impact of persistent dysphagia on levels of disability, handicap, and well-being in patients. Design: Retrospective review and patient contact. Setting: Adult acute care tertiary hospital. Patients: The study group, consecutively sampled from January 1993 to December 1997, comprised 55 patients who underwent total laryngectomy and 37 patients who underwent pharyngolaryngectomy with free jejunal reconstruction. Follow-up with 36 of 55 laryngectomy and 14 of 37 pharyngolaryngectomy patients was conducted 1 to 6 years postsurgery. Main Outcome Measures: Number of days until the resumption of oral intake; swallowing complications prior to and following discharge; types of diets managed at discharge and follow-up; and ratings of disability, handicap, and distress levels related to swallowing. Results: Fifty four (98%) of the laryngectomy and 37 (100%) of the pharyngolaryngectomy patients experienced dysphagia at discharge. By approximately 3 years postsurgery, 21 (58%) of the laryngectomy and 7 (50%) of the pharyngolaryngectomy patients managed a normal diet. Pharyngolaryngectomy patients experienced increased duration of nasogastric feeding, time to resume oral intake, and incidence of early complications affecting swallowing. Patients experiencing long-term dysphagia identified significantly increased levels of disability, handicap, and distress. Patients without dysphagia also experienced slight levels of handicap and distress resulting from taste changes and increased durations required to complete meals of normal consistency. Conclusions: The true incidence of patients experiencing a compromise in swallowing following surgery has been underestimated. The significant impact of impaired swallowing on a patient's level of perceived disability, handicap, and distress highlights the importance of providing optimal management of this negative consequence of surgery to maximize the patient's quality of life.
Resumo:
The purpose of this study was to compare transient evoked otoacoustic emission (TEOAE) screening outcomes (pass/fail) across the seasons (spring, autumn, and winter) between infants and schoolchildren. A total of 526 infants (275 boys, 251 girls) with a mean age of 2.0 months (SD = 0.38 months) and 975 schoolchildren (513 boys, 462 girls) with a mean age of 6.2 years (SD = 0.36 years) were screened using the ILO Otodynamics Quickscreen program. The same TEOAE pass/fail criterion was applied to the two groups. The results indicated a significant difference in pass rates between infants (91.2% of 1052 ears) and schoolchildren (86.0% of 1950 ears). A seasonal effect was found only for schoolchildren, with a significantly lower pass rate in winter than in spring or autumn. There was no difference in pass rates between spring and autumn. Implications for the seasonal effect on TEOAE screening outcomes for infants and schoolchildren are discussed.
Resumo:
Objectives: To determine (i) factors which predict whether patients hospitalised with acute myocardial infarction (AMI) receive care discordant with recommendations of clinical practice guidelines; and (ii) whether such discordant care results in worse outcomes compared with receiving guideline-concordant care. Design: Retrospective cohort study. Setting: Two community general hospitals. Participants: 607 consecutive patients admitted with AMI between July 1997 and December 2000. Main outcome measures: Clinical predictors of discordant care; crude and risk-adjusted rates of inhospital mortality and reinfarction, and mean length of hospital stay. Results: At least one treatment recommendation for AMI was applicable for 602 of the 607 patients. Of these patients, 411(68%) received concordant care, and 191 (32%) discordant care. Positive predictors at presentation of discordant care were age > 65 years (odds ratio [OR], 2.5; 95% Cl, 1.7-3.6), silent infarction (OR, 2.7; 95% Cl, 1.6-4.6), anterior infarction (OR, 2.5; 95% Cl, 1.7-3.8), a history of heart failure (OR, 6.3; 95% Cl, 3.7-10.7), chronic atrial fibrillation (OR, 3.2; 95% Cl, 1.5-6.4); and heart rate greater than or equal to 100 beats/min (OR, 2.1; 95% Cl, 1.4-3.1). Death occurred in 12.0% (23/191) of discordant-care patients versus 4.6% (19/411) of concordant-care patients (adjusted OR, 2.42; 95% Cl, 1.22-4.82). Mortality was inversely related to the level of guideline concordance (P = 0.03). Reinfarction rates also tended to be higher in the discordant-care group (4.2% v 1.7%; adjusted OR, 2.5; 95% Cl, 0.90-7.1). Conclusions: Certain clinical features at presentation predict a higher likelihood of guideline-discordant care in patients presenting with AMI Such care appears to increase the risk of inhospital death.
Resumo:
Stress echocardiography has been shown to improve the diagnosis of coronary artery disease in the presence of hypertension, but its value in prognostic evaluation is unclear. We sought to determine whether stress echocardiography could be used to predict mortality in 2363 patients with hypertension, who were followed for up to 10 years (mean 4.0+/-1.8) for death and revascularization. Stress echocardiograms were normal in 1483 patients (63%), 16% had resting left ventricular (LV) dysfunction alone, and 21% had ischemia. Abnormalities were confined to one territory in 489 patients (21%) and to multiple territories in 365 patients (15%). Cardiac death was less frequent among the patients able to exercise than among those undergoing dobutamine echocardiography (4% versus 7%, P<0.001). The risk of death in patients with a negative stress echocardiogram was <1% per year. Ischemia identified by stress echocardiography was an independent predictor of mortality in those able to exercise (hazard ratio 2.21, 95% confidence intervals 1.10 to 4.43, P=0.0001) as well as those undergoing dobutamine echo (hazard ratio 2.39, 95% confidence intervals 1.53 to 3.75, P=0.0001); other predictors were age, heart failure, resting LV dysfunction, and the Duke treadmill score. In stepwise models replicating the sequence of clinical evaluation, the results of stress echocardiography added prognostic power to models based on clinical and stress-testing variables. Thus, the results of stress echocardiography are an independent predictor of cardiac death in hypertensive patients with known or suspected coronary artery disease, incremental to clinical risks and exercise results.
Resumo:
Objective: First, to assess the clinical effectiveness of hylan G-F 20 in an appropriate care treatment regimen (as defined by the American College of Rheumatology (ACR) 1995 guidelines) as measured by validated disease-specific outcomes and health-related quality of life endpoints for patients with osteoarthritis (OA) of the knee. Second, to utilize the measures of effectiveness and costs in an economic evaluation (see accompanying manuscript). Design: A total of 255 patients with OA of the knee were enrolled by rheumatologists or orthopedic surgeons into a prospective, randomized, open-label, 1-year, multi-centred trial, conducted in Canada. Patients were randomized to 'Appropriate care with hylan G-F 20' (AC+H) or 'Appropriate care without hylan G-F 20' (AC). Data were collected at clinic visits (baseline, 12 months) and by telephone (1, 2, 4, 6, 8, 10, and 12 months). Results: The AC+H group was superior to the AC group for all primary (% reduction in mean Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) pain scale: 38% vs 13%, P=0.0001) and secondary effectiveness outcome measures. These differences were all statistically significant and exceeded the 20% difference between groups seta priori by the investigators as the minimum clinically important difference. Health-related quality of life improvements in the AC+H group were statistically superior for the WOMAC pain, stiffness and physical function (all P
Resumo:
Canola (Brassica napus L.) and sunflower (Helianthus annuus L.), two important oilseed crops, are sensitive to low boron (B) supply. Symptoms of B deficiency are often more severe during the reproductive stage, but it is not known if this is due to a decreased external B supply with time or an increased sensitivity to low B during this stage. Canola and sunflower were grown for 75 days after transplanting (DAT) in two solution culture experiments using Amberlite (IRA-743) B-specific resin to maintain constant B concentration in solution over the range 0.6 - 53 muM. Initially, the vegetative growth of both crops was good in all treatments. With the onset of the reproductive stage, however, severe B deficiency symptoms developed and growth of canola and sunflower was reduced with less than or equal to 0.9 and less than or equal to 0.7 muM B, respectively. At these concentrations, reproductive parts failed to develop. The critical B concentration (i.e. 90% of maximum shoot dry matter yield) in the youngest opened leaf was 18 mg kg(-1) in canola and 25 mg kg(-1) in sunflower at 75 DAT. The results of this study indicate that the reproductive stage of these two oilseed crops is more sensitive than the vegetative stage to low B supply.