997 resultados para Farm life.


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite research showing the benefits of glycemic control, it remains suboptimal among adults with diabetes in the United States. Possible reasons include unaddressed risk factors as well as lack of awareness of its immediate and long term consequences. The objectives of this study were to, using cross-sectional data, (1) ascertain the association between suboptimal (Hemoglobin A1c (HbA1c) .7%), borderline (HbA1c 7-8.9%), and poor (HbA1c .9%) glycemic control and potentially new risk factors (e.g. work characteristics), and (2) assess whether aspects of poor health and well-being such as poor health related quality of life (HRQOL), unemployment, and missed-work are associated with glycemic control; and (3) using prospective data, assess the relationship between mortality risk and glycemic control in US adults with type 2 diabetes. Data from the 1988-1994 and 1999-2004 National Health and Nutrition Examination Surveys were used. HbA1c values were used to create dichotomous glycemic control indicators. Binary logistic regression models were used to assess relationships between risk factors, employment status and glycemic control. Multinomial logistic regression analyses were conducted to assess relationships between glycemic control and HRQOL variables. Zero-inflated Poisson regression models were used to assess relationships between missed work days and glycemic control. Cox-proportional hazard models were used to assess effects of glycemic control on mortality risk. Using STATA software, analyses were weighted to account for complex survey design and non-response. Multivariable models adjusted for socio-demographics, body mass index, among other variables. Results revealed that being a farm worker and working over 40 hours/week were risk factors for suboptimal glycemic control. Having greater days of poor mental was associated with suboptimal, borderline, and poor glycemic control. Having greater days of inactivity was associated with poor glycemic control while having greater days of poor physical health was associated with borderline glycemic control. There were no statistically significant relationships between glycemic control, self-reported general health, employment, and missed work. Finally, having an HbA1c value less than 6.5% was protective against mortality. The findings suggest that work-related factors are important in a person’s ability to reach optimal diabetes management levels. Poor glycemic control appears to have significant detrimental effects on HRQOL.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite research showing the benefits of glycemic control, it remains suboptimal among adults with diabetes in the United States. Possible reasons include unaddressed risk factors as well as lack of awareness of its immediate and long term consequences. The objectives of this study were to, using cross-sectional data, 1) ascertain the association between suboptimal (Hemoglobin A1c (HbA1c) ≥7%), borderline (HbA1c 7-8.9%), and poor (HbA1c ≥9%) glycemic control and potentially new risk factors (e.g. work characteristics), and 2) assess whether aspects of poor health and well-being such as poor health related quality of life (HRQOL), unemployment, and missed-work are associated with glycemic control; and 3) using prospective data, assess the relationship between mortality risk and glycemic control in US adults with type 2 diabetes. Data from the 1988-1994 and 1999-2004 National Health and Nutrition Examination Surveys were used. HbA1c values were used to create dichotomous glycemic control indicators. Binary logistic regression models were used to assess relationships between risk factors, employment status and glycemic control. Multinomial logistic regression analyses were conducted to assess relationships between glycemic control and HRQOL variables. Zero-inflated Poisson regression models were used to assess relationships between missed work days and glycemic control. Cox-proportional hazard models were used to assess effects of glycemic control on mortality risk. Using STATA software, analyses were weighted to account for complex survey design and non-response. Multivariable models adjusted for socio-demographics, body mass index, among other variables. Results revealed that being a farm worker and working over 40 hours/week were risk factors for suboptimal glycemic control. Having greater days of poor mental was associated with suboptimal, borderline, and poor glycemic control. Having greater days of inactivity was associated with poor glycemic control while having greater days of poor physical health was associated with borderline glycemic control. There were no statistically significant relationships between glycemic control, self-reported general health, employment, and missed work. Finally, having an HbA1c value less than 6.5% was protective against mortality. The findings suggest that work-related factors are important in a person’s ability to reach optimal diabetes management levels. Poor glycemic control appears to have significant detrimental effects on HRQOL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[Excerpt] In this chapter, we draw from both popular media and research support, along with anecdotal examples drawn from conversations accumulated as part of our own prior studies. Our goal is to present reminders that working hours are a personal life choice, even with external demands, but a choice that is influenced by elements of the individual’s working situation. The implications of a choice for long working hours are shown through use of two past “hard working” icons from popular media, one from the 1940s and one from the 1980s. Discussion continues into current time with an overview highlighting advances in technology that provide expanded work opportunities but, also, exacerbate tendencies toward work addiction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although diarrhoea caused by Cryptosporidium is prevalent in livestock species throughout the world relatively little is known about the species and subtypes of Cryptosporidium found in cattle on Scottish farms. In particular, little is known about the shedding profiles (age when calves become infected and duration of shedding) of the different species found in cattle and how calves become infected. There are several theories about how neonatal calves first become infected with the parasite but the role which adult cattle play in the transmission of the parasite has not been fully addressed. It was previously thought that adult cattle did not become infected with the same species of Cryptosporidium which causes disease in the young calves. Some studies have shown that this may not be true and with the advance of new techniques to discriminate species this is an area which should be revisited. In addition, it is known that it is possible for humans to become infected with Cryptosporidium and show clinical disease early in life and then again later in adulthood. In livestock however, diarrhoea caused by the parasite is generally only seen in neonatal livestock while older animals tend to be asymptomatic. It is not known if this resistance to clinical disease at an older age is due to changes in the host with an increase in age or if prior infection “immunises” the animal and provides protection against re-infection. It is also not known if infection with one isolate of C. parvum will provide protection against infection with another or if the protection formed is species/isolate specific. The main aims of this thesis were to: determine the species and subtypes of Cryptosporidium found in calves on a study farm over a one year period from birth; assess the role which adult cattle play in the transmission of the parasite to newborn calves; develop new typing tools to enable the rapid and easy differentiation of Cryptosporidium species found in cattle and to examine the host-pathogen interactions in animals given serial experimental challenges with distinct Cryptosporidium parvum isolates to determine if the resistance seen in older animals on farms is due to an increase in age or as a result of prior infection. iii A variety of different approaches were taken to achieve these aims. Longitudinal experiments carried out on a study farm revealed that in calves <9 weeks of age the most common species of Cryptosporidium is C. parvum and that all calves in the group became infected with Cryptosporidium within the first two weeks of life. Sample collection from the same animals later in life (at 6 months of age) showed that contrary to most previous studies the most common species detected at in this age group was also C. parvum although, interestingly, the subtype which the calves were shedding was not the same subtype that they were shedding previously. The longitudinal study which investigated the role of adult cattle in the transmission of Cryptosporidium also yielded some interesting results. It was found that most of the adult cattle on this farm were shedding Cryptosporidium albeit intermittently. Speciation of the positive samples revealed that, on this farm, the most predominant species of Cryptosporidium in adult cattle was also C. parvum. This is very unusual as most previous studies have not found this level of infection in older cattle and C. parvum is not usually found in this age group. A number of different subtypes were found in adult cattle and some animals shed more than one subtype over the course of the study. This contradicts prior findings which demonstrated that only one subtype is found on a single farm. The experimental infection trial involving infection of young (<1 week old) and older (6 week old) lambs with distinct C. parvum isolates demonstrated that an increase in age at primary infection reduces the effect of clinical disease. Animals which were infected at <1 week of age were re-challenged at 6 weeks of age with either a homologous or heterologous infection. Results revealed that previous exposure does not protect against re-infection with the same or a different isolate of C. parvum. This study also demonstrated that an increase in infective dose leads to a shorter pre-patent period and that there are variations in the clinical manifestations of different isolates of the same Cryptosporidium species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grain finishing of cattle has become increasingly common in Australia over the past 30 years. However, interest in the associated environmental impacts and resource use is increasing and requires detailed analysis. In this study we conducted a life cycle assessment (LCA) to investigate impacts of the grain-finishing stage for cattle in seven feedlots in eastern Australia, with a particular focus on the feedlot stage, including the impacts from producing the ration, feedlot operations, transport, and livestock emissions while cattle are in the feedlot (gate-to-gate). The functional unit was 1 kg of liveweight gain (LWG) for the feedlot stage and results are included for the full supply chain (cradle-to-gate), reported per kilogram of liveweight (LW) at the point of slaughter. Three classes of cattle produced for different markets were studied: short-fed domestic market (55–80 days on feed), mid-fed export (108–164 days on feed) and long-fed export (>300 days on feed). In the feedlot stage, mean fresh water consumption was found to vary from 171.9 to 672.6 L/kg LWG and mean stress-weighted water use ranged from 100.9 to 193.2 water stress index eq. L/kg LWG. Irrigation contributed 57–91% of total fresh water consumption with differences mainly related to the availability of irrigation water near the feedlot and the use of irrigated feed inputs in rations. Mean fossil energy demand ranged from 16.5 to 34.2 MJ lower heating values/kg LWG and arable land occupation from 18.7 to 40.5 m2/kg LWG in the feedlot stage. Mean greenhouse gas (GHG) emissions in the feedlot stage ranged from 4.6 to 9.5 kg CO2-e/kg LWG (excluding land use and direct land-use change emissions). Emissions were dominated by enteric methane and contributions from the production, transport and milling of feed inputs. Linear regression analysis showed that the feed conversion ratio was able to explain >86% of the variation in GHG intensity and energy demand. The feedlot stage contributed between 26% and 44% of total slaughter weight for the classes of cattle fed, whereas the contribution of this phase to resource use varied from 4% to 96% showing impacts from the finishing phase varied considerably, compared with the breeding and backgrounding. GHG emissions and total land occupation per kilogram of LWG during the grain finishing phase were lower than emissions from breeding and backgrounding, resulting in lower life-time emissions for grain-finished cattle compared with grass finishing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the scored Patient-generated Subjective Global Assessment (PG-SGA) tool as an outcome measure in clinical nutrition practice and determine its association with quality of life (QoL). DESIGN: A prospective 4 week study assessing the nutritional status and QoL of ambulatory patients receiving radiation therapy to the head, neck, rectal or abdominal area. SETTING: Australian radiation oncology facilities. SUBJECTS: Sixty cancer patients aged 24-85 y. INTERVENTION: Scored PG-SGA questionnaire, subjective global assessment (SGA), QoL (EORTC QLQ-C30 version 3). RESULTS: According to SGA, 65.0% (39) of subjects were well-nourished, 28.3% (17) moderately or suspected of being malnourished and 6.7% (4) severely malnourished. PG-SGA score and global QoL were correlated (r=-0.66, P<0.001) at baseline. There was a decrease in nutritional status according to PG-SGA score (P<0.001) and SGA (P<0.001); and a decrease in global QoL (P<0.001) after 4 weeks of radiotherapy. There was a linear trend for change in PG-SGA score (P<0.001) and change in global QoL (P=0.003) between those patients who improved (5%) maintained (56.7%) or deteriorated (33.3%) in nutritional status according to SGA. There was a correlation between change in PG-SGA score and change in QoL after 4 weeks of radiotherapy (r=-0.55, P<0.001). Regression analysis determined that 26% of the variation of change in QoL was explained by change in PG-SGA (P=0.001). CONCLUSION: The scored PG-SGA is a nutrition assessment tool that identifies malnutrition in ambulatory oncology patients receiving radiotherapy and can be used to predict the magnitude of change in QoL.

Relevância:

20.00% 20.00%

Publicador: