993 resultados para feeding management
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The objective of this study was to investigate whether increased dietary water content and feeding frequency increased voluntary physical activity of young, lean adult female cats. A replicated 4 x 4 Latin square design with a 2 x 2 factorial treatment arrangement (feeding frequency and water content) was used. The 4 treatments consisted of 1 meal daily dry pet food without added water (1D; 12% moisture as is), 1 meal daily dry pet food with added water (1W; 70% total water content), 4 meals daily dry pet food without added water (4D; 12% moisture as is), and 4 meals daily dry pet food with added water (4W; 70% total water content). Eight healthy adult, lean, intact, young, female domestic shorthair cats were used in this experiment. Voluntary physical activity was evaluated using Actical activity monitors placed on collars and worn around the cats'necks for the last 7 d of each experimental period of 14 d. Food anticipatory activity (FAA) was calculated based on 2 h prior to feeding periods and expressed as a percentage of total daily voluntary physical activity. Increased feeding frequency (4 vs. 1 meal daily) resulted in greater average daily activity (P = 0.0147), activity during the light period (P = 0.0023), and light: dark activity ratio (P = 0.0002). In contrast, physical activity during the dark period was not altered by feeding frequency (P > 0.05). Cats fed 4 meals daily had increased afternoon FAA (P = 0.0029) compared with cats fed once daily. Dietary water content did not affect any measure of voluntary physical activity. Increased feeding frequency is an effective strategy to increase the voluntary physical activity of cats. Thus, it may assist in the prevention and management of obesity.
Resumo:
This NebGuide describes the life cycle of the army cutworm and pale western cutworm, and provides recommendations for management.The army cutworm, Euxoa auxiliaris, and the pale western cutworm, Agrotis orthogonia, are sporadic pests that are distributed throughout the Great Plains. The army cutworm can be found throughout Nebraska, but is more common in the western half of the state. Because of the drier environment, the pale western cutworm is found only in the western third of Nebraska. Both cutworms can feed on a vast array of crops and weeds. Their major economic impact is limited to winter wheat and alfalfa, because these are the vulnerable crops growing in the early spring when larval feeding activity occurs. However, they can also cause substantial damage to early spring row crops (sugarbeets and corn), especially in areas where winter cereal cover crops are used.
Resumo:
When looking at developing countries, the prolonged intensive medical and nursing care required by many patients places extra demands on an already stretched healthcare budget. The purpose of this study was to verify the effectiveness of a systematic rehabilitative program for swallowing and oral-motor movements in intensive care unit patients with the diagnosis of tetanus. Forty-five patients who were clinically diagnosed with tetanus were included in the study. Participants were divided in two groups: Cl - consisted of 18 tetanus patients who were consecutively admitted to the infectious disease ICU from January 2002 to December 2005, prior to the existence of a systematic swallowing and oral-motor intervention: GII - consisted of 27 tetanus patients who were consecutively admitted to the infectious disease ICU from January 2006 to December 2009 and were submitted to a specific rehabilitative management of swallowing and of the oral-motor movements. Results indicate that the proposed rehabilitative program reduced by approximately 50% the time patients remained in the ICU. The significant improvement observed in patients with tetanus who were submitted to the rehabilitative program for swallowing and oral-motor movements occurred in conjunction with a reduction in the amount of time necessary to reintroduce oral feeding, to decannulate and to remove the feeding tubes. In conclusion, swallowing/muscle exercise, in patients with severe/very severe tetanus, seem to promote the remission of muscle tension and seem to maximize functional swallowing. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Colostrum feeding in small ruminants is crucial during the first hours after birth due to the lack of Ig transfer during pregnancy via the placenta. In addition the immature immune system of the neonate is slow to produce its own Ig during the first weeks of life. Colostrogenesis, i.e. the transfer of Ig from blood into mammary secretions, starts several weeks prepartum. In goat plasma, immunoglobulin G (IgG) concentration decreases by around 38% from the third month of gestation until partum, which coincides with the dry period. Thus, management during the dry period is crucial for the course of colostrogenesis. The colostrum synthesis is determined by the nutrition during the prepartum period, but the transfer of Ig is obviously independent of nutritional influences. The administration of conjugated linoleic acid during the dry period to dairy goats causes a less pronounced decrease of blood plasma IgG concentration (6%) but it did not change colostral IgG levels. In cattle, IgG1 is transported from blood into colostrum by an IgG1 specific receptor located on the surface of alveolar epithelial cells during colostrogenesis, and this is most likely similar in small ruminants. Via inactivation of this receptor, the Ig transfer is downregulated by increasing prolactin (PRL) during lactogenesis. It was recently observed in goats treated with PGF2 alpha, in order to induce parturition, lower colostrum IgG concentrations occurred concomitantly with an earlier increase of plasma PRL as compared to untreated animals. The effect of litter size and number of lactations on colostral IgG concentration in small ruminants has not been made fully clear until now most likely due to the different breeds used in the published studies.
Resumo:
Arterio-venous malformations (AVMs) are congenital vascular malformations (CVMs) that result from birth defects involving the vessels of both arterial and venous origins, resulting in direct communications between the different size vessels or a meshwork of primitive reticular networks of dysplastic minute vessels which have failed to mature to become 'capillary' vessels termed "nidus". These lesions are defined by shunting of high velocity, low resistance flow from the arterial vasculature into the venous system in a variety of fistulous conditions. A systematic classification system developed by various groups of experts (Hamburg classification, ISSVA classification, Schobinger classification, angiographic classification of AVMs,) has resulted in a better understanding of the biology and natural history of these lesions and improved management of CVMs and AVMs. The Hamburg classification, based on the embryological differentiation between extratruncular and truncular type of lesions, allows the determination of the potential of progression and recurrence of these lesions. The majority of all AVMs are extra-truncular lesions with persistent proliferative potential, whereas truncular AVM lesions are exceedingly rare. Regardless of the type, AV shunting may ultimately result in significant anatomical, pathophysiological and hemodynamic consequences. Therefore, despite their relative rarity (10-20% of all CVMs), AVMs remain the most challenging and potentially limb or life-threatening form of vascular anomalies. The initial diagnosis and assessment may be facilitated by non- to minimally invasive investigations such as duplex ultrasound, magnetic resonance imaging (MRI), MR angiography (MRA), computerized tomography (CT) and CT angiography (CTA). Arteriography remains the diagnostic gold standard, and is required for planning subsequent treatment. A multidisciplinary team approach should be utilized to integrate surgical and non-surgical interventions for optimum care. Currently available treatments are associated with significant risk of complications and morbidity. However, an early aggressive approach to elimiate the nidus (if present) may be undertaken if the benefits exceed the risks. Trans-arterial coil embolization or ligation of feeding arteries where the nidus is left intact, are incorrect approaches and may result in proliferation of the lesion. Furthermore, such procedures would prevent future endovascular access to the lesions via the arterial route. Surgically inaccessible, infiltrating, extra-truncular AVMs can be treated with endovascular therapy as an independent modality. Among various embolo-sclerotherapy agents, ethanol sclerotherapy produces the best long term outcomes with minimum recurrence. However, this procedure requires extensive training and sufficient experience to minimize complications and associated morbidity. For the surgically accessible lesions, surgical resection may be the treatment of choice with a chance of optimal control. Preoperative sclerotherapy or embolization may supplement the subsequent surgical excision by reducing the morbidity (e.g. operative bleeding) and defining the lesion borders. Such a combined approach may provide an excellent potential for a curative result. Conclusion. AVMs are high flow congenital vascular malformations that may occur in any part of the body. The clinical presentation depends on the extent and size of the lesion and can range from an asymptomatic birthmark to congestive heart failure. Detailed investigations including duplex ultrasound, MRI/MRA and CT/CTA are required to develop an appropriate treatment plan. Appropriate management is best achieved via a multi-disciplinary approach and interventions should be undertaken by appropriately trained physicians.
Resumo:
OBJECTIVE: The purpose of this study was to evaluate, in relation to intraoperative estimated blood loss (EBL), the effectiveness of preoperative transcatheter arterial embolization of hypervascular osseous metastatic lesions before orthopedic resection and stabilization. MATERIALS AND METHODS: Between June 1987 and November 2007, 22 patients underwent transcatheter arterial embolization of tumors of the long bone, hip, or vertebrae before resection and stabilization. Osseous metastatic lesions from renal cell carcinoma, malignant melanoma, leiomyosarcoma, and prostate cancer were embolized. All patients were treated with a coaxial catheter technique with polyvinyl alcohol (PVA) particles alone or a combination of PVA particles and coils. After embolization, each tumor was angiographically graded according to devascularization (grades 1-3) based on tumor blush after contrast injection into the main tumor-feeding arteries. RESULTS: In patients with complete devascularization (grade 1), mean EBL was calculated to be 1,119 mL, whereas in patients with partial embolization (grades 2 and 3) EBL was 1,788 mL and 2,500 mL. With respect to intraoperative EBL, no significant difference between devascularization grades was found (p > 0.05). Moderate correlation (r = 0.51, p = 0.019) was observed between intraoperative EBL and tumor size before embolization. Only low correlation (r = 0.44, p = 0.046) was found between intraoperative EBL and operating time. Major complications included transient palsy of the sciatic nerve and gluteal abscess in one patient. CONCLUSION: The results of this study support the concept that there is no statistically significant difference among amounts of intraoperative EBL with varying degrees of embolization.
Resumo:
Chronic pancreatitis (CP) is an inflammatory disorder that results in permanent impairment of the glandular anatomy of the pancreas with or without functional abnormalities. The pathogenesis of CP is usually unclear, except in the case of alcohol-induced disease. The most common symptoms of CP are abdominal pain, diarrhea, and weight loss often requiring recurring hospitalization. Over time, pancreatic endocrine and exocrine dysfunction may develop as the disease progresses, and a variety of complications can occur. Among the possible complications are nutrient malabsorption and diabetes mellitus. The treatment of CP is difficult and challenging for every physician. Relieving pain is the first step in treating CP. This symptom needs to be controlled, often with narcotics, which can cause dependence. Diarrhea usually indicates the presence of steatorrhea, which is often treated with a high-calorie, high-protein, and low-fat diet to minimize symptoms of the underlying disease and to promote weight retention or gain. Pancreatic replacement therapy is used to combat maldigestion and malabsorption. Patients with diabetes may need insulin therapy for glycemic control. The use of parenteral nutrition for bowel rest is a standard approach in patients with symptomatic CP. The use of jejunal enteral feeding recently has been evaluated for efficacy in CP patients. The role of pancreatic endotherapy in the management of CP is evolving. Several reports have suggested that endoscopic therapy aimed at decompressing the obstructed pancreatic duct can be associated with pain relief in some patients. Surgery should be considered in patients who fail medical therapy.
Resumo:
Animal production, hay production and feeding, winter forage composition changes, and summer pasture yields and nutrient composition of a year-round grazing system for spring-calving and fall-calving cows were compared to those of a conventional, minimal land system. Cows in the year-round and minimal land systems grazed forage from smooth bromegrassorchardgrass-birdsfoot trefoil (SB-O-T) pastures at 1.67 and 3.33 acres, respectively, per cow in the summer. During the summer, SB-O-T pastures in the year-round grazing system also were grazed by stockers at 1.67 stockers per acre, and spring-calving and fall-calving cows grazed smooth bromegrass–red clover (SB-RC) and endophyte-free tall fescue–red clover (TF-RC) at 2.5 acres per cow for approximately 45 days in midsummer. In the year-round grazing system, spring-calving cows grazed corn crop residues at 2.5 acres per cow and stockpiled SB-RC pastures at 2.5 acres per cow; fallcalving cows grazed stockpiled TF-RC pastures at 2.5 acres per cow during winter. In the minimal land system, in winter, cows were maintained in a drylot on first-cutting hay harvested from 62.5–75% of the pasture acres during summer. Hay was fed to maintain a body condition score of 5 on a 9-point scale for springcalving cows in both systems and a body condition score of 3 for fall-calving cows in the year-round system. Over 3 years, mean body weights of fall-calving cows in the year-round system did not differ from the body weights of spring-calving cows in either system, but fall-calving cows had higher (P < .05) body condition scores compared to spring-calving cows in either system. There were no differences among all groups of cows in body condition score changes over the winter grazing season (P > .05). During the summer grazing season, fall-calving cows in the year- round system and springcalving cows in the minimal land system gained more body condition and more weight (P < .05) than springcalving cows in the year-round grazing system. Fall calves in the year-round system had higher birth weights, lower weaning weights, and lower average preweaning daily gains compared to either group of spring calves (P < .05). However, there were no significant differences for birth weights, weaning weights, or average pre-weaning daily gains between spring calves in either system over the 3-year experiment (P > .05). The amount of total growing animal production (calves and stockers) per acre for each system did not differ in any year (P > .05). Over the 3-year experiment, 1.9 ton more hay was fed per cow and 1 ton more hay was fed per cow–calf pair in the minimal land system compared to the year-round grazing system (P < .05).
Resumo:
A year-round grazing system for spring- and fall-calving cows was developed to compare animal production and performance, hay production and feeding, winter forage composition changes, and summer pasture yield and nutrient composition to that from a conventional, or minimal land system. Systems compared forage from smooth bromegrass-orchardgrass-birdsfoot trefoil pastures for both systems in the summer and corn crop residues and stockpiled grass-legume pastures for the year-round system to drylot hay feeding during winter for the minimal land system. The year-round grazing system utilized 1.67 acres of smooth bromegrassorchardgrass- birdsfoot trefoil (SB-O-T) pasture per cow in the summer, compared with 3.33 acres of (SB-O-T) pasture per cow in the control (minimal land) system. In addition to SB-O-T pastures, the year-round grazing system utilized 2.5 acres of tall fescue-red clover (TFRC) and 2.5 acres of smooth bromegrass-red clover (SBRC) per cow for grazing in both mid-summer and winter for fall- and spring-calving cows, respectively. First-cutting hay was harvested from the TF-RC and SB-RC pastures, and regrowth was grazed for approximately 45 days in the summer. These pastures were then fertilized with 40 lbs N/acre and stockpiled for winter grazing. Also utilized during the winter for spring-calving cows in the year-round grazing system were corn crop residue (CCR) pastures at an allowance of 2.5 acres per cow. In the minimal land system, hay was harvested from three-fourths of the area in SB-O-T pastures and stored for feeding in a drylot through the winter. Summer grazing was managed with rotational stocking for both systems, and winter grazing of stockpiled forages and corn crop residues by year-round system cows was managed by strip-stocking. Hay was fed to maintain a body condition score of 5 on a 9 point scale for spring-calving cows in both systems. Hay was supplemented as needed to maintain a body condition score of 3 for fall-calving cows nursing calves through the winter. Although initial condition scores for cows in both systems were different at the initiation of grazing for both winter and summer, there were no significant differences (P > .05) in overall condition score changes throughout both grazing seasons. In year 1, fall-calving cows in the year-round grazing system lost more (P < .05) body weight during winter than spring-calving cows in either system. In year 2, there were no differences seen in weight changes over winter for any group of cows. Average daily gains of fall calves in the yearround system were 1.9 lbs/day compared with weight gains of 2.5 lbs/day for spring calves from both systems. Yearly growing animal production from pastures for both years did not differ between systems when weight gains of stockers that grazed summer pastures in the year-round grazing system were added to weight gains of suckling calves. Carcass characteristics for all calves finished in the feedlot for both systems were similar. There were no significant differences in hay production between systems for year 1; however, amounts of hay needed to maintain cows were 923, 1373, 4732 lbs dry matter/cow for year-round fall-calving, year-round spring-calving, and minimal land spring-calving cows, respectively. In year 2, hay production per acre in the minimal land system was greater (P < .05) than for the year-round system, but the amounts of hay required per cow were 0, 0, and 4720 lbs dry matter/cow for yearround fall-calving, year-round spring-calving, and minimal land spring-calving cows, respectively.
Resumo:
The aim of this study was to document experience gained with herd health management in veal calf production and to describe the calves' most frequent health problems. Fifteen farms with an 'all-in-all-out' animal flow system and 20 farms with a continuous animal flow system were investigated and data on animal movements, housing, feeding, medical treatments, and management were collected. Cadavers underwent pathological examination, and data were recorded from the carcasses of slaughtered calves. On the 15 'all-in-all-out'-farms, 2'747 calves were clinically examined by the contract-veterinarian upon arrival at the farm, and 71,1 % of the calves showed at least one sign of illness. The main causes of death were with 54,9 % digestive disorders (a perforating abomasal ulcer being the most frequent diagnosis), followed by respiratory diseases (29,6 %, mainly pneumonia). The meat color of 25 % of the carcasses was red. Calves from farms with the continuous animal flow system, which recruit mainly animals originating from the same farm, showed significantly better results regarding antibiotic use, performance and carcass quality than those calves from farms with the 'all-in-all-out'-system.
Resumo:
Background. The increasing emphasis on medical outcomes and cost containment has made it imperative to identify patient populations in which aggressive nutritional care can improve quality of care. The aim of this prospective study was to implement a standardized early jejunal feeding protocol for patients undergoing small and large bowel resection, and to evaluate its effect on patient outcome and cost.^ Methods. Treatment patients (n = 81) who met protocol inclusion criteria had a jejunal feeding tube inserted at the time of surgery. Feeding was initiated at 10 cc/hour within 12 hours after bowel resection and progressed if hemodynamically stable. The control group (n = 159) received usual care. Outcome measures included postoperative length of stay, total direct cost, nosocomial infection rate and health status (SF-36) scores.^ Results. By postoperative day 4, the use of total parenteral nutrition (TPN) was significantly greater in the control group compared to the treatment group; however, total nutritional intake was significantly less. Multiple regression analysis indicated an increased likelihood of infection with the use of TPN. A reduction of 3.5 postoperative days (p =.013) with 4.3 fewer TPN days per patient (p =.001) and a 9.6% reduction in infection rate (p =.042) was demonstrated in the treatment group. There was no difference in health status scores between groups at discharge and 3 months post-discharge.^ Conclusion. These positive outcomes and an average total cost savings of $4,145 per treatment patient indicate that the treatment protocol was effective. ^
Resumo:
It is estimated that N losses from fertilized crops range between 50-70%, depending on management practices, climate and soil conditions. Ammonia (NH3) emissions following land application of animal manures give rise to a significant proportion of the total NH3 emissions from agricultural sources.
Resumo:
Six Bos taurus (Hereford) steers (body weight 324 22 kg) were used in a 45-day study with a replicated 3 x 3 Latin-square design. Three treatments [ad libitum feeding (ADLIB); limit feeding, 85% of ad libitum (LIMIT); bunk management feeding where steers were only given access to feed from 1600 to 0800 hours the following day (BUNK)] were imposed over 3 periods, with 2 steers assigned to each treatment in each period. Cattle were managed in a temperature-controlled metabolism unit and were exposed to both thermoneutral (17.7degreesC-26.1degreesC) and hot (16.7degreesC-32.9degreesC) environmental conditions. By design, during the thermoneutral period, the ADLIB cattle displayed greater intake (P < 0.05) than the LIMIT group, with the BUNK group being intermediate. However, during the hot period, both the LIMIT and BUNK treatment groups increased feed intake 4-5%, whereas feed intake of the ADLIB treatment group declined nearly 2%. During both periods respiration rate (RR, breath/min) followed the same pattern that was observed for feed intake, with the greatest (P < 0.05) RR found in the ADLIB treatment group (81.09 and 109.55, thermoneutral and hot, respectively) and lowest (P < 0.05) RR in the LIMIT treatment group (74.47 and 102.76, thermoneutral and hot, respectively). Rectal temperature (RT) did not differ among treatments during the thermoneutral period or the first hot day, although during the thermoneutral period the ADLIB treatment group did tend to display a lower RT, possibly as a result of other physiological processes (pulse rate and RR) aiding to keep RT lower. During the hot period, differences in RT were found on Day 5, with the LIMIT cattle having lower (P < 0.10) RT (38.92degreesC) than the ADLIB (39.18degreesC) cattle, with BUNK cattle RT (39.14degreesC) being intermediate. However, when hourly data were examined, the ADLIB cattle had greater(P < 0.05) RT than the BUNK and LIMIT at 1800 hours and greater RT (P < 0.05) than the LIMIT group at 1400, 1500, and 1600 hours. Clearly, a change in diurnal RT pattern was obtained by using the LIMIT and BUNK feeding regimen. Both of these groups displayed a peak RT during the hot conditions, between 2100 and 2200 hours, whereas the ADLIB group displayed a peak RT between 1400 and 1500 hours, a time very close to when peak climatic stress occurs. Based on these results it is apparent that feedlot managers could alleviate the effects of adverse hot weather on cattle by utilising either a limit-feeding regimen or altering bunk management practices to prevent feed from being consumed several hours prior to the hottest portion of the day.
Resumo:
Wildlife feeding is a wide-spread and controversial practice that can pose serious threats to the safety of both wildlife and visitors. The design and effectiveness of warning signs in recreational areas varies considerably and is rarely the product of theoretical models or scientific research. This study uses front-end and formative evaluation to design and test the perceived effectiveness of warning signs relating to bird feeding. Stage One examined visitors' beliefs, attitudes and bird feeding behaviour and found significant differences between feeders and non-feeders. Stage Two involved designing and evaluating three signs that built on the beliefs, knowledge and mis/conceptions identified in Stage One. Respondents thought the sign that focused on the birds' health and safety would be the most persuasive, however, elements of the other two signs were also positively evaluated. The article concludes with recommendations for the wording of future bird feeding warning signs. (c) 2004 Elsevier Ltd. All rights reserved.