995 resultados para PERCENTAGE


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background, aim, and scope The retention of potentially toxic metals in highly weathered soils can follow different pathways that variably affect their mobility and availability in the soil-water-plant system. This study aimed to evaluate the effects of pH, nature of electrolyte, and ionic strength of the solution on nickel (Ni) adsorption by two acric Oxisols and a less weathered Alfisol. Materials and methods The effect of pH on Ni adsorption was evaluated in surface and subsurface samples from a clayey textured Anionic `Rhodic` Acrudox ( RA), a sandy-clayey textured Anionic `Xantic` Acrudox (XA), and a heavy clayey textured Rhodic Kandiudalf (RK). All soil samples were equilibrated with the same concentration of Ni solution (5.0 mg L(-1)) and two electrolyte solutions (CaCl(2) or NaCl) with different ionic strengths (IS) (1.0, 0.1 and 0.01 mol L(-1)). The pH of each sample set varied from 3 to 10 in order to obtain sorption envelopes. Results and discussion Ni adsorption increased as the pH increased, reaching its maximum of nearly pH 6. The adsorption was highest in Alfisol, followed by RA and XA. Competition between Ni(2+) and Ca(2+) was higher than that between Ni(2+) and Na(+) in all soil samples, as shown by the higher percentage of Ni adsorption at pH 5. At pH values below the intersection point of the three ionic strength curves (zero point of salt effect), Ni adsorption was generally higher in the more concentrated solution (highest IS), probably due to the neutralization of positive charges of soil colloids by Cl(-) ions and consequent adsorption of Ni(2+). Above this point, Ni adsorption was higher in the more diluted solution (lowest ionic strength), due to the higher negative potential at the colloid surfaces and the lower ionic competition for exchange sites in soil colloids. Conclusions The effect of ionic strength was lower in the Oxisols than in the Alfisol. The main mechanism that controlled Ni adsorption in the soils was the ionic exchange, since the adsorption of ionic species varied according to the variation of pH values. The ionic competition revealed the importance of electrolyte composition and ionic strength on Ni adsorption in soils from the humid tropics. Recommendations and perspectives The presence of NaCl or CaCl(2) in different ionic strengths affects the availability of heavy metals in contaminated soils. Therefore, the study of heavy metal dynamics in highly weathered soils must consider this behavior, especially in soils with large amounts of acric components.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Florida Spodosols axe sandy, inherently low in Fe- and Al-based minerals, and sorb phosphorus (P) poorly. We evaluated runoff and leachate P losses from a typical Florida Spodosol amended with biosolids and triple superphosphate (TSP). Phosphorus losses were evaluated with traditional indoor rainfall simulations but used a double-deck box arrangement that allowed leaching and runoff to be determined simultaneously. Biosolids (Lakeland, OCUD, Milorganite, and Disney) represented contrasting values of total P, percent water-extractable p (PWEP), and percentage of solids. All P sources were surface applied at 224 kg P ha(-1), representing a soil P rate typical of N-based biosolids application. All biosolids-P sources lost less P than TSp, and leachate-P losses generally dominated. For Lakeland-amended I soil, bioavailable P (BAP) was mainly lost by runoff (81% of total BAP losses). This behavior was due to surface scaling and 1 drying after application of the slurry (31 g kg(-1) solids), material. For all other P sources, BAP losses in leachate were much,greater than in runoff, representing 94% of total BAP losses for TSP, 80% for Milorganite, 72% for Disney, and 69% for OCUD treatments. Phosphorus leaching can be extreme and represents a great concern in many coarse-textured Florida Spodosols, and other coastal plain soils with low P-sorption,capacities. The PWEP values of P sources were significantly correlared with total P and BAP losses in runoff and leachate. The PWEP of a source can serve as a good indicator of potential P loss when amended to sandy soils with low P-retention capacities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this study was to investigate immunoglobulin G (IgG) and total serum protein (TP) acquisition in newborn Santa Ines lambs fed Holstein bovine or Santa Ines ovine colostrum as well as the cell proliferation rate in the animals` intestine epithelium. At 0 h and 6 h of life, 12 newborn lambs received 250 mL of bovine 1st milking colostrum (BC) and another 12 animals received 250 mL of ovine 1st milking colostrum (OC). Blood samples were collected at 0, 6, 24. and 72 h of life. Six animals were randomly slaughtered just after birth, without colostrum intake. The other animals were randomly slaughtered at 24 and 72 h. The IgG serum concentration at 6, 24 and 72 h were significantly higher for BC, 16.32 +/- 6.19; 33.80 +/- 5.68 and 27.95 +/- 5.46 mg/mL respectively, compared with OC, 11.31 +/- 6.08, 21.02 +/- 6.53 and 19.88 +/- 7.31 mg/mL BC showed higher (P < 0.05) TP values (7.29 +/- 0.87 and 6.89 +/- 0.30 g/100 mL) at 24 and 72 h in relation to OC (5.73 +/- 1.35 and 5.69 +/- 0.57 g/100 mL). At birth, the animals showed 32.52%, 45.47% and 30.60% cells in division for the duodenum, jejunum and ileum, respectively. At 24 h, the OC animals showed lower (P < 0.0001) mitotic cell percentage in the duodenum (42.12%) and ileum (35.66%) in relation to the BC animals, 46.44% and 39.74%, respectively. At 72 h, a lower (P < 0.0001) rate of proliferation was observed in the duodenum crypts of the OC animals (36.28%) compared with BC (43.18%). The results indicate that this lacteal secretion can accelerate the epithelium renovation process and can be used as an alternative source of IgG for newborn lambs. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data on fertilisation and embryo quality in dairy cattle are presented and the main factors responsible for the low fertility of single-ovulating lactating cows and embryo yield in superovulated dairy cattle are highlighted. During the past 50 years, the fertility in high-producing lactating dairy cattle has decreased as milk production increased. Recent data show conception rates to first service to be approximately 32% in lactating cows, whereas in heifers it has remained above 50%. Fertilisation does not seem to be the principal factor responsible for the low fertility in single-ovulating cows, because it has remained above 80%. Conversely, early embryonic development is impaired in high-producing dairy cows, as observed by most embryonic losses occurring during the first week after fertilisation. However, in superovulated dairy cattle, although fertilisation failure is more pronounced, averaging approximately 45%, the percentage of fertilised embryos viable at 1 week is quite high (>70%). Among the multifactorial causes of low fertility in lactating dairy cows, high feed intake associated with low concentrations of circulating steroids may contribute substantially to reduced embryo quality. Fertilisation failure in superovulated cattle may be a consequence of inappropriate gamete transport due to hormonal imbalances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Relationships between the chemical composition of the 9th- to 11th-rib section and the chemical composition of the carcass and empty body were evaluated for Bos indicus (108 Nellore and 36 Guzerah; GuS) and tropically adapted Bos taurus (56 Caracu; CaS) bulls, averaging 20 to 24 mo of age at slaughter. Nellore cattle were represented by 56 animals from the selected herd (NeS) and 52 animals from the control herd (NeC). The CaS and GuS bulls were from selected herds. Selected herds were based on 20 yr of selection for postweaning BW. Carcass composition was obtained after grinding, homogenizing, sampling, and analyzing soft tissue and bones. Similarly, empty body composition was obtained after grinding, homogenizing, sampling, analyzing, and combining blood, hide, head + feet, viscera, and carcass. Bulls were separated into 2 groups. Group 1 was composed of 36 NeS, 36 NeC, 36 CaS, and 36 GuS bulls and had water, ether extract (EE), protein, and ash chemically determined in the 9th- to 11th-rib section and in the carcass. Group 2 was composed of 20 NeS, 16 NeC, and 20 CaS bulls and water, EE, protein, and ash were determined in the 9th-to 11th-rib section, carcass, and empty body. Linear regressions were developed between the carcass and the 9th-to 11th-rib section compositions for group 1 and between carcass and empty body compositions for group 2. The 9th-to 11th-rib section percentages of water (RWt) and EE (RF) predicted the percentages of carcass water (CWt) and carcass fat (CF) with high precision: CWt, % = 29.0806 + 0.4873 x RWt, % (r(2) = 0.813, SE = 1.06) and CF, % = 10.4037 + 0.5179 x RF, % (r(2) = 0.863, SE = 1.26), respectively. Linear regressions between percentage of CWt and CF and empty body water (EBWt) and empty body fat (EBF) were also predicted with high precision: EBWt, % = -9.6821 + 1.1626 x CWt, % (r(2) = 0.878, SE = 1.43) and EBF, % = 0.3739 + 1.0386 x CF, % (r(2) = 0.982, SE = 0.65), respectively. Chemical composition of the 9th-to 11th-rib section precisely estimated carcass percentages of water and EE. These regressions can accurately predict carcass and empty body compositions for Nellore, Guzerah, and Caracu breeds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objectives of this study were to determine if percentage Bos taurus (0 or 50%) of the cow had an effect on ME requirements and milk production, and to compare cow/calf efficiency among 3 mating systems. Metabolizable energy requirements were estimated during a feeding trial that encompassed a gestation and lactation feeding trial for each of 2 groups of cows. Cows were 0 or 50% Bos taurus ( 100 or 50% Nellore) breed type: Nellore cows (NL; n = 10) mated to Nellore bulls, NL cows ( n = 9) mated to Angus bulls, Angus x Nellore (ANL; n = 10) and Simmental x Nellore (SNL; n = 10) cows mated to Canchim (5/ 8 Charolais 3/ 8 Zebu) bulls. Cows were individually fed a total mixed diet that contained 11.3% CP and 2.23 Mcal of ME/kg of DM. At 14-d intervals, cows and calves were weighed and the amount of DM was adjusted to keep shrunk BW and BCS of cows constant. Beginning at 38 d of age, corn silage was available to calves ad libitum. Milk production at 42, 98, 126, and 180 d postpartum was measured using the weigh-suckle-weigh technique. At 190 d of age, calves were slaughtered and body composition estimated using 9-10-11th-rib section to obtain energy deposition. Regression of BW change on daily ME intake (MEI) was used to estimate MEI at zero BW change. Increase in percentage Bos taurus had a significant effect on daily ME requirements (Mcal/d) during pregnancy (P < 0.01) and lactation (P < 0.01). Percentage Bos taurus had a positive linear effect on maintenance requirements of pregnant (P = 0.07) and lactating (P < 0.01) cows; during pregnancy, the ME requirements were 91 and 86% of those in lactation (131 +/- 3.5 vs. 145 +/- 3.4 Mcal.kg(-0.75).d(-1)) for the 0 and 50% B. taurus groups, respectively. The 50% B. taurus cows, ANL and SNL, suckling crossbred calves had greater total MEI (4,319 +/- 61 Mcal; P < 0.01) than 0% B. taurus cows suckling NL (3,484 +/- 86 Mcal) or ANL calves (3,600 +/- 91 Mcal). The 0% B. taurus cows suckling ANL calves were more efficient (45.3 +/- 1.6 g/Mcal; P = 0.03) than straightbred NL (35.1 +/- 1.5 g/Mcal) and ANL or SNL pairs (41.0 +/- 1.0 g/Mcal). Under the conditions of this study, crossbreeding improved cow/ calf efficiency and showed an advantage for cows that have lower energy requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data from 9 studies were compiled to evaluate the effects of 20 yr of selection for postweaning weight (PWW) on carcass characteristics and meat quality in experimental herds of control Nellore (NeC) and selected Nellore (NeS), Caracu (CaS), Guzerah (GuS), and Gir (GiS) breeds. These studies were conducted with animals from a genetic selection program at the Experimental Station of Sertaozinho, Sao Paulo State, Brazil. After the performance test (168 d postweaning), bulls (n = 490) from the calf crops born between 1992 and 2000 were finished and slaughtered to evaluate carcass traits and meat quality. Treatments were different across studies. A meta-analysis was conducted with a random coefficients model in which herd was considered a fixed effect and treatments within year and year were considered as random effects. Either calculated maturity degree or initial BW was used interchangeably as the covariate, and least squares means were used in the multiple-comparison analysis. The CaS and NeS had heavier (P = 0.002) carcasses than the NeC and GiS; GuS were intermediate. The CaS had the longest carcass (P < 0.001) and heaviest spare ribs (P < 0.001), striploin (P < 0.001), and beef plate (P = 0.013). Although the body, carcass, and quarter weights of NeS were similar to those of CaS, NeS had more edible meat in the leg region than did CaS bulls. Selection for PWW increased rib-eye area in Nellore bulls. Selected Caracu had the lowest (most favorable) shear force values compared with the NeS (P = 0.003), NeC (P = 0.005), GuS (P = 0.003), and GiS (P = 0.008). Selection for PWW increased body, carcass, and meat retail weights in the Nellore without altering dressing percentage and body fat percentage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mature pregnant crossbred ewes (n = 90) were used in a randomized complete block design experiment and were assigned to 1 of 3 winter-feeding systems differing in primary feed source: haylage (HL), limit-fed corn (CN), or limit-fed dried distillers grains (DDGS). Effects of these winter-feeding strategies on postweaning progeny performance were determined. Lamb progeny (n = 96) were weaned at 61 +/- 4 d of age and fed a common high-concentrate diet. Lambs were assigned to feedlot pen (n = 18) based on dam mid-gestation pen. Growth rate, DMI, and ADG were determined for the first 40 d of the finishing period. At 96 +/- 4 d of age, 1 wether lamb was randomly selected from each pen (n = 18) for a glucose tolerance test. The experiment was terminated, and lambs were slaughtered individually when they were determined to have achieved 0.6-cm 12th-rib fat thickness. After a 24-h chill, carcass data were collected and a 2.54-cm chop was removed from each lamb from the LM posterior to the 12th rib for ether extract analysis. Additional carcass measurements of bone, muscle, and fat from the shoulder, rack, loin, and leg were collected on 35 carcasses. At weaning, lamb BW was not different among treatments, whereas final BW tended to be greater (P = 0.09) for lambs from ewes fed DDGS and CN during gestation than from those fed HL. Overall lamb growth rate from birth to slaughter was not different among treatments. Lambs from ewes fed DDGS vs. CN or HL tended to have a greater initial insulin response (P = 0.09). Dressing percent was less (P = 0.04) in lambs from ewes fed DDGS, but no difference (P = 0.16) was detected in HCW among treatments. As expected, 12th rib fat thickness was similar among treatments, whereas LM area was largest to smallest (P = 0.05) in lambs from ewes fed CN, HL, and DDGS, respectively. Proportion of internal fat tended to be greatest to smallest (P = 0.06) in lambs from ewes fed DDGS, CN, and HL, respectively. Calculated boneless trimmed retail cuts percentage was less (P = 0.04) in lambs from ewes fed DDGS than CN or HL. Loin muscle weight as a percentage of wholesale cut tended (P = 0.10) to be greater in lambs from ewes fed CN and HL than DDGS, whereas other muscle, bone, and fat weights and proportions were similar (P > 0.24) among treatments. Prepartum diet during mid to late gestation of ewes altered postnatal fat and muscle deposition and may be associated with alterations in insulin sensitivity of progeny.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aimed to achieve a better understanding about the foraging behavior of leaf-cutter ant (Atta sexdens rubropilosa Forel) workers with respect to defoliation sites in plants. To accomplish that, artificial plants 70 cm in height were prepared and divided into four levels (heights), having natural plant leaves attached to them. Evaluations during the bioassays included the number of leaves dropped by the ants, as well as the percentage of plant mass removed. In all replicates, it became evident that the most exploited plant site is the apical region, which significantly differed from other plant levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective was to adjust a protocol for peach pollen grains in vitro germination. For that, were realized five experiments with the purpose of establish the ideal concentration of sucrose, agar, calcium nitrate, boric acid, the best pH value, the germination temperature and the polinic tube emission time. As vegetal material, was used the Aurora 1 and Douradao cultivars. For the Aurora 1 cultivar, higher germination of pollen grains was obtained with the use of 48,29 g. L(-1) of sucrose, 10 g. L(-1) of agar, 400 mg.L(-1) of boric acid and pH 5,5. For the Douradao cultivar, higher germination was obtained on medium containing 90 g.L(-1) of sucrose, 10 g.L(-1) of agar, 400 mg.L(-1) of boric acid, 369 mg.L(-1) of calcium nitrate and pH 6,5. The best temperature for the germination of the pollen grains for both cultivars was 25 degrees C, being the pollen grains germination percentage raising proportionally directly to the evaluation time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Previously known only from the southern United States, hosta petiole rot recently appeared in the northern United States. Sclerotium rolfsii var. delphinii is believed to be the predominant petiole rot pathogen in the northern United States, whereas S. rolfsii is most prevalent in the southern United States. In order to test the hypothesis that different tolerance to climate extremes affects the geographic distribution of these fungi, the survival of S. rolfsii and S. rolfsii var. delphinii in the northern and southeastern United States was investigated. At each of four locations, nylon screen bags containing sclerotia were placed on the surface of bare soil and at 20-cm depth. Sclerotia were recovered six times from November 2005 to July 2006 in North Dakota and Iowa, and from December 2005 to August 2006 in North Carolina and Georgia. Survival was estimated by quantifying percentage of sclerotium survival on carrot agar. Sclerotia of S. rolfsii var. delphinii survived until at least late July in all four states. In contrast, no S. rolfsii sclerotia survived until June in North Dakota or Iowa, whereas 18.5% survived until August in North Carolina and 10.3% survived in Georgia. The results suggest that inability to tolerate low temperature extremes limits the northern range of S. rolfsii.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An analytical procedure for the determination of Hg in otter (Lontra longicaudis) feces was developed, to separate fish scales for the identification of the animal diet. Samples were washed with ultra-pure water and the suspension was sampled and transferred for digestion. The solubilization was performed with nitric-perchloric acid mixture, and detection carried out by the atomic fluorescence spectrometry (AFS). The quality of the analytical procedure was assessed by analyzing in-house standard solutions and certified reference materials. Total Hg concentrations were in the range of 7.6-156 ng g(-1) (July 2004), 25.6-277 ng g(-1) (January 2005) and 14.6-744 ng g(-1) (May 2005) that is approximately the same order of magnitude for all samples collected in two reservoirs at the Tiete River, Brazil. Although Hg concentrations varied with sampling periods and diet, high levels were correlated to the percentage of carnivorous fish scales present in the otter feces. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

P>Thirty-five lymph node samples were taken from animals with macroscopic lesions consistent with Mycobacterium bovis infection. The animals were identified by postmortem examination in an abattoir in the northwestern region of state of Parana, Brazil. Twenty-two of the animals had previously been found to be tuberculin skin test positive. Tissue samples were decontaminated by Petroff`s method and processed for acid-fast bacilli staining, culture in Stonebrink and Lowenstein-Jensen media and DNA extraction. Lymph node DNA samples were amplified by PCR in the absence and presence (inhibitor controls) of DNA extracted from M. bovis culture. Mycobacterium bovis was identified in 14 (42.4%) lymph node samples by both PCR and by culture. The frequency of PCR-positive results (54.5%) was similar to that of culture-positive results (51.5%, P > 0.05). The percentage of PCR-positive lymph nodes increased from 39.4% (13/33) to 54.5% (18/33) when samples that were initially PCR-negative were reanalysed using 2.5 mu l DNA (two samples) and 1 : 2 diluted DNA (three samples). PCR sensitivity was affected by inhibitors and by the amount of DNA in the clinical samples. Our results indicate that direct detection of M. bovis in lymph nodes by PCR may be a fast and useful tool for bovine tuberculosis epidemic management in the region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: Looking for possible neuroimmune relationships, we analyzed the effects of methylenedioxymethamphetamine (MDMA) administration on neuroendocrine, neutrophil activity and leukocyte distribution in mice. Methods: Five experiments were performed. In the first, mice were treated with MDMA (10 mg/kg) 30, 60 min and 24 h prior to blood sample collection for neutrophil activity analysis. In the second experiment, the blood of nave mice was collected and incubated with MDMA for neutrophil activity in vitro analysis. In the third and fourth experiments, mice were injected with MDMA (10 mg/kg) and 60 min later, blood and brain were collected to analyze corticosterone serum levels and hypothalamic noradrenaline (NA) levels and turnover. In the last experiment, mice were injected with MDMA 10 mg/kg and 60 min later, blood, bone marrow and spleen were collected for leukocyte distribution analysis. Results: Results showed an increase in hypothalamic NA turnover and corticosterone serum levels 60 min after MDMA (10 mg/kg) administration, a decrease in peripheral blood neutrophil oxidative burst and a decrease in the percentage and intensity of neutrophil phagocytosis. It was further found that MDMA (10 mg/kg) treatment also altered leukocyte distribution in blood, bone marrow and spleen. In addition, no effects were observed for MDMA after in vitro exposure both in neutrophil oxidative burst and phagocytosis. Conclusion: The effects of MDMA administration (10 mg/kg) on neutrophil activity and leukocyte distribution might have been induced indirectly through noradrenergic neurons and/or hypothalamic-pituitary-adrenal axis activations. Copyright (C) 2009 S. Karger AG, Basel

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The lack of a clear correlation between the levels of antibody to pertussis antigens and protection against disease lends credence to the possibility that cell-mediated immunity provides primary protection against disease. This phase I comparative trial had the aim of comparing the in vitro cellular immune response and anti-pertussis toxin (anti-PT) immunoglobulin G (IgG) titers induced by a cellular pertussis vaccine with low lipopolysaccharide (LPS) content (wP(low) vaccine) with those induced by the conventional whole-cell pertussis (wP) vaccine. A total of 234 infants were vaccinated at 2, 4, and 6 months with the conventional wP vaccine or the wP(low) vaccine. Proliferation of CD3(+) T cells was evaluated by flow cytometry after 6 days of peripheral blood mononuclear cell culture with stimulation with heat-killed Bordetella pertussis or phytohemagglutinin (PHA). CD3(+), CD4(+), CD8(+), and T-cell receptor gamma delta-positive (gamma delta(+)) cells were identified in the gate of blast lymphocytes. Gamma interferon, tumor necrosis factor alpha, interleukin-4 (IL-4), and IL-10 levels in super-natants and serum anti-PT IgG levels were determined using enzyme-linked immunosorbent assay (ELISA). The net percentage of CD3(+) blasts in cultures with B. pertussis in the group vaccinated with wP was higher than that in the group vaccinated with the wP(low) vaccine (medians of 6.2% for the wP vaccine and 3.9% for the wP(low) vaccine; P = 0.029). The frequencies of proliferating CD4(+), CD8(+), and gamma delta(+) cells, cytokine concentrations in supernatants, and the geometric mean titers of anti-PT IgG were similar for the two vaccination groups. There was a significant difference between the T-cell subpopulations for B. pertussis and PHA cultures, with a higher percentage of gamma delta(+) cells in the B. pertussis cultures (P < 0.001). The overall data did suggest that wP vaccination resulted in modestly better specific CD3(+) cell proliferation, and gamma delta(+) cell expansions were similar with the two vaccines.