989 resultados para Histologie comparée
Resumo:
Citrus post-bloom fruit drop (caused by Colletotrichum acutatum) frequently occurs in the southwestern region of So Paulo State, Brazil. A survey of Colletotrichum isolates associated with symptoms of post-bloom fruit drop in So Paulo State showed C. gloeosporioides in addition to C. acutatum. The objectives of this study were to confirm the identification of C. gloeosporioides isolated from symptomatic citrus flowers, to test the pathogenicity of C. gloeosporioides isolates, to compare the development of disease caused by C. gloeosporioides and C. acutatum, and to determine the frequency of C. gloeosporioides in a sample of isolates obtained from symptomatic flowers in different regions of So Paulo State. Through the use of species-specific primers by PCR, 17.3% of 139 isolates were C. gloeosporioides, and the remaining 82.7% were C. acutatum. The pathogenicity tests, carried out in 3-year old potted plants of sweet oranges indicated that both species caused typical symptoms of the disease including blossom blight and persistent calyces. Incubation periods (3.5 and 3.9 days, respectively, for C. acutatum and C. gloeosporioides) and fruit sets (6.7 and 8.5%, respectively for C. acutatum and C. gloeosporioides) were similar for both species. The incidences of blossom blight and persistent calyces were higher on plants inoculated with C. acutatum than in those inoculated with C. gloeosporioides. Conidial germination was similar for both species under different temperatures and wetness periods. Under optimal conditions, appressorium formation and melanisation were higher for C. gloeosporioides than for C. acutatum. These results indicated that Colletotrichum gloeosporioides is a new causal agent of post-bloom fruit drop.
Resumo:
The production and commercialization of citrus seedlings inspected and produced in protected screen-houses has become mandatory in Sao Paulo State, Brazil since January 2003. This law was intended to avoid the dispersion of Citrus Variegated Chlorosis (CVC), disease caused by Xylella fastidiosa. Our objective was to compare the yield over 8 years of `Natal` sweet orange trees grafted onto Rangpur lime obtained from healthy nursery plants and from plants artificially inoculated with X. fastidiosa. Yield was evaluated in an orchard planted in February 1999 with two treatments: (i) trees from healthy nursery plant, and (ii) trees from plants artificially inoculated with X. fastidiosa. The mean yield was 21% higher in trees from healthy nursery plants, as compared to trees from inoculated nursery plants. This difference represents a gain of approximately 203 boxes of 40.8 kg each, considering a planting density of 550 plants per hectare. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Background: Dentists of Lar Sao Francisco observed during dental treatment that children with cerebral palsy (CP) had increased heart rate (HR) and lower production of saliva. Despite the high prevalence of CP found in the literature (2.08-3.6/1000 individuals), little is known about the electrocardiographic (ECG) characteristics, especially HR, of individuals with CP. Objective: This study aimed to investigate the hypothesis that individuals with CP have a higher HR and to define other ECG characteristics of this population. Methods: Ninety children with CP underwent clinical examination and 12-lead rest ECG. Electrocardiographic data on rhythm, HR, PR interval, QRS duration, P/QRS/T axis, and QT, QTc and T(peak-end) intervals (minimum, mean, maximum, and dispersion) were measured and analyzed then compared with data from a control group with 35 normal children. Fisher and Mann-Whitney U tests were used, respectively, to compare categorical and continuous data. Results: Groups cerebral palsy and control did not significantly differ in age (9 +/- 3 x 9 +/- 4 years) and male gender (65% x 49%). Children with CP had a higher HR (104.0 +/- 20.6 x 84.2 +/- 13.3 beats per minute; P < .0001), shorter PR interval (128.8 +/- 15.0 x 138.1 +/- 15.1 milliseconds; P = .0018), shorter QRS duration (77.4 +/- 8.6 x 82.0 +/- 8.7 milliseconds; P = .0180), QRS axis (46.0 degrees +/- 26.3 degrees x 59.7 degrees +/- 24.8 degrees; P = .0024) and T-wave axis (34.3 degrees +/- 28.9 degrees x 42.9 degrees +/- 17.1 degrees; P = .034) more horizontally positioned, and greater mean QTc (418.1 +/- 18.4 x 408.5 +/- 19.4 milliseconds; P = .0110). All the electrocardiogram variables were within the reference range for the age group including those with significant differences. Conclusion: Children with CP showed increased HR and other abnormal ECG findings in the setting of this investigation. Further studies are needed to explain our findings and to correlate the increased HR with situations such as dehydration, stress, and autonomic nervous disorders. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Landscape unit discrimination for pedological surveys by orbital spectral response. The objective of tins study was compare two soil survey methods. The first was performed by methods traditionally used to distinguish landscape units and soil class discrimination. The second was based on soil class distinction through orbital spectral response. In order to establish soil characteristics and their classification, soil samples were collected at two depths in a grid system, with a distance of 500 meters between points. With these samples, physical and chemical analyses were carried out. In the sampling points, the apparent reflectance of the soil, front the orbital image, was determined and, through cluster analysis landscape units were established. In order to evaluate the resemblance reliability between the landscape units established in each method, the Kappa index was used, the value set for the confusion matrix was 0.43, indicating high quality in the comparison, showing that the non-conventional method was as close as the one carried out by photointerpretation.
Resumo:
Agricultural reuse of treated sewage effluent (TSE) is an environmental and economic practice; however, little is known about its effects on the characteristics and microbial function in tropical soils. The effect of surplus irrigation of a pasture with TSE, in a period of 18 months, was investigated, considering the effect of 0% surplus irrigation with TSE as a control. In addition, the experiment consisted of three surplus treatments (25%, 50%, and 100% excess) and a nonirrigated pasture area (SE) to compare the soil microbial community level physiological profiles, using the Biolog method. The TSE application increased the average substrate consumption of the soil microbial community, based on the kinetic parameters of the average well color development curve fitting. There were no significant differences between the levels of surplus irrigation treatments. Surplus TSE pasture irrigation caused minor increases in the physiological status of the soil microbial community but no detectable damage to the pasture or soil.
Resumo:
The Fungal Ribosomal Intergenic Spacer Analysis (F-RISA) was used to characterize soil fungal communities from three ecosystems of Araucaria angustifolia from Brazil: a native forest and two replanted forest ecosystems, one of them with a past history of wildfire. The arbuscular mycorrhizal fungi (AMF) infection was evaluated in Araucaria roots of 18-month-old axenic plants previously inoculated with soils collected from those areas in a greenhouse experiment. The principal component analysis of F-RISA profiles showed different soil fungal community between the three studied areas. Sixty three percent of F-RISA fragments amplified in the soil and the substrate samples presented lengths between 500 and 700 bp. The number of Operational Taxonomic Units (OTUs) was 34 for soil and 38 for substrate, however, more fragments were detected in soil (214) than in substrate (163). An in silico F-RISA analysis to compare our data with ITS1-5.8S-ITS2 sequences from NCBI database showed the presence of Ascomycota, Basidiomycota and Glomeromycota among the soil and substrate fungal communities. AMF infection was higher in plants inoculated with soil from the native forest and the replanted forest with wildfire, both presenting similar chemical characteristics but with different disturbance levels. These results indicate that soil chemical composition may influence the soil fungal community structures rather than the anthropogenic or fire disturbances.
Resumo:
The rhizosphere constitutes a complex niche that may be exploited by a wide variety of bacteria. Bacterium-plant interactions in this niche can be influenced by factors such as the expression of heterologous genes in the plant. The objective of this work was to describe the bacterial communities associated with the rhizosphere and rhizoplane regions of tobacco plants, and to compare communities from transgenic tobacco lines (CAB1, CAB2 and TRP) with those found in wild-type (WT) plants. Samples were collected at two stages of plant development, the vegetative and flowering stages (1 and 3 months after germination). The diversity of the culturable microbial community was assessed by isolation and further characterization of isolates by amplified ribosomal RNA gene restriction analysis (ARDRA) and 16S rRNA sequencing. These analyses revealed the presence of fairly common rhizosphere organisms with the main groups Alphaproteobacteria, Betaproteobacteria, Actinobacteria and Bacilli. Analysis of the total bacterial communities using PCR-DGGE (denaturing gradient gel electrophoresis) revealed that shifts in bacterial communities occurred during early plant development, but the reestablishment of original community structure was observed over time. The effects were smaller in rhizosphere than in rhizoplane samples, where selection of specific bacterial groups by the different plant lines was demonstrated. Clustering patterns and principal components analysis (PCA) were used to distinguish the plant lines according to the fingerprint of their associated bacterial communities. Bands differentially detected in plant lines were found to be affiliated with the genera Pantoea, Bacillus and Burkholderia in WT, CAB and TRP plants, respectively. The data revealed that, although rhizosphere/rhizoplane microbial communities can be affected by the cultivation of transgenic plants, soil resilience may be able to restore the original bacterial diversity after one cycle of plant cultivation.
Resumo:
Soil CO(2) emissions are highly variable, both spatially and across time, with significant changes even during a one-day period. The objective of this study was to compare predictions of the diurnal soil CO(2) emissions in an agricultural field when estimated by ordinary kriging and sequential Gaussian simulation. The dataset consisted of 64 measurements taken in the morning and in the afternoon on bare soil in southern Brazil. The mean soil CO(2) emissions were significantly different between the morning (4.54 mu mol m(-2) s(-1)) and afternoon (6.24 mu mol m(-2) s(-1)) measurements. However, the spatial variability structures were similar, as the models were spherical and had close range values of 40.1 and 40.0 m for the morning and afternoon semivariograms. In both periods, the sequential Gaussian simulation maps were more efficient for the estimations of emission than ordinary kriging. We believe that sequential Gaussian simulation can improve estimations of soil CO(2) emissions in the field, as this property is usually highly non-Gaussian distributed.
Resumo:
The objective of this study was to compare the results of an on-farm test, named Somaticell, with results of electronic cell counting and for milk somatic cell count (SCC) among readers. The Somaticell test correctly determined the SCC in fresh quarter milk samples. Correlation between Somaticell and electronic enumeration of somatic cells was 0.92 and. coefficient 0.82. Using a threshold of 205,000 cells/mL, the sensitivity and specificity for determination of intramammary infections were 91.3 and 96.0%, respectively. The SCC was greater for milk samples from which major mastitis pathogens were recovered. Minor variation among readers was observed and most likely associated with the mixing procedure. However, the final analysis indicated that this variation was not significant and did not affect the amount of samples classified as having subclinical mastitis. The on-farm test evaluated in this study showed adequate capacity of determining SCC on quarter milk samples and may be considered as an alternative for on-farm detection of subclinical mastitis.
Resumo:
The objectives of this study were to determine if percentage Bos taurus (0 or 50%) of the cow had an effect on ME requirements and milk production, and to compare cow/calf efficiency among 3 mating systems. Metabolizable energy requirements were estimated during a feeding trial that encompassed a gestation and lactation feeding trial for each of 2 groups of cows. Cows were 0 or 50% Bos taurus ( 100 or 50% Nellore) breed type: Nellore cows (NL; n = 10) mated to Nellore bulls, NL cows ( n = 9) mated to Angus bulls, Angus x Nellore (ANL; n = 10) and Simmental x Nellore (SNL; n = 10) cows mated to Canchim (5/ 8 Charolais 3/ 8 Zebu) bulls. Cows were individually fed a total mixed diet that contained 11.3% CP and 2.23 Mcal of ME/kg of DM. At 14-d intervals, cows and calves were weighed and the amount of DM was adjusted to keep shrunk BW and BCS of cows constant. Beginning at 38 d of age, corn silage was available to calves ad libitum. Milk production at 42, 98, 126, and 180 d postpartum was measured using the weigh-suckle-weigh technique. At 190 d of age, calves were slaughtered and body composition estimated using 9-10-11th-rib section to obtain energy deposition. Regression of BW change on daily ME intake (MEI) was used to estimate MEI at zero BW change. Increase in percentage Bos taurus had a significant effect on daily ME requirements (Mcal/d) during pregnancy (P < 0.01) and lactation (P < 0.01). Percentage Bos taurus had a positive linear effect on maintenance requirements of pregnant (P = 0.07) and lactating (P < 0.01) cows; during pregnancy, the ME requirements were 91 and 86% of those in lactation (131 +/- 3.5 vs. 145 +/- 3.4 Mcal.kg(-0.75).d(-1)) for the 0 and 50% B. taurus groups, respectively. The 50% B. taurus cows, ANL and SNL, suckling crossbred calves had greater total MEI (4,319 +/- 61 Mcal; P < 0.01) than 0% B. taurus cows suckling NL (3,484 +/- 86 Mcal) or ANL calves (3,600 +/- 91 Mcal). The 0% B. taurus cows suckling ANL calves were more efficient (45.3 +/- 1.6 g/Mcal; P = 0.03) than straightbred NL (35.1 +/- 1.5 g/Mcal) and ANL or SNL pairs (41.0 +/- 1.0 g/Mcal). Under the conditions of this study, crossbreeding improved cow/ calf efficiency and showed an advantage for cows that have lower energy requirements.
Resumo:
Evaluation of two semi-selective media to detect Curtobacterium flaccumfaciens pv. flaccumfaciens in bean seeds This study aimed to compare the effectiveness of the semi-selective MSCFF and modified CNS culture media in detecting Curtobacterium flaccumfaciens pv. flaccumfaciens (Cff) in bean seeds, using the streak and spread plate techniques. Four 500 g subsamples, obtained from two samples of bean seeds, were immersed in 600 mL of sterile distilled water for 18 h at 5 degrees C. Suspensions were picked and transferred to plates with both culture media. Plates were then incubated at 28 degrees C, and bacterial growth on both media was evaluated 72 and 144 hours later, compared to the growth of a Cff reference strain. Both media revealed the presence of Cff colonies. Typical colonies were isolated for PCR analyses and pathogenicity tests on tobacco leaves. A characteristic Cff growth on MSCFF medium was observed for the seed samples, for the two plate techniques used, in both evaluations. On the modified CNS culture medium, the bacterial growth was only detected in seed samples after 144 hours of incubation, regardless of the plate technique used. The results showed Cff grew faster on the MSCFF semi-selective culture medium. Bacterial isolates tested were identified as Cff by both PCR analyses and a positive tobacco hypersensitivity reaction.
Resumo:
A warning system for sooty blotch and flyspeck (SBFS) of apple, developed in the southeastern United States, uses cumulative hours of leaf wetness duration (LWD) to predict the timing of the first appearance of signs. In the Upper Midwest United States, however, this warning system has resulted in sporadic disease control failures. The purpose of the present study was to determine whether the warning system`s algorithm could be modified to provide more reliable assessment of SBFS risk. Hourly LWD, rainfall, relative humidity (RH), and temperature data were collected from orchards in Iowa, North Carolina, and Wisconsin in 2005 and 2006. Timing of the first appearance of SBFS signs was determined by weekly scouting. Preliminary analysis using scatterplots and boxplots suggested that Cumulative hours of RH >= 97% could be a useful predictor of SBFS appearance. Receiver operating characteristic curve analysis was used to compare the predictive performance of cumulative LWD and cumulative hours of RH >= 97%. Cumulative hours of RH >= 97% was a more conservative and accurate predictor than cumulative LWD for 15 site years in the Upper Midwest, but not for four site years in North Carolina. Performance of the SBFS warning system in the Upper Midwest and climatically similar regions may be improved if cumulative hours of RH >= 97% were substituted for cumulative LWD to predict the first appearance of SBFS.
Resumo:
Cultivar, growing conditions and geographical origin are factors that influence the carotenoid composition in fruits. Because the loquat cultivars evaluated in this study, CentenAria, Mizauto, Mizuho, Mizumo and Nectar de Cristal, have not previously been investigated, the present work was carried out to determine and compare the carotenoid composition of these five loquat cultivars, by applying high-performance liquid chromatography connected to a photodiode array and mass spectrometry detectors (HPLC-PDA-MS/MS). Twenty-five carotenoids were separated on a C(30) column, and 23 of them were identified. All-trans-beta-carotene (19-55%), all-trans-beta-cryptoxanthin (18-28%), 5,6:5`,6`-diepoxy-beta-cryptoxantilin (9-18%) and 5,6-epoxy-beta-cryptoxanthin (7-10%) were the main carotenoids. The total carotenoid content ranged from 196 mu g/100 g (cv. Nectar de Cristal) to 3020 mu g/100 g (CV. Mizumo). The carotenoid profile of cv. Nectar de Cristal was different from the other cultivars, which was in agreement with its cream pulp colour, in contrast to the other four cultivars with orange pulp colour. Cultivars Mizauto, Mizuho, Mizumo and CentenAria showed provitamin A values between 89 and 162 mu g RAE/100 g, and can be considered good source of this provitamin. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Listeriosis is a serious foodborne disease caused by Listeria monocytogenes, a pathogen often found in food processing plants. Poultry meat and its derivatives may harbor L. monocytogenes even if good manufacturing practices are implanted in abattoirs. Little information exists in Brazil on the frequency of L. monocytogenes contamination, even though the country is considered the top poultry meat exporter in the world. This study attempted to compare 2 exporters poultry facilities following same the standards but differing only in manual (plant M) or automatic (plant A) evisceration. Eight hundred fifty-one samples from food, food contact and non-food contact surfaces, water, and workers` hands were collected from cage to finished products over a 1-yr period. In plant A, 20.1% of the samples were positive for L. monocytogenes, whereas in plant M, 16.4% was found. The greatest incidence of contamination with the pathogen in plant A was found in non- food contact surfaces (27.3%), while in plant M, it was found in products (19.4%). The most prevalent serovars were 1/2a or 3a (plant M) and 4b, 4d, or 4e (plant A). Despite having proper hygiene and good manufacturing practices, controlling the entry and persistence of L. monocytogenes in processing facilities remains a formidable task.