25 resultados para BaTiO(3) and titanates


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have characterised six Australian Cucumber mosaic virus (CMV) strains belonging to different subgroups, determined by the sequence of their complete RNA 3 and by their host range and the symptoms they cause on species in the Solanaceae, Cucurbitaceae and on sweet corn. These data allowed classification of strains into the known three CMV subgroups and identification of plant species able to differentiate the Australian strains by symptoms and host range. Western Australian strains 237 and Twa and Queensland strains 207 and 242 are closely related members of CMV subgroup IA, which cause similar severe symptoms on Nicotiana species. Strains 207 and 237 (subgroup IA) were the only strains tested which systemically infected sweet corn. Strain 243 caused the most severe symptoms of all strains on Nicotiana species, tomato and capsicum and appears to be the first confirmed subgroup IB strain reported in Australia. Based on pair-wise distance analysis and phylogeny of RNA 3, as well as mild disease symptoms on Nicotiana species, CMV 241 was assigned to subgroup II, as the previously described Q-CMV and LY-CMV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The diet selected in autumn by steers fistulated at the oesophageous was studied in a subset of treatments in an extensive grazing study conducted in a Heteropogon contortus pasture in central Queensland between 1988 and 2001. These treatments were a factorial array of three stocking rates (4, 3 and 2 ha/steer) and three pasture types (native pasture, legume-oversown native pasture and animal diet supplement/spring-burning native pasture). Seasonal rainfall throughout this study was below the long-term mean and mean annual pasture utilisation ranged from 30 to 61%. Steers consistently selected H. contortus with levels decreasing from 47 to 18% of the diet as stocking rate increased from 4 ha/steer to 2 ha/steer. Stylosanthes scabra cv. Seca was always selected in legume-oversown pastures with diet composition varying from 35 to 66% despite its plant density increasing from 7 to 65 plants/m(2) and pasture composition from 20 to 50%. Steers also selected a diet containing Chrysopogon fallax, forbs and sedges in higher proportions than they were present in the pasture. Greater availability of the intermediate grasses Chloris divaricata and Eragrostis spp. was associated with increased stocking rates. Bothriochloa bladhii was seldom selected in the diet, especially when other palatable species were present in the pasture, despite B. bladhii often being the major contributor to total pasture yield. It was concluded that a stocking rate of 4 ha/steer will maintain the availability of H. contortus in the pasture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pineapple mealybug wilt-associated virus 1 (PMWaV-1), 2 (PMWaV-2) and -3 (PMWaV-3) have been detected in Australian commercial pineapple crops, along with a previously undescribed ampelovirus, for which the name Pineapple mealybug wilt-associated virus 5 (PMWaV-5) is proposed. Partial sequences extending from open reading frame 1b through to the heat shock protein homologue were obtained for PMWaV-1, -3 and -5. Phylogenetic analyses of selected regions of these sequences indicated that PMWaV-5 is a distinct species and most closely related to PMWaV-1. The amino acid sequence variation observed in the RNA-dependent RNA polymerase region of PMWaV-1 isolates was 95.8–98.4% and of PMWaV-3 isolates was 92.2–99.5%. In surveys of mealybug wilt disease (MWD) affected crops, none of the four viruses was clearly associated with the disease at all survey sites. A statistically significant association (P < 0.001) between the presence of PMWaV-2 and symptoms was observed at one survey site (site 3), but the virus was at a low incidence at the remaining three survey sites. By contrast, although PMWaV-1 and -3 were equally distributed between symptomless and MWD-affected plants at site 3, there was a statistically significant (P < 0.001) association between each of these two viruses and MWD at sites 1 and 4. At site 2, there was a statistically significant (P < 0.001) association only between PMWaV-3 and MWD. PMWaV-1 was the most commonly found of the four viruses and conversely PMWaV-5 was only occasionally found. Australian isolates of PMWaV-1, -2 and -3 were transmitted by the mealybug species Dysmicoccus brevipes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multiplex real-time PCR was designed to detect and differentiate equid herpesvirus 1 (EHV-1) and equid herpesvirus 4 (EHV-4). The PCR targets the glycoprotein B gene of EHV-1 and EHV-4. Primers and probes were specific to each equine herpesvirus type and can be used in monoplex or multiplex PCRs, allowing the differentiation of these two closely related members of the Alphaherpesvirinae. The two probes were minor-groove binding probes (MGB?) labelled with 6-carboxy-fluorescein (FAM?) and VIC® for detection of EHV-1 and EHV-4, respectively. Ten EHV-1 isolates, six EHV-1 positive clinical samples, one EHV-1 reference strain (EHV-1.438/77), three EHV-4 positive clinical samples, two EHV-4 isolates and one EHV-4 reference strain (EHV-4 405/76) were included in this study. EHV-1 isolates, clinical samples and the reference strain reacted in the EHV-1 real-time PCR but not in the EHV-4 real-time PCR and similarly EHV-4 clinical samples, isolates and the reference strain were positive in the EHV-4 real-time PCR but not in the EHV-1 real-time PCR. Other herpesviruses, such as EHV-2, EHV-3 and EHV-5 were all negative when tested using the multiplex real-time PCR. When bacterial pathogens and opportunistic pathogens were tested in the multiplex real-time PCR they did not react with either system. The multiplex PCR was shown to be sensitive and specific and is a useful tool for detection and differentiation of EHV-1 and EHV-4 in a single reaction. A comprehensive equine herpesvirus disease investigation procedure used in our laboratory is also outlined. This procedure describes the combination of alphaherpesvirus multiplex real-time PCR along with existing gel-based PCRs described by other authors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Space allowance is a major factor influencing animal welfare. For livestock, at least, it plays a critical role in profitability, yet there is little information on the amount of space that animals require. The amount of space an animal occupies as a consequence of its shape and size can be estimated using allometry; linear dimensions (L) can be expressed as L = kW1/3 and surface area (S) as S = kW2/3, where k = a constant and W = the weight of the animal. Such equations have been used to determine the amount of space needed by standing (area [m2] = 0.019W0.66) and lying (area [m2] = 0.027W0.67) animals. Limited studies on the lying down and standing up behaviors of pigs and cattle suggest that the amount of space required can be estimated by area (m2) = 0.047W0.66. Linear space required per animal for behaviors such as feeding or drinking from a trough can be estimated from 0.064W0.33, but in groups this requirement will be affected by social interactions among group members and the amount of competition for the resource. Determining the amount of space for groups of animals is complex, as the amount of useable space can vary with group size and by how group members share space in time. Some studies have been conducted on the way in which groups of domestic fowl use space, but overall, we know very little about the ways in which livestock time-share space, synchronicity in the performance of behaviors, and the effects of spatial restrictions on behavior and welfare.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The fate of nitrogen (N) applied in biosolids was investigated in a forage production system on an alluvial clay loam soil in south-eastern Queensland, Australia. Biosolids were applied in October 2002 at rates of 6, 12, 36, and 54dryt/ha for aerobically digested biosolids (AE) and 8, 16, 48, and 72dryt/ha for anaerobically digested biosolids (AN). Rates were based on multiples of the Nitrogen Limited Biosolids Application rate (0.5, 1, 3, and 4.5NLBAR) for each type of biosolid. The experiment included an unfertilised control and a fertilised control that received multiple applications of synthetic fertiliser. Forage sorghum was planted 1 week after biosolids application and harvested 4 times between December 2002 and May 2003. Dry matter production was significantly greater from the biosolids-treated plots (21-27t/ha) than from the unfertilised (16t/ha) and fertilised (18t/ha) controls. The harvested plant material removed an extra 148-488kg N from the biosolids-treated plots. Partial N budgets were calculated for the 1NLBAR and 4.5NLBAR treatments for each biosolids type at the end of the crop season. Crop removal only accounted for 25-33% of the applied N in the 1NLBAR treatments and as low as 8-15% with 4.5NLBAR. Residual biosolids N was predominantly in the form of organic N (38-51% of applied biosolids N), although there was also a significant proportion (10-23%) as NO3-N, predominantly in the top 0.90m of the soil profile. From 12 to 29% of applied N was unaccounted for, and presumed to be lost as gaseous nitrogen and/or ammonia, as a consequence of volatilisation or denitrification, respectively. In-season mineralisation of organic N in biosolids was 43-59% of the applied organic N, which was much greater than the 15% (AN)-25% (AE) expected, based on current NLBAR calculation methods. Excessive biosolids application produced little additional biomass but led to high soil mineral N concentrations that were vulnerable to multiple loss pathways. Queensland Guidelines need to account for higher rates of mineralisation and losses via denitrification and volatilisation and should therefore encourage lower application rates to achieve optimal plant growth and minimise the potential for detrimental impacts on the environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most plant disease resistance (R) genes encode proteins with a nucleotide binding site and leucine-rich repeat structure (NBS-LRR). In this study, degenerate primers were used to amplify genomic NBS-type sequences from wild banana (Musa acuminata ssp. malaccensis) plants resistant to the fungal pathogen Fusarium oxysporum formae specialis (f. sp.) cubense (FOC) race 4. Five different classes of NBS-type sequences were identified and designated as resistance gene candidates (RGCs). The deduced amino acid sequences of the RGCs revealed the presence of motifs characteristic of the majority of known plant NBS-LRR resistance genes. Structural and phylogenetic analyses grouped the banana RGCs within the non-TIR (homology to Toll/interleukin-1 receptors) subclass of NBS sequences. Southern hybridization showed that each banana RGC is present in low copy number. The expression of the RGCs was assessed by RT-PCR in leaf and root tissues of plants resistant or susceptible to FOC race 4. RGC1, 3 and 5 showed a constitutive expression profile in both resistant and susceptible plants whereas no expression was detected for RGC4. Interestingly, RGC2 expression was found to be associated only to FOC race 4 resistant lines. This finding could assist in the identification of a FOC race 4 resistance gene.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The partial gene sequencing of the matrix (M) protein from seven clinical isolates of bovine parainfluenza virus type 3 (BPIV-3), and the complete sequencing of a representative isolate (Q5592) was completed in this study. Nucleotide sequence analysis was initiated because of the failure of in-house BPIV-3 RT-PCR methods to yield expected products for four of the isolates. Phylogenetic reconstructions based on the nucleotide sequences for the M-protein and the entire genome, using all of the available BPIV-3 nucleotide sequences, demonstrated that there were two distinct BPIV-3 genotypes (BPIV-3a and BPIV-3b). These newly identified genotypes have implications for the development of BPIV-3 molecular detection methods and may also impact on BPIV-3 vaccine formulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effect of defoliation on Amarillo (Arachis pintoi cv. Amarillo) was studied in a glasshouse and in mixed swards with 2 tropical grasses. In the glasshouse, Amarillo plants grown in pots were subjected to a 30/20°C or 25/15°C temperature regime and to defoliation at 10-, 20- or 30-day intervals for 60 days. Two field plot studies were conducted on Amarillo with either irrigated kikuyu (Pennisetum clandestinum) in autumn and spring or dryland Pioneer rhodes grass (Chloris gayana) over summer and autumn. Treatments imposed were 3 defoliation intervals (7, 14 and 28 days) and 2 residual heights (5 and 10 cm for kikuyu; 3 and 10 cm for rhodes grass) with extra treatments (56 days to 3 cm for both grasses and 21 days to 5 cm for kikuyu). Defoliation interval had no significant effect on accumulated Amarillo leaf dry matter (DM) at either temperature regime. At the higher temperature, frequent defoliation reduced root dry weight (DW) and increased crude protein (CP) but had no effect on stolon DW or in vitro organic matter digestibility (OMD). On the other hand, at the lower temperature, frequent defoliation reduced stolon DW and increased OMD but had no effect on root DW or CP. Irrespective of temperaure and defoliation, water-soluble carbohydrate levels were higher in stolons than in roots (4.70 vs 3.65%), whereas for starch the reverse occured (5.37 vs 9.44%). Defoliating the Amarillo-kikuyu sward once at 56 days to 3 cm produced the highest DM yield in autumn and sprong (582 and 7121 kg/ha DM, respectively), although the Amarillo component and OMD were substantially reduced. Highest DM yields (1726 kg/ha) were also achieved in the Amarillo-rhodes grass sward when defoliated every 56 days to 3 cm, although the Amarillo component was unaffected. In a mixed sward with either kikuyu or rhodes grass, the Amarillo component in the sward was maintained up to a 28-day defoliation interval and was higher when more severely defoliated. The results show that Amarillo can tolerate frequent defoliation and that it can co-exist with tropical grasses of differing growth habits, provided the Amarillo-tropical grass sward is subject to frequent and severe defoliation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An experiment using herds of similar to 20 cows (farmlets) assessed the effects of high stocking rates on production and profitability of feeding systems based on dryland and irrigated perennial ryegrass-based pastures in a Mediterranean environment in South Australia over 4 years. A target level of milk production of 7000 L/cow.year was set, based on predicted intakes of 2.7 t DM/cow.year as concentrates, pasture intakes from 1.5 to 2.7 t/cow.year and purchased fodder. In years 1 and 2, up to 1.5 t DM/cow.year of purchased fodder was used and in years 3 and 4 the amounts were increased if necessary to enable levels of milk production per cow to be maintained at target levels. Cows in dryland farmlets calved in March to May inclusive and were stocked at 2.5, 2.9, 3.3, 3.6 and 4.1 cows/ha, while those in irrigated farmlets calved in August to October inclusive and were stocked at 4.1, 5.2, 6.3 and 7.4 cows/ha. In the first 2 years, when inputs of purchased fodder were limited, milk production per cow was reduced with higher stocking rates (P < 0.01), but in years 3 and 4 there were no differences. Mean production was 7149 kg/cow.year in years 1 and 2, and 8162 kg/cow.year in years 3 and 4. Production per hectare was very closely related to stocking rate in all years (P < 0.01), increasing from 18 to 34 t milk/ha.year for dryland farmlets (1300 to 2200 kg milk solids/ha) and from 30 to 60 t milk/ha.year for irrigated farmlets (2200 to 4100 kg milk solids/ha). Almost all of these increases were attributed to the increases in grain and purchased fodder inputs associated with the increases in stocking rate. Net pasture accumulation rates and pasture harvest were generally not altered with stocking rate, though as stocking rate increased there was a change to more of the pasture being grazed and less conserved in both dryland and irrigated farmlets. Total pasture harvest averaged similar to 8 and 14 t DM/ha.year for dryland and irrigated pastures, respectively. An exception was at the highest stocking rate under irrigation, where pugging during winter was associated with a 14% reduction in annual pasture growth. There were several indications that these high stocking rates may not be sustainable without substantial changes in management practice. There were large and positive nutrient balances and associated increases in soil mineral content (P < 0.01), especially for phosphorus and nitrate nitrogen, with both stocking rate and succeeding years. Levels under irrigation were considerably higher (up to 90 and 240 mg/kg of soil for nitrate nitrogen and phosphorus, respectively) than under dryland pastures (60 and 140 mg/kg, respectively). Soil organic carbon levels did not change with stocking rate, indicating a high level of utilisation of forage grown. Weed ingress was also high (to 22% DM) in all treatments and especially in heavily stocked irrigated pastures during winter. It was concluded the higher stocking rates used exceeded those that are feasible for Mediterranean pastures in this environment and upper levels of stocking are suggested to be 2.5 cows/ha for dryland pastures and 5.2 cows/ha for irrigated pastures. To sustain these suggested stocking rates will require further development of management practices to avoid large increases in soil minerals and weed invasion of pastures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Listeria and Salmonella are important foodborne pathogens normally associated with the shrimp production chain. This study investigated the potential of Salmonella Typhimurium, Salmonella Senftenberg, and Listeria monocytogenes (Scott A and V7) to attach to and colonize shrimp carapace. Attachment and colonization of Listeria and Salmonella were demonstrated. Shrimp abdominal carapaces showed higher levels of bacterial attachment (P < 0.05) than did head carapaces. Listeria consistently exhibited greater attachment (P < 0.05) than did Salmonella on all surfaces. Chitinase activity of all strains was tested and found not to occur at the three temperatures (10, 25. and 37 degrees C) tested. The surface physicochemical properties of bacterial cells and shrimp carapace were Studied to determine their role in attachment and colonization. Salmonella had significantly (P < 0.05) more positive (-3.9 and -6.0 mV) cell surface charge than Listeria (-18 and -22.8 mV) had. Both bacterial species were found to be hydrophilic (<35%) when measured by the bacterial adherence to hydrocarbon method and by contact angle (theta) measurements (Listeria, 21.3 and 24.8 degrees, and Salmonella, 14.5 and 18.9 degrees). The percentage of cells retained by Pheryl-Sepharose was lower for Salmonella (12.8 to 14.8%) than it was for Listeria (26.5 to 31.4%). The shrimp carapace was found to be hydrophobic (theta = 74.5 degrees), and a significant (P < 0.05) difference in surface roughness between carapace types was noted. There was a linear correlation between bacterial cell Surface charge (r(2) = 0.95) and hydrophobicity (r(2) = 0.85) and initial attachment (P < 0.05) of Listeria and Salmonella to carapaces. However, the same properties Could not be related to subsequent colonization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study set out to test the hypothesis through field and simulation studies that the incorporation of short-term summer legumes, particularly annual legume lablab (Lablab purpureus cv. Highworth), in a fallow-wheat cropping system will improve the overall economic and environmental benefits in south-west Queensland. Replicated, large plot experiments were established at five commercial properties by using their machineries, and two smaller plot experiments were established at two intensively researched sites (Roma and St George). A detailed study on various other biennial and perennial summer forage legumes in rotation with wheat and influenced by phosphorus (P) supply (10 and 40 kg P/ha) was also carried out at the two research sites. The other legumes were lucerne (Medicago sativa), butterfly pea (Clitoria ternatea) and burgundy bean (Macroptilium bracteatum). After legumes, spring wheat (Triticum aestivum) was sown into the legume stubble. The annual lablab produced the highest forage yield, whereas germination, establishment and production of other biennial and perennial legumes were poor, particularly in the red soil at St George. At the commercial sites, only lablab-wheat rotations were experimented, with an increased supply of P in subsurface soil (20 kg P/ha). The lablab grown at the commercial sites yielded between 3 and 6 t/ha forage yield over 2-3 month periods, whereas the following wheat crop with no applied fertiliser yielded between 0.5 to 2.5 t/ha. The wheat following lablab yielded 30% less, on average, than the wheat in a fallow plot, and the profitability of wheat following lablab was slightly higher than that of the wheat following fallow because of greater costs associated with fallow management. The profitability of the lablab-wheat phase was determined after accounting for the input costs and additional costs associated with the management of fallow and in-crop herbicide applications for a fallow-wheat system. The economic and environmental benefits of forage lablab and wheat cropping were also assessed through simulations over a long-term climatic pattern by using economic (PreCAPS) and biophysical (Agricultural Production Systems Simulation, APSIM) decision support models. Analysis of the long-term rainfall pattern (70% in summer and 30% in winter) and simulation studies indicated that ~50% time a wheat crop would not be planted or would fail to produce a profitable crop (grain yield less than 1 t/ha) because of less and unreliable rainfall in winter. Whereas forage lablab in summer would produce a profitable crop, with a forage yield of more than 3 t/ha, ~90% times. Only 14 wheat crops (of 26 growing seasons, i.e. 54%) were profitable, compared with 22 forage lablab (of 25 seasons, i.e. 90%). An opportunistic double-cropping of lablab in summer and wheat in winter is also viable and profitable in 50% of the years. Simulation studies also indicated that an opportunistic lablab-wheat cropping can reduce the potential runoff+drainage by more than 40% in the Roma region, leading to improved economic and environmental benefits.