18 resultados para Type 6 Secretion systems
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Anaplasma marginale is the most prevalent tick-borne livestock pathogen and poses a significant threat to cattle industry. In contrast to currently available live blood-derived vaccines against A. marginale, alternative safer and better-defined subunit vaccines will be of great significance. Two proteins (VirB9-1 and VirB9-2) from the Type IV secretion system of A. marginale have been shown to induce humoral and cellular immunity. In this study, Escherichia coli were used to express VirB9-1 and VirB9-2 proteins. Silica vesicles having a thin wall of 6 nm and pore size of 5.8 nm were used as the carrier and adjuvant to deliver these two antigens both as individual or mixed nano-formulations. High loading capacity was achieved for both proteins, and the mouse immunisation trial with individual as well as mixed nano-formulations showed high levels of antibody titres over 107 and strong T-cell responses. The mixed nano-formulation also stimulated high-level recall responses in bovine T-cell proliferation assays. These results open a promising path towards the development of efficient A. marginale vaccines and provide better understanding on the role of silica vesicles to deliver multivalent vaccines as mixed nano-formulations able to activate both B-cell and T-cell immunity, for improved animal health.
Resumo:
Dairy farms located in the subtropical cereal belt of Australia rely on winter and summer cereal crops, rather than pastures, for their forage base. Crops are mostly established in tilled seedbeds and the system is vulnerable to fertility decline and water erosion, particularly over summer fallows. Field studies were conducted over 5 years on contrasting soil types, a Vertosol and Sodosol, in the 650-mm annual-rainfall zone to evaluate the benefits of a modified cropping program on forage productivity and the soil-resource base. Growing forage sorghum as a double-crop with oats increased total mean annual production over that of winter sole-crop systems by 40% and 100% on the Vertosol and Sodosol sites respectively. However, mean annual winter crop yield was halved and overall forage quality was lower. Ninety per cent of the variation in winter crop yield was attributable to fallow and in-crop rainfall. Replacing forage sorghum with the annual legume lablab reduced fertiliser nitrogen (N) requirements and increased forage N concentration, but reduced overall annual yield. Compared with sole-cropped oats, double-cropping reduced the risk of erosion by extending the duration of soil water deficits and increasing the time ground was under plant cover. When grown as a sole-crop, well fertilised forage sorghum achieved a mean annual cumulative yield of 9.64 and 6.05 t DM/ha on the Vertosol and Sodosol, respectively, being about twice that of sole-cropped oats. Forage sorghum established using zero-tillage practices and fertilised at 175 kg N/ha. crop achieved a significantly higher yield and forage N concentration than did the industry-standard forage sorghum (conventional tillage and 55 kg N/ha. crop) on the Vertosol but not on the Sodosol. On the Vertosol, mean annual yield increased from 5.65 to 9.64 t DM/ha (33 kg DM/kg N fertiliser applied above the base rate); the difference in the response between the two sites was attributed to soil type and fertiliser history. Changing both tillage practices and N-fertiliser rate had no affect on fallow water-storage efficiency but did improve fallow ground cover. When forage sorghum, grown as a sole crop, was replaced with lablab in 3 of the 5 years, overall forage N concentration increased significantly, and on the Vertosol, yield and soil nitrate-N reserves also increased significantly relative to industry-standard sorghum. All forage systems maintained or increased the concentration of soil nitrate-N (0-1.2-m soil layer) over the course of the study. Relative to sole-crop oats, alternative forage systems were generally beneficial to the concentration of surface-soil (0-0.1 m) organic carbon and systems that included sorghum showed most promise for increasing soil organic carbon concentration. We conclude that an emphasis on double-or summer sole-cropping rather than winter sole-cropping will advantage both farm productivity and the soil-resource base.
Resumo:
The Mycoplasma hyopneumoniae ribonucleotide reductase R2 subunit (NrdF) gene fragment was cloned into eukaryotic and prokaryotic expression vectors and its immunogenicity evaluated in mice immunized orally with attenuated Salmonella typhimurium aroA CS332 harboring either of the recombinant expression plasmids. We found that NrdF is highly conserved among M. hyopneumoniae strains. The immunogenicity of NrdF was examined by analyzing antibody responses in sera and lung washes, and the cell-mediated immune (CMI) response was assessed by determining the INF-[gamma] level produced by splenocytes upon in vitro stimulation with NrdF antigen. S. typhimurium expressing NrdF encoded by the prokaryotic expression plasmid (pTrcNrdF) failed to elicit an NrdF-specific serum or secretory antibody response, and IFN-[gamma] was not produced. Similarly, S. typhimurium carrying the eukaryotic recombinant plasmid encoding NrdF (pcNrdF) did not induce a serum or secretory antibody response, but did elicit significant NrdF-specific IFN-[gamma] production, indicating induction of a CMI response. However, analysis of immune responses against the live vector S. typhimurium aroA CS332 showed a serum IgG response but no mucosal IgA response in spite of its efficient invasiveness in vitro. In the present study we show that the DNA vaccine encoding the M. hyopneumoniae antigen delivered orally via a live attenuated S. typhimurium aroA can induce a cell-mediated immune response. We also indicate that different live bacterial vaccine carriers may have an influence on the type of the immune response induced.
Resumo:
An urgent need exists for indicators of soil health and patch functionality in extensive rangelands that can be measured efficiently and at low cost. Soil mites are candidate indicators, but their identification and handling is so specialised and time-consuming that their inclusion in routine monitoring is unlikely. The aim of this study was to measure the relationship between patch type and mite assemblages using a conventional approach. An additional aim was to determine if a molecular approach traditionally used for soil microbes could be adapted for soil mites to overcome some of the bottlenecks associated with soil fauna diversity assessment. Soil mite species abundance and diversity were measured using conventional ecological methods in soil from patches with perennial grass and litter cover (PGL), and compared to soil from bare patches with annual grasses and/or litter cover (BAL). Soil mite assemblages were also assessed using a molecular method called terminal-restriction fragment length polymorphism (T-RFLP) analysis. The conventional data showed a relationship between patch type and mite assemblage. The Prostigmata and Oribatida were well represented in the PGL sites, particularly the Aphelacaridae (Oribatida). For T-RFLP analysis, the mite community was represented by a series of DNA fragment lengths that reflected mite sequence diversity. The T-RFLP data showed a distinct difference in the mite assemblage between the patch types. Where possible, T-RFLP peaks were matched to mite families using a reference 18S rDNA database, and the Aphelacaridae prevalent in the conventional samples at PGL sites were identified, as were prostigmatids and oribatids. We identified limits to the T-RFLP approach and this included an inability to distinguish some species whose DNA sequences were similar. Despite these limitations, the data still showed a clear difference between sites, and the molecular taxonomic inferences also compared well with the conventional ecological data. The results from this study indicated that the T-RFLP approach was effective in measuring mite assemblages in this system. The power of this technique lies in the fact that species diversity and abundance data can be obtained quickly because of the time taken to process hundreds of samples, from soil DNA extraction to data output on the gene analyser, can be as little as 4 days.
Resumo:
Remote drafting technology now available for sheep allows targeted supplementation of individuals within a grazing flock. This paper reports results of three experiments. Experiment 1 examined the weight change of Merino wethers allowed access to either lupin grain or whole cottonseed 0, 1, 2 or 7 days/week for 6 weeks. Experiment 2 examined the weight change of Merino wethers allowed access to either lupins or a sorghum + cottonseed meal (CSM) supplement 0, 2, 4 or 7 days/week for 8 weeks. Experiment 3 investigated the relationship between five allocations of trough space at the supplement self-feeders (5–50 cm/sheep) and the weight change of Merino wethers allowed access to lupins 1 day/week for 8 weeks. In all experiments, the Merino wethers had free access as a single group to drinking water and low quality hay in a large group pen and were allowed access to supplement once per day on their scheduled days of access. No water was available in the areas containing supplement, but one-way flow gates allowed animals to return to the group pen in their own time. There was a linear response in growth rate to increased frequency of access to lupins in Experiments 1 and 2, with each additional day of access increasing liveweight gain by 26 and 21 g/day, respectively. Similarly, the response to the sorghum + CSM supplement was linear, although significantly lower (P < 0.05), at 12 g/day. Providing access to whole cottonseed resulted in no significant change in growth rate compared with the control animals. In Experiment 3, decreasing trough space from 50 to 5 cm/sheep had no effect on sheep liveweight change. It was concluded that the relationships developed here, for growth response to increased frequency of access to lupins or a sorghum + CSM supplement, could be used to indicate the most appropriate frequency of access to supplement, through a remote drafting unit, to achieve sheep weight change targets. Also, that a trough space of 5 cm/sheep appears adequate in this supplementation system.
Resumo:
In dryland cotton cropping systems, the main weeds and effectiveness of management practices were identified, and the economic impact of weeds was estimated using information collected in a postal and a field survey of Southern Queensland and northern New South Wales. Forty-eight completed questionnaires were returned, and 32 paddocks were monitored in early and late summer for weed species and density. The main problem weeds were bladder ketmia (Hibiscus trionum), common sowthistle (Sonchus oleraceus), barnyard grasses (Echinochloa spp.), liverseed grass (Urochloa panicoides) and black bindweed (Fallopia convolvulus), but the relative importance of these differed with crops, fallows and crop rotations. The weed flora was diverse with 54 genera identified in the field survey. Control of weed growth in rotational crops and fallows depended largely on herbicides, particularly glyphosate in fallow and atrazine in sorghum, although effective control was not consistently achieved. Weed control in dryland cotton involved numerous combinations of selective herbicides, several non-selective herbicides, inter-row cultivation and some manual chipping. Despite this, residual weeds were found at 38-59% of initial densities in about 3-quarters of the survey paddocks. The on-farm financial costs of weeds ranged from $148 to 224/ha.year depending on the rotation, resulting in an estimated annual economic cost of $19.6 million. The approach of managing weed populations across the whole cropping system needs wider adoption to reduce the weed pressure in dryland cotton and the economic impact of weeds in the long term. Strategies that optimise herbicide performance and minimise return of weed seed to the soil are needed. Data from the surveys provide direction for research to improve weed management in this cropping system. The economic framework provides a valuable measure of evaluating likely future returns from technologies or weed management improvements.
Resumo:
The present study set out to test the hypothesis through field and simulation studies that the incorporation of short-term summer legumes, particularly annual legume lablab (Lablab purpureus cv. Highworth), in a fallow-wheat cropping system will improve the overall economic and environmental benefits in south-west Queensland. Replicated, large plot experiments were established at five commercial properties by using their machineries, and two smaller plot experiments were established at two intensively researched sites (Roma and St George). A detailed study on various other biennial and perennial summer forage legumes in rotation with wheat and influenced by phosphorus (P) supply (10 and 40 kg P/ha) was also carried out at the two research sites. The other legumes were lucerne (Medicago sativa), butterfly pea (Clitoria ternatea) and burgundy bean (Macroptilium bracteatum). After legumes, spring wheat (Triticum aestivum) was sown into the legume stubble. The annual lablab produced the highest forage yield, whereas germination, establishment and production of other biennial and perennial legumes were poor, particularly in the red soil at St George. At the commercial sites, only lablab-wheat rotations were experimented, with an increased supply of P in subsurface soil (20 kg P/ha). The lablab grown at the commercial sites yielded between 3 and 6 t/ha forage yield over 2-3 month periods, whereas the following wheat crop with no applied fertiliser yielded between 0.5 to 2.5 t/ha. The wheat following lablab yielded 30% less, on average, than the wheat in a fallow plot, and the profitability of wheat following lablab was slightly higher than that of the wheat following fallow because of greater costs associated with fallow management. The profitability of the lablab-wheat phase was determined after accounting for the input costs and additional costs associated with the management of fallow and in-crop herbicide applications for a fallow-wheat system. The economic and environmental benefits of forage lablab and wheat cropping were also assessed through simulations over a long-term climatic pattern by using economic (PreCAPS) and biophysical (Agricultural Production Systems Simulation, APSIM) decision support models. Analysis of the long-term rainfall pattern (70% in summer and 30% in winter) and simulation studies indicated that ~50% time a wheat crop would not be planted or would fail to produce a profitable crop (grain yield less than 1 t/ha) because of less and unreliable rainfall in winter. Whereas forage lablab in summer would produce a profitable crop, with a forage yield of more than 3 t/ha, ~90% times. Only 14 wheat crops (of 26 growing seasons, i.e. 54%) were profitable, compared with 22 forage lablab (of 25 seasons, i.e. 90%). An opportunistic double-cropping of lablab in summer and wheat in winter is also viable and profitable in 50% of the years. Simulation studies also indicated that an opportunistic lablab-wheat cropping can reduce the potential runoff+drainage by more than 40% in the Roma region, leading to improved economic and environmental benefits.
Resumo:
Because of the variable and changing environment, advisors and farmers are seeking systems that provide risk management support at a number of time scales. The Agricultural Production Systems Research Unit, Toowoomba, Australia has developed a suite of tools to assist advisors and farmers to better manage risk in cropping. These tools range from simple rainfall analysis tools (Rainman, HowWet, HowOften) through crop simulation tools (WhopperCropper and YieldProphet) to the most complex, APSFarm, a whole-farm analysis tool. Most are derivatives of the APSIM crop model. These tools encompass a range of complexity and potential benefit to both the farming community and for government policy. This paper describes, the development and usage of two specific products; WhopperCropper and APSFarm. WhopperCropper facilitates simulation-aided discussion of growers' exposure to risk when comparing alternative crop input options. The user can readily generate 'what-if' scenarios that separate the major influences whilst holding other factors constant. Interactions of the major inputs can also be tested. A manager can examine the effects of input levels (and Southern Oscillation Index phase) to broadly determine input levels that match their attitude to risk. APSFarm has been used to demonstrate that management changes can have different effects in short and long time periods. It can be used to test local advisors and farmers' knowledge and experience of their desired rotation system. This study has shown that crop type has a larger influence than more conservative minimum soil water triggers in the long term. However, in short term dry periods, minimum soil water triggers and maximum area of the various crops can give significant financial gains.
Resumo:
Investigation of Pimelea elongata ("Lakebed Pimelea") afforded 18 tigliane- and daphnane-type diterpenes (1-18). Eight of these were new compounds: four (1-3, 5) tigliane esters and four (7, 8, 10, 11) daphnane orthoesters. The 10 known compounds were 12-O-decanoylphorbol-13-acetate (4), P. simplex subtoxin B (6), wikstroelide E (9), pimelotides A and B (12, 13), gnidiglaucin (14), simplexin (15), huratoxin (16), kirkinine D (17), and 12-beta-acetoxyhuratoxin (18). The structures and relative configurations of the new compounds were determined by ID and 2D NMR spectroscopic studies in combination with MS analyses.
Resumo:
Runoff, soil loss, and nutrient loss were assessed on a Red Ferrosol in tropical Australia over 3 years. The experiment was conducted using bounded, 100-m(2) field plots cropped to peanuts, maize, or grass. A bare plot, without cover or crop, was also instigated as an extreme treatment. Results showed the importance of cover in reducing runoff, soil loss, and nutrient loss from these soils. Runoff ranged from 13% of incident rainfall for the conventional cultivation to 29% under bare conditions during the highest rainfall year, and was well correlated with event rainfall and rainfall energy. Soil loss ranged from 30 t/ha. year under bare conditions to <6 t/ha. year under cropping. Nutrient losses of 35 kg N and 35 kg P/ha. year under bare conditions and 17 kg N and 11 kg P/ha. year under cropping were measured. Soil carbon analyses showed a relationship with treatment runoff, suggesting that soil properties influenced the rainfall runoff response. The cropping systems model PERFECT was calibrated using runoff, soil loss, and soil water data. Runoff and soil loss showed good agreement with observed data in the calibration, and soil water and yield had reasonable agreement. Longterm runs using historical weather data showed the episodic nature of runoff and soil loss events in this region and emphasise the need to manage land using protective measures such as conservation cropping practices. Farmers involved in related, action-learning activities wished to incorporate conservation cropping findings into their systems but also needed clear production benefits to hasten practice change.
Resumo:
In 2001 a scoping study (phase I) was commissioned to determine and prioritise the weed issues of cropping systems with dryland cotton. The main findings were that the weed flora was diverse, cropping systems complex, and weeds had a major financial and economical impact. Phase II 'Best weed management strategies for dryland cropping systems with cotton' focused on improved management of the key weeds, bladder ketmia, sowthistle, fleabane, barnyard grass and liverseed grass.In Phase III 'Improving management of summer weeds in dryland cropping systems with cotton', more information on the seed-bank dynamics of key weeds was gained in six pot and field studies. The studies found that these characteristics differed between species, and even climate in the case of bladder ketmia. Species such as sowthistle, fleabane and barnyard grass emerged predominately from the surface soil. Sweet summer grass was also in this category but also had a significant proportion emerging from 5 cm depth. Bladder ketmia in central Queensland emerged mainly from the top 2 cm, whereas in southern Queensland it emerged mainly from 5 cm. Liverseed grass had its highest emergence from 5 cm below the surface. In all cases the persistence of seed increased with increasing soil depth. Fleabane was also found to be sensitive to soil type with no seedlings emerging in the self-mulching black vertisol soil. A strategic tillage trial showed that burial of fleabane seed, using a disc or chisel plough, to a depth of greater than 2 cm can significantly reduce subsequent fleabane emergence. In contrast, tillage increased barnyard grass emergence and tended to decrease persistence. This research showed that weed management plans can not be blanketed across all weed species, rather they need to be targeted for each main weed species.This project has also resulted in an increased knowledge of how to manage fleabane from the eight experiments; one in wheat, two in sorghum, one in cotton and three in fallow on double knock. For summer crops, the best option is to apply a highly effective fallow treatment prior to sowing the crops. For winter crops, the strategy is the integration of competitive crops, residual herbicide followed by a knockdown to control survivors. This project explored further the usefulness of the double knock tactic for weed control and preventing seed set. Two field and one pot experiments have shown that this tactic was highly effective for fleabane control. Paraquat products provided good control when followed by glyphosate. When 2, 4-D was added in a tank mix with glyphosate and followed by paraquat products, 99-100% control was achieved in all cases. The ideal follow-up times for paraquat products after glyphosate were 5-7 days. The preferred follow-up times for 2, 4-D after glyphosate were on the same day and one day later. The pot trial, which compared a population from a cropping field with previous glyphosate exposure and a population from a non-cropping area with no previous glyphosate herbicide exposure, showed that the pervious herbicide exposure affected the response of fleabane to herbicidal control measures. The web-based brochure on managing fleabane has been updated.Knowledge on management of summer grasses and safe use of residual herbicides was derived from eight field and pot experiments. Residual grass and broadleaf weed control was excellent with atrazine pre-plant and at-planting treatments, provided rain was received within a short interval after application. Highly effective fallow treatments (cultivation and double knock), not only gave excellent grass control in the fallow, also gave very good control in the following cotton. In the five re-cropping experiments, there were no adverse impacts on cotton from atrazine, metolachlor, metsulfuron and chlorsulfuron residues following use in previous sorghum, wheat and fallows. However, imazapic residues did reduce cotton growth.The development of strategies to reduce the heavy reliance on glyphosate in our cropping systems, and therefore minimise the risk of glyphosate resistance development, was a key factor in the research undertaken. This work included identifying suitable tactics for summer grass control, such as double knock with glyphosate followed by paraquat and tillage. Research on fleabane also concentrated on minimising emergence through tillage, and applying the double knock tactic. Our studies have shown that these strategies can be used to prevent seed set with the goal of driving down the seed bank. Utilisation of the strategies will also reduce the reliance on glyphosate, and therefore reduce the risk of glyphosate resistance developing in our cropping systems.Information from this research, including ecological and management data were collected from an additional eight paddock monitoring sites, was also incorporated into the Weeds CRC seed bank model "Weed Seed Wizard", which will be able to predict the impact of different management options on weed populations in cotton and grain farming systems. Extensive communication activities were undertaken throughout this project to ensure adoption of the new strategies for improved weed management and reduced risk for glyphosate resistance.
Resumo:
The distribution and density of the ampullary electroreceptors in the skin of elasmobranchs are influenced by the phylogeny and ecology of a species. Sensory maps were created for 4 species of pristid sawfish. Their ampullary pores were separated into pore fields based on their innervation and cluster formation. Ventrally, ampullary pores are located in 6 areas (5 in Pristis microdon), covering the rostrum and head to the gills. Dorsally, pores are located in 4 areas (3 in P. microdon), which cover the rostrum, head and may extend slightly onto the pectoral fins. In all species, the highest number of pores is found on the dorsal and ventral sides of the rostrum. The high densities of pores along the rostrum combined with the low densities around the mouth could indicate that sawfish use their rostrum to stun their prey before ingesting it, but this hypothesis remains to be tested. The directions of ampullary canals on the ventral side of the rostrum are species specific. P. microdon possesses the highest number of ampullary pores, which indicates that amongst the study species this species is an electroreception specialist. As such, juvenile P. microdon inhabit low-visibility freshwater habitats.
Resumo:
Fruit size and quality are major problems in early-season stonefruit cultivars grown in Australia and South-East Asia. In Australia, Thailand and Vietnam, new training and trellising systems are being developed to improve yield and fruit quality. Australian trials found that new training systems, such as the Open Tatura system, are more productive compared with standard vase-trained trees. We established new crop-loading indices for low-chill stonefruit to provide a guide for optimum fruit thinning based on fruit number per canopy surface and butt cross sectional area. Best management practices were developed for low-chill stonefruit cultivation using growth retardants, optimizing leaf nitrogen concentrations and controlling rates and timing of irrigation. Regulated deficit irrigation (RDI) improved fruit sugar concentrations by restricting water application during stage II of fruit growth. New pest and disease control measures are also being developed using a new generation of fruit fly baits. Soft insecticides such as (Spinosad) are used at significantly lower concentrations and have lower mammalian toxicity than the organophosphates currently registered in Australia. In addition, fruit fly exclusion netting effectively eliminated fruit fly and many other insect pests from the orchard with no increase in diseases. This netting system increased sugar concentrations of peach and nectarine by as much as 30%. Economic analyses showed that the break-even point can be reduced from 12 to 6 years Open Tatura trellising and exclusion netting.
Resumo:
The aim of this study was to investigate the effects on follicle stimulating hormone (FSH) secretion and dominant follicle (OF) growth, of treatment of Bos indicus heifers with different combinations of intra-vaginal progesterone releasing devices (IPRD), oestradiol benzoate (ODB), PGF(2 alpha), and eCG. Two-year-old Brahman (BN; n=30) and Brahman-cross (BNX; n=34) heifers were randomly allocated to three IPRD-treatments: (i) standard-dose IPRD [CM 1.56 g; 1.56 g progesterone (P-4); n = 17]; (ii) half-dose IPRD (CM 0.78 g; 0.78 g p(4); n=15); (iii) half-dose IPRD + 300 IU eCG at IPRD removal (CM 0.78 g+G; n=14); and, (iv) non-IPRD control (2 x PGF(2 alpha); n=18) 500 mu g cloprostenol on Days -16 and -2. IPRD-treated heifers received 250 mu g PGF(2 alpha) at IPRD insertion (Day 10) and IPRD removal (Day -2) and 1 mg ODB on Day -10 and Day -1. Follicular dynamics were monitored daily by trans-rectal ultrasonography from Day -10 to Day 1. Blood samples for determination of P-4 were collected daily and samples for FSH determination were collected at 12 h intervals from Day -9 to Day -2. A significant surge in concentrations of FSH was observed in the 2 x PGF(2 alpha), treatment 12 h prior and 48 h after follicular wave emergence, but not in the IPRD-treated heifers. Estimated mean concentrations of total plasma P-4 during the 8 days of IPRD insertion was greater (P<0.001) in the CM 1.56 g P-4 treated heifers compared to the CM 0.78 g P-4 treated heifers (18.38 ng/ml compared with 11.09 ng/ml, respectively). A treatment by genotype interaction (P=0.036) was observed in the mean plasma P4 concentration in heifers with no CL during IPRD insertion, whereby BN heifers in the CM 1.56 g treatment had greater plasma P-4 than the BNX heifers on Days-9, -7, -6, -5, and -4. However, there was no genotype effect in the CM 0.78 g +/- G or the 2 x PGF(2 alpha) treatment. Treatment had no effect on the DF growth from either day of wave emergence (P=0.378) or day of IPRD removal (P=0.780) to ovulation. This study demonstrates that FSH secretion in B. indicus heifers treated with a combination of IPRD's and ODB to synchronise ovulation was suppressed during the period of IPRD insertion but no significant effect on growth of the DF was observed. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Scabies is an ectoparasitic infestation by the mite Sarcoptes scabiei. Although commonly self-limiting, a fraction of patients develop severely debilitating crusted scabies. The immune mechanisms underlying the development of crusted scabies are unclear, and undertaking longitudinal infection studies in humans is difficult. We utilized a porcine model to compare cellular immune responses in peripheral blood and skin of pigs with different clinical manifestations of scabies (n = 12), and in uninfected controls (n = 6). Although clinical symptoms were not evident until at least 4 weeks post-infestation, the numbers of peripheral IFNγ-secreting CD4+ T cells and γδ T cells increased in infected pigs from week 1 post-infestation. γδ T cells remained increased in the blood at week 15 post-infestation. At week 15, skin cell infiltrates from pigs with crusted scabies had significantly higher CD8+ T cell, γδ T cell and IL-17+ cell numbers than those with ordinary scabies. Peripheral IL-17 levels were not increased, suggesting that localized skin IL-17-secreting T cells may play a critical role in the pathogenesis of crusted scabies development. Given the potential of anti-IL-17 immunotherapy demonstrated for other inflammatory skin diseases, this study may provide a novel therapeutic avenue for patients with recurrent crusted scabies.