16 resultados para Scale not given.None

em eResearch Archive - Queensland Department of Agriculture


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, nasal swabs taken from multiparous sows at weaning time or from sick pigs displaying symptoms of Glasser's disease from farms in Australia [date not given] were cultured and analysed by polymerase chain reaction (PCR). Within each genotype detected on a farm, representative isolates were serotyped by gel diffusion (GD) testing or indirect haemagglutination (IHA) test. Isolates which did not react in any of the tests were regarded as non-typable and were termed serovar NT. Serovars 1, 5, 12, 13 and 14 were classified as highly pathogenic; serovars 2, 4 and 15 being moderately pathogenic; serovar 8 being slightly pathogenic and serovars 3, 6, 7, 9 and 11 being non-pathogenic. Sows were inoculated with the strain of Haemophilus parasuis (serovars 4, 6 and 9 from Farms 1, 2 and 4, respectively) used for controlled challenge 3 and 5 weeks before farrowing. Before farrowing the sows were divided into control and treatment groups. Five to seven days after birth, the piglets of the treatment group were challenged with a strain from the farm which had were used to vaccinate the sows. The effectiveness of the controlled exposure was evaluated by number of piglets displaying clinical signs possibly related to infection, number of antibiotic treatments and pig mortality. Nasal swabs of sick pigs were taken twice a week to find a correlation to infection. A subsample of pigs was weighed after leaving the weaning sheds. The specificity of a realtime PCR amplifying the infB gene was evaluated with 68 H. parasuis isolates and 36 strains of closely related species. 239 samples of DNA from tissues and fluids of 16 experimentally challenged animals were also tested with the realtime PCR, and the results compared with culture and a conventional PCR. The farm experiments showed that none of the controlled challenge pigs showed any signs of illness due to Glasser's disease, although the treatment groups required more antibiotics than the controls. A total of 556 H. parasuis isolates were genotyped, while 150 isolates were serotyped. H. parasuis was detected on 19 of 20 farms, including 2 farms with an extensive history of freedom from Glasser's disease. Isolates belonging to serovars regarded as potentially pathogenic were obtained from healthy pigs at weaning on 8 of the 10 farms with a history of Glasser's disease outbreaks. Sampling 213 sick pigs yielded 115 isolates, 99 of which belonged to serovars that were either potentially pathogenic or of unknown pathogenicity. Only 16 isolates from these sick pigs were of a serovar known to be non-pathogenic. Healthy pigs also had H. parasuis, even on farms free of Glasser's disease. The realtime PCR gave positive results for all 68 H. parasuis isolates and negative results for all 36 non-target bacteria. When used on the clinical material from experimental infections, the realtime PCR produced significantly more positive results than the conventional PCR (165 compared to 86).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Buffel grass [Pennisetum ciliare (L.) Link] has been widely introduced in the Australian rangelands as a consequence of its value for productive grazing, but tends to competitively establish in non-target areas such as remnant vegetation. In this study, we examined the influence landscape-scale and local-scale variables had upon the distribution of buffel grass in remnant poplar box (Eucalyptus populnea F. Muell.) dominant woodland fragments in the Brigalow Bioregion, Queensland. Buffel grass and variables thought to influence its distribution in the region were measured at 60 sites, which were selected based on the amount of native woodland retained in the landscape and patch size. An information-theoretic modelling approach and hierarchical partitioning revealed that the most influential variable was the percent of retained vegetation within a 1-km spatial extent. From this, we identified a critical threshold of similar to 30% retained vegetation in the landscape, above which the model predicted buffel grass was not likely to occur in a woodland fragment. Other explanatory variables in the model were site based, and included litter cover and long-term rainfall. Given the paucity of information on the effect of buffel grass upon biodiversity values, we undertook exploratory analyses to determine whether buffel grass cover influenced the distribution of grass, forb and reptile species. We detected some trends; hierarchical partitioning revealed that buffel grass cover was the most important explanatory variable describing habitat preferences of four reptile species. However, establishing causal links - particularly between native grass and forb species and buffel grass - was problematic owing to possible confounding with grazing pressure. We conclude with a set of management recommendations aimed at reducing the spread of buffel grass into remnant woodlands.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A rare opportunity to test hypotheses about potential fishery benefits of large-scale closures was initiated in July 2004 when an additional 28.4% of the 348 000 km2 Great Barrier Reef (GBR) region of Queensland, Australia was closed to all fishing. Advice to the Australian and Queensland governments that supported this initiative predicted these additional closures would generate minimal (10%) initial reductions in both catch and landed value within the GBR area, with recovery of catches becoming apparent after three years. To test these predictions, commercial fisheries data from the GBR area and from the two adjacent (non-GBR) areas of Queensland were compared for the periods immediately before and after the closures were implemented. The observed means for total annual catch and value within the GBR declined from pre-closure (2000–2003) levels of 12 780 Mg and Australian $160 million, to initial post-closure (2005–2008) levels of 8143 Mg and $102 million; decreases of 35% and 36% respectively. Because the reference areas in the non-GBR had minimal changes in catch and value, the beyond-BACI (before, after, control, impact) analyses estimated initial net reductions within the GBR of 35% for both total catch and value. There was no evidence of recovery in total catch levels or any comparative improvement in catch rates within the GBR nine years after implementation. These results are not consistent with the advice to governments that the closures would have minimal initial impacts and rapidly generate benefits to fisheries in the GBR through increased juvenile recruitment and adult spillovers. Instead, the absence of evidence of recovery in catches to date currently supports an alternative hypothesis that where there is already effective fisheries management, the closing of areas to all fishing will generate reductions in overall catches similar to the percentage of the fished area that is closed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nutrient mass balances have been used to assess a variety of land resource scenarios, at various scales. They are widely used as a simple basis for policy, planning, and regulatory decisions but it is not clear how accurately they reflect reality. This study provides a critique of broad-scale nutrient mass balances, with particular application to the fertiliser use of beef lot-feeding manure in Queensland. Mass balances completed at the district and farm scale were found to misrepresent actual manure management behaviour and potentially the risk of nutrient contamination of water resources. The difficulties of handling stockpile manure and concerns about soil compaction mean that manure is spread thickly over a few paddocks at a time and not evenly across a whole farm. Consequently, higher nutrient loads were applied to a single paddock less frequently than annually. This resulted in years with excess nitrogen, phosphorus, and potassium remaining in the soil profile. This conclusion was supported by evidence of significant nutrient movement in several of the soil profiles studied. Spreading manure is profitable, but maximum returns can be associated with increased risk of nutrient leaching relative to conventional inorganic fertiliser practices. Bio-economic simulations found this increased risk where manure was applied to supply crop nitrogen requirements (the practice of the case study farms, 200-5000 head lot-feeders). Thus, the use of broad-scale mass balances can be misleading because paddock management is spatially heterogeneous and this leads to increased local potential for nutrient loss. In response to the effect of spatial heterogeneity policy makers who intend to use mass balance techniques to estimate potential for nutrient contamination should apply these techniques conservatively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A variety of materials were trialed as supported permeable covers using a series of laboratory-scale anaerobic digesters. Efficacy of cover performance was assessed in terms of impact on odour and greenhouse gas emission rate, and the characteristics of anaerobic liquor. Data were collected over a 12-month period. Initially the covers reduced the rate of odour emission 40-100 times relative to uncovered digesters. After about three months, this decreased to about a threefold reduction in odour emission rate, which was maintained over the remainder of the trial. The covers did not alter methane emission rates. Carbon dioxide emission rates varied according to cover type. Performance of the covers was attributed to the physical characteristics of the cover materials and changes in liquor composition. The reductions in odour emission indicate that these covers offer a cost-effective method for odour control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Controlled traffic (matching wheel and row spacing) is being promoted as a means to manage soil compaction in the Australian sugar industry. However, machinery limitations dictate that wider row spacings than the standard 1.5-m single row will need to be adopted to incorporate controlled traffic and many growers are reluctant to widen row spacing for fear of yield penalties. To address these concerns, contrasting row configuration and planting density combinations were investigated for their effect on cane and sugar yield in large-scale experiments in the Gordonvale, Tully, Ingham, Mackay, and Bingera (near Bundaberg) sugarcane-growing regions of Queensland, Australia. The results showed that sugarcane possesses a capacity to compensate for different row configurations and planting densities through variation in stalk number and individual stalk weight. Row configurations ranging from 1.5-m single rows (the current industry standard) to 1.8-m dual rows (50 cm between duals), 2.1-m dual (80 cm between duals) and triple ( 65 cm between triples) rows, and 2.3-m triple rows (65 cm between triples) produced similar yields. Four rows (50 cm apart) on a 2.1-m configuration (quad rows) produced lower yields largely due to crop lodging, while a 1.8-m single row configuration produced lower yields in the plant crop, probably due to inadequate resource availability (water stress/limited radiation interception). The results suggest that controlled traffic can be adopted in the Australian sugar industry by changing from a 1.5-m single row to 1.8-m dual row configuration without yield penalty. Further, the similar yields obtained with wider row configurations (2 m or greater with multiple rows) in these experiments emphasise the physiological and environmental plasticity that exists in sugarcane. Controlled traffic can be implemented with these wider row configurations (>2 m), although it will be necessary to carry out expensive modifications to the current harvester and haul-out equipment. There were indications from this research that not all cultivars were suited to configurations involving multiple rows. The results suggest that consideration be given to assessing clones with different growth habits under a range of row configurations to find the most suitable plant types for controlled traffic cropping systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This joint DPI/Burdekin Shire Council project assessed the efficacy of a pilot-scale biological remediation system to recover Nitrogen (N) and Phosphorous (P) nutrients from secondary treated municipal wastewater at the Ayr Sewage Treatment Plant. Additionally, this study considered potential commercial uses for by-products from the treatment system. Knowledge gained from this study can provide directions for implementing a larger-scale final effluent treatment protocol on site at the Ayr plant. Trials were conducted over 10 months and assessed nutrient removal from duckweed-based treatments and an algae/fish treatment – both as sequential and as stand-alone treatment systems. A 42.3% reduction in Total N was found through the sequential treatment system (duckweed followed by algae/fish treatment) after 6.6 days Effluent Retention Time (E.R.T.). However, duckweed treatment was responsible for the majority of this nutrient recovery (7.8 times more effective than algae/fish treatment). Likewise, Total P reduction (15.75% reduction after 6.6 days E.R.T.) was twice as great in the duckweed treatment. A phytoplankton bloom, which developed in the algae/fish tanks, reduced nutrient recovery in this treatment. A second trial tested whether the addition of fish enhanced duckweed treatment by evaluating systems with and without fish. After four weeks operation, low DO under the duckweed blanket caused fish mortalities. Decomposition of these fish led to an additional organic load and this was reflected in a breakdown of nitrogen species that showed an increase in organic nitrogen. However, the Dissolved Inorganic Nitrogen (DIN: ammonia, nitrite and nitrate) removal was similar between treatments with and without fish (57% and 59% DIN removal from incoming, respectively). Overall, three effluent residence times were evaluated using duckweed-based treatments; i.e. 3.5 days, 5.5 days and 10.4 days. Total N removal was 37.5%, 55.7% and 70.3%, respectively. The 10.4-day E.R.T. trial, however, was evaluated by sequential nutrient removal through the duckweed-minus-fish treatment followed by the duckweed-plus-fish treatment. Therefore, the 70.3% Total N removal was lower than could have been achieved at this retention time due to the abovementioned fish mortalities. Phosphorous removal from duckweed treatments was greatest after 10.4-days E.R.T. (13.6%). Plant uptake was considered the most important mechanism for this P removal since there was no clay substrate in the plastic tanks that could have contributed to P absorption as part of the natural phosphorous cycle. Duckweed inhibited phytoplankton production (therefore reducing T.S.S) and maintained pH close to neutral. DO beneath the duckweed blanket fell to below 1ppm; however, this did not limit plant production. If fish are to be used as part of the duckweed treatment, air-uplifts can be installed that maintain DO levels without disturbing surface waters. Duckweed grown in the treatments doubled its biomass on average every 5.7 days. On a per-surface area basis, 1.23kg/m2 was harvested weekly. Moisture content of duckweed was 92%, equating to a total dry weight harvest of 0.098kg/m2/week. Nutrient analysis of dried duckweed gave an N content of 6.67% and a P content of 1.27%. According to semi-quantitative analyses, harvested duckweed contained no residual elements from the effluent stream that were greater than ANZECC toxicant guidelines proposed for aquaculture. In addition, jade perch, a local aquaculture species, actively consumed and gained weight on harvested duckweed, suggesting potential for large-scale fish production using by-products from the effluent treatment process. This suggests that a duckweed-based system may be one viable option for tertiary treatment of Ayr municipal wastewater. The tertiary detention lagoon proposed by the Burdekin Shire Council, consisting of six bays approximately 290 x 35 metres (x 1.5 metres deep), would be suitable for duckweed culture with minor modification to facilitate the efficient distribution of duckweed plants across the entire available growing surface (such as floating containment grids). The effluent residence time resulting from this proposed configuration (~30 days) should be adequate to recover most effluent nutrients (certainly N) based on the current trial. Duckweed harvest techniques on this scale, however, need to be further investigated. Based on duckweed production in the current trial (1.23kg/m2/week), a weekly harvest of approximately 75 000kg (wet weight) could be expected from the proposed lagoon configuration under full duckweed production. A benefit of the proposed multi-bay lagoon is that full lagoon production of duckweed may not be needed to restore effluent to a desirable standard under the present nutrient load, and duckweed treatment may be restricted to certain bays. Restored effluent could be released without risk of contaminating the receiving waterway with duckweed by evacuating water through an internal standpipe located mid-way in the water column.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two prerequisites for realistically embarking upon an eradication programme are that cost-benefit analysis favours this strategy over other management options and that sufficient resources are available to carry the programme through to completion. These are not independent criteria, but it is our view that too little attention has been paid to estimating the investment required to complete weed eradication programmes. We deal with this problem by using a two-pronged approach: 1) developing a stochastic dynamic model that provides an estimation of programme duration; and 2) estimating the inputs required to delimit a weed incursion and to prevent weed reproduction over a sufficiently long period to allow extirpation of all infestations. The model is built upon relationships that capture the time-related detection of new infested areas, rates of progression of infestations from the active to the monitoring stage, rates of reversion of infestations from the monitoring to active stage, and the frequency distribution of time since last detection for all infestations. This approach is applied to the branched broomrape (Orobanche ramosa) eradication programme currently underway in South Australia. This programme commenced in 1999 and currently 7450 ha are known to be infested with the weed. To date none of the infestations have been eradicated. Given recent (2008) levels of investment and current eradication methods, model predictions are that it would take, on average, an additional 73 years to eradicate this weed at an average additional cost (NPV) of $AU67.9m. When the model was run for circumstances in 2003 and 2006, the average programme duration and total cost (NPV) were predicted to be 159 and 94 years, and $AU91.3m and $AU72.3m, respectively. The reduction in estimated programme length and cost may represent progress towards the eradication objective, although eradication of this species still remains a long term prospect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Development of new agricultural industries in northern Australia is seen as a way to provide food security in the face of reduced water availability in existing regions in the south. This report aims to identify some of the possible economic consequences of developing a rice industry in the Burdekin region, while there is a reduction of output in the Riverina. Annual rice production in the Riverina peaked at 1.7 M tonnes, but the long-term outlook, given climate change impacts on that region and government water buy-backs, is more likely to be less than 800,000 tonnes. Growers are highly efficient water users by international standards, but the ability to offset an anticipated reduction in water availability through further efficiency gains is limited. In recent years growers in the Riverina have diversified their farms to a greater extent and secondary production systems include beef, sheep and wheat. Production in north Queensland is in its infancy, but a potentially suitable farming system has been developed by including rice within the sugarcane system without competition and in fact contributing to the production of sugar by increasing yields and controlling weeds. The economic outcomes are estimated a large scale, dynamic, computable general equilibrium (CGE) model of the world economy (Tasman Global), scaled down to regional level. CGE models mimic the workings of the economy through a system of interdependent behavioural and accounting equations which are linked to an input-output database. When an economic shock or change is applied to a model, each of the markets adjusts according to the set of behavioural parameters which are underpinned by economic theory. In this study the model is driven by reducing production in the Riverina in accordance with relationships found between water availability and the production of rice and replacement by other crops and by increasing ride production in the Burdekin. Three scenarios were considered: • Scenario 1: Rice is grown using the fallow period between the last ratoon crop of sugarcane and the new planting. In this scenario there is no competition between rice and sugarcane • Scenario 2: Rice displaces sugarcane production • Scenario 3: Rice is grown on additional land and does not compete with sugarcane. Two time periods were used, 2030 and 2070, which are the conventional time points to consider climate change impacts. Under scenario 1, real economic output declines in the Riverina by $45 million in 2030 and by $139 million in 2070. This is only partially offset by the increased real economic output in the Burdekin of $35 million and $131 million respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multi-species fisheries are complex to manage and the ability to develop an appropriate governance structure is often seriously impeded because trading between sustainability objectives at the species level, economic objectives at the fleet level, and social objectives at the community scale, is complex. Many of these fisheries also tend to have a mix of information, with stock assessments available for some species and almost no information on other species. The fleets themselves comprise fishers from small family enterprises to large vertically integrated businesses. The Queensland trawl fishery in Australia is used as a case study for this kind of fishery. It has the added complexity that a large part of the fishery is within a World Heritage Area, the Great Barrier Reef Marine Park, which is managed by an agency of the Australian Commonwealth Government whereas the fishery itself is managed by the Queensland State Government. A stakeholder elicitation process was used to develop social, governance, economic and ecological objectives, and then weight the relative importance of these. An expert group was used to develop different governance strawmen (or management strategies) and these were assessed by a group of industry stakeholders and experts using multi-criteria decision analysis techniques against the different objectives. One strawman clearly provided the best overall set of outcomes given the multiple objectives, but was not optimal in terms of every objective, demonstrating that even the "best" strawman may be less than perfect. © 2012.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Remote detection of management-related trend in the presence of inter-annual climatic variability in the rangelands is difficult. Minimally disturbed reference areas provide a useful guide, but suitable benchmarks are usually difficult to identify. We describe a method that uses a unique conceptual framework to identify reference areas from multitemporal sequences of ground cover derived from Landsat TM and ETM+ imagery. The method does not require ground-based reference sites nor GIS layers about management. We calculate a minimum ground cover image across all years to identify locations of most persistent ground cover in years of lowest rainfall. We then use a moving window approach to calculate the difference between the window's central pixel and its surrounding reference pixels. This difference estimates ground-cover change between successive below-average rainfall years, which provides a seasonally interpreted measure of management effects. We examine the approach's sensitivity to window size and to cover-index percentiles used to define persistence. The method successfully detected management-related change in ground cover in Queensland tropical savanna woodlands in two case studies: (1) a grazing trial where heavy stocking resulted in substantial decline in ground cover in small paddocks, and (2) commercial paddocks where wet-season spelling (destocking) resulted in increased ground cover. At a larger scale, there was broad agreement between our analysis of ground-cover change and ground-based land condition change for commercial beef properties with different a priori ratings of initial condition, but there was also some disagreement where changing condition reflected pasture composition rather than ground cover. We conclude that the method is suitably robust to analyse grazing effects on ground cover across the 1.3 x 10(6) km(2) of Queensland's rangelands. Crown Copyright (c) 2012 Published by Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Many prey species around the world are suffering declines due to a variety of interacting causes such as land use change, climate change, invasive species and novel disease. Recent studies on the ecological roles of top-predators have suggested that lethal top-predator control by humans (typically undertaken to protect livestock or managed game from predation) is an indirect additional cause of prey declines through trophic cascade effects. Such studies have prompted calls to prohibit lethal top-predator control with the expectation that doing so will result in widespread benefits for biodiversity at all trophic levels. However, applied experiments investigating in situ responses of prey populations to contemporary top-predator management practices are few and none have previously been conducted on the eclectic suite of native and exotic mammalian, reptilian, avian and amphibian predator and prey taxa we simultaneously assess. We conducted a series of landscape-scale, multi-year, manipulative experiments at nine sites spanning five ecosystem types across the Australian continental rangelands to investigate the responses of sympatric prey populations to contemporary poison-baiting programs intended to control top-predators (dingoes) for livestock protection. Results Prey populations were almost always in similar or greater abundances in baited areas. Short-term prey responses to baiting were seldom apparent. Longer-term prey population trends fluctuated independently of baiting for every prey species at all sites, and divergence or convergence of prey population trends occurred rarely. Top-predator population trends fluctuated independently of baiting in all cases, and never did diverge or converge. Mesopredator population trends likewise fluctuated independently of baiting in almost all cases, but did diverge or converge in a few instances. Conclusions These results demonstrate that Australian populations of prey fauna at lower trophic levels are typically unaffected by top-predator control because top-predator populations are not substantially affected by contemporary control practices, thus averting a trophic cascade. We conclude that alteration of current top-predator management practices is probably unnecessary for enhancing fauna recovery in the Australian rangelands. More generally, our results suggest that theoretical and observational studies advancing the idea that lethal control of top-predators induces trophic cascades may not be as universal as previously supposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extensive resources are allocated to managing vertebrate pests, yet spatial understanding of pest threats, and how they respond to management, is limited at the regional scale where much decision-making is undertaken. We provide regional-scale spatial models and management guidance for European rabbits (Oryctolagus cuniculus) in a 260,791 km(2) region in Australia by determining habitat suitability, habitat susceptibility and the effects of the primary rabbit management options (barrier fence, shooting and baiting and warren ripping) or changing predation or disease control levels. A participatory modelling approach was used to develop a Bayesian network which captured the main drivers of suitability and spread, which in turn was linked spatially to develop high resolution risk maps. Policy-makers, rabbit managers and technical experts were responsible for defining the questions the model needed to address, and for subsequently developing and parameterising the model. Habitat suitability was determined by conditions required for warren-building and by above-ground requirements, such as food and harbour, and habitat susceptibility by the distance from current distributions, habitat suitability, and the costs of traversing habitats of different quality. At least one-third of the region had a high probability of being highly suitable (support high rabbit densities), with the model supported by validation. Habitat susceptibility was largely restricted by the current known rabbit distribution. Warren ripping was the most effective control option as warrens were considered essential for rabbit persistence. The anticipated increase in disease resistance was predicted to increase the probability of moderately suitable habitat becoming highly suitable, but not increase the at-risk area. We demonstrate that it is possible to build spatial models to guide regional-level management of vertebrate pests which use the best available knowledge and capture fine spatial-scale processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapid screening tests and an appreciation of the simple genetic control of Alternaria brown spot (ABS) susceptibility have existed for many years, and yet the application of this knowledge to commercial-scale breeding programs has been limited. Detached leaf assays were first demonstrated more than 40 years ago and reliable data suggesting a single gene determining susceptibility has been emerging for at least 20 years. However it is only recently that the requirement for genetic resistance in new hybrids has become a priority, following increased disease prevalence in Australian mandarin production areas previously considered too dry for the pathogen. Almost all of the high-fruit-quality parents developed so far by the Queensland-based breeding program are susceptible to ABS necessitating the screening of their progeny to avoid commercialisation of susceptible hybrids. This is done effectively and efficiently by spraying 3-6 month old hybrid seedlings with a spore suspension derived from a toxin-producing field isolate of Alternaria alternate, then incubating these seedlings in a cool room at 25°C and high humidity for 5 days. Susceptible seedlings show clear disease symptoms and are discarded. Analysis of observed and expected segregation ratios loosely support the hypothesis for a single dominant gene for susceptibility, but do not rule out the possibility of alternative genetic models. After implementing the routine screening for ABS resistance for three seasons we now have more than 20,000 hybrids growing in field progeny blocks that have been screened for resistance to the ABS disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The tomato I-3 and I-7 genes confer resistance to Fusarium oxysporum f. sp. lycopersici (Fol) race 3 and were introgressed into the cultivated tomato, Solanum lycopersicum, from the wild relative Solanum pennellii. I-3 has been identified previously on chromosome 7 and encodes an S-receptor-like kinase, but little is known about I-7. Molecular markers have been developed for the marker-assisted breeding of I-3, but none are available for I-7. We used an RNA-seq and single nucleotide polymorphism (SNP) analysis approach to map I-7 to a small introgression of S. pennellii DNA (c. 210 kb) on chromosome 8, and identified I-7 as a gene encoding a leucine-rich repeat receptor-like protein (LRR-RLP), thereby expanding the repertoire of resistance protein classes conferring resistance to Fol. Using an eds1 mutant of tomato, we showed that I-7, like many other LRR-RLPs conferring pathogen resistance in tomato, is EDS1 (Enhanced Disease Susceptibility 1) dependent. Using transgenic tomato plants carrying only the I-7 gene for Fol resistance, we found that I-7 also confers resistance to Fol races 1 and 2. Given that Fol race 1 carries Avr1, resistance to Fol race 1 indicates that I-7-mediated resistance, unlike I-2- or I-3-mediated resistance, is not suppressed by Avr1. This suggests that Avr1 is not a general suppressor of Fol resistance in tomato, leading us to hypothesize that Avr1 may be acting against an EDS1-independent pathway for resistance activation. The identification of I-7 has allowed us to develop molecular markers for marker-assisted breeding of both genes currently known to confer Fol race 3 resistance (I-3 and I-7). Given that I-7-mediated resistance is not suppressed by Avr1, I-7 may be a useful addition to I-3 in the tomato breeder's toolbox.