977 resultados para Resistance management


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bacteria with antimicrobial resistance can be transferred from animals to humans and may compromise antimicrobial treatment in case of infection. To determine the antimicrobial resistance situation in bacteria from Swiss veal calves, faecal samples from 500 randomly selected calves originating from 129 farms were collected at four big slaughterhouses. Samples were cultured for Escherichia coli, Enterococcus sp. and Campylobacter sp. and isolated strains were tested for antimicrobial susceptibility to selected antimicrobial agents by the minimal inhibitory concentration technique using the broth microdilution method. From 100 farms, data on farm management, animal husbandry and antimicrobial treatments of the calves were collected by questionnaire. Risk factors associated with antimicrobial resistance were identified by logistic regression. In total, 467 E. coli, 413 Enterococcus sp. and 202 Campylobacter sp. were isolated. Of those, 68.7%, 98.7% and 67.8%, respectively, were resistant to at least one of the tested antimicrobial agents. Resistance was mainly observed to antimicrobials frequently used in farm animals. Prevalence of resistance to antimicrobials important for human treatment was generally low. However, a rather high number of quinupristin/dalfopristin-resistant Enterococcus faecium and ciprofloxacin-resistant Campylobacter sp. were detected. External calf purchase, large finishing groups, feeding of milk by-products and administration of antimicrobials through feed upon arrival of the animals on the farm significantly increased the risk of antimicrobial resistance at farm level. Participation in a quality assurance programme and injection of a macrolide upon arrival of the animals on the farm had a protective effect. The present study showed that veal calves may serve as a reservoir for resistant bacteria. To ensure food safety, veal calves should be included in the national monitoring programme for antimicrobial resistance in farm animals. By improving farm management and calf husbandry the prevalence of resistance may be reduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Renovascular vasoconstriction in patients with hepatorenal syndrome can be quantified by the renal arterial resistance index (RI). We investigated the value of RI measurement in detection of renal function impairment in patients with different stages of chronic liver disease. METHODS: Subjects were divided into 4 groups containing 21 patients with liver cirrhosis and ascites, 25 patients with liver cirrhosis without ascites, 35 patients with fatty liver disease and 78 control subjects. All patients underwent abdominal ultrasound examination with renal RI measurement and correlation with laboratory results for renal function. RESULTS: RI was significantly higher in ascitic patients compared to non-ascitic patients (0.74 vs. 0.67, p<0.01) and in non-ascitic patients with liver cirrhosis than in control subjects (0.67 vs. 0.62, p<0.01). 48% (19/40) of patients with liver cirrhosis and normal serum creatinine concentration showed elevated RI levels. There were no significant differences in RI levels between patients with fatty liver disease and controls (0.63 vs. 0.62). CONCLUSIONS: Intrarenal RI measurement is a predictor of renal vasoconstriction and serves to detect early renal function impairment in cirrhotic patients. The diagnosis of elevated RI may be taken into account in the clinical management of these patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Infections with vancomycin-resistant enterococci (VRE) are a growing concern in hospitals. The impact of vancomycin resistance in enterococcal urinary tract infection is not well-defined. Aim To describe the epidemiology of enterococcal bacteriuria in a hospital and compare the clinical picture and patient outcomes depending on vancomycin resistance. Methods This was a 6-month prospective cohort study of hospital patients who were admitted with or who developed enterococcal bacteriuria in a 1250-bed tertiary care hospital. We examined clinical presentation, diagnostic work-up, management, and outcomes. Findings We included 254 patients with enterococcal bacteriuria; 160 (63%) were female and median age was 65 years (range: 17–96). A total of 116 (46%) bacteriurias were hospital-acquired and 145 (57%) catheter-associated. Most patients presented with asymptomatic bacteriuria (ASB) (119; 47%) or pyelonephritis (64; 25%); 51 (20%) had unclassifiable bacteriuria and 20 (8%) had cystitis. Secondary bloodstream infection was detected in 8 (3%) patients. Seventy of 119 (59%) with ASB received antibiotics (mostly vancomycin). There were 74 (29%) VRE bacteriurias. VRE and vancomycin-susceptible enterococci (VSE) produced similar rates of pyelonephritis [19 (25%) vs 45 (25%); P = 0.2], cystitis, and ASB. Outcomes such as ICU transfer [10 (14%) VRE vs 17 (9%) VSE; P = 0.3], hospital length of stay (6.8 vs 5.0 days; P = 0.08), and mortality [10 (14%) vs 13 (7%); P = 0.1] did not vary with vancomycin susceptibility. Conclusions Vancomycin resistance did not affect the clinical presentation nor did it impact patient outcomes in this cohort of inpatients with enterococcal bacteriuria. Almost half of our cohort had enterococcal ASB; more than 50% of these asymptomatic patients received unnecessary antibiotics. Antimicrobial stewardship efforts should address overtreatment of enterococcal bacteriurias.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to identify optimal therapy for children with bacterial pneumonia, Pakistan's ARI Program, in collaboration with the National Institute of Health (NIH), Islamabad, undertook a national surveillance of antimicrobial resistance in S. pneumoniae and H. influenzae. The project was carried out at selected urban and peripheral sites in 6 different regions of Pakistan, in 1991–92. Nasopharyngeal (NP) specimens and blood cultures were obtained from children with pneumonia diagnosed in the outpatient clinic of participating facilities. Organisms were isolated by local hospital laboratories and sent to NIH for confirmation, serotyping and antimicrobial susceptibility testing. Following were the aims of the study (i) to determine the antimicrobial resistance patterns of S. pneumoniae and H. influenzae in children aged 2–59 months; (ii) to determine the ability of selected laboratories to identify and effectively transport isolates of S. pneumoniae and H. influenzae cultured from nasopharyngeal and blood specimens; (iii) to validate the comparability of resistance patterns for nasopharyngeal and blood isolates of S. pneumoniae and H. influenzae from children with pneumonia; and (iv) to examine the effect of drug resistance and laboratory error on the cost of effectively treating children with ARI. ^ A total of 1293 children with ARI were included in the study: 969 (75%) from urban areas and 324 (25%) from rural parts of the country. Of 1293, there were 786 (61%) male and 507 (39%) female children. The resistance rate of S. pneumoniae to various antibiotics among the urban children with ARI was: TMP/SMX (62%); chloramphenicol (23%); penicillin (5%); tetracycline (16%); and ampicillin/amoxicillin (0%). The rates of resistance of H. influenzae were higher than S. pneumoniae: TMP/SMX (85%); chloramphenicol (62%); penicillin (59%); ampicillin/amoxicillin (46%); and tetracycline (100%). There were similar rates of resistance to each antimicrobial agent among isolates from the rural children. ^ Of a total 614 specimens that were tested for antimicrobial susceptibility, 432 (70.4%) were resistant to TMP/SMX and 93 (15.2%) were resistant to antimicrobial agents other than TMP/SMX viz. ampicillin/amoxicillin, chloramphenicol, penicillin, and tetracycline. ^ The sensitivity and positive predictive value of peripheral laboratories for H. influenzae were 99% and 65%, respectively. Similarly, the sensitivity and positive predictive value of peripheral laboratory tests compared to gold standard i.e. NIH laboratory, for S. pneumoniae were 99% and 54%, respectively. ^ The sensitivity and positive predictive value of nasopharyngeal specimens compared to blood cultures (gold standard), isolated by the peripheral laboratories, for H. influenzae were 88% and 11%, and for S. pneumoniae 92% and 39%, respectively. (Abstract shortened by UMI.)^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SETTING Drug resistance threatens tuberculosis (TB) control, particularly among human immunodeficiency virus (HIV) infected persons. OBJECTIVE To describe practices in the prevention and management of drug-resistant TB under antiretroviral therapy (ART) programs in lower-income countries. DESIGN We used online questionnaires to collect program-level data on 47 ART programs in Southern Africa (n = 14), East Africa (n = 8), West Africa (n = 7), Central Africa (n = 5), Latin America (n = 7) and the Asia-Pacific (n = 6 programs) in 2012. Patient-level data were collected on 1002 adult TB patients seen at 40 of the participating ART programs. RESULTS Phenotypic drug susceptibility testing (DST) was available in 36 (77%) ART programs, but was only used for 22% of all TB patients. Molecular DST was available in 33 (70%) programs and was used in 23% of all TB patients. Twenty ART programs (43%) provided directly observed therapy (DOT) during the entire course of treatment, 16 (34%) during the intensive phase only, and 11 (23%) did not follow DOT. Fourteen (30%) ART programs reported no access to second-line anti-tuberculosis regimens; 18 (38%) reported TB drug shortages. CONCLUSIONS Capacity to diagnose and treat drug-resistant TB was limited across ART programs in lower-income countries. DOT was not always implemented and drug supplies were regularly interrupted, which may contribute to the global emergence of drug resistance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION Rates of both TB/HIV co-infection and multi-drug-resistant (MDR) TB are increasing in Eastern Europe (EE). Data on the clinical management of TB/HIV co-infected patients are scarce. Our aim was to study the clinical characteristics of TB/HIV patients in Europe and Latin America (LA) at TB diagnosis, identify factors associated with MDR-TB and assess the activity of initial TB treatment regimens given the results of drug-susceptibility tests (DST). MATERIAL AND METHODS We enrolled 1413 TB/HIV patients from 62 clinics in 19 countries in EE, Western Europe (WE), Southern Europe (SE) and LA from January 2011 to December 2013. Among patients who completed DST within the first month of TB therapy, we linked initial TB treatment regimens to the DST results and calculated the distribution of patients receiving 0, 1, 2, 3 and ≥4 active drugs in each region. Risk factors for MDR-TB were identified in logistic regression models. RESULTS Significant differences were observed between EE (n=844), WE (n=152), SE (n=164) and LA (n=253) for use of combination antiretroviral therapy (cART) at TB diagnosis (17%, 40%, 44% and 35%, p<0.0001), a definite TB diagnosis (culture and/or PCR positive for Mycobacterium tuberculosis; 47%, 71%, 72% and 40%, p<0.0001) and MDR-TB prevalence (34%, 3%, 3% and 11%, p <0.0001 among those with DST results). The history of injecting drug use [adjusted OR (aOR) = 2.03, (95% CI 1.00-4.09)], prior TB treatment (aOR = 3.42, 95% CI 1.88-6.22) and living in EE (aOR = 7.19, 95% CI 3.28-15.78) were associated with MDR-TB. For 569 patients with available DST, the initial TB treatment contained ≥3 active drugs in 64% of patients in EE compared with 90-94% of patients in other regions (Figure 1a). Had the patients received initial therapy with standard therapy [Rifampicin, Isoniazid, Pyrazinamide, Ethambutol (RHZE)], the corresponding proportions would have been 64% vs. 86-97%, respectively (Figure 1b). CONCLUSIONS In EE, TB/HIV patients had poorer exposure to cART, less often a definitive TB diagnosis and more often MDR-TB compared to other parts of Europe and LA. Initial TB therapy in EE was sub-optimal, with less than two-thirds of patients receiving at least three active drugs, and improved compliance with standard RHZE treatment does not seem to be the solution. Improved management of TB/HIV patients requires routine use of DST, initial TB therapy according to prevailing resistance patterns and more widespread use of cART.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Anthelmintic drugs have been widely used in sheep as a cost-effective means for gastro-intestinal nematode (GIN) control. However, growing anthelmintic resistance (AHR) has created a compelling need to identify evidence-based management recommendations that reduce the risk of further development and impact of AHR. OBJECTIVE To identify, critically assess, and synthesize available data from primary research on factors associated with AHR in sheep. METHODS Publications reporting original observational or experimental research on selected factors associated with AHR in sheep GINs and published after 1974, were identified through two processes. Three electronic databases (PubMed, Agricola, CAB) and Web of Science (a collection of databases) were searched for potentially relevant publications. Additional publications were identified through consultation with experts, manual search of references of included publications and conference proceedings, and information solicited from small ruminant practitioner list-serves. Two independent investigators screened abstracts for relevance. Relevant publications were assessed for risk of systematic bias. Where sufficient data were available, random-effects Meta-Analyses (MAs) were performed to estimate the pooled Odds Ratio (OR) and 95% Confidence Intervals (CIs) of AHR for factors reported in ≥2 publications. RESULTS Of the 1712 abstracts screened for eligibility, 131 were deemed relevant for full publication review. Thirty publications describing 25 individual studies (15 observational studies, 7 challenge trials, and 3 controlled trials) were included in the qualitative synthesis and assessed for systematic bias. Unclear (i.e. not reported, or unable to assess) or high risk of selection bias and confounding bias was found in 93% (14/15) and 60% (9/15) of the observational studies, respectively, while unclear risk of selection bias was identified in all of the trials. Ten independent studies were included in the quantitative synthesis, and MAs were performed for five factors. Only high frequency of treatment was a significant risk factor (OR=4.39; 95% CI=1.59, 12.14), while the remaining 4 variables were marginally significant: mixed-species grazing (OR=1.63; 95% CI=0.66, 4.07); flock size (OR=1.02; 95% CI=0.97, 1.07); use of long-acting drug formulations (OR=2.85; 95% CI=0.79, 10.24); and drench-and-shift pasture management (OR=4.08; 95% CI=0.75, 22.16). CONCLUSIONS While there is abundant literature on the topic of AHR in sheep GINs, few studies have explicitly investigated the association between putative risk or protective factors and AHR. Consequently, several of the current recommendations on parasite management are not evidence-based. Moreover, many of the studies included in this review had a high or unclear risk of systematic bias, highlighting the need to improve study design and/or reporting of future research carried out in this field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Antimicrobial drugs may be used to treat diarrheal illness in companion animals. It is important to monitor antimicrobial use to better understand trends and patterns in antimicrobial resistance. There is no monitoring of antimicrobial use in companion animals in Canada. To explore how the use of electronic medical records could contribute to the ongoing, systematic collection of antimicrobial use data in companion animals, anonymized electronic medical records were extracted from 12 participating companion animal practices and warehoused at the University of Calgary. We used the pre-diagnostic, clinical features of diarrhea as the case definition in this study. Using text-mining technologies, cases of diarrhea were described by each of the following variables: diagnostic laboratory tests performed, the etiological diagnosis and antimicrobial therapies. The ability of the text miner to accurately describe the cases for each of the variables was evaluated. It could not reliably classify cases in terms of diagnostic tests or etiological diagnosis; a manual review of a random sample of 500 diarrhea cases determined that 88/500 (17.6%) of the target cases underwent diagnostic testing of which 36/88 (40.9%) had an etiological diagnosis. Text mining, compared to a human reviewer, could accurately identify cases that had been treated with antimicrobials with high sensitivity (92%, 95% confidence interval, 88.1%-95.4%) and specificity (85%, 95% confidence interval, 80.2%-89.1%). Overall, 7400/15,928 (46.5%) of pets presenting with diarrhea were treated with antimicrobials. Some temporal trends and patterns of the antimicrobial use are described. The results from this study suggest that informatics and the electronic medical records could be useful for monitoring trends in antimicrobial use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE OF REVIEW Fever and neutropenia is the most common complication in the treatment of childhood cancer. This review will summarize recent publications that focus on improving the management of this condition as well as those that seek to optimize translational research efforts. RECENT FINDINGS A number of clinical decision rules are available to assist in the identification of low-risk fever and neutropenia however few have undergone external validation and formal impact analysis. Emerging evidence suggests acute fever and neutropenia management strategies should include time to antibiotic recommendations, and quality improvement initiatives have focused on eliminating barriers to early antibiotic administration. Despite reported increases in antimicrobial resistance, few studies have focused on the prediction, prevention, and optimal treatment of these infections and the effect on risk stratification remains unknown. A consensus guideline for paediatric fever and neutropenia research is now available and may help reduce some of the heterogeneity between studies that have previously limited the translation of evidence into clinical practice. SUMMARY Risk stratification is recommended for children with cancer and fever and neutropenia. Further research is required to quantify the overall impact of this approach and to refine exactly which children will benefit from early antibiotic administration as well as modifications to empiric regimens to cover antibiotic-resistant organisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Guidelines on the clinical management of non-metastatic castrate-resistant prostate cancer (nmCRPC) generally focus on the need to continue androgen deprivation therapy and enrol patients into clinical trials of investigational agents. This guidance reflects the lack of clinical trial data with established agents in the nmCRPC patient population and the need for trials of new agents. AIM To review the evidence base and consider ways of improving the management of nmCRPC. CONCLUSION Upon the development of castrate resistance, it is essential to rule out the presence of metastases or micrometastases by optimising the use of bone scans and possibly newer procedures and techniques. When nmCRPC is established, management decisions should be individualised according to risk, but risk stratification in this diverse population is poorly defined. Currently, prostate-specific antigen (PSA) levels and PSA doubling time remain the best method of assessing the risk of progression and response to treatment in nmCRPC. However, optimising imaging protocols can also help assess the changing metastatic burden in patients with CRPC. Clinical trials of novel agents in nmCRPC are limited and have problems with enrolment, and therefore, improved risk stratification and imaging may be crucial to the improved management. The statements presented in this paper, reflecting the views of the authors, provide a discussion of the most recent evidence in nmCRPC and provide some advice on how to ensure these patients receive the best management available. However, there is an urgent need for more data on the management of nmCRPC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compared the responses of native and non-native populations of the seaweed Gracilaria vermiculophylla to heat shock in common garden-type experiments. Specimens from six native populations in East Asia and from eight non-native populations in Europe and on the Mexican Pacific coast were acclimated to two sets of identical conditions before their resistance to heat shock was examined. The experiments were carried out twice - one time in the native range in Qingdao, China and one time in the invaded range in Kiel, Germany - to rule out effects of specific local conditions. In both testing sites the non-native populations survived heat shock significantly better than the native populations, The data underlying this statement are presented in https://doi.pangaea.de/10.1594/PANGAEA.859335. After three hours of heat shock G. vermiculophylla exhibited increased levels of heat shock protein 70 (HSP70) and of a specific isoform of haloperoxidase, suggesting that both enzymes could be required for heat shock stress management. However, the elevated resistance toward heat shock of non-native populations only correlated with an increased constitutive expression of heat shock protein 70 (HSP70). The haloperoxidase isoform was more prominent in native populations, suggesting that not only increased HSP70 expression, but also reduced allocation into haloperoxidase expression after heat shock was selected during the invasion history. The data describing expression of HSP70 and three different isoforms of haloperoxidase are presented in https://doi.pangaea.de/10.1594/PANGAEA.859358.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The European chestnut (Castanea sativa Mill.) is a multipurpose species that has been widely cultivated around the Mediterranean basin since ancient times. New varieties were brought to the Iberian Peninsula during the Roman Empire, which coexist since then with native populations that survived the last glaciation. The relevance of chestnut cultivation has being steadily growing since the Middle Ages, until the rural decline of the past century put a stop to this trend. Forest fires and diseases were also major factors. Chestnut cultivation is gaining momentum again due to its economic (wood, fruits) and ecologic relevance, and represents currently an important asset in many rural areas of Europe. In this Thesis we apply different molecular tools to help improve current management strategies. For this study we have chosen El Bierzo (Castile and Leon, NW Spain), which has a centenary tradition of chestnut cultivation and management, and also presents several unique features from a genetic perspective (next paragraph). Moreover, its nuts are widely appreciated in Spain and abroad for their organoleptic properties. We have focused our experimental work on two major problems faced by breeders and the industry: the lack of a fine-grained genetic characterization and the need for new strategies to control blight disease. To characterize with sufficient detail the genetic diversity and structure of El Bierzo orchards, we analyzed DNA from 169 trees grafted for nut production covering the entire region. We also analyzed 62 nuts from all traditional varieties. El Bierzo constitutes an outstanding scenario to study chestnut genetics and the influence of human management because: (i) it is located at one extreme of the distribution area; (ii) it is a major glacial refuge for the native species; (iii) it has a long tradition of human management (since Roman times, at least); and (iv) its geographical setting ensures an unusual degree of genetic isolation. Thirteen microsatellite markers provided enough informativeness and discrimination power to genotype at the individual level. Together with an unexpected level of genetic variability, we found evidence of genetic structure, with three major gene pools giving rise to the current population. High levels of genetic differentiation between groups supported this organization. Interestingly, genetic structure does not match with spatial boundaries, suggesting that the exchange of material and cultivation practices have strongly influenced natural gene flow. The microsatellite markers selected for this study were also used to classify a set of 62 samples belonging to all traditional varieties. We identified several cases of synonymies and homonymies, evidencing the need to substitute traditional classification systems with new tools for genetic profiling. Management and conservation strategies should also benefit from these tools. The avenue of high-throughput sequencing technologies, combined with the development of bioinformatics tools, have paved the way to study transcriptomes without the need for a reference genome. We took advantage of RNA sequencing and de novo assembly tools to determine the transcriptional landscape of chestnut in response to blight disease. In addition, we have selected a set of candidate genes with high potential for developing resistant varieties via genetic engineering. Our results evidenced a deep transcriptional reprogramming upon fungal infection. The plant hormones ET and JA appear to orchestrate the defensive response. Interestingly, our results also suggest a role for auxins in modulating such response. Many transcription factors were identified in this work that interact with promoters of genes involved in disease resistance. Among these genes, we have conducted a functional characterization of a two major thaumatin-like proteins (TLP) that belongs to the PR5 family. Two genes encoding chestnut cotyledon TLPs have been previously characterized, termed CsTL1 and CsTL2. We substantiate here their protective role against blight disease for the first time, including in silico, in vitro and in vivo evidence. The synergy between TLPs and other antifungal proteins, particularly endo-p-1,3-glucanases, bolsters their interest for future control strategies based on biotechnological approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cover crops in Mediterranean vineyards are scarcely used due to water competition between the cover crop and the grapevine; however, bare soil management through tillage or herbicides tends to have negative effects on the soil over time (organic matter decrease, soil structure and soil fertility degradation, compaction, etc). The objective of this study was to understand how soil management affects soil fertility, compaction and infiltration over time. To this end, two bare soil techniques were compared, tillage (TT) and total herbicide (HT) with two cover crops; annual cereal (CT) and annual grass (AGT), established for 8 years. CT treatment showed the highest organic matter content, having the biggest amount of biomass incorporated into the soil. The annual adventitious vegetation in TT treatment (568 kg dry matter ha-1) that was incorporated into the soil, kept the organic matter content higher than HT levels and close to AGT level, in spite of the greater aboveground annual biomass production of this treatment (3632 kg dry matter ha-1) whereas only its roots were incorporated into the soil. TT presented the highest bulk density under the tractor track lines and a greatest resistance to penetration (at 0.2 m depth). AGT presented bulk density values (upper 0.4 m) lower than TT and penetration resistance in CT lower (at 0.20 m depth) than TT too. Effects of soil management in vineyard on soil physical and chemical characteristics - ResearchGate. Available from: http://www.researchgate.net/publication/268520480_Effects_of_soil_management_in_vineyard_on_soil_physical_and_chemical_characteristics [accessed May 20, 2015].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Insecticidal proteins from the soil bacterium Bacillus thuringiensis (Bt) are becoming a cornerstone of ecologically sound pest management. However, if pests quickly adapt, the benefits of environmentally benign Bt toxins in sprays and genetically engineered crops will be short-lived. The diamondback moth (Plutella xylostella) is the first insect to evolve resistance to Bt in open-field populations. Here we report that populations from Hawaii and Pennsylvania share a genetic locus at which a recessive mutation associated with reduced toxin binding confers extremely high resistance to four Bt toxins. In contrast, resistance in a population from the Philippines shows multilocus control, a narrower spectrum, and for some Bt toxins, inheritance that is not recessive and not associated with reduced binding. The observed variation in the genetic and biochemical basis of resistance to Bt, which is unlike patterns documented for some synthetic insecticides, profoundly affects the choice of strategies for combating resistance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent predictions of growth in human populations and food supply suggest that there will be a need to substantially increase food production in the near future. One possible approach to meeting this demand, at least in part, is the control of pests and diseases, which currently cause a 30–40% loss in available crop production. In recent years, strategies for controlling pests and diseases have tended to focus on short-term, single-technology interventions, particularly chemical pesticides. This model frequently applies even where so-called integrated pest management strategies are used because in reality, these often are dominated by single technologies (e.g., biocontrol, host plant resistance, or biopesticides) that are used as replacements for chemicals. Very little attention is given to the interaction or compatibility of the different technologies used. Unfortunately, evidence suggests that such approaches rarely yield satisfactory results and are unlikely to provide sustainable pest control solutions for the future. Drawing on two case histories, this paper demonstrates that by increasing our basic understanding of how individual pest control technologies act and interact, new opportunities for improving pest control can be revealed. This approach stresses the need to break away from the existing single-technology, pesticide-dominated paradigm and to adopt a more ecological approach built around a fundamental understanding of population biology at the local farm level and the true integration of renewable technologies such as host plant resistance and natural biological control, which are available to even the most resource-poor farmers.