972 resultados para National Sanitary Surveillance Agency
Resumo:
BACKGROUND A recombinant, replication-competent vesicular stomatitis virus-based vaccine expressing a surface glycoprotein of Zaire Ebolavirus (rVSV-ZEBOV) is a promising Ebola vaccine candidate. We report the results of an interim analysis of a trial of rVSV-ZEBOV in Guinea, west Africa. METHODS For this open-label, cluster-randomised ring vaccination trial, suspected cases of Ebola virus disease in Basse-Guinée (Guinea, west Africa) were independently ascertained by Ebola response teams as part of a national surveillance system. After laboratory confirmation of a new case, clusters of all contacts and contacts of contacts were defined and randomly allocated 1:1 to immediate vaccination or delayed (21 days later) vaccination with rVSV-ZEBOV (one dose of 2 × 10(7) plaque-forming units, administered intramuscularly in the deltoid muscle). Adults (age ≥18 years) who were not pregnant or breastfeeding were eligible for vaccination. Block randomisation was used, with randomly varying blocks, stratified by location (urban vs rural) and size of rings (≤20 vs >20 individuals). The study is open label and masking of participants and field teams to the time of vaccination is not possible, but Ebola response teams and laboratory workers were unaware of allocation to immediate or delayed vaccination. Taking into account the incubation period of the virus of about 10 days, the prespecified primary outcome was laboratory-confirmed Ebola virus disease with onset of symptoms at least 10 days after randomisation. The primary analysis was per protocol and compared the incidence of Ebola virus disease in eligible and vaccinated individuals in immediate vaccination clusters with the incidence in eligible individuals in delayed vaccination clusters. This trial is registered with the Pan African Clinical Trials Registry, number PACTR201503001057193. FINDINGS Between April 1, 2015, and July 20, 2015, 90 clusters, with a total population of 7651 people were included in the planned interim analysis. 48 of these clusters (4123 people) were randomly assigned to immediate vaccination with rVSV-ZEBOV, and 42 clusters (3528 people) were randomly assigned to delayed vaccination with rVSV-ZEBOV. In the immediate vaccination group, there were no cases of Ebola virus disease with symptom onset at least 10 days after randomisation, whereas in the delayed vaccination group there were 16 cases of Ebola virus disease from seven clusters, showing a vaccine efficacy of 100% (95% CI 74·7-100·0; p=0·0036). No new cases of Ebola virus disease were diagnosed in vaccinees from the immediate or delayed groups from 6 days post-vaccination. At the cluster level, with the inclusion of all eligible adults, vaccine effectiveness was 75·1% (95% CI -7·1 to 94·2; p=0·1791), and 76·3% (95% CI -15·5 to 95·1; p=0·3351) with the inclusion of everyone (eligible or not eligible for vaccination). 43 serious adverse events were reported; one serious adverse event was judged to be causally related to vaccination (a febrile episode in a vaccinated participant, which resolved without sequelae). Assessment of serious adverse events is ongoing. INTERPRETATION The results of this interim analysis indicate that rVSV-ZEBOV might be highly efficacious and safe in preventing Ebola virus disease, and is most likely effective at the population level when delivered during an Ebola virus disease outbreak via a ring vaccination strategy. FUNDING WHO, with support from the Wellcome Trust (UK); Médecins Sans Frontières; the Norwegian Ministry of Foreign Affairs through the Research Council of Norway; and the Canadian Government through the Public Health Agency of Canada, Canadian Institutes of Health Research, International Development Research Centre, and Department of Foreign Affairs, Trade and Development.
Resumo:
Background: The Swiss pig population enjoys a favourable health situation. To further promote this, the Pig Health Service (PHS) conducts a surveillance program in affiliated herds: closed multiplier herds with the highest PHS-health and hygiene status have to be free from swine dysentery and progressive atrophic rhinitis and are clinically examined four times a year, including laboratory testing. Besides, four batches of pigs per year are fattened together with pigs from other herds and checked for typical symptoms (monitored fattening groups (MF)). While costly and laborious, little was known about the effectiveness of the surveillance to detect an infection in a herd. Therefore, the sensitivity of the surveillance for progressive atrophic rhinitis and swine dysentery at herd level was assessed using scenario tree modelling, a method well established at national level. Furthermore, its costs and the time until an infection would be detected were estimated, with the final aim of yielding suggestions how to optimize surveillance. Results: For swine dysentery, the median annual surveillance sensitivity was 96.7 %, mean time to detection 4.4 months, and total annual costs 1022.20 Euro/herd. The median component sensitivity of active sampling was between 62.5 and 77.0 %, that of a MF between 7.2 and 12.7 %. For progressive atrophic rhinitis, the median surveillance sensitivity was 99.4 %, mean time to detection 3.1 months and total annual costs 842.20 Euro. The median component sensitivity of active sampling was 81.7 %, that of a MF between 19.4 and 38.6 %. Conclusions: Results indicate that total sensitivity for both diseases is high, while time to detection could be a risk in herds with frequent pig trade. From all components, active sampling had the highest contribution to the surveillance sensitivity, whereas that of MF was very low. To increase efficiency, active sampling should be intensified (more animals sampled) and MF abandoned. This would significantly improve sensitivity and time to detection at comparable or lower costs. The method of scenario tree modelling proved useful to assess the efficiency of surveillance at herd level. Its versatility allows adjustment to all kinds of surveillance scenarios to optimize sensitivity, time to detection and/or costs.
Resumo:
BACKGROUND In 2012, the levels of chlamydia control activities including primary prevention, effective case management with partner management and surveillance were assessed in 2012 across countries in the European Union and European Economic Area (EU/EEA), on initiative of the European Centre for Disease Control (ECDC) survey, and the findings were compared with those from a similar survey in 2007. METHODS Experts in the 30 EU/EEA countries were invited to respond to an online questionnaire; 28 countries responded, of which 25 participated in both the 2007 and 2012 surveys. Analyses focused on 13 indicators of chlamydia prevention and control activities; countries were assigned to one of five categories of chlamydia control. RESULTS In 2012, more countries than in 2007 reported availability of national chlamydia case management guidelines (80% vs. 68%), opportunistic chlamydia testing (68% vs. 44%) and consistent use of nucleic acid amplification tests (64% vs. 36%). The number of countries reporting having a national sexually transmitted infection control strategy or a surveillance system for chlamydia did not change notably. In 2012, most countries (18/25, 72%) had implemented primary prevention activities and case management guidelines addressing partner management, compared with 44% (11/25) of countries in 2007. CONCLUSION Overall, chlamydia control activities in EU/EEA countries strengthened between 2007 and 2012. Several countries still need to develop essential chlamydia control activities, whereas others may strengthen implementation and monitoring of existing activities.
Resumo:
BACKGROUND Ductal carcinoma in situ (DCIS) is a noninvasive breast lesion with uncertain risk for invasive progression. Usual care (UC) for DCIS consists of treatment upon diagnosis, thus potentially overtreating patients with low propensity for progression. One strategy to reduce overtreatment is active surveillance (AS), whereby DCIS is treated only upon detection of invasive disease. Our goal was to perform a quantitative evaluation of outcomes following an AS strategy for DCIS. METHODS Age-stratified, 10-year disease-specific cumulative mortality (DSCM) for AS was calculated using a computational risk projection model based upon published estimates for natural history parameters, and Surveillance, Epidemiology, and End Results data for outcomes. AS projections were compared with the DSCM for patients who received UC. To quantify the propagation of parameter uncertainty, a 95% projection range (PR) was computed, and sensitivity analyses were performed. RESULTS Under the assumption that AS cannot outperform UC, the projected median differences in 10-year DSCM between AS and UC when diagnosed at ages 40, 55, and 70 years were 2.6% (PR = 1.4%-5.1%), 1.5% (PR = 0.5%-3.5%), and 0.6% (PR = 0.0%-2.4), respectively. Corresponding median numbers of patients needed to treat to avert one breast cancer death were 38.3 (PR = 19.7-69.9), 67.3 (PR = 28.7-211.4), and 157.2 (PR = 41.1-3872.8), respectively. Sensitivity analyses showed that the parameter with greatest impact on DSCM was the probability of understaging invasive cancer at diagnosis. CONCLUSION AS could be a viable management strategy for carefully selected DCIS patients, particularly among older age groups and those with substantial competing mortality risks. The effectiveness of AS could be markedly improved by reducing the rate of understaging.
Resumo:
The need for wildlife health surveillance has become increasingly recognized. However, comprehensive programs which cover a wide spectrum of species, pathogens and geographic areas are still lacking in most European countries and practical examples of systems in place remain scarce. This article provides an overview of the organization of wildlife health surveillance in Switzerland, with a focus on the development, current strategies and the activities of the national program carried out by the Centre for Fish and Wildlife Health (FIWI), University of Bern. This documentation may stimulate on-going discussions on the design and development of national wildlife health surveillance programs in other countries. Investigations into wildlife health in Switzerland date back to the 1950s. The FIWI acts as a national competence center for wildlife diseases on mandate of the Swiss federal authorities. The mandate includes four main activities: disease diagnostics, research, consulting and teaching. In line with this, the FIWI has made continuous efforts to strengthen a national network of field partners and implemented strategies to facilitate long-term and metastudies.
Resumo:
Systems for the identification and registration of cattle have gradually been receiving attention for use in syndromic surveillance, a relatively recent approach for the early detection of infectious disease outbreaks. Real or near real-time monitoring of deaths or stillbirths reported to these systems offer an opportunity to detect temporal or spatial clusters of increased mortality that could be caused by an infectious disease epidemic. In Switzerland, such data are recorded in the "Tierverkehrsdatenbank" (TVD). To investigate the potential of the Swiss TVD for syndromic surveillance, 3 years of data (2009-2011) were assessed in terms of data quality, including timeliness of reporting and completeness of geographic data. Two time-series consisting of reported on-farm deaths and stillbirths were retrospectively analysed to define and quantify the temporal patterns that result from non-health related factors. Geographic data were almost always present in the TVD data; often at different spatial scales. On-farm deaths were reported to the database by farmers in a timely fashion; stillbirths were less timely. Timeliness and geographic coverage are two important features of disease surveillance systems, highlighting the suitability of the TVD for use in a syndromic surveillance system. Both time series exhibited different temporal patterns that were associated with non-health related factors. To avoid false positive signals, these patterns need to be removed from the data or accounted for in some way before applying aberration detection algorithms in real-time. Evaluating mortality data reported to systems for the identification and registration of cattle is of value for comparing national data systems and as a first step towards a European-wide early detection system for emerging and re-emerging cattle diseases.
Resumo:
Current toxic tort cases have increased national awareness of health concerns and present an important avenue in which public health scientists can perform a vital function: in litigation, and in public health initiatives and promotions which may result. This review presents a systematic approach, using the paradigm of interactive public health disciplines, for the design of a matrix framework for medical surveillance of workers exposed to toxic substances. The matrix framework design addresses the required scientific bases to support the legal remedy of medical monitoring for workers injured as a result of their exposure to toxic agents. A background of recent legal developments which have a direct impact on the use of scientific expertise in litigation is examined in the context of toxic exposure litigation and the attainment of public health goals. The matrix model is applied to five different workplace exposures: dental mercury, firefighting, vinyl chloride manufacture, radon in mining and silica. An exposure matrix designed by the Department of Energy for government nuclear workers is included as a reference comparison to the design matrix. ^
Resumo:
Under the Clean Air Act, Congress granted discretionary decision making authority to the Administrator of the Environmental Protection Agency (EPA). This discretionary authority involves setting standards to protect the public's health with an "adequate margin of safety" based on current scientific knowledge. The Administrator of the EPA is usually not a scientist, and for the National Ambient Air Quality Standard (NAAQS) for particulate matter (PM), the Administrator faced the task of revising a standard when several scientific factors were ambiguous. These factors included: (1) no identifiable threshold below which health effects are not manifested, (2) no biological basis to explain the reported associations between particulate matter and adverse health effects, and (3) no consensus among the members of the Clean Air Scientific Advisory Committee (CASAC) as to what an appropriate PM indicator, averaging period, or value would be for the revised standard. ^ This project recommends and demonstrates a tool, integrated assessment (IA), to aid the Administrator in making a public health policy decision in the face of ambiguous scientific factors. IA is an interdisciplinary approach to decision making that has been used to deal with complex issues involving many uncertainties, particularly climate change analyses. Two IA approaches are presented; a rough set analysis by which the expertise of CASAC members can be better utilized, and a flag model for incorporating the views of stakeholders into the standard setting process. ^ The rough set analysis can describe minimal and maximal conditions about the current science pertaining to PM and health effects. Similarly, a flag model can evaluate agreement or lack of agreement by various stakeholder groups to the proposed standard in the PM review process. ^ The use of these IA tools will enable the Administrator to (1) complete the NAAQS review in a manner that is in closer compliance with the Clean Air Act, (2) expand the input from CASAC, (3) take into consideration the views of the stakeholders, and (4) retain discretionary decision making authority. ^
Resumo:
Background. Pulsed-field gel electrophoresis (PFGE) is a laboratory technique in which Salmonella DNA banding patterns are used as molecular fingerprints for epidemiologic study for "PFGE clusters". State and national health departments (CDC) use PFGE to detect clusters of related cases and to discover common sources of bacteria in outbreaks. ^ Objectives. Using Houston Department of Health and Human Services (HDHHS) data, the study sought: (1) to describe the epidemiology of Salmonella in Houston, with PFGE subtype as a variable; and (2) to determine whether PFGE patterns and clusters detected in Houston were local appearances of PFGE patterns or clusters that occurred statewide. ^ Methods. During the years 2002 to 2005, the HDHHS collected and analyzed data from routine surveillance of Salmonella. We implemented a protocol, between May 1, 2007 and December 31, 2007, in which PFGE patterns from local cases were sent via e-mail to the Texas Department of State Health Services, to verify whether the local PFGE patterns were also part of statewide clusters. PFGE was performed from 106 patients providing a sample from which Salmonella was isolated in that time period. Local PFGE clusters were investigated, with the enhanced picture obtained by linking local PFGE patterns to PFGE patterns at the state and national level. ^ Results. We found that, during the years 2002 to 2005, there were 66 PFGE clusters, ranging in size from 2 to 22 patients within each cluster. Between different serotypes, there were marked differences in the sizes of PFGE clusters. A common source or risk factor was found in fewer than 5 of the 66 PFGE clusters. With the revised protocol, we found that 19 of 66 local PFGE patterns were indistinguishable from PFGE patterns at Texas DSHS. During the eight months, we identified ten local PFGE clusters with a total of 42 patients. The PFGE pattern for eight of the ten clusters matched the PFGE patterns for cases reported to Texas DSHS from other geographic areas. Five of the ten PFGE patterns matched PFGE patterns for clusters under investigation at PulseNet at the national level. HDHHS epidemiologists identified a mode of transmission in two of the ten local clusters and a common risk factor in a third local cluster. ^ Conclusion. In the extended-study protocol, Houston PFGE patterns were linked to patterns seen at the state and national level. The investigation of PFGE clusters was more efficacious in detecting a common transmission when local data were linked to state and national data. ^
Resumo:
Introduction: The Texas Occupational Safety & Health Surveillance System (TOSHSS) was created to collect, analyze and interpret occupational injury and illness data in order to decrease the impact of occupational injuries within the state of Texas. This process evaluation was performed midway through the 4-year grant to assess the efficiency and effectiveness of the surveillance system’s planning and implementation activities1. ^ Methods: Two evaluation guidelines published by the Centers for Disease Control and Prevention (CDC) were used as the theoretical models for this process evaluation. The Framework for Program Evaluation in Public Health was used to examine the planning and design of TOSHSS using logic models. The Framework for Evaluating Public Health Surveillance Systems was used to examine the implementation of approximately 60 surveillance activities, including uses of the data obtained from the surveillance system. ^ Results/Discussion: TOSHSS planning activities omitted the creation of a scientific advisory committee and specific activities designed to maintain contacts with stakeholders; and proposed activities should be reassessed and aligned with ongoing performance measurement criteria, including the role of collaborators in helping the surveillance system achieve each proposed activity. TOSHSS implementation activities are substantially meeting expectations and received an overall score of 61% for all activities being performed. TOSHSS is considered a surveillance system that is simple, flexible, acceptable, fairly stable, timely, moderately useful, with good data quality and a PVP of 86%. ^ Conclusions: Through the third year of TOSHSS implementation, the surveillance system is has made a considerable contribution to the collection of occupational injury and illness information within the state of Texas. Implementation of the nine recommendations provided under this process evaluation is expected to increase the overall usefulness of the surveillance system and assist TDSHS in reducing occupational fatalities, injuries, and diseases within the state of Texas. ^ 1 Disclaimer: The Texas Occupational Safety and Health Surveillance System is supported by Grant/Cooperative Agreement Number (U60 OH008473-01A1). The content of the current evaluation are solely the responsibility of the authors and do not necessarily represent the official views of the Centers for Disease Control and Prevention, National Institute for Occupational Safety and Health.^
Resumo:
The Federal Coal Mine Health and Safety Act of 1969 required that periodic chest radiographs be offered to underground coal miners to protect the miners from the development of Coal Workers' Pneumoconiosis (CWP) and progression of the disease to progressive massive fibrosis (PMF). These examinations are administered by the National Institute for Occupational Safety and Health (NIOSH) through the Coal Workers' Health Surveillance Program (CWHSP). The mine operator is required to provide each miner with the opportunity to have the chest radiograph at no cost to the miner.^ Three rounds of examinations have been conducted since 1969 and the fourth is underway. The decrease in participation over rounds is of great concern if the incidence and progression of CWP are to be understood and controlled.^ This study developed rates of participation for each of 558 West Virginia underground coal mines who submitted or had NIOSH assigned plans for making chest radiographs available during the third round, July 1978 through December 1980. These rates were analyzed in relation to desired levels of participation and to reinforcing, predisposing and enabling factors presumed to affect rates of participation in disease prevention and surveillance programs.^ Two reinforcing factors, size of mine and inclusion of the mine in the National Coal Study (NCS) epidemiology research program, and the enabling factor, use of an on-site radiograph facility, demonstrated highly significant relationships to participation rates.^ The major findings of the study were: (1) Participation in the CWHSP is even lower than previously estimated; (2) CWHSP program evaluation is not systematic and program data base is not complete and comprehensive; and (3) NIOSH program policy is not clear and administration of the CWHSP is fragmented and lacks adequate fiscal and personnel resources. ^
Resumo:
This dissertation focuses on Project HOPE, an American medical aid agency, and its work in Tunisia. More specifically this is a study of the implementation strategies of those HOPE sponsored projects and programs designed to solve the problems of high morbidity and infant mortality rates due to environmentally related diarrheal and enteric diseases. Several environmental health programs and projects developed in cooperation with Tunisian counterparts are described and analyzed. These include (1) a paramedical manpower training program; (2) a national hospital sanitation and infection control program; (3) a community sewage disposal project; (4) a well reconstruction project; and (5) a solid-waste disposal project for a hospital.^ After independence, Tunisia, like many developing countries, encountered several difficulties which hindered progress toward solving basic environmental health problems and prompted a request for aid. This study discusses the need for all who work in development programs to recognize and assess those difficulties or constraints which affect the program planning process, including those latent cultural and political constraints which not only exist within the host country but within the aid agency as well. For example, failure to recognize cultural differences may adversely affect the attitudes of the host staff towards their work and towards the aid agency and its task. These factors, therefore, play a significant role in influencing program development decisions and must be taken into account in order to maximize the probability of successful outcomes.^ In 1969 Project HOPE was asked by the Tunisian government to assist the Ministry of Health in solving its health manpower problems. HOPE responded with several programs, one of which concerned the training of public health nurses, sanitary technicians, and aids at Tunisia's school of public health in Nabeul. The outcome of that program as well as the strategies used in its development are analyzed. Also, certain questions are addressed such as, what should the indicators of success be, and when is the time right to phase out?^ Another HOPE program analyzed involved hospital sanitation and infection control. Certain generic aspects of basic hospital sanitation procedures were documented and presented in the form of a process model which was later used as a "microplan" in setting up similar programs in other Tunisian hospitals. In this study the details of the "microplan" are discussed. The development of a nation-wide program without any further need of external assistance illustrated the success of HOPE's implementation strategies.^ Finally, although it is known that the high incidence of enteric disease in developing countries is due to poor environmental sanitation and poor hygiene practices, efforts by aid agencies to correct these conditions have often resulted in failure. Project HOPE's strategy was to maximize limited resources by using a systems approach to program development and by becoming actively involved in the design and implementation of environmental health projects utilizing "appropriate" technology. Three innovative projects and their implementation strategies (including technical specifications) are described.^ It is advocated that if aid agencies are to make any progress in helping developing countries basic sanitation problems, they must take an interdisciplinary approach to progrm development and play an active role in helping counterparts seek and identify appropriate technologies which are socially and economically acceptable. ^
Resumo:
The National Health Planning and Resources Development Act of 1974 (Public Law 93-641) requires that health systems agencies (HSAs) plan for their health service areas by the use of existing data to the maximum extent practicable. Health planning is based on the identificaton of health needs; however, HSAs are, at present, identifying health needs in their service areas in some approximate terms. This lack of specificity has greatly reduced the effectiveness of health planning. The intent of this study is, therefore, to explore the feasibility of predicting community levels of hospitalized morbidity by diagnosis by the use of existing data so as to allow health planners to plan for the services associated with specific diagnoses.^ The specific objectives of this study are (a) to obtain by means of multiple regression analysis a prediction equation for hospital admission by diagnosis, i.e., select the variables that are related to demand for hospital admissions; (b) to examine how pertinent the variables selected are; and (c) to see if each equation obtained predicts well for health service areas.^ The existing data on hospital admissions by diagnosis are those collected from the National Hospital Discharge Surveys, and are available in a form aggregated to the nine census divisions. When the equations established with such data are applied to local health service areas for prediction, the application is subject to the criticism of the theory of ecological fallacy. Since HSAs have to rely on the availability of existing data, it is imperative to examine whether or not the theory of ecological fallacy holds true in this case.^ The results of the study show that the equations established are highly significant and the independent variables in the equations explain the variation in the demand for hospital admission well. The predictability of these equations is good when they are applied to areas at the same ecological level but become poor, predominantly due to ecological fallacy, when they are applied to health service areas.^ It is concluded that HSAs can not predict hospital admissions by diagnosis without primary data collection as discouraged by Public Law 93-641. ^
Resumo:
Invasive pneumococcal disease (IPD) causes significant health burden in the US, is responsible for the majority of bacterial meningitis, and causes more deaths than any other vaccine preventable bacterial disease in the US. The estimated National IPD rate is 14.3 cases per 100,000 population with a case-fatality rate of 1.5 cases per 100,000 population. Although cases of IPD are routinely reported to the local health department in Harris County Texas, the incidence (IR) and case-fatality (CFR) rates have not been reported. Additionally, it is important to know which serotypes of S. pneumoniae are circulating in Harris County Texas and to determine if ‘replacement disease’ is occurring. ^ This study reported incidence and case-fatality rates from 2003 to 2009, and described the trends in IPD, including the IPD serotypes circulating in Harris County Texas during the study period, particularly in 2008 and 2010. Annual incidence rates were calculated and reported for 2003 to 2009, using complete surveillance-year data. ^ Geographic information system (GIS) software was used to create a series of maps of the data reported during the study period. Cluster and outlier analysis and hot spot analysis were conducted using both case counts by census tract and disease rate by census tract. ^ IPD age- and race-adjusted IR for Harris County Texas and their 95% confidence intervals (CIs) were 1.40 (95% CI 1.0, 1.8), 1.71 (95% CI 1.24, 2.17), 3.13 (95% CI 2.48, 3.78), 3.08 (95% CI 2.43, 3.74), 5.61 (95% CI 4.79, 6.43), 8.11 (95% CI 7.11, 9.1), and 7.65 (95% CI 6.69, 8.61) for the years 2003 to 2009, respectively (rates were age- and race-adjusted to each year's midyear US population estimates). A Poisson regression model demonstrated a statistically significant increasing trend of about 32 percent per year in the IPD rates over the course of the study period. IPD age- and race-adjusted case-fatality rates (CFR) for Harris County Texas were also calculated and reported. A Poisson regression model demonstrated a statistically significant increasing trend of about 26 percent per year in the IPD case-fatality rates from 2003 through 2009. A logistic regression model associated the risk of dying from IPD to alcohol abuse (OR 4.69, 95% CI 2.57, 8.56) and to meningitis (OR 2.42, 95% CI 1.46, 4.03). ^ The prevalence of non-vaccine serotypes (NVT) among IPD cases with serotyped isolates was 98.2 percent. In 2008, the year with the sample more geographically representative of all areas of Harris County Texas, the prevalence was 96 percent. Given these findings, it is reasonable to conclude that ‘replacement disease’ is occurring in Harris County Texas, meaning that, the majority of IPD is caused by serotypes not included in the PCV7 vaccine. Also in conclusion, IPD rates increased during the study period in Harris County Texas.^
Resumo:
La reutilización de efluentes depurados siempre ha sido una opción en lugares con déficit coyuntural o estructural de recursos hídricos, se haya o no procedido a la regulación y planificación de esta práctica. La necesidad se crea a partir de las demandas de una zona, normalmente riego agrícola, que ven un mejor desarrollo de su actividad por contar con este recurso. España es el país de la UE que más caudal reutiliza, y está dentro de los diez primeros a nivel mundial. La regulación de esta práctica por el RD 1620/2007, ayudó a incorporar la reutilización de efluentes depurados a la planificación hidrológica como parte de los programas de medidas, con objeto de mitigar presiones, como son las extracciones de agua superficial y subterránea, o mejoras medioambientales evitando un vertido. El objeto de este trabajo es conocer la situación de la reutilización de efluentes depurados en España, los diferentes escenarios y planteamientos de esta actividad, el desarrollo del marco normativo y su aplicabilidad, junto a los tratamientos que permiten alcanzar los límites de calidad establecidos en la normativa vigente, en función de los distintos usos. Además, se aporta un análisis de costes de las distintas unidades de tratamiento y tipologías de líneas de regeneración, tanto de las utilizadas después de un tratamiento secundario como de otras opciones de depuración, como son los biorreactores de membrana (MBRs). Para el desarrollo de estos objetivos, en primer lugar, se aborda el conocimiento de la situación de la reutilización en España a través de una base de datos diseñada para cubrir todos los aspectos de esta actividad: datos de la estación depuradora de aguas residuales (EDAR), de la estación regeneradora (ERA), caudales depurados, reutilizados, volúmenes utilizados y ubicación de los distintos usos, tipos de líneas de tratamiento, calidades del agua reutilizada, etc. Las principales fuentes de información son las Confederaciones Hidrográficas (CCHH) a través de las concesiones de uso del agua depurada, las entidades de saneamiento y depuración de las distintas comunidades autónomas (CCAA), ayuntamientos, Planes Hidrológicos de Cuenca (PHC) y visitas a las zonas más emblemáticas. Además, se revisan planes y programas con el fin de realizar una retrospectiva de cómo se ha ido consolidando y desarrollando esta práctica en las distintas zonas de la geografía española. Se han inventariado 322 sistemas de reutilización y 216 tratamientos de regeneración siendo el más extendido la filtración mediante filtro arena seguido de una desinfección mediante hipoclorito, aunque este tratamiento se ha ido sustituyendo por un físico-químico con decantación lamelar, filtro de arena y radiación ultravioleta, tratamiento de regeneración convencional (TRC), y otros tratamientos que pueden incluir membranas, tratamientos de regeneración avanzados (TRA), con dosificación de hipoclorito como desinfección residual, para adaptarse al actual marco normativo. El uso más extendido es el agrícola con el 70% del caudal total reutilizado, estimado en 408 hm3, aunque la capacidad de los tratamientos de regeneración esperada para 2015, tras el Plan Nacional de Reutilización de Aguas (PNRA), es tres veces superior. Respecto al desarrollo normativo, en las zonas donde la reutilización ha sido pionera, las administraciones competentes han ido desarrollando diferentes recomendaciones de calidad y manejo de este tipo de agua. El uso agrícola, y en zonas turísticas, el riego de campos de golf, fueron los dos primeros usos que tuvieron algún tipo de recomendación incluso reglamentación. Esta situación inicial, sin una normativa a nivel estatal ni recomendaciones europeas, creó cierta incertidumbre en el avance de la reutilización tanto a nivel de concesiones como de planificación. En la actualidad sigue sin existir una normativa internacional para la reutilización y regeneración de efluentes depurados. Las recomendaciones de referencia a nivel mundial, y en concreto para el uso agrícola, son las de la OMS (Organización Mundial de la Salud) publicadas 1989, con sus posteriores revisiones y ampliaciones (OMS, 2006). Esta norma combina tratamientos básicos de depuración y unas buenas prácticas basadas en diferentes niveles de protección para evitar problemas sanitarios. Otra normativa que ha sido referencia en el desarrollo del marco normativo en países donde se realiza esta práctica, son las recomendaciones dadas por la Agencia Medioambiente Estadunidense (USEPA, 2012) o las publicadas por el Estado de California (Título 22, 2001). Estas normas establecen unos indicadores y valores máximos dónde el tratamiento de regeneración es el responsable de la calidad final en función del uso. Durante 2015, la ISO trabajaba en un documento para el uso urbano donde se muestra tanto los posibles parámetros que habría que controlar como la manera de actuar para evitar posibles riesgos. Por otro lado, la Comisión Europea (CE) viene impulsando desde el 2014 la reutilización de aguas depuradas dentro del marco de la Estrategia Común de Implantación de la Directiva Marco del Agua, y fundamentalmente a través del grupo de trabajo de “Programas de medidas”. Para el desarrollo de esta iniciativa se está planteando sacar para 2016 una guía de recomendaciones que podría venir a completar el marco normativo de los distintos Estados Miembros (EM). El Real Decreto 1620/2007, donde se establece el marco jurídico de la reutilización de efluentes depurados, tiende más a la filosofía implantada por la USEPA, aunque la UE parece más partidaria de una gestión del riesgo, donde se establecen unos niveles de tolerancia y unos puntos de control en función de las condiciones socioeconómicas de los distintos Estados, sin entrar a concretar indicadores, valores máximos o tratamientos. Sin embargo, en la normativa estadounidense se indican una serie de tratamientos de regeneración, mientras que, en la española, se hacen recomendaciones a este respecto en una Guía sin validez legal. Por tanto, queda sin regular los procesos para alcanzar estos estándares de calidad, pudiendo ser éstos no apropiados para esta práctica. Es el caso de la desinfección donde el uso de hipoclorito puede generar subproductos indeseables. En la Guía de recomendaciones para la aplicación del RD, publicada por el Ministerio de Agricultura y Medioambiente (MAGRAMA) en 2010, se aclaran cuestiones frecuentes sobre la aplicación del RD, prescripciones técnicas básicas para los sistemas de reutilización, y buenas prácticas en función del uso. Aun así, el RD sigue teniendo deficiencias en su aplicación siendo necesaria una revisión de la misma, como en las frecuencias de muestreo incluso la omisión de algunos parámetros como huevos de nematodos que se ha demostrado ser inexistentes tras un tratamiento de regeneración convencional. En este sentido, existe una tendencia a nivel mundial a reutilizar las aguas con fines de abastecimiento, incluir indicadores de presencia de virus o protozoos, o incluir ciertas tecnologías como las membranas u oxidaciones avanzadas para afrontar temas como los contaminantes emergentes. Otro de los objetivos de este trabajo es el estudio de tipologías de tratamiento en función de los usos establecidos en el RD 1620/2007 y sus costes asociados, siendo base de lo establecido a este respecto en la Guía y PNRA anteriormente indicados. Las tipologías de tratamiento propuestas se dividen en líneas con capacidad de desalar y las que no cuentan con una unidad de desalación de aguas salobres de ósmosis inversa o electrodiálisis reversible. Se realiza esta división al tener actuaciones en zonas costeras donde el agua de mar entra en los colectores, adquiriendo el agua residual un contenido en sales que es limitante en algunos usos. Para desarrollar este objetivo se han estudiado las unidades de tratamiento más implantadas en ERAs españolas en cuanto a fiabilidad para conseguir determinada calidad y coste, tanto de implantación como de explotación. El TRC, tiene un coste de implantación de 28 a 48 €.m-3.d y de explotación de 0,06 a 0,09 €. m-3, mientras que, si se precisara desalar, este coste se multiplica por diez en la implantación y por cinco en la explotación. En caso de los usos que requieren de TRA, como los domiciliarios o algunos industriales, los costes serían de 185 a 398 €.m-3.d en implantación y de 0,14 a 0,20 €.m-3 en explotación. En la selección de tecnologías de regeneración, la capacidad del tratamiento en relación al coste es un indicador fundamental. Este trabajo aporta curvas de tendencia coste-capacidad que sirven de herramienta de selección frente a otros tratamientos de regeneración de reciente implantación como son los MBR, u otros como la desalación de agua de mar o los trasvases entre cuencas dentro de la planificación hidrológica. En España, el aumento de las necesidades de agua de alta calidad en zonas con recursos escasos, aumento de zonas sensibles como puntos de captación para potables, zonas de baño o zonas de producción piscícola, y en ocasiones, el escaso terreno disponible para la implantación de nuevas plantas depuradoras (EDARs), han convertido a los MBRs, en una opción dentro del marco de la reutilización de aguas depuradas. En este trabajo, se estudia esta tecnología frente a los TRC y TRA, aportando igualmente curvas de tendencia coste-capacidad, e identificando cuando esta opción tecnológica puede ser más competitiva frente a los otros tratamientos de regeneración. Un MBR es un tratamiento de depuración de fangos activos donde el decantador secundario es sustituido por un sistema de membranas de UF o MF. La calidad del efluente, por tanto, es la misma que el de una EDAR seguida de un TRA. Los MBRs aseguran una calidad del efluente para todos los usos establecidos en el RD, incluso dan un efluente que permite ser directamente tratado por las unidades de desalación de OI o EDR. La implantación de esta tecnología en España ha tenido un crecimiento exponencial, pasando de 13 instalaciones de menos de 5.000 m3. d-1 en el 2006, a más de 55 instalaciones en operación o construcción a finales del 2014, seis de ellas con capacidades por encima de los 15.000 m3. d-1. Los sistemas de filtración en los MBR son los que marcan la operación y diseño de este tipo de instalaciones. El sistema más implantado en España es de membrana de fibra hueca (MFH), sobre todo para instalaciones de gran capacidad, destacando Zenon que cuenta con el 57% de la capacidad total instalada. La segunda casa comercial con mayor número de plantas es Kubota, con membranas de configuración placa plana (MPP), que cuenta con el 30 % de la capacidad total instalada. Existen otras casas comerciales implantadas en MBR españoles como son Toray, Huber, Koch o Microdym. En este documento se realiza la descripción de los sistemas de filtración de todas estas casas comerciales, aportando información de sus características, parámetros de diseño y operación más relevantes. El estudio de 14 MBRs ha posibilitado realizar otro de los objetivos de este trabajo, la estimación de los costes de explotación e implantación de este tipo de sistemas frente a otras alternativas de tratamiento de regeneración. En este estudio han participado activamente ACA y ESAMUR, entidades públicas de saneamiento y depuración de Cataluña y Murcia respectivamente, que cuentan con una amplia experiencia en la explotación de este tipo de sistemas. Este documento expone los problemas de operación encontrados y sus posibles soluciones, tanto en la explotación como en los futuros diseños de este tipo de plantas. El trabajo concluye que los MBRs son una opción más para la reutilización de efluentes depurados, siendo ventajosos en costes, tanto de implantación como de explotación, respecto a EDARs seguidas de TRA en capacidades por encima de los 10.000 m3.d-1. ABSTRACT The reuse of treated effluent has always been an option in places where a situational or structural water deficit exists, whether regulatory and/or planning efforts are completed or not. The need arises from the demand of a sector, commonly agricultural irrigation, which benefits of this new resource. Within the EU, Spain is ahead in the annual volume of reclaimed water, and is among the top ten countries at a global scale. The regulation of this practice through the Royal Decree 1620/2007 has helped to incorporate the water reuse to the hydrological plans as a part of the programme of measures to mitigate pressures such as surface or ground water extraction, or environmental improvements preventing discharges. The object of this study is to gain an overview of the state of the water reuse in Spain, the different scenarios and approaches to this activity, the development of the legal framework and its enforceability, together with the treatments that achieve the quality levels according to the current law, broken down by applications. Additionally, a cost analysis of technologies and regeneration treatment lines for water reclamation is performed, whereas the regeneration treatment is located after a wastewater treatment or other options such as membrane bioreactors (MBR). To develop the abovementioned objectives, the state of water reuse in Spain is studied by means of a database designed to encompass all aspects of the activity: data from the wastewater treatment plants (WWTP), from the water reclamation plants (WRP), the use of reclaimed water, treated water and reclaimed water annual volumes and qualities, facilities and applications, geographic references, technologies, regeneration treatment lines, etc. The main data providers are the River Basin authorities, through the concession or authorization for water reuse, (sanitary and wastewater treatment managers from the territorial governments, local governments, Hydrological Plans of the River Basins and field visits to the main water reuse systems. Additionally, a review of different plans and programmes on wastewater treatment or water reuse is done, aiming to put the development and consolidation process of this activity in the different regions of Spain in perspective. An inventory of 322 reuse systems and 216 regeneration treatments has been gathered on the database, where the most extended regeneration treatment line was sand filtration followed by hypochlorite disinfection, even though recently it is being replaced by physical–chemical treatment with a lamella settling system, depth sand filtration, and a disinfection with ultraviolet radiation and hypochlorite as residual disinfectant, named conventional regeneration treatment (CRT), and another treatment that may include a membrane process, named advanced regeneration treatment (ART), to adapt to legal requirements. Agricultural use is the most extended, accumulating 70% of the reclaimed demand, estimated at 408 hm3, even though the expected total capacity of WRPs for 2015, after the implementation of the National Water Reuse Plan (NWRP) is three times higher. According to the development of the water reuse legal framework, there were pioneer areas where competent authorities developed different quality and use recommendations for this new resource. Agricultural use and golf course irrigation in touristic areas were the first two uses with recommendations and even legislation. The initial lack of common legislation for water reuse at a national or European level created some doubts which affected the implementation of water reuse, both from a planning and a licensing point of view. Currently there is still a lack of common international legislation regarding water reuse, technologies and applications. Regarding agricultural use, the model recommendations at a global scale are those set by the World Health Organization published in 1989, and subsequent reviews and extensions about risk prevention (WHO, 2006). These documents combine wastewater treatments with basic regeneration treatments reinforced by good practices based on different levels of protection to avoid deleterious health effects. Another relevant legal reference for this practices has been the Environmental Protection Agency of the US (USEPA, 2012), or those published by the State of California (Title 22, 2001). These establish indicator targets and maximum thresholds where regeneration treatment lines are responsible for the final quality according to the different uses. During 2015, the ISO has worked on a document aimed at urban use, where the possible parameters to be monitored together with risk prevention have been studied. On the other hand, the European Commission has been promoting the reuse of treated effluents within the Common Implementation Strategy of the Water Framework Directive, mainly through the work of the Programme of Measures Working Group. Within this context, the publication of a recommendation guide during 2016 is intended, as a useful tool to fill in the legal gaps of different Member States on the matter. The Royal Decree 1620/2007, where the water reuse regulation is set, resembles the principles of the USEPA more closely, even though the EU shows a tendency to prioritize risk assessment by establishing tolerance levels and thresholds according to socioeconomic conditions of the different countries, without going into details of indicators, maximum thresholds or treatments. In contrast, in the US law, regeneration treatments are indicated, while in the Spanish legislation, the only recommendations to this respect are compiled in a non-compulsory guide. Therefore, there is no regulation on the different treatment lines used to achieve the required quality standards, giving room for inappropriate practices in this respect. This is the case of disinfection, where the use of hypochlorite may produce harmful byproducts. In the recommendation Guide for the application of the Royal Decree (RD), published by the Ministry of Agriculture and Environment (MAGRAMA) in 2010, clarifications of typical issues that may arise from the application of the RD are given, as well as basic technical parameters to consider in reuse setups, or good practices according to final use. Even so, the RD still presents difficulties in its application and requires a review on issues such as the sampling frequency of current quality parameters or even the omission of nematode eggs indicator, which have been shown to be absent after CRT. In this regard, there is a global tendency to employ water reuse for drinking water, including indicators for the presence of viruses and protozoans, or to include certain technologies such as membranes or advanced oxidation processes to tackle problems like emerging pollutants. Another of the objectives of this study is to provide different regeneration treatment lines to meet the quality requirements established in the RD 1620/2007 broken down by applications, and to estimate establishment and operational costs. This proposal has been based on what is established in the above mentioned Guide and NWRP. The proposed treatment typologies are divided in treatment trains with desalination, like reverse osmosis or reversible electrodialisis, and those that lack this treatment for brackish water. This separation is done due to coastal facilities, where sea water may permeate the collecting pipes, rising salt contents in the wastewater, hence limiting certain uses. To develop this objective a study of the most common treatment units set up in Spanish WRPs is conducted in terms of treatment train reliability to obtain an acceptable relationship between the required quality and the capital and operational costs. The CRT has an establishment cost of 28 to 48 €.m-3.d and an operation cost of 0.06 to 0.09 €.m-3, while, if desalination was required, these costs would increase tenfold for implementation and fivefold for operation. In the cases of uses that require ART, such as residential or certain industrial uses, the costs would be of 185 to 398 €.m-3.d for implementation and of 0.14 to 0.20 €.m-3 for operation. When selecting regeneration treatment lines, the relation between treatment capacity and cost is a paramount indicator. This project provides cost-capacity models for regeneration treatment trains. These may serve as a tool when selecting between different options to fulfill water demands with MBR facilities, or others such as sea water desalination plants or inter-basin water transfer into a water planning framework. In Spain, the requirement for high quality water in areas with low resource availability, the increasing number of sensitive zones, such as drinking water extraction, recreational bathing areas, fish protected areas and the lack of available land to set up new WWTPs, have turned MBRs into a suitable option for water reuse. In this work this technology is analyzed in contrast to CRT and ART, providing cost-capacity models, and identifying when and where this treatment option may outcompete other regeneration treatments. An MBR is an activated sludge treatment where the secondary settling is substituted by a membrane system of UF or MF. The quality of the effluent is, therefore, comparable to that of a WWTP followed by an ART. MBRs ensure a sufficient quality level for the requirements of the different uses established in the RD, even producing an effluent that can be directly treated in OI or EDR processes. The implementation of this technology in Spain has grown exponentially, growing from 13 facilities with less than 5000 m3.d-1 in 2006 to above 55 facilities operating by the end of 2014, 6 of them with capacities over 15000 m3.d-1. The membrane filtration systems for MBR are the ones that set the pace of operation and design of this type of facilities. The most widespread system in Spain is the hollow fiber membrane configuration, especially on high flow capacities, being Zenon commercial technology, which mounts up to 57% of the total installed capacity, the main contributor. The next commercial technology according to plant number is Kubota, which uses flat sheet membrane configuration, which mounts up to 30% of the total installed capacity. Other commercial technologies exist within the Spanish MBR context, such as Toray, Huber, Koch or Microdym. In this document an analysis of all of these membrane filtration systems is done, providing information about their characteristics and relevant design and operation parameters. The study of 14 full scale running MBRs has enabled to pursue another of the objectives of this work: the estimation of the implementation and operation costs of this type of systems in contrast to other regeneration alternatives. Active participation of ACA and ESAMUR, public wastewater treatment and reuse entities of Cataluña and Murcia respectively, has helped attaining this objective. A number of typical operative problems and their possible solutions are discussed, both for operation and plant design purposes. The conclusion of this study is that MBRs are another option to consider for water reuse, being advantageous in terms of both implementation and operational costs, when compared with WWTPs followed by ART, when considering flow capacities above 10000 m3.d-1.