980 resultados para Volcanic hazard analysis
Resumo:
Lately, the acceptability of fermented dairy beverages has been increased, due to the nutritional benefits, the practical consumption and the low cost of product for the manufacturers, and consequently for final market price to the consumers. During the manufacturing process, these products can be susceptible to microbiological contamination. The present study investigated the contaminant microbiota in fermented dairy beverages produced by small- and medium-sized companies, by means of analyses on moulds and yeasts counting, determination of the Most Probable Number (MPN) of total and thermo tolerant coliforms, Escherichia coli and Salmonella spp. detection, and determination of pH value. In spite of the absence of Salmonella spp., a high counts of yeasts and moulds were found, and E. coli was detected in five samples (16.67 %); and the sample were classified as “products in poor sanitary conditions”, because they showed thermo-tolerant coliforms counting higher than the standard established by the legislation in force. Therefore, quality programs such as Good Manufacturing Practices (GMP) and Hazard Analysis and Critical Control Points (HACCP) should be employed to prevent any contamination risk, in order to provide safe products to consumers.
Aplicação de biofungicidas no controle do fungo Aspergillus flavus L. em amendoim (Arachis hypogaea)
Resumo:
Pós-graduação em Engenharia e Ciência de Alimentos - IBILCE
Resumo:
Hazard analysis and critical control points (HACCP) is one of the main tools currently used to ensure safety, quality and integrity of foods. So, the aim of this study was to develop and implement the HACCP program in the processing of pasteurized grade A milk Checklists were used to assess on the level of the pre requisites programs and on the sanitary classification of the dairy industry and the results were used as references for the development of the HACCP system. A "decision tree" protocol was used for the identification of the critical control points (CCP). No physical or chemical CCP were identified, whereas pasteurization and packaging were considered biological CCP For these CCP, the limits for prevention, monitoring needs, corrective actions, critical limits and verification procedures were established. The pre requisites program was essential for the establishment of the system. The implementation of the HACCP for the processing of grade A pasteurized milk was efficient to control the biological hazards and enabled the product to comply with the legislation specifications and achieve safety.
Resumo:
Le ricerche di carattere eustatico, mareografico, climatico, archeologico e geocronologico, sviluppatesi soprattutto nell’ultimo ventennio, hanno messo in evidenza che gran parte delle piane costiere italiane risulta soggetta al rischio di allagamento per ingressione marina dovuta alla risalita relativa del livello medio del mare. Tale rischio è la conseguenza dell’interazione tra la presenza di elementi antropici e fenomeni di diversa natura, spesso difficilmente discriminabili e quantificabili, caratterizzati da magnitudo e velocità molto diverse tra loro. Tra le cause preponderanti che determinano l’ingressione marina possono essere individuati alcuni fenomeni naturali, climatici e geologici, i quali risultano fortemente influenzati dalle attività umane soprattutto a partire dal XX secolo. Tra questi si individuano: - la risalita del livello del mare, principalmente come conseguenza del superamento dell’ultimo acme glaciale e dello scioglimento delle grandi calotte continentali; - la subsidenza. Vaste porzioni delle piane costiere italiane risultano soggette a fenomeni di subsidenza. In certe zone questa assume proporzioni notevoli: per la fascia costiera emiliano-romagnola si registrano ratei compresi tra 1 e 3 cm/anno. Tale subsidenza è spesso il risultato della sovrapposizione tra fenomeni naturali (neotettonica, costipamento di sedimenti, ecc.) e fenomeni indotti dall’uomo (emungimenti delle falde idriche, sfruttamento di giacimenti metaniferi, escavazione di materiali per l’edilizia, ecc.); - terreni ad elevato contenuto organico: la presenza di depositi fortemente costipabili può causare la depressione del piano di campagna come conseguenza di abbassamenti del livello della falda superficiale (per drenaggi, opere di bonifica, emungimenti), dello sviluppo dei processi di ossidazione e decomposizione nei terreni stessi, del costipamento di questi sotto il proprio peso, della carenza di nuovi apporti solidi conseguente alla diminuita frequenza delle esondazioni dei corsi d’acqua; - morfologia: tra i fattori di rischio rientra l’assetto morfologico della piana e, in particolare il tipo di costa (lidi, spiagge, cordoni dunari in smantellamento, ecc. ), la presenza di aree depresse o comunque vicine al livello del mare (fino a 1-2 m s.l.m.), le caratteristiche dei fondali antistanti (batimetria, profilo trasversale, granulometria dei sedimenti, barre sommerse, assenza di barriere biologiche, ecc.); - stato della linea di costa in termini di processi erosivi dovuti ad attività umane (urbanizzazione del litorale, prelievo inerti, costruzione di barriere, ecc.) o alle dinamiche idro-sedimentarie naturali cui risulta soggetta (correnti litoranee, apporti di materiale, ecc. ). Scopo del presente studio è quello di valutare la probabilità di ingressione del mare nel tratto costiero emiliano-romagnolo del Lido delle Nazioni, la velocità di propagazione del fronte d’onda, facendo riferimento allo schema idraulico del crollo di una diga su letto asciutto (problema di Riemann) basato sul metodo delle caratteristiche, e di modellare la propagazione dell’inondazione nell’entroterra, conseguente all’innalzamento del medio mare . Per simulare tale processo è stato utilizzato il complesso codice di calcolo bidimensionale Mike 21. La fase iniziale di tale lavoro ha comportato la raccolta ed elaborazione mediante sistema Arcgis dei dati LIDAR ed idrografici multibeam , grazie ai quali si è provveduto a ricostruire la topo-batimetria di dettaglio della zona esaminata. Nel primo capitolo è stato sviluppato il problema del cambiamento climatico globale in atto e della conseguente variazione del livello marino che, secondo quanto riportato dall’IPCC nel rapporto del 2007, dovrebbe aumentare al 2100 mediamente tra i 28 ed i 43 cm. Nel secondo e terzo capitolo è stata effettuata un’analisi bibliografica delle metodologie per la modellazione della propagazione delle onde a fronte ripido con particolare attenzione ai fenomeni di breaching delle difese rigide ed ambientali. Sono state studiate le fenomenologie che possono inficiare la stabilità dei rilevati arginali, realizzati sia in corrispondenza dei corsi d’acqua, sia in corrispondenza del mare, a discapito della protezione idraulica del territorio ovvero dell’incolumità fisica dell’uomo e dei territori in cui esso vive e produce. In un rilevato arginale, quale che sia la causa innescante la formazione di breccia, la generazione di un’onda di piena conseguente la rottura è sempre determinata da un’azione erosiva (seepage o overtopping) esercitata dall’acqua sui materiali sciolti costituenti il corpo del rilevato. Perciò gran parte dello studio in materia di brecce arginali è incentrato sulla ricostruzione di siffatti eventi di rottura. Nel quarto capitolo è stata calcolata la probabilità, in 5 anni, di avere un allagamento nella zona di interesse e la velocità di propagazione del fronte d’onda. Inoltre è stata effettuata un’analisi delle condizioni meteo marine attuali (clima ondoso, livelli del mare e correnti) al largo della costa emiliano-romagnola, le cui problematiche e linee di intervento per la difesa sono descritte nel quinto capitolo, con particolare riferimento alla costa ferrarese, oggetto negli ultimi anni di continui interventi antropici. Introdotto il sistema Gis e le sue caratteristiche, si è passati a descrivere le varie fasi che hanno permesso di avere in output il file delle coordinate x, y, z dei punti significativi della costa, indispensabili al fine della simulazione Mike 21, le cui proprietà sono sviluppate nel sesto capitolo.
Resumo:
The present work concerns with the study of debris flows and, in particular, with the related hazard in the Alpine Environment. During the last years several methodologies have been developed to evaluate hazard associated to such a complex phenomenon, whose velocity, impacting force and inappropriate temporal prediction are responsible of the related high hazard level. This research focuses its attention on the depositional phase of debris flows through the application of a numerical model (DFlowz), and on hazard evaluation related to watersheds morphometric, morphological and geological characterization. The main aims are to test the validity of DFlowz simulations and assess sources of errors in order to understand how the empirical uncertainties influence the predictions; on the other side the research concerns with the possibility of performing hazard analysis starting from the identification of susceptible debris flow catchments and definition of their activity level. 25 well documented debris flow events have been back analyzed with the model DFlowz (Berti and Simoni, 2007): derived form the implementation of the empirical relations between event volume and planimetric and cross section inundated areas, the code allows to delineate areas affected by an event by taking into account information about volume, preferential flow path and digital elevation model (DEM) of fan area. The analysis uses an objective methodology for evaluating the accuracy of the prediction and involve the calibration of the model based on factors describing the uncertainty associated to the semi empirical relationships. The general assumptions on which the model is based have been verified although the predictive capabilities are influenced by the uncertainties of the empirical scaling relationships, which have to be necessarily taken into account and depend mostly on errors concerning deposited volume estimation. In addition, in order to test prediction capabilities of physical-based models, some events have been simulated through the use of RAMMS (RApid Mass MovementS). The model, which has been developed by the Swiss Federal Institute for Forest, Snow and Landscape Research (WSL) in Birmensdorf and the Swiss Federal Institute for Snow and Avalanche Research (SLF) takes into account a one-phase approach based on Voellmy rheology (Voellmy, 1955; Salm et al., 1990). The input file combines the total volume of the debris flow located in a release area with a mean depth. The model predicts the affected area, the maximum depth and the flow velocity in each cell of the input DTM. Relatively to hazard analysis related to watersheds characterization, the database collected by the Alto Adige Province represents an opportunity to examine debris-flow sediment dynamics at the regional scale and analyze lithologic controls. With the aim of advancing current understandings about debris flow, this study focuses on 82 events in order to characterize the topographic conditions associated with their initiation , transportation and deposition, seasonal patterns of occurrence and examine the role played by bedrock geology on sediment transfer.
Resumo:
OBJECTIVE To expand the limited information on the prognostic impact of quantitatively obtained collateral function in patients with coronary artery disease (CAD) and to estimate causality of such a relation. DESIGN Prospective cohort study with long-term observation of clinical outcome. SETTING University Hospital. PATIENTS One thousand one hundred and eighty-one patients with chronic stable CAD undergoing 1771 quantitative, coronary pressure-derived collateral flow index measurements, as obtained during a 1-min coronary balloon occlusion (CFI is the ratio between mean distal coronary occlusive pressure and mean aortic pressure both subtracted by central venous pressure). Subgroup of 152 patients included in randomised trials on the longitudinal effect of different arteriogenic protocols on CFI. INTERVENTIONS Collection of long-term follow-up information on clinical outcome. MAIN OUTCOME MEASURES All-cause mortality and major adverse cardiac events. RESULTS Cumulative 15-year survival rate was 48% in patients with CFI<0.25 and 65% in the group with CFI≥0.25 (p=0.0057). Cumulative 10-year survival rate was 75% in patients without arteriogenic therapy and 88% (p=0.0482) in the group with arteriogenic therapy and showing a significant increase in CFI at follow-up. By proportional hazard analysis, the following variables predicted increased all-cause mortality: age, low CFI, left ventricular end-diastolic pressure and number of vessels with CAD. CONCLUSIONS A well-functioning coronary collateral circulation independently predicts lowered mortality in patients with chronic CAD. This relation appears to be causal, because augmented collateral function by arteriogenic therapy is associated with prolonged survival.
Resumo:
Prevalence and genetic relatedness were determined for third-generation cephalosporin-resistant Escherichia coli (3GC-R-Ec) detected in Swiss beef, veal, pork, and poultry retail meat. Samples from meat-packing plants (MPPs) processing 70% of the slaughtered animals in Switzerland were purchased at different intervals between April and June 2013 and analyzed. Sixty-nine 3GC-R-Ec isolates were obtained and characterized by microarray, PCR/DNA sequencing, Multi Locus Sequence Typing (MLST), and plasmid replicon typing. Plasmids of selected strains were transformed by electroporation into E. coli TOP10 cells and analyzed by plasmid MLST. The prevalence of 3GC-R-Ec was 73.3% in chicken and 2% in beef meat. No 3GC-R-Ec were found in pork and veal. Overall, the blaCTX-M-1 (79.4%), blaCMY-2 (17.6%), blaCMY-4 (1.5%), and blaSHV-12 (1.5%) β-lactamase genes were detected, as well as other genes conferring resistance to chloramphenicol (cmlA1-like), sulfonamides (sul), tetracycline (tet), and trimethoprim (dfrA). The 3GC-R-Ec from chicken meat often harbored virulence genes associated with avian pathogens. Plasmid incompatibility (Inc) groups IncI1, IncFIB, IncFII, and IncB/O were the most frequent. A high rate of clonality (e.g., ST1304, ST38, and ST93) among isolates from the same MPPs suggests that strains persist at the plant and spread to meat at the carcass-processing stage. Additionally, the presence of the blaCTX-M-1 gene on an IncI1 plasmid sequence type 3 (IncI1/pST3) in genetically diverse strains indicates interstrain spread of an epidemic plasmid. The blaCMY-2 and blaCMY-4 genes were located on IncB/O plasmids. This study represents the first comprehensive assessment of 3GC-R-Ec in meat in Switzerland. It demonstrates the need for monitoring contaminants and for the adaptation of the Hazard Analysis and Critical Control Point concept to avoid the spread of multidrug-resistant bacteria through the food chain.
Resumo:
Thirty-six US states have already enacted some form of seller's property condition disclosure law. At a time when there is a movement in this direction nationally, this paper attempts to ascertain the factors that lead states to adopt disclosure law. Motivation for the study stems from the fact that not all states have yet adopted the law, and states that have enacted the law have done so in different years. The analytical structure employs hazard models, using a unique set of economic and institutional attributes for a panel of 50 US States spanning 21 years, from 1984 to 2004. The proportional hazard analysis of law adoption reveals that greater number of disciplinary actions tends to favor passage of the law. Greater broker supervision, implying generally higher awareness among real estate agents, seems to have a negative impact on the likelihood of a state adopting a property condition disclosure law.
Resumo:
The overall purpose of this study was to assess the relationship between the promoter region polymorphism (-2607 1G/2G) of matrix metalloproteinase-1 (MMP-1) polymorphism and outcome in brain tumor patients diagnosed with a primary brain tumor between 1994 and 2000 at The University of Texas M. D. Anderson Cancer Center. The MMP-1 polymorphism was genotyped for all brain tumor patients who participated in the Family Brain Tumor Study and for whom blood samples were available. Relevant covariates were abstracted from medical records for all cases from the original protocol, including information on demographics, tumor histology, therapy and outcome was obtained. The hypothesis was that brain tumor patients with the 2G allele have a poorer prognosis and shorter survival than brain tumor patients with the 1G allele. ^ Experimental Design: Genetic variants for the MMP-1 enzyme were determined by a polymerase chain reaction-restriction fragment length polymorphism assay. Comparison was made between the overall survival for cases with the 2G polymorphism and overall survival for cases with the 1G polymorphism using multivariable Cox Proportional-Hazard analysis, controlling for age, sex, Karnofsky Performance Scale (KPS), extent of surgery, tumor histology and treatment received. Kaplan-Meier and Cox Proportional-Hazard analyses were utilized to assess if the MMP-1 polymorphisms were related to overall survival. Results: Overall survival was not statistically significantly different between the 2G allele brain tumor patients and the 1G allele patients and there was no statistically significant difference between tumor types. ^ Conclusions: No association was found between MMP-1 polymorphisms and survival in patients with malignant gliomas. ^
Resumo:
Seismic evaluation methodology is applied to an existing viaduct in the south of Spain, near Granada, which is a medium seismicity region. The influence of both geology and topography in the spatial variability of ground motion are studied as well as seismic hazard analysis and ground motion characterization. Artificial hazard-consistent ground motion records are synthesised applying seismic hazard analysis and site effects are estimated through a diffraction study. Direct BEM is used to calculate the valley displacement response to vertically propagating SV waves and transfer functions are generated allowing the transformation of free field motion to motion at each support. A closed formulae is used to estimate these transfer function. Finally, the results obtained are compared.
Resumo:
Este trabajo estudia la aportación que los métodos de agregación de juicios de expertos pueden realizar en el cálculo de la peligrosidad sísmica de emplazamientos. Se han realizado cálculos en dos emplazamientos de la Península Ibérica: Mugardos (La Coruña) y Cofrentes (Valencia) que están sometidos a regímenes tectónicos distintos y que, además, alojan instalaciones industriales de gran responsabilidad. Las zonas de estudio, de 320 Km de radio, son independientes. Se ha aplicado un planteamiento probabilista a la estimación de la tasa anual de superación de valores de la aceleración horizontal de pico y se ha utilizado el Método de Montecarlo para incorporar a los resultados la incertidumbre presente en los datos relativos a la definición de cada fuente sismogenética y de su sismicidad. Los cálculos se han operado mediante un programa de ordenador, desarrollado para este trabajo, que utiliza la metodología propuesta por el Senior Seismic Hazard Analysis Commitee (1997) para la NRC. La primera conclusión de los resultados ha sido que la Atenuación es la fuente principal de incertidumbre en las estimaciones de peligrosidad en ambos casos. Dada la dificultad de completar los datos históricos disponibles de esta variable se ha estudiado el comportamiento de cuatro métodos matemáticos de agregación de juicios de expertos a la hora de estimar una ley de atenuación en un emplazamiento. Los datos de partida se han obtenido del Catálogo de Isosistas del IGN. Los sismos utilizados como variables raíz se han elegido con el criterio de cubrir uniformemente la serie histórica disponible y los valores de magnitud observados. Se ha asignado un panel de expertos particular a cada uno de los dos emplazamientos y se han aplicado a sus juicios los métodos de Cooke, equipesos, Apostolakis_Mosleh y Morris. Sus propuestas se han comparado con los datos reales para juzgar su eficacia y su facilidad de operación. A partir de los resultados se ha concluido que el método de Cooke ha mostrado el comportamiento más eficiente y robusto para ambos emplazamientos. Este método, además, ha permitido identificar, razonadamente, a aquellos expertos que no deberían haberse introducido en un panel. The present work analyses the possible contribution of the mathematical methods of aggregation in the assessment of Seismic Hazzard. Two sites, in the Iberian Peninsula, have been considered: Mugardos ( La Coruña) and Cofrentes (Valencia).Both of them are subjected to different tectonic regimes an both accommodate high value industrial plants. Their areas of concern, with radius of 320 Km, are not overlapping. A probabilistic approach has been applied in the assessment the annual probability of exceedence of the horizontal peak acceleration. The Montecarlo Method has allowed to transfer the uncertainty in the models and parameters to the final results. A computer program has been developed for this purpose. The methodology proposed by the Senior Seismic Analysis Committee (1997) for the NRC has been considered. Attenuation in Ground motion has been proved to be the main source of uncertainty in seismic hazard for both sites. Taking into account the difficulties to complete existing historical data in this subject the performance of four mathematical methods of aggregation has been studied. Original data have been obtained from the catalogs of the Spanish National Institute of Geography. The seismic events considered were chosen to cover evenly the historical records and the observed values of magnitude. A panel of experts have been applied to each site and four aggregation methods have been developed : equal weights, Cooke, Apostolakis-Mosleh and Morris The four proposals have been compaired with the actual data to judge their performance and ease of application. The results have shown that the Method of Cooke have proved the most efficient and robust for both sites. This method, besides, allow the reasoned identification of those experts who should be rejected from the panel
Resumo:
El presente proyecto muestra el estudio realizado para evaluar la capacidad de drenaje de la canalización construida en el río Forcón-Barranca, en la población de San Miguel de Reinante (Lugo), a consecuencia de las severas inundaciones acaecidas los días 9 y 10 de junio de 2010. Para evaluar y analizar la peligrosidad se ha seguido un método hidrometeorológico agregado, debido a la falta de aforos, pluviómetros y datos, y se han usado Modelos Digitales de Elevaciones como punto de partida. Todo ello ha llevado a la conclusión de la ineficacia del canal para un período de retorno de 500 años. ABSTRACT After severe flooding in San Miguel de Reinante (Lugo) the 9th and 10th June 2010, a concrete channel was built. This project assesses the drainage capacity of such channel, performing a flood hazard analysis. Because no flow measures or rain gauge data were available a lumped model was followed in order to evaluate flood hazard, using Digital Elevation Models as a starting point. This has led to the conclusion that the channel cannot effectively drain a flow after a rainfall with a return period of 500 years.
Resumo:
El rebase se define como el transporte de una cantidad importante de agua sobre la coronación de una estructura. Por tanto, es el fenómeno que, en general, determina la cota de coronación del dique dependiendo de la cantidad aceptable del mismo, a la vista de condicionantes funcionales y estructurales del dique. En general, la cantidad de rebase que puede tolerar un dique de abrigo desde el punto de vista de su integridad estructural es muy superior a la cantidad permisible desde el punto de vista de su funcionalidad. Por otro lado, el diseño de un dique con una probabilidad de rebase demasiado baja o nula conduciría a diseños incompatibles con consideraciones de otro tipo, como son las estéticas o las económicas. Existen distintas formas de estudiar el rebase producido por el oleaje sobre los espaldones de las obras marítimas. Las más habituales son los ensayos en modelo físico y las formulaciones empíricas o semi-empíricas. Las menos habituales son la instrumentación en prototipo, las redes neuronales y los modelos numéricos. Los ensayos en modelo físico son la herramienta más precisa y fiable para el estudio específico de cada caso, debido a la complejidad del proceso de rebase, con multitud de fenómenos físicos y parámetros involucrados. Los modelos físicos permiten conocer el comportamiento hidráulico y estructural del dique, identificando posibles fallos en el proyecto antes de su ejecución, evaluando diversas alternativas y todo esto con el consiguiente ahorro en costes de construcción mediante la aportación de mejoras al diseño inicial de la estructura. Sin embargo, presentan algunos inconvenientes derivados de los márgenes de error asociados a los ”efectos de escala y de modelo”. Las formulaciones empíricas o semi-empíricas presentan el inconveniente de que su uso está limitado por la aplicabilidad de las fórmulas, ya que éstas sólo son válidas para una casuística de condiciones ambientales y tipologías estructurales limitadas al rango de lo reproducido en los ensayos. El objetivo de la presente Tesis Doctoral es el contrate de las formulaciones desarrolladas por diferentes autores en materia de rebase en distintas tipologías de diques de abrigo. Para ello, se ha realizado en primer lugar la recopilación y el análisis de las formulaciones existentes para estimar la tasa de rebase sobre diques en talud y verticales. Posteriormente, se llevó a cabo el contraste de dichas formulaciones con los resultados obtenidos en una serie de ensayos realizados en el Centro de Estudios de Puertos y Costas. Para finalizar, se aplicó a los ensayos de diques en talud seleccionados la herramienta neuronal NN-OVERTOPPING2, desarrollada en el proyecto europeo de rebases CLASH (“Crest Level Assessment of Coastal Structures by Full Scale Monitoring, Neural Network Prediction and Hazard Analysis on Permissible Wave Overtopping”), contrastando de este modo la tasa de rebase obtenida en los ensayos con este otro método basado en la teoría de las redes neuronales. Posteriormente, se analizó la influencia del viento en el rebase. Para ello se han realizado una serie de ensayos en modelo físico a escala reducida, generando oleaje con y sin viento, sobre la sección vertical del Dique de Levante de Málaga. Finalmente, se presenta el análisis crítico del contraste de cada una de las formulaciones aplicadas a los ensayos seleccionados, que conduce a las conclusiones obtenidas en la presente Tesis Doctoral. Overtopping is defined as the volume of water surpassing the crest of a breakwater and reaching the sheltered area. This phenomenon determines the breakwater’s crest level, depending on the volume of water admissible at the rear because of the sheltered area’s functional and structural conditioning factors. The ways to assess overtopping processes range from those deemed to be most traditional, such as semi-empirical or empirical type equations and physical, reduced scale model tests, to others less usual such as the instrumentation of actual breakwaters (prototypes), artificial neural networks and numerical models. Determining overtopping in reduced scale physical model tests is simple but the values obtained are affected to a greater or lesser degree by the effects of a scale model-prototype such that it can only be considered as an approximation to what actually happens. Nevertheless, physical models are considered to be highly useful for estimating damage that may occur in the area sheltered by the breakwater. Therefore, although physical models present certain problems fundamentally deriving from scale effects, they are still the most accurate, reliable tool for the specific study of each case, especially when large sized models are adopted and wind is generated Empirical expressions obtained from laboratory tests have been developed for calculating the overtopping rate and, therefore, the formulas obtained obviously depend not only on environmental conditions – wave height, wave period and water level – but also on the model’s characteristics and are only applicable in a range of validity of the tests performed in each case. The purpose of this Thesis is to make a comparative analysis of methods for calculating overtopping rates developed by different authors for harbour breakwater overtopping. First, existing equations were compiled and analysed in order to estimate the overtopping rate on sloping and vertical breakwaters. These equations were then compared with the results obtained in a number of tests performed in the Centre for Port and Coastal Studies of the CEDEX. In addition, a neural network model developed in the European CLASH Project (“Crest Level Assessment of Coastal Structures by Full Scale Monitoring, Neural Network Prediction and Hazard Analysis on Permissible Wave Overtopping“) was also tested. Finally, the wind effects on overtopping are evaluated using tests performed with and without wind in the physical model of the Levante Breakwater (Málaga).
Resumo:
La legislación existente en nuestro país, el Real Decreto 1620/200, de 7 de diciembre, por el que se establece el régimen jurídico de la reutilización de las aguas depuradas, no ha sido modificada en estos últimos años a pesar de las opiniones de expertos y operadores en el sentido que merece una revisión y mejora. Por ello se ha realizado un análisis pormenorizado de toda la legislación existente a nivel país y de otros países con amplia experiencia en reutilización para proponer mejoras o cambios en base a la información recopilada, tanto en estudios específicos de investigación como de los datos obtenidos de operadores y/o explotadores de estaciones regeneradoras de aguas residuales en España. Del estudio surgen algunas propuestas claras que se ponen a consideración de las autoridades. Se ha comprobado que no existen estudios suficientes relacionados a tipos de controles en estaciones de tratamiento terciario de aguas residuales en cada uno de los pasos de la línea de proceso, que permitan conocer las garantías de funcionamiento de dichas etapas y del proceso en su conjunto. De éste modo se podrían analizar todas las etapas por separado y comprobar si el funcionamiento en las condiciones previstas de diseño es apropiado, o si es posible mejorar la eficiencia a partir de estos datos intermedios, en lugar de los controles normales efectuados en la entrada y la salida realizados por los explotadores de las plantas existentes en el país. Existen, sin embargo, investigaciones y datos de casi todas las tecnologías existentes en el mercado relacionadas con la regeneración, lo cual ha permitido identificar sus ventajas e inconvenientes. La aplicación de un proceso de APPCC (Análisis de Peligros y Puntos de Control Críticos) a una planta existente ha permitido comprobar que es una herramienta práctica que permite gestionar la seguridad de un proceso de producción industrial como es el caso de un sistema de reutilización, por lo que se considera conveniente recomendar su aplicación siempre que sea posible. The existing legislation in our country, Royal Decree 1620/2007, of 7th December, establishing the legal regime for the reuse of treated water has not been modified in recent years despite expert and operators opinions in the sense that it deserves review and improvement. Therefore it has been conducted a detailed analysis of all existing legislation at country level and of other countries with extensive experience in water reuse to propose improvements or changes based on the information gathered both in specific research studies and the data obtained from operators and/or operators of regenerative sewage stations in Spain. As a result of these studies, some clear proposals are made to be considered by the authorities. It has been found that there are not enough studies related to types of controls in tertiary wastewater treatment plants, in each step of the process line, that provide insight into the performance guarantees of the mention stages and of the whole process. In this manner all stages could be analyzed separately and check if the operation in the design conditions envisaged is appropriate, or whether it is possible to improve efficiency taking in account these intermediate data, instead to the normal checks at entry and exit carried out by operators in existing plants in the country. There are, however, research and data from almost all technologies existing in the market related to regeneration, which has allowed identified its advantages and disadvantages. The application of a process of HACCP (hazard analysis and critical control points) to an existing plant has shown that it is a practical tool for managing the security of a process of industrial production as is the case of a water reuse system, so it is considered appropriate to recommend application whenever possible.
Resumo:
Con anterioridad a la década de 1980, la literatura especializada en análisis y gestión del riesgo estaba dominada por la llamada visión tecnocrática o dominante. Esta visión establecía que los desastres naturales eran sucesos físicos extremos, producidos por una naturaleza caprichosa, externos a lo social y que requerían soluciones tecnológicas y de gestión por parte de expertos. Este artículo se centra en desarrollar una nueva explicación para entender la persistencia hegemónica de la visión tecnocrática basada en el concepto de incuestionabilidad del riesgo. Esta propuesta conceptual hace referencia a la incapacidad y desidia de los expertos, científicos y tomadores de decisiones en general (claimmakers) de identificar y actuar sobre las causas profundas de la producción del riesgo ya que ello conllevaría a cuestionar los imperativos normativos, las necesidades de las elites y los estilos de vida del actual sistema socioeconómico globalizado.