919 resultados para Containment of rainwater
Resumo:
Iodine-129 (Full-size image (<1 K)) concentrations have been determined by accelerator mass spectrometry in rainwater samples taken at Seville (southwestern Spain) in 1996 and 1997. This technique allows a reduction in the detection limits for this radionuclide in comparison to radiometric counting and other mass spectrometric methods such as ICP-MS. Typical 129I concentrations range from 4.7×107129I atoms/l (19.2%) to 4.97×109129I atoms/l (5.9%), while 129I depositions are normally in the order of 108–1010 atoms/m2 d. These values agree well with other results obtained for recent rainwater samples collected in Europe. Apart from these, the relationship between 129I deposition and some atmospheric factors has been analyzed, showing the importance of the precipitation rate and the concentration of suspended matter in it.
Resumo:
In spite of the environmental relevance of 129I, there is still a scarcity of data about its presence in the different natural compartments. In this work, results are presented on the concentration of 129I in rainwater samples taken in Sevilla (southwestern Spain) and in a sediment core taken near the Ringhals coast (Sweden). Typical concentrations of 108 and 109129I at/l are found in rainwater samples, similar to other values in literature. In the case of the sediment core, our results clearly show the impact of anthropogenic sources, with concentrations in the order of 1013129I at./kg and isotopic ratios 129I/127I in the order of 10−8 in the higher layers.
Resumo:
Soil salinity management can be complex, expensive, and time demanding, especially in arid and semi-arid regions. Besides taking no action, possible management strategies include amelioration and adaptation measures. Here we apply the World Overview of Conservation Approaches and Technologies (WOCAT) framework for the systematic analysis and evaluation and selection of soil salinisation amelioration technologies in close collaboration with stakeholders. The participatory approach is applied in the RECARE (Preventing and Remediating degradation of soils in Europe through Land Care) project case study of Timpaki, a semiarid region in south-central Crete (Greece) where the main land use is horticulture in greenhouses irrigated by groundwater. Excessive groundwater abstractions have resulted in a drop of the groundwater level in the coastal part of the aquifer, thus leading to seawater intrusion and in turn to soil salinisation. The documented technologies are evaluated for their impacts on ecosystem services, cost, and input requirements using a participatory approach and field evaluations. Results show that technologies which promote maintaining existing crop types while enhancing productivity and decreasing soil salinity are preferred by the stakeholders. The evaluation concludes that rainwater harvesting is the optimal solution for direct soil salinity mitigation, as it addresses a wider range of ecosystem and human well-being benefits. Nevertheless, this merit is offset by poor financial motivation making agronomic measures more attractive to users.
Resumo:
Questions Do extreme dry spells in late summer or in spring affect abundance and species composition of the reproductive shoots and the seed rain in the next annual crop? Are drought effects on reproductive shoots related to the rooting depths of species? Location Species-rich semi-natural grassland at Negrentino, Switzerland. Methods In plots under automated rain-out shelters, rainwater was added to simulate normal conditions and compare them with two experimentally effected long dry spells, in late summer (2004) and in the following spring (2005). For 28 plots, numbers of reproductive shoots per species were counted in 1-m2 areas and seed rain was estimated using nine sticky traps of 102 cm2 after dry spells. Results The two extreme dry spells in late summer and spring were similar in length and their probability of recurrence. They independently reduced the subsequent reproductive output of the community, while their seasonal timing modified its species composition. Compared to drought in spring, drought in late summer reduced soil moisture more and reduced the number of reproductive shoots of more species. The negative effects of summer drought decreased with species’ rooting depth. The shallow-rooted graminoids showed a consistent susceptibility to summer drought, while legumes and other forbs showed more varied responses to both droughts. Spring drought strongly reduced density (–53%) and species richness (–43%) of the community seed rain, while summer drought had only a marginally significant impact on seed density of graminoids (–44%). Reductions in seed number per shoot vs reproductive shoot density distinguished the impacts of drought with respect to its seasonal timing. Conclusion The essentially negative impact of drought in different seasons on reproductive output suggests that more frequent dry spells could contribute to local plant diversity loss by aggravating seed deficiency in species-rich grassland.
Resumo:
The biological safety profession has historically functioned within an environment based on recommended practices rather than regulations, so summary data on compliance or noncompliance with recommended practices is largely absent from the professional literature. The absence of safety performance outcome data is unfortunate since the concept of biosafety containment is based on a combination of facility based controls and workplace practices, and persistent failures in either type of controls could ultimately result in injury or death. In addition, the number of laboratories requiring biosafety containment is likely to grow significantly in the coming years in the wake of the terrorist events of 2001. In this study, the outcomes of 768 biosafety level 2 (BSL-2) safety surveys were analyzed for commonalities and trends. Items of non-compliance noted were classified as facility related or practice related. The most frequent item of noncompliance encountered was the failure to re-certify biosafety cabinetry. Not surprisingly, the preponderance of the other frequent items of non-compliance encountered were practice related, such as general housekeeping orderly, changes in compliance levels, as well as establish trends in the elements of items of non-compliance during the sequential survey period. The findings described in this study are significant because, for the first time, the outcomes of compliance with recommended biosafety practices can be characterized and thus used as the basis for focused interventions. Since biosafety is heavily reliant on adherence to specific safety practices, the ability to focus interventions on objectively identified practice-related items of non-compliance can assist in the reduction of worker risk in this area experiencing tremendous growth. The information described is also of heighten importance given the number of workplaces expected to involve potentially infectious agents in the coming years. ^
Resumo:
This study demonstrated that accurate, short-term forecasts of Veterans Affairs (VA) hospital utilization can be made using the Patient Treatment File (PTF), the inpatient discharge database of the VA. Accurate, short-term forecasts of two years or less can reduce required inventory levels, improve allocation of resources, and are essential for better financial management. These are all necessary achievements in an era of cost-containment.^ Six years of non-psychiatric discharge records were extracted from the PTF and used to calculate four indicators of VA hospital utilization: average length of stay, discharge rate, multi-stay rate (a measure of readmissions) and days of care provided. National and regional levels of these indicators were described and compared for fiscal year 1984 (FY84) to FY89 inclusive.^ Using the observed levels of utilization for the 48 months between FY84 and FY87, five techniques were used to forecast monthly levels of utilization for FY88 and FY89. Forecasts were compared to the observed levels of utilization for these years. Monthly forecasts were also produced for FY90 and FY91.^ Forecasts for days of care provided were not produced. Current inpatients with very long lengths of stay contribute a substantial amount of this indicator and it cannot be accurately calculated.^ During the six year period between FY84 and FY89, average length of stay declined substantially, nationally and regionally. The discharge rate was relatively stable, while the multi-stay rate increased slightly during this period. FY90 and FY91 forecasts show a continued decline in the average length of stay, while the discharge rate is forecast to decline slightly and the multi-stay rate is forecast to increase very slightly.^ Over a 24 month ahead period, all three indicators were forecast within a 10 percent average monthly error. The 12-month ahead forecast errors were slightly lower. Average length of stay was less easily forecast, while the multi-stay rate was the easiest indicator to forecast.^ No single technique performed significantly better as determined by the Mean Absolute Percent Error, a standard measure of error. However, Autoregressive Integrated Moving Average (ARIMA) models performed well overall and are recommended for short-term forecasting of VA hospital utilization. ^
Resumo:
The present study analyzed some of the effects of imposing a cost-sharing requirement on users of a state's health service program. The study population consisted of people who were in diagnosed medical need and included, but was not limited to, people in financial need.^ The purpose of the study was to determine if the cost-sharing requirement had any detrimental effects on the service population. Changes in the characteristics of service consumers and in utilization patterns were analyzed using time-series techniques and pre-post policy comparisons.^ The study hypotheses stated that the distribution of service provided, diagnoses serviced, and consumer income levels would change following the cost-sharing policy.^ Analysis of data revealed that neither the characteristics of service users (income, race, sex, etc.) nor services provided by the program changed significantly following the policy. The results were explainable in part by the fact that all of the program participants were in diagnosed medical need. Therefore, their use of "discretionary" or "less necessary" services was limited.^ The study's findings supported the work of Joseph Newhouse, Charles Phelps, and others who have contended that necessary service use would not be detrimentally affected by reasonable cost-sharing provisions. These contentions raise the prospect of incorporating cost-sharing into programs such as Medicaid, which, at this writing, do not demand any consumer payment for services.^ The study concluded with a discussion of the cost-containment problem in health services. The efficacy of cost-sharing was considered relative to other financing and reimbursement strategies such as HMO's, self-funding, and reimbursement for less costly services and places of service. ^
Resumo:
There is currently much interest in the appropriate use of obstetrical technology, cost containment and meeting consumers' needs for safe and satisfying maternity care. At the same time, there has been an increase in professionally unattended home births. In response, a new type of service, the out-of-hospital childbearing center (CBC) has been developed which is administratively and structurally separate from the hospital. In the CBC, maternity care is provided by certified nurse-midwives to carefully screened low risk childbearing families in conjunction with physician and hospital back-up.^ It was the purpose of this study to accomplish the following objectives: (1) To describe in a historical prospective study the demographic and medical-obstetric characteristics of patients laboring in eleven selected out-of-hospital childbearing centers in the United States from May 1, 1972, to December 15, 1979. Labor is defined as the onset of regular contractions as determined by the patient. (2) To describe any differences between those patients who require transfer to a back-up hospital and those who do not. (3) To describe administrative and service characteristics of eleven selected out-of-hospital childbearing centers in the United States. (4) To compare the demographic and medical-obstetric characteristics of women laboring in eleven selected out-of-hospital childbearing centers with a national sample of women of similar obstetric risk who according to birth certificates delivered legitimate infants in a hospital setting in the United States in 1972.^ Research concerning CBCs and supportive to the development of CBCs including studies which identified factors associated with fetal and perinatal morbidity and mortality, obstetrical risk screening, and the progress of technological development in obstetrics were reviewed. Information concerning the organization and delivery of care at each selected CBC was also collected and analyzed.^ A stratified, systematic sample of 1938 low risk women who began labor in a selected CBC were included in the study. These women were not unlike those described previously in small single center studies reported in the literature. The mean age was 25 years. Sixty-three per cent were white, 34 per cent Hispanic, 88 per cent married, 45 per cent had completed at least two years of college, nearly one-third were professionals and over a third were housewives. . . . (Author's abstract exceeds stipulated maximum length. Discontinued here with permission of school.) UMI ^
Resumo:
High tunnels are simple, plastic-covered, passive solar-heated structures in which crops are grown in the ground. They are used by fruit and vegetable growers to extend the growing season and intensify production in cold climates. The covered growing area creates a desert-like environment requiring carefully monitored irrigation practices. In contrast, the exterior expanse of a high tunnel generates a large volume of water with every measurable rainfall. Each 1,000 ft of high tunnel roof will generate approximately 300 gallons from a half inch of rain. Unless the high tunnel site is elevated from the surrounding area or drainage tiles installed, or other drainage accommodations are made around the perimeter, the soil along the inside edge of the high tunnel is nearly continuously saturated. High volumes of water can also create an erosion problem. The objective of this project was to design and construct a system that enables growers using high tunnels in their production operation to reduce drainage problems, erosion, and crop loss due to excess moisture in and around their high tunnel(s) without permanent environmental and soil mediations.
Resumo:
Stalagmites are important palaeo-climatic archives since their chemical and isotopic signatures have the potential to record high-resolution changes in temperature and precipitation over thousands of years. We present three U/Th-dated records of stalagmites (MA1-MA3) in the superhumid southern Andes, Chile (53°S). They grew simultaneously during the last five thousand years (ka BP) in a cave that developed in schist and granodiorite. Major and trace elements as well as the C and O isotope compositions of the stalagmites were analysed at high spatial and temporal resolution as proxies for palaeo-temperature and palaeo-precipitation. Calibrations are based on data from five years of monitoring the climate and hydrology inside and outside the cave and on data from 100 years of regional weather station records. Water-insoluble elements such as Y and HREE in the stalagmites indicate the amount of incorporated siliciclastic detritus. Monitoring shows that the quantity of detritus is controlled by the drip water rate once a threshold level has been exceeded. In general, drip rate variations of the stalagmites depend on the amount of rainfall. However, different drip-water pathways above each drip location gave rise to individual drip rate levels. Only one of the three stalagmites (MA1) had sufficiently high drip rates to record detrital proxies over its complete length. Carbonate-compatible element contents (e.g. U, Sr, Mg), which were measured up to sub-annual resolution, document changes in meteoric precipitation and related drip-water dilution. In addition, these soluble elements are controlled by leaching during weathering of the host rock and soils depending on the pH of acidic pore waters in the peaty soils of the cave's catchment area. In general, higher rainfall resulted in a lower concentration of these elements and vice versa. The Mg/Ca record of stalagmite MA1 was calibrated against meteoric precipitation records for the last 100 years from two regional weather stations. Carbonate-compatible soluble elements show similar patterns in the three stalagmites with generally high values when drip rates and detrital tracers were low and vice versa. d13C and d18O values are highly correlated in each stalagmite suggesting a predominantly drip rate dependent kinetic control by evaporation and/or outgassing. Only C and O isotopes from stalagmite MA1 that received the highest drip rates show a good correlation between detrital proxy elements and carbonate-compatible elements. A temperature-related change in rainwater isotope values modified the MA1 record during the Little Ice Age (~0.7-0.1 ka BP) that was ~1.5 °C colder than today. The isotopic composition of the stalagmites MA2 and MA3 that formed at lower drip rates shows a poor correlation with stalagmite MA1 and all other chemical proxies of MA1. 'Hendy tests' indicate that the degassing-controlled isotope fractionation of MA2 and MA3 had already started at the cave roof, especially when drip rates were low. Changing pathways and residence times of the seepage water caused a non-climatically controlled isotope fractionation, which may be generally important in ventilated caves during phases of low drip rates. Our proxies indicate that the Neoglacial cold phases from ~3.5 to 2.5 and from ~0.7 to 0.1 ka BP were characterised by 30% lower precipitation compared with the Medieval Warm Period from 1.2 to 0.8 ka BP, which was extremely humid in this region.
Resumo:
A methodology of experimental simulation of state of spent nuclear fuel that occurs on the sea floor due to some catastrophes or dumping is developed. Data on long-term (more than 2000 days) experiments on estimation of 85Kr and 137Cs release rate from spent nuclear fuel (fragments of irradiated UO2 pellets) were firstly obtained; these estimates prove correctness of a hypothesis offered by us in early 1990s concerning to earlier 85Kr release (by one order of magnitude higher than that of 137Cs) as compared to other fission fragments in case of loss of integrity of fuel containment as a result of corrosion on the sea floor. A method and technique of onboard 85Kr and 137Cs sampling and extraction (as well as sampling of tritium, product of triple 235U fission) and their radiometric analysis at coastal laboratories are developed. Priority data on 85Kr background in bottom layers of the Barents and Kara Seas and 137Cs and 3H in these seas (state of 2003) are presented. Models necessary for estimation of dilution of fission products of spent nuclear fuel and their transport on the floor in accident and dumping regions are developed. An experimental method for examination of state of spent nuclear fuel on the sea floor (one expedition each 2-3 years) by 85Kr release into environment (a leak tracer) is proposed; this release is an indicator of destruction of fuel containment and release of products of spent nuclear fuel in case of 235UO2 corrosion in sea water.
Resumo:
In order to determine the presence of Fusarium spp. in atmospheric dust and rainfall dust, samples were collected during September 2007, and July, August, and October 2008. The results reveal the prevalence of airborne Fusarium species coming from the atmosphere of the South East coast of Spain. Five different Fusarium species were isolated from the settling dust: Fusarium oxysporum, F. solani, F. equiseti, F. dimerum, and F. proliferatum. Moreover, rainwater samples were obtained during significant rainfall events in January and February 2009. Using the dilution-plate method, 12 fungal genera were identified from these rainwater samples. Specific analyses of the rainwater revealed the presence of three species of Fusarium: F. oxysporum, F. proliferatum and F. equiseti. A total of 57 isolates of Fusarium spp. obtained from both rainwater and atmospheric rainfall dust sampling were inoculated onto melon (Cucumis melo L.) cv. Piñonet and tomato (Lycopersicon esculentum Mill.) cv. San Pedro. These species were chosen because they are the main herbaceous crops in Almeria province. The results presented in this work indicate strongly that spores or propagules of Fusarium are able to cross the continental barrier carried by winds from the Sahara (Africa) to crop or coastal lands in Europe. Results show differences in the pathogenicity of the isolates tested. Both hosts showed root rot when inoculated with different species of Fusarium, although fresh weight measurements did not bring any information about the pathogenicity. The findings presented above are strong indications that long-distance transmission of Fusarium propagules may occur. Diseases caused by species of Fusarium are common in these areas. They were in the past, and are still today, a problem for greenhouses crops in Almería, and many species have been listed as pathogens on agricultural crops in this region. Saharan air masses dominate the Mediterranean regions. The evidence of long distance dispersal of Fusarium spp. by atmospheric dust and rainwater together with their proved pathogenicity must be taken into account in epidemiological studies.
Resumo:
This paper evaluates the water footprint of Spanish olives and olive oil over the period 1997-2008. In particular, it analyses the three colour components of the water footprint: green (rainwater stored in the soil), blue (surface and groundwater) and grey (freshwater required to assimilate load of pollutants). Apparent water productivity and virtual water embedded in olive oil exports have also been studied. Results show more than 99.5% of the water footprint of one liter of bottled olive oil is related to the olive production, whereas less than 0.5% is due to the other components such as bottle, cap and label. Over the studied period, the green water footprint in absolute terms of Spanish olive oil production represents about 72% in rainfed systems and just 12% in irrigated olive orchards. Blue and grey water footprints represent 6% and 10% of the national water footprint, respectively. It is shown that olive production is concentrated in regions with the smallest water footprint per unit of product. However, the increase of groundwater consumption in the main olive producing region (Andalusia), from 98 to 378 Mm3 between 1997 and 2008, has added significant pressure in the upstream Guadalquivir basin. This raises questions about the sustainability of irrigated olive orchards for export from the region. Finally, the virtual water related to olive oil exports illustrate the importance of green water footprint of rainfed olives amounting to about 77% of the total virtual water exports.
Resumo:
The wetting front is the zone where water invades and advances into an initially dry porous material and it plays a crucial role in solute transport through the unsaturated zone. Water is an essential part of the physiological process of all plants. Through water, necessary minerals are moved from the roots to the parts of the plants that require them. Water moves chemicals from one part of the plant to another. It is also required for photosynthesis, for metabolism and for transpiration. The leaching of chemicals by wetting fronts is influenced by two major factors, namely: the irregularity of the fronts and heterogeneity in the distribution of chemicals, both of which have been described by using fractal techniques. Soil structure can significantly modify infiltration rates and flow pathways in soils. Relations between features of soil structure and features of infiltration could be elucidated from the velocities and the structure of wetting fronts. When rainwater falls onto soil, it doesn?t just pool on surfaces. Water ?or another fluid- acts differently on porous surfaces. If the surface is permeable (porous) it seeps down through layers of soil, filling that layer to capacity. Once that layer is filled, it moves down into the next layer. In sandy soil, water moves quickly, while it moves much slower through clay soil. The movement of water through soil layers is called the the wetting front. Our research concerns the motion of a liquid into an initially dry porous medium. Our work presents a theoretical framework for studying the physical interplay between a stationary wetting front of fractal dimension D with different porous materials. The aim was to model the mass geometry interplay by using the fractal dimension D of a stationary wetting front. The plane corresponding to the image is divided in several squares (the minimum correspond to the pixel size) of size length ". We acknowledge the help of Prof. M. García Velarde and the facilities offered by the Pluri-Disciplinary Institute of the Complutense University of Madrid. We also acknowledge the help of European Community under project Multi-scale complex fluid flows and interfacial phenomena (PITN-GA-2008-214919). Thanks are also due to ERCOFTAC (PELNoT, SIG 14)
Resumo:
Steam Generator Tube Rupture (SGTR) sequences in Pressurized Water Reactors are known to be one of the most demanding transients for the operating crew. SGTR are a special kind of transient as they could lead to radiological releases without core damage or containment failure, as they can constitute a direct path from the reactor coolant system to the environment. The first methodology used to perform the Deterministic Safety Analysis (DSA) of a SGTR did not credit the operator action for the first 30 min of the transient, assuming that the operating crew was able to stop the primary to secondary leakage within that period of time. However, the different real SGTR accident cases happened in the USA and over the world demonstrated that the operators usually take more than 30 min to stop the leakage in actual sequences. Some methodologies were raised to overcome that fact, considering operator actions from the beginning of the transient, as it is done in Probabilistic Safety Analysis. This paper presents the results of comparing different assumptions regarding the single failure criteria and the operator action taken from the most common methodologies included in the different Deterministic Safety Analysis. One single failure criteria that has not been analysed previously in the literature is proposed and analysed in this paper too. The comparison is done with a PWR Westinghouse three loop model in TRACE code (Almaraz NPP) with best estimate assumptions but including deterministic hypothesis such as single failure criteria or loss of offsite power. The behaviour of the reactor is quite diverse depending on the different assumptions made regarding the operator actions. On the other hand, although there are high conservatisms included in the hypothesis, as the single failure criteria, all the results are quite far from the regulatory limits. In addition, some improvements to the Emergency Operating Procedures to minimize the offsite release from the damaged SG in case of a SGTR are outlined taking into account the offsite dose sensitivity results.