924 resultados para STORM
Resumo:
A three-dimensional, regional coupled atmosphere-ocean model with full physics is developed to study air-sea interactions during winter storms off the U. S. east coast. Because of the scarcity of open ocean observations, models such as this offer valuable opportunities to investigate how oceanic forcing drives atmospheric circulation and vice versa. The study presented here considers conditions of strong atmospheric forcing (high wind speeds) and strong oceanic forcing (significant sea surface temperature (SST) gradients). A simulated atmospheric cyclone evolves in a manner consistent with Eta reanalysis, and the simulated air-sea heat and momentum exchanges strongly affect the circulations in both the atmosphere and the ocean. For the simulated cyclone of 19-20 January 1998, maximum ocean-to-atmosphere heat fluxes first appear over the Gulf Stream in the South Atlantic Bight, and this results in rapid deepening of the cyclone off the Carolina coast. As the cyclone moves eastward, the heat flux maximum shifts into the region near Cape Hatteras and later northeast of Hatteras, where it enhances the wind locally. The oceanic response to the atmospheric forcing is closely related to the wind direction. Southerly and southwesterly winds tend to strengthen surface currents in the Gulf Stream, whereas northeasterly winds weaken the surface currents in the Gulf Stream and generate southwestward flows on the shelf. The oceanic feedback to the atmosphere moderates the cyclone strength. Compared with a simulation in which the oceanic model always passes the initial SST to the atmospheric model, the coupled simulation in which the oceanic model passes the evolving SST to the atmospheric model produces higher ocean-to-atmosphere heat flux near Gulf Stream meander troughs. This is due to wind-driven lateral shifts of the stream, which in turn enhance the local northeasterly winds. Away from the Gulf Stream the coupled simulation produces surface winds that are 5 similar to 10% weaker. Differences in the surface ocean currents between these two experiments are significant on the shelf and in the open ocean.
Resumo:
Dissolved organic matter (DOM) dynamics during storm events has received considerable attention in forested watersheds, but the extent to which storms impart rapid changes in DOM concentration and composition in highly disturbed agricultural watersheds remains poorly understood. In this study, we used identical in situ optical sensors for DOM fluorescence (FDOM) with and without filtration to continuously evaluate surface water DOM dynamics in a 415 km(2) agricultural watershed over a 4 week period containing a short-duration rainfall event. Peak turbidity preceded peak discharge by 4 h and increased by over 2 orders of magnitude, while the peak filtered FDOM lagged behind peak turbidity by 15 h. FDOM values reported using the filtered in situ fluorometer increased nearly fourfold and were highly correlated with dissolved organic carbon (DOC) concentrations (r(2) = 0.97), providing a highly resolved proxy for DOC throughout the study period. Discrete optical properties including specific UV absorbance (SUVA(254)), spectral slope (S(290-350)), and fluorescence index (FI) were also strongly correlated with in situ FDOM and indicate a shift toward aromatic, high molecular weight DOM from terrestrially derived sources during the storm. The lag of the peak in FDOM behind peak discharge presumably reflects the draining of watershed soils from natural and agricultural landscapes. Field and experimental evidence showed that unfiltered FDOM measurements underestimated filtered FDOM concentrations by up to similar to 60% at particle concentrations typical of many riverine systems during hydrologic events. Together, laboratory and in situ data provide insights into the timing and magnitude of changes in DOM quantity and quality during storm events in an agricultural watershed, and indicate the need for sample filtration in systems with moderate to high suspended sediment loads.
Resumo:
Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey uncertainties inherent to spatial data and analysis.
Resumo:
A seawall was constructed in 1897 along the steep coast of Streckelsberg, Usedom Island to stop the cliff retreat. It was destroyed several times by storm induced sea floods, reconstructed and gradually extended to a length of 450 m. After the severe storm event of 1/2.3.1949, no more repair work was implemented. The ruins were no longer capable of preventing further erosion of the Streckelsberg cliff. A new protective structure became a necessity against ongoing erosion, and to check the lowering of the abrasion platform. The construction of three breakwaters began in 1995. A severe storm occurred on 3/4.11.1995 before their completion. Coastal bottom sediment mapping using a sidescan-sonar carried out two days later showed that a channel system down to a depth of 1.5 m was cut into the sand layer covering the sea floor on both sides of the Koserow Bank. The bottom of these channels was paved with gravel and boulders. This layer was encountered in the whole surveyed area below a mobile sand layer. Discharged bodies of fine sand half a meter high and erosional cavities several m2 in diameter around boulders led to the conclusion that an intensive sediment movement down to a depth of 11 m had taken place during the storm. A storm related direction of sediment discharge could not be identified. The existing section of the breakwaters withstood the severe storm.
Resumo:
This study investigates the rate of erosion during the 1951-2006 period on the Bykovsky Peninsula, located north-east of the harbour town of Tiksi, north Siberia. Its coastline, which is characterized by the presence of ice-rich sediment (Ice Complex) and the vicinity of the Lena River Delta, retreated at a mean rate of 0.59 m/yr between 1951 and 2006. Total erosion ranged from 434 m of erosion to 92 m of accretion during these 56 years and exhibited large variability (sigma = 45.4). Ninety-seven percent of the rates observed were less than 2 m/yr and 81.6% were less than 1 m/yr. No significant trend in erosion could be recorded despite the study of five temporal subperiods within 1951-2006. Erosion modes and rates actually appear to be strongly dependant on the nature of the backshore material, erosion being stronger along low-lying coastal stretches affected by past or current thermokarst activity. The juxtaposition of wind records monitored at the town of Tiksi and erosion records yielded no significant relationship despite strong record amplitude for both data sets. We explain this poor relationship by the only rough incorporation of sea-ice cover in our storm extraction algorithm, the use of land-based wind records vs. offshore winds, the proximity of the peninsula to the Lena River Delta freshwater and sediment plume and the local topographical constraints on wave development.
Resumo:
Very recently (Banerjee et al. in Astrophys. Space, doi:1007/s10509-011-0836-1, 2011) the statistics of geomagnetic Disturbance storm (Dst) index have been addressed, and the conclusion from this analysis suggests that the underlying dynamical process can be modeled as a fractional Brownian motion with persistent long-range correlations. In this comment we expose several misconceptions and flaws in the statistical analysis of that work. On the basis of these arguments, the former conclusion should be revisited.
Resumo:
Over the last decade, Grid computing paved the way for a new level of large scale distributed systems. This infrastructure made it possible to securely and reliably take advantage of widely separated computational resources that are part of several different organizations. Resources can be incorporated to the Grid, building a theoretical virtual supercomputer. In time, cloud computing emerged as a new type of large scale distributed system, inheriting and expanding the expertise and knowledge that have been obtained so far. Some of the main characteristics of Grids naturally evolved into clouds, others were modified and adapted and others were simply discarded or postponed. Regardless of these technical specifics, both Grids and clouds together can be considered as one of the most important advances in large scale distributed computing of the past ten years; however, this step in distributed computing has came along with a completely new level of complexity. Grid and cloud management mechanisms play a key role, and correct analysis and understanding of the system behavior are needed. Large scale distributed systems must be able to self-manage, incorporating autonomic features capable of controlling and optimizing all resources and services. Traditional distributed computing management mechanisms analyze each resource separately and adjust specific parameters of each one of them. When trying to adapt the same procedures to Grid and cloud computing, the vast complexity of these systems can make this task extremely complicated. But large scale distributed systems complexity could only be a matter of perspective. It could be possible to understand the Grid or cloud behavior as a single entity, instead of a set of resources. This abstraction could provide a different understanding of the system, describing large scale behavior and global events that probably would not be detected analyzing each resource separately. In this work we define a theoretical framework that combines both ideas, multiple resources and single entity, to develop large scale distributed systems management techniques aimed at system performance optimization, increased dependability and Quality of Service (QoS). The resulting synergy could be the key 350 J. Montes et al. to address the most important difficulties of Grid and cloud management.
Resumo:
Storm evolution is fundamental for analysing the damage progression of the different failure modes and establishing suitable protocols for maintaining and optimally sizing structures. However, this aspect has hardly been studied and practically the whole of the studies dealing with the subject adopt the Equivalent triangle storm. As against this approach, two new ones are proposed. The first is the Equivalent Triangle Magnitude Storm model (ETMS), whose base, the triangular storm duration, D, is established such that its magnitude (area describing the storm history above the reference threshold level which sets the storm condition),HT, equals the real storm magnitude. The other is the Equivalent Triangle Number of Waves Storm (ETNWS), where the base is referred in terms of the real storm's number of waves,Nz. Three approaches are used for estimating the mean period, Tm, associated to each of the sea states defining the storm evolution, which is necessary to determine the full energy flux withstood by the structure in the course of the extreme event. Two are based on the Jonswap spectrum representativity and the other uses the bivariate Gumbel copula (Hs, Tm), resulting from adjusting the storm peaks. The representativity of the approaches proposed and those defined in specialised literature are analysed by comparing the main armour layer's progressive loss of hydraulic stability caused by real storms and that relating to theoretical ones. An empirical maximum energy flux model is used for this purpose. The agreement between the empirical and theoretical results demonstrates that the representativity of the different approaches depends on the storm characteristics and point towards a need to investigate other geometrical shapes to characterise the storm evolution associated with sea states heavily influenced by swell wave components.
Resumo:
Storm evolution is fundamental for analysing the damage progression of the different failure modes and establishing suitable protocols for maintaining and optimally sizing structures. However, this aspect has hardly been studied and practically the whole of the studies dealing with the subject adopt the Equivalent triangle storm. As against this approach, two new ones are proposed. The first is the Equivalent Triangle Magnitude Storm model (ETMS), whose base, the triangular storm duration, D, is established such that its magnitude (area describing the storm history above the reference threshold level which sets the storm condition),HT, equals the real storm magnitude. The other is the Equivalent Triangle Number of Waves Storm (ETNWS), where the base is referred in terms of the real storm's number of waves,Nz. Three approaches are used for estimating the mean period, Tm, associated to each of the sea states defining the storm evolution, which is necessary to determine the full energy flux withstood by the structure in the course of the extreme event. Two are based on the Jonswap spectrum representativity and the other uses the bivariate Gumbel copula (Hs, Tm), resulting from adjusting the storm peaks. The representativity of the approaches proposed and those defined in specialised literature are analysed by comparing the main armour layer's progressive loss of hydraulic stability caused by real storms and that relating to theoretical ones. An empirical maximum energy flux model is used for this purpose. The agreement between the empirical and theoretical results demonstrates that the representativity of the different approaches depends on the storm characteristics and point towards a need to investigate other geometrical shapes to characterise the storm evolution associated with sea states heavily influenced by swell wave components.
Resumo:
The different theoretical models related with storm wave characterization focus on determining the significant wave height of the peak storm, the mean period and, usually assuming a triangle storm shape, their duration. In some cases, the main direction is also considered. Nevertheless, definition of the whole storm history, including the variation of the main random variables during the storm cycle is not taken into consideration. The representativeness of the proposed storm models, analysed in a recent study using an empirical maximum energy flux time dependent function shows that the behaviour of the different storm models is extremely dependent on the climatic characteristics of the project area. Moreover, there are no theoretical models able to adequately reproduce storm history evolution of the sea states characterized by important swell components. To overcome this shortcoming, several theoretical storm shapes are investigated taking into consideration the bases of the three best theoretical storm models, the Equivalent Magnitude Storm (EMS), the Equivalent Number of Waves Storm (ENWS) and the Equivalent Duration Storm (EDS) models. To analyse the representativeness of the new storm shape, the aforementioned maximum energy flux formulation and a wave overtopping discharge structure function are used. With the empirical energy flux formulation, correctness of the different approaches is focussed on the progressive hydraulic stability loss of the main armour layer caused by real and theoretical storms. For the overtopping structure equation, the total volume of discharge is considered. In all cases, the results obtained highlight the greater representativeness of the triangular EMS model for sea waves and the trapezoidal (nonparallel sides) EMS model for waves with a higher degree of wave development. Taking into account the increase in offshore and shallow water wind turbines, maritime transport and deep vertical breakwaters, the maximum wave height of the whole storm history and that corresponding to each sea state belonging to its cycle's evolution is also considered. The procedure considers the information usually available for extreme waves' characterization. Extrapolations of the maximum wave height of the selected storms have also been considered. The 4th order statistics of the sea state belonging to the real and theoretical storm have been estimated to complete the statistical analysis of individual wave height
Resumo:
A chronic debilitating parasitic infection, viscerotropic leishmaniasis (VTL), has been described in Operation Desert Storm veterans. Diagnosis of this disease, caused by Leishmania tropica, has been difficult due to low or absent specific immune responses in traditional assays. We report the cloning and characterization of two genomic fragments encoding portions of a single 210-kDa L. tropica protein useful for the diagnosis of VTL in U.S. military personnel. The recombinant proteins encoded by these fragments, recombinant (r) Lt-1 and rLt-2, contain a 33-amino acid repeat that reacts with sera from Desert Storm VTL patients and with sera from L. tropica-infected patients with cutaneous leishmaniasis. Antibody reactivities to rLt-1 indicated a bias toward IgG2 in VTL patient sera. Peripheral blood mononuclear cells from VTL patients produced interferon gamma, but not interleukin 4 or 10, in response to rLt-1. No cytokine production was observed in response to parasite lysate. The results indicate that specific leishmanial antigens may be used to detect immune responses in VTL patients with chronic infections.
Resumo:
Stormwater runoff is a major cause of surface water pollution in western Washington and cost-effective control measures are needed to reduce contamination of receiving waters. This project evaluated the performance efficiency and costs of installing a residential rain garden by inspecting two gardens and testing water quality samples in one. Infiltration through the garden was effective in reducing metals and nutrients as indicated by reductions in copper and nitrate concentrations of 75% and 97%, respectively. The costs were affordable for most homeowners and cost-effective compared to other options. After doing on-site investigations, the storm water manual for the State of Washington was reviewed and eight recommendations were made for improving it.