981 resultados para cloud environment


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whereas some species may rely on periodic drought conditions for part of their life histories, or have life strategies suited to exploiting the habitat or changed environmental conditions that are created by drought, for other organisms it is a time of stress. Periodic drought conditions therefore generate a series of waves of colonization and extinctions. Studies on lowland wet grassland, in winterbournes and in the toiche zone of both ponds and rivers, also demonstrate that different organisms are competitively favoured with changing hydrological conditions, and that this process prevents any one species from overwhelming its competitors. Competitive impacts may be inter- and intraspecific. It is therefore apparent that the death of organisms such as adult fish during severe drought conditions, though traumatic for human onlookers and commercial interests, may be merely a regular occurrence to which the ecosystem is adapted. The variability of climatic conditions thereby provides a direct influence on the maintenance of biological diversity, and it is this very biodiversity that provides the ecosystem with the resilience to respond to environmental changes in both the short and the longer term.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The bulk of the European Community's water policy legislation was developed between the mid 1970s and the early 1990s. These directives addressed specific substances, sources, uses or processes but caused problems with differing methods definitions and aims. The Water Framework Directive (WFD) aims to resolve the piecemeal approach. The Environemnt Agency (EA) welcomes and supported the overall objective of establishing a coherent legislative framework. The EA has been discussing the implications of the WFD with European partners and has developed a timetable for the implementation and a special team will commission necessary research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complexity in the earthquake rupture process can result from many factors. This study investigates the origin of such complexity by examining several recent, large earthquakes in detail. In each case the local tectonic environment plays an important role in understanding the source of the complexity.

Several large shallow earthquakes (Ms > 7.0) along the Middle American Trench have similarities and differences between them that may lead to a better understanding of fracture and subduction processes. They are predominantly thrust events consistent with the known subduction of the Cocos plate beneath N. America. Two events occurring along this subduction zone close to triple junctions show considerable complexity. This may be attributable to a more heterogeneous stress environment in these regions and as such has implications for other subduction zone boundaries.

An event which looks complex but is actually rather simple is the 1978 Bermuda earthquake (Ms ~ 6). It is located predominantly in the mantle. Its mechanism is one of pure thrust faulting with a strike N 20°W and dip 42°NE. Its apparent complexity is caused by local crustal structure. This is an important event in terms of understanding and estimating seismic hazard on the eastern seaboard of N. America.

A study of several large strike-slip continental earthquakes identifies characteristics which are common to them and may be useful in determining what to expect from the next great earthquake on the San Andreas fault. The events are the 1976 Guatemala earthquake on the Motagua fault and two events on the Anatolian fault in Turkey (the 1967, Mudurnu Valley and 1976, E. Turkey events). An attempt to model the complex P-waveforms of these events results in good synthetic fits for the Guatemala and Mudurnu Valley events. However, the E. Turkey event proves to be too complex as it may have associated thrust or normal faulting. Several individual sources occurring at intervals of between 5 and 20 seconds characterize the Guatemala and Mudurnu Valley events. The maximum size of an individual source appears to be bounded at about 5 x 1026 dyne-cm. A detailed source study including directivity is performed on the Guatemala event. The source time history of the Mudurnu Valley event illustrates its significance in modeling strong ground motion in the near field. The complex source time series of the 1967 event produces amplitudes greater by a factor of 2.5 than a uniform model scaled to the same size for a station 20 km from the fault.

Three large and important earthquakes demonstrate an important type of complexity --- multiple-fault complexity. The first, the 1976 Philippine earthquake, an oblique thrust event, represents the first seismological evidence for a northeast dipping subduction zone beneath the island of Mindanao. A large event, following the mainshock by 12 hours, occurred outside the aftershock area and apparently resulted from motion on a subsidiary fault since the event had a strike-slip mechanism.

An aftershock of the great 1960 Chilean earthquake on June 6, 1960, proved to be an interesting discovery. It appears to be a large strike-slip event at the main rupture's southern boundary. It most likely occurred on the landward extension of the Chile Rise transform fault, in the subducting plate. The results for this event suggest that a small event triggered a series of slow events; the duration of the whole sequence being longer than 1 hour. This is indeed a "slow earthquake".

Perhaps one of the most complex of events is the recent Tangshan, China event. It began as a large strike-slip event. Within several seconds of the mainshock it may have triggered thrust faulting to the south of the epicenter. There is no doubt, however, that it triggered a large oblique normal event to the northeast, 15 hours after the mainshock. This event certainly contributed to the great loss of life-sustained as a result of the Tangshan earthquake sequence.

What has been learned from these studies has been applied to predict what one might expect from the next great earthquake on the San Andreas. The expectation from this study is that such an event would be a large complex event, not unlike, but perhaps larger than, the Guatemala or Mudurnu Valley events. That is to say, it will most likely consist of a series of individual events in sequence. It is also quite possible that the event could trigger associated faulting on neighboring fault systems such as those occurring in the Transverse Ranges. This has important bearing on the earthquake hazard estimation for the region.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis focuses on improving the simulation skills and the theoretical understanding of the subtropical low cloud response to climate change.

First, an energetically consistent forcing framework is designed and implemented for the large eddy simulation (LES) of the low-cloud response to climate change. The three representative current-day subtropical low cloud regimes of cumulus (Cu), cumulus-over-stratocumulus, and stratocumulus (Sc) are all well simulated with this framework, and results are comparable to the conventional fixed-SST approach. However, the cumulus response to climate warming subject to energetic constraints differs significantly from the conventional approach with fixed SST. Under the energetic constraint, the subtropics warm less than the tropics, since longwave (LW) cooling is more efficient with the drier subtropical free troposphere. The surface latent heat flux (LHF) also increases only weakly subject to the surface energetic constraint. Both factors contribute to an increased estimated inversion strength (EIS), and decreased inversion height. The decreased Cu-depth contributes to a decrease of liquid water path (LWP) and weak positive cloud feedback. The conventional fixed-SST approach instead simulates a strong increase in LHF and deepening of the Cu layer, leading to a weakly negative cloud feedback. This illustrates the importance of energetic constraints to the simulation and understanding of the sign and magnitude of low-cloud feedback.

Second, an extended eddy-diffusivity mass-flux (EDMF) closure for the unified representation of sub-grid scale (SGS) turbulence and convection processes in general circulation models (GCM) is presented. The inclusion of prognostic terms and the elimination of the infinitesimal updraft fraction assumption makes it more flexible for implementation in models across different scales. This framework can be consistently extended to formulate multiple updrafts and downdrafts, as well as variances and covariances. It has been verified with LES in different boundary layer regimes in the current climate, and further development and implementation of this closure may help to improve our simulation skills and understanding of low-cloud feedback through GCMs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

STEEL, the Caltech created nonlinear large displacement analysis software, is currently used by a large number of researchers at Caltech. However, due to its complexity, lack of visualization tools (such as pre- and post-processing capabilities) rapid creation and analysis of models using this software was difficult. SteelConverter was created as a means to facilitate model creation through the use of the industry standard finite element solver ETABS. This software allows users to create models in ETABS and intelligently convert model information such as geometry, loading, releases, fixity, etc., into a format that STEEL understands. Models that would take several days to create and verify now take several hours or less. The productivity of the researcher as well as the level of confidence in the model being analyzed is greatly increased.

It has always been a major goal of Caltech to spread the knowledge created here to other universities. However, due to the complexity of STEEL it was difficult for researchers or engineers from other universities to conduct analyses. While SteelConverter did help researchers at Caltech improve their research, sending SteelConverter and its documentation to other universities was less than ideal. Issues of version control, individual computer requirements, and the difficulty of releasing updates made a more centralized solution preferred. This is where the idea for Caltech VirtualShaker was born. Through the creation of a centralized website where users could log in, submit, analyze, and process models in the cloud, all of the major concerns associated with the utilization of SteelConverter were eliminated. Caltech VirtualShaker allows users to create profiles where defaults associated with their most commonly run models are saved, and allows them to submit multiple jobs to an online virtual server to be analyzed and post-processed. The creation of this website not only allowed for more rapid distribution of this tool, but also created a means for engineers and researchers with no access to powerful computer clusters to run computationally intensive analyses without the excessive cost of building and maintaining a computer cluster.

In order to increase confidence in the use of STEEL as an analysis system, as well as verify the conversion tools, a series of comparisons were done between STEEL and ETABS. Six models of increasing complexity, ranging from a cantilever column to a twenty-story moment frame, were analyzed to determine the ability of STEEL to accurately calculate basic model properties such as elastic stiffness and damping through a free vibration analysis as well as more complex structural properties such as overall structural capacity through a pushover analysis. These analyses showed a very strong agreement between the two softwares on every aspect of each analysis. However, these analyses also showed the ability of the STEEL analysis algorithm to converge at significantly larger drifts than ETABS when using the more computationally expensive and structurally realistic fiber hinges. Following the ETABS analysis, it was decided to repeat the comparisons in a software more capable of conducting highly nonlinear analysis, called Perform. These analyses again showed a very strong agreement between the two softwares in every aspect of each analysis through instability. However, due to some limitations in Perform, free vibration analyses for the three story one bay chevron brace frame, two bay chevron brace frame, and twenty story moment frame could not be conducted. With the current trend towards ultimate capacity analysis, the ability to use fiber based models allows engineers to gain a better understanding of a building’s behavior under these extreme load scenarios.

Following this, a final study was done on Hall’s U20 structure [1] where the structure was analyzed in all three softwares and their results compared. The pushover curves from each software were compared and the differences caused by variations in software implementation explained. From this, conclusions can be drawn on the effectiveness of each analysis tool when attempting to analyze structures through the point of geometric instability. The analyses show that while ETABS was capable of accurately determining the elastic stiffness of the model, following the onset of inelastic behavior the analysis tool failed to converge. However, for the small number of time steps the ETABS analysis was converging, its results exactly matched those of STEEL, leading to the conclusion that ETABS is not an appropriate analysis package for analyzing a structure through the point of collapse when using fiber elements throughout the model. The analyses also showed that while Perform was capable of calculating the response of the structure accurately, restrictions in the material model resulted in a pushover curve that did not match that of STEEL exactly, particularly post collapse. However, such problems could be alleviated by choosing a more simplistic material model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genetic engineering now makes possible the insertion of DNA from many organisms into other prokaryotic, eukaryotic and viral hosts. This technology has been used to construct a variety of such genetically engineered microorganisms (GEMs). The possibility of accidental or deliberate release of GEMs into the natural environment has recently raised much public concern. The prospect of deliberate release of these microorganisms has prompted an increased need to understand the processes of survival, expression, transfer and rearrangement of recombinant DNA molecules in microbial communities. The methodology which is being developed to investigate these processes will greatly enhance our ability to study microbial population ecology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The biomass of the phytoplankton and its composition is one of the most important factors in water quality control. Determination of the phytoplankton assemblage is usually done by microscopic analysis (Utermöhl's method). Quantitative estimations of the biovolume, by cell counting and cell size measurements, are time-consuming and normally are not done in routine water quality control. Several alternatives have been tried: computer-based image analysis, spectral fluorescence signatures, flow cytometry and pigment fingerprinting aided by high performance liquid chromatography (HPLC). The latter method is based on the fact that each major algal group of taxa contains a specific carotenoid which can be used for identification and relative quantification of the taxa in the total assemblage. This article gives a brief comparative introduction to the different techniques available and presents some recent results obtained by HPLC-based pigment fingerprinting, applied to three lakes of different trophic status. The results show that this technique yields reliable results from different lake types and is a powerful tool for studying the distribution pattern of the phytoplankton community in relation to water depth. However, some restrictions should be taken into account for the interpretation of routine data.