991 resultados para stored


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The needs for various forms of information systems relating to the European environment and ecosystem are reviewed, and limitations indicated. Existing information systems are reviewed and compared in terms of aims and functionalities. We consider TWO technical challenges involved in attempting to develop an IEEICS. First, there is the challenge of developing an Internet-based communication system which allows fluent access to information stored in a range of distributed databases. Some of the currently available solutions are considered, i.e. Web service federations. The second main challenge arises from the fact that there is general intra-national heterogeneity in the definitions adopted, and the measurement systems used throughout the nations of Europe. Integrated strategies are needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An explosion occurred in a busy university laboratory during a few minutes when it happened to be unoccupied. The explosion was puzzling since the laboratory was dedicated to geochemical work, such as digesting rock samples with stable, inorganic reagents. The only unstable substance knowingly stored or handled for this purpose, perchloric acid, was not in use on the day of the incident. The investigation was unable to reach an exact conclusion but did prove that a substantial organic contaminant, not on the laboratory inventory, must have been present

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A two dimensional staggered unstructured discretisation scheme for the solution of fluid flow problems has been developed. This scheme stores and solves the velocity vector resolutes normal and parallel to each cell face and other scalar variables (pressure, temperature) are stored at cell centres. The coupled momentum; continuity and energy equations are solved, using the well known pressure correction algorithm SIMPLE. The method is tested for accuracy and convergence behaviour against standard cell-centre solutions in a number of benchmark problems: The Lid-Driven Cavity, Natural Convection in a Cavity and the Melting of Gallium in a rectangular domain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The WTC evacuation of 11 September 2001 provides an unrepeatable opportunity to probe into and understand the very nature of evacuation dynamics and with this improved understanding, contribute to the design of safer, more evacuation efficient, yet highly functional, high rise buildings. Following 9/11 the Fire Safety Engineering Group (FSEG) of the University of Greenwich embarked on a study of survivor experiences from the WTC Twin Towers evacuation. The experiences were collected from published accounts appearing in the print and electronic mass media and are stored in a relational data base specifically developed for this purpose. Using these accounts and other available sources of information FSEG also undertook a series of numerical simulations of the WTC North Tower. This paper represents an overview of the results from both studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the methodologies employed in the collection and storage of first-hand accounts of evacuation experiences derived from face-to-face interviews with evacuees from the World Trade Center (WTC) Twin Towers complex on 11 September 2001. In particular the paper describes the development of the High-rise Evacuation Evaluation Database (HEED). This is a flexible qualitative research tool which contains the full transcribed interview accounts and coded evacuee experiences extracted from those transcripts. The data and information captured and stored in the HEED database is not only unique, but it provides a means to address current and emerging issues relating to human factors associated with the evacuation of high-rise buildings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes work towards the deployment of flexible self-management into real-time embedded systems. A challenging project which focuses specifically on the development of a dynamic, adaptive automotive middleware is described, and the specific self-management requirements of this project are discussed. These requirements have been identified through the refinement of a wide-ranging set of use cases requiring context-sensitive behaviours. A sample of these use-cases is presented to illustrate the extent of the demands for self-management. The strategy that has been adopted to achieve self-management, based on the use of policies is presented. The embedded and real-time nature of the target system brings the constraints that dynamic adaptation capabilities must not require changes to the run-time code (except during hot update of complete binary modules), adaptation decisions must have low latency, and because the target platforms are resource-constrained the self-management mechanism have low resource requirements (especially in terms of processing and memory). Policy-based computing is thus and ideal candidate for achieving the self-management because the policy itself is loaded at run-time and can be replaced or changed in the future in the same way that a data file is loaded. Policies represent a relatively low complexity and low risk means of achieving self-management, with low run-time costs. Policies can be stored internally in ROM (such as default policies) as well as externally to the system. The architecture of a designed-for-purpose powerful yet lightweight policy library is described. A suitable evaluation platform, supporting the whole life-cycle of feasibility analysis, concept evaluation, development, rigorous testing and behavioural validation has been devised and is described.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article provides a broad overview of project HEED (High-rise Evacuation Evaluation Database) and the methodologies employed in the collection and storage of first-hand accounts of evacuation experiences derived from face-to-face interviews of evacuees from the World Trade Center (WTC) Twin Towers complex on September 11, 2001. In particular, the article describes the development of the HEED database. This is a flexible research tool which contains qualitative type data in the form of coded evacuee experiences along with the full interview transcripts. The data and information captured and stored in the HEED database is not only unique, but provides a means to address current and emerging issues relating to human factors associated with the evacuation of high-rise buildings

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: The aim of this study was to investigate how the release of fluoride from two compomers and a fluoridated composite resin was affected by exposure to KF solution. MATERIAL AND METHODS: Two compomers (Dyract AP and Compoglass F) and one fluoridated composite (Wave) were prepared as discs (6 mm diameter and 2 mm thick), curing with a standard dental lamp. They were then stored in either water or 0.5% KF for 1 week, followed by placement in water for periods of 1 week up to 5 weeks total. Fluoride was determined with and without TISAB (to allow complexed and decomplexed fluoride to be determined), and other ion release (Na, Ca, Al, Si, P) was determined by ICP-OES. RESULTS: Specimens were found not to take up fluoride from 100 ppm KF solution in 24 h, but to release additional fluoride when stored for up to five weeks. Compomers released more fluoride cumulatively following exposure to KF solution (p<0.001), all of which was decomplexed, though initial (1 week) values were not statistically significant for Dyract AP. Other ions showed no variations in release over 1 week, regardless of whether the specimens were exposed to KF. Unlike the compomers, Wave showed no change in fluoride release as a result of exposure to KF. CONCLUSIONS: Compomers are affected by KF solution, and release more fluoride (but not other ions) after exposure than if stored in water.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An aqueous solution of sucrose was lyophilised, producing amorphous sucrose. This wasthen stored under different humidity at 25ºC for 1 week, allowing some samples tocrystallise. FT-Raman spectroscopy and PXRD have been successfully shown toqualitatively distinguish between amorphous and crystalline samples of sucrose. The datafrom the two techniques is complementary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The water uptake and water loss behaviour in three different formulations of zinc oxy-chloride cement have been studied in detail. Specimens of each material were subjected to a high humidity atmosphere (93% RH) over saturated aqueous sodium sulfate, and a low humidity desiccating atmosphere over concentrated sulfuric acid. In high humidity, the cement formulated from the nominal 75% ZnCl2 solutions gained mass, eventually becoming too sticky to weigh further. The specimens at 25% and 50% ZnCl2 by contrast lost mass by a diffusion process, though by 1 week the 50% cement had stated to gain mass and was also too sticky to weigh. In low humidity, all three cements lost mass, again by a diffusion process. Both water gain and water loss followed Fick's law for a considerable time. In the case of water loss under desiccating conditions, this corresponded to values of Mt/MĄ well above 0.5. However, plots did not go through the origin, showing that there was an induction period before true diffusion began. Diffusion coefficients varied from 1.56 x 10-5 (75% ZnCl2) to 2.75 x 10-5 cm2/s (50% ZnCl2), and appeared to be influenced not simply by composition. The drying of the 25% and 50% ZnCl2 cements in high humidity conditions occurred at a much lower rate, with a value of D of 2.5 x 10-8 cm2/s for the 25% ZnCl2 cement. This cement was found to equilibrate slowly, but total water loss did not differ significantly from that of the cements stored under desiccating conditions. Equilibration times for water loss in desiccating conditions were of the order of 2-4 hours, depending on ZnCl2 content; equilibrium water losses were respectively 28.8 [25% ZnCl2], 16.2 [50%] and 12.4 [75%] which followed the order of ZnCl2 content. It is concluded that the water transport processes are strongly influenced by the ZnCl2 content of the cement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Continuous Plankton Recorder (CPR) survey has collected plankton samples from regular tracks across the world's oceans for almost 70 y. Over 299,000 spatially extensive CPR samples are archived and stored in buffered formalin. This CPR archive offers huge potential to study changes in marine communities using molecular data from a period when marine pollution, exploitation and global anthropogenic impact were much less pronounced. However, to harness the amount of data available within the CPR archive fully, it is necessary to improve techniques of larval identification, to genus and species preferably, and to obtain genetic information for historical studies of population ecology. To increase the potential of the CPR database this paper describes the first extraction, amplification by the polymerase chain reaction and utilization of a DNA sequence (mitochondrial 16S rDNA) from a CPR sample, a formalin fixed larval sandeel.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Continuous Plankton Recorder (CPR) survey provides a unique multi- decadal dataset on the abundance of plankton in the North Sea and North Atlantic and is one of only a few monitoring programmes operating at a large spatio- temporal scale. The results of all samples analysed from the survey since 1946 are stored on an Access Database at the Sir Alister Hardy Foundation for Ocean Science (SAHFOS) in Plymouth. The database is large, containing more than two million records (~80 million data points, if zero results are added) for more than 450 taxonomic entities. An open data policy is operated by SAHFOS. However, the data are not on-line and so access by scientists and others wishing to use the results is not interactive. Requests for data are dealt with by the Database Manager. To facilitate access to the data from the North Sea, which is an area of high research interest, a selected set of data for key phytoplankton and zooplankton species has been processed in a form that makes them readily available on CD for research and other applications. A set of MATLAB tools has been developed to provide an interpolated spatio-temporal description of plankton sampled by the CPR in the North Sea, as well as easy and fast access to users in the form of a browser. Using geostatistical techniques, plankton abundance values have been interpolated on a regular grid covering the North Sea. The grid is established on centres of 1 degree longitude x 0.5 degree latitude (~32 x 30 nautical miles). Based on a monthly temporal resolution over a fifty-year period (1948-1997), 600 distribution maps have been produced for 54 zooplankton species, and 480 distribution maps for 57 phytoplankton species over the shorter period 1958-1997. The gridded database has been developed in a user-friendly form and incorporates, as a package on a CD, a set of options for visualisation and interpretation, including the facility to plot maps for selected species by month, year, groups of months or years, long-term means or as time series and contour plots. This study constitutes the first application of an easily accessed and interactive gridded database of plankton abundance in the North Sea. As a further development the MATLAB browser is being converted to a user- friendly Windows-compatible format (WinCPR) for release on CD and via the Web in 2003.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the 1970’s and 1980’s, the late Dr Norman Holme undertook extensive towed sledge surveys in the English Channel and some in the Irish Sea. Only a minority of the resulting images were analysed and reported before his death in 1989 but logbooks, video and film material has been archived in the National Marine Biological Library (NMBL) in Plymouth. A scoping study was therefore commissioned by the Joint Nature Conservation Committee and as a part of the Mapping European Seabed Habitats (MESH) project to identify the value of the material archived and the procedure and cost to undertake further work. The results of the scoping study are: 1. NMBL archives hold 106 videotapes (reel-to-reel Sony HD format) and 59 video cassettes (including 15 from the Irish Sea) in VHS format together with 90 rolls of 35 mm colour transparency film (various lengths up to about 240 frames per film). These are stored in the Archive Room, either in a storage cabinet or in original film canisters. 2. Reel-to-reel material is extensive and had already been selectively copied to VHS cassettes. The cost of transferring it to an accepted ‘long-life’ medium (Betamax) would be approximately £15,000. It was not possible to view the tapes as a suitable machine was not located. The value of the tapes is uncertain but they are likely to become beyond salvation within one to two years. 3. Video cassette material is in good condition and is expected to remain so for several more years at least. Images viewed were generally of poor quality and the speed of tow often makes pictures blurred. No immediate action is required. 4. Colour transparency films are in good condition and the images are very clear. They provide the best source of information for mapping seabed biotopes. They should be scanned to digital format but inexpensive fast copying is problematic as there are no between-frame breaks between images and machines need to centre the image based on between-frame breaks. The minimum cost to scan all of the images commercially is approximately £6,000 and could be as much as £40,000 on some quotations. There is a further cost in coding and databasing each image and, all-in-all it would seem most economic to purchase a ‘continuous film’ scanner and undertake the work in-house. 5. Positional information in ships logs has been matched to films and to video tapes. Decca Chain co-ordinates recorded in the logbooks have been converted to latitude and longitude (degrees, minutes and seconds) and a further routine developed to convert to degrees and decimal degrees required for GIS mapping. However, it is unclear whether corrections to Decca positions were applied at the time the position was noted. Tow tracks have been mapped onto an electronic copy of a Hydrographic Office chart. 6. The positions of start and end of each tow were entered to a spread sheet so that they can be displayed on GIS or on a Hydrographic Office Chart backdrop. The cost of the Hydrographic Office chart backdrop at a scale of 1:75,000 for the whole area was £458 incl. VAT. 7. Viewing all of the video cassettes to note habitats and biological communities, even by an experienced marine biologist, would take at least in the order of 200 hours and is not recommended. English Channel towed sledge seabed images. Phase 1: scoping study and example analysis. 6 8. Once colour transparencies are scanned and indexed, viewing to identify seabed habitats and biological communities would probably take about 100 hours for an experienced marine biologist and is recommended. 9. It is expected that identifying biotopes along approximately 1 km lengths of each tow would be feasible although uncertainties about Decca co-ordinate corrections and exact positions of images most likely gives a ±250 m position error. More work to locate each image accurately and solve the Decca correction question would improve accuracy of image location. 10. Using codings (produced by Holme to identify different seabed types), and some viewing of video and transparency material, 10 biotopes have been identified, although more would be added as a result of full analysis. 11. Using the data available from the Holme archive, it is possible to populate various fields within the Marine Recorder database. The overall ‘survey’ will be ‘English Channel towed video sled survey’. The ‘events’ become the 104 tows. Each tow could be described as four samples, i.e. the start and end of the tow and two areas in the middle to give examples along the length of the tow. These samples would have their own latitude/longitude co-ordinates. The four samples would link to a GIS map. 12. Stills and video clips together with text information could be incorporated into a multimedia presentation, to demonstrate the range of level seabed types found along a part of the northern English Channel. More recent images taken during SCUBA diving of reef habitats in the same area as the towed sledge surveys could be added to the Holme images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Preserved and archived organic material offers huge potential for the conduct of retrospective and long-term historical ecosystem reconstructions using stable isotope analyses, but because of isotopic exchange with preservatives the obtained values require validation. The Continuous Plankton Recorder (CPR) Survey is the most extensive long-term monitoring program for plankton communities worldwide and has utilised ships of opportunity to collect samples since 1931. To keep the samples intact for subsequent analysis, they are collected and preserved in formalin; however, previous studies have found that this may alter stable carbon and nitrogen isotope ratios in zooplankton. A maximum ~0.9‰ increase of δ15N and a time dependent maximum ~1.0‰ decrease of δ13C were observed when the copepod, Calanus helgolandicus, was experimentally exposed to two formalin preservatives for 12 months. Applying specific correction factors to δ15N and δ13C values for similarly preserved Calanoid species collected by the CPR Survey within 12 months of analysis may be appropriate to enable their use in stable isotope studies. The isotope values of samples stored frozen did not differ significantly from those of controls. Although the impact of formalin preservation was relatively small in this and other studies of marine zooplankton, changes in isotope signatures are not consistent across taxa, especially for δ15N, indicating that species-specific studies may be required. Copyright © 2011 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impact of a sub-seabed CO2 leak from geological sequestration on the microbial process of ammonia oxidation was investigated in the field. Sediment samples were taken before, during and after a controlled sub-seabed CO2 leak at four zones differing in proximity to the CO2 source (epicentre, and 25m, 75m, and 450m distant). The impact of CO2 release on benthic microbial ATP levels was compared to ammonia oxidation rates and the abundance of bacterial and archaeal ammonia amoA genes and transcripts, and also to the abundance of nitrite oxidize (nirS) and anammox hydrazine oxidoreductase (hzo) genes and transcripts. The major factor influencing measurements was seasonal: only minor differences were detected at the zones impacted by CO2 (epicentre and 25m distant). This included a small increase to ammonia oxidation after 37daysof CO2 release which was linked to an increase in ammonia availability as a result of mineral dissolution. A CO2 leak on the scale used within this study (<1tonneday−1) would have very little impact to ammonia oxidation within coastal sediments. However, seawater containing 5% CO2 did reduce rates of ammonia oxidation. This was linked to the buffering capacity of the sediment, suggesting that the impact of a sub-seabed leak of stored CO2 on ammonia oxidation would be dependent on both the scale of the CO2 release and sediment type.