888 resultados para Project 2004-028-C : Wayfinding in the Built Environment


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geological carbon dioxide storage (CCS) has the potential to make a significant contribution to the decarbonisation of the UK. Amid concerns over maintaining security, and hence diversity, of supply, CCS could allow the continued use of coal, oil and gas whilst avoiding the CO2 emissions currently associated with fossil fuel use. This project has explored some of the geological, environmental, technical, economic and social implications of this technology. The UK is well placed to exploit CCS with a large offshore storage capacity, both in disused oil and gas fields and saline aquifers. This capacity should be sufficient to store CO2 from the power sector (at current levels) for a least one century, using well understood and therefore likely to be lower-risk, depleted hydrocarbon fields and contained parts of aquifers. It is very difficult to produce reliable estimates of the (potentially much larger) storage capacity of the less well understood geological reservoirs such as non-confined parts of aquifers. With the majority of its large coal fired power stations due to be retired during the next 15 to 20 years, the UK is at a natural decision point with respect to the future of power generation from coal; the existence of both national reserves and the infrastructure for receiving imported coal makes clean coal technology a realistic option. The notion of CCS as a ‘bridging’ or ‘stop-gap’ technology (i.e. whilst we develop ‘genuinely’ sustainable renewable energy technologies) needs to be examined somewhat critically, especially given the scale of global coal reserves. If CCS plant is built, then it is likely that technological innovation will bring down the costs of CO2 capture, such that it could become increasingly attractive. As with any capitalintensive option, there is a danger of becoming ‘locked-in’ to a CCS system. The costs of CCS in our model for UK power stations in the East Midlands and Yorkshire to reservoirs in the North Sea are between £25 and £60 per tonne of CO2 captured, transported and stored. This is between about 2 and 4 times the current traded price of a tonne of CO2 in the EU Emissions Trading Scheme. In addition to the technical and economic requirements of the CCS technology, it should also be socially and environmentally acceptable. Our research has shown that, given an acceptance of the severity and urgency of addressing climate change, CCS is viewed favourably by members of the public, provided it is adopted within a portfolio of other measures. The most commonly voiced concern from the public is that of leakage and this remains perhaps the greatest uncertainty with CCS. It is not possible to make general statements concerning storage security; assessments must be site specific. The impacts of any potential leakage are also somewhat uncertain but should be balanced against the deleterious effects of increased acidification in the oceans due to uptake of elevated atmospheric CO2 that have already been observed. Provided adequate long term monitoring can be ensured, any leakage of CO2 from a storage site is likely to have minimal localised impacts as long as leaks are rapidly repaired. A regulatory framework for CCS will need to include risk assessment of potential environmental and health and safety impacts, accounting and monitoring and liability for the long term. In summary, although there remain uncertainties to be resolved through research and demonstration projects, our assessment demonstrates that CCS holds great potential for significant cuts in CO2 emissions as we develop long term alternatives to fossil fuel use. CCS can contribute to reducing emissions of CO2 into the atmosphere in the near term (i.e. peak-shaving the future atmospheric concentration of CO2), with the potential to continue to deliver significant CO2 reductions over the long term.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study assesses the current state of adult skeletal age-at-death estimation in biological anthropology through analysis of data published in recent research articles from three major anthropological and archaeological journals (2004–2009). The most commonly used adult ageing methods, age of ‘adulthood’, age ranges and the maximum age reported for ‘mature’ adults were compared. The results showed a wide range of variability in the age at which individuals were determined to be adult (from 14 to 25 years), uneven age ranges, a lack of standardisation in the use of descriptive age categories and the inappropriate application of some ageing methods for the sample being examined. Such discrepancies make comparisons between skeletal samples difficult, while the inappropriate use of some techniques make the resultant age estimations unreliable. At a time when national and even global comparisons of past health are becoming prominent, standardisation in the terminology and age categories used to define adults within each sample is fundamental. It is hoped that this research will prompt discussions in the osteological community (both nationally and internationally) about what defines an ‘adult’, how to standardise the age ranges that we use and how individuals should be assigned to each age category. Skeletal markers have been proposed to help physically identify ‘adult’ individuals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The latest Hadley Centre climate model, HadGEM2-ES, includes Earth system components such as interactive chemistry and eight species of tropospheric aerosols. It has been run for the period 1860–2100 in support of the fifth phase of the Climate Model Intercomparison Project (CMIP5). Anthropogenic aerosol emissions peak between 1980 and 2020, resulting in a present-day all-sky top of the atmosphere aerosol forcing of −1.6 and −1.4 W m−2 with and without ammonium nitrate aerosols, respectively, for the sum of direct and first indirect aerosol forcings. Aerosol forcing becomes significantly weaker in the 21st century, being weaker than −0.5 W m−2 in 2100 without nitrate. However, nitrate aerosols become the dominant species in Europe and Asia and decelerate the decrease in global mean aerosol forcing. Considering nitrate aerosols makes aerosol radiative forcing 2–4 times stronger by 2100 depending on the representative concentration pathway, although this impact is lessened when changes in the oxidation properties of the atmosphere are accounted for. Anthropogenic aerosol residence times increase in the future in spite of increased precipitation, as cloud cover and aerosol-cloud interactions decrease in tropical and midlatitude regions. Deposition of fossil fuel black carbon onto snow and ice surfaces peaks during the 20th century in the Arctic and Europe but keeps increasing in the Himalayas until the middle of the 21st century. Results presented here confirm the importance of aerosols in influencing the Earth's climate, albeit with a reduced impact in the future, and suggest that nitrate aerosols will partially replace sulphate aerosols to become an important anthropogenic species in the remainder of the 21st century.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During SPURT (Spurenstofftransport in der Tropopausenregion, trace gas transport in the tropopause region) we performed measurements of a wide range of trace gases with different lifetimes and sink/source characteristics in the northern hemispheric upper troposphere (UT) and lowermost stratosphere (LMS). A large number of in-situ instruments were deployed on board a Learjet 35A, flying at altitudes up to 13.7 km, at times reaching to nearly 380 K potential temperature. Eight measurement campaigns (consisting of a total of 36 flights), distributed over all seasons and typically covering latitudes between 35° N and 75° N in the European longitude sector (10° W–20° E), were performed. Here we present an overview of the project, describing the instrumentation, the encountered meteorological situations during the campaigns and the data set available from SPURT. Measurements were obtained for N2O, CH4, CO, CO2, CFC12, H2, SF6, NO, NOy, O3 and H2O. We illustrate the strength of this new data set by showing mean distributions of the mixing ratios of selected trace gases, using a potential temperature-equivalent latitude coordinate system. The observations reveal that the LMS is most stratospheric in character during spring, with the highest mixing ratios of O3 and NOy and the lowest mixing ratios of N2O and SF6. The lowest mixing ratios of NOy and O3 are observed during autumn, together with the highest mixing ratios of N2O and SF6 indicating a strong tropospheric influence. For H2O, however, the maximum concentrations in the LMS are found during summer, suggesting unique (temperature- and convection-controlled) conditions for this molecule during transport across the tropopause. The SPURT data set is presently the most accurate and complete data set for many trace species in the LMS, and its main value is the simultaneous measurement of a suite of trace gases having different lifetimes and physical-chemical histories. It is thus very well suited for studies of atmospheric transport, for model validation, and for investigations of seasonal changes in the UT/LMS, as demonstrated in accompanying and elsewhere published studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Project management (PM) is a globally recognized discipline and has been widely adopted within the construction industry. Despite advancements in the PM discipline, the ineffective traditional management system, typical of the non-executive PM structure, is still widely used in the Nigerian construction industry. The aim of this paper is thus to explore the challenges facing the adoption of the executive PM structure in Nigeria. The paper first assesses the level of growth of PM in Nigeria using UK best practices as a benchmark and identifies the key PM characteristics in the two countries. Focus group interviews were used to collect the primary data for the study and content analysis was used to present the results in a thematic format. The study revealed the key barriers to the adoption of an executive PM structure in Nigeria as a lack of proper awareness, unfavorable policies, skill shortages, the traditional culture of stakeholders and the absence of a regulatory body. It is recommended that the government, as a major player/client in the Nigerian construction industry, should lead the campaign to change the traditional industry approach to project management. This is necessary if construction stakeholders in Nigeria are to be educated and encouraged towards adopting and putting into practice effective PM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research which underpins this paper began as a doctoral project exploring archaic beliefs concerning Otherworlds and Thin Places in two particular landscapes - the West Coast of Wales and the West Coast of Ireland. A Thin Place is an ancient Celtic Christian term used to describe a marginal, liminal realm, beyond everyday human experience and perception, where mortals could pass into the Otherworld more readily, or make contact with those in the Otherworld more willingly. To encounter a Thin Place in ancient folklore was significant because it engendered a state of alertness, an awakening to what the theologian John O’ Donohue (2004: 49) called “the primal affection.” These complex notions and terms will be further explored in this paper in relation to Education. Thin Teaching is a pedagogical approach which offers students the space to ruminate on the possibility that their existence can be more and can mean more than the categories they believed they belonged to or felt they should inhabit. Central to the argument then, is that certain places and their inhabitants can become revitalised by sensitively considered teaching methodologies. This raises interesting questions about the role spirituality plays in teaching practice as a tool for healing in the twenty first century.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land cover maps at different resolutions and mapping extents contribute to modeling and support decision making processes. Because land cover affects and is affected by climate change, it is listed among the 13 terrestrial essential climate variables. This paper describes the generation of a land cover map for Latin America and the Caribbean (LAC) for the year 2008. It was developed in the framework of the project Latin American Network for Monitoring and Studying of Natural Resources (SERENA), which has been developed within the GOFC-GOLD Latin American network of remote sensing and forest fires (RedLaTIF). The SERENA land cover map for LAC integrates: 1) the local expertise of SERENA network members to generate the training and validation data, 2) a methodology for land cover mapping based on decision trees using MODIS time series, and 3) class membership estimates to account for pixel heterogeneity issues. The discrete SERENA land cover product, derived from class memberships, yields an overall accuracy of 84% and includes an additional layer representing the estimated per-pixel confidence. The study demonstrates in detail the use of class memberships to better estimate the area of scarce classes with a scattered spatial distribution. The land cover map is already available as a printed wall map and will be released in digital format in the near future. The SERENA land cover map was produced with a legend and classification strategy similar to that used by the North American Land Change Monitoring System (NALCMS) to generate a land cover map of the North American continent, that will allow to combine both maps to generate consistent data across America facilitating continental monitoring and modeling

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within the ESA Climate Change Initiative (CCI) project Aerosol_cci (2010–2013), algorithms for the production of long-term total column aerosol optical depth (AOD) datasets from European Earth Observation sensors are developed. Starting with eight existing pre-cursor algorithms three analysis steps are conducted to improve and qualify the algorithms: (1) a series of experiments applied to one month of global data to understand several major sensitivities to assumptions needed due to the ill-posed nature of the underlying inversion problem, (2) a round robin exercise of "best" versions of each of these algorithms (defined using the step 1 outcome) applied to four months of global data to identify mature algorithms, and (3) a comprehensive validation exercise applied to one complete year of global data produced by the algorithms selected as mature based on the round robin exercise. The algorithms tested included four using AATSR, three using MERIS and one using PARASOL. This paper summarizes the first step. Three experiments were conducted to assess the potential impact of major assumptions in the various aerosol retrieval algorithms. In the first experiment a common set of four aerosol components was used to provide all algorithms with the same assumptions. The second experiment introduced an aerosol property climatology, derived from a combination of model and sun photometer observations, as a priori information in the retrievals on the occurrence of the common aerosol components. The third experiment assessed the impact of using a common nadir cloud mask for AATSR and MERIS algorithms in order to characterize the sensitivity to remaining cloud contamination in the retrievals against the baseline dataset versions. The impact of the algorithm changes was assessed for one month (September 2008) of data: qualitatively by inspection of monthly mean AOD maps and quantitatively by comparing daily gridded satellite data against daily averaged AERONET sun photometer observations for the different versions of each algorithm globally (land and coastal) and for three regions with different aerosol regimes. The analysis allowed for an assessment of sensitivities of all algorithms, which helped define the best algorithm versions for the subsequent round robin exercise; all algorithms (except for MERIS) showed some, in parts significant, improvement. In particular, using common aerosol components and partly also a priori aerosol-type climatology is beneficial. On the other hand the use of an AATSR-based common cloud mask meant a clear improvement (though with significant reduction of coverage) for the MERIS standard product, but not for the algorithms using AATSR. It is noted that all these observations are mostly consistent for all five analyses (global land, global coastal, three regional), which can be understood well, since the set of aerosol components defined in Sect. 3.1 was explicitly designed to cover different global aerosol regimes (with low and high absorption fine mode, sea salt and dust).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives To estimate mortality rates and mortality trends from SLE in the state of Sao Paulo, Brazil. Material and methods The official data bank was used to study all deaths occurred from 1985 to 2004 in which SLE was mentioned as the underlying cause of death. Besides the overall mortality rate, the annual gender- and age-specific mortality rates were estimated for each calendar year by age bracket (0-19 years, 20-39 years, 40-59 years and over 60 years) and for the sub-periods 1985-1995 (first) and 1996-2004 (second), by decades. Chi-square test was used to compare the mortality rates between the two periods, as well the mortality rates according to educational level considering years of study. Pearson correlation coefficient test was used to analyse mortality trends. The crude rates were adjusted for age by the direct method, using the standard Brazilian population in 2000. Results A total of 2,601 deaths (90% female) attributed to SLE were analysed. The mean age at death was significantly higher in the second than in the first sub-period (36.6 +/- 15.6 years vs. 33.9 +/- 14.0 years; p<0.001). The overall adjusted mortality rate was 3.8 deaths/million habitants/year for the entire period and 3.4 deaths/million inhabitants/year for the first and 4.0 deaths/million inhabitants/year for the second sub-period (p<0.001). In each calendar year, the mortality rate was significantly lower for the better educated group. Throughout the period, there was a significant increase in mortality rates only among women over 40. Conclusion SLE patients living in the state of Silo Paulo still die at younger ages than those living in developed countries. Our data do not support the theory that there was an improvement in the SLE mortality rate in the last 20 years in the state of Sao Paulo. Socio-economic factors, such as the difficulty to get medical care and adequate treatment, may be the main factors to explain the worst prognosis for our patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Castanhao reservoir was built in the state of Ceara, a dry region in Northeastern Brazil, to regulate the flow of the Jaguaribe River, for irrigation, and for power generation. It is an earth-filled dam, 60 m high, with a water capacity of 4.5 x 10(9) m(3). The seismicity in the area has been monitored since 1998, with a few interruptions, using one analog or one digital station and, during a few periods, a three-station network. The first earthquakes likely to be induced events were detected in 2003, when the water level was about 20 in high. In early 2004 a very heavy rainfall season quickly filled the reservoir. Shortly after, an increase in the seismic activity occurred and many micro-earthquakes were recorded. We suggest that this activity resulted from an increase in pore pressure due to undrained response. Therefore, we may classify this cluster of microearthquakes as ""initial seismicity."" We deployed a network with four analog stations in the area, following this activity, to determine the epicentral zone. At least three epicentral areas under the reservoir were detected. The spatio-temporal analysis of the available data revealed that the seismicity occurs in clusters and that these were activated at different periods. We identified four sets of faults (N-S-, E-W-, NW-SE-, and NE-SW-oriented), some of which moved in shallow crustal levels and as recently as the Quaternary (1.8 Ma). Under the present-day stress regime, the last two sets moved as strike-slip structures. We suggest a possible correlation between dormant faults and the observed induced seismicity. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The global marketplace is rapidly intensifying. Longer product sales lives, greater profit margins or simply survival, is dependent on management¿s ability to create and lead change. Project Management has become an important competency, combined with other business practices to adapt to the trend of changing conditions. Critical Chain is a relatively new project methodology, elaborated by Eliyahu Goldratt in order to complete projects faster, make more efficient use of resources and securing the project deliverables. The methodology is based on the assumption that traditional project techniques such as CPM and PERT, do not recognize critical human behavior. The methodology claims that many project failures are a direct result of how safety is built into the task delivery times, and then wasted by human behavior such as Student Syndrome, Parkinson Law and Multitasking. However, there has been little or no previous research regarding this topic in the Argentine marketplace. This study intended to investigate to what extent the human behavior concepts of critical chain project management are present, by performing in-depth interviews with Argentine project stakeholders. It appears that the four human behavior concepts are present in Argentina and that the majority of Argentine companies are yet to apply project management techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report the results of a transcript finishing initiative, undertaken for the purpose of identifying and characterizing novel human transcripts, in which RT-PCR was used to bridge gaps between paired EST Clusters, mapped against the genomic sequence. Each pair of EST Clusters selected for experimental validation was designated a transcript finishing unit (TFU). A total of 489 TFUs were selected for validation, and an overall efficiency of 43.1% was achieved. We generated a total of 59,975 bp of transcribed sequences organized into 432 exons, contributing to the definition of the structure of 211 human transcripts. The structure of several transcripts reported here was confirmed during the course of this project, through the generation of their corresponding full-length cDNA sequences. Nevertheless, for 21% of the validated TFUs, a full-length cDNA sequence is not yet available in public databases, and the structure of 69.2% of these TFUs was not correctly predicted by computer programs. The TF strategy provides a significant contribution to the definition of the complete catalog of human genes and transcripts, because it appears to be particularly useful for identification of low abundance transcripts expressed in a restricted Set of tissues as well as for the delineation of gene boundaries and alternatively spliced isoforms.