48 resultados para Multi-Level Datasets


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The technique of linear responsibility analysis is used for a retrospective case study of a private industrial development consisting of an engineering factory and offices. A multi-disciplinary professional practice was used to manage and design the project. The organizational structure adopted on the project is analysed using concepts from systems theory which are included in Walker's theoretical model of the structure of building project organizations (Walker, 1981). This model proposes that the process of buildings provision can be viewed as systems and sub-systems which are differentiated form each other at decision points. Further to this, the sub-systematic analysis of the relationship between the contributors gives a quantitative assessment of the efficiency of the organizational structure used. There was a high level of satisfaction with the completed project and this is reflected by the way in which the organization structure corresponded to the model's proposition. However, the project was subject to string environmental forces which the project organization was not capable of entirely overcoming.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Northern hemisphere snow water equivalent (SWE) distribution from remote sensing (SSM/I), the ERA40 reanalysis product and the HadCM3 general circulation model are compared. Large differences are seen in the February climatologies, particularly over Siberia. The SSM/I retrieval algorithm may be overestimating SWE in this region, while comparison with independent runoff estimates suggest that HadCM3 is underestimating SWE. Treatment of snow grain size and vegetation parameterizations are concerns with the remotely sensed data. For this reason, ERA40 is used as `truth' for the following experiments. Despite the climatology differences, HadCM3 is able to reproduce the distribution of ERA40 SWE anomalies when assimilating ERA40 anomaly fields of temperature, sea level pressure, atmospheric winds and ocean temperature and salinity. However when forecasts are released from these assimilated initial states, the SWE anomaly distribution diverges rapidly from that of ERA40. No predictability is seen from one season to another. Strong links between European SWE distribution and the North Atlantic Oscillation (NAO) are seen, but forecasts of this index by the assimilation scheme are poor. Longer term relationships between SWE and the NAO, and SWE and the El Ni\~no-Southern Oscillation (ENSO) are also investigated in a multi-century run of HadCM3. SWE is impacted by ENSO in the Himalayas and North America, while the NAO affects SWE in North America and Europe. While significant connections with the NAO index were only present in DJF (and to an extent SON), the link between ENSO and February SWE distribution was seen to exist from the previous JJA ENSO index onwards. This represents a long lead time for SWE prediction for hydrological applications such as flood and wildfire forecasting. Further work is required to develop reliable large scale observation-based SWE datasets with which to test these model-derived connections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Projections of stratospheric ozone from a suite of chemistry-climate models (CCMs) have been analyzed. In addition to a reference simulation where anthropogenic halogenated ozone depleting substances (ODSs) and greenhouse gases (GHGs) vary with time, sensitivity simulations with either ODS or GHG concentrations fixed at 1960 levels were performed to disaggregate the drivers of projected ozone changes. These simulations were also used to assess the two distinct milestones of ozone returning to historical values (ozone return dates) and ozone no longer being influenced by ODSs (full ozone recovery). The date of ozone returning to historical values does not indicate complete recovery from ODSs in most cases, because GHG-induced changes accelerate or decelerate ozone changes in many regions. In the upper stratosphere where CO2-induced stratospheric cooling increases ozone, full ozone recovery is projected to not likely have occurred by 2100 even though ozone returns to its 1980 or even 1960 levels well before (~2025 and 2040, respectively). In contrast, in the tropical lower stratosphere ozone decreases continuously from 1960 to 2100 due to projected increases in tropical upwelling, while by around 2040 it is already very likely that full recovery from the effects of ODSs has occurred, although ODS concentrations are still elevated by this date. In the midlatitude lower stratosphere the evolution differs from that in the tropics, and rather than a steady decrease in ozone, first a decrease in ozone is simulated from 1960 to 2000, which is then followed by a steady increase through the 21st century. Ozone in the midlatitude lower stratosphere returns to 1980 levels by ~2045 in the Northern Hemisphere (NH) and by ~2055 in the Southern Hemisphere (SH), and full ozone recovery is likely reached by 2100 in both hemispheres. Overall, in all regions except the tropical lower stratosphere, full ozone recovery from ODSs occurs significantly later than the return of total column ozone to its 1980 level. The latest return of total column ozone is projected to occur over Antarctica (~2045–2060) whereas it is not likely that full ozone recovery is reached by the end of the 21st century in this region. Arctic total column ozone is projected to return to 1980 levels well before polar stratospheric halogen loading does so (~2025–2030 for total column ozone, cf. 2050–2070 for Cly+60×Bry) and it is likely that full recovery of total column ozone from the effects of ODSs has occurred by ~2035. In contrast to the Antarctic, by 2100 Arctic total column ozone is projected to be above 1960 levels, but not in the fixed GHG simulation, indicating that climate change plays a significant role.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Farming systems research is a multi-disciplinary holistic approach to solve the problems of small farms. Small and marginal farmers are the core of the Indian rural economy Constituting 0.80 of the total farming community but possessing only 0.36 of the total operational land. The declining trend of per capita land availability poses a serious challenge to the sustainability and profitability of farming. Under such conditions, it is appropriate to integrate land-based enterprises such as dairy, fishery, poultry, duckery, apiary, field and horticultural cropping within the farm, with the objective of generating adequate income and employment for these small and marginal farmers Under a set of farm constraints and varying levels of resource availability and Opportunity. The integration of different farm enterprises can be achieved with the help of a linear programming model. For the current review, integrated farming systems models were developed, by Way Of illustration, for the marginal, small, medium and large farms of eastern India using linear programming. Risk analyses were carried out for different levels of income and enterprise combinations. The fishery enterprise was shown to be less risk-prone whereas the crop enterprise involved greater risk. In general, the degree of risk increased with the increasing level of income. With increase in farm income and risk level, the resource use efficiency increased. Medium and large farms proved to be more profitable than small and marginal farms with higher level of resource use efficiency and return per Indian rupee (Rs) invested. Among the different enterprises of integrated farming systems, a chain of interaction and resource flow was observed. In order to make fanning profitable and improve resource use efficiency at the farm level, the synergy among interacting components of farming systems should be exploited. In the process of technology generation, transfer and other developmental efforts at the farm level (contrary to the discipline and commodity-based approaches which have a tendency to be piecemeal and in isolation), it is desirable to place a whole-farm scenario before the farmers to enhance their farm income, thereby motivating them towards more efficient and sustainable fanning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The control of fishing mortality via fishing effort remains fundamental to most fisheries management strategies even at the local community or co-management level. Decisions to support such strategies require knowledge of the underlying response of the catch to changes in effort. Even under adaptive management strategies, imprecise knowledge of the response is likely to help accelerate the adaptive learning process. Data and institutional capacity requirements to employ multi-species biomass dynamics and age-structured models invariably render their use impractical particularly in less developed regions of the world. Surplus production models fitted to catch and effort data aggregated across all species offer viable alternatives. The current paper seeks models of this type that best describe the multi-species catch–effort responses in floodplain-rivers, lakes and reservoirs and reef-based fisheries based upon among fishery comparisons, building on earlier work. Three alternative surplus production models were fitted to estimates of catch per unit area (CPUA) and fisher density for 258 fisheries in Africa, Asia and South America. In all cases examined, the best or equal best fitting model was the Fox type, explaining up to 90% of the variation in CPUA. For lake and reservoir fisheries in Africa and Asia, the Schaefer and an asymptotic model fitted equally well. The Fox model estimates of fisher density (fishers km−2) at maximum yield (iMY) for floodplain-rivers, African lakes and reservoirs and reef-based fisheries are 13.7 (95% CI [11.8, 16.4]); 27.8 (95% CI [17.5, 66.7]) and 643 (95% CI [459,1075]), respectively and compare well with earlier estimates. Corresponding estimates of maximum yield are also given. The significantly higher value of iMY for reef-based fisheries compared to estimates for rivers and lakes reflects the use of a different measure of fisher density based upon human population size estimates. The models predict that maximum yield is achieved at a higher fishing intensity in Asian lakes compared to those in Africa. This may reflect the common practice in Asia of stocking lakes to augment natural recruitment. Because of the equilibrium assumptions underlying the models, all the estimates of maximum yield and corresponding levels of effort should be treated with caution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nucleolin is a multi-functional protein that is located to the nucleolus. In tissue Culture cells, the stability of nucleolin is related to the proliferation status of the cell. During development, rat cardiomyocytes proliferate actively with increases in the mass of the heart being due to both hyperplasia and hypertrophy. The timing of this shift in the phenotype of the myocyte from one capable of undergoing hyperplasia to one that can grow only by hypertrophy occurs within 4 days of post-natal development. Thus, cardiomyocytes are an ideal model system in which to study the regulation of nucleolin during growth in vivo. Using Western blot and quantitative RT-PCR (TaqMan) we found that the amount of nucleolin is regulated both at the level of transcription and translation during the development of the cardiomyocyte. However, in cells which had exited the cell cycle and were subsequently given a hypertrophic stimulus, nucleolin was regulated post-transcriptionally. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objectives of this paper are to: firstly, identify key issues related to sustainable intelligent buildings (environmental, social, economic and technological factors); develop a conceptual model for the selection of the appropriate KPIs; secondly, test critically stakeholder's perceptions and values of selected KPIs intelligent buildings; and thirdly develop a new model for measuring the level of sustainability for sustainable intelligent buildings. This paper uses a consensus-based model (Sustainable Built Environment Tool- SuBETool), which is analysed using the analytical hierarchical process (AHP) for multi-criteria decision-making. The use of the multi-attribute model for priority setting in the sustainability assessment of intelligent buildings is introduced. The paper commences by reviewing the literature on sustainable intelligent buildings research and presents a pilot-study investigating the problems of complexity and subjectivity. This study is based upon a survey perceptions held by selected stakeholders and the value they attribute to selected KPIs. It is argued that the benefit of the new proposed model (SuBETool) is a ‘tool’ for ‘comparative’ rather than an absolute measurement. It has the potential to provide useful lessons from current sustainability assessment methods for strategic future of sustainable intelligent buildings in order to improve a building's performance and to deliver objective outcomes. Findings of this survey enrich the field of intelligent buildings in two ways. Firstly, it gives a detailed insight into the selection of sustainable building indicators, as well as their degree of importance. Secondly, it tesst critically stakeholder's perceptions and values of selected KPIs intelligent buildings. It is concluded that the priority levels for selected criteria is largely dependent on the integrated design team, which includes the client, architects, engineers and facilities managers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building refurbishment is key to reducing the carbon footprint and improving comfort in the built environment. However, quantifying the real benefit of a facade change, which can bring advantages to owners (value), occupants (comfort) and the society (sustainability), is not a simple task. At a building physics level, the changes in kWh per m2 of heating / cooling load can be readily quantified. However, there are many subtle layers of operation and mainte-nance below these headline figures which determine how sustainable a building is in reality, such as for example quality of life factors. This paper considers the range of approached taken by a fa/e refurbishment consortium to assess refurbishment solutions for multi-storey, multi-occupancy buildings and how to critically evaluate them. Each of the applued tools spans one or more of the three building parameters of people, product and process. 'De-cision making' analytical network process and parametric building analysis tools are described and their potential impact on the building refurbishment process evaluated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A means of assessing, monitoring and controlling aggregate emissions from multi-instrument Emissions Trading Schemes is proposed. The approach allows contributions from different instruments with different forms of emissions targets to be integrated. Where Emissions Trading Schemes are helping meet specific national targets, the approach allows the entry requirements of new participants to be calculated and set at a level that will achieve these targets. The approach is multi-levelled, and may be extended downwards to support pooling of participants within instruments, or upwards to embed Emissions Trading Schemes within a wider suite of policies and measures with hard and soft targets. Aggregate emissions from each instrument are treated stochastically. Emissions from the scheme as a whole are then the joint probability distribution formed by integrating the emissions from its instruments. Because a Bayesian approach is adopted, qualitative and semi-qualitative data from expert opinion can be used where quantitative data is not currently available, or is incomplete. This approach helps government retain sufficient control over emissions trading scheme targets to allow them to meet their emissions reduction obligations, while minimising the need for retrospectively adjusting existing participants’ conditions of entry. This maintains participant confidence, while providing the necessary policy levers for good governance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fingerprinting is a well known approach for identifying multimedia data without having the original data present but what amounts to its essence or ”DNA”. Current approaches show insufficient deployment of three types of knowledge that could be brought to bear in providing a finger printing framework that remains effective, efficient and can accommodate both the whole as well as elemental protection at appropriate levels of abstraction to suit various Foci of Interest (FoI) in an image or cross media artefact. Thus our proposed framework aims to deliver selective composite fingerprinting that remains responsive to the requirements for protection of whole or parts of an image which may be of particularly interest and be especially vulnerable to attempts at rights violation. This is powerfully aided by leveraging both multi-modal information as well as a rich spectrum of collateral context knowledge including both image-level collaterals as well as the inevitably needed market intelligence knowledge such as customers’ social networks interests profiling which we can deploy as a crucial component of our Fingerprinting Collateral Knowledge. This is used in selecting the special FoIs within an image or other media content that have to be selectively and collaterally protected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As Terabyte datasets become the norm, the focus has shifted away from our ability to produce and store ever larger amounts of data, onto its utilization. It is becoming increasingly difficult to gain meaningful insights into the data produced. Also many forms of the data we are currently producing cannot easily fit into traditional visualization methods. This paper presents a new and novel visualization technique based on the concept of a Data Forest. Our Data Forest has been designed to be used with vir tual reality (VR) as its presentation method. VR is a natural medium for investigating large datasets. Our approach can easily be adapted to be used in a variety of different ways, from a stand alone single user environment to large multi-user collaborative environments. A test application is presented using multi-dimensional data to demonstrate the concepts involved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fingerprinting is a well known approach for identifying multimedia data without having the original data present but instead what amounts to its essence or 'DNA'. Current approaches show insufficient deployment of various types of knowledge that could be brought to bear in providing a fingerprinting framework that remains effective, efficient and can accommodate both the whole as well as elemental protection at appropriate levels of abstraction to suit various Zones of Interest (ZoI) in an image or cross media artefact. The proposed framework aims to deliver selective composite fingerprinting that is powerfully aided by leveraging both multi-modal information as well as a rich spectrum of collateral context knowledge including both image-level collaterals and also the inevitably needed market intelligence knowledge such as customers' social networks interests profiling which we can deploy as a crucial component of our fingerprinting collateral knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a high-level design method to synthesize multi-phase regular arrays. The method is based on deriving component designs using classical regular (or systolic) array synthesis techniques and composing these separately evolved component design into a unified global design. Similarity transformations ar e applied to component designs in the composition stage in order to align data ow between the phases of the computations. Three transformations are considered: rotation, re ection and translation. The technique is aimed at the design of hardware components for high-throughput embedded systems applications and we demonstrate this by deriving a multi-phase regular array for the 2-D DCT algorithm which is widely used in many vide ocommunications applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The consistency of precipitation variability estimated from the multiple satellite-based observing systems is assessed. There is generally good agreement between TRMM TMI, SSM/I, GPCP and AMSRE datasets for the inter-annual variability of precipitation since 1997 but the HOAPS dataset appears to overestimate the magnitude of variability. Over the tropical ocean the TRMM 3B42 dataset produces unrealistic variabilitys. Based upon deseasonalised GPCP data for the period 1998-2008, the sensitivity of global mean precipitation (P) to surface temperature (T) changes (dP/dT) is about 6%/K, although a smaller sensitivity of 3.6%/K is found using monthly GPCP data over the longer period 1989-2008. Over the tropical oceans dP/dT ranges from 10-30%/K depending upon time-period and dataset while over tropical land dP/dT is -8 to -11%/K for the 1998-2008 period. Analyzing the response of the tropical ocean precipitation intensity distribution to changes in T we find the wetter area P shows a strong positive response to T of around 20%/K. The response over the drier tropical regimes is less coherent and varies with datasets, but responses over the tropical land show significant negative relationships over an interannual time-scale. The spatial and temporal resolutions of the datasets strongly influence the precipitation responses over the tropical oceans and help explain some of the discrepancy between different datasets. Consistency between datasets is found to increase on averaging from daily to 5-day time-scales and considering a 1o (or coarser) spatial resolution. Defining the wet and dry tropical ocean regime by the 60th percentile of P intensity, the 5-day average, 1o TMI data exhibits a coherent drying of the dry regime at the rate of -20%/K and the wet regime becomes wetter at a similar rate with warming.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forests are a store of carbon and an eco-system that continually removes carbon dioxide from the atmosphere. If they are sustainably managed, the carbon store can be maintained at a constant level, while the trees removed and converted to timber products can form an additional long term carbon store. The total carbon store in the forest and associated ‘wood chain’ therefore increases over time, given appropriate management. This increasing carbon store can be further enhanced with afforestation. The UK’s forest area has increased continually since the early 1900s, although the rate of increase has declined since its peak in the late 1980s, and it is a similar picture in the rest of Europe. The increased sustainable use of timber in construction is a key market incentive for afforestation, which can make a significant contribution to reducing carbon emissions. The case study presented in this paper demonstrates the carbon benefits of a Cross Laminated Timber (CLT) solution for a multi-storey residential building in comparison with a more conventional reinforced concrete solution. The embodied carbon of the building up to completion of construction is considered, together with the stored carbon during the life of the building and the impact of different end of life scenarios. The results of the study show that the total stored carbon in the CLT structural frame is 1215tCO2 (30tCO2 per housing unit). The choice of treatment at end of life has a significant effect on the whole life embodied carbon of the CLT frame, which ranges from -1017 tCO2e for re-use to +153tCO2e for incinerate without energy recovery. All end of life scenarios considered result in lower total CO2e emissions for the CLT frame building compared with the reinforced concrete frame solution.