80 resultados para Best Dominant

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Planning a project with proper considerations of all necessary factors and managing a project to ensure its successful implementation will face a lot of challenges. Initial stage in planning a project for bidding a project is costly, time consuming and usually with poor accuracy on cost and effort predictions. On the other hand, detailed information for previous projects may be buried in piles of archived documents which can be increasingly difficult to learn from the previous experiences. Project portfolio has been brought into this field aiming to improve the information sharing and management among different projects. However, the amount of information that could be shared is still limited to generic information. This paper, we report a recently developed software system COBRA to automatically generate a project plan with effort estimation of time and cost based on data collected from previous completed projects. To maximise the data sharing and management among different projects, we proposed a method of using product based planning from PRINCE2 methodology. (Automated Project Information Sharing and Management System -�COBRA) Keywords: project management, product based planning, best practice, PRINCE2

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Integrated Catchment Model of Nitrogen (INCA-N) was applied to the River Lambourn, a Chalk river-system in southern England. The model's abilities to simulate the long-term trend and seasonal patterns in observed stream water nitrate concentrations from 1920 to 2003 were tested. This is the first time a semi-distributed, daily time-step model has been applied to simulate such a long time period and then used to calculate detailed catchment nutrient budgets which span the conversion of pasture to arable during the late 1930s and 1940s. Thus, this work goes beyond source apportionment and looks to demonstrate how such simulations can be used to assess the state of the catchment and develop an understanding of system behaviour. The mass-balance results from 1921, 1922, 1991, 2001 and 2002 are presented and those for 1991 are compared to other modelled and literature values of loads associated with nitrogen soil processes and export. The variations highlighted the problem of comparing modelled fluxes with point measurements but proved useful for identifying the most poorly understood inputs and processes thereby providing an assessment of input data and model structural uncertainty. The modelled terrestrial and instream mass-balances also highlight the importance of the hydrological conditions in pollutant transport. Between 1922 and 2002, increased inputs of nitrogen from fertiliser, livestock and deposition have altered the nitrogen balance with a shift from possible reduction in soil fertility but little environmental impact in 1922, to a situation of nitrogen accumulation in the soil, groundwater and instream biota in 2002. In 1922 and 2002 it was estimated that approximately 2 and 18 kg N ha(-1) yr(-1) respectively were exported from the land to the stream. The utility of the approach and further considerations for the best use of models are discussed. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Laboratory determined mineral weathering rates need to be normalised to allow their extrapolation to natural systems. The principle normalisation terms used in the literature are mass, and geometric- and BET specific surface area (SSA). The purpose of this study was to determine how dissolution rates normalised to these terms vary with grain size. Different size fractions of anorthite and biotite ranging from 180-150 to 20-10 mu m were dissolved in pH 3, HCl at 25 degrees C in flow through reactors under far from equilibrium conditions. Steady state dissolution rates after 5376 h (anorthite) and 4992 h (biotite) were calculated from Si concentrations and were normalised to initial- and final- mass and geometric-, geometric edge- (biotite), and BET SSA. For anorthite, rates normalised to initial- and final-BET SSA ranged from 0.33 to 2.77 X 10(-10) mol(feldspar) m(-2) s(-1), rates normalised to initial- and final-geometric SSA ranged from 5.74 to 8.88 X 10(-10) mol(feldspar) m(-2) s(-1) and rates normalised to initial- and final-mass ranged from 0.11 to 1.65 mol(feldspar) g(-1) s(-1). For biotite, rates normalised to initial- and final-BET SSA ranged from 1.02 to 2.03 X 10(-12) mol(biotite) m(-2) s(-1), rates normalised to initial- and final-geometric SSA ranged from 3.26 to 16.21 X 10(-12) mol(biotite) m(-2) s(-1), rates normalised to initial- and final-geometric edge SSA ranged from 59.46 to 111.32 x 10(-12) mol(biotite) m(-2) s(-1) and rates normalised to initial- and final-mass ranged from 0.81 to 6.93 X 10(-12) mol(biotite) g(-1) s(-1). For all normalising terms rates varied significantly (p <= 0.05) with grain size. The normalising terms which gave least variation in dissolution rate between grain sizes for anorthite were initial BET SSA and initial- and final-geometric SSA. This is consistent with: (1) dissolution being dominated by the slower dissolving but area dominant non-etched surfaces of the grains and, (2) the walls of etch pits and other dissolution features being relatively unreactive. These steady state normalised dissolution rates are likely to be constant with time. Normalisation to final BET SSA did not give constant ratios across grain size due to a non-uniform distribution of dissolution features. After dissolution coarser grains had a greater density of dissolution features with BET-measurable but unreactive wall surface area than the finer grains. The normalising term which gave the least variation in dissolution rates between grain sizes for biotite was initial BET SSA. Initial- and final-geometric edge SSA and final BET SSA gave the next least varied rates. The basal surfaces dissolved sufficiently rapidly to influence bulk dissolution rate and prevent geometric edge SSA normalised dissolution rates showing the least variation. Simple modelling indicated that biotite grain edges dissolved 71-132 times faster than basal surfaces. In this experiment, initial BET SSA best integrated the different areas and reactivities of the edge and basal surfaces of biotite. Steady state dissolution rates are likely to vary with time as dissolution alters the ratio of edge to basal surface area. Therefore they would be more properly termed pseudo-steady state rates, only appearing constant because the time period over which they were measured (1512 h) was less than the time period over wich they would change significantly. (c) 2006 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Integrated Catchment Model of Nitrogen (INCA-N) was applied to the River Lambourn, a Chalk river-system in southern England. The model's abilities to simulate the long-term trend and seasonal patterns in observed stream water nitrate concentrations from 1920 to 2003 were tested. This is the first time a semi-distributed, daily time-step model has been applied to simulate such a long time period and then used to calculate detailed catchment nutrient budgets which span the conversion of pasture to arable during the late 1930s and 1940s. Thus, this work goes beyond source apportionment and looks to demonstrate how such simulations can be used to assess the state of the catchment and develop an understanding of system behaviour. The mass-balance results from 1921, 1922, 1991, 2001 and 2002 are presented and those for 1991 are compared to other modelled and literature values of loads associated with nitrogen soil processes and export. The variations highlighted the problem of comparing modelled fluxes with point measurements but proved useful for identifying the most poorly understood inputs and processes thereby providing an assessment of input data and model structural uncertainty. The modelled terrestrial and instream mass-balances also highlight the importance of the hydrological conditions in pollutant transport. Between 1922 and 2002, increased inputs of nitrogen from fertiliser, livestock and deposition have altered the nitrogen balance with a shift from possible reduction in soil fertility but little environmental impact in 1922, to a situation of nitrogen accumulation in the soil, groundwater and instream biota in 2002. In 1922 and 2002 it was estimated that approximately 2 and 18 kg N ha(-1) yr(-1) respectively were exported from the land to the stream. The utility of the approach and further considerations for the best use of models are discussed. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A systematic modular approach to investigate the respective roles of the ocean and atmosphere in setting El Niño characteristics in coupled general circulation models is presented. Several state-of-the-art coupled models sharing either the same atmosphere or the same ocean are compared. Major results include 1) the dominant role of the atmosphere model in setting El Niño characteristics (periodicity and base amplitude) and errors (regularity) and 2) the considerable improvement of simulated El Niño power spectra—toward lower frequency—when the atmosphere resolution is significantly increased. Likely reasons for such behavior are briefly discussed. It is argued that this new modular strategy represents a generic approach to identifying the source of both coupled mechanisms and model error and will provide a methodology for guiding model improvement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Airborne scanning laser altimetry (LiDAR) is an important new data source for river flood modelling. LiDAR can give dense and accurate DTMs of floodplains for use as model bathymetry. Spatial resolutions of 0.5m or less are possible, with a height accuracy of 0.15m. LiDAR gives a Digital Surface Model (DSM), so vegetation removal software (e.g. TERRASCAN) must be used to obtain a DTM. An example used to illustrate the current state of the art will be the LiDAR data provided by the EA, which has been processed by their in-house software to convert the raw data to a ground DTM and separate vegetation height map. Their method distinguishes trees from buildings on the basis of object size. EA data products include the DTM with or without buildings removed, a vegetation height map, a DTM with bridges removed, etc. Most vegetation removal software ignores short vegetation less than say 1m high. We have attempted to extend vegetation height measurement to short vegetation using local height texture. Typically most of a floodplain may be covered in such vegetation. The idea is to assign friction coefficients depending on local vegetation height, so that friction is spatially varying. This obviates the need to calibrate a global floodplain friction coefficient. It’s not clear at present if the method is useful, but it’s worth testing further. The LiDAR DTM is usually determined by looking for local minima in the raw data, then interpolating between these to form a space-filling height surface. This is a low pass filtering operation, in which objects of high spatial frequency such as buildings, river embankments and walls may be incorrectly classed as vegetation. The problem is particularly acute in urban areas. A solution may be to apply pattern recognition techniques to LiDAR height data fused with other data types such as LiDAR intensity or multispectral CASI data. We are attempting to use digital map data (Mastermap structured topography data) to help to distinguish buildings from trees, and roads from areas of short vegetation. The problems involved in doing this will be discussed. A related problem of how best to merge historic river cross-section data with a LiDAR DTM will also be considered. LiDAR data may also be used to help generate a finite element mesh. In rural area we have decomposed a floodplain mesh according to taller vegetation features such as hedges and trees, so that e.g. hedge elements can be assigned higher friction coefficients than those in adjacent fields. We are attempting to extend this approach to urban area, so that the mesh is decomposed in the vicinity of buildings, roads, etc as well as trees and hedges. A dominant points algorithm is used to identify points of high curvature on a building or road, which act as initial nodes in the meshing process. A difficulty is that the resulting mesh may contain a very large number of nodes. However, the mesh generated may be useful to allow a high resolution FE model to act as a benchmark for a more practical lower resolution model. A further problem discussed will be how best to exploit data redundancy due to the high resolution of the LiDAR compared to that of a typical flood model. Problems occur if features have dimensions smaller than the model cell size e.g. for a 5m-wide embankment within a raster grid model with 15m cell size, the maximum height of the embankment locally could be assigned to each cell covering the embankment. But how could a 5m-wide ditch be represented? Again, this redundancy has been exploited to improve wetting/drying algorithms using the sub-grid-scale LiDAR heights within finite elements at the waterline.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

At present, there is much anxiety regarding the security of energy supplies; for example, the UK and other European States are set to become increasingly dependant upon imports of natural gas from states with which political relations are often strained. These uncertainties are felt acutely by the electricity generating sector, which is facing major challenges regarding the choice of fuel mix in the years ahead. Nuclear energy may provide an alternative; however, in the UK, progress in replacing the first generation reactors is exceedingly slow. A number of operators are looking to coal as a means of plugging the energy gap. However, in the light of ever more stringent legal controls on emissions, this step cannot be taken without the adoption of sophisticated pollution abatement technology. This article examines the role which legal concepts such as Best Available Techniques (BAT) must play in bringing about these changes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When assessing hypotheses, the possibility and consequences of false-positive conclusions should be considered along with the avoidance of false-negative ones. A recent assessment of the system of rice intensification (SRI) by McDonald et al. [McDonald, A.J., Hobbs, P.R., Riha, S.J., 2006. Does the system of rice intensification outperform conventional best management? A synopsis of the empirical record. Field Crops Res. 96, 31-36] provides a good example where this was not done as it was preoccupied with avoiding false-positives only. It concluded, based on a desk study using secondary data assembled selectively from diverse sources and with a 95% level of confidence, that 'best management practices' (BMPs) on average produce 11% higher rice yields than SRI methods, and that, therefore, SRI has little to offer beyond what is already known by scientists.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents our experience with combining statistical principles and participatory methods to generate national statistics. The methodology was developed in Malawi during 1999–2002. We demonstrate that if PRA is combined with statistical principles (including probability-based sampling and standardization), it can produce total population statistics and estimates of the proportion of households with certain characteristics (e.g., poverty). It can also provide quantitative data on complex issues of national importance such as poverty targeting. This approach is distinct from previous PRA-based approaches, which generate numbers at community level but only provide qualitative information at national level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An overtly critical perspective on 're-engineering construction' is presented. It is contended that re-engineering is impossible to define in terms of its substantive content and is best understood as a rhetorical label. In recent years, the language of re-engineering has heavily shaped the construction research agenda. The declared goals are to lower costs and improve value for the customer. The discourse is persuasive because it reflects the ideology of the 'enterprise culture' and the associated rhetoric of customer responsiveness. Re-engineering is especially attractive to the construction industry because it reflects and reinforces the existing dominant way of thinking. The overriding tendency is to reduce organizational complexities to a mechanistic quest for efficiency. Labour is treated as a commodity. Within this context, the objectives of re-engineering become 'common sense'. Knowledge becomes subordinate to the dominant ideology of neo-liberalism. The accepted research agenda for re-engineering construction exacerbates the industry's problems and directly contributes to the casualization of the workforce. The continued adherence to machine metaphors by the construction industry's top management has directly contributed to the 'bad attitudes' and 'adversarial culture' that they repeatedly decry. Supposedly neutral topics such as pre-assembly, partnering, supply chain management and lean thinking serve only to justify the shift towards bogus labour-only subcontracting and the associated reduction of employment rights. The continued casualization of the workforce raises real questions about the industry's future capacity to deliver high-quality construction. In order to appear 'relevant' to the needs of industry, it seems that the research community is doomed to perpetuate this regressive cycle.