976 resultados para project delay estimation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Delays in the justice system have been undermining the functioning and performance of the court system all over the world for decades. Despite the widespread concern about delays, the solutions have not kept up with the growth of the problem. The delay problem existing in the justice courts processes is a good example of the growing need and pressure in professional public organizations to start improving their business process performance.This study analyses the possibilities and challenges of process improvement in professional public organizations. The study is based on experiences gained in two longitudinal action research improvement projects conducted in two separate Finnish law instances; in the Helsinki Court of Appeal and in the Insurance Court. The thesis has two objectives. First objective is to study what kinds of factors in court system operations cause delays and unmanageable backlogs and how to reduce and prevent delays. Based on the lessons learned from the case projects the objective is to give new insights on the critical factors of process improvement conducted in professional public organizations. Four main areas and factors behind the delay problem is identified: 1) goal setting and performance measurement practices, 2) the process control system, 3) production and capacity planning procedures, and 4) process roles and responsibilities. The appropriate improvement solutions include tools to enhance project planning and scheduling and monitoring the agreed time-frames for different phases of the handling process and pending inventory. The study introduces the identified critical factors in different phases of process improvement work carried out in professional public organizations, the ways the critical factors can be incorporated to the different stages of the projects, and discusses the role of external facilitator in assisting process improvement work and in enhancing ownership towards the solutions and improvement. The study highlights the need to concentrate on the critical factors aiming to get the employees to challenge their existing ways of conducting work, analyze their own processes, and create procedures for diffusing the process improvement culture instead of merely concentrating of finding tools, techniques, and solutions appropriate for applications from the manufacturing sector

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Target company of this study is a large machinery company, which is, inter alia, engaged in energy and pulp engineering, procurement and construction management (EPCM) supply business. The main objective of this study was to develop cost estimation of the target company by providing more accurate, reliable and up-to-date information through enterprise resource planning (ERP) system. Another objective was to find cost-effective methods to collect total cost of ownership information to support more informed supplier selection decision making. This study is primarily action-oriented, but also constructive, and it can be divided in two sections: theoretical literature review and empirical study on the abovementioned part of the target company’s business. Development of information collection is, in addition to literature review, based on nearly 30 qualitative interviews of employees at various organizational units, functions and levels at the target company. At the core of development was to make initial data more accurate, reliable and available, a necessary prerequisite for informed use of the information. Certain development suggestions and paths were presented in order to regain confidence in ERP system as information source by reorganizing work breakdown structure and by complementing mere cost information with quantitative, technical and scope information. Several methods to use the information ever more effectively were also discussed. While implementation of the development suggestions outreached the scope of this study, it was forwarded in test environment and interest groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies the effect of time delay on the active non-linear control of dynamically loaded flexible structures. The behavior of non-linear systems under state feedback control, considering a fixed time delay for the control force, is investigated. A control method based on non-linear optimal control, using a tensorial formulation and state feedback control is used. The state equations and the control forces are expressed in polynomial form and a performance index, quadratic in both state vector and control forces, is used. General polynomial representations of the non-linear control law are obtained and implemented for control algorithms up to the fifth order. This methodology is applied to systems with quadratic and cubic non-linearities. Strongly non-linear systems are tested and the effectiveness of the control system including a delay for the application of control forces is discussed. Numerical results indicate that the adopted control algorithm can be efficient for non-linear systems, chiefly in the presence of strong non-linearities but increasing time delay reduces the efficiency of the control system. Numerical results emphasize the importance of considering time delay in the project of active structural control systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chapter 1 presents a brief note on the state at which the construction industry stands at present, bringing into focus the significance of the critical study. Relevance of the study, area of investigation and objectives of the study are outlined in this chapter. The 2nd chapter presents a review of the literature on the relevant areas. In the third chapter an analysis on time and cost overrun in construction highlighting the major factors responsible for it has been done. A couple of case studies to estimate loss to the nation on account of delay in construction have been presented in the chapter. The need for an appropriate estimate and a competent contractor has been emphasised for improving effectiveness in the project implementation. Certain useful equations and thoughts have been formulated on this area in this chapter that can be followed in State PWD and other Govt. organisations. Case studies on project implementation of major projects undertaken by Government sponsored/supported organizations in Kerala have been dealt with in Chapter 4. A detailed description of the project of Kerala Legislature Complex with a critical analysis has been given in this chapter. A detailed account of the investigations carried out on the construction of International Stadium, a sports project of Greater Cochin Development Authority is included here. The project details of Cochin International Airport at Nedumbassery, its promoters and contractors are also discussed in Chapter 4. Various aspects of implementation which led the above projects successful have been discussed in chapter 5. The data collected were analysed through discussion and perceptions to arrive at certain conclusions. The emergence of front-loaded contract and its impact on economics of the project execution are dealt with in this chapter. Analysis of delays in respect of the various project narrated in chapter 3 has been done here. The root causes of the project time and overrun and its remedial measures are also enlisted in this chapter. Study of cost and time overrun of any construction project IS a part of construction management. Under the present environment of heavy investment on construction activities in India, the consequences of mismanagement many a time lead to excessive expenditure which are not be avoidable. Cost consciousness, therefore has to be keener than ever before. Optimization in investment can be achieved by improved dynamism in construction management. The successful completion of coristruction projects within the specified programme, optimizing three major attributes of the process - quality, schedule and costs - has become the most valuable and challenging task for the engineer - managers to perform. So, the various aspects of construction management such as cost control, schedule control, quality assurance, management techniques etc. have also been discussed in this fifth chapter. Chapter 6 summarises the conclusions drawn from the above criticalr1 of rhajor construction projects in Kerala.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Planning a project with proper considerations of all necessary factors and managing a project to ensure its successful implementation will face a lot of challenges. Initial stage in planning a project for bidding a project is costly, time consuming and usually with poor accuracy on cost and effort predictions. On the other hand, detailed information for previous projects may be buried in piles of archived documents which can be increasingly difficult to learn from the previous experiences. Project portfolio has been brought into this field aiming to improve the information sharing and management among different projects. However, the amount of information that could be shared is still limited to generic information. This paper, we report a recently developed software system COBRA to automatically generate a project plan with effort estimation of time and cost based on data collected from previous completed projects. To maximise the data sharing and management among different projects, we proposed a method of using product based planning from PRINCE2 methodology. (Automated Project Information Sharing and Management System -�COBRA) Keywords: project management, product based planning, best practice, PRINCE2

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The technique of linear responsibility analysis is used for a retrospective case study of a private development consisting of an extension to an existing building to provide a wholesale butchery facility. The project used a conventionally organized management process. The organization structure adopted on the project is analysed using concepts from the systems theory, which are included in Walkers theoretical model of the structure of building project organizations. This model proposes that the process of building provision can be viewed as systems and sub-systems that are differentiated from each other at decision points. Further to this, the sub-systems can be viewed as the interaction of managing system and operating system. Using Walkers model, a systematic analysis of the relationships between the contributors gives a quantitative assessment of the efficiency of the organizational structure used. The project's organization structure diverged from the models propositions resulting in delay to the project's completion and cost overrun but the client was satisfied with the project functionally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The European research project TIDE (Tidal Inlets Dynamics and Environment) is developing and validating coupled models describing the morphological, biological and ecological evolution of tidal environments. The interactions between the physical and biological processes occurring in these regions requires that the system be studied as a whole rather than as separate parts. Extensive use of remote sensing including LiDAR is being made to provide validation data for the modelling. This paper describes the different uses of LiDAR within the project and their relevance to the TIDE science objectives. LiDAR data have been acquired from three different environments, the Venice Lagoon in Italy, Morecambe Bay in England, and the Eden estuary in Scotland. LiDAR accuracy at each site has been evaluated using ground reference data acquired with differential GPS. A semi-automatic technique has been developed to extract tidal channel networks from LiDAR data either used alone or fused with aerial photography. While the resulting networks may require some correction, the procedure does allow network extraction over large areas using objective criteria and reduces fieldwork requirements. The networks extracted may subsequently be used in geomorphological analyses, for example to describe the drainage patterns induced by networks and to examine the rate of change of networks. Estimation of the heights of the low and sparse vegetation on marshes is being investigated by analysis of the statistical distribution of the measured LiDAR heights. Species having different mean heights may be separated using the first-order moments of the height distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parametric software effort estimation models consisting on a single mathematical relationship suffer from poor adjustment and predictive characteristics in cases in which the historical database considered contains data coming from projects of a heterogeneous nature. The segmentation of the input domain according to clusters obtained from the database of historical projects serves as a tool for more realistic models that use several local estimation relationships. Nonetheless, it may be hypothesized that using clustering algorithms without previous consideration of the influence of well-known project attributes misses the opportunity to obtain more realistic segments. In this paper, we describe the results of an empirical study using the ISBSG-8 database and the EM clustering algorithm that studies the influence of the consideration of two process-related attributes as drivers of the clustering process: the use of engineering methodologies and the use of CASE tools. The results provide evidence that such consideration conditions significantly the final model obtained, even though the resulting predictive quality is of a similar magnitude.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Little has been reported on the performance of near-far resistant CDMA detectors in the presence of system parameter estimation errors (SPEEs). Starting with the general mathematical model of matched filters, the paper examines the effects of three classes of SPEEs, i.e., time-delay, carrier phase, and carrier frequency errors, on the performance (BER) of an emerging type of near-far resistant coherent DS/SSMA detector, i.e., the linear decorrelating detector. For comparison, the corresponding results for the conventional detector are also presented. It is shown that the linear decorrelating detector can still maintain a considerable performance advantage over the conventional detector even when some SPEEs exist.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a model-data fusion (MDF) inter-comparison project (REFLEX), which compared various algorithms for estimating carbon (C) model parameters consistent with both measured carbon fluxes and states and a simple C model. Participants were provided with the model and with both synthetic net ecosystem exchange (NEE) of CO2 and leaf area index (LAI) data, generated from the model with added noise, and observed NEE and LAI data from two eddy covariance sites. Participants endeavoured to estimate model parameters and states consistent with the model for all cases over the two years for which data were provided, and generate predictions for one additional year without observations. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. For the synthetic data case, parameter estimates compared well with the true values. The results of the analyses indicated that parameters linked directly to gross primary production (GPP) and ecosystem respiration, such as those related to foliage allocation and turnover, or temperature sensitivity of heterotrophic respiration, were best constrained and characterised. Poorly estimated parameters were those related to the allocation to and turnover of fine root/wood pools. Estimates of confidence intervals varied among algorithms, but several algorithms successfully located the true values of annual fluxes from synthetic experiments within relatively narrow 90% confidence intervals, achieving >80% success rate and mean NEE confidence intervals <110 gC m−2 year−1 for the synthetic case. Annual C flux estimates generated by participants generally agreed with gap-filling approaches using half-hourly data. The estimation of ecosystem respiration and GPP through MDF agreed well with outputs from partitioning studies using half-hourly data. Confidence limits on annual NEE increased by an average of 88% in the prediction year compared to the previous year, when data were available. Confidence intervals on annual NEE increased by 30% when observed data were used instead of synthetic data, reflecting and quantifying the addition of model error. Finally, our analyses indicated that incorporating additional constraints, using data on C pools (wood, soil and fine roots) would help to reduce uncertainties for model parameters poorly served by eddy covariance data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A global river routing scheme coupled to the ECMWF land surface model is implemented and tested within the framework of the Global Soil Wetness Project II, to evaluate the feasibility of modelling global river runoff at a daily time scale. The exercise is designed to provide benchmark river runoff predictions needed to verify the land surface model. Ten years of daily runoff produced by the HTESSEL land surface scheme is input into the TRIP2 river routing scheme in order to generate daily river runoff. These are then compared to river runoff observations from the Global Runoff Data Centre (GRDC) in order to evaluate the potential and the limitations. A notable source of inaccuracy is bias between observed and modelled discharges which is not primarily due to the modelling system but instead of to the forcing and quality of observations and seems uncorrelated to the river catchment size. A global sensitivity analysis and Generalised Likelihood Uncertainty Estimation (GLUE) uncertainty analysis are applied to the global routing model. The ground water delay parameter is identified as being the most sensitive calibration parameter. Significant uncertainties are found in results, and those due to parameterisation of the routing model are quantified. The difficulty involved in parameterising global river discharge models is discussed. Detailed river runoff simulations are shown for the river Danube, which match well observed river runoff in upstream river transects. Results show that although there are errors in runoff predictions, model results are encouraging and certainly indicative of useful runoff predictions, particularly for the purpose of verifying the land surface scheme hydrologicly. Potential of this modelling system on future applications such as river runoff forecasting and climate impact studies is highlighted. Copyright © 2009 Royal Meteorological Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Bollène-2002 Experiment was aimed at developing the use of a radar volume-scanning strategy for conducting radar rainfall estimations in the mountainous regions of France. A developmental radar processing system, called Traitements Régionalisés et Adaptatifs de Données Radar pour l’Hydrologie (Regionalized and Adaptive Radar Data Processing for Hydrological Applications), has been built and several algorithms were specifically produced as part of this project. These algorithms include 1) a clutter identification technique based on the pulse-to-pulse variability of reflectivity Z for noncoherent radar, 2) a coupled procedure for determining a rain partition between convective and widespread rainfall R and the associated normalized vertical profiles of reflectivity, and 3) a method for calculating reflectivity at ground level from reflectivities measured aloft. Several radar processing strategies, including nonadaptive, time-adaptive, and space–time-adaptive variants, have been implemented to assess the performance of these new algorithms. Reference rainfall data were derived from a careful analysis of rain gauge datasets furnished by the Cévennes–Vivarais Mediterranean Hydrometeorological Observatory. The assessment criteria for five intense and long-lasting Mediterranean rain events have proven that good quantitative precipitation estimates can be obtained from radar data alone within 100-km range by using well-sited, well-maintained radar systems and sophisticated, physically based data-processing systems. The basic requirements entail performing accurate electronic calibration and stability verification, determining the radar detection domain, achieving efficient clutter elimination, and capturing the vertical structure(s) of reflectivity for the target event. Radar performance was shown to depend on type of rainfall, with better results obtained with deep convective rain systems (Nash coefficients of roughly 0.90 for point radar–rain gauge comparisons at the event time step), as opposed to shallow convective and frontal rain systems (Nash coefficients in the 0.6–0.8 range). In comparison with time-adaptive strategies, the space–time-adaptive strategy yields a very significant reduction in the radar–rain gauge bias while the level of scatter remains basically unchanged. Because the Z–R relationships have not been optimized in this study, results are attributed to an improved processing of spatial variations in the vertical profile of reflectivity. The two main recommendations for future work consist of adapting the rain separation method for radar network operations and documenting Z–R relationships conditional on rainfall type.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method has been developed to estimate aerosol optical depth (AOD) over land surfaces using high spatial resolution, hyperspectral, and multiangle Compact High Resolution Imaging Spectrometer (CHRIS)/Project for On Board Autonomy (PROBA) images. The CHRIS instrument is mounted aboard the PROBA satellite and provides up to 62 bands. The PROBA satellite allows pointing to obtain imagery from five different view angles within a short time interval. The method uses inversion of a coupled surface/atmosphere radiative transfer model and includes a general physical model of angular surface reflectance. An iterative process is used to determine the optimum value providing the best fit of the corrected reflectance values for a number of view angles and wavelengths with those provided by the physical model. This method has previously been demonstrated on data from the Advanced Along-Track Scanning Radiometer and is extended here to the spectral and angular sampling of CHRIS/PROBA. The values obtained from these observations are validated using ground-based sun-photometer measurements. Results from 22 image sets show an rms error of 0.11 in AOD at 550 nm, which is reduced to 0.06 after an automatic screening procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to overcome divergence of estimation with the same data, the proposed digital costing process adopts an integrated design of information system to design the process knowledge and costing system together. By employing and extending a widely used international standard, industry foundation classes, the system can provide an integrated process which can harvest information and knowledge of current quantity surveying practice of costing method and data. Knowledge of quantification is encoded from literatures, motivation case and standards. It can reduce the time consumption of current manual practice. The further development will represent the pricing process in a Bayesian Network based knowledge representation approach. The hybrid types of knowledge representation can produce a reliable estimation for construction project. In a practical term, the knowledge management of quantity surveying can improve the system of construction estimation. The theoretical significance of this study lies in the fact that its content and conclusion make it possible to develop an automatic estimation system based on hybrid knowledge representation approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The equations of Milsom are evaluated, giving the ground range and group delay of radio waves propagated via the horizontally stratified model ionosphere proposed by Bradley and Dudeney. Expressions for the ground range which allow for the effects of the underlying E- and F1-regions are used to evaluate the basic maximum usable frequency or M-factors for single F-layer hops. An algorithm for the rapid calculation of the M-factor at a given range is developed, and shown to be accurate to within 5%. The results reveal that the M(3000)F2-factor scaled from vertical-incidence ionograms using the standard URSI procedure can be up to 7.5% in error. A simple addition to the algorithm effects a correction to ionogram values to make these accurate to 0.5%.