884 resultados para Cost Estimation System
Resumo:
Data assimilation is a sophisticated mathematical technique for combining observational data with model predictions to produce state and parameter estimates that most accurately approximate the current and future states of the true system. The technique is commonly used in atmospheric and oceanic modelling, combining empirical observations with model predictions to produce more accurate and well-calibrated forecasts. Here, we consider a novel application within a coastal environment and describe how the method can also be used to deliver improved estimates of uncertain morphodynamic model parameters. This is achieved using a technique known as state augmentation. Earlier applications of state augmentation have typically employed the 4D-Var, Kalman filter or ensemble Kalman filter assimilation schemes. Our new method is based on a computationally inexpensive 3D-Var scheme, where the specification of the error covariance matrices is crucial for success. A simple 1D model of bed-form propagation is used to demonstrate the method. The scheme is capable of recovering near-perfect parameter values and, therefore, improves the capability of our model to predict future bathymetry. Such positive results suggest the potential for application to more complex morphodynamic models.
Resumo:
By the turn of the twenty-first century, UNDP had embraced a new form of funding based on ‘cost-sharing’, with this source accounting for 51 per cent of the organisation’s total expenditure worldwide in 2000. Unlike the traditional donor - recipient relationship so common with development projects, the new cost-sharing modality has created a situation whereby UNDP local offices become ‘subcontractors’ and agencies of the recipient countries become ‘clients’. This paper explores this transition in the context of Brazil, focusing on how the new modality may have compromised UNDP’s ability to promote Sustainable Human Development, as established in its mandate. The great enthusiasm for this modality within the UN system and its potential application to other developing countries increase the importance of a systematic assessment of its impact and developmental consequences.
Resumo:
Real-time rainfall monitoring in Africa is of great practical importance for operational applications in hydrology and agriculture. Satellite data have been used in this context for many years because of the lack of surface observations. This paper describes an improved artificial neural network algorithm for operational applications. The algorithm combines numerical weather model information with the satellite data. Using this algorithm, daily rainfall estimates were derived for 4 yr of the Ethiopian and Zambian main rainy seasons and were compared with two other algorithms-a multiple linear regression making use of the same information as that of the neural network and a satellite-only method. All algorithms were validated against rain gauge data. Overall, the neural network performs best, but the extent to which it does so depends on the calibration/validation protocol. The advantages of the neural network are most evident when calibration data are numerous and close in space and time to the validation data. This result emphasizes the importance of a real-time calibration system.
Resumo:
This paper describes a case study of an electronic data management system developed in-house by the Facilities Management Directorate (FMD) of an educational institution in the UK. The FMD Maintenance and Business Services department is responsible for the maintenance of the built-estate owned by the university. The department needs to have a clear definition of the type of work undertaken and the administration that enables any maintenance work to be carried out. These include the management of resources, budget, cash flow and workflow of reactive, preventative and planned maintenance of the campus. In order to be more efficient in supporting the business process, the FMD had decided to move from a paper-based information system to an electronic system, WREN, to support the business process of the FMD. Some of the main advantages of WREN are that it is tailor-made to fit the purpose of the users; it is cost effective when it comes to modifications on the system; and the database can also be used as a knowledge management tool. There is a trade-off; as WREN is tailored to the specific requirements of the FMD, it may not be easy to implement within a different institution without extensive modifications. However, WREN is successful in not only allowing the FMD to carry out the tasks of maintaining and looking after the built-estate of the university, but also has achieved its aim to minimise costs and maximise efficiency.
Resumo:
Asynchronous Optical Sampling (ASOPS) [1,2] and frequency comb spectrometry [3] based on dual Ti:saphire resonators operated in a master/slave mode have the potential to improve signal to noise ratio in THz transient and IR sperctrometry. The multimode Brownian oscillator time-domain response function described by state-space models is a mathematically robust framework that can be used to describe the dispersive phenomena governed by Lorentzian, Debye and Drude responses. In addition, the optical properties of an arbitrary medium can be expressed as a linear combination of simple multimode Brownian oscillator functions. The suitability of a range of signal processing schemes adopted from the Systems Identification and Control Theory community for further processing the recorded THz transients in the time or frequency domain will be outlined [4,5]. Since a femtosecond duration pulse is capable of persistent excitation of the medium within which it propagates, such approach is perfectly justifiable. Several de-noising routines based on system identification will be shown. Furthermore, specifically developed apodization structures will be discussed. These are necessary because due to dispersion issues, the time-domain background and sample interferograms are non-symmetrical [6-8]. These procedures can lead to a more precise estimation of the complex insertion loss function. The algorithms are applicable to femtosecond spectroscopies across the EM spectrum. Finally, a methodology for femtosecond pulse shaping using genetic algorithms aiming to map and control molecular relaxation processes will be mentioned.
Resumo:
The 'direct costs' attributable to 30 different endemic diseases of farm animals in Great Britain are estimated using a standardised method to construct a simple model for each disease that includes consideration of disease prevention and treatment costs. The models so far developed provide a basis for further analyses including cost-benefit analyses for the economic assessment of disease control options. The approach used reflects the inherent livestock disease information constraints, which limit the application of other economic analytical methods. It is a practical and transparent approach that is relatively easily communicated to veterinary scientists and policy makers. The next step is to develop the approach by incorporating wider economic considerations into the analyses in a way that will demonstrate to policy makers and others the importance of an economic perspective to livestock disease issues.
Resumo:
Diabetes incurs heavy personal and health system costs. Self-management is required if complications are to be avoided. Adolescents face particular challenges as they learn to take responsibility for their diabetes. A systematic review of educational and psychosocial programmes for adolescents with diabetes was undertaken. This aimed to: identify and categorise the types of programmes that have been evaluated; assess the cost-effectiveness of interventions; identify areas where further research is required. Sixty-two papers were identified and Subjected to a narrative review. Generic programmes focus on knowledge/skills, psychosocial issues, and behaviour/self-management. They result in modest improvements across a range of outcomes but improvements are often not sustained, suggesting a need for continuous support, possibly integrated into normal care. In-hospital education at diagnosis confers few advantages over home treatment. The greatest returns may be obtained by targeting poorly controlled individuals. Few studies addressed resourcing issues and robust cost-effectiveness appraisals are required to identify interventions that generate the greatest returns on expenditure. (C) 2004 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Purpose – To evaluate the control strategy for a hybrid natural ventilation wind catchers and air-conditioning system and to assess the contribution of wind catchers to indoor air environments and energy savings if any. Design/methodology/approach – Most of the modeling techniques for assessing wind catchers performance are theoretical. Post-occupancy evaluation studies of buildings will provide an insight into the operation of these building components and help to inform facilities managers. A case study for POE was presented in this paper. Findings – The monitoring of the summer and winter month operations showed that the indoor air quality parameters were kept within the design target range. The design control strategy failed to record data regarding the operation, opening time and position of wind catchers system. Though the implemented control strategy was working effectively in monitoring the operation of mechanical ventilation systems, i.e. AHU, did not integrate the wind catchers with the mechanical ventilation system. Research limitations/implications – Owing to short-falls in the control strategy implemented in this project, it was found difficult to quantify and verify the contribution of the wind catchers to the internal conditions and, hence, energy savings. Practical implications – Controlling the operation of the wind catchers via the AHU will lead to isolation of the wind catchers in the event of malfunctioning of the AHU. Wind catchers will contribute to the ventilation of space, particularly in the summer months. Originality/value – This paper demonstrates the value of POE as indispensable tool for FM professionals. It further provides insight into the application of natural ventilation systems in building for healthier indoor environments at lower energy cost. The design of the control strategy for natural ventilation and air-conditioning should be considered at the design stage involving the FM personnel.
Resumo:
Modern buildings are designed to enhance the match between environment, spaces and the people carrying out work, so that the well-being and the performance of the occupants are all in harmony. Building services are systems that facilitate a healthy working environment within which workers productivity can be optimised in the buildings. However, the maintenance of these services is fraught with problems that may contribute to up to 50% of the total life cycle cost of the building. Maintenance support is one area which is not usually designed into the system as this is not common practice in the services industry. The other areas of shortfall for future designs are; client requirements, commissioning, facilities management data and post occupancy evaluation feedback which needs to be adequately planned to capture and document this information for use in future designs. At the University of Reading an integrated approach has been developed to assemble the multitude of aspects inherent in this field. The means records required and measured achievements for the benefit of both building owners and practitioners. This integrated approach can be represented in a Through Life Business Model (TLBM) format using the concept of Integrated Logistic Support (ILS). The prototype TLBM developed utilises the tailored tools and techniques of ILS for building services. This TLBM approach will facilitate the successful development of a databank that would be invaluable in capturing essential data (e.g. reliability of components) for enhancing future building services designs, life cycle costing and decision making by practitioners, in particular facilities managers.
Resumo:
Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward constrained regression manner. The leave-one-out (LOO) test score is used for kernel selection. The jackknife parameter estimator subject to positivity constraint check is used for the parameter estimation of a single parameter at each forward step. As such the proposed approach is simple to implement and the associated computational cost is very low. An illustrative example is employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with comparable accuracy to that of the classical Parzen window estimate.
Resumo:
The relative fast processing speed requirements in Wireless Personal Area Network (WPAN) consumer based products are often in conflict with their low power and cost requirements. In order to solve this conflict the efficiency and cost effectiveness of these products and the underlying functional modules become paramount. This paper presents a low-cost, simple, yet high performance solution for the receiver Channel Estimator and Equalizer for the Mutiband OFDM (MB-OFDM) system, particularly directed to the WiMedia Consortium Physical Later (ECMA-368) consumer implementation for Wireless-USB and Fast Bluetooth. In this paper, the receiver fixed point performance is measured and the results indicate excellent performance compared to the current literature(1).
Resumo:
Finding an estimate of the channel impulse response (CIR) by correlating a received known (training) sequence with the sent training sequence is commonplace. Where required, it is also common to truncate the longer correlation to a sub-set of correlation coefficients by finding the set of N sequential correlation coefficients with the maximum power. This paper presents a new approach to selecting the optimal set of N CIR coefficients from the correlation rather than relying on power. The algorithm reconstructs a set of predicted symbols using the training sequence and various sub-sets of the correlation to find the sub-set that results in the minimum mean squared error between the actual received symbols and the reconstructed symbols. The application of the algorithm is presented in the context of the TDMA based GSM/GPRS system to demonstrate an improvement in the system performance with the new algorithm and the results are presented in the paper. However, the application lends itself to any training sequence based communication system often found within wireless consumer electronic device(1).
Apodisation, denoising and system identification techniques for THz transients in the wavelet domain
Resumo:
This work describes the use of a quadratic programming optimization procedure for designing asymmetric apodization windows to de-noise THz transient interferograms and compares these results to those obtained when wavelet signal processing algorithms are adopted. A systems identification technique in the wavelet domain is also proposed for the estimation of the complex insertion loss function. The proposed techniques can enhance the frequency dependent dynamic range of an experiment and should be of particular interest to the THz imaging and tomography community. Future advances in THz sources and detectors are likely to increase the signal-to-noise ratio of the recorded THz transients and high quality apodization techniques will become more important, and may set the limit on the achievable accuracy of the deduced spectrum.
Resumo:
Hidden Markov Models (HMMs) have been successfully applied to different modelling and classification problems from different areas over the recent years. An important step in using HMMs is the initialisation of the parameters of the model as the subsequent learning of HMM’s parameters will be dependent on these values. This initialisation should take into account the knowledge about the addressed problem and also optimisation techniques to estimate the best initial parameters given a cost function, and consequently, to estimate the best log-likelihood. This paper proposes the initialisation of Hidden Markov Models parameters using the optimisation algorithm Differential Evolution with the aim to obtain the best log-likelihood.
Resumo:
Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward-constrained regression (FCR) manner. The proposed algorithm selects significant kernels one at a time, while the leave-one-out (LOO) test score is minimized subject to a simple positivity constraint in each forward stage. The model parameter estimation in each forward stage is simply the solution of jackknife parameter estimator for a single parameter, subject to the same positivity constraint check. For each selected kernels, the associated kernel width is updated via the Gauss-Newton method with the model parameter estimate fixed. The proposed approach is simple to implement and the associated computational cost is very low. Numerical examples are employed to demonstrate the efficacy of the proposed approach.