926 resultados para Engineering Process
Resumo:
In the tender process, contractors often rely on subcontract and supply enquiries to calculate their bid prices. However, this integral part of the bidding process is not empirically articulated in the literature. Over 30 published materials on the tendering process of contractors that talk about enquiries were reviewed and found to be based mainly on experiential knowledge rather than systematic evidence. The empirical research here helps to describe the process of enquiries precisely, improve it in practice, and have some basis to support it in theory. Using a live participant observation case study approach, the whole tender process was shadowed in the offices of two of the top 20 UK civil engineering construction firms. This helped to investigate 15 research questions on how contractors enquire and obtain prices from subcontractors and suppliers. Forty-three subcontract enquiries and 18 supply enquiries were made across two different projects with average value of 7m. An average of 15 subcontract packages and seven supply packages was involved. Thus, two or three subcontractors or suppliers were invited to bid in each package. All enquiries were formulated by the estimator, with occasional involvement of three other personnel. Most subcontract prices were received in an average of 14 working days; and supply prices took five days. The findings show 10 main activities involved in processing enquiries and their durations, as well as wasteful practices associated with enquiries. Contractors should limit their enquiry invitations to a maximum of three per package, and optimize the waiting time for quotations in order to improve cost efficiency.
Resumo:
Purpose – The purpose of this paper is to show the extent to which clients amend standard form contracts in practice, the locus of the amendments, and how contractors respond to the amendments when putting together a bid. Design/methodology/approach – Four live observational case studies were carried out in two of the top 20 UK construction firms. The whole process used to review the proposed terms and conditions of the contract was shadowed using participant observation, interview and documentary analysis. Findings – All four cases showed strong evidence of amendments relating mostly to payment and contractual aspects: 83 amendments in Case Study 1 (CS1), 80 in CS2, 15 in CS3 and 29 in CS4. This comprised clauses that were modified (37 per cent), substituted (23 per cent), deleted (7 per cent) and new additions (33 per cent). Risks inherent in the amendments were mostly addressed through contractual rather than price mechanisms, to reflect commercial imperatives. “Qualifications” and “clarifications” were included in the tender submissions for post-tender negotiations. Thus, the amendments did not necessarily influence price. There was no evidence of a “standard-form contract“ being used as such, although clients may draw on published “standard-form contracts” to derive the forms of contract actually used in practice. Practical implications – Contractors should pay attention to clauses relating to contractual and financial aspects when reviewing tender documents. Clients should draft equitable payment and contractual terms and conditions to reduce risk of dispute. Indeed, it is prudent for clients not to pass on inestimable risks. Originality/value – A better understanding of the extent and locus of amendments in standard form contracts, and how contractors respond, is provided.
Resumo:
The management of information in engineering organisations is facing a particular challenge in the ever-increasing volume of information. It has been recognised that an effective methodology is required to evaluate information in order to avoid information overload and to retain the right information for reuse. By using, as a starting point, a number of the current tools and techniques which attempt to obtain ‘the value’ of information, it is proposed that an assessment or filter mechanism for information is needed to be developed. This paper addresses this issue firstly by briefly reviewing the information overload problem, the definition of value, and related research work on the value of information in various areas. Then a “characteristic” based framework of information evaluation is introduced using the key characteristics identified from related work as an example. A Bayesian Network diagram method is introduced to the framework to build the linkage between the characteristics and information value in order to quantitatively calculate the quality and value of information. The training and verification process for the model is then described using 60 real engineering documents as a sample. The model gives a reasonable accurate result and the differences between the model calculation and training judgements are summarised as the potential causes are discussed. Finally, several further issues including the challenge of the framework and the implementations of this evaluation assessment method are raised.
Resumo:
This paper presents the on-going research performed in order to integrate process automation and process management support in the context of media production. This has been addressed on the basis of a holistic approach to software engineering applied to media production modelling to ensure design correctness, completeness and effectiveness. The focus of the research and development has been to enhance the metadata management throughout the process in a similar fashion to that achieved in Decision Support Systems (DSS) to facilitate well-grounded business decisions. The paper sets out the aims and objectives and the methodology deployed. The paper describes the solution in some detail and sets out some preliminary conclusions and the planned future work.
An empirical study of process-related attributes in segmented software cost-estimation relationships
Resumo:
Parametric software effort estimation models consisting on a single mathematical relationship suffer from poor adjustment and predictive characteristics in cases in which the historical database considered contains data coming from projects of a heterogeneous nature. The segmentation of the input domain according to clusters obtained from the database of historical projects serves as a tool for more realistic models that use several local estimation relationships. Nonetheless, it may be hypothesized that using clustering algorithms without previous consideration of the influence of well-known project attributes misses the opportunity to obtain more realistic segments. In this paper, we describe the results of an empirical study using the ISBSG-8 database and the EM clustering algorithm that studies the influence of the consideration of two process-related attributes as drivers of the clustering process: the use of engineering methodologies and the use of CASE tools. The results provide evidence that such consideration conditions significantly the final model obtained, even though the resulting predictive quality is of a similar magnitude.
Resumo:
The construction sector is under growing pressure to increase productivity and improve quality, most notably in reports by Latham (1994, Constructing the Team, HMSO, London) and Egan (1998, Rethinking Construction, HMSO, London). A major problem for construction companies is the lack of project predictability. One method of increasing predictability and delivering increased customer value is through the systematic management of construction processes. However, the industry has no methodological mechanism to assess process capability and prioritise process improvements. Standardized Process Improvement for Construction Enterprises (SPICE) is a research project that is attempting to develop a stepwise process improvement framework for the construction industry, utilizing experience from the software industry, and in particular the Capability Maturity Model (CMM), which has resulted in significant productivity improvements in the software industry. This paper introduces SPICE concepts and presents the results from two case studies conducted on design and build projects. These studies have provided further in-sight into the relevance and accuracy of the framework, as well as its value for the construction sector.
Resumo:
The complexity of construction projects and the fragmentation of the construction industry undertaking those projects has effectively resulted in linear, uncoordinated and highly variable project processes in the UK construction sector. Research undertaken at the University of Salford resulted in the development of an improved project process, the Process Protocol, which considers the whole lifecycle of a construction project whilst integrating its participants under a common framework. The Process Protocol identifies the various phases of a construction project with particular emphasis on what is described in the manufacturing industry as the ‘fuzzy front end’. The participants in the process are described in terms of the activities that need to be undertaken in order to achieve a successful project and process execution. In addition, the decision-making mechanisms, from a client perspective, are illustrated and the foundations for a learning organization/industry are facilitated within a consistent Process Protocol.
Resumo:
The current research agenda for construction process improvement is heavily influenced by the rhetoric of business process re-engineering (BPR). In contrast to the wider literature on BPR, there is little evidence of critical thought within the construction management research community. A postmodernist interpretation is advocated whereby the reality of management practice is defined by the dominant management discourse. The persuasiveness of BPR rhetoric is analysed with particular reference to the way in which it plays on the insecurity of modern managers. Despite the lip service given to ‘empowerment’ and ‘teamwork’, the dominant theme of the re-engineering movement is that of technocratic totalitarianism. From a critical perspective, it is suggested that BPR is imposed on construction organizations to ensure continued control by the industry's dominant power groups. Whilst industry leaders are fond of calling for ‘attitudinal and cultural improvement’, the language of the accepted research agenda continually reinforces the industry's dominant culture of ‘control and command’. Therefore, current research directions in process improvement perpetuate existing attitudes rather than facilitating cultural change. The concept of lean construction is seen to be the latest manifestation of this phenomenon.
Resumo:
Dielectric properties of 16 process cheeses were determined over the frequency range 0.3-3 GHz. The effect of temperature on the dielectric properties of process cheeses were investigated at temperature intervals of 10 degrees C between 5 and 85 degrees C. Results showed that the dielectric constant decreased gradually as frequency increased, for all cheeses. The dielectric loss factor (epsilon") decreased from above 125 to below 12 as frequency increased. epsilon' was highest at 5 degrees C and generally decreased up to a temperature between 55 and 75 degrees C. epsilon" generally increased with increasing temperature for high and medium moisture/fat ratio cheeses. epsilon" decreased with temperature between 5 and 55 degrees C and then increased, for low moisture/fat ratio cheese. Partial least square regression models indicated that epsilon' and epsilon" could be used as a quality control screening application to measure moisture content and inorganic salt content of process cheese, respectively. (c) 2005 Elsevier Ltd. All rights reserved..
Resumo:
Through multiple case studies of firms we argue that firms that have developed corporate responsibility strategies, albeit informally at first, do so by making intentional, informed and collective choices about CSR initiatives. More precisely, we point to the importance of considering corporate identity in making these choices and to the process of adaptive coordination, which includes both responding to and influencing the CSR environment. We conclude that CSR strategic landscape are determined more and more by the astute and careful management of a network of cooperative and competitive stakeholder interests which possess both tangible and intangible value to a firm.
Resumo:
Whole-life thinking for engineers working on the built environment has become more important in a fast changing world.Engineers are increasingly concerned with complex systems, in which the parts interact with each other and with the outside world in many ways – the relationships between the parts determine how the system behaves. Systems thinking provides one approach to developing a more robust whole life approach. Systems thinking is a process of understanding how things influence one another within a wider perspective. Complexity, chaos, and risk are endemic in all major projects. New approaches are needed to produce more reliable whole life predictions. Best value, rather than lowest cost can be achieved by using whole-life appraisal as part of the design and delivery strategy.
Resumo:
A new class of parameter estimation algorithms is introduced for Gaussian process regression (GPR) models. It is shown that the integration of the GPR model with probability distance measures of (i) the integrated square error and (ii) Kullback–Leibler (K–L) divergence are analytically tractable. An efficient coordinate descent algorithm is proposed to iteratively estimate the kernel width using golden section search which includes a fast gradient descent algorithm as an inner loop to estimate the noise variance. Numerical examples are included to demonstrate the effectiveness of the new identification approaches.
Resumo:
The sustainable delivery of multiple ecosystem services requires the management of functionally diverse biological communities. In an agricultural context, an emphasis on food production has often led to a loss of biodiversity to the detriment of other ecosystem services such as the maintenance of soil health and pest regulation. In scenarios where multiple species can be grown together, it may be possible to better balance environmental and agronomic services through the targeted selection of companion species. We used the case study of legume-based cover crops to engineer a plant community that delivered the optimal balance of six ecosystem services: early productivity, regrowth following mowing, weed suppression, support of invertebrates, soil fertility building (measured as yield of following crop), and conservation of nutrients in the soil. An experimental species pool of 12 cultivated legume species was screened for a range of functional traits and ecosystem services at five sites across a geographical gradient in the United Kingdom. All possible species combinations were then analyzed, using a process-based model of plant competition, to identify the community that delivered the best balance of services at each site. In our system, low to intermediate levels of species richness (one to four species) that exploited functional contrasts in growth habit and phenology were identified as being optimal. The optimal solution was determined largely by the number of species and functional diversity represented by the starting species pool, emphasizing the importance of the initial selection of species for the screening experiments. The approach of using relationships between functional traits and ecosystem services to design multifunctional biological communities has the potential to inform the design of agricultural systems that better balance agronomic and environmental services and meet the current objective of European agricultural policy to maintain viable food production in the context of the sustainable management of natural resources.