99 resultados para Engineering Days
Resumo:
An overtly critical perspective on 're-engineering construction' is presented. It is contended that re-engineering is impossible to define in terms of its substantive content and is best understood as a rhetorical label. In recent years, the language of re-engineering has heavily shaped the construction research agenda. The declared goals are to lower costs and improve value for the customer. The discourse is persuasive because it reflects the ideology of the 'enterprise culture' and the associated rhetoric of customer responsiveness. Re-engineering is especially attractive to the construction industry because it reflects and reinforces the existing dominant way of thinking. The overriding tendency is to reduce organizational complexities to a mechanistic quest for efficiency. Labour is treated as a commodity. Within this context, the objectives of re-engineering become 'common sense'. Knowledge becomes subordinate to the dominant ideology of neo-liberalism. The accepted research agenda for re-engineering construction exacerbates the industry's problems and directly contributes to the casualization of the workforce. The continued adherence to machine metaphors by the construction industry's top management has directly contributed to the 'bad attitudes' and 'adversarial culture' that they repeatedly decry. Supposedly neutral topics such as pre-assembly, partnering, supply chain management and lean thinking serve only to justify the shift towards bogus labour-only subcontracting and the associated reduction of employment rights. The continued casualization of the workforce raises real questions about the industry's future capacity to deliver high-quality construction. In order to appear 'relevant' to the needs of industry, it seems that the research community is doomed to perpetuate this regressive cycle.
Resumo:
In the tender process, contractors often rely on subcontract and supply enquiries to calculate their bid prices. However, this integral part of the bidding process is not empirically articulated in the literature. Over 30 published materials on the tendering process of contractors that talk about enquiries were reviewed and found to be based mainly on experiential knowledge rather than systematic evidence. The empirical research here helps to describe the process of enquiries precisely, improve it in practice, and have some basis to support it in theory. Using a live participant observation case study approach, the whole tender process was shadowed in the offices of two of the top 20 UK civil engineering construction firms. This helped to investigate 15 research questions on how contractors enquire and obtain prices from subcontractors and suppliers. Forty-three subcontract enquiries and 18 supply enquiries were made across two different projects with average value of 7m. An average of 15 subcontract packages and seven supply packages was involved. Thus, two or three subcontractors or suppliers were invited to bid in each package. All enquiries were formulated by the estimator, with occasional involvement of three other personnel. Most subcontract prices were received in an average of 14 working days; and supply prices took five days. The findings show 10 main activities involved in processing enquiries and their durations, as well as wasteful practices associated with enquiries. Contractors should limit their enquiry invitations to a maximum of three per package, and optimize the waiting time for quotations in order to improve cost efficiency.
Resumo:
The management of information in engineering organisations is facing a particular challenge in the ever-increasing volume of information. It has been recognised that an effective methodology is required to evaluate information in order to avoid information overload and to retain the right information for reuse. By using, as a starting point, a number of the current tools and techniques which attempt to obtain ‘the value’ of information, it is proposed that an assessment or filter mechanism for information is needed to be developed. This paper addresses this issue firstly by briefly reviewing the information overload problem, the definition of value, and related research work on the value of information in various areas. Then a “characteristic” based framework of information evaluation is introduced using the key characteristics identified from related work as an example. A Bayesian Network diagram method is introduced to the framework to build the linkage between the characteristics and information value in order to quantitatively calculate the quality and value of information. The training and verification process for the model is then described using 60 real engineering documents as a sample. The model gives a reasonable accurate result and the differences between the model calculation and training judgements are summarised as the potential causes are discussed. Finally, several further issues including the challenge of the framework and the implementations of this evaluation assessment method are raised.
Resumo:
The development and performance of a three-stage tubular model of the large human intestine is outlined. Each stage comprises a membrane fermenter where flow of an aqueous polyethylene glycol solution on the outside of the tubular membrane is used to control the removal of water and metabolites (principally short chain fatty acids) from, and thus the pH of, the flowing contents on the fermenter side. The three stage system gave a fair representation of conditions in the human gut. Numbers of the main bacterial groups were consistently higher than in an existing three-chemostat gut model system, suggesting the advantages of the new design in providing an environment for bacterial growth to represent the actual colonic microflora. Concentrations of short chain fatty acids and Ph levels throughout the system were similar to those associated with corresponding sections of the human colon. The model was able to achieve considerable water transfer across the membrane, although the values were not as high as those in the colon. The model thus goes some way towards a realistic simulation of the colon, although it makes no pretence to simulate the pulsating nature of the real flow. The flow conditions in each section are characterized by low Reynolds numbers: mixing due to Taylor dispersion is significant, and the implications of Taylor mixing and biofilm development for the stability, that is the ability to operate without washout, of the system are briefly analysed and discussed. It is concluded that both phenomena are important for stabilizing the model and the human colon.