974 resultados para 680400 Construction Processes
Resumo:
A Mass Customisation model is discussed as a competitive positioning strategy in the marketplace adding value to the customer’s end-use. It includes the user as part of the construction process responding to the customer’s demands and wishes. To the present day, almost all proposals for Mass Customisation have been focused on the design phase and single family houses. The reality is that the processes carried out in the work execution are so inefficient that the costs of the Mass Customisation models are assumed by the customer and they do not offer solutions that support the change management. Furthermore, this inefficiency often makes Mass Customisation unfeasible in terms of deadlines and site management. Therefore, the present proposal focuses on achieving the paradigm of Mass Customisation in the traditional residential construction complementary to the existing proposals in the design phase. All this through the proposal of a framework for the integral management in the work execution, which will address change management introduced by the users offering an efficient and productive model that reduces costs in the process. This model will focus on the synergy between different strategies, techniques and technologies currently used in the construction management (such as Lean Construction or Six Sigma), together with, other strategies and technologies that have proven to be valid solutions in other fields (such as Business Process Management, Service Oriented Architecture, etc.).
Resumo:
The construction industry has long been considered as highly fragmented and non-collaborative industry. This fragmentation sprouted from complex and unstructured traditional coordination processes and information exchanges amongst all parties involved in a construction project. This nature coupled with risk and uncertainty has pushed clients and their supply chain to search for new ways of improving their business process to deliver better quality and high performing product. This research will closely investigate the need to implement a Digital Nervous System (DNS), analogous to a biological nervous system, on the flow and management of digital information across the project lifecycle. This will be through direct examination of the key processes and information produced in a construction project and how a DNS can provide a well-integrated flow of digital information throughout the project lifecycle. This research will also investigate how a DNS can create a tight digital feedback loop that enables the organisation to sense, react and adapt to changing project conditions. A Digital Nervous System is a digital infrastructure that provides a well-integrated flow of digital information to the right part of the organisation at the right time. It provides the organisation with the relevant and up-to-date information it needs, for critical project issues, to aid in near real-time decision-making. Previous literature review and survey questionnaires were used in this research to collect and analyse data about information management problems of the industry – e.g. disruption and discontinuity of digital information flow due to interoperability issues, disintegration/fragmentation of the adopted digital solutions and paper-based transactions. Results analysis revealed efficient and effective information management requires the creation and implementation of a DNS.
Resumo:
The construction industry is characterised by fragmentation and suffers from lack of collaboration, often adopting adversarial working practices to achieve deliverables. For the UK Government and construction industry, BIM is a game changer aiming to rectify this fragmentation and promote collaboration. However it has become clear that there is an essential need to have better controls and definitions of both data deliverables and data classification. Traditional methods and techniques for collating and inputting data have shown to be time consuming and provide little to improve or add value to the overall task of improving deliverables. Hence arose the need in the industry to develop a Digital Plan of Work (DPoW) toolkit that would aid the decision making process, providing the required control over the project workflows and data deliverables, and enabling better collaboration through transparency of need and delivery. The specification for the existing Digital Plan of Work (DPoW) was to be, an industry standard method of describing geometric, requirements and data deliveries at key stages of the project cycle, with the addition of a structured and standardised information classification system. However surveys and interviews conducted within this research indicate that the current DPoW resembles a digitised version of the pre-existing plans of work and does not push towards the data enriched decision-making abilities that advancements in technology now offer. A Digital Framework is not simply the digitisation of current or historic standard methods and procedures, it is a new intelligent driven digital system that uses new tools, processes, procedures and work flows to eradicate waste and increase efficiency. In addition to reporting on conducted surveys above, this research paper will present a theoretical investigation into usage of Intelligent Decision Support Systems within a digital plan of work framework. Furthermore this paper will present findings on the suitability to utilise advancements in intelligent decision-making system frameworks and Artificial Intelligence for a UK BIM Framework. This should form the foundations of decision-making for projects implemented at BIM level 2. The gap identified in this paper is that the current digital toolkit does not incorporate the intelligent characteristics available in other industries through advancements in technology and collation of vast amounts of data that a digital plan of work framework could have access to and begin to develop, learn and adapt for decision-making through the live interaction of project stakeholders.
Resumo:
National Highway Traffic Safety Administration, Office of Research and Development, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Office of Research and Development, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Office of Research and Development, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Office of Research and Development, Washington, D.C.
Resumo:
"EPA/600/9-85/024a."
Resumo:
Let Q be a stable and conservative Q-matrix over a countable state space S consisting of an irreducible class C and a single absorbing state 0 that is accessible from C. Suppose that Q admits a finite mu-subinvariant measure in on C. We derive necessary and sufficient conditions for there to exist a Q-process for which m is mu-invariant on C, as well as a necessary condition for the uniqueness of such a process.
Resumo:
Let S be a countable set and let Q = (q(ij), i, j is an element of S) be a conservative q-matrix over S with a single instantaneous state b. Suppose that we are given a real number mu >= 0 and a strictly positive probability measure m = (m(j), j is an element of S) such that Sigma(i is an element of S) m(i)q(ij) = -mu m(j), j 0 b. We prove that there exists a Q-process P(t) = (p(ij) (t), i, j E S) for which m is a mu-invariant measure, that is Sigma(i is an element of s) m(i)p(ij)(t) = e(-mu t)m(j), j is an element of S. We illustrate our results with reference to the Kolmogorov 'K 1' chain and a birth-death process with catastrophes and instantaneous resurrection.
Resumo:
Process optimisation and optimal control of batch and continuous drum granulation processes are studied in this paper. The main focus of the current research has been: (i) construction of optimisation and control relevant, population balance models through the incorporation of moisture content, drum rotation rate and bed depth into the coalescence kernels; (ii) investigation of optimal operational conditions using constrained optimisation techniques; (iii) development of optimal control algorithms based on discretized population balance equations; and (iv) comprehensive simulation studies on optimal control of both batch and continuous granulation processes. The objective of steady state optimisation is to minimise the recycle rate with minimum cost for continuous processes. It has been identified that the drum rotation-rate, bed depth (material charge), and moisture content of solids are practical decision (design) parameters for system optimisation. The objective for the optimal control of batch granulation processes is to maximize the mass of product-sized particles with minimum time and binder consumption. The objective for the optimal control of the continuous process is to drive the process from one steady state to another in a minimum time with minimum binder consumption, which is also known as the state-driving problem. It has been known for some time that the binder spray-rate is the most effective control (manipulative) variable. Although other possible manipulative variables, such as feed flow-rate and additional powder flow-rate have been investigated in the complete research project, only the single input problem with the binder spray rate as the manipulative variable is addressed in the paper to demonstrate the methodology. It can be shown from simulation results that the proposed models are suitable for control and optimisation studies, and the optimisation algorithms connected with either steady state or dynamic models are successful for the determination of optimal operational conditions and dynamic trajectories with good convergence properties. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The enhanced biological phosphorus removal (EBPR) process is regularly used for the treatment of wastewater, but suffers from erratic performance. Successful EBPR relies on the growth of bacteria called polyphosphate-accumulating organisms (PAOs), which store phosphorus intracellularly as polyphosphate, thus removing it from wastewater. Metabolic models have been proposed which describe the measured chemical transformations, however genetic evidence is lacking to confirm these hypotheses. The aim of this research was to generate a metagenomic library from biomass enriched in PAOs as determined by phenotypic data and fluorescence in situ hybridisation (FISH) using probes specific for the only described PAO to date, Candidatus Accumulibacter phosphatis. DNA extraction methods were optimised and two fosmid libraries were constructed which contained 93 million base pairs of metagenomic data. Initial screening of the library for 16S rRNA genes revealed fosmids originating from a range of non-pure-cultured wastewater bacteria. The metagenomic libraries constructed will provide the ability to link phylogenetic and metabolic data for bacteria involved in nutrient removal from wastewater. Keywords DNA extraction; EBPR; metagenomic library; 16S rRNA gene.
Resumo:
This paper reports preliminary progress on a principled approach to modelling nonstationary phenomena using neural networks. We are concerned with both parameter and model order complexity estimation. The basic methodology assumes a Bayesian foundation. However to allow the construction of pragmatic models, successive approximations have to be made to permit computational tractibility. The lowest order corresponds to the (Extended) Kalman filter approach to parameter estimation which has already been applied to neural networks. We illustrate some of the deficiencies of the existing approaches and discuss our preliminary generalisations, by considering the application to nonstationary time series.
Resumo:
We develop an approach for sparse representations of Gaussian Process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets. The method is based on a combination of a Bayesian online algorithm together with a sequential construction of a relevant subsample of the data which fully specifies the prediction of the GP model. By using an appealing parametrisation and projection techniques that use the RKHS norm, recursions for the effective parameters and a sparse Gaussian approximation of the posterior process are obtained. This allows both for a propagation of predictions as well as of Bayesian error measures. The significance and robustness of our approach is demonstrated on a variety of experiments.
Resumo:
We develop an approach for sparse representations of Gaussian Process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets. The method is based on a combination of a Bayesian online algorithm together with a sequential construction of a relevant subsample of the data which fully specifies the prediction of the GP model. By using an appealing parametrisation and projection techniques that use the RKHS norm, recursions for the effective parameters and a sparse Gaussian approximation of the posterior process are obtained. This allows both for a propagation of predictions as well as of Bayesian error measures. The significance and robustness of our approach is demonstrated on a variety of experiments.