974 resultados para 680400 Construction Processes
Resumo:
Several parties (stakeholders) are involved in a construction project. The conventional Risk Management Process (RMP) manages risks from a single party perspective, which does not give adequate consideration to the needs of others. The objective of multi-party risk management is to assist decision-makers in managing risk systematically and most efficiently in a multi-party environment. Multi-party Risk Management Processes (MRMP) consist of risk identification, structuring, analysis and developing responses from all party perspectives. The MRMP has been applied to a cement plant construction project in Thailand to demonstrate its effectiveness.
Resumo:
Purpose - The purpose of the paper is to the identify risk factors, which affect oil and gas construction projects in Vietnam and derive risk responses. Design/methodology/approach - Questionnaire survey was conducted with the involvement of project executives of PetroVietnam and statistical analysis was carried out in order to identify the major project risks. Subsequently, mitigating measures were derived using informal interviews with the various levels of management of PetroVietnam. Findings - Bureaucratic government system and long project approval procedures, poor design, incompetence of project team, inadequate tendering practices, and late internal approval processes from the owner were identified as major risks. The executives suggested various strategies to mitigate the identified risks. Reforming the government system, effective partnership with foreign collaborators, training project executives, implementing contractor evaluation using multiple criteria decision-making technique, and enhancing authorities of project people were suggested as viable approaches. Practical implications - The improvement measures as derived in this study would improve chances of project success in the oil and gas industry in Vietnam. Originality/value - There are several risk management studies on managing projects in developing countries. However, as risk factors vary considerably across industry and countries, the study of risk management for successful projects in the oil and gas industry in Vietnam is unique and has tremendous importance for effective project management.
Resumo:
Conventional project management techniques are not always sufficient for ensuring time, cost and quality achievement of large-scale construction projects due to complexity in planning and implementation processes. The main reasons for project non-achievement are changes in scope and design, changes in Government policies and regulations, unforeseen inflation) under-estimation and improper estimation. Projects that are exposed to such an uncertain environment can be effectively managed with the application of risk numagement throughout project life cycle. However, the effectiveness of risk management depends on the technique in which the effects of risk factors are analysed and! or quantified. This study proposes Analytic Hierarchy Process (AHP), a multiple attribute decision-making technique as a tool for risk analysis because it can handle subjective as well as objective factors in decision model that are conflicting in nature. This provides a decision support system (DSS) to project managenumt for making the right decision at the right time for ensuring project success in line with organisation policy, project objectives and competitive business environment. The whole methodology is explained through a case study of a cross-country petroleum pipeline project in India and its effectiveness in project1nana.gement is demonstrated.
Resumo:
Conventional project management techniques are not always sufficient to ensure that schedule, cost and quality goals are met on large-scale construction projects. These jobs require complex planning, designing and implementation processes. The main reasons for a project's nonachievement in India's hydrocarbon processing industry are changes in scope and design, altered government policies and regulations, unforeseen inflation, under and/or improper estimation. Projects that are exposed to such an uncertain environment can be effectively managed by applying risk management throughout the project life cycle.
Resumo:
The Securities and Exchange Commission (SEC) in the United States and in particular its immediately past chairman, Christopher Cox, has been actively promoting an upgrade of the EDGAR system of disseminating filings. The new generation of information provision has been dubbed by Chairman Cox, "Interactive Data" (SEC, 2006). In October this year the Office of Interactive Disclosure was created(http://www.sec.gov/news/press/2007/2007-213.htm). The focus of this paper is to examine the way in which the non-professional investor has been constructed by various actors. We examine the manner in which Interactive Data has been sold as the panacea for financial market 'irregularities' by the SEC and others. The academic literature shows almost no evidence of researching non-professional investors in any real sense (Young, 2006). Both this literature and the behaviour of representatives of institutions such as the SEC and FSA appears to find it convenient to construct this class of investor in a particular form and to speak for them. We theorise the activities of the SEC and its chairman in particular over a period of about three years, both following and prior to the 'credit crunch'. Our approach is to examine a selection of the policy documents released by the SEC and other interested parties and the statements made by some of the policy makers and regulators central to the programme to advance the socio-technical project that is constituted by Interactive Data. We adopt insights from ANT and more particularly the sociology of translation (Callon, 1986; Latour, 1987, 2005; Law, 1996, 2002; Law & Singleton, 2005) to show how individuals and regulators have acted as spokespersons for this malleable class of investor. We theorise the processes of accountability to investors and others and in so doing reveal the regulatory bodies taking the regulated for granted. The possible implications of technological developments in digital reporting have been identified also by the CEO's of the six biggest audit firms in a discussion document on the role of accounting information and audit in the future of global capital markets (DiPiazza et al., 2006). The potential for digital reporting enabled through XBRL to "revolutionize the entire company reporting model" (p.16) is discussed and they conclude that the new model "should be driven by the wants of investors and other users of company information,..." (p.17; emphasis in the original). Here rather than examine the somewhat illusive and vexing question of whether adding interactive functionality to 'traditional' reports can achieve the benefits claimed for nonprofessional investors we wish to consider the rhetorical and discursive moves in which the SEC and others have engaged to present such developments as providing clearer reporting and accountability standards and serving the interests of this constructed and largely unknown group - the non-professional investor.
Resumo:
System compositional approach to model construction and research of informational processes, which take place in biological hierarchical neural networks, is being discussed. A computer toolbox has been successfully developed for solution of tasks from this scientific sphere. A series of computational experiments investigating the work of this toolbox on olfactory bulb model has been carried out. The well-known psychophysical phenomena have been reproduced in experiments.
Resumo:
Summarizing the accumulated experience for a long time in the polyparametric cognitive modeling of different physiological processes (electrocardiogram, electroencephalogram, electroreovasogram and others) and the development on this basis some diagnostics methods give ground for formulating a new methodology of the system analysis in biology. The gist of the methodology consists of parametrization of fractals of electrophysiological processes, matrix description of functional state of an object with a unified set of parameters, construction of the polyparametric cognitive geometric model with artificial intelligence algorithms. The geometry model enables to display the parameter relationships are adequate to requirements of the system approach. The objective character of the elements of the models and high degree of formalization which facilitate the use of the mathematical methods are advantages of these models. At the same time the geometric images are easily interpreted in physiological and clinical terms. The polyparametric modeling is an object oriented tool possessed advances functional facilities and some principal features.
Resumo:
2000 Mathematics Subject Classification: 49L20, 60J60, 93E20
Resumo:
Since wind has an intrinsically complex and stochastic nature, accurate wind power forecasts are necessary for the safety and economics of wind energy utilization. In this paper, we investigate a combination of numeric and probabilistic models: one-day-ahead wind power forecasts were made with Gaussian Processes (GPs) applied to the outputs of a Numerical Weather Prediction (NWP) model. Firstly the wind speed data from NWP was corrected by a GP. Then, as there is always a defined limit on power generated in a wind turbine due the turbine controlling strategy, a Censored GP was used to model the relationship between the corrected wind speed and power output. To validate the proposed approach, two real world datasets were used for model construction and testing. The simulation results were compared with the persistence method and Artificial Neural Networks (ANNs); the proposed model achieves about 11% improvement in forecasting accuracy (Mean Absolute Error) compared to the ANN model on one dataset, and nearly 5% improvement on another.
Resumo:
Identity influences the practice of English language teachers and supervisors, their professional development and their ability to incorporate innovation and change. Talk during post observation feedback meetings provides participants with opportunities to articulate, construct, verify, contest and negotiate identities, processes which often engender issues of face. This study examines the construction and negotiation of identity and face in post observation feedback meetings between in-service English language teachers and supervisors at a tertiary institution in the United Arab Emirates. Within a linguistic ethnography framework, this study combined linguistic microanalysis of audio recorded feedback meetings with ethnographic data gathered from participant researcher knowledge, pre-analysis interviews and post-analysis participant interpretation interviews. Through a detailed, empirical description of situated ‘real life’ institutional talk, this study shows that supervisors construct identities involving authority, power, expertise, knowledge and experience while teachers index identities involving experience, knowledge and reflection. As well as these positive valued identities, other negative, disvalued identities are constructed. Identities are shown to be discursively claimed, verified, contested and negotiated through linguistic actions. This study also shows a link between identity and face. Analysis demonstrates that identity claims verified by an interactional partner can lead to face maintenance or support. However, a contested identity claim can lead to face threat which is usually managed by facework. Face, like identity, is found to be interactionally achieved and endogenous to situated discourse. Teachers and supervisors frequently risk face threat to protect their own identities, to contest their interactional partner’s identities or to achieve the feedback meeting goal i.e. improved teaching. Both identity and face are found to be consequential to feedback talk and therefore influence teacher development, teacher/supervisor relationships and the acceptance of feedback. Analysis highlights the evaluative and conforming nature of feedback in this context which may be hindering opportunities for teacher development.
Resumo:
Presently monoethanolamine (MEA) remains the industrial standard solvent for CO2 capture processes. Operating issues relating to corrosion and degradation of MEA at high temperatures and concentrations, and in the presence of oxygen, in a traditional PCC process, have introduced the requisite for higher quality and costly stainless steels in the construction of capture equipment and the use of oxygen scavengers and corrosion inhibitors. While capture processes employing MEA have improved significantly in recent times there is a continued attraction towards alternative solvents systems which offer even more improvements. This movement includes aqueous amine blends which are gaining momentum as new generation solvents for CO2 capture processes. Given the exhaustive array of amines available to date endless opportunities exist to tune and tailor a solvent to deliver specific performance and physical properties in line with a desired capture process. The current work is focussed on the rationalisation of CO2 absorption behaviour in a series of aqueous amine blends incorporating monoethanolamine, N,N-dimethylethanolamine (DMEA), N,N-diethylethanolamine (DEEA) and 2-amino-2-methyl-1-propanol (AMP) as solvent components. Mass transfer/kinetic measurements have been performed using a wetted wall column (WWC) contactor at 40°C for a series of blends in which the blend properties including amine concentration, blend ratio, and CO2 loadings from 0.0-0.4 (moles CO2/total moles amine) were systematically varied and assessed. Equilibrium CO2 solubility in each of the blends has been estimated using a software tool developed in Matlab for the prediction of vapour liquid equilibrium using a combination of the known chemical equilibrium reactions and constants for the individual amine components which have been combined into a blend.From the CO2 mass transfer data the largest absorption rates were observed in blends containing 3M MEA/3M Am2 while the selection of the Am2 component had only a marginal impact on mass transfer rates. Overall, CO2 mass transfer in the fastest blends containing 3M MEA/3M Am2 was found to be only slightly lower than a 5M MEA solution at similar temperatures and CO2 loadings. In terms of equilibrium behaviour a slight decrease in the absorption capacity (moles CO2/mole amine) with increasing Am2 concentration in the blends with MEA was observed while cyclic capacity followed the opposite trend. Significant increases in cyclic capacity (26-111%) were observed in all blends when compared to MEA solutions at similar temperatures and total amine concentrations. In view of the reasonable compromise between CO2 absorption rate and capacity a blend containing 3M MEA and 3M AMP as blend components would represent a reasonable alternative in replacement of 5M MEA as a standalone solvent.
Resumo:
The integration of automation (specifically Global Positioning Systems (GPS)) and Information and Communications Technology (ICT) through the creation of a Total Jobsite Management Tool (TJMT) in construction contractor companies can revolutionize the way contractors do business. The key to this integration is the collection and processing of real-time GPS data that is produced on the jobsite for use in project management applications. This research study established the need for an effective planning and implementation framework to assist construction contractor companies in navigating the terrain of GPS and ICT use. An Implementation Framework was developed using the Action Research approach. The framework consists of three components, as follows: (i) ICT Infrastructure Model, (ii) Organizational Restructuring Model, and (iii) Cost/Benefit Analysis. The conceptual ICT infrastructure model was developed for the purpose of showing decision makers within highway construction companies how to collect, process, and use GPS data for project management applications. The organizational restructuring model was developed to assist companies in the analysis and redesign of business processes, data flows, core job responsibilities, and their organizational structure in order to obtain the maximum benefit at the least cost in implementing GPS as a TJMT. A cost-benefit analysis which identifies and quantifies the cost and benefits (both direct and indirect) was performed in the study to clearly demonstrate the advantages of using GPS as a TJMT. Finally, the study revealed that in order to successfully implement a program to utilize GPS data as a TJMT, it is important for construction companies to understand the various implementation and transitioning issues that arise when implementing this new technology and business strategy. In the study, Factors for Success were identified and ranked to allow a construction company to understand the factors that may contribute to or detract from the prospect for success during implementation. The Implementation Framework developed as a result of this study will serve to guide highway construction companies in the successful integration of GPS and ICT technologies for use as a TJMT.
Resumo:
Non-Destructive Testing (NDT) of deep foundations has become an integral part of the industry's standard manufacturing processes. It is not unusual for the evaluation of the integrity of the concrete to include the measurement of ultrasonic wave speeds. Numerous methods have been proposed that use the propagation speed of ultrasonic waves to check the integrity of concrete for drilled shaft foundations. All such methods evaluate the integrity of the concrete inside the cage and between the access tubes. The integrity of the concrete outside the cage remains to be considered to determine the location of the border between the concrete and the soil in order to obtain the diameter of the drilled shaft. It is also economic to devise a methodology to obtain the diameter of the drilled shaft using the Cross-Hole Sonic Logging system (CSL). Performing such a methodology using the CSL and following the CSL tests is performed and used to check the integrity of the inside concrete, thus allowing the determination of the drilled shaft diameter without having to set up another NDT device.^ This proposed new method is based on the installation of galvanized tubes outside the shaft across from each inside tube, and performing the CSL test between the inside and outside tubes. From the performed experimental work a model is developed to evaluate the relationship between the thickness of concrete and the ultrasonic wave properties using signal processing. The experimental results show that there is a direct correlation between concrete thicknesses outside the cage and maximum amplitude of the received signal obtained from frequency domain data. This study demonstrates how this new method to measuring the diameter of drilled shafts during construction using a NDT method overcomes the limitations of currently-used methods. ^ In the other part of study, a new method is proposed to visualize and quantify the extent and location of the defects. It is based on a color change in the frequency amplitude of the signal recorded by the receiver probe in the location of defects and it is called Frequency Tomography Analysis (FTA). Time-domain data is transferred to frequency-domain data of the signals propagated between tubes using Fast Fourier Transform (FFT). Then, distribution of the FTA will be evaluated. This method is employed after CSL has determined the high probability of an anomaly in a given area and is applied to improve location accuracy and to further characterize the feature. The technique has a very good resolution and clarifies the exact depth location of any void or defect through the length of the drilled shaft for the voids inside the cage. ^ The last part of study also evaluates the effect of voids inside and outside the reinforcement cage and corrosion in the longitudinal bars on the strength and axial load capacity of drilled shafts. The objective is to quantify the extent of loss in axial strength and stiffness of drilled shafts due to presence of different types of symmetric voids and corrosion throughout their lengths.^
Resumo:
A heat loop suitable for the study of thermal fouling and its relationship to corrosion processes was designed, constructed and tested. The design adopted was an improvement over those used by such investigators as Hopkins and the Heat Transfer Research Institute in that very low levels of fouling could be detected accurately, the heat transfer surface could be readily removed for examination and the chemistry of the environment could be carefully monitored and controlled. In addition, an indirect method of electrical heating of the heat transfer surface was employed to eliminate magnetic and electric effects which result when direct resistance heating is employed to a test section. The testing of the loop was done using a 316 stainless steel test section and a suspension of ferric oxide and water in an attempt to duplicate the results obtained by Hopkins. Two types of thermal ·fouling resistance versus time curves were obtained . (i) Asymptotic type fouling curve, similar to the fouling behaviour described by Kern and Seaton and other investigators, was the most frequent type of fouling curve obtained. Thermal fouling occurred at a steadily decreasing rate before reaching a final asymptotic value. (ii) If an asymptotically fouled tube was cooled with rapid cir- ·culation for periods up to eight hours at zero heat flux, and heating restarted, fouling recommenced at a high linear rate. The fouling results obtained were observed to be similar and 1n agreement with the fouling behaviour reported previously by Hopkins and it was possible to duplicate quite closely the previous results . This supports the contention of Hopkins that the fouling results obtained were due to a crevice corrosion process and not an artifact of that heat loop which might have caused electrical and magnetic effects influencing the fouling. The effects of Reynolds number and heat flux on the asymptotic fouling resistance have been determined. A single experiment to study the effect of oxygen concentration has been carried out. The ferric oxide concentration for most of the fouling trials was standardized at 2400 ppM and the range of Reynolds number and heat flux for the study was 11000-29500 and 89-121 KW/M², respectively.
Resumo:
There is a growing body of literature which marks out a feminist ethics of care and it is within this framework we understand transitions from primary to secondary school education can be challenging and care-less, especially for disabled children. By exploring the narratives of parents and professionals, we investigate transitions and self-identity, as a meaningful transition depends on the care-full spaces pupils inhabit. These education narratives are all in the context of privileging academic attainment and a culture of testing and examinations. Parents and professionals, as well as children are also surveyed. Until there are care-full education processes, marginalisation will remain, impacting on disabled children’s transition to secondary school and healthy identity construction. Moreover, if educational challenges are not addressed, their life chances are increasingly limited. Interdependent caring work enables engagement in a meaningful education and positive identity formation. In school and at home, care-full spaces are key in this process.