956 resultados para Large modeling projects


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional project management techniques are not always sufficient to ensure time, cost and quality achievement of large-scale construction projects due to complexity in planning, design and implementation processes. The main reasons for project non-achievement are changes in scope and design, changes in government policies and regulations, unforeseen inflation, underestimation and improper estimation. Projects that are exposed to such an uncertain environment can be effectively managed with the application of risk management throughout the project's life cycle. However, the effectiveness of risk management depends on the technique through which the effects of risk factors are analysed/quantified. This study proposes the Analytic Hierarchy Process (AHP), a multiple attribute decision making technique, as a tool for risk analysis because it can handle subjective as well as objective factors in a decision model that are conflicting in nature. This provides a decision support system (DSS) to project management for making the right decision at the right time for ensuring project success in line with organisation policy, project objectives and a competitive business environment. The whole methodology is explained through a case application of a cross-country petroleum pipeline project in India and its effectiveness in project management is demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study highlights the variables associated with the implementation of renewable energy (RE) projects for sustainable development in India, by using an interpretive structural modeling (ISM) - based approach to model variables' interactions, which impact RE adoption. These variables have been categorized under enablers that help to enhance implementation of RE projects for sustainable development. A major finding is that public awareness regarding RE for sustainable development is a very significant enabler. For successful implementation of RE projects, it has been observed that top management should focus on improving highdriving power enablers (leadership, strategic planning, public awareness, management commitment, availability of finance, government support, and support from interest groups).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this work was to design, construct and commission a new ablative pyrolysis reactor and a high efficiency product collection system. The reactor was to have a nominal throughput of 10 kg/11r of dry biomass and be inherently scalable up to an industrial scale application of 10 tones/hr. The whole process consists of a bladed ablative pyrolysis reactor, two high efficiency cyclones for char removal and a disk and doughnut quench column combined with a wet walled electrostatic precipitator, which is directly mounted on top, for liquids collection. In order to aid design and scale-up calculations, detailed mathematical modelling was undertaken of the reaction system enabling sizes, efficiencies and operating conditions to be determined. Specifically, a modular approach was taken due to the iterative nature of some of the design methodologies, with the output from one module being the input to the next. Separate modules were developed for the determination of the biomass ablation rate, specification of the reactor capacity, cyclone design, quench column design and electrostatic precipitator design. These models enabled a rigorous design protocol to be developed capable of specifying the required reactor and product collection system size for specified biomass throughputs, operating conditions and collection efficiencies. The reactor proved capable of generating an ablation rate of 0.63 mm/s for pine wood at a temperature of 525 'DC with a relative velocity between the heated surface and reacting biomass particle of 12.1 m/s. The reactor achieved a maximum throughput of 2.3 kg/hr, which was the maximum the biomass feeder could supply. The reactor is capable of being operated at a far higher throughput but this would require a new feeder and drive motor to be purchased. Modelling showed that the reactor is capable of achieving a reactor throughput of approximately 30 kg/hr. This is an area that should be considered for the future as the reactor is currently operating well below its theoretical maximum. Calculations show that the current product collection system could operate efficiently up to a maximum feed rate of 10 kg/Fir, provided the inert gas supply was adjusted accordingly to keep the vapour residence time in the electrostatic precipitator above one second. Operation above 10 kg/hr would require some modifications to the product collection system. Eight experimental runs were documented and considered successful, more were attempted but due to equipment failure had to be abandoned. This does not detract from the fact that the reactor and product collection system design was extremely efficient. The maximum total liquid yield was 64.9 % liquid yields on a dry wood fed basis. It is considered that the liquid yield would have been higher had there been sufficient development time to overcome certain operational difficulties and if longer operating runs had been attempted to offset product losses occurring due to the difficulties in collecting all available product from a large scale collection unit. The liquids collection system was highly efficient and modeling determined a liquid collection efficiency of above 99% on a mass basis. This was validated due to the fact that a dry ice/acetone condenser and a cotton wool filter downstream of the collection unit enabled mass measurements of the amount of condensable product exiting the product collection unit. This showed that the collection efficiency was in excess of 99% on a mass basis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An increasing number of organisational researchers have turned to social capital theory in an attempt to better understand the impetus for knowledge sharing at the individual and organisational level. This thesis extends that research by investigating the impact of social capital on knowledge sharing at the group-level in the organisational project context. The objective of the thesis is to investigate the importance of social capital in fostering tacit knowledge sharing among the team members of a project. The analytical focus is on the Nahapiet and Ghoshal framework of social capital but also includes elements of other scholars' work. In brief, social capital is defined as an asset that is embedded in the network of relationships possessed by an individual or social unit. It is argued that the main dimensions of social capital that are of relevance to knowledge sharing are structural, cognitive, and relational because these, among other things, foster the exchange and combination of knowledge and resources among the team members. Empirically, the study is based on the grounded theory method. Data were collected from five projects in large, medium, and small ICT companies in Malaysia. Underpinned by the constant comparative method, data were derived from 55 interviews, and observations. The data were analysed using open, axial, and selective coding. The analysis also involved counting frequency occurrence from the coding generated by grounded theory to find the important items and categories under social capital dimensions and knowledge sharing, and for further explaining sub-groups within the data. The analysis shows that the most important dimension for tacit knowledge sharing is structural capital. Most importantly, the findings also suggest that structural capital is a prerequisite of cognitive capital and relational capital at the group-level in an organisational project. It also found that in a project context, relational capital is hard to realise because it requires time and frequent interactions among the team members. The findings from quantitative analysis show that frequent meetings and interactions, relationship, positions, shared visions, shared objectives, and collaboration are among the factors that foster the sharing of tacit knowledge among the team members. In conclusion, the present study adds to the existing literature on social capital in two main ways. Firstly, it distinguishes the dimensions of social capital and identifies that structural capital is the most important dimension in social capital and it is a prerequisite of cognitive and relational capital in a project context. Secondly, it identifies the causal sequence in the dimension of social capital suggesting avenues for further theoretical and empirical work in this emerging area of inquiry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many planning and control tools, especially network analysis, have been developed in the last four decades. The majority of them were created in military organization to solve the problem of planning and controlling research and development projects. The original version of the network model (i.e. C.P.M/PERT) was transplanted to the construction industry without the consideration of the special nature and environment of construction projects. It suited the purpose of setting up targets and defining objectives, but it failed in satisfying the requirement of detailed planning and control at the site level. Several analytical and heuristic rules based methods were designed and combined with the structure of C.P.M. to eliminate its deficiencies. None of them provides a complete solution to the problem of resource, time and cost control. VERT was designed to deal with new ventures. It is suitable for project evaluation at the development stage. CYCLONE, on the other hand, is concerned with the design and micro-analysis of the production process. This work introduces an extensive critical review of the available planning techniques and addresses the problem of planning for site operation and control. Based on the outline of the nature of site control, this research developed a simulation based network model which combines part of the logics of both VERT and CYCLONE. Several new nodes were designed to model the availability and flow of resources, the overhead and operating cost and special nodes for evaluating time and cost. A large software package is written to handle the input, the simulation process and the output of the model. This package is designed to be used on any microcomputer using MS-DOS operating system. Data from real life projects were used to demonstrate the capability of the technique. Finally, a set of conclusions are drawn regarding the features and limitations of the proposed model, and recommendations for future work are outlined at the end of this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

WiMAX has been introduced as a competitive alternative for metropolitan broadband wireless access technologies. It is connection oriented and it can provide very high data rates, large service coverage, and flexible quality of services (QoS). Due to the large number of connections and flexible QoS supported by WiMAX, the uplink access in WiMAX networks is very challenging since the medium access control (MAC) protocol must efficiently manage the bandwidth and related channel allocations. In this paper, we propose and investigate a cost-effective WiMAX bandwidth management scheme, named the WiMAX partial sharing scheme (WPSS), in order to provide good QoS while achieving better bandwidth utilization and network throughput. The proposed bandwidth management scheme is compared with a simple but inefficient scheme, named the WiMAX complete sharing scheme (WCPS). A maximum entropy (ME) based analytical model (MEAM) is proposed for the performance evaluation of the two bandwidth management schemes. The reason for using MEAM for the performance evaluation is that MEAM can efficiently model a large-scale system in which the number of stations or connections is generally very high, while the traditional simulation and analytical (e.g., Markov models) approaches cannot perform well due to the high computation complexity. We model the bandwidth management scheme as a queuing network model (QNM) that consists of interacting multiclass queues for different service classes. Closed form expressions for the state and blocking probability distributions are derived for those schemes. Simulation results verify the MEAM numerical results and show that WPSS can significantly improve the network's performance compared to WCPS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Practitioners assess performance of entities in increasingly large and complicated datasets. If non-parametric models, such as Data Envelopment Analysis, were ever considered as simple push-button technologies, this is impossible when many variables are available or when data have to be compiled from several sources. This paper introduces by the 'COOPER-framework' a comprehensive model for carrying out non-parametric projects. The framework consists of six interrelated phases: Concepts and objectives, On structuring data, Operational models, Performance comparison model, Evaluation, and Result and deployment. Each of the phases describes some necessary steps a researcher should examine for a well defined and repeatable analysis. The COOPER-framework provides for the novice analyst guidance, structure and advice for a sound non-parametric analysis. The more experienced analyst benefits from a check list such that important issues are not forgotten. In addition, by the use of a standardized framework non-parametric assessments will be more reliable, more repeatable, more manageable, faster and less costly. © 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Post-disaster housing reconstruction projects face several challenges. Resources and material supplies are often scarce; several and different types of organizations are involved, while projects must be completed as quickly as possible to foster recovery. Within this context, the chapter aims to increase the understanding of relief supply chain design in reconstruction. In addition, the chapter is introducing a community based and beneficiary perspective to relief supply chains by evaluating the implications of local components for supply chain design in reconstruction. This is achieved through the means of secondary data analysis based on the evaluation reports of two major housing reconstruction projects that took place in Europe the last decade. A comparative analysis of the organizational designs of these projects highlights the ways in which users can be involved. The performance of reconstruction supply chains seems to depend to a large extent on the way beneficiaries are integrated in supply chain design impacting positively on the effectiveness of reconstruction supply chains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite considerable and growing interest in the subject of academic researchers and practising managers jointly generating knowledge (which we term ‘co-production’), our searches of management literature revealed few articles based on primary data or multiple cases. Given the increasing commitment to co-production by academics, managers and those funding research, it seems important to strengthen the evidence base about practice and performance in co-production. Literature on collaborative research was reviewed to develop a framework to structure the analysis of this data and relate findings to the limited body of prior research on collaborative research practice and performance. This paper presents empirical data from four completed, large scale co-production projects. Despite major differences between the cases, we find that the key success factors and the indicators of performances are remarkably similar. We demonstrate many, complex influences between factors, between outcomes, and between factors and outcomes, and discuss the features that are distinctive to co-production. Our empirical findings are broadly consonant with prior literature, but go further in trying to understand success factors’ consequences for performance. A second contribution of this paper is the development of a conceptually and methodologically rigorous process for investigating collaborative research, linking process and performance. The paper closes with discussion of the study’s limitations and opportunities for further research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale introduction of Organic Solar Cells (OSCs) onto the market is currently limited by their poor stability in light and air, factors present in normal working conditions for these devices. Thus, great efforts have to be undertaken to understand the photodegradation mechanisms of their organic materials in order to find solutions that mitigate these effects. This study reports on the elucidation of the photodegradation mechanisms occurring in a low bandgap polymer, namely, Si-PCPDTBT (poly[(4,4′-bis(2-ethylhexyl)dithieno[3,2-b:2′,3′-d]silole)-2,6-diyl-alt-(4,7-bis(2-thienyl)-2,1,3-benzothiadiazole)-5,5′-diyl]). Complementary analytical techniques (AFM, HS-SPME-GC-MS, UV-vis and IR spectroscopy) have been employed to monitor the modification of the chemical structure of the polymer upon photooxidative aging and the subsequent consequences on its architecture and nanomechanical properties. Furthermore, these different characterization techniques have been combined with a theoretical approach based on quantum chemistry to elucidate the evolution of the polymer alkyl side chains and backbone throughout exposure. Si-PCPDTBT is shown to be more stable against photooxidation than the commonly studied p-type polymers P3HT and PCDTBT, while modeling demonstrated the benefits of using silicon as a bridging atom in terms of photostability. (Figure Presented).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter discusses network protection of high-voltage direct current (HVDC) transmission systems for large-scale offshore wind farms where the HVDC system utilizes voltage-source converters. The multi-terminal HVDC network topology and protection allocation and configuration are discussed with DC circuit breaker and protection relay configurations studied for different fault conditions. A detailed protection scheme is designed with a solution that does not require relay communication. Advanced understanding of protection system design and operation is necessary for reliable and safe operation of the meshed HVDC system under fault conditions. Meshed-HVDC systems are important as they will be used to interconnect large-scale offshore wind generation projects. Offshore wind generation is growing rapidly and offers a means of securing energy supply and addressing emissions targets whilst minimising community impacts. There are ambitious plans concerning such projects in Europe and in the Asia-Pacific region which will all require a reliable yet economic system to generate, collect, and transmit electrical power from renewable resources. Collective offshore wind farms are efficient and have potential as a significant low-carbon energy source. However, this requires a reliable collection and transmission system. Offshore wind power generation is a relatively new area and lacks systematic analysis of faults and associated operational experience to enhance further development. Appropriate fault protection schemes are required and this chapter highlights the process of developing and assessing such schemes. The chapter illustrates the basic meshed topology, identifies the need for distance evaluation, and appropriate cable models, then details the design and operation of the protection scheme with simulation results used to illustrate operation. © Springer Science+Business Media Singapore 2014.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The coordination of effort within and among different expert groups is a central feature of contemporary organizations. Within the existing literature, however, a dichotomy has emerged in our understanding of the role played by codification in coordinating expert groups. One strand of literature emphasizes codification as a process that supports coordination by enabling the storage and ready transfer of knowledge. In contrast, another strand highlights the persistent differences between expert groups that create boundaries to the transfer of knowledge, seeing coordination as dependent on the quality of the reciprocal interactions between groups and individuals. Our research helps to resolve such contested understandings of the coordinative role played by codification. By focusing on the offshore-outsourcing of knowledge-intensive services, we examine the role played by codification when expertise was coordinated between client staff and onsite and offshore vendor personnel in a large-scale outsourcing contract between TATA Consultancy Services (TCS) and ABN AMRO bank. A number of theoretical contributions flow from our analysis of the case study, helping to move our understanding beyond the dichotomized views of codification outlined above. First, our study adds to previous work where codification has been seen as a static concept by demonstrating the multiple, coexisting, and complementary roles that codification may play. We examine the dynamic nature of codification and show changes in the relative importance of these different roles in coordinating distributed expertise over time. Second, we reconceptualize the commonly accepted view of codification as focusing on the replication and diffusion of knowledge by developing the notion of the codification of the “knower” as complementary to the codification of knowledge. Unlike previous studies of expertise directories, codification of the knower does not involve representing expertise in terms of occupational skills or competences but enables the reciprocal interrelating of expertise required by more unstructured tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The production of recombinant therapeutic proteins is an active area of research in drug development. These bio-therapeutic drugs target nearly 150 disease states and promise to bring better treatments to patients. However, if new bio-therapeutics are to be made more accessible and affordable, improvements in production performance and optimization of processes are necessary. A major challenge lies in controlling the effect of process conditions on production of intact functional proteins. To achieve this, improved tools are needed for bio-processing. For example, implementation of process modeling and high-throughput technologies can be used to achieve quality by design, leading to improvements in productivity. Commercially, the most sought after targets are secreted proteins due to the ease of handling in downstream procedures. This chapter outlines different approaches for production and optimization of secreted proteins in the host Pichia pastoris. © 2012 Springer Science+business Media, LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a theoretical framework for modeling of continuous wave Yb-doped fiber lasers with highly nonlinear cavity dynamics. The developed approach has shown good agreement between theoretical predictions and experimental results for particular scheme of Yb-doped laser with large spectral broadening during single round trip. The model is capable to accurately describe main features of the experimentally measured laser outputs such as power efficiency slope, power leakage through fibre Bragg gratings, spectral broadening and spectral shape of generated radiation. © 2011 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The subject of dropout prevention/reduction is deservedly receiving attention as a problem that, if not resolved, could threaten our national future.^ This study investigates a small segment of the overall dropout problem, which has apparently unique features of program design and population selection. The evidence presented here should add to the knowledge bank of this complicated problem.^ Project Trio was one of a number of dropout prevention programs and activities which were conducted in Dade County school years 1984-85 and 1985-86, and it is here investigated longitudinally through the end of the 1987-88 school year. It involved 17 junior and senior high schools, and 27 programs, 10 the first year and 17 the second, with over 1,000 total students, who had been selected by the schools from a list of the "at risk" students provided by the district, and were divided approximately evenly into the classical research design of an experimental group and the control group, which following standard procedure was to take the regular school curriculum. No school had more than 25 students in either group.^ Each school modified the basic design of the project to accommodate the individual school characteristics and the perceived needs of their students; however all schools projects were to include some form of academic enhancement, counseling and career awareness study.^ The conclusion of this study was that the control group had a significantly lower dropout rate than the experimental group. Though impossible to make a certain determination of the reasons for this unexpected result, it appears from evidence presented that one cause may have been inadequate administration at the local level.^ This study was also a longitudinal investigation of the "at risk" population as a whole for the three and four year period, to determine if academic factors were present in records may be used to identify dropout proneness.^ A significant correlation was found between dropping out and various measures including scores on the Quality of School Life Instrument, attendance, grade point averages, mathematics grades, and overage in grade, important identifiers in selection for dropout prevention programs. ^