984 resultados para Complex Projects


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Much research has been devoted over the years to investigating and advancing the techniques and tools used by analysts when they model. As opposed to what academics, software providers and their resellers promote as should be happening, the aim of this research was to determine whether practitioners still embraced conceptual modeling seriously. In addition, what are the most popular techniques and tools used for conceptual modeling? What are the major purposes for which conceptual modeling is used? The study found that the top six most frequently used modeling techniques and methods were ER diagramming, data flow diagramming, systems flowcharting, workflow modeling, UML, and structured charts. Modeling technique use was found to decrease significantly from smaller to medium-sized organizations, but then to increase significantly in larger organizations (proxying for large, complex projects). Technique use was also found to significantly follow an inverted U-shaped curve, contrary to some prior explanations. Additionally, an important contribution of this study was the identification of the factors that uniquely influence the decision of analysts to continue to use modeling, viz., communication (using diagrams) to/from stakeholders, internal knowledge (lack of) of techniques, user expectations management, understanding models' integration into the business, and tool/software deficiencies. The highest ranked purposes for which modeling was undertaken were database design and management, business process documentation, business process improvement, and software development. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to describe how the application of systems thinking to designing, managing and improving business processes has resulted in a new and unique holonic-based process modeling methodology know as process orientated holonic modeling. Design/methodology/approach: The paper describes key systems thinking axioms that are built upon in an overview of the methodology; the techniques are described using an example taken from a large organization designing and manufacturing capital goods equipment operating within a complex and dynamic environment. These were produced in an 18 month project, using an action research approach, to improve quality and process efficiency. Findings: The findings of this research show that this new methodology can support process depiction and improvement in industrial sectors which are characterized by environments of high variety and low volume (e.g. projects; such as the design and manufacture of a radar system or a hybrid production process) which do not provide repetitive learning opportunities. In such circumstances, the methodology has not only been able to deliver holonic-based process diagrams but also been able to transfer strategic vision from top management to middle and operational levels without being reductionistic. Originality/value: This paper will be of interest to organizational analysts looking at large complex projects whom require a methodology that does not confine them to thinking reductionistically in "task-breakdown" based approaches. The novel ideas in this paper have great impact on the way analysts should perceive organizational processes. Future research is applying the methodology in similar environments in other industries. © Emerald Group Publishing Limited.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This project supported the planning and conduct of a two-day Iowa Department of Transportation–hosted peer exchange for state agencies that have implemented some or all of the suggested strategies outlined in the Second Strategic Highway Research Program–sponsored project R10, Project Management Strategies for Complex Projects. Presentations were made by participating states, and several opportunities were provided for directed discussion. General themes emerging from the presentations and discussions were identified as follows: To implement improvements in project management processes, agency leadership needs to decide that a new approach to project management is worth pursuing and then dedicate resources to developing a project management plan. The change to formalized project management and five-dimensional project management (5DPM) requires a culture shift in agencies from segmented “silo” processes to collaborative, cooperative processes that make communication and collaboration high priorities. Agencies need trained project managers who are empowered to execute the project management plan, as well as properly trained functional staff. Project management can be centralized or decentralized with equal effect. After an agency’s project management plan and structure are developed, software tools and other resources should be implemented to support the plan and structure. All projects will benefit from enhanced project management, but the project management plan should specify appropriate approaches for several project levels as defined by factors in addition to dollar value. Project management should be included in an agency’s project development manual.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective of the thesis is to develop project management procedure for chilled beam projects. In organization is recognized that project management techniques could help in large and complex projects. Information sharing have been challenging in projects, so improvement of information sharing is one key topic of the thesis. Academic researches and literature are used to find suitable project management theories and methods. Main theories are related to phases of the project and project management tools. Practical knowledge of project management is collected from two project business oriented companies. Project management tools are chosen and modified to fulfill needs of the beam projects. Result of the thesis is proposed project management procedure, which includes phases of the chilled beam projects and project milestones. Project management procedure helps to recognize the most critical phases of the project and tools help to manage information of the project. Procedure increases knowledge of the project management techniques and tools. It also forms coherent project management working method among the chilled beam project group.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The subject of management is renowned for its addiction to fads and fashions. Project Management is no exception. The issue of interest for this paper is the establishment of the 'College of Complex Project Managers' and their 'competency standard for complex project managers.' Both have generated significant interest in the Project Management community, and like any other human endeavour they should be subject to critical evaluation. The results of this evaluation show significant flaws in the definition of complex in this case, the process by which the College and its standard have emerged, and the content of the standard. However, there is a significant case for a portfolio of research that extends the existing bodies of knowledge into large-scale complicated (or major) projects that would be owned by the relevant practitioner communities, rather than focused on one organization. Research questions are proposed that would commence this stream of activity towards an intelligent synthesis of what is required to manage in both complicated and truly complex environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wind resource evaluation in two sites located in Portugal was performed using the mesoscale modelling system Weather Research and Forecasting (WRF) and the wind resource analysis tool commonly used within the wind power industry, the Wind Atlas Analysis and Application Program (WAsP) microscale model. Wind measurement campaigns were conducted in the selected sites, allowing for a comparison between in situ measurements and simulated wind, in terms of flow characteristics and energy yields estimates. Three different methodologies were tested, aiming to provide an overview of the benefits and limitations of these methodologies for wind resource estimation. In the first methodology the mesoscale model acts like “virtual” wind measuring stations, where wind data was computed by WRF for both sites and inserted directly as input in WAsP. In the second approach, the same procedure was followed but here the terrain influences induced by the mesoscale model low resolution terrain data were removed from the simulated wind data. In the third methodology, the simulated wind data is extracted at the top of the planetary boundary layer height for both sites, aiming to assess if the use of geostrophic winds (which, by definition, are not influenced by the local terrain) can bring any improvement in the models performance. The obtained results for the abovementioned methodologies were compared with those resulting from in situ measurements, in terms of mean wind speed, Weibull probability density function parameters and production estimates, considering the installation of one wind turbine in each site. Results showed that the second tested approach is the one that produces values closest to the measured ones, and fairly acceptable deviations were found using this coupling technique in terms of estimated annual production. However, mesoscale output should not be used directly in wind farm sitting projects, mainly due to the mesoscale model terrain data poor resolution. Instead, the use of mesoscale output in microscale models should be seen as a valid alternative to in situ data mainly for preliminary wind resource assessments, although the application of mesoscale and microscale coupling in areas with complex topography should be done with extreme caution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present work the benefits of using graphics processing units (GPU) to aid the design of complex geometry profile extrusion dies, are studied. For that purpose, a3Dfinite volume based code that employs unstructured meshes to solve and couple the continuity, momentum and energy conservation equations governing the fluid flow, together with aconstitutive equation, was used. To evaluate the possibility of reducing the calculation time spent on the numerical calculations, the numerical code was parallelized in the GPU, using asimple programing approach without complex memory manipulations. For verificationpurposes, simulations were performed for three benchmark problems: Poiseuille flow, lid-driven cavity flow and flow around acylinder. Subsequently, the code was used on the design of two real life extrusion dies for the production of a medical catheter and a wood plastic composite decking profile. To evaluate the benefits, the results obtained with the GPU parallelized code were compared, in terms of speedup, with a serial implementation of the same code, that traditionally runs on the central processing unit (CPU). The results obtained show that, even with the simple parallelization approach employed, it was possible to obtain a significant reduction of the computation times.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large projects evaluation rises well known difficulties because -by definition- they modify the current price system; their public evaluation presents additional difficulties because they modify too existing shadow prices without the project. This paper analyzes -first- the basic methodologies applied until late 80s., based on the integration of projects in optimization models or, alternatively, based on iterative procedures with information exchange between two organizational levels. New methodologies applied afterwards are based on variational inequalities, bilevel programming and linear or nonlinear complementarity. Their foundations and different applications related with project evaluation are explored. As a matter of fact, these new tools are closely related among them and can treat more complex cases involving -for example- the reaction of agents to policies or the existence of multiple agents in an environment characterized by common functions representing demands or constraints on polluting emissions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recent advance in high-throughput sequencing and genotyping protocols allows rapid investigation of Mendelian and complex diseases on a scale not previously been possible. In my thesis research I took advantage of these modern techniques to study retinitis pigmentosa (RP), a rare inherited disease characterized by progressive loss of photoreceptors and leading to blindness; and hypertension, a common condition affecting 30% of the adult population. Firstly, I compared the performance of different next generation sequencing (NGS) platforms in the sequencing of the RP-linked gene PRPF31. The gene contained a mutation in an intronic repetitive element, which presented difficulties for both classic sequencing methods and NGS. We showed that all NGS platforms are powerful tools to identify rare and common DNA variants, also in case of more complex sequences. Moreover, we evaluated the features of different NGS platforms that are important in re-sequencing projects. The main focus of my thesis was then to investigate the involvement of pre-mRNA splicing factors in autosomal dominant RP (adRP). I screened 5 candidate genes in a large cohort of patients by using long-range PCR as enrichment step, followed by NGS. We tested two different approaches: in one, all target PCRs from all patients were pooled and sequenced as a single DNA library; in the other, PCRs from each patient were separated within the pool by DNA barcodes. The first solution was more cost-effective, while the second one allowed obtaining faster and more accurate results, but overall they both proved to be effective strategies for gene screenings in many samples. We could in fact identify novel missense mutations in the SNRNP200 gene, encoding an essential RNA helicase for splicing catalysis. Interestingly, one of these mutations showed incomplete penetrance in one family with adRP. Thus, we started to study the possible molecular causes underlying phenotypic differences between asymptomatic and affected members of this family. For the study of hypertension, I joined a European consortium to perform genome-wide association studies (GWAS). Thanks to the use of very informative genotyping arrays and of phenotipically well-characterized cohorts, we could identify a novel susceptibility locus for hypertension in the promoter region of the endothelial nitric oxide synthase gene (NOS3). Moreover, we have proven the direct causality of the associated SNP using three different methods: 1) targeted resequencing, 2) luciferase assay, and 3) population study. - Le récent progrès dans le Séquençage à haut Débit et les protocoles de génotypage a permis une plus vaste et rapide étude des maladies mendéliennes et multifactorielles à une échelle encore jamais atteinte. Durant ma thèse de recherche, j'ai utilisé ces nouvelles techniques de séquençage afin d'étudier la retinite pigmentale (RP), une maladie héréditaire rare caractérisée par une perte progressive des photorécepteurs de l'oeil qui entraine la cécité; et l'hypertension, une maladie commune touchant 30% de la population adulte. Tout d'abord, j'ai effectué une comparaison des performances de différentes plateformes de séquençage NGS (Next Generation Sequencing) lors du séquençage de PRPF31, un gène lié à RP. Ce gène contenait une mutation dans un élément répétable intronique, qui présentait des difficultés de séquençage avec la méthode classique et les NGS. Nous avons montré que les plateformes de NGS analysées sont des outils très puissants pour identifier des variations de l'ADN rares ou communes et aussi dans le cas de séquences complexes. De plus, nous avons exploré les caractéristiques des différentes plateformes NGS qui sont importantes dans les projets de re-séquençage. L'objectif principal de ma thèse a été ensuite d'examiner l'effet des facteurs d'épissage de pre-ARNm dans une forme autosomale dominante de RP (adRP). Un screening de 5 gènes candidats issus d'une large cohorte de patients a été effectué en utilisant la long-range PCR comme étape d'enrichissement, suivie par séquençage avec NGS. Nous avons testé deux approches différentes : dans la première, toutes les cibles PCRs de tous les patients ont été regroupées et séquencées comme une bibliothèque d'ADN unique; dans la seconde, les PCRs de chaque patient ont été séparées par code barres d'ADN. La première solution a été la plus économique, tandis que la seconde a permis d'obtenir des résultats plus rapides et précis. Dans l'ensemble, ces deux stratégies se sont démontrées efficaces pour le screening de gènes issus de divers échantillons. Nous avons pu identifier des nouvelles mutations faux-sens dans le gène SNRNP200, une hélicase ayant une fonction essentielle dans l'épissage. Il est intéressant de noter qu'une des ces mutations montre une pénétrance incomplète dans une famille atteinte d'adRP. Ainsi, nous avons commencé une étude sur les causes moléculaires entrainant des différences phénotypiques entre membres affectés et asymptomatiques de cette famille. Lors de l'étude de l'hypertension, j'ai rejoint un consortium européen pour réaliser une étude d'association Pangénomique ou genome-wide association study Grâce à l'utilisation de tableaux de génotypage très informatifs et de cohortes extrêmement bien caractérisées au niveau phénotypique, un nouveau locus lié à l'hypertension a été identifié dans la région promotrice du gène endothélial nitric oxide sinthase (NOS3). Par ailleurs, nous avons prouvé la cause directe du SNP associé au moyen de trois méthodes différentes: i) en reséquençant la cible avec NGS, ii) avec des essais à la luciférase et iii) une étude de population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In response to chronic stress the heart undergoes an adverse remodeling process associated with cardiomyocyte hypertrophy, increased cellular apoptosis and fibrosis, which ultimately causes cardiac dysfunction and heart failure. Increasing evidence suggest the role of scaffolding and anchoring proteins in coordinating different signaling pathways that mediate the hypertrophic response of the heart. In this context, the family of Α-kinase anchoring proteins (AKAPs) emerged as important regulators of the cardiac function. During my thesis work I have conducted two independent projects, both of them aiming at elucidating the role of AKAPs in the heart. It has been shown that AKAP-Lbc, an anchoring protein that possesses an intrinsic Rho- specific exchange factor activity, organizes a signaling complex that links AKAP-Lbc- dependent activation of RhoA with the mitogen activated protein kinase (MAPK) p38. The first aim of my thesis was to study the role of this novel transduction pathway in the context of cardiac hypertrophy. Here we show that transgenic mice overexpressing in cardiomyocytes a competitor fragment of AKAP-Lbc, which specifically disrupts endogenous AKAP-Lbc / p38 complexes, developed early dilated cardiomyopathy in response to two weeks of transverse aortic constriction (TAC) as compared to controls. Interestingly, inhibition of the AKAP-Lbc / p38 transduction pathway significantly reduced the hypertrophic growth of single cardiomyocytes induced by pressure overload. Therefore, it appears that the AKAP- Lbc / p38 complex is crucially involved in the regulation of stress-induced cardiomyocyte hypertrophy and that disruption of this signaling pathway is detrimental for the heart under conditions of sustained hemodynamic stress. Secondly, in order to identify new AKAPs involved in the regulation of cardiac function, we followed a proteomic approach which allowed us to characterize AKAP2 as a major AKAP in the heart. Importantly, here we show that AKAP2 interacts with several proteins known to be involved in the control of gene transcription, such as the nuclear receptor coactivator 3 (NCoA3) or the ATP-dependent SWI/SNF chromatin remodeling complex. Thus, we propose AKAP2 as a novel mediator of cardiac gene expression through its interaction with these transcriptional regulators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report proposes, that for certain types of highway construction projects undertaken by the Iowa Department of Transportation, a scheduling technique commonly referred to as linear scheduling may be more effective than the Critical Path Method scheduling technique that is currently being used. The types of projects that appear to be good candidates for the technique are those projects that have a strong linear orientation. Like a bar chart, this technique shows when an activity is scheduled to occur and like a CPM schedule it shows the sequence in which activities are expected to occur. During the 1992 construction season, the authors worked with an inlay project on Interstate 29 to demonstrate the linear scheduling technique to the Construction Office. The as-planned schedule was developed from the CPM schedule that the contractor had developed for the project. Therefore, this schedule represents what a linear representation of a CPM schedule would look like, and not necessarily what a true linear schedule would look like if it had been the only scheduling technique applied to the project. There is a need to expand the current repertoire of scheduling techniques to address those projects for which the bar chart and CPM may not be appropriate either because of the lack of control information or due to overly complex process for the actual project characteristics. The scheduling approaches used today on transportation projects have many shortcomings for properly modeling the real world constraints and conditions which are encountered. Linear project's predilection for activities with variable production rates, a concept very difficult to handle with the CPM, is easily handled and visualized with the linear technique. It is recommended that work proceed with the refinement of the method of linear scheduling described above and the development of a microcomputer based system for use by the Iowa Department of Transportation and contractors for its implementation. The system will be designed to provide the information needed to adjust schedules in a rational understandable method for monitoring progress on the projects and alerting Iowa Department of Transportation personnel when the contractor is deviating from the plan.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis focuses on integration in project business, i.e. how projectbased companies organize their product and process structures when they deliver industrial solutions to their customers. The customers that invest in these solutions run their businesses in different geographical, political and economical environments, which should be acknowledged by the supplier when providing solutions comprising of larger and more complex scopes than previously supplied to these customers. This means that the suppliers are increasing their supply range by taking over some of the activities in the value chain that have traditionally been handled by the customer. In order to be able to provide the functioning solutions, including more engineering hours, technical equipment and a wider project network, a change is needed in the mindset in order to be able to carry out and take the required responsibility that these new approaches bring. For the supplier it is important to be able to integrate technical products, systems and services, but the supplier also needs to have the capabilities to integrate the cross-functional organizations and departments in the project network, the knowledge and information between and within these organizations and departments, along with inputs from the customer into the product and process structures during the lifecycle of the project under development. Hence, the main objective of this thesis is to explore the challenges of integration that industrial projects meet, and based on that, to suggest a concept of how to manage integration in project business by making use of integration mechanisms. Integration is considered the essential process for accomplishing an industrial project, whereas the accomplishment of the industrial project is considered to be the result of the integration. The thesis consists of an extended summary and four papers, that are based on three studies in which integration mechanisms for value creation in industrial project networks and the management of integration in project business have been explored. The research is based on an inductive approach where in particular the design, commissioning and operations functions of industrial projects have been studied, addressing entire project life-cycles. The studies have been conducted in the shipbuilding and power generation industries where the scopes of supply consist of stand-alone equipment, equipment and engineering, and turnkey solutions. These industrial solutions include demanding efforts in engineering and organization. Addressing the calls for more studies on the evolving value chains of integrated solutions, mechanisms for inter- and intra-organizational integration and subsequent value creation in project networks have been explored. The research results in thirteen integration mechanisms and a typology for integration is proposed. Managing integration consists of integrating the project network (the supplier and the sub-suppliers) and the customer (the customer’s business purpose, operations environment and the end-user) into the project by making use of integration mechanisms. The findings bring new insight into research on industrial project business by proposing integration of technology and engineering related elements with elements related to customer oriented business performance in contemporary project environments. Thirteen mechanisms for combining products and the processes needed to deliver projects are described and categorized according to the impact that they have on the management of knowledge and information. These mechanisms directly relate to the performance of the supplier, and consequently to the functioning of the solution that the project provides. This thesis offers ways to promote integration of knowledge and information during the lifecycle of industrial projects, enhancing the development towards innovative solutions in project business.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study combines several projects related to the flows in vessels with complex shapes representing different chemical apparata. Three major cases were studied. The first one is a two-phase plate reactor with a complex structure of intersecting micro channels engraved on one plate which is covered by another plain plate. The second case is a tubular microreactor, consisting of two subcases. The first subcase is a multi-channel two-component commercial micromixer (slit interdigital) used to mix two liquid reagents before they enter the reactor. The second subcase is a micro-tube, where the distribution of the heat generated by the reaction was studied. The third case is a conventionally packed column. However, flow, reactions or mass transfer were not modeled. Instead, the research focused on how to describe mathematically the realistic geometry of the column packing, which is rather random and can not be created using conventional computeraided design or engineering (CAD/CAE) methods. Several modeling approaches were used to describe the performance of the processes in the considered vessels. Computational fluid dynamics (CFD) was used to describe the details of the flow in the plate microreactor and micromixer. A space-averaged mass transfer model based on Fick’s law was used to describe the exchange of the species through the gas-liquid interface in the microreactor. This model utilized data, namely the values of the interfacial area, obtained by the corresponding CFD model. A common heat transfer model was used to find the heat distribution in the micro-tube. To generate the column packing, an additional multibody dynamic model was implemented. Auxiliary simulation was carried out to determine the position and orientation of every packing element in the column. This data was then exported into a CAD system to generate desirable geometry, which could further be used for CFD simulations. The results demonstrated that the CFD model of the microreactor could predict the flow pattern well enough and agreed with experiments. The mass transfer model allowed to estimate the mass transfer coefficient. Modeling for the second case showed that the flow in the micromixer and the heat transfer in the tube could be excluded from the larger model which describes the chemical kinetics in the reactor. Results of the third case demonstrated that the auxiliary simulation could successfully generate complex random packing not only for the column but also for other similar cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The initial timing of face-specific effects in event-related potentials (ERPs) is a point of contention in face processing research. Although effects during the time of the N170 are robust in the literature, inconsistent effects during the time of the P100 challenge the interpretation of the N170 as being the initial face-specific ERP effect. The interpretation of the early P100 effects are often attributed to low-level differences between face stimuli and a host of other image categories. Research using sophisticated controls for low-level stimulus characteristics (Rousselet, Husk, Bennett, & Sekuler, 2008) report robust face effects starting at around 130 ms following stimulus onset. The present study examines the independent components (ICs) of the P100 and N170 complex in the context of a minimally controlled low-level stimulus set and a clear P100 effect for faces versus houses at the scalp. Results indicate that four ICs account for the ERPs to faces and houses in the first 200ms following stimulus onset. The IC that accounts for the majority of the scalp N170 (icNla) begins dissociating stimulus conditions at approximately 130 ms, closely replicating the scalp results of Rousselet et al. (2008). The scalp effects at the time of the P100 are accounted for by two constituent ICs (icP1a and icP1b). The IC that projects the greatest voltage at the scalp during the P100 (icP1a) shows a face-minus-house effect over the period of the P100 that is less robust than the N 170 effect of icN 1 a when measured as the average of single subject differential activation robustness. The second constituent process of the P100 (icP1b), although projecting a smaller voltage to the scalp than icP1a, shows a more robust effect for the face-minus-house contrast starting prior to 100 ms following stimulus onset. Further, the effect expressed by icP1 b takes the form of a larger negative projection to medial occipital sites for houses over faces partially canceling the larger projection of icP1a, thereby enhancing the face positivity at this time. These findings have three main implications for ERP research on face processing: First, the ICs that constitute the face-minus-house P100 effect are independent from the ICs that constitute the N170 effect. This suggests that the P100 effect and the N170 effect are anatomically independent. Second, the timing of the N170 effect can be recovered from scalp ERPs that have spatio-temporally overlapping effects possibly associated with low-level stimulus characteristics. This unmixing of the EEG signals may reduce the need for highly constrained stimulus sets, a characteristic that is not always desirable for a topic that is highly coupled to ecological validity. Third, by unmixing the constituent processes of the EEG signals new analysis strategies are made available. In particular the exploration of the relationship between cortical processes over the period of the P100 and N170 ERP complex (and beyond) may provide previously unaccessible answers to questions such as: Is the face effect a special relationship between low-level and high-level processes along the visual stream?