58 resultados para Real-life Projects

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many planning and control tools, especially network analysis, have been developed in the last four decades. The majority of them were created in military organization to solve the problem of planning and controlling research and development projects. The original version of the network model (i.e. C.P.M/PERT) was transplanted to the construction industry without the consideration of the special nature and environment of construction projects. It suited the purpose of setting up targets and defining objectives, but it failed in satisfying the requirement of detailed planning and control at the site level. Several analytical and heuristic rules based methods were designed and combined with the structure of C.P.M. to eliminate its deficiencies. None of them provides a complete solution to the problem of resource, time and cost control. VERT was designed to deal with new ventures. It is suitable for project evaluation at the development stage. CYCLONE, on the other hand, is concerned with the design and micro-analysis of the production process. This work introduces an extensive critical review of the available planning techniques and addresses the problem of planning for site operation and control. Based on the outline of the nature of site control, this research developed a simulation based network model which combines part of the logics of both VERT and CYCLONE. Several new nodes were designed to model the availability and flow of resources, the overhead and operating cost and special nodes for evaluating time and cost. A large software package is written to handle the input, the simulation process and the output of the model. This package is designed to be used on any microcomputer using MS-DOS operating system. Data from real life projects were used to demonstrate the capability of the technique. Finally, a set of conclusions are drawn regarding the features and limitations of the proposed model, and recommendations for future work are outlined at the end of this thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Product-Service System (PSS) is an integrated product and service offering that delivers value in use. This paper presents a real-life case study of a large company which has moved towards PSS. A research protocol has been created to conduct an extensive series of interviews with key personnel within the case study company. The results of the study and implications for research are explored.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Exploratory analysis of data seeks to find common patterns to gain insights into the structure and distribution of the data. In geochemistry it is a valuable means to gain insights into the complicated processes making up a petroleum system. Typically linear visualisation methods like principal components analysis, linked plots, or brushing are used. These methods can not directly be employed when dealing with missing data and they struggle to capture global non-linear structures in the data, however they can do so locally. This thesis discusses a complementary approach based on a non-linear probabilistic model. The generative topographic mapping (GTM) enables the visualisation of the effects of very many variables on a single plot, which is able to incorporate more structure than a two dimensional principal components plot. The model can deal with uncertainty, missing data and allows for the exploration of the non-linear structure in the data. In this thesis a novel approach to initialise the GTM with arbitrary projections is developed. This makes it possible to combine GTM with algorithms like Isomap and fit complex non-linear structure like the Swiss-roll. Another novel extension is the incorporation of prior knowledge about the structure of the covariance matrix. This extension greatly enhances the modelling capabilities of the algorithm resulting in better fit to the data and better imputation capabilities for missing data. Additionally an extensive benchmark study of the missing data imputation capabilities of GTM is performed. Further a novel approach, based on missing data, will be introduced to benchmark the fit of probabilistic visualisation algorithms on unlabelled data. Finally the work is complemented by evaluating the algorithms on real-life datasets from geochemical projects.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The eighth edition is a fundamental and essential update to the seventh edition published in 2000. This new edition examines a comprehensive range of existing and newer topics that are relevant to project financing in 2012 and explores current trends in the project finance and leasing industries. Contributors are experienced academics and practitioners. Since the first edition was published, the financial markets have undergone tremendous upheavals and many new structures and instruments have been created to meet the financing needs of business. This edition considers the wider world of project finance, applicable to such diverse situations as venture capital and leveraged buyouts, and using new approaches such as Islamic finance techniques. The eighth edition is an essential and over-due update to the previous edition published in 2000. The eighth edition updates a comprehensive review of financial and related topics which are relevant to project financing in 2012 and explores current trends in financial modelling of a project, risk management and the private finance initiatives. This is a comprehensive and practical book full of advice and tips for successful project financing, including leasing, offering a clear, easy to understand guide to a complex area with examples. The topic coverage is well organized and complete moving from the fundamentals to the more complex issues. There is an extensive glossary to support readers. Finally the use of 12 practitioner case studies brings many of these complex issues to life. This is the new edition of the clear, easy-to-understand industry-standard text on project financing. With a good overview of a broad area and using principles of project financing to explain complex structures, this book includes lots of examples and case studies (including Eurotunnel, Dabhol, multiple Paiton deals and other recent deals along with subsequent developments) to show the concepts in use, examine outcomes and to ensure you understand important issues such as effective project structuring and financing, financial modelling for project valuation, and risk management. Substantially updated and expanded to provide the latest developments in all aspects of project financing. An important manual reference, this book is a must-have for every project financier's desk. The text unites the domain of project financing with a wealth of project management techniques, supported by diagrams and charts and other pictorial features, where appropriate. All these supporting features facilitate a better understanding of the accompanying text for the reader. In many chapters there are diagrams to clarify the specific transaction structure discussed in the accompanying text. These diagrams enable the reader to get a very clear idea of the transaction structure, which is particularly useful where it is complex or unusual. There are also a number of checklists to assist stakeholders in the project and resource management of complex project financings. The new financial modelling chapters allow exploration of some of the pitfalls project models encounter, challenging the accurate replication of the project cash flows for stakeholders to evaluate. In the later new risk management chapters, worked examples are included to illustrate the techniques in practice. The new public private partnership/private finance initiatives chapter introduces readers to this new approach to public projects. References are made to useful websites throughout the text. Cases are included at the end of the main text to encourage examination of real-life examples of project financing in practice and also highlight specific issues of current interest. The book will be helpful to project finance sponsors, lawyers, host governments, bankers and providers of capital

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The purpose of this study was to investigate the 12-month outcome of macular edema secondary to both chronic and new central and branch retinal vein occlusions treated with intravitreal bevacizumab in the real-life clinical setting in the UK. Methods: Retrospective case notes analysis of consecutive patients with retinal vein occlusions treated with bevacizumab in 2010 to 2012. Outcome measures were visual acuity (measured with Snellen, converted into logMAR [logarithm of the minimum angle of resolution] for statistical calculation) and central retinal thickness at baseline, 4 weeks post-loading phase, and at 1 year. Results: There were 56 and 100 patients with central and branch retinal vein occlusions, respectively, of whom 62% had chronic edema and received prior therapies and another 32% required additional laser treatments post-baseline bevacizumab. Baseline median visual acuity was 0.78 (interquartile range [IQR] 0.48–1.22) in the central group and 0.6 (IQR 0.3–0.78) in the branch group. In both groups, visual improvement was statistically significant from baseline compared to post-loading (P,0.001 and P=0.03, respectively), but was not significant by month 12 (P=0.058 and P=0.166, respectively); 30% improved by at least three lines and 44% improved by at least one line by month 12. Baseline median central retinal thickness was 449 μm (IQR 388–553) in the central group and 441 µm (IQR 357–501) in the branch group. However, the mean reduction in thickness was statistically significant at post-loading (P,0.001) and at the 12-month time point (P,0.001) for both groups. The average number of injections in 1 year was 4.2 in the central group and 3.3 in the branch group. Conclusion: Our large real-world cohort results indicate that bevacizumab introduced to patients with either new or chronic edema due to retinal vein occlusion can result in resolution of edema and stabilization of vision in the first year.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mixture Density Networks are a principled method to model conditional probability density functions which are non-Gaussian. This is achieved by modelling the conditional distribution for each pattern with a Gaussian Mixture Model for which the parameters are generated by a neural network. This thesis presents a novel method to introduce regularisation in this context for the special case where the mean and variance of the spherical Gaussian Kernels in the mixtures are fixed to predetermined values. Guidelines for how these parameters can be initialised are given, and it is shown how to apply the evidence framework to mixture density networks to achieve regularisation. This also provides an objective stopping criteria that can replace the `early stopping' methods that have previously been used. If the neural network used is an RBF network with fixed centres this opens up new opportunities for improved initialisation of the network weights, which are exploited to start training relatively close to the optimum. The new method is demonstrated on two data sets. The first is a simple synthetic data set while the second is a real life data set, namely satellite scatterometer data used to infer the wind speed and wind direction near the ocean surface. For both data sets the regularisation method performs well in comparison with earlier published results. Ideas on how the constraint on the kernels may be relaxed to allow fully adaptable kernels are presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data visualization algorithms and feature selection techniques are both widely used in bioinformatics but as distinct analytical approaches. Until now there has been no method of measuring feature saliency while training a data visualization model. We derive a generative topographic mapping (GTM) based data visualization approach which estimates feature saliency simultaneously with the training of the visualization model. The approach not only provides a better projection by modeling irrelevant features with a separate noise model but also gives feature saliency values which help the user to assess the significance of each feature. We compare the quality of projection obtained using the new approach with the projections from traditional GTM and self-organizing maps (SOM) algorithms. The results obtained on a synthetic and a real-life chemoinformatics dataset demonstrate that the proposed approach successfully identifies feature significance and provides coherent (compact) projections. © 2006 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

thesis is developed from a real life application of performance evaluation of small and medium-sized enterprises (SMEs) in Vietnam. The thesis presents two main methodological developments on evaluation of dichotomous environment variable impacts on technical efficiency. Taking into account the selection bias the thesis proposes a revised frontier separation approach for the seminal Data Envelopment Analysis (DEA) model which was developed by Charnes, Cooper, and Rhodes (1981). The revised frontier separation approach is based on a nearest neighbour propensity score matching pairing treated SMEs with their counterfactuals on the propensity score. The thesis develops order-m frontier conditioning on propensity score from the conditional order-m approach proposed by Cazals, Florens, and Simar (2002), advocated by Daraio and Simar (2005). By this development, the thesis allows the application of the conditional order-m approach with a dichotomous environment variable taking into account the existence of the self-selection problem of impact evaluation. Monte Carlo style simulations have been built to examine the effectiveness of the aforementioned developments. Methodological developments of the thesis are applied in empirical studies to evaluate the impact of training programmes on the performance of food processing SMEs and the impact of exporting on technical efficiency of textile and garment SMEs of Vietnam. The analysis shows that training programmes have no significant impact on the technical efficiency of food processing SMEs. Moreover, the analysis confirms the conclusion of the export literature that exporters are self selected into the sector. The thesis finds no significant impact from exporting activities on technical efficiency of textile and garment SMEs. However, large bias has been eliminated by the proposed approach. Results of empirical studies contribute to the understanding of the impact of different environmental variables on the performance of SMEs. It helps policy makers to design proper policy supporting the development of Vietnamese SMEs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Offshore oil and gas pipelines are vulnerable to environment as any leak and burst in pipelines cause oil/gas spill resulting in huge negative Impacts on marine lives. Breakdown maintenance of these pipelines is also cost-intensive and time-consuming resulting in huge tangible and intangible loss to the pipeline operators. Pipelines health monitoring and integrity analysis have been researched a lot for successful pipeline operations and risk-based maintenance model is one of the outcomes of those researches. This study develops a risk-based maintenance model using a combined multiple-criteria decision-making and weight method for offshore oil and gas pipelines in Thailand with the active participation of experienced executives. The model's effectiveness has been demonstrated through real life application on oil and gas pipelines in the Gulf of Thailand. Practical implications. Risk-based inspection and maintenance methodology is particularly important for oil pipelines system, as any failure in the system will not only affect productivity negatively but also has tremendous negative environmental impact. The proposed model helps the pipelines operators to analyze the health of pipelines dynamically, to select specific inspection and maintenance method for specific section in line with its probability and severity of failure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A content analysis examined the way majorities and minorities are represented in the British press. An analysis of the headlines of five British newspapers, over a period of five years, revealed that the words ‘majority’ and ‘minority’ appeared 658 times. Majority headlines were most frequent (66% ), more likely to emphasize the numerical size of the majority, to link majority status with political groups, to be described with positive evaluations, and to cover political issues. By contrast, minority headlines were less frequent (34%), more likely to link minority status with ethnic groups and to other social issues, and less likely to be described with positive evaluations. The implications of examining how real-life majorities and minorities are represented for our understanding of experimental research are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose - One of the principal organizational developments in the last decade has been the pervasive influence of computer mediated communication (CMC) tools. The purpose of this paper is to closely interrogate the day-to-day role of e-mail in explicating, influencing and shaping social and information interactions within an organization. Design/methodology/approach - A series of in-depth interviews (n = 29) were undertaken to elicit employee opinions on their e-mail adaptation, experiences and practices. Findings - The paper provides insights into the polymorphic role of e-mail, particularly the way in which it is adapted by individuals within the organization. Specifically, it shows how this tool interacts within day-to-day work activities and tasks. Research limitations/implications - This paper investigates only one CMC tool, e-mail, although it is envisaged that this initial work will be used to raise a new understanding of the socially skilled adaptation of other CMC tools by employees as well as leaders. Practical implications- Previously unreported insights into employee opinion are delineated in order to provide a focus from which organizations can train and develop their employees and leaders to maximise knowledge creation within the organization. Originality/value - This study assesses CMC from an under-researched "real-life" perspective in which everyday interactions are used to understand employee reactions to e-mail communication and hence foster an atmosphere in which these interactions assist organizational development.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data envelopment analysis defines the relative efficiency of a decision making unit (DMU) as the ratio of the sum of its weighted outputs to the sum of its weighted inputs allowing the DMUs to freely allocate weights to their inputs/outputs. However, this measure may not reflect a DMU's true efficiency as some inputs/outputs may not contribute reasonably to the efficiency measure. Traditionally, to overcome this problem weights restrictions have been imposed. This paper offers a new approach to this problem where DMUs operate a constant returns to scale technology in a single input multi-output context. The approach is based on introducing unobserved DMUs, created by adjusting the output levels of certain observed relatively efficient DMUs, reflecting a combination of technical information of feasible production levels and the DM's value judgments. Its main advantage is that the information conveyed by the DM is local, with reference to a specific observed DMU. The approach is illustrated on a real life application. © 2003 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mixture Density Networks are a principled method to model conditional probability density functions which are non-Gaussian. This is achieved by modelling the conditional distribution for each pattern with a Gaussian Mixture Model for which the parameters are generated by a neural network. This thesis presents a novel method to introduce regularisation in this context for the special case where the mean and variance of the spherical Gaussian Kernels in the mixtures are fixed to predetermined values. Guidelines for how these parameters can be initialised are given, and it is shown how to apply the evidence framework to mixture density networks to achieve regularisation. This also provides an objective stopping criteria that can replace the `early stopping' methods that have previously been used. If the neural network used is an RBF network with fixed centres this opens up new opportunities for improved initialisation of the network weights, which are exploited to start training relatively close to the optimum. The new method is demonstrated on two data sets. The first is a simple synthetic data set while the second is a real life data set, namely satellite scatterometer data used to infer the wind speed and wind direction near the ocean surface. For both data sets the regularisation method performs well in comparison with earlier published results. Ideas on how the constraint on the kernels may be relaxed to allow fully adaptable kernels are presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aim: To investigate the correlation between tests of visual function and perceived visual ability recorded with a quality of life questionnaire for patients with uveitis. Methods: 132 patients with various types of uveitis were studied. High (monocular and binocular) and low (binocular) contrast logMAR letter acuities were recorded using a Bailey-Lovie chart. Contrast sensitivity (binocular) was determined using a Pelli-Robson chart. Vision related quality of life was assessed using the Vision Specific Quality of Life (VQOL) questionnaire. Results: VQOL declined with reduced performance on the following tests: binocular high contrast visual acuity (p = 0.0011), high contrast visual acuity of the better eye (p = 0.0012), contrast sensitivity (p = 0.005), binocular low contrast visual acuity (p = 0.0065), and high contrast visual acuity of the worse eye (p = 0.015). Stepwise multiple regression analysis revealed binocular high contrast visual acuity (p <0.01) to be the only visual function adequate to predict VQOL. The age of the patient was also significantly associated with perceived visual ability (p <0.001). Conclusions: Binocular high contrast visual acuity is a good measure of how uveitis patients perform in real life situations. Vision quality of life is worst in younger patients with poor binocular visual acuity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Today, the data available to tackle many scientific challenges is vast in quantity and diverse in nature. The exploration of heterogeneous information spaces requires suitable mining algorithms as well as effective visual interfaces. Most existing systems concentrate either on mining algorithms or on visualization techniques. Though visual methods developed in information visualization have been helpful, for improved understanding of a complex large high-dimensional dataset, there is a need for an effective projection of such a dataset onto a lower-dimension (2D or 3D) manifold. This paper introduces a flexible visual data mining framework which combines advanced projection algorithms developed in the machine learning domain and visual techniques developed in the information visualization domain. The framework follows Shneiderman’s mantra to provide an effective user interface. The advantage of such an interface is that the user is directly involved in the data mining process. We integrate principled projection methods, such as Generative Topographic Mapping (GTM) and Hierarchical GTM (HGTM), with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates, billboarding, and user interaction facilities, to provide an integrated visual data mining framework. Results on a real life high-dimensional dataset from the chemoinformatics domain are also reported and discussed. Projection results of GTM are analytically compared with the projection results from other traditional projection methods, and it is also shown that the HGTM algorithm provides additional value for large datasets. The computational complexity of these algorithms is discussed to demonstrate their suitability for the visual data mining framework.