32 resultados para To-failure Method

em Aston University Research Archive


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to help managers to successfully plan, implement, and operate enterprise resource planning (ERP) projects using a risk management framework. Design/methodology/approach – This paper adopted a combined literature review and case study method. Using literature review, the paper first identified major issues of managing ERP projects and develops a risk management framework for managing those issues. The proposed risk management framework was then applied to a ERP implementation project of a UK-based energy services group and its effectiveness for managing ERP projects implementation had been demonstrated. Additionally, the risk factors as identified from the case application are compared with the risk factors from the previous researches so as to suggest mitigating measures. Findings – All the risk factors are categorized into planning, implementation and operations phases along with project processes, organizational transformation and information technology (IT) perspectives. Project implementation phase is the most vulnerable to failure. The case study results reveal that the effect of other projects on on-going ERP project, management of overall IT architecture and non-availability of resources for organizational transformation are most critical from likelihood and impact perspectives. Managing risk across various phases of project and equal emphasize to effective project management, organizational transformation and IT adoption are the key to success in ERP implementation. Practical implications – The risk factors, which were identified using literature review and the case study, have great significance as mitigating measures of those risks may result successful implementation of ERP projects in the industry. Additionally, proposed risk management framework could be customized to implement ERP projects elsewhere. Originality/value – ERP projects are risky as they are capital intensive, technically complex, and call for organizational transformation. There are both success and failure stories. However, both researchers and practitioners agree, that if it can be implemented and operated successfully and benefits should be achievable. Although there are many studies on ERP implementation, little has been discussed on managing risks of ERP projects. Therefore, this paper bridges the gap.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: In the light of sub-optimal uptake of the measles, mumps, and rubella (MMR) vaccination, we investigated the factors that influence the intentions of mothers to vaccinate. METHOD: A cross-sectional survey of 300 mothers in Birmingham with children approaching a routine MMR vaccination was conducted using a postal questionnaire to measure: intention to vaccinate, psychological variables, knowledge of the vaccine, and socioeconomic status. The vaccination status of the children was obtained from South Birmingham Child Health Surveillance Unit. RESULTS: The response rate was 59%. Fewer mothers approaching the second MMR vaccination (Group 2) intended to take their children for this vaccination than Group 1 (mothers approaching the first MMR vaccination) (Mann-Whitney U = 2180, P < 0.0001). Group 2 expressed more negative beliefs about the outcome of having the MMR vaccine ('vaccine outcome beliefs') (Mann-Whitney U = 2155, P < 0.0001), were more likely to believe it was 'unsafe' (chi 2 = 9.114, P = 0.004) and that it rarely protected (chi 2 = 6.882, P = 0.014) than Group 1. The commonest side-effect cited was general malaise, but 29.8% cited autism. The most trusted source of information was the general practitioner but the most common source of information on side-effects was television (34.6%). Multiple linear regression revealed that, in Group 1, only 'vaccine outcome beliefs' significantly predicted intention (77.1% of the variance). In Group 2 'vaccine outcome beliefs', attitude to the MMR vaccine, and prior MMR status all predicted intention (93% of the variance). CONCLUSION: A major reason for the low uptake of the MMR vaccination is that it is not perceived to be important for children's health, particularly the second dose. Health education from GPs is likely to have a considerable impact.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In recent years, much interest has focused on the beneficial effects of administering potentially harmful therapeutic agents in drug carriers so as to reduce their toxic side effects. Rheumatoid arthritis is a chronic systemic disease with progressive destruction of the Joints and long term patient disability, Corticosteroids have been shown to retard the progression of Joint destruction but are limited in their use due to adverse side effects,This project, following the line of investigation started by other workers, was designed to study the use of microspheres to deliver corticosteroids to inflamed tissues by both the oral and intravenous routes. Hydrocortisone (HC)-loaded albumin microspheres were prepared by three different methods, by direct incorporation of HC within the particles, by indirect incorporation of HC by the enzymatic conversion of hydrocortisone-21-phosphate (H-21-P) to HC within the particles, and by the adsorption of HC onto the surface. HC was also loaded with PLA microspheres. The level of corticosteriod loading and in vitro release from microspheres was determined by HPLC analysis. A reversed-phase, ion-pairing HPLC method was developed to simultaneously measure both HC and H-21-P. The highest level of corticosteroid loading was achieved using the incorporation of H-21-P with enzymatic conversion to HC method. However, HPLC analysis showed only 5% of the incorporated steroid was HC. In vitro release rates of steroid from albumin microspheres showed >95% of incorporated steroid was released within 2 hours of dissolution. Increasing the protein:steroid ratio, and the temperature and duration of microsphere stabilization, had little effect on prolonging drug release. In vivo studies, using the carrageenan-induced rat hind-paw model of inflammation, indicated steroid-incorporated microspheres administered both orally and intraperitoneally were not therapeutically advantageous when compared to equivalent free steroid doses. The ability of orally and intravenously dosed [125I]~albumin microspheres (2.67 μm mean diameter) to accumulate in acutely and chronically inflamed tissues was investigated, The subcutaneous air-pouch was the model of inflammation used, with carrageenan as the inflammatory stimulus. Acute and chronic inflammation was shown to be consistently formed  in pouch tissues in terms of cell infiltration and fluid exudate formation in the pouch cavity. Albumin microspheres were shown to accumulate in the inflamed tissues and pouch fluids after both oral and intravenous administration. Preliminary, confirmatory studies using latex microspheres and quantitation by GPC analysis, also indicated microsphere accumulation in both acutely and chronically inflamed air-pouch tissues. tntl lUr"'poucbtis,sues; The results indicate the uptake and transfer of microspheres across the gastrointestinal tract into the circulation and their migration through disrupted endothelium and basement membranes at the inflamed sites. , .

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tne object of this research was to investigate the behaviour of birdcage scaffolding as used in falsework structures, assess the suitability of existing design methods and make recommendations for a set of design rules. Since excessive deflection is as undesirable in a structure as total collapse, the project was divided into two sections. These were to determine the ultimate vertical and horizontal load-carrying capacity and also the deflection characteristics of any falsework. So theoretical analyses were developed to ascertain the ability of both the individual standards to resist vertical load, and of the bracing to resist horizontal load.Furthermore a model was evolved which would predict the horizontal deflection of a scaffold under load using strain energy methods. These models were checked by three series of experiments. The first was on individual standards under vertical load only. The second series was carried out on full scale falsework structures loading vertically and horizontally to failure. Finally experiments were conducted on scaffold couplers to provide additional verification of the method of predicting deflections. This thesis gives the history of the project and an introduction into the field of scaffolding. It details both the experiments conducted and the theories developed and the correlation between theory and experiment. Finally it makes recommendations for a design method to be employed by scaffolding designers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Roma population has become a policy issue highly debated in the European Union (EU). The EU acknowledges that this ethnic minority faces extreme poverty and complex social and economic problems. 52% of the Roma population live in extreme poverty, 75% in poverty (Soros Foundation, 2007, p. 8), with a life expectancy at birth of about ten years less than the majority population. As a result, Romania has received a great deal of policy attention and EU funding, being eligible for 19.7 billion Euros from the EU for 2007-2013. Yet progress is slow; it is debated whether Romania's government and companies were capable to use these funds (EurActiv.ro, 2012). Analysing three case studies, this research looks at policy implementation in relation to the role of Roma networks in different geographical regions of Romania. It gives insights about how to get things done in complex settings and it explains responses to the Roma problem as a „wicked‟ policy issue. This longitudinal research was conducted between 2008 and 2011, comprising 86 semi-structured interviews, 15 observations, and documentary sources and using a purposive sample focused on institutions responsible for implementing social policies for Roma: Public Health Departments, School Inspectorates, City Halls, Prefectures, and NGOs. Respondents included: governmental workers, academics, Roma school mediators, Roma health mediators, Roma experts, Roma Councillors, NGOs workers, and Roma service users. By triangulating the data collected with various methods and applied to various categories of respondents, a comprehensive and precise representation of Roma network practices was created. The provisions of the 2001 „Governmental Strategy to Improve the Situation of the Roma Population‟ facilitated forming a Roma network by introducing special jobs in local and central administration. In different counties, resources, people, their skills, and practices varied. As opposed to the communist period, a new Roma elite emerged: social entrepreneurs set the pace of change by creating either closed cliques or open alliances and by using more or less transparent practices. This research deploys the concept of social/institutional entrepreneurs to analyse how key actors influence clique and alliance formation and functioning. Significantly, by contrasting three case studies, it shows that both closed cliques and open alliances help to achieve public policy network objectives, but that closed cliques can also lead to failure to improve the health and education of Roma people in a certain region.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hydrocarbons are the most common form of energy used to date. The activities involving exploration and exploitation of large oil and gas fields are constantly in operation and have extended to such hostile environments as the North Sea. This enforces much greater demands on the materials which are used, and the need for enhancing the endurance of the existing ones which must continue parallel to the explorations. Due to their ease in fabrication, relatively high mechanical properties and low costs, steels are the most widely favoured material for the construction of offshore platforms. The most critical part of an offshore structure prone to failure are the welded nodal joints, particulary those which are used within the vicinity of the splash zones. This is an area of high complex stress concentrations, varying mechanical and metallurgical properties in addition to severe North Sea environmental conditions. The main are of this work has been concerned with the durability studies of this type of steel, based on the concept of the worst case analysis, consisting of combinations of welds of varying qualities, various degrees of stress concentrations and the environmental conditions of stress corrosion and hydrogen embrittlement. The experiments have been designed to reveal significance of defects as sites of crack initiation in the welded steels and the extent to which stress corrosion and hydrogen embrittlement will limit their durability. This has been done for various heat treatments and in some experiments deformation has been forced through the welded zone of the specimens to reveal the mechanical properties of the welds themselves to provide data for finite element simulations. A comparison of the results of these simulations with the actual deformation and fracture behaviour has been done to reveal the extent to which both mechanical and metallurgical factors control behaviour of the steels in the hostile environments of high stress, corrosion, and hydrogen embrittlement at their surface.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work presents a two-dimensional approach of risk assessment method based on the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The risk is calculated using Monte Carlo simulation methods whereby synthetic contaminant source terms were generated to the same distribution as historically occurring pollution events or a priori potential probability distribution. The spatial and temporal distributions of the generated contaminant concentrations at pre-defined monitoring points within the aquifer were then simulated from repeated realisations using integrated mathematical models. The number of times when user defined ranges of concentration magnitudes were exceeded is quantified as risk. The utilities of the method were demonstrated using hypothetical scenarios, and the risk of pollution from a number of sources all occurring by chance together was evaluated. The results are presented in the form of charts and spatial maps. The generated risk maps show the risk of pollution at each observation borehole, as well as the trends within the study area. This capability to generate synthetic pollution events from numerous potential sources of pollution based on historical frequency of their occurrence proved to be a great asset to the method, and a large benefit over the contemporary methods.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper advances a philosophically informed rationale for the broader, reflexive and practical application of arts-based methods to benefit research, practice and pedagogy. It addresses the complexity and diversity of learning and knowing, foregrounding a cohabitative position and recognition of a plurality of research approaches, tailored and responsive to context. Appreciation of art and aesthetic experience is situated in the everyday, underpinned by multi-layered exemplars of pragmatic visual-arts narrative inquiry undertaken in the third, creative and communications sectors. Discussion considers semi-guided use of arts-based methods as a conduit for topic engagement, reflection and intersubjective agreement; alongside observation and interpretation of organically employed approaches used by participants within daily norms. Techniques span handcrafted (drawing), digital (photography), hybrid (cartooning), performance dimensions (improvised installations) and music (metaphor and structure). The process of creation, the artefact/outcome produced and experiences of consummation are all significant, with specific reflexivity impacts. Exploring methodology and epistemology, both the "doing" and its interpretation are explicated to inform method selection, replication, utility, evaluation and development of cross-media skills literacy. Approaches are found engaging, accessible and empowering, with nuanced capabilities to alter relationships with phenomena, experiences and people. By building a discursive space that reduces barriers; emancipation, interaction, polyphony, letting-go and the progressive unfolding of thoughts are supported, benefiting ways of knowing, narrative (re)construction, sensory perception and capacities to act. This can also present underexplored researcher risks in respect to emotion work, self-disclosure, identity and agenda. The paper therefore elucidates complex, intricate relationships between form and content, the represented and the representation or performance, researcher and participant, and the self and other. This benefits understanding of phenomena including personal experience, sensitive issues, empowerment, identity, transition and liminality. Observations are relevant to qualitative and mixed methods researchers and a multidisciplinary audience, with explicit identification of challenges, opportunities and implications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objectives: To develop a tool for the accurate reporting and aggregation of findings from each of the multiple methods used in a complex evaluation in an unbiased way. Study Design and Setting: We developed a Method for Aggregating The Reporting of Interventions in Complex Studies (MATRICS) within a gastroenterology study [Evaluating New Innovations in (the delivery and organisation of) Gastrointestinal (GI) endoscopy services by the NHS Modernisation Agency (ENIGMA)]. We subsequently tested it on a different gastroenterology trial [Multi-Institutional Nurse Endoscopy Trial (MINuET)]. We created three layers to define the effects, methods, and findings from ENIGMA. We assigned numbers to each effect in layer 1 and letters to each method in layer 2. We used an alphanumeric code based on layers 1 and 2 to every finding in layer 3 to link the aims, methods, and findings. We illustrated analogous findings by assigning more than one alphanumeric code to a finding. We also showed that more than one effect or method could report the same finding. We presented contradictory findings by listing them in adjacent rows of the MATRICS. Results: MATRICS was useful for the effective synthesis and presentation of findings of the multiple methods from ENIGMA. We subsequently successfully tested it by applying it to the MINuET trial. Conclusion: MATRICS is effective for synthesizing the findings of complex, multiple-method studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Oxidative post-translational modifications (oxPTMs) can alter the function of proteins, and are important in the redox regulation of cell behaviour. The most informative technique to detect and locate oxPTMs within proteins is mass spectrometry (MS). However, proteomic MS data are usually searched against theoretical databases using statistical search engines, and the occurrence of unspecified or multiple modifications, or other unexpected features, can lead to failure to detect the modifications and erroneous identifications of oxPTMs. We have developed a new approach for mining data from accurate mass instruments that allows multiple modifications to be examined. Accurate mass extracted ion chromatograms (XIC) for specific reporter ions from peptides containing oxPTMs were generated from standard LC-MSMS data acquired on a rapid-scanning high-resolution mass spectrometer (ABSciex 5600 Triple TOF). The method was tested using proteins from human plasma or isolated LDL. A variety of modifications including chlorotyrosine, nitrotyrosine, kynurenine, oxidation of lysine, and oxidized phospholipid adducts were detected. For example, the use of a reporter ion at 184.074 Da/e, corresponding to phosphocholine, was used to identify for the first time intact oxidized phosphatidylcholine adducts on LDL. In all cases the modifications were confirmed by manual sequencing. ApoB-100 containing oxidized lipid adducts was detected even in healthy human samples, as well as LDL from patients with chronic kidney disease. The accurate mass XIC method gave a lower false positive rate than normal database searching using statistical search engines, and identified more oxidatively modified peptides. A major advantage was that additional modifications could be searched after data collection, and multiple modifications on a single peptide identified. The oxPTMs present on albumin and ApoB-100 have potential as indicators of oxidative damage in ageing or inflammatory diseases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We show that a set of fundamental solutions to the parabolic heat equation, with each element in the set corresponding to a point source located on a given surface with the number of source points being dense on this surface, constitute a linearly independent and dense set with respect to the standard inner product of square integrable functions, both on lateral- and time-boundaries. This result leads naturally to a method of numerically approximating solutions to the parabolic heat equation denoted a method of fundamental solutions (MFS). A discussion around convergence of such an approximation is included.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We analyse the dynamics of a number of second order on-line learning algorithms training multi-layer neural networks, using the methods of statistical mechanics. We first consider on-line Newton's method, which is known to provide optimal asymptotic performance. We determine the asymptotic generalization error decay for a soft committee machine, which is shown to compare favourably with the result for standard gradient descent. Matrix momentum provides a practical approximation to this method by allowing an efficient inversion of the Hessian. We consider an idealized matrix momentum algorithm which requires access to the Hessian and find close correspondence with the dynamics of on-line Newton's method. In practice, the Hessian will not be known on-line and we therefore consider matrix momentum using a single example approximation to the Hessian. In this case good asymptotic performance may still be achieved, but the algorithm is now sensitive to parameter choice because of noise in the Hessian estimate. On-line Newton's method is not appropriate during the transient learning phase, since a suboptimal unstable fixed point of the gradient descent dynamics becomes stable for this algorithm. A principled alternative is to use Amari's natural gradient learning algorithm and we show how this method provides a significant reduction in learning time when compared to gradient descent, while retaining the asymptotic performance of on-line Newton's method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Computerised production control developments have concentrated on Manufacturing Resources Planning (MRP II) systems. The literature suggests however, that despite the massive investment in hardware, software and management education, successful implementation of such systems in manufacturing industries has proved difficult. This thesis reviews the development of production planning and control systems, in particular, investigates the causes of failures in implementing MRP/MRP II systems in industrial environments and argues that the centralised and top-down planning structure, as well as the routine operational methodology of such systems, is inherently prone to failure. The thesis reviews the control benefits of cellular manufacturing systems but concludes that in more dynamic manufacturing environments, techniques such as Kanban are inappropriate. The basic shortcomings of MRP II systems are highlighted and a new enhanced operational methodology based on distributed planning and control principles is introduced. Distributed Manufacturing Resources Planning (DMRP), was developed as a capacity sensitive production planning and control solution for cellular manufacturing environments. The system utilises cell based, independently operated MRP II systems, integrated into a plant-wide control system through a Local Area Network. The potential benefits of adopting the system in industrial environments is discussed and the results of computer simulation experiments to compare the performance of the DMRP system against the conventional MRP II systems presented. DMRP methodology is shown to offer significant potential advantages which include ease of implementation, cost effectiveness, capacity sensitivity, shorter manufacturing lead times, lower working in progress levels and improved customer service.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.