970 resultados para current solution


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Controlled drug delivery is a key topic in modern pharmacotherapy, where controlled drug delivery devices are required to prolong the period of release, maintain a constant release rate, or release the drug with a predetermined release profile. In the pharmaceutical industry, the development process of a controlled drug delivery device may be facilitated enormously by the mathematical modelling of drug release mechanisms, directly decreasing the number of necessary experiments. Such mathematical modelling is difficult because several mechanisms are involved during the drug release process. The main drug release mechanisms of a controlled release device are based on the device’s physiochemical properties, and include diffusion, swelling and erosion. In this thesis, four controlled drug delivery models are investigated. These four models selectively involve the solvent penetration into the polymeric device, the swelling of the polymer, the polymer erosion and the drug diffusion out of the device but all share two common key features. The first is that the solvent penetration into the polymer causes the transition of the polymer from a glassy state into a rubbery state. The interface between the two states of the polymer is modelled as a moving boundary and the speed of this interface is governed by a kinetic law. The second feature is that drug diffusion only happens in the rubbery region of the polymer, with a nonlinear diffusion coefficient which is dependent on the concentration of solvent. These models are analysed by using both formal asymptotics and numerical computation, where front-fixing methods and the method of lines with finite difference approximations are used to solve these models numerically. This numerical scheme is conservative, accurate and easily implemented to the moving boundary problems and is thoroughly explained in Section 3.2. From the small time asymptotic analysis in Sections 5.3.1, 6.3.1 and 7.2.1, these models exhibit the non-Fickian behaviour referred to as Case II diffusion, and an initial constant rate of drug release which is appealing to the pharmaceutical industry because this indicates zeroorder release. The numerical results of the models qualitatively confirms the experimental behaviour identified in the literature. The knowledge obtained from investigating these models can help to develop more complex multi-layered drug delivery devices in order to achieve sophisticated drug release profiles. A multi-layer matrix tablet, which consists of a number of polymer layers designed to provide sustainable and constant drug release or bimodal drug release, is also discussed in this research. The moving boundary problem describing the solvent penetration into the polymer also arises in melting and freezing problems which have been modelled as the classical onephase Stefan problem. The classical one-phase Stefan problem has unrealistic singularities existed in the problem at the complete melting time. Hence we investigate the effect of including the kinetic undercooling to the melting problem and this problem is called the one-phase Stefan problem with kinetic undercooling. Interestingly we discover the unrealistic singularities existed in the classical one-phase Stefan problem at the complete melting time are regularised and also find out the small time behaviour of the one-phase Stefan problem with kinetic undercooling is different to the classical one-phase Stefan problem from the small time asymptotic analysis in Section 3.3. In the case of melting very small particles, it is known that surface tension effects are important. The effect of including the surface tension to the melting problem for nanoparticles (no kinetic undercooling) has been investigated in the past, however the one-phase Stefan problem with surface tension exhibits finite-time blow-up. Therefore we investigate the effect of including both the surface tension and kinetic undercooling to the melting problem for nanoparticles and find out the the solution continues to exist until complete melting. The investigation of including kinetic undercooling and surface tension to the melting problems reveals more insight into the regularisations of unphysical singularities in the classical one-phase Stefan problem. This investigation gives a better understanding of melting a particle, and contributes to the current body of knowledge related to melting and freezing due to heat conduction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Light gauge steel frame wall systems are commonly used in industrial and commercial buildings, and there is a need for simple fire design rules to predict their load capacities and fire resistance ratings. During fire events, the light gauge steel frame wall studs are subjected to non-uniform temperature distributions that cause thermal bowing, neutral axis shift and magnification effects and thus resulting in a combined axial compression and bending action on the studs. In this research, a series of full-scale fire tests was conducted first to evaluate the performance of light gauge steel frame wall systems with eight different wall configurations under standard fire conditions. Finite element models of light gauge steel frame walls were then developed, analysed under transient and steady-state conditions and validated using full-scale fire tests. Using the results from fire tests and finite element analyses, a detailed investigation was undertaken into the prediction of axial compression strength and failure times of light gauge steel frame wall studs in standard fires using the available fire design rules based on Australian, American and European standards. The results from both fire tests and finite element analyses were used to investigate the ability of these fire design rules to include the complex effects of non-uniform temperature distributions and their accuracy in predicting the axial compression strength of wall studs and the failure times. Suitable modifications were then proposed to the fire design rules. This article presents the details of this investigation on the fire design rules of light gauge steel frame walls and the results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new optimal control model of the interactions between a growing tumour and the host immune system along with an immunotherapy treatment strategy is presented. The model is based on an ordinary differential equation model of interactions between the growing tu- mour and the natural killer, cytotoxic T lymphocyte and dendritic cells of the host immune system, extended through the addition of a control function representing the application of a dendritic cell treat- ment to the system. The numerical solution of this model, obtained from a multi species Runge–Kutta forward-backward sweep scheme, is described. We investigate the effects of varying the maximum al- lowed amount of dendritic cell vaccine administered to the system and find that control of the tumour cell population is best effected via a high initial vaccine level, followed by reduced treatment and finally cessation of treatment. We also found that increasing the strength of the dendritic cell vaccine causes an increase in the number of natural killer cells and lymphocytes, which in turn reduces the growth of the tumour.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite extensive research, no cure has been found for Alzheimer's disease as yet. A large number of medications have been investigated to determine their potential for altering the natural history of the disease and this work is ongoing. In an effort to shed light on current and future (awaiting approval) medications, the following is a summary of published material and experiences with some of the drugs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A review of radiographers was undertaken to determine the specific projections currently performed for patients with acute presentation for shoulder trauma. Radiographers were asked to indicate projections they would perform for specific patient presentations. This poster presents a snapshot of the diversity of projections performed and a review of the current evidence of the most appropriate projections

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Considerate amount of research has proposed optimization-based approaches employing various vibration parameters for structural damage diagnosis. The damage detection by these methods is in fact a result of updating the analytical structural model in line with the current physical model. The feasibility of these approaches has been proven. But most of the verification has been done on simple structures, such as beams or plates. In the application on a complex structure, like steel truss bridges, a traditional optimization process will cost massive computational resources and lengthy convergence. This study presents a multi-layer genetic algorithm (ML-GA) to overcome the problem. Unlike the tedious convergence process in a conventional damage optimization process, in each layer, the proposed algorithm divides the GA’s population into groups with a less number of damage candidates; then, the converged population in each group evolves as an initial population of the next layer, where the groups merge to larger groups. In a damage detection process featuring ML-GA, as parallel computation can be implemented, the optimization performance and computational efficiency can be enhanced. In order to assess the proposed algorithm, the modal strain energy correlation (MSEC) has been considered as the objective function. Several damage scenarios of a complex steel truss bridge’s finite element model have been employed to evaluate the effectiveness and performance of ML-GA, against a conventional GA. In both single- and multiple damage scenarios, the analytical and experimental study shows that the MSEC index has achieved excellent damage indication and efficiency using the proposed ML-GA, whereas the conventional GA only converges at a local solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The decentralisation reform in Indonesia has mandated the Central Government to transfer some functions and responsibilities to local governments including the transfer of human resources, assets and budgets. Local governments became giant asset holders almost overnight and most were ill prepared to handle these transformations. Assets were transferred without analysing local government need, ability or capability to manage the assets and no local government was provided with an asset management framework. Therefore, the aim of this research is to develop a Public Asset Management Framework for provincial governments in Indonesia, especially for infrastructure and real property assets. This framework will enable provincial governments to develop integrated asset management procedures throughout asset‘s lifecycle. Achieving the research aim means answering the following three research questions; 1) How do provincial governments in Indonesia currently manage their public assets? 2) What factors influence the provincial governments in managing these public assets? 3) How is a Public Asset Management Framework developed that is specific for the Indonesian provincial governments‘ situation? This research applied case studies approach after a literature review; document retrieval, interviews and observations were collated. Data was collected in June 2009 (preliminary data collection) and January to July 2010 in the major eastern Indonesian provinces. Once the public asset management framework was developed, a focus group was used to verify the framework. Results are threefold and indicate that Indonesian provincial governments need to improve the effectiveness and efficiency of current practice of public asset management in order to improve public service quality. The second result shows that the 5 major concerns that influence the local government public asset management processes are asset identification and inventory systems, public asset holding, asset guidance and legal arrangements, asset management efficiency and effectiveness, and, human resources and their organisational arrangements. The framework was then applied to assets already transferred to local governments and so included a system of asset identification and a needs analysis to classify the importance of these assets to local governments, their functions and responsibilities in delivering public services. Assets that support local government functions and responsibilities will then be managed using suitable asset lifecycle processes. Those categorised as surplus assets should be disposed. Additionally functions and responsibilities that do not need an asset solution should be performed directly by local governments. These processes must be measured using performance measurement indicators. All these stages should be guided and regulated with sufficient laws and regulations. Constant improvements to the quality and quantity of human resources hold an important role in successful public asset management processes. This research focuses on developing countries, and contributes toward the knowledge of a Public Asset Management Framework at local government level, particularly Indonesia. The framework provides local governments a foundation to improve their effectiveness and efficiency in managing public assets, which could lead to improved public service quality. This framework will ensure that the best decisions are made throughout asset decision ownership and provide a better asset life cycle process, leading to selection of the most appropriate asset, improve its acquisition and delivery process, optimise asset performance, and provide an appropriate disposal program.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – This paper adds to growing research of psychiatric intensive care units (PICU) by recounting descriptions of psychiatric intensive care settings and discusses the perceptions held by nurses of the organisational interfaces, arrangements and provisions of care in these settings. Design/methodology/approach – Data gathered from focus groups held with nurses from two PICUs was used to establish terminology, defining attributes, related concepts, antecedents, values, processes and concepts related to current practices. A literature search was conducted to permit a review of the conceptual arrangements and contemporary understanding of intensive care for people experiencing acute psychiatric illness based on the perspectives held by the nurses from the focus groups. Findings – Dissonance between service needs and the needs and management of individual patients overshadow strategies to implement comprehensive recovery-oriented approaches. Three factors are reported in this paper that influence standards and procedural practice in PICU; organisational structures; physical structures; and subtype nomenclature. Practical implications – Acute inpatient care is an important part of a comprehensive approach to mental health services. Commonly intensive acute care is delivered in specialised wards or units co-located with acute mental health inpatient units mostly known as PICU. Evidence of the most effective treatment and approaches in intensive care settings that support comprehensive recovery for improved outcomes is nascent. Originality/value – Current descriptions from nurses substantiate wide variations in the provisions, design and classifications of psychiatric intensive care. Idiosyncratic and localised conceptions of psychiatric intensive care are not adequately entailing effective treatment and methods in support of recovery principles for improved and comprehensive outcomes. The authors suggest that more concrete descriptions, guidelines, training and policies for provision of intensive psychiatric health care encompassing the perspective of nursing professionals, would reinforce conceptual construction and thus optimum treatments within a comprehensive, recovery-oriented approach to mental health services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent advances in the area of ‘Transformational Government’ position the citizen at the centre of focus. This paradigm shift from a department-centric to a citizen-centric focus requires governments to re-think their approach to service delivery, thereby decreasing costs and increasing citizen satisfaction. The introduction of franchises as a virtual business layer between the departments and their citizens is intended to provide a solution. Franchises are structured to address the needs of citizens independent of internal departmental structures. For delivering services online, governments pursue the development of a One-Stop Portal, which structures information and services through those franchises. Thus, each franchise can be mapped to a specific service bundle, which groups together services that are deemed to be of relevance to a specific citizen need. This study focuses on the development and evaluation of these service bundles. In particular, two research questions guide the line of investigation of this study: Research Question 1): What methods can be used by governments to identify service bundles as part of governmental One-Stop Portals? Research Question 2): How can the quality of service bundles in governmental One-Stop Portals be evaluated? The first research question asks about the identification of suitable service bundle identification methods. A literature review was conducted, to, initially, conceptualise the service bundling task, in general. As a consequence, a 4-layer model of service bundling and a morphological box were created, detailing characteristics that are of relevance when identifying service bundles. Furthermore, a literature review of Decision-Support Systems was conducted to identify approaches of relevance in different bundling scenarios. These initial findings were complemented by targeted studies of multiple leading governments in the e-government domain, as well as with a local expert in the field. Here, the aim was to identify the current status of online service delivery and service bundling in practice. These findings led to the conceptualising of two service bundle identification methods, applicable in the context of Queensland Government: On the one hand, a provider-driven approach, based on service description languages, attributes, and relationships between services was conceptualised. As well, a citizen-driven approach, based on analysing the outcomes from content identification and grouping workshops with citizens, was also conceptualised. Both methods were then applied and evaluated in practice. The conceptualisation of the provider-driven method for service bundling required the initial specification of relevant attributes that could be used to identify similarities between services called relationships; these relationships then formed the basis for the identification of service bundles. This study conceptualised and defined seven relationships, namely ‘Co-location’, ‘Resource’, ‘Co-occurrence’, ‘Event’, ‘Consumer’, ‘Provider’, and ‘Type’. The relationships, and the bundling method itself, were applied and refined as part of six Action Research cycles in collaboration with the Queensland Government. The findings show that attributes and relationships can be used effectively as a means for bundle identification, if distinct decision rules are in place to prescribe how services are to be identified. For the conceptualisation of the citizen-driven method, insights from the case studies led to the decision to involve citizens, through card sorting activities. Based on an initial list of services, relevant for a certain franchise, participating citizens grouped services according to their liking. The card sorting activity, as well as the required analysis and aggregation of the individual card sorting results, was analysed in depth as part of this study. A framework was developed that can be used as a decision-support tool to assist with the decision of what card sorting analysis method should be utilised in a given scenario. The characteristic features associated with card sorting in a government context led to the decision to utilise statistical analysis approaches, such as cluster analysis and factor analysis, to aggregate card sorting results. The second research question asks how the quality of service bundles can be assessed. An extensive literature review was conducted focussing on bundle, portal, and e-service quality. It was found that different studies use different constructs, terminology, and units of analysis, which makes comparing these models a difficult task. As a direct result, a framework was conceptualised, that can be used to position past and future studies in this research domain. Complementing the literature review, interviews conducted as part of the case studies with leaders in e-government, indicated that, typically, satisfaction is evaluated for the overall portal once the portal is online, but quality tests are not conducted during the development phase. Consequently, a research model which appropriately defines perceived service bundle quality would need to be developed from scratch. Based on existing theory, such as Theory of Reasoned Action, Expectation Confirmation Theory, and Theory of Affordances, perceived service bundle quality was defined as an inferential belief. Perceived service bundle quality was positioned within the nomological net of services. Based on the literature analysis on quality, and on the subsequent work of a focus group, the hypothesised antecedents (descriptive beliefs) of the construct and the associated question items were defined and the research model conceptualised. The model was then tested, refined, and finally validated during six Action Research cycles. Results show no significant difference in higher quality or higher satisfaction among users for either the provider-driven method or for the citizen-driven method. The decision on which method to choose, it was found, should be based on contextual factors, such as objectives, resources, and the need for visibility. The constructs of the bundle quality model were examined. While the quality of bundles identified through the citizen-centric approach could be explained through the constructs ‘Navigation’, ‘Ease of Understanding’, and ‘Organisation’, bundles identified through the provider-driven approach could be explained solely through the constructs ‘Navigation’ and ‘Ease of Understanding’. An active labelling style for bundles, as part of the provider-driven Information Architecture, had a larger impact on ‘Quality’ than the topical labelling style used in the citizen-centric Information Architecture. However, ‘Organisation’, reflecting the internal, logical structure of the Information Architecture, was a significant factor impacting on ‘Quality’ only in the citizen-driven Information Architecture. Hence, it was concluded that active labelling can compensate for a lack of logical structure. Further studies are needed to further test this conjecture. Such studies may involve building alternative models and conducting additional empirical research (e.g. use of an active labelling style for the citizen-driven Information Architecture). This thesis contributes to the body of knowledge in several ways. Firstly, it presents an empirically validated model of the factors explaining and predicting a citizen’s perception of service bundle quality. Secondly, it provides two alternative methods that can be used by governments to identify service bundles in structuring the content of a One-Stop Portal. Thirdly, this thesis provides a detailed narrative to suggest how the recent paradigm shift in the public domain, towards a citizen-centric focus, can be pursued by governments; the research methodology followed by this study can serve as an exemplar for governments seeking to achieve a citizen-centric approach to service delivery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current discourse surrounding victims of online fraud is heavily premised on an individual notion of greed. The strength of this discourse permeates the thinking of those who have not experienced this type of crime, as well as victims themselves. The current discourse also manifests itself in theories of victim precipitation, which again assigns the locus of blame to individuals for their actions in an offence. While these typologies and categorisations of victims have been critiqued as “victim blaming” in other fields, this has not occurred with regard to online fraud victims, where victim focused ideas of responsibility for the offence continue to dominate. This paper illustrates the nature and extent of the greed discourse and argues that it forms part of a wider construction of online fraud that sees responsibility for victimisation lie with the victims themselves and their actions. It argues that the current discourse does not take into account the level of deception and the targeting of vulnerability that is employed by the offender in perpetrating this type of crime. It concludes by advocating the need to further examine and challenge this discourse, especially with regard to its potential impact for victim’s access to support services and the wider criminal justice system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The success or effectiveness for any aircraft design is a function of many trade-offs. Over the last 100 years of aircraft design these trade-offs have been optimized and dominant aircraft design philosophies have emerged. Pilotless aircraft (or uninhabited airborne systems, UAS) present new challenges in the optimization of their configuration. Recent developments in battery and motor technology have seen an upsurge in the utility and performance of electric powered aircraft. Thus, the opportunity to explore hybrid-electric aircraft powerplant configurations is compelling. This thesis considers the design of such a configuration from an overall propulsive, and energy efficiency perspective. A prototype system was constructed using a representative small UAS internal combustion engine (10cc methanol two-stroke) and a 600W brushless Direct current (BLDC) motor. These components were chosen to be representative of those that would be found on typical small UAS. The system was tested on a dynamometer in a wind-tunnel and the results show an improvement in overall propulsive efficiency of 17% when compared to a non-hybrid powerplant. In this case, the improvement results from the utilization of a larger propeller that the hybrid solution allows, which shows that general efficiency improvements are possible using hybrid configurations for aircraft propulsion. Additionally this approach provides new improvements in operational and mission flexibility (such as the provision of self-starting) which are outlined in the thesis. Specifically, the opportunity to use the windmilling propeller for energy regeneration was explored. It was found (in the prototype configuration) that significant power (60W) is recoverable in a steep dive, and although the efficiency of regeneration is low, the capability can allow several options for improved mission viability. The thesis concludes with the general statement that a hybrid powerplant improves the overall mission effectiveness and propulsive efficiency of small UAS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background/aims: Remote monitoring for heart failure has not only been evaluated in a large number of randomised controlled trials, but also in many systematic reviews and meta-analyses. The aim of this meta-review was to identify, appraise and synthesise existing systematic reviews that have evaluated the effects of remote monitoring in heart failure. Methods: Using a Cochrane methodology, we electronically searched all relevant online databases and search engines, performed a forward citation search as well as hand-searched bibliographies. Only fully published systematic reviews of invasive and/or non-invasive remote monitoring interventions were included. Two reviewers independently extracted data. Results: Sixty-five publications from 3333 citations were identified. Seventeen fulfilled the inclusion and exclusion criteria. Quality varied with A Measurement Tool to Assess Systematic Reviews (AMSTAR scores) ranging from 2 to 11 (mean 5.88). Seven reviews (41%) pooled results from individual studies for meta-analysis. Eight (47%) considered all non-invasive remote monitoring strategies. Four (24%) focused specifically on telemonitoring. Four (24%) included studies investigating both non-invasive and invasive technologies. Population characteristics of the included studies were not reported consistently. Mortality and hospitalisations were the most frequently reported outcomes 12 (70%). Only five reviews (29%) reported healthcare costs and compliance. A high degree of heterogeneity was reported in many of the meta-analyses. Conclusions: These results should be considered in context of two negative RCTs of remote monitoring for heart failure that have been published since the meta-analyses (TIM-HF and Tele-HF). However, high quality reviews demonstrated improved mortality, quality of life, reduction in hospitalisations and healthcare costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sol-gel synthesis in varied gravity is only a relatively new topic in the literature and further investigation is required to explore its full potential as a method to synthesise novel materials. Although trialled for systems such as silica, the specific application of varied gravity synthesis to other sol-gel systems such as titanium has not previously been undertaken. Current literature methods for the synthesis of sol-gel material in reduced gravity could not be applied to titanium sol-gel processing, thus a new strategy had to be developed in this study. To successfully conduct experiments in varied gravity a refined titanium sol-gel chemical precursor had to be developed which allowed the single solution precursor to remain un-reactive at temperatures up to 50oC and only begin to react when exposed to a pressure decrease from a vacuum. Due to the new nature of this precursor, a thorough characterisation of the reaction precursors was subsequently undertaken with the use of techniques such as Nuclear Magnetic Resonance, Infra-red and UV-Vis spectroscopy in order to achieve sufficient understanding of precursor chemistry and kinetic stability. This understanding was then used to propose gelation reaction mechanisms under varied gravity conditions. Two unique reactor systems were designed and built with the specific purpose to allow the effects of varied gravity (high, normal, reduced) during synthesis of titanium sol-gels to be studied. The first system was a centrifuge capable of providing high gravity environments of up to 70 g’s for extended periods, whilst applying a 100 mbar vacuum and a temperature of 40-50oC to the reaction chambers. The second system to be used in the QUT Microgravity Drop Tower Facility was also required to provide the same thermal and vacuum conditions used in the centrifuge, but had to operate autonomously during free fall. Through the use of post synthesis characterisation techniques such as Raman Spectroscopy, X-Ray diffraction (XRD) and N2 adsorption, it was found that increased gravity levels during synthesis, had the greatest effect on the final products. Samples produced in reduced and normal gravity appeared to form amorphous gels containing very small particles with moderate surface areas. Whereas crystalline anatase (TiO2), was found to form in samples synthesised above 5 g with significant increases in crystallinity, particle size and surface area observed when samples were produced at gravity levels up to 70 g. It is proposed that for samples produced in higher gravity, an increased concentration gradient of water is forms at the bottom of the reacting film due to forced convection. The particles formed in higher gravity diffuse downward towards this excess of water, which favours the condensation reaction of remaining sol gel precursors with the particles promoting increased particle growth. Due to the removal of downward convection in reduced gravity, particle growth due to condensation reaction processes are physically hindered hydrolysis reactions favoured instead. Another significant finding from this work was that anatase could be produced at relatively low temperatures of 40-50oC instead of the conventional method of calcination above 450oC solely through sol-gel synthesis at higher gravity levels. It is hoped that the outcomes of this research will lead to an increased understanding of the effects of gravity on chemical synthesis of titanium sol-gel, potentially leading to the development of improved products suitable for diverse applications such as semiconductor or catalyst materials as well as significantly reducing production and energy costs through manufacturing these materials at significantly lower temperatures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dc capacitors voltage unbalancing is the main technical drawback of a diode-clamped multilevel inverter (DCMLI), with more than three levels. A voltage-balancing circuit based on buck–boost chopper connected to the dc link of DCMLI is a reliable and robust solution to this problem. This study presents four different schemes for controlling the chopper circuit to achieve the capacitor voltages equalisation. These can be broadly categorised as single-pulse, multi-pulse and hysteresis band current control schemes. The single-pulse scheme does not involve faster switching actions but need the chopper devices to be rated for higher current. The chopper devices current rating can be kept limited by using the multi-pulse scheme but it involves faster switching actions and slower response. The hysteresis band current control scheme offers faster dynamics, lower current rating of the chopper devices and can nullify the initial voltage imbalance as well. However, it involves much faster switching actions which may not be feasible for some of its applications. Therefore depending on the system requirements and ratings, one of these schemes may be used. The performance and validity of the proposed schemes are confirmed through both simulation and experimental investigations on a prototype five-level diode-clamped inverter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reliability analysis is crucial to reducing unexpected down time, severe failures and ever tightened maintenance budget of engineering assets. Hazard based reliability methods are of particular interest as hazard reflects the current health status of engineering assets and their imminent failure risks. Most existing hazard models were constructed using the statistical methods. However, these methods were established largely based on two assumptions: one is the assumption of baseline failure distributions being accurate to the population concerned and the other is the assumption of effects of covariates on hazards. These two assumptions may be difficult to achieve and therefore compromise the effectiveness of hazard models in the application. To address this issue, a non-linear hazard modelling approach is developed in this research using neural networks (NNs), resulting in neural network hazard models (NNHMs), to deal with limitations due to the two assumptions for statistical models. With the success of failure prevention effort, less failure history becomes available for reliability analysis. Involving condition data or covariates is a natural solution to this challenge. A critical issue for involving covariates in reliability analysis is that complete and consistent covariate data are often unavailable in reality due to inconsistent measuring frequencies of multiple covariates, sensor failure, and sparse intrusive measurements. This problem has not been studied adequately in current reliability applications. This research thus investigates such incomplete covariates problem in reliability analysis. Typical approaches to handling incomplete covariates have been studied to investigate their performance and effects on the reliability analysis results. Since these existing approaches could underestimate the variance in regressions and introduce extra uncertainties to reliability analysis, the developed NNHMs are extended to include handling incomplete covariates as an integral part. The extended versions of NNHMs have been validated using simulated bearing data and real data from a liquefied natural gas pump. The results demonstrate the new approach outperforms the typical incomplete covariates handling approaches. Another problem in reliability analysis is that future covariates of engineering assets are generally unavailable. In existing practices for multi-step reliability analysis, historical covariates were used to estimate the future covariates. Covariates of engineering assets, however, are often subject to substantial fluctuation due to the influence of both engineering degradation and changes in environmental settings. The commonly used covariate extrapolation methods thus would not be suitable because of the error accumulation and uncertainty propagation. To overcome this difficulty, instead of directly extrapolating covariate values, projection of covariate states is conducted in this research. The estimated covariate states and unknown covariate values in future running steps of assets constitute an incomplete covariate set which is then analysed by the extended NNHMs. A new assessment function is also proposed to evaluate risks of underestimated and overestimated reliability analysis results. A case study using field data from a paper and pulp mill has been conducted and it demonstrates that this new multi-step reliability analysis procedure is able to generate more accurate analysis results.