889 resultados para fit optimisation
Resumo:
Despite recent success, rapidly disintegrating lyophilized tablets still face problems of low mechanical strength and higher disintegration times resulting in poor patient compliance. The aim of the current work was to carry out a systematic study to understand the factors controlling mechanical properties of these formulations. The work investigated the influence of two bloom strengths of gelatin: low (60 bloom) and high (225 bloom) at different stock solution concentrations (2, 5, 7.5, and 10 %w/w). This was followed by investigation of addition of five saccharides (xylitol, glucose, trehalose, maltotriose and mannitol) at varied concentration range (10-80 %w/w) to decipher their influence on disintegration time, mechanical and thermal properties of the formulation. The results indicated that the disintegration time of the tablets dramatically decreased by decreasing the concentration and bloom strength of gelatin in the stock solution. However the mechanical properties of the tablets were mainly influenced by the concentration of gelatin rather than the bloom strength. The addition of saccharides resulted in enhancement of tablet properties and was concentration dependent. All the saccharides improved the fractubility of the tablets significantly at high concentration (equal or higher than 40% w/w). However, only high concentration (equal or higher than 40% w/w) of trehalose, maltotriose and mannitol significantly enhanced the hardness. Additionally, mannitol crytallised during freeze drying and consequently produced elegant tablets, whilst the other saccarides exhibited lyoprotectant activity as they were able to retain amorphous status. Based on the above findings, an optimized formulation was also successfully developed and characterized to deliver 100 microg dose of Clonidine HCl.
Resumo:
The primary objective of this work is to relate the biomass fuel quality to fast pyrolysis-oil quality in order to identify key biomass traits which affect pyrolysis-oil stability. During storage the pyrolysis-oil becomes more viscous due to chemical and physical changes, as reactions and volatile losses occur due to aging. The reason for oil instability begins within the pyrolysis reactor during pyrolysis in which the biomass is rapidly heated in the absence of oxygen, producing free radical volatiles which are then quickly condensed to form the oil. The products formed do not reach thermodynamic equilibrium and in tum the products react with each other to try to achieve product stability. The first aim of this research was to develop and validate a rapid screening method for determining biomass lignin content in comparison to traditional, time consuming and hence costly wet chemical methods such as Klason. Lolium and Festuca grasses were selected to validate the screening method, as these grass genotypes exhibit a low range of Klason /Acid Digestible Fibre lignin contents. The screening methodology was based on the relationship between the lignin derived products from pyrolysis and the lignin content as determined by wet chemistry. The second aim of the research was to determine whether metals have an affect on fast pyrolysis products, and if any clear relationships can be deduced to aid research in feedstock selection for fast pyrolysis processing. It was found that alkali metals, particularly Na and K influence the rate and yield of degradation as well the char content. Pre-washing biomass with water can remove 70% of the total metals, and improve the pyrolysis product characteristics by increasing the organic yield, the temperature in which maximum liquid yield occurs and the proportion of higher molecular weight compounds within the pyrolysis-oil. The third aim identified these feedstock traits and relates them to the pyrolysis-oil quality and stability. It was found that the mineral matter was a key determinant on pyrolysis-oil yield compared to the proportion of lignin. However the higher molecular weight compounds present in the pyrolysis-oil are due to the lignin, and can cause instability within the pyrolysis-oil. The final aim was to investigate if energy crops can be enhanced by agronomical practices to produce a biomass quality which is attractive to the biomass conversion community, as well as giving a good yield to the farmers. It was found that the nitrogen/potassium chloride fertiliser treatments enhances Miscanthus qualities, by producing low ash, high volatiles yields with acceptable yields for farmers. The progress of senescence was measured in terms of biomass characteristics and fast pyrolysis product characteristics. The results obtained from this research are in strong agreement with published literature, and provides new information on quality traits for biomass which affects pyrolysis and pyrolysis-oils.
Resumo:
Analysis of the use of ICT in the aerospace industry has prompted the detailed investigation of an inventory-planning problem. There is a special class of inventory, consisting of expensive repairable spares for use in support of aircraft operations. These items, called rotables, are not well served by conventional theory and systems for inventory management. The context of the problem, the aircraft maintenance industry sector, is described in order to convey some of its special characteristics in the context of operations management. A literature review is carried out to seek existing theory that can be applied to rotable inventory and to identify a potential gap into which newly developed theory could contribute. Current techniques for rotable planning are identified in industry and the literature: these methods are modelled and tested using inventory and operational data obtained in the field. In the expectation that current practice leaves much scope for improvement, several new models are proposed. These are developed and tested on the field data for comparison with current practice. The new models are revised following testing to give improved versions. The best model developed and tested here comprises a linear programming optimisation, which finds an optimal level of inventory for multiple test cases, reflecting changing operating conditions. The new model offers an inventory plan that is up to 40% less expensive than that determined by current practice, while maintaining required performance.
Resumo:
This thesis presents research within empirical financial economics with focus on liquidity and portfolio optimisation in the stock market. The discussion on liquidity is focused on measurement issues, including TAQ data processing and measurement of systematic liquidity factors (FSO). Furthermore, a framework for treatment of the two topics in combination is provided. The liquidity part of the thesis gives a conceptual background to liquidity and discusses several different approaches to liquidity measurement. It contributes to liquidity measurement by providing detailed guidelines on the data processing needed for applying TAQ data to liquidity research. The main focus, however, is the derivation of systematic liquidity factors. The principal component approach to systematic liquidity measurement is refined by the introduction of moving and expanding estimation windows, allowing for time-varying liquidity co-variances between stocks. Under several liability specifications, this improves the ability to explain stock liquidity and returns, as compared to static window PCA and market average approximations of systematic liquidity. The highest ability to explain stock returns is obtained when using inventory cost as a liquidity measure and a moving window PCA as the systematic liquidity derivation technique. Systematic factors of this setting also have a strong ability in explaining a cross-sectional liquidity variation. Portfolio optimisation in the FSO framework is tested in two empirical studies. These contribute to the assessment of FSO by expanding the applicability to stock indexes and individual stocks, by considering a wide selection of utility function specifications, and by showing explicitly how the full-scale optimum can be identified using either grid search or the heuristic search algorithm of differential evolution. The studies show that relative to mean-variance portfolios, FSO performs well in these settings and that the computational expense can be mitigated dramatically by application of differential evolution.
Resumo:
Sponsorship fit is frequently mentioned and empirically examined as a success factor of sponsorship. While sponsorship fit has been considered as a determinant of sponsorship success, little knowledge exists about the antecedents of sponsorship fit. In the present paper, individual and firm-level antecedents of sponsorship fit are examined in a single hierarchical linear model. Results show that sponsorship fit is influenced by the perception of benefits, the firm’s regional identification, sincerity, relatedness to the sponsored activity, and its dominance. On a partnership level, results show that contract length contributes to sponsorship fit while contract value is found to be unrelated.
Resumo:
Inference and optimisation of real-value edge variables in sparse graphs are studied using the tree based Bethe approximation optimisation algorithms. Equilibrium states of general energy functions involving a large set of real edge-variables that interact at the network nodes are obtained for networks in various cases. These include different cost functions, connectivity values, constraints on the edge bandwidth and the case of multiclass optimisation.
Resumo:
Liposomes due to their biphasic characteristic and diversity in design, composition and construction, offer a dynamic and adaptable technology for enhancing drug solubility. Starting with equimolar egg-phosphatidylcholine (PC)/cholesterol liposomes, the influence of the liposomal composition and surface charge on the incorporation and retention of a model poorly water soluble drug, ibuprofen was investigated. Both the incorporation and the release of ibuprofen were influenced by the lipid composition of the multi-lamellar vesicles (MLV) with inclusion of the long alkyl chain lipid (dilignoceroyl phosphatidylcholine (C 24PC)) resulting in enhanced ibuprofen incorporation efficiency and retention. The cholesterol content of the liposome bilayer was also shown to influence ibuprofen incorporation with maximum ibuprofen incorporation efficiency achieved when 4 μmol of cholesterol was present in the MLV formulation. Addition of anionic lipid dicetylphosphate (DCP) reduced ibuprofen drug loading presumably due to electrostatic repulsive forces between the carboxyl group of ibuprofen and the anionic head-group of DCP. In contrast, the addition of 2 μmol of the cationic lipid stearylamine (SA) to the liposome formulation (PC:Chol - 16 μmol:4 μmol) increased ibuprofen incorporation efficiency by approximately 8%. However further increases of the SA content to 4 μmol and above reduced incorporation by almost 50% compared to liposome formulations excluding the cationic lipid. Environmental scanning electron microscopy (ESEM) was used to dynamically follow the changes in liposome morphology during dehydration to provide an alternative assay of liposome stability. ESEM analysis clearly demonstrated that ibuprofen incorporation improved the stability of PC:Chol liposomes as evidenced by an increased resistance to coalescence during dehydration. These finding suggest a positive interaction between amphiphilic ibuprofen molecules and the bilayer structure of the liposome. © 2004 Elsevier B.V. All rights reserved.
Resumo:
Premium intraocular lenses (IOLs) aim to surgically correct astigmatism and presbyopia following cataract extraction, optimising vision and eliminating the need for cataract surgery in later years. It is usual to fully correct astigmatism and to provide visual correction for distance and near when prescribing spectacles and contact lenses, however for correction with the lens implanted during cataract surgery, patients are required to purchase the premium IOLs and pay surgery fees outside the National Health Service in the UK. The benefit of using toric IOLs was thus demonstrated, both in standard visual tests and real-world situations. Orientation of toric IOLs during implantation is critical and the benefit of using conjunctival blood vessels for alignment was shown. The issue of centration of IOLs relative to the pupil was also investigated, showing changes with the amount of dilation and repeat dilation evaluation, which must be considered during surgery to optimize the visual performance of premium IOLs. Presbyopia is a global issue, of growing importance as life expectancy increases, with no real long-term cure. Despite enhanced lifestyles, changes in diet and improved medical care, presbyopia still presents in modern life as a significant visual impairment. The onset of presbyopia was found to vary with risk factors including alcohol consumption, smoking, UV exposure and even weight as well as age. A new technique to make measurement of accommodation more objective and robust was explored, although needs for further design modifications were identified. Due to dysphotopsia and lack of intermediate vision through most multifocal IOL designs, the development of a trifocal IOL was shown to minimize these aspects. The current thesis, therefore, emphasises the challenges of premium IOL surgery and need for refinement for optimum visual outcome in addition to outlining how premium IOLs may provide long-term and successful correction of astigmatism and presbyopia.
Resumo:
Purpose: To develop a new schematic scheme for efficiently recording the key parameters of gas permeable contact lens (GP) fits based on current consensus. Methods: Over 100 established GP fitters and educators met to discuss the parameters proposed in educational material for evaluating GP fit and concluded on the key parameters that should be recorded. The accuracy and variability of evaluating the fluorescein pattern of GP fit was determined by having 35 experienced contact lens practitioners from across the world, grading 5 images of a range of fits and the topographer simulation of the same fits, in random, order using the proposed scheme. The accuracy of the grading was compared to objective image analysis of the fluorescein intensity of the same images. Results: The key information to record to adequately describe the fit of an GP was agreed as: the manufacturer, brand and lens parameters; settling time; comfort on a 5 point scale; centration; movement on blink on a ±2 scale; and the Primary Fluorescein Pattern in the central, mid-peripheral and edge regions of the lens averaged along the horizontal and vertical lens axes, on a ±2 scale. On average 50-60% of practitioners selected the median grade when subjectively rating fluorescein intensity and this was correlated to objective quantification (r= 0.602, p< 0.001). Objective grading suggesting horizontal median fluorescein intensity was generally symmetrical, as was the vertical meridian, but this was not the case for subjective grading. Simulated fluorescein patterns were subjectively and objectively graded as being less intense than real photographs (p< 0.01). Conclusion: GP fit recording can be standardised and simplified to enhance GP practice. © 2013 British Contact Lens Association.
Resumo:
With the recent rapid growth of the Semantic Web (SW), the processes of searching and querying content that is both massive in scale and heterogeneous have become increasingly challenging. User-friendly interfaces, which can support end users in querying and exploring this novel and diverse, structured information space, are needed to make the vision of the SW a reality. We present a survey on ontology-based Question Answering (QA), which has emerged in recent years to exploit the opportunities offered by structured semantic information on the Web. First, we provide a comprehensive perspective by analyzing the general background and history of the QA research field, from influential works from the artificial intelligence and database communities developed in the 70s and later decades, through open domain QA stimulated by the QA track in TREC since 1999, to the latest commercial semantic QA solutions, before tacking the current state of the art in open user-friendly interfaces for the SW. Second, we examine the potential of this technology to go beyond the current state of the art to support end-users in reusing and querying the SW content. We conclude our review with an outlook for this novel research area, focusing in particular on the R&D directions that need to be pursued to realize the goal of efficient and competent retrieval and integration of answers from large scale, heterogeneous, and continuously evolving semantic sources.
Resumo:
This article is a first step towards addressing a gap in the field of organisational resilience research by examining how small and medium enterprises (SME) manage the threat and actuality of extreme events. Pilot research found that the managerial framing of extreme events varied by a range of organisational factors. This finding informed further examination of the contextual nature of the resilience concept. To date, large organisations have been the traditional focus of empirical work and theorising in this area; yet the heterogeneous SME sector makes up approximately 99% of UK industry and routinely operates under conditions of uncertainty. In a comparative study examining UK organisational resilience, it emerged that SME participants had both a distinctive perspective and approach to resilience when compared to participants from larger organisations. This article presents a subset of data from 11 SME decision-makers. The relationship between resilience capabilities, such as flexibility and adaptation, is interrogated in relation to organisational size. The data suggest limitations of applying a one-size-fits-all organisation solution (managerial or policy) to creating resilience. This study forms the basis for survey work examining the extent to which resilience is an organisationally contingent concept in practice.
Resumo:
This thesis presented a detailed research work on diamond materials. Chapter 1 is an overall introduction of the thesis. In the Chapter 2, the literature review on the physical, chemical, optical, mechanical, as well as other properties of diamond materials are summarised. Followed by this chapter, several advanced diamond growth and characterisation techniques used in experimental work are also introduced. Then, the successful installation and applications of chemical vapour deposition system was demonstrated in Chapter 4. Diamond growth on a variety of different substrates has been investigated such as on silicon, diamond-like carbon or silica fibres. In Chapter 5, the single crystalline diamond substrate was used as the substrate to perform femtosecond laser inscription. The results proved the potentially feasibility of this technique, which could be utilised in fabricating future biochemistry microfluidic channels on diamond substrates. In Chapter 6, the hydrogen-terminated nanodiamond powder was studied using impedance spectroscopy. Its intrinsic electrical properties and its thermal stability were presented and analysed in details. As the first PhD student within Nanoscience Research Group at Aston, my initial research work was focused on the installation and testing of the microwave plasma enhanced chemical vapour deposition system (MPECVD), which will be beneficial to all the future researchers in the group. The fundamental of the on MPECVD system will be introduced in details. After optimisation of the growth parameters, the uniform diamond deposition has been achieved with a good surface coverage and uniformity. Furthermore, one of the most significant contributions of this work is the successful pattern inscription on diamond substrates by femtosecond laser system. Previous research of femtosecond laser inscription on diamond was simple lines or dots, with little characterisation techniques were used. In my research work, the femtosecond laser has been successfully used to inscribe patterns on diamond substrate and fully characterisation techniques, e.g. by SEM, Raman, XPS, as well as AFM, have been carried out. After the femtosecond laser inscription, the depth of microfluidic channels on diamond film has been found to be 300~400 nm, with a graphitic layer thickness of 165~190 nm. Another important outcome of this work is the first time to characterise the electrical properties of hydrogenterminated nanodiamond with impedance spectroscopy. Based on the experimental evaluation and mathematic fitting, the resistance of hydrogen-terminated nanodiamond reduced to 0.25 MO, which were four orders of magnitude lower than untreated nanodiamond. Meanwhile, a theoretical equivalent circuit has been proposed to fit the results. Furthermore, the hydrogenterminated nanodiamond samples were annealed at different temperature to study its thermal stability. The XPS and FTIR results indicate that hydrogen-terminated nanodiamond will start to oxidize over 100ºC and the C-H bonds can survive up to 400ºC. This research work reports the fundamental electrical properties of hydrogen-terminated nanodiamond, which can be used in future applications in physical or chemical area.
Resumo:
In previous Statnotes, many of the statistical tests described rely on the assumption that the data are a random sample from a normal or Gaussian distribution. These include most of the tests in common usage such as the ‘t’ test ), the various types of analysis of variance (ANOVA), and Pearson’s correlation coefficient (‘r’) . In microbiology research, however, not all variables can be assumed to follow a normal distribution. Yeast populations, for example, are a notable feature of freshwater habitats, representatives of over 100 genera having been recorded . Most common are the ‘red yeasts’ such as Rhodotorula, Rhodosporidium, and Sporobolomyces and ‘black yeasts’ such as Aurobasidium pelculans, together with species of Candida. Despite the abundance of genera and species, the overall density of an individual species in freshwater is likely to be low and hence, samples taken from such a population will contain very low numbers of cells. A rare organism living in an aquatic environment may be distributed more or less at random in a volume of water and therefore, samples taken from such an environment may result in counts which are more likely to be distributed according to the Poisson than the normal distribution. The Poisson distribution was named after the French mathematician Siméon Poisson (1781-1840) and has many applications in biology, especially in describing rare or randomly distributed events, e.g., the number of mutations in a given sequence of DNA after exposure to a fixed amount of radiation or the number of cells infected by a virus given a fixed level of exposure. This Statnote describes how to fit the Poisson distribution to counts of yeast cells in samples taken from a freshwater lake.