12 resultados para model diagnostic and residual analysis
em Aston University Research Archive
Resumo:
The topic of my research is consumer brand equity (CBE). My thesis is that the success or otherwise of a brand is better viewed from the consumers’ perspective. I specifically focus on consumers as a unique group of stakeholders whose involvement with brands is crucial to the overall success of branding strategy. To this end, this research examines the constellation of ideas on brand equity that have hitherto been offered by various scholars. Through a systematic integration of the concepts and practices identified but these scholars (concepts and practices such as: competitiveness, consumer searching, consumer behaviour, brand image, brand relevance, consumer perceived value, etc.), this research identifies CBE as a construct that is shaped, directed and made valuable by the beliefs, attitudes and the subjective preferences of consumers. This is done by examining the criteria on the basis of which the consumers evaluate brands and make brand purchase decisions. Understanding the criteria by which consumers evaluate brands is crucial for several reasons. First, as the basis upon which consumers select brands changes with consumption norms and technology, understanding the consumer choice process will help in formulating branding strategy. Secondly, an understanding of these criteria will help in formulating a creative and innovative agenda for ‘new brand’ propositions. Thirdly, it will also influence firms’ ability to simulate and mould the plasticity of demand for existing brands. In examining these three issues, this thesis presents a comprehensive account of CBE. This is because the first issue raised in the preceding paragraph deals with the content of CBE. The second issue addresses the problem of how to develop a reliable and valid measuring instrument for CBE. The third issue examines the structural and statistical relationships between the factors of CBE and the consequences of CBE on consumer perceived value (CPV). Using LISREL-SIMPLIS 8.30, the study finds direct and significant influential links between consumer brand equity and consumer value perception.
Resumo:
A nonlinear dynamic model of microbial growth is established based on the theories of the diffusion response of thermodynamics and the chemotactic response of biology. Except for the two traditional variables, i.e. the density of bacteria and the concentration of attractant, the pH value, a crucial influencing factor to the microbial growth, is also considered in this model. The pH effect on the microbial growth is taken as a Gaussian function G0e-(f- fc)2/G1, where G0, G1 and fc are constants, f represents the pH value and fc represents the critical pH value that best fits for microbial growth. To study the effects of the reproduction rate of the bacteria and the pH value on the stability of the system, three parameters a, G0 and G1 are studied in detail, where a denotes the reproduction rate of the bacteria, G0 denotes the impacting intensity of the pH value to microbial growth and G1 denotes the bacterial adaptability to the pH value. When the effect of the pH value of the solution which microorganisms live in is ignored in the governing equations of the model, the microbial system is more stable with larger a. When the effect of the bacterial chemotaxis is ignored, the microbial system is more stable with the larger G1 and more unstable with the larger G0 for f0 > fc. However, the stability of the microbial system is almost unaffected by the variation G0 and G1 and it is always stable for f0 < fc under the assumed conditions in this paper. In the whole system model, it is more unstable with larger G1 and more stable with larger G0 for f0 < fc. The system is more stable with larger G1 and more unstable with larger G0 for f0 > fc. However, the system is more unstable with larger a for f0 < fc and the stability of the system is almost unaffected by a for f0 > fc. The results obtained in this study provide a biophysical insight into the understanding of the growth and stability behavior of microorganisms.
Resumo:
Aims - To build a population pharmacokinetic model that describes the apparent clearance of tacrolimus and the potential demographic, clinical and genetically controlled factors that could lead to inter-patient pharmacokinetic variability within children following liver transplantation. Methods - The present study retrospectively examined tacrolimus whole blood pre-dose concentrations (n = 628) of 43 children during their first year post-liver transplantation. Population pharmacokinetic analysis was performed using the non-linear mixed effects modelling program (nonmem) to determine the population mean parameter estimate of clearance and influential covariates. Results - The final model identified time post-transplantation and CYP3A5*1 allele as influential covariates on tacrolimus apparent clearance according to the following equation: TVCL = 12.9 x (Weight/13.2)0.35 x EXP (-0.0058 x TPT) x EXP (0.428 x CYP3A5) where TVCL is the typical value for apparent clearance, TPT is time post-transplantation in days and the CYP3A5 is 1 where *1 allele is present and 0 otherwise. The population estimate and inter-individual variability (%CV) of tacrolimus apparent clearance were found to be 0.977 l h−1 kg−1 (95% CI 0.958, 0.996) and 40.0%, respectively, while the residual variability between the observed and predicted concentrations was 35.4%. Conclusion Tacrolimus apparent clearance was influenced by time post-transplantation and CYP3A5 genotypes. The results of this study, once confirmed by a large scale prospective study, can be used in conjunction with therapeutic drug monitoring to recommend tacrolimus dose adjustments that take into account not only body weight but also genetic and time-related changes in tacrolimus clearance.
Resumo:
An intelligent agent, operating in an external world which cannot be fully described in its internal world model, must be able to monitor the success of a previously generated plan and to respond to any errors which may have occurred. The process of error analysis requires the ability to reason in an expert fashion about time and about processes occurring in the world. Reasoning about time is needed to deal with causality. Reasoning about processes is needed since the direct effects of a plan action can be completely specified when the plan is generated, but the indirect effects cannot. For example, the action `open tap' leads with certainty to `tap open', whereas whether there will be a fluid flow and how long it might last is more difficult to predict. The majority of existing planning systems cannot handle these kinds of reasoning, thus limiting their usefulness. This thesis argues that both kinds of reasoning require a complex internal representation of the world. The use of Qualitative Process Theory and an interval-based representation of time are proposed as a representation scheme for such a world model. The planning system which was constructed has been tested on a set of realistic planning scenarios. It is shown that even simple planning problems, such as making a cup of coffee, require extensive reasoning if they are to be carried out successfully. The final Chapter concludes that the planning system described does allow the correct solution of planning problems involving complex side effects, which planners up to now have been unable to solve.
Resumo:
The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.
Resumo:
Pulsed field gel electrophoresis of 82 intestinal spirochaete isolates showed specific differentiation of Serpulina pilosicoli and Serpulina hyodysenteriae although considerable heterogeneity was observed, especially amongst S. pilosicoli isolates. In several cases genotypically similar isolates originated from different animals suggesting that cross-species transmission may have occurred. The Caco-2 and Caco-21HT29 cell models have been proposed as potentially realistic models of intestinal infection. Quantitation of adhesion to the cells showed isolate 3 82/91 (from a bacteraemia) to adhere at significantly greater numbers than any other isolate tested. This isolate produced a PFGE profile which differed from other S. pilosicoli isolates and so would be of interest for further study. Comparison of bacteraemic and other S. pilosicoli isolates suggested that bacteraemic isolates were not more specifically adapted for adhesion to, or invasion of the epithelial cell layer than other S. pilosicoli isolates. Genotypically similar isolates from differing animal origins adhered to the Caco-2 model at similar levels. Generation of a random genomic library of S. pilosicoli and screening with species specific monoclonal antibody has enabled the identification of a gene sequence encoding a protein which showed significant homology with an ancestral form of the enzyme pyruvate oxidoreductase. Immunoscreening with polyclonal serum identified the sequences of two gene clusters and a probable arylsulphatase. One gene cluster represented a ribosomal gene cluster which has a similar molecular arrangement to Borrelia burgdorjeri, Treponema pallidum and Thermatoga maritima. The other gene cluster contained an ABC transporter protein, sorbitol dehydrogenase and phosphomannose isomerase. An ELISA type assay was used to demonstrate that isolates of S. pilosicoli could adhere to components of the extracellular matrix such as collagen (type 1), fibronectin, laminin, and porcine gastric mucin.
Resumo:
Firstly, we numerically model a practical 20 Gb/s undersea configuration employing the Return-to-Zero Differential Phase Shift Keying data format. The modelling is completed using the Split-Step Fourier Method to solve the Generalised Nonlinear Schrdinger Equation. We optimise the dispersion map and per-channel launch power of these channels and investigate how the choice of pre/post compensation can influence the performance. After obtaining these optimal configurations, we investigate the Bit Error Rate estimation of these systems and we see that estimation based on Gaussian electrical current systems is appropriate for systems of this type, indicating quasi-linear behaviour. The introduction of narrower pulses due to the deployment of quasi-linear transmission decreases the tolerance to chromatic dispersion and intra-channel nonlinearity. We used tools from Mathematical Statistics to study the behaviour of these channels in order to develop new methods to estimate Bit Error Rate. In the final section, we consider the estimation of Eye Closure Penalty, a popular measure of signal distortion. Using a numerical example and assuming the symmetry of eye closure, we see that we can simply estimate Eye Closure Penalty using Gaussian statistics. We also see that the statistics of the logical ones dominates the statistics of the logical ones dominates the statistics of signal distortion in the case of Return-to-Zero On-Off Keying configurations.
Resumo:
Liposomes due to their biphasic characteristic and diversity in design, composition and construction, offer a dynamic and adaptable technology for enhancing drug solubility. Starting with equimolar egg-phosphatidylcholine (PC)/cholesterol liposomes, the influence of the liposomal composition and surface charge on the incorporation and retention of a model poorly water soluble drug, ibuprofen was investigated. Both the incorporation and the release of ibuprofen were influenced by the lipid composition of the multi-lamellar vesicles (MLV) with inclusion of the long alkyl chain lipid (dilignoceroyl phosphatidylcholine (C 24PC)) resulting in enhanced ibuprofen incorporation efficiency and retention. The cholesterol content of the liposome bilayer was also shown to influence ibuprofen incorporation with maximum ibuprofen incorporation efficiency achieved when 4 μmol of cholesterol was present in the MLV formulation. Addition of anionic lipid dicetylphosphate (DCP) reduced ibuprofen drug loading presumably due to electrostatic repulsive forces between the carboxyl group of ibuprofen and the anionic head-group of DCP. In contrast, the addition of 2 μmol of the cationic lipid stearylamine (SA) to the liposome formulation (PC:Chol - 16 μmol:4 μmol) increased ibuprofen incorporation efficiency by approximately 8%. However further increases of the SA content to 4 μmol and above reduced incorporation by almost 50% compared to liposome formulations excluding the cationic lipid. Environmental scanning electron microscopy (ESEM) was used to dynamically follow the changes in liposome morphology during dehydration to provide an alternative assay of liposome stability. ESEM analysis clearly demonstrated that ibuprofen incorporation improved the stability of PC:Chol liposomes as evidenced by an increased resistance to coalescence during dehydration. These finding suggest a positive interaction between amphiphilic ibuprofen molecules and the bilayer structure of the liposome. © 2004 Elsevier B.V. All rights reserved.
Resumo:
Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.
Resumo:
WHAT IS ALREADY KNOWN ABOUT THIS SUBJECT • The cytotoxic effects of 6-mercaptopurine (6-MP) were found to be due to drug-derived intracellular metabolites (mainly 6-thioguanine nucleotides and to some extent 6-methylmercaptopurine nucleotides) rather than the drug itself. • Current empirical dosing methods for oral 6-MP result in highly variable drug and metabolite concentrations and hence variability in treatment outcome. WHAT THIS STUDY ADDS • The first population pharmacokinetic model has been developed for 6-MP active metabolites in paediatric patients with acute lymphoblastic leukaemia and the potential demographic and genetically controlled factors that could lead to interpatient pharmacokinetic variability among this population have been assessed. • The model shows a large reduction in interindividual variability of pharmacokinetic parameters when body surface area and thiopurine methyltransferase polymorphism are incorporated into the model as covariates. • The developed model offers a more rational dosing approach for 6-MP than the traditional empirical method (based on body surface area) through combining it with pharmacogenetically guided dosing based on thiopurine methyltransferase genotype. AIMS - To investigate the population pharmacokinetics of 6-mercaptopurine (6-MP) active metabolites in paediatric patients with acute lymphoblastic leukaemia (ALL) and examine the effects of various genetic polymorphisms on the disposition of these metabolites. METHODS - Data were collected prospectively from 19 paediatric patients with ALL (n = 75 samples, 150 concentrations) who received 6-MP maintenance chemotherapy (titrated to a target dose of 75 mg m−2 day−1). All patients were genotyped for polymorphisms in three enzymes involved in 6-MP metabolism. Population pharmacokinetic analysis was performed with the nonlinear mixed effects modelling program (nonmem) to determine the population mean parameter estimate of clearance for the active metabolites. RESULTS - The developed model revealed considerable interindividual variability (IIV) in the clearance of 6-MP active metabolites [6-thioguanine nucleotides (6-TGNs) and 6-methylmercaptopurine nucleotides (6-mMPNs)]. Body surface area explained a significant part of 6-TGNs clearance IIV when incorporated in the model (IIV reduced from 69.9 to 29.3%). The most influential covariate examined, however, was thiopurine methyltransferase (TPMT) genotype, which resulted in the greatest reduction in the model's objective function (P < 0.005) when incorporated as a covariate affecting the fractional metabolic transformation of 6-MP into 6-TGNs. The other genetic covariates tested were not statistically significant and therefore were not included in the final model. CONCLUSIONS - The developed pharmacokinetic model (if successful at external validation) would offer a more rational dosing approach for 6-MP than the traditional empirical method since it combines the current practice of using body surface area in 6-MP dosing with a pharmacogenetically guided dosing based on TPMT genotype.
Resumo:
Design verification in the digital domain, using model-based principles, is a key research objective to address the industrial requirement for reduced physical testing and prototyping. For complex assemblies, the verification of design and the associated production methods is currently fragmented, prolonged and sub-optimal, as it uses digital and physical verification stages that are deployed in a sequential manner using multiple systems. This paper describes a novel, hybrid design verification methodology that integrates model-based variability analysis with measurement data of assemblies, in order to reduce simulation uncertainty and allow early design verification from the perspective of satisfying key assembly criteria.
Resumo:
Diabetes patients might suffer from an unhealthy life, long-term treatment and chronic complicated diseases. The decreasing hospitalization rate is a crucial problem for health care centers. This study combines the bagging method with base classifier decision tree and costs-sensitive analysis for diabetes patients' classification purpose. Real patients' data collected from a regional hospital in Thailand were analyzed. The relevance factors were selected and used to construct base classifier decision tree models to classify diabetes and non-diabetes patients. The bagging method was then applied to improve accuracy. Finally, asymmetric classification cost matrices were used to give more alternative models for diabetes data analysis.