19 resultados para Variable dosage
em Aston University Research Archive
Resumo:
A robust vaginal immune response is considered essential for an effective prophylactic vaccine that prevents transmission of HIV and other sexually acquired diseases. Considerable attention has recently focused on the potential of vaginally administered vaccines as a means to induce such local immunity. However, the potential for vaccination at this site remains in doubt as the vaginal mucosa is generally considered to have low immune inductive potential. In the current study, we explored for the first time the use of a quick release, freeze-dried, solid dosage system for practical vaginal administration of a protein antigen. These solid dosage forms overcome the common problem associated with leakage and poor retention of vaginally administered antigen solutions. Mice were immunized vaginally with H4A, an HIV gp41 envelope based recombinant protein, using quick release, freeze-dried solid rods, and the immune responses compared to a control group immunized via subcutaneous H4A injection. Vaginally immunized mice failed to elicit robust immune responses. Our detailed investigations, involving cytokine analysis, the stability of H4A in mouse cervicovaginal lavage, and elucidation of the state of H4A protein in the immediate-release dosage form, revealed that antigen instability in vaginal fluid, the state of the antigen in the dosage form, and the cytokine profile induced are all likely to have contributed to the observed lack of immunogenicity. These are important factors affecting vaginal immunization and provide a rational basis for explaining the typically poor and variable elicitation of immunity at this site, despite the presence of immune responsive cells within the vaginal mucosae. In future mucosal vaccine studies, a more explicit focus on antigen stability in the dosage form and the immune potential of available antigen-responsive cells is recommended.
Resumo:
WHAT IS ALREADY KNOWN ABOUT THIS SUBJECT • Currently tacrolimus is the mainstay of immunosuppression for most children undergoing liver transplantation (LT). • The clinical use of this agent, however, is complicated by its various adverse effects (mainly nephrotoxicity), its narrow therapeutic-index and considerable pharmacokinetic variability. • The low and variable oral bioavailability of tacrolimus is thought to result from the action of the multidrug efflux-pump P-glycoprotein, encoded by the ABCB1 gene. WHAT THIS STUDY ADDS • A significant association between ABCB1 genetic polymorphisms and tacrolimus-associated nephrotoxicity in paediatric patients following LT is reported for the first time. Genotyping such polymorphisms may have the potential to individualize better initial tacrolimus therapy and enhance drug safety. • The long-term effect of ABCB1 polymorphisms on tacrolimus trough concentrations were investigated up to 5 years post-transplantation. A significant effect of intestinal P-glycoprotein genotypes on tacrolimus pharmacokinetics was found at 3 and 4 years post-transplantation suggesting that the effect is maintained long term. AIMS - The aim of this study was to investigate the influence of genetic polymorphisms in ABCB1 on the incidence of nephrotoxicity and tacrolimus dosage-requirements in paediatric patients following liver transplantation. METHODS - Fifty-one paediatric liver transplant recipients receiving tacrolimus were genotyped for ABCB1 C1236>T, G2677>T and C3435>T polymorphisms. Dose-adjusted tacrolimus trough concentrations and estimated glomerular filtration rates (EGFR) indicative of renal toxicity were determined and correlated with the corresponding genotypes. RESULTS - The present study revealed a higher incidence of the ABCB1 variant-alleles examined among patients with renal dysfunction (≥30% reduction in EGFR) at 6 months post-transplantation (1236T allele: 63.3% vs 37.5% in controls, P= 0.019; 2677T allele: 63.3% vs. 35.9%, p = 0.012; 3435T allele: 60% vs. 39.1%, P= 0.057). Carriers of the G2677->T variant allele also had a significant reduction (%) in EGFR at 12 months post-transplant (mean difference = 22.6%; P= 0.031). Haplotype analysis showed a significant association between T-T-T haplotypes and an increased incidence of nephrotoxicity at 6 months post-transplantation (haplotype-frequency = 52.9% in nephrotoxic patients vs 29.4% in controls; P= 0.029). Furthermore, G2677->T and C3435->T polymorphisms and T-T-T haplotypes were significantly correlated with higher tacrolimus dose-adjusted pre-dose concentrations at various time points examined long after drug initiation. CONCLUSIONS - These findings suggest that ABCB1 polymorphisms in the native intestine significantly influence tacrolimus dosage-requirement in the stable phase after transplantation. In addition, ABCB1 polymorphisms in paediatric liver transplant recipients may predispose them to nephrotoxicity over the first year post-transplantation. Genotyping future transplant recipients for ABCB1 polymorphisms, therefore, could have the potential to individualize better tacrolimus immunosuppressive therapy and enhance drug safety.
Resumo:
The performance of seven minimization algorithms are compared on five neural network problems. These include a variable-step-size algorithm, conjugate gradient, and several methods with explicit analytic or numerical approximations to the Hessian.
Resumo:
States or state sequences in neural network models are made to represent concepts from applications. This paper motivates, introduces and discusses a formalism for denoting such representations; a representation for representations. The formalism is illustrated by using it to discuss the representation of variable binding and inference abstractly, and then to present four specific representations. One of these is an apparently novel hybrid of phasic and tensor-product representations which retains the desirable properties of each.
Resumo:
There is currently considerable interest in developing general non-linear density models based on latent, or hidden, variables. Such models have the ability to discover the presence of a relatively small number of underlying `causes' which, acting in combination, give rise to the apparent complexity of the observed data set. Unfortunately, to train such models generally requires large computational effort. In this paper we introduce a novel latent variable algorithm which retains the general non-linear capabilities of previous models but which uses a training procedure based on the EM algorithm. We demonstrate the performance of the model on a toy problem and on data from flow diagnostics for a multi-phase oil pipeline.
Resumo:
Visualization has proven to be a powerful and widely-applicable tool the analysis and interpretation of data. Most visualization algorithms aim to find a projection from the data space down to a two-dimensional visualization space. However, for complex data sets living in a high-dimensional space it is unlikely that a single two-dimensional projection can reveal all of the interesting structure. We therefore introduce a hierarchical visualization algorithm which allows the complete data set to be visualized at the top level, with clusters and sub-clusters of data points visualized at deeper levels. The algorithm is based on a hierarchical mixture of latent variable models, whose parameters are estimated using the expectation-maximization algorithm. We demonstrate the principle of the approach first on a toy data set, and then apply the algorithm to the visualization of a synthetic data set in 12 dimensions obtained from a simulation of multi-phase flows in oil pipelines and to data in 36 dimensions derived from satellite images.
Resumo:
This paper introduces a new technique in the investigation of limited-dependent variable models. This paper illustrates that variable precision rough set theory (VPRS), allied with the use of a modern method of classification, or discretisation of data, can out-perform the more standard approaches that are employed in economics, such as a probit model. These approaches and certain inductive decision tree methods are compared (through a Monte Carlo simulation approach) in the analysis of the decisions reached by the UK Monopolies and Mergers Committee. We show that, particularly in small samples, the VPRS model can improve on more traditional models, both in-sample, and particularly in out-of-sample prediction. A similar improvement in out-of-sample prediction over the decision tree methods is also shown.
Resumo:
The ability to distinguish one visual stimulus from another slightly different one depends on the variability of their internal representations. In a recent paper on human visual-contrast discrimination, Kontsevich et al (2002 Vision Research 42 1771 - 1784) re-considered the long-standing question whether the internal noise that limits discrimination is fixed (contrast-invariant) or variable (contrast-dependent). They tested discrimination performance for 3 cycles deg-1 gratings over a wide range of incremental contrast levels at three masking contrasts, and showed that a simple model with an expansive response function and response-dependent noise could fit the data very well. Their conclusion - that noise in visual-discrimination tasks increases markedly with contrast - has profound implications for our understanding and modelling of vision. Here, however, we re-analyse their data, and report that a standard gain-control model with a compressive response function and fixed additive noise can also fit the data remarkably well. Thus these experimental data do not allow us to decide between the two models. The question remains open. [Supported by EPSRC grant GR/S74515/01]
Resumo:
With the extensive use of pulse modulation methods in telecommunications, much work has been done in the search for a better utilisation of the transmission channel.The present research is an extension of these investigations. A new modulation method, 'Variable Time-Scale Information Processing', (VTSIP), is proposed.The basic principles of this system have been established, and the main advantages and disadvantages investigated. With the proposed system, comparison circuits detect the instants at which the input signal voltage crosses predetermined amplitude levels.The time intervals between these occurrences are measured digitally and the results are temporarily stored, before being transmitted.After reception, an inverse process enables the original signal to be reconstituted.The advantage of this system is that the irregularities in the rate of information contained in the input signal are smoothed out before transmission, allowing the use of a smaller transmission bandwidth. A disadvantage of the system is the time delay necessarily introduced by the storage process.Another disadvantage is a type of distortion caused by the finite store capacity.A simulation of the system has been made using a standard speech signal, to make some assessment of this distortion. It is concluded that the new system should be an improvement on existing pulse transmission systems, allowing the use of a smaller transmission bandwidth, but introducing a time delay.
Resumo:
We present a diffractive phase variable attenuator for femtosecond laser radiation control. It allows the control of beam power up to 0.75 10 <sup>13</sup> W/cm<sup>2</sup> without introducing serious distortions in spectra and beam shape while it operates in zero order diffraction. The attenuator can operate with wavelengths from DUV to IR. © 2009 Optical Society of America.
Resumo:
There may be circumstances where it is necessary for microbiologists to compare variances rather than means, e,g., in analysing data from experiments to determine whether a particular treatment alters the degree of variability or testing the assumption of homogeneity of variance prior to other statistical tests. All of the tests described in this Statnote have their limitations. Bartlett’s test may be too sensitive but Levene’s and the Brown-Forsythe tests also have problems. We would recommend the use of the variance-ratio test to compare two variances and the careful application of Bartlett’s test if there are more than two groups. Considering that these tests are not particularly robust, it should be remembered that the homogeneity of variance assumption is usually the least important of those considered when carrying out an ANOVA. If there is concern about this assumption and especially if the other assumptions of the analysis are also not likely to be met, e.g., lack of normality or non additivity of treatment effects then it may be better either to transform the data or to carry out a non-parametric test on the data.
Resumo:
Fe{HB(CHN)} is observed by variable temperature infrared and magnetic studies to have a spin transition between the low spin S = 0 and high spin S = 2 states at 331 K (58 °C) with thermal hysteresis of ~1.5 K. Changes in the triazole ligand IR absorptions demonstrate that distant non-metal-ligand vibrations are altered upon the change in electronic structure associated with the spin-crossover can be used to monitor the the spin-crossover transition.