58 resultados para 1539


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A number of recent experiments suggest that, at a given wetting speed, the dynamic contact angle formed by an advancing liquid-gas interface with a solid substrate depends on the flow field and geometry near the moving contact line. In the present work, this effect is investigated in the framework of an earlier developed theory that was based on the fact that dynamic wetting is, by its very name, a process of formation of a new liquid-solid interface (newly “wetted” solid surface) and hence should be considered not as a singular problem but as a particular case from a general class of flows with forming or/and disappearing interfaces. The results demonstrate that, in the flow configuration of curtain coating, where a liquid sheet (“curtain”) impinges onto a moving solid substrate, the actual dynamic contact angle indeed depends not only on the wetting speed and material constants of the contacting media, as in the so-called slip models, but also on the inlet velocity of the curtain, its height, and the angle between the falling curtain and the solid surface. In other words, for the same wetting speed the dynamic contact angle can be varied by manipulating the flow field and geometry near the moving contact line. The obtained results have important experimental implications: given that the dynamic contact angle is determined by the values of the surface tensions at the contact line and hence depends on the distributions of the surface parameters along the interfaces, which can be influenced by the flow field, one can use the overall flow conditions and the contact angle as a macroscopic multiparametric signal-response pair that probes the dynamics of the liquid-solid interface. This approach would allow one to investigate experimentally such properties of the interface as, for example, its equation of state and the rheological properties involved in the interface’s response to an external torque, and would help to measure its parameters, such as the coefficient of sliding friction, the surface-tension relaxation time, and so on.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has recently been increasing demand for better designs to conduct first-into-man dose-escalation studies more efficiently, more accurately and more quickly. The authors look into the Bayesian decision-theoretic approach and use simulation as a tool to investigate the impact of compromises with conventional practice that might make the procedures more acceptable for implementation. Copyright © 2005 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The double triangular test was introduced twenty years ago, and the purpose of this paper is to review applications that have been made since then. In fact, take-up of the method was rather slow until the late 1990s, but in recent years several clinical trial reports have been published describing its use in a wide range of therapeutic areas. The core of this paper is a detailed account of five trials that have been published since 2000 in which the method was applied to studies of pancreatic cancer, breast cancer, myocardial infarction, epilepsy and bedsores. Before those accounts are given, the method is described and the history behind its evolution is presented. The future potential of the method for sequential case-control and equivalence trials is also discussed. Copyright © 2004 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of a phase H clinical trial is to decide whether or not to develop an experimental therapy further through phase III clinical evaluation. In this paper, we present a Bayesian approach to the phase H trial, although we assume that subsequent phase III clinical trials will hat,e standard frequentist analyses. The decision whether to conduct the phase III trial is based on the posterior predictive probability of a significant result being obtained. This fusion of Bayesian and frequentist techniques accepts the current paradigm for expressing objective evidence of therapeutic value, while optimizing the form of the phase II investigation that leads to it. By using prior information, we can assess whether a phase II study is needed at all, and how much or what sort of evidence is required. The proposed approach is illustrated by the design of a phase II clinical trial of a multi-drug resistance modulator used in combination with standard chemotherapy in the treatment of metastatic breast cancer. Copyright (c) 2005 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the comparison of two formulations in terms of average bioequivalence using the 2 × 2 cross-over design. In a bioequivalence study, the primary outcome is a pharmacokinetic measure, such as the area under the plasma concentration by time curve, which is usually assumed to have a lognormal distribution. The criterion typically used for claiming bioequivalence is that the 90% confidence interval for the ratio of the means should lie within the interval (0.80, 1.25), or equivalently the 90% confidence interval for the differences in the means on the natural log scale should be within the interval (-0.2231, 0.2231). We compare the gold standard method for calculation of the sample size based on the non-central t distribution with those based on the central t and normal distributions. In practice, the differences between the various approaches are likely to be small. Further approximations to the power function are sometimes used to simplify the calculations. These approximations should be used with caution, because the sample size required for a desirable level of power might be under- or overestimated compared to the gold standard method. However, in some situations the approximate methods produce very similar sample sizes to the gold standard method. Copyright © 2005 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The International Citicoline Trial in acUte Stroke is a sequential phase III study of the use of the drug citicoline in the treatment of acute ischaemic stroke, which was initiated in 2006 in 56 treatment centres. The primary objective of the trial is to demonstrate improved recovery of patients randomized to citicoline relative to those randomized to placebo after 12 weeks of follow-up. The primary analysis will take the form of a global test combining the dichotomized results of assessments on three well-established scales: the Barthel Index, the modified Rankin scale and the National Institutes of Health Stroke Scale. This approach was previously used in the analysis of the influential National Institute of Neurological Disorders and Stroke trial of recombinant tissue plasminogen activator in stroke. The purpose of this paper is to describe how this trial was designed, and in particular how the simultaneous objectives of taking into account three assessment scales, performing a series of interim analyses and conducting treatment allocation and adjusting the analyses to account for prognostic factors, including more than 50 treatment centres, were addressed. Copyright (C) 2008 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiscale modeling is emerging as one of the key challenges in mathematical biology. However, the recent rapid increase in the number of modeling methodologies being used to describe cell populations has raised a number of interesting questions. For example, at the cellular scale, how can the appropriate discrete cell-level model be identified in a given context? Additionally, how can the many phenomenological assumptions used in the derivation of models at the continuum scale be related to individual cell behavior? In order to begin to address such questions, we consider a discrete one-dimensional cell-based model in which cells are assumed to interact via linear springs. From the discrete equations of motion, the continuous Rouse [P. E. Rouse, J. Chem. Phys. 21, 1272 (1953)] model is obtained. This formalism readily allows the definition of a cell number density for which a nonlinear "fast" diffusion equation is derived. Excellent agreement is demonstrated between the continuum and discrete models. Subsequently, via the incorporation of cell division, we demonstrate that the derived nonlinear diffusion model is robust to the inclusion of more realistic biological detail. In the limit of stiff springs, where cells can be considered to be incompressible, we show that cell velocity can be directly related to cell production. This assumption is frequently made in the literature but our derivation places limits on its validity. Finally, the model is compared with a model of a similar form recently derived for a different discrete cell-based model and it is shown how the different diffusion coefficients can be understood in terms of the underlying assumptions about cell behavior in the respective discrete models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Observation of adverse drug reactions during drug development can cause closure of the whole programme. However, if association between the genotype and the risk of an adverse event is discovered, then it might suffice to exclude patients of certain genotypes from future recruitment. Various sequential and non-sequential procedures are available to identify an association between the whole genome, or at least a portion of it, and the incidence of adverse events. In this paper we start with a suspected association between the genotype and the risk of an adverse event and suppose that the genetic subgroups with elevated risk can be identified. Our focus is determination of whether the patients identified as being at risk should be excluded from further studies of the drug. We propose using a utility function to? determine the appropriate action, taking into account the relative costs of suffering an adverse reaction and of failing to alleviate the patient's disease. Two illustrative examples are presented, one comparing patients who suffer from an adverse event with contemporary patients who do not, and the other making use of a reference control group. We also illustrate two classification methods, LASSO and CART, for identifying patients at risk, but we stress that any appropriate classification method could be used in conjunction with the proposed utility function. Our emphasis is on determining the action to take rather than on providing definitive evidence of an association. Copyright (C) 2008 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two-stage designs offer substantial advantages for early phase II studies. The interim analysis following the first stage allows the study to he stopped for futility, or more positively, it might lead to early progression to the trials needed for late phase H and phase III. If the study is to continue to its second stage, then there is an opportunity for a revision of the total sample size. Two-stage designs have been implemented widely in oncology studies in which there is a single treatment arm and patient responses are binary. In this paper the case of two-arm comparative studies in which responses are quantitative is considered. This setting is common in therapeutic areas other than oncology. It will be assumed that observations are normally distributed, but that there is some doubt concerning their standard deviation, motivating the need for sample size review. The work reported has been motivated by a study in diabetic neuropathic pain, and the development of the design for that trial is described in detail. Copyright (C) 2008 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Combinations of drugs are increasingly being used for a wide variety of diseases and conditions. A pre-clinical study may allow the investigation of the response at a large number of dose combinations. In determining the response to a drug combination, interest may lie in seeking evidence of synergism, in which the joint action is greater than the actions of the individual drugs, or of antagonism, in which it is less. Two well-known response surface models representing no interaction are Loewe additivity and Bliss independence, and Loewe or Bliss synergism or antagonism is defined relative to these. We illustrate an approach to fitting these models for the case in which the marginal single drug dose-response relationships are represented by four-parameter logistic curves with common upper and lower limits, and where the response variable is normally distributed with a common variance about the dose-response curve. When the dose-response curves are not parallel, the relative potency of the two drugs varies according to the magnitude of the desired effect and the models for Loewe additivity and synergism/antagonism cannot be explicitly expressed. We present an iterative approach to fitting these models without the assumption of parallel dose-response curves. A goodness-of-fit test based on residuals is also described. Implementation using the SAS NLIN procedure is illustrated using data from a pre-clinical study. Copyright © 2007 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, Bayesian decision procedures are developed for dose-escalation studies based on binary measures of undesirable events and continuous measures of therapeutic benefit. The methods generalize earlier approaches where undesirable events and therapeutic benefit are both binary. A logistic regression model is used to model the binary responses, while a linear regression model is used to model the continuous responses. Prior distributions for the unknown model parameters are suggested. A gain function is discussed and an optional safety constraint is included. Copyright (C) 2006 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The orientational ordering of the nematic phase of a polyethylene glycol (PEG)-peptide block copolymer in aqueous solution is probed by small-angle neutron scattering (SANS), with the sample subjected to steady shear in a Couette cell. The PEG-peptide conjugate forms fibrils that behave as semiflexible rodlike chains. The orientational order parameters (P) over bar (2) and (P) over bar (4) are obtained by modeling the data using a series expansion approach to the form factor of uniform cylinders. The method used is independent of assumptions on the form of the singlet orientational distribution function. Good agreement with the anisotropic two-dimensional SANS patterns is obtained. The results show shear alignment starting at very low shear rates, and the orientational order parameters reach a plateau at higher shear rates with a pseudologarithmic dependence on shear rate. The most probable distribution functions correspond to fibrils parallel to the flow direction under shear, but a sample at rest shows a bimodal distribution with some of the rodlike peptide fibrils oriented perpendicular to the flow direction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We agree with Duckrow and Albano [Phys. Rev. E 67, 063901 (2003)] and Quian Quiroga [Phys. Rev. E 67, 063902 (2003)] that mutual information (MI) is a useful measure of dependence for electroencephalogram (EEG) data, but we show that the improvement seen in the performance of MI on extracting dependence trends from EEG is more dependent on the type of MI estimator rather than any embedding technique used. In an independent study we conducted in search for an optimal MI estimator, and in particular for EEG applications, we examined the performance of a number of MI estimators on the data set used by Quian Quiroga in their original study, where the performance of different dependence measures on real data was investigated [Phys. Rev. E 65, 041903 (2002)]. We show that for EEG applications the best performance among the investigated estimators is achieved by k-nearest neighbors, which supports the conjecture by Quian Quiroga in Phys. Rev. E 67, 063902 (2003) that the nearest neighbor estimator is the most precise method for estimating MI.