404 resultados para RANDOM-PHASE-APPROXIMATION
Resumo:
BACKGROUND AND OBJECTIVE: Idiopathic pulmonary fibrosis (IPF) is a degenerative disease characterized by fibrosis following failed epithelial repair. Mesenchymal stromal cells (MSC), a key component of the stem cell niche in bone marrow and possibly other organs including lung, have been shown to enhance epithelial repair and are effective in preclinical models of inflammation-induced pulmonary fibrosis, but may be profibrotic in some circumstances. METHODS: In this single centre, non-randomized, dose escalation phase 1b trial, patients with moderately severe IPF (diffusing capacity for carbon monoxide (DLCO ) ≥ 25% and forced vital capacity (FVC) ≥ 50%) received either 1 × 10(6) (n = 4) or 2 × 10(6) (n = 4) unrelated-donor, placenta-derived MSC/kg via a peripheral vein and were followed for 6 months with lung function (FVC and DLCO ), 6-min walk distance (6MWD) and computed tomography (CT) chest. RESULTS: Eight patients (4 female, aged 63.5 (57-75) years) with median (interquartile range) FVC 60 (52.5-74.5)% and DLCO 34.5 (29.5-40)% predicted were treated. Both dose schedules were well tolerated with only minor and transient acute adverse effects. MSC infusion was associated with a transient (1% (0-2%)) fall in SaO2 after 15 min, but no changes in haemodynamics. At 6 months FVC, DLCO , 6MWD and CT fibrosis score were unchanged compared with baseline. There was no evidence of worsening fibrosis. CONCLUSIONS: Intravenous MSC administration is feasible and has a good short-term safety profile in patients with moderately severe IPF.
Resumo:
Background/Aim. Mesenchymal stromal cells (MSCs) have been utilised in many clinical trials as an experimental treatment in numerous clinical settings. Bone marrow remains the traditional source tissue for MSCs but is relatively hard to access in large volumes. Alternatively, MSCs may be derived from other tissues including the placenta and adipose tissue. In an initial study no obvious differences in parameters such as cell surface phenotype, chemokine receptor display, mesodermal differentiation capacity or immunosuppressive ability, were detected when we compared human marrow derived- MSCs to human placenta-derived MSCs. The aim of this study was to establish and evaluate a protocol and related processes for preparation placenta-derived MSCs for early phase clinical trials. Methods. A full-term placenta was taken after delivery of the baby as a source of MSCs. Isolation, seeding, incubation, cryopreservation of human placentaderived MSCs and used production release criteria were in accordance with the complex regulatory requirements applicable to Code of Good Manufacturing Practice manufacturing of ex vivo expanded cells. Results. We established and evaluated instructions for MSCs preparation protocol and gave an overview of the three clinical areas application. In the first trial, MSCs were co-transplanted iv to patient receiving an allogeneic cord blood transplant as therapy for treatmentrefractory acute myeloid leukemia. In the second trial, MSCs were administered iv in the treatment of idiopathic pulmonary fibrosis and without serious adverse effects. In the third trial, MSCs were injected directly into the site of tendon damage using ultrasound guidance in the treatment of chronic refractory tendinopathy. Conclusion. Clinical trials using both allogeneic and autologous cells demonstrated MSCs to be safe. A described protocol for human placenta-derived MSCs is appropriate for use in a clinical setting, relatively inexpensive and can be relatively easily adjusted to a different set of regulatory requirements, as applicable to early phase clinical trials.
Resumo:
In continuum one-dimensional space, a coupled directed continuous time random walk model is proposed, where the random walker jumps toward one direction and the waiting time between jumps affects the subsequent jump. In the proposed model, the Laplace-Laplace transform of the probability density function P(x,t) of finding the walker at position at time is completely determined by the Laplace transform of the probability density function φ(t) of the waiting time. In terms of the probability density function of the waiting time in the Laplace domain, the limit distribution of the random process and the corresponding evolving equations are derived.
Resumo:
A method for determination of tricyclazole in water using solid phase extraction and high performance liquid chromatography (HPLC) with UV detection at 230nm and a mobile phase of acetonitrile:water (20:80, v/v) was developed. A performance comparison between two types of solid phase sorbents, the C18 sorbent of Supelclean ENVI-18 cartridge and the styrene-divinyl benzene copolymer sorbent of Sep-Pak PS2-Plus cartridge was conducted. The Sep-Pak PS2-Plus cartridges were found more suitable for extracting tricyclazole from water samples than the Supelclean ENVI-18 cartridges. For this cartridge, both methanol and ethyl acetate produced good results. The method was validated with good linearity and with a limit of detection of 0.008gL-1 for a 500-fold concentration through the SPE procedure. The recoveries of the method were stable at 80% and the precision was from 1.1-6.0% within the range of fortified concentrations. The validated method was also applied to measure the concentrations of tricyclazole in real paddy water.
Resumo:
The growing interest in co-created reading experiences in both digital and print formats raises interesting questions for creative writers who work in the space of interactive fiction. This essay argues that writers have not abandoned experiments with co-creation in print narratives in favour of the attractions of the digital environment, as might be assumed by the discourse on digital development. Rather, interactive print narratives, in particular ‘reader-assembled narratives’ demonstrate a rich history of experimentation and continue to engage writers who wish to craft individual reading experiences for readers and to experiment with their own creative process as writers. The reader-assembled narrative has been used for many different reasons and for some writers, such as BS Johnson it is a method of problem solving, for others, like Robert Coover, it is a way to engage the reader in a more playful sense. Authors such as Marc Saporta, BS Johnson, and Robert Coover have engaged with this type of narrative play. This examination considers the narrative experimentation of these authors as a way of offering insights into creative practice for contemporary creative writers.
Resumo:
Submarine groundwater discharge (SGD) is an integral part of the hydrological cycle and represents an important aspect of land-ocean interactions. We used a numerical model to simulate flow and salt transport in a nearshore groundwater aquifer under varying wave conditions based on yearlong random wave data sets, including storm surge events. The results showed significant flow asymmetry with rapid response of influxes and retarded response of effluxes across the seabed to the irregular wave conditions. While a storm surge immediately intensified seawater influx to the aquifer, the subsequent return of intruded seawater to the sea, as part of an increased SGD, was gradual. Using functional data analysis, we revealed and quantified retarded, cumulative effects of past wave conditions on SGD including the fresh groundwater and recirculating seawater discharge components. The retardation was characterized well by a gamma distribution function regardless of wave conditions. The relationships between discharge rates and wave parameters were quantifiable by a regression model in a functional form independent of the actual irregular wave conditions. This statistical model provides a useful method for analyzing and predicting SGD from nearshore unconfined aquifers affected by random waves
Resumo:
The goal of this article is to provide a new design framework and its corresponding estimation for phase I trials. Existing phase I designs assign each subject to one dose level based on responses from previous subjects. Yet it is possible that subjects with neither toxicity nor efficacy responses can be treated at higher dose levels, and their subsequent responses to higher doses will provide more information. In addition, for some trials, it might be possible to obtain multiple responses (repeated measures) from a subject at different dose levels. In this article, a nonparametric estimation method is developed for such studies. We also explore how the designs of multiple doses per subject can be implemented to improve design efficiency. The gain of efficiency from "single dose per subject" to "multiple doses per subject" is evaluated for several scenarios. Our numerical study shows that using "multiple doses per subject" and the proposed estimation method together increases the efficiency substantially.
Resumo:
The effects of fish density distribution and effort distribution on the overall catchability coefficient are examined. Emphasis is also on how aggregation and effort distribution interact to affect overall catch rate [catch per unit effort (cpue)]. In particular, it is proposed to evaluate three indices, the catchability index, the knowledge parameter, and the aggregation index, to describe the effectiveness of targeting and the effects on overall catchability in the stock area. Analytical expressions are provided so that these indices can easily be calculated. The average of the cpue calculated from small units where fishing is random is a better index for measuring the stock abundance. The overall cpue, the ratio of lumped catch and effort, together with the average cpue, can be used to assess the effectiveness of targeting. The proposed methods are applied to the commercial catch and effort data from the Australian northern prawn fishery. The indices are obtained assuming a power law for the effort distribution as an approximation of targeting during the fishing operation. Targeting increased catchability in some areas by 10%, which may have important implications on management advice.
Resumo:
A decision-theoretic framework is proposed for designing sequential dose-finding trials with multiple outcomes. The optimal strategy is solvable theoretically via backward induction. However, for dose-finding studies involving k doses, the computational complexity is the same as the bandit problem with k-dependent arms, which is computationally prohibitive. We therefore provide two computationally compromised strategies, which is of practical interest as the computational complexity is greatly reduced: one is closely related to the continual reassessment method (CRM), and the other improves CRM and approximates to the optimal strategy better. In particular, we present the framework for phase I/II trials with multiple outcomes. Applications to a pediatric HIV trial and a cancer chemotherapy trial are given to illustrate the proposed approach. Simulation results for the two trials show that the computationally compromised strategy can perform well and appear to be ethical for allocating patients. The proposed framework can provide better approximation to the optimal strategy if more extensive computing is available.
Resumo:
Stallard (1998, Biometrics 54, 279-294) recently used Bayesian decision theory for sample-size determination in phase II trials. His design maximizes the expected financial gains in the development of a new treatment. However, it results in a very high probability (0.65) of recommending an ineffective treatment for phase III testing. On the other hand, the expected gain using his design is more than 10 times that of a design that tightly controls the false positive error (Thall and Simon, 1994, Biometrics 50, 337-349). Stallard's design maximizes the expected gain per phase II trial, but it does not maximize the rate of gain or total gain for a fixed length of time because the rate of gain depends on the proportion: of treatments forwarding to the phase III study. We suggest maximizing the rate of gain, and the resulting optimal one-stage design becomes twice as efficient as Stallard's one-stage design. Furthermore, the new design has a probability of only 0.12 of passing an ineffective treatment to phase III study.
Resumo:
The purpose of a phase I trial in cancer is to determine the level (dose) of the treatment under study that has an acceptable level of adverse effects. Although substantial progress has recently been made in this area using parametric approaches, the method that is widely used is based on treating small cohorts of patients at escalating doses until the frequency of toxicities seen at a dose exceeds a predefined tolerable toxicity rate. This method is popular because of its simplicity and freedom from parametric assumptions. In this payer, we consider cases in which it is undesirable to assume a parametric dose-toxicity relationship. We propose a simple model-free approach by modifying the method that is in common use. The approach assumes toxicity is nondecreasing with dose and fits an isotonic regression to accumulated data. At any point in a trial, the dose given is that with estimated toxicity deemed closest to the maximum tolerable toxicity. Simulations indicate that this approach performs substantially better than the commonly used method and it compares favorably with other phase I designs.
Resumo:
The paper studies stochastic approximation as a technique for bias reduction. The proposed method does not require approximating the bias explicitly, nor does it rely on having independent identically distributed (i.i.d.) data. The method always removes the leading bias term, under very mild conditions, as long as auxiliary samples from distributions with given parameters are available. Expectation and variance of the bias-corrected estimate are given. Examples in sequential clinical trials (non-i.i.d. case), curved exponential models (i.i.d. case) and length-biased sampling (where the estimates are inconsistent) are used to illustrate the applications of the proposed method and its small sample properties.
Resumo:
This paper addresses the following predictive business process monitoring problem: Given the execution trace of an ongoing case,and given a set of traces of historical (completed) cases, predict the most likely outcome of the ongoing case. In this context, a trace refers to a sequence of events with corresponding payloads, where a payload consists of a set of attribute-value pairs. Meanwhile, an outcome refers to a label associated to completed cases, like, for example, a label indicating that a given case completed “on time” (with respect to a given desired duration) or “late”, or a label indicating that a given case led to a customer complaint or not. The paper tackles this problem via a two-phased approach. In the first phase, prefixes of historical cases are encoded using complex symbolic sequences and clustered. In the second phase, a classifier is built for each of the clusters. To predict the outcome of an ongoing case at runtime given its (uncompleted) trace, we select the closest cluster(s) to the trace in question and apply the respective classifier(s), taking into account the Euclidean distance of the trace from the center of the clusters. We consider two families of clustering algorithms – hierarchical clustering and k-medoids – and use random forests for classification. The approach was evaluated on four real-life datasets.
Resumo:
Single layered transition metal dichalcogenides have attracted tremendous research interest due to their structural phase diversities. By using a global optimization approach, we have discovered a new phase of transition metal dichalcogenides (labelled as T′′), which is confirmed to be energetically, dynamically and kinetically stable by our first-principles calculations. The new T′′ MoS2 phase exhibits an intrinsic quantum spin Hall (QSH) effect with a nontrivial gap as large as 0.42 eV, suggesting that a two-dimensional (2D) topological insulator can be achieved at room temperature. Most interestingly, there is a topological phase transition simply driven by a small tensile strain of up to 2%. Furthermore, all the known MX2 (M = Mo or W; X = S, Se or Te) monolayers in the new T′′ phase unambiguously display similar band topologies and strain controlled topological phase transitions. Our findings greatly enrich the 2D families of transition metal dichalcogenides and offer a feasible way to control the electronic states of 2D topological insulators for the fabrication of high-speed spintronics devices.
Resumo:
Random walk models are often used to interpret experimental observations of the motion of biological cells and molecules. A key aim in applying a random walk model to mimic an in vitro experiment is to estimate the Fickian diffusivity (or Fickian diffusion coefficient),D. However, many in vivo experiments are complicated by the fact that the motion of cells and molecules is hindered by the presence of obstacles. Crowded transport processes have been modeled using repeated stochastic simulations in which a motile agent undergoes a random walk on a lattice that is populated by immobile obstacles. Early studies considered the most straightforward case in which the motile agent and the obstacles are the same size. More recent studies considered stochastic random walk simulations describing the motion of an agent through an environment populated by obstacles of different shapes and sizes. Here, we build on previous simulation studies by analyzing a general class of lattice-based random walk models with agents and obstacles of various shapes and sizes. Our analysis provides exact calculations of the Fickian diffusivity, allowing us to draw conclusions about the role of the size, shape and density of the obstacles, as well as examining the role of the size and shape of the motile agent. Since our analysis is exact, we calculateDdirectly without the need for random walk simulations. In summary, we find that the shape, size and density of obstacles has a major influence on the exact Fickian diffusivity. Furthermore, our results indicate that the difference in diffusivity for symmetric and asymmetric obstacles is significant.