973 resultados para T.E.U.C.
Resumo:
Experimental results of flow around a circular cylinder with moving surface boundary-layer control (MSBC) are presented. Two small rotating cylinders strategically located inject momentum in the boundary layer of the cylinder, which delays the separation of the boundary layer. As a consequence, the wake becomes narrower and the fluctuating transverse velocity is reduced, resulting in a recirculation free region that prevents the vortex formation. The control parameter is the ratio between the tangential velocity of the moving surface and the flow velocity (U-c/U). The main advantage of the MSBC is the possibility of combining the suppression of vortex-induced vibration (VIV) and drag reduction. The experimental tests are preformed at a circulating water channel facility and the circular cylinders are mounted on a low-damping air bearing base with one degree-of-freedom in the transverse direction of the channel flow. The mass ratio is 1.8. The Reynolds number ranges from 1600 to 7500, the reduced velocity varies up to 17, and the control parameter interval is U-c/U = 5-10. A significant decreasing in the maximum amplitude of oscillation for the cylinder with MSBC is observed. Drag measurements are obtained for statically mounted cylinders with and without MSBC. The use of the flow control results in a mean drag reduction at U-c/U = 5 of almost 60% compared to the plain cylinder. PIV velocity fields of the wake of static cylinders are measured at Re = 3000. The results show that the wake is highly organized and narrower compared to the one observed in cylinders without control. The calculation of the total variance of the fluctuating transverse velocity in the wake region allows the introduction of an active closed-loop control. The experimental results are in good agreement with the numerical simulation studies conducted by other researchers for cylinders with MSBC. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
A low-protein, high-carbohydrate (LPHC) diet for 15 days increased the lipid content in the carcass and adipose tissues of rats. The aim of this work was to investigate the mechanisms of this lipid increase in the retroperitoneal white adipose tissue (RWAT) of these animals. The LPHC diet induced an approximately two- and tenfold increase in serum corticosterone and TNF-alpha, respectively. The rate of de novo fatty acid (FA) synthesis in vivo was reduced (50%) in LPHC rats, and the lipoprotein lipase activity increased (100%). In addition, glycerokinase activity increased (60%), and the phosphoenolpyruvate carboxykinase content decreased (27%). Basal [U-C-14]-glucose incorporation into glycerol-triacylglycerol did not differ between the groups; however, in the presence of insulin, [U-C-14]-glucose incorporation increased by 124% in adipocytes from only control rats. The reductions in IRS1 and AKT content as well as AKT phosphorylation in the RWAT from LPHC rats and the absence of an insulin response suggest that these adipocytes have reduced insulin sensitivity. The increase in NE turnover by 45% and the lack of a lipolytic response to NE in adipocytes from LPHC rats imply catecholamine resistance. The data reveal that the increase in fat storage in the RWAT of LPHC rats results from an increase in FA uptake from circulating lipoproteins and glycerol phosphorylation, which is accompanied by an impaired lipolysis that is activated by NE.
Resumo:
We report dramatic sensitivity enhancements in multidimensional MAS NMR spectra by the use of nonuniform sampling (NUS) and introduce maximum entropy interpolation (MINT) processing that assures the linearity between the time and frequency domains of the NUS acquired data sets. A systematic analysis of sensitivity and resolution in 2D and 3D NUS spectra reveals that with NUS, at least 1.5- to 2-fold sensitivity enhancement can be attained in each indirect dimension without compromising the spectral resolution. These enhancements are similar to or higher than those attained by the newest-generation commercial cryogenic probes. We explore the benefits of this NUS/MaxEnt approach in proteins and protein assemblies using 1-73-(U-C-13,N-15)/74-108-(U-N-15) Escherichia coil thioredoxin reassembly. We demonstrate that in thioredoxin reassembly, NUS permits acquisition of high-quality 3D-NCACX spectra, which are inaccessible with conventional sampling due to prohibitively long experiment times. Of critical importance, issues that hinder NUS-based SNR enhancement in 3D-NMR of liquids are mitigated in the study of solid samples in which theoretical enhancements on the order of 3-4 fold are accessible by compounding the NUS-based SNR enhancement of each indirect dimension. NUS/MINT is anticipated to be widely applicable and advantageous for multidimensional heteronuclear MAS NMR spectroscopy of proteins, protein assemblies, and other biological systems.
Resumo:
Occupational diisocyanate-induced extrinsic allergic alveolitis (EAA) is a rare and probably underestimated diagnosis. Two acute occupational EAA cases have been described in this context, but neither of them concerned hexamethylene diisocyanate (HDI) exposure.
Resumo:
Recently, we have demonstrated that considerable inherent sensitivity gains are attained in MAS NMR spectra acquired by nonuniform sampling (NUS) and introduced maximum entropy interpolation (MINT) processing that assures the linearity of transformation between the time and frequency domains. In this report, we examine the utility of the NUS/MINT approach in multidimensional datasets possessing high dynamic range, such as homonuclear C-13-C-13 correlation spectra. We demonstrate on model compounds and on 1-73-(U-C-13,N-15)/74-108-(U-N-15) E. coli thioredoxin reassembly, that with appropriately constructed 50 % NUS schedules inherent sensitivity gains of 1.7-2.1-fold are readily reached in such datasets. We show that both linearity and line width are retained under these experimental conditions throughout the entire dynamic range of the signals. Furthermore, we demonstrate that the reproducibility of the peak intensities is excellent in the NUS/MINT approach when experiments are repeated multiple times and identical experimental and processing conditions are employed. Finally, we discuss the principles for design and implementation of random exponentially biased NUS sampling schedules for homonuclear C-13-C-13 MAS correlation experiments that yield high-quality artifact-free datasets.
Resumo:
In natural history studies of chronic disease, it is of interest to understand the evolution of key variables that measure aspects of disease progression. This is particularly true for immunological variables in persons infected with the Human Immunodeficiency Virus (HIV). The natural timescale for such studies is time since infection. However, most data available for analysis arise from prevalent cohorts, where the date of infection is unknown for most or all individuals. As a result, standard curve fitting algorithms are not immediately applicable. Here we propose two methods to circumvent this difficulty. The first uses repeated measurement data to provide information not only on the level of the variable of interest, but also on its rate of change, while the second uses an estimate of the expected time since infection. Both methods are based on the principal curves algorithm of Hastie and Stuetzle, and are applied to data from a prevalent cohort of HIV-infected homosexual men, giving estimates of the average pattern of CD4+ lymphocyte decline. These methods are applicable to natural history studies using data from prevalent cohorts where the time of disease origin is uncertain, provided certain ancillary information is available from external sources.
Resumo:
In biostatistical applications interest often focuses on the estimation of the distribution of a time-until-event variable T. If one observes whether or not T exceeds an observed monitoring time at a random number of monitoring times, then the data structure is called interval censored data. We extend this data structure by allowing the presence of a possibly time-dependent covariate process that is observed until end of follow up. If one only assumes that the censoring mechanism satisfies coarsening at random, then, by the curve of dimensionality, typically no regular estimators will exist. To fight the curse of dimensionality we follow the approach of Robins and Rotnitzky (1992) by modeling parameters of the censoring mechanism. We model the right-censoring mechanism by modeling the hazard of the follow up time, conditional on T and the covariate process. For the monitoring mechanism we avoid modeling the joint distribution of the monitoring times by only modeling a univariate hazard of the pooled monitoring times, conditional on the follow up time, T, and the covariates process, which can be estimated by treating the pooled sample of monitoring times as i.i.d. In particular, it is assumed that the monitoring times and the right-censoring times only depend on T through the observed covariate process. We introduce inverse probability of censoring weighted (IPCW) estimator of the distribution of T and of smooth functionals thereof which are guaranteed to be consistent and asymptotically normal if we have available correctly specified semiparametric models for the two hazards of the censoring process. Furthermore, given such correctly specified models for these hazards of the censoring process, we propose a one-step estimator which will improve on the IPCW estimator if we correctly specify a lower-dimensional working model for the conditional distribution of T, given the covariate process, that remains consistent and asymptotically normal if this latter working model is misspecified. It is shown that the one-step estimator is efficient if each subject is at most monitored once and the working model contains the truth. In general, it is shown that the one-step estimator optimally uses the surrogate information if the working model contains the truth. It is not optimal in using the interval information provided by the current status indicators at the monitoring times, but simulations in Peterson, van der Laan (1997) show that the efficiency loss is small.
Resumo:
Common goals in epidemiologic studies of infectious diseases include identification of the infectious agent, description of the modes of transmission and characterization of factors that influence the probability of transmission from infected to uninfected individuals. In the case of AIDS, the agent has been identified as the Human Immunodeficiency Virus (HIV), and transmission is known to occur through a variety of contact mechanisms including unprotected sexual intercourse, transfusion of infected blood products and sharing of needles in intravenous drug use. Relatively little is known about the probability of IV transmission associated with the various modes of contact, or the role that other cofactors play in promoting or suppressing transmission. Here, transmission probability refers to the probability that the virus is transmitted to a susceptible individual following exposure consisting of a series of potentially infectious contacts. The infectivity of HIV for a given route of transmission is defined to be the per contact probability of infection. Knowledge of infectivity and its relationship to other factors is important in understanding the dynamics of the AIDS epidemic and in suggesting appropriate measures to control its spread. The primary source of empirical data about infectivity comes from sexual partners of infected individuals. Partner studies consist of a series of such partnerships, usually heterosexual and monogamous, each composed of an initially infected "index case" and a partner who may or may not be infected by the time of data collection. However, because the infection times of both partners may be unknown and the history of contacts uncertain, any quantitative characterization of infectivity is extremely difficult. Thus, most statistical analyses of partner study data involve the simplifying assumption that infectivity is a constant common to all partnerships. The major objectives of this work are to describe and discuss the design and analysis of partner studies, providing a general statistical framework for investigations of infectivity and risk factors for HIV transmission. The development is largely based on three papers: Jewell and Shiboski (1990), Kim and Lagakos (1990), and Shiboski and Jewell (1992).
Resumo:
A large number of proposals for estimating the bivariate survival function under random censoring has been made. In this paper we discuss nonparametric maximum likelihood estimation and the bivariate Kaplan-Meier estimator of Dabrowska. We show how these estimators are computed, present their intuitive background and compare their practical performance under different levels of dependence and censoring, based on extensive simulation results, which leads to a practical advise.
Resumo:
In Malani and Neilsen (1992) we have proposed alternative estimates of survival function (for time to disease) using a simple marker that describes time to some intermediate stage in a disease process. In this paper we derive the asymptotic variance of one such proposed estimator using two different methods and compare terms of order 1/n when there is no censoring. In the absence of censoring the asymptotic variance obtained using the Greenwood type approach converges to exact variance up to terms involving 1/n. But the asymptotic variance obtained using the theory of the counting process and results from Voelkel and Crowley (1984) on semi-Markov processes has a different term of order 1/n. It is not clear to us at this point why the variance formulae using the latter approach give different results.
Resumo:
Jewell and Kalbfleisch (1992) consider the use of marker processes for applications related to estimation of the survival distribution of time to failure. Marker processes were assumed to be stochastic processes that, at a given point in time, provide information about the current hazard and consequently on the remaining time to failure. Particular attention was paid to calculations based on a simple additive model for the relationship between the hazard function at time t and the history of the marker process up until time t. Specific applications to the analysis of AIDS data included the use of markers as surrogate responses for onset of AIDS with censored data and as predictors of the time elapsed since infection in prevalent individuals. Here we review recent work on the use of marker data to tackle these kinds of problems with AIDS data. The Poisson marker process with an additive model, introduced in Jewell and Kalbfleisch (1992) may be a useful "test" example for comparison of various procedures.
Resumo:
In biostatistical applications, interest often focuses on the estimation of the distribution of time T between two consecutive events. If the initial event time is observed and the subsequent event time is only known to be larger or smaller than an observed monitoring time, then the data is described by the well known singly-censored current status model, also known as interval censored data, case I. We extend this current status model by allowing the presence of a time-dependent process, which is partly observed and allowing C to depend on T through the observed part of this time-dependent process. Because of the high dimension of the covariate process, no globally efficient estimators exist with a good practical performance at moderate sample sizes. We follow the approach of Robins and Rotnitzky (1992) by modeling the censoring variable, given the time-variable and the covariate-process, i.e., the missingness process, under the restriction that it satisfied coarsening at random. We propose a generalization of the simple current status estimator of the distribution of T and of smooth functionals of the distribution of T, which is based on an estimate of the missingness. In this estimator the covariates enter only through the estimate of the missingness process. Due to the coarsening at random assumption, the estimator has the interesting property that if we estimate the missingness process more nonparametrically, then we improve its efficiency. We show that by local estimation of an optimal model or optimal function of the covariates for the missingness process, the generalized current status estimator for smooth functionals become locally efficient; meaning it is efficient if the right model or covariate is consistently estimated and it is consistent and asymptotically normal in general. Estimation of the optimal model requires estimation of the conditional distribution of T, given the covariates. Any (prior) knowledge of this conditional distribution can be used at this stage without any risk of losing root-n consistency. We also propose locally efficient one step estimators. Finally, we show some simulation results.
Resumo:
We investigate the interplay of smoothness and monotonicity assumptions when estimating a density from a sample of observations. The nonparametric maximum likelihood estimator of a decreasing density on the positive half line attains a rate of convergence at a fixed point if the density has a negative derivative. The same rate is obtained by a kernel estimator, but the limit distributions are different. If the density is both differentiable and known to be monotone, then a third estimator is obtained by isotonization of a kernel estimator. We show that this again attains the rate of convergence and compare the limit distributors of the three types of estimators. It is shown that both isotonization and smoothing lead to a more concentrated limit distribution and we study the dependence on the proportionality constant in the bandwidth. We also show that isotonization does not change the limit behavior of a kernel estimator with a larger bandwidth, in the case that the density is known to have more than one derivative.
Resumo:
Estimation for bivariate right censored data is a problem that has had much study over the past 15 years. In this paper we propose a new class of estimators for the bivariate survival function based on locally efficient estimation. We introduce the locally efficient estimator for bivariate right censored data, present an asymptotic theorem, present the results of simulation studies and perform a brief data analysis illustrating the use of the locally efficient estimator.
Resumo:
Much controversy exists over whether the course of schizophrenia, as defined by the lengths of repeated community tenures, is progressively ameliorating or deteriorating. This article employs a new statistical method proposed by Wang and Chen (2000) to analyze the Denmark registry data in Eaton, et al (1992). The new statistical method correctly handles the bias caused by induced informative censoring, which is an interaction of the heterogeneity of schizophrenia patients and long-term follow-up. The analysis shows a progressive deterioration pattern in terms of community tenures for the full registry cohort, rather than a progressive amelioration pattern as reported for a selected sub-cohort in Eaton, et al (1992). When adjusted for the long-term chronicity of calendar time, no significant progressive pattern was found for the full cohort.