7 resultados para CONTINUOUS VARIABLE SYSTEMS

em Duke University


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract

Continuous variable is one of the major data types collected by the survey organizations. It can be incomplete such that the data collectors need to fill in the missingness. Or, it can contain sensitive information which needs protection from re-identification. One of the approaches to protect continuous microdata is to sum them up according to different cells of features. In this thesis, I represents novel methods of multiple imputation (MI) that can be applied to impute missing values and synthesize confidential values for continuous and magnitude data.

The first method is for limiting the disclosure risk of the continuous microdata whose marginal sums are fixed. The motivation for developing such a method comes from the magnitude tables of non-negative integer values in economic surveys. I present approaches based on a mixture of Poisson distributions to describe the multivariate distribution so that the marginals of the synthetic data are guaranteed to sum to the original totals. At the same time, I present methods for assessing disclosure risks in releasing such synthetic magnitude microdata. The illustration on a survey of manufacturing establishments shows that the disclosure risks are low while the information loss is acceptable.

The second method is for releasing synthetic continuous micro data by a nonstandard MI method. Traditionally, MI fits a model on the confidential values and then generates multiple synthetic datasets from this model. Its disclosure risk tends to be high, especially when the original data contain extreme values. I present a nonstandard MI approach conditioned on the protective intervals. Its basic idea is to estimate the model parameters from these intervals rather than the confidential values. The encouraging results of simple simulation studies suggest the potential of this new approach in limiting the posterior disclosure risk.

The third method is for imputing missing values in continuous and categorical variables. It is extended from a hierarchically coupled mixture model with local dependence. However, the new method separates the variables into non-focused (e.g., almost-fully-observed) and focused (e.g., missing-a-lot) ones. The sub-model structure of focused variables is more complex than that of non-focused ones. At the same time, their cluster indicators are linked together by tensor factorization and the focused continuous variables depend locally on non-focused values. The model properties suggest that moving the strongly associated non-focused variables to the side of focused ones can help to improve estimation accuracy, which is examined by several simulation studies. And this method is applied to data from the American Community Survey.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider a deterministic system with two conserved quantities and infinity many invariant measures. However the systems possess a unique invariant measure when enough stochastic forcing and balancing dissipation are added. We then show that as the forcing and dissipation are removed a unique limit of the deterministic system is selected. The exact structure of the limiting measure depends on the specifics of the stochastic forcing.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper analyzes a class of common-component allocation rules, termed no-holdback (NHB) rules, in continuous-review assemble-to-order (ATO) systems with positive lead times. The inventory of each component is replenished following an independent base-stock policy. In contrast to the usually assumed first-come-first-served (FCFS) component allocation rule in the literature, an NHB rule allocates a component to a product demand only if it will yield immediate fulfillment of that demand. We identify metrics as well as cost and product structures under which NHB rules outperform all other component allocation rules. For systems with certain product structures, we obtain key performance expressions and compare them to those under FCFS. For general product structures, we present performance bounds and approximations. Finally, we discuss the applicability of these results to more general ATO systems. © 2010 INFORMS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The growth and proliferation of invasive bacteria in engineered systems is an ongoing problem. While there are a variety of physical and chemical processes to remove and inactivate bacterial pathogens, there are many situations in which these tools are no longer effective or appropriate for the treatment of a microbial target. For example, certain strains of bacteria are becoming resistant to commonly used disinfectants, such as chlorine and UV. Additionally, the overuse of antibiotics has contributed to the spread of antibiotic resistance, and there is concern that wastewater treatment processes are contributing to the spread of antibiotic resistant bacteria.

Due to the continually evolving nature of bacteria, it is difficult to develop methods for universal bacterial control in a wide range of engineered systems, as many of our treatment processes are static in nature. Still, invasive bacteria are present in many natural and engineered systems, where the application of broad acting disinfectants is impractical, because their use may inhibit the original desired bioprocesses. Therefore, to better control the growth of treatment resistant bacteria and to address limitations with the current disinfection processes, novel tools that are both specific and adaptable need to be developed and characterized.

In this dissertation, two possible biological disinfection processes were investigated for use in controlling invasive bacteria in engineered systems. First, antisense gene silencing, which is the specific use of oligonucleotides to silence gene expression, was investigated. This work was followed by the investigation of bacteriophages (phages), which are viruses that are specific to bacteria, in engineered systems.


For the antisense gene silencing work, a computational approach was used to quantify the number of off-targets and to determine the effects of off-targets in prokaryotic organisms. For the organisms of Escherichia coli K-12 MG1655 and Mycobacterium tuberculosis H37Rv the mean number of off-targets was found to be 15.0 + 13.2 and 38.2 + 61.4, respectively, which results in a reduction of greater than 90% of the effective oligonucleotide concentration. It was also demonstrated that there was a high variability in the number of off-targets over the length of a gene, but that on average, there was no general gene location that could be targeted to reduce off-targets. Therefore, this analysis needs to be performed for each gene in question. It was also demonstrated that the thermodynamic binding energy between the oligonucleotide and the mRNA accounted for 83% of the variation in the silencing efficiency, compared to the number of off-targets, which explained 43% of the variance of the silencing efficiency. This suggests that optimizing thermodynamic parameters must be prioritized over minimizing the number of off-targets. In conclusion for the antisense work, these results suggest that off-target hybrids can account for a greater than 90% reduction in the concentration of the silencing oligonucleotides, and that the effective concentration can be increased through the rational design of silencing targets by minimizing off-target hybrids.

Regarding the work with phages, the disinfection rates of bacteria in the presence of phages was determined. The disinfection rates of E. coli K12 MG1655 in the presence of coliphage Ec2 ranged up to 2 h-1, and were dependent on both the initial phage and bacterial concentrations. Increasing initial phage concentrations resulted in increasing disinfection rates, and generally, increasing initial bacterial concentrations resulted in increasing disinfection rates. However, disinfection rates were found to plateau at higher bacterial and phage concentrations. A multiple linear regression model was used to predict the disinfection rates as a function of the initial phage and bacterial concentrations, and this model was able to explain 93% of the variance in the disinfection rates. The disinfection rates were also modeled with a particle aggregation model. The results from these model simulations suggested that at lower phage and bacterial concentrations there are not enough collisions to support active disinfection rates, which therefore, limits the conditions and systems where phage based bacterial disinfection is possible. Additionally, the particle aggregation model over predicted the disinfection rates at higher phage and bacterial concentrations of 108 PFU/mL and 108 CFU/mL, suggesting other interactions were occurring at these higher concentrations. Overall, this work highlights the need for the development of alternative models to more accurately describe the dynamics of this system at a variety of phage and bacterial concentrations. Finally, the minimum required hydraulic residence time was calculated for a continuous stirred-tank reactor and a plug flow reactor (PFR) as a function of both the initial phage and bacterial concentrations, which suggested that phage treatment in a PFR is theoretically possible.

In addition to determining disinfection rates, the long-term bacterial growth inhibition potential was determined for a variety of phages with both Gram-negative and Gram-positive bacteria. It was determined, that on average, phages can be used to inhibit bacterial growth for up to 24 h, and that this effect was concentration dependent for various phages at specific time points. Additionally, it was found that a phage cocktail was no more effective at inhibiting bacterial growth over the long-term than the best performing phage in isolation.

Finally, for an industrial application, the use of phages to inhibit invasive Lactobacilli in ethanol fermentations was investigated. It was demonstrated that phage 8014-B2 can achieve a greater than 3-log inactivation of Lactobacillus plantarum during a 48 h fermentation. Additionally, it was shown that phages can be used to protect final product yields and maintain yeast viability. Through modeling the fermentation system with differential equations it was determined that there was a 10 h window in the beginning of the fermentation run, where the addition of phages can be used to protect final product yields, and after 20 h no additional benefit of the phage addition was observed.

In conclusion, this dissertation improved the current methods for designing antisense gene silencing targets for prokaryotic organisms, and characterized phages from an engineering perspective. First, the current design strategy for antisense targets in prokaryotic organisms was improved through the development of an algorithm that minimized the number of off-targets. For the phage work, a framework was developed to predict the disinfection rates in terms of the initial phage and bacterial concentrations. In addition, the long-term bacterial growth inhibition potential of multiple phages was determined for several bacteria. In regard to the phage application, phages were shown to protect both final product yields and yeast concentrations during fermentation. Taken together, this work suggests that the rational design of phage treatment is possible and further work is needed to expand on this foundation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Antigenically variable RNA viruses are significant contributors to the burden of infectious disease worldwide. One reason for their ubiquity is their ability to escape herd immunity through rapid antigenic evolution and thereby to reinfect previously infected hosts. However, the ways in which these viruses evolve antigenically are highly diverse. Some have only limited diversity in the long-run, with every emergence of a new antigenic variant coupled with a replacement of the older variant. Other viruses rapidly accumulate antigenic diversity over time. Others still exhibit dynamics that can be considered evolutionary intermediates between these two extremes. Here, we present a theoretical framework that aims to understand these differences in evolutionary patterns by considering a virus's epidemiological dynamics in a given host population. Our framework, based on a dimensionless number, probabilistically anticipates patterns of viral antigenic diversification and thereby quantifies a virus's evolutionary potential. It is therefore similar in spirit to the basic reproduction number, the well-known dimensionless number which quantifies a pathogen's reproductive potential. We further outline how our theoretical framework can be applied to empirical viral systems, using influenza A/H3N2 as a case study. We end with predictions of our framework and work that remains to be done to further integrate viral evolutionary dynamics with disease ecology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamics of biomolecules over various spatial and time scales are essential for biological functions such as molecular recognition, catalysis and signaling. However, reconstruction of biomolecular dynamics from experimental observables requires the determination of a conformational probability distribution. Unfortunately, these distributions cannot be fully constrained by the limited information from experiments, making the problem an ill-posed one in the terminology of Hadamard. The ill-posed nature of the problem comes from the fact that it has no unique solution. Multiple or even an infinite number of solutions may exist. To avoid the ill-posed nature, the problem needs to be regularized by making assumptions, which inevitably introduce biases into the result.

Here, I present two continuous probability density function approaches to solve an important inverse problem called the RDC trigonometric moment problem. By focusing on interdomain orientations we reduced the problem to determination of a distribution on the 3D rotational space from residual dipolar couplings (RDCs). We derived an analytical equation that relates alignment tensors of adjacent domains, which serves as the foundation of the two methods. In the first approach, the ill-posed nature of the problem was avoided by introducing a continuous distribution model, which enjoys a smoothness assumption. To find the optimal solution for the distribution, we also designed an efficient branch-and-bound algorithm that exploits the mathematical structure of the analytical solutions. The algorithm is guaranteed to find the distribution that best satisfies the analytical relationship. We observed good performance of the method when tested under various levels of experimental noise and when applied to two protein systems. The second approach avoids the use of any model by employing maximum entropy principles. This 'model-free' approach delivers the least biased result which presents our state of knowledge. In this approach, the solution is an exponential function of Lagrange multipliers. To determine the multipliers, a convex objective function is constructed. Consequently, the maximum entropy solution can be found easily by gradient descent methods. Both algorithms can be applied to biomolecular RDC data in general, including data from RNA and DNA molecules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Organophosphate (OP) pesticides are well-known developmental neurotoxicants that have been linked to abnormal cognitive and behavioral endpoints through both epidemiological studies and animal models of behavioral teratology, and are implicated in the dysfunction of multiple neurotransmitters, including dopamine. Chemical similarities between OP pesticides and organophosphate flame retardants (OPFRs), a class of compounds growing in use and environmental relevance, have produced concern regarding whether developmental exposures to OPFRs and OP pesticides may share behavioral outcomes, impacts on dopaminergic systems, or both. Methods: Using the zebrafish animal model, we exposed developing fish to two OPFRs, TDCIPP and TPHP, as well as the OP pesticide chlorpyrifos, during the first 5 days following fertilization. From there, the exposed fish were assayed for behavioral abnormalities and effects on monoamine neurochemistry as both larvae and adults. An experiment conducted in parallel examined how antagonism of the dopamine system during an identical window of development could alter later life behavior in the same assays. Finally, we investigated the interaction between developmental exposure to an OPFR and acute dopamine antagonism in larval behavior. Results: Developmental exposure to all three OP compounds altered zebrafish behavior, with effects persisting into adulthood. Additionally, exposure to an OPFR decreased the behavioral response to acute D2 receptor antagonism in larvae. However, the pattern of behavioral effects diverged substantially from those seen following developmental dopamine antagonism, and the investigations into dopamine neurochemistry were too variable to be conclusive. Thus, although the results support the hypothesis that OPFRs, as with OP pesticides such as chlorpyrifos, may present a risk to normal behavioral development, we were unable to directly link these effects to any dopaminergic dysfunction.