7 resultados para Equilibrium and stability analysis
em CaltechTHESIS
Resumo:
Lipid bilayer membranes are models for cell membranes--the structure that helps regulate cell function. Cell membranes are heterogeneous, and the coupling between composition and shape gives rise to complex behaviors that are important to regulation. This thesis seeks to systematically build and analyze complete models to understand the behavior of multi-component membranes.
We propose a model and use it to derive the equilibrium and stability conditions for a general class of closed multi-component biological membranes. Our analysis shows that the critical modes of these membranes have high frequencies, unlike single-component vesicles, and their stability depends on system size, unlike in systems undergoing spinodal decomposition in flat space. An important implication is that small perturbations may nucleate localized but very large deformations. We compare these results with experimental observations.
We also study open membranes to gain insight into long tubular membranes that arise for example in nerve cells. We derive a complete system of equations for open membranes by using the principle of virtual work. Our linear stability analysis predicts that the tubular membranes tend to have coiling shapes if the tension is small, cylindrical shapes if the tension is moderate, and beading shapes if the tension is large. This is consistent with experimental observations reported in the literature in nerve fibers. Further, we provide numerical solutions to the fully nonlinear equilibrium equations in some problems, and show that the observed mode shapes are consistent with those suggested by linear stability. Our work also proves that beadings of nerve fibers can appear purely as a mechanical response of the membrane.
Resumo:
In this study we investigate the existence, uniqueness and asymptotic stability of solutions of a class of nonlinear integral equations which are representations for some time dependent non- linear partial differential equations. Sufficient conditions are established which allow one to infer the stability of the nonlinear equations from the stability of the linearized equations. Improved estimates of the domain of stability are obtained using a Liapunov Functional approach. These results are applied to some nonlinear partial differential equations governing the behavior of nonlinear continuous dynamical systems.
Resumo:
Pre-mRNA splicing requires interaction of cis- acting intron sequences with trans -acting factors: proteins and small nuclear ribonucleoproteins (snRNPs). The assembly of these factors into a large complex, the spliceosome, is essential for the subsequent two step splicing reaction. First, the 5' splice site is cleaved and free exon 1 and a lariat intermediate (intron- exon2) form. In the second reaction the 3' splice site is cleaved the exons ligated and lariat intron released. A combination of genetic and biochemical techniques have been used here to study pre-mRNA splicing in yeast.
Yeast introns have three highly conserved elements. We made point mutations within these elements and found that most of them affect splicing efficiency in vivo and in vitro, usually by inhibiting spliceosome assembly.
To study trans -acting splicing factors we generated and screened a bank of temperature- sensitive (ts) mutants. Eleven new complementation groups (prp17 to prp27) were isolated. The four phenotypic classes obtained affect different steps in splicing and accumulate either: 1) pre-mRNA, 2) lariat intermediate, 3) excised intron or 4) both pre-mRNA and intron. The latter three classes represent novel phenotypes. The excised intron observed in one mutant: prp26 is stabilized due to protection in a snRNP containing particle. Extracts from another mutant: prpl8 are heat labile and accumulate lariat intermediate and exon 1. This is especially interesting as it allows analysis of the second splicing reaction. In vitro complementation of inactivated prp18 extracts does not require intact snRNPs. These studies have also shown the mutation to be in a previously unknown splicing protein. A specific requirement for A TP is also observed for the second step of splicing. The PRP 18 gene has been cloned and its polyadenylated transcript identified.
Resumo:
The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.
Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.
Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.
Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.
In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.
Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.
The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.
Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.
Resumo:
The olefin metathesis reaction has found many applications in polymer synthesis and more recently in organic synthesis. The use of single component late metal olefin metathesis catalysts has expanded the scope of the reaction to many new applications and has allowed for detailed study of the catalytic species.
The metathesis of terminal olefins of different steric bulk, different geometry as well as electronically different para-substituted styrenes was studied with the ruthenium based metathesis initiators, trans-(PCy3)2Cl2Ru=CHR, of different carbene substituents. Increasing olefin bulk was found to slow the rate of reaction and trans internal olefins were found to be slower to react than cis internal olefins. The kinetic product of a11 reactions was found to be the alkylidene, rather than the methylidene, suggesting the intermediacy of a 2,4-metallacycle. The observed effects were used to explain the mechanism of ring opening cross metathesis and acyclic diene metathesis polymerization. No linear electronic effects were observed.
In studying the different carbene ligands, a series of ester-carbene complexes was synthesized. These complexes were found to be highly active for the metathesis of olefinic substrates, including acrylates and trisubstituted olefins. In addition, the estercarbene moiety is thermodynamically high in energy. As a result, these complexes react to ring-open cyclohexene by metathesis to alleviate the thermodynamic strain of the ester-carbene ligand. However, ester-carbene complexes were found to be thermolytically unstable in solution.
Thermolytic decomposition pathways were studied for several ruthenium-carbene based olefin metathesis catalysts. Substituted carbenes were found to decompose through bimolecular pathways while the unsubstituted carbene (the methylidene) was found to decompose unimolecularly. The stability of several derivatives of the bis-phosphine ruthenium based catalysts was studied for its implications to ring-closing metathesis. The reasons for the activity and stability of the different ruthenium-based catalysts is discussed.
The difference in catalyst activity and initiation is discussed for the bis-phosphine based and mixed N-heterocyclic carbene/phosphine based ruthenium olefin metathesis catalysts. The mixed ligand catalysts initiate far slower than the bis-phosphine catalysts but are far more metathesis active. A scheme is proposed to explain the difference in reactivity between the two types of catalysts.
Resumo:
This thesis consists of three essays in the areas of political economy and game theory, unified by their focus on the effects of pre-play communication on equilibrium outcomes.
Communication is fundamental to elections. Chapter 2 extends canonical voter turnout models, where citizens, divided into two competing parties, choose between costly voting and abstaining, to include any form of communication, and characterizes the resulting set of Aumann's correlated equilibria. In contrast to previous research, high-turnout equilibria exist in large electorates and uncertain environments. This difference arises because communication can coordinate behavior in such a way that citizens find it incentive compatible to follow their correlated signals to vote more. The equilibria have expected turnout of at least twice the size of the minority for a wide range of positive voting costs.
In Chapter 3 I introduce a new equilibrium concept, called subcorrelated equilibrium, which fills the gap between Nash and correlated equilibrium, extending the latter to multiple mediators. Subcommunication equilibrium similarly extends communication equilibrium for incomplete information games. I explore the properties of these solutions and establish an equivalence between a subset of subcommunication equilibria and Myerson's quasi-principals' equilibria. I characterize an upper bound on expected turnout supported by subcorrelated equilibrium in the turnout game.
Chapter 4, co-authored with Thomas Palfrey, reports a new study of the effect of communication on voter turnout using a laboratory experiment. Before voting occurs, subjects may engage in various kinds of pre-play communication through computers. We study three communication treatments: No Communication, a control; Public Communication, where voters exchange public messages with all other voters, and Party Communication, where messages are exchanged only within one's own party. Our results point to a strong interaction effect between the form of communication and the voting cost. With a low voting cost, party communication increases turnout, while public communication decreases turnout. The data are consistent with correlated equilibrium play. With a high voting cost, public communication increases turnout. With communication, we find essentially no support for the standard Nash equilibrium turnout predictions.
Resumo:
Laser interferometer gravitational wave observatory (LIGO) consists of two complex large-scale laser interferometers designed for direct detection of gravitational waves from distant astrophysical sources in the frequency range 10Hz - 5kHz. Direct detection of space-time ripples will support Einstein's general theory of relativity and provide invaluable information and new insight into physics of the Universe.
Initial phase of LIGO started in 2002, and since then data was collected during six science runs. Instrument sensitivity was improving from run to run due to the effort of commissioning team. Initial LIGO has reached designed sensitivity during the last science run, which ended in October 2010.
In parallel with commissioning and data analysis with the initial detector, LIGO group worked on research and development of the next generation detectors. Major instrument upgrade from initial to advanced LIGO started in 2010 and lasted till 2014.
This thesis describes results of commissioning work done at LIGO Livingston site from 2013 until 2015 in parallel with and after the installation of the instrument. This thesis also discusses new techniques and tools developed at the 40m prototype including adaptive filtering, estimation of quantization noise in digital filters and design of isolation kits for ground seismometers.
The first part of this thesis is devoted to the description of methods for bringing interferometer to the linear regime when collection of data becomes possible. States of longitudinal and angular controls of interferometer degrees of freedom during lock acquisition process and in low noise configuration are discussed in details.
Once interferometer is locked and transitioned to low noise regime, instrument produces astrophysics data that should be calibrated to units of meters or strain. The second part of this thesis describes online calibration technique set up in both observatories to monitor the quality of the collected data in real time. Sensitivity analysis was done to understand and eliminate noise sources of the instrument.
Coupling of noise sources to gravitational wave channel can be reduced if robust feedforward and optimal feedback control loops are implemented. The last part of this thesis describes static and adaptive feedforward noise cancellation techniques applied to Advanced LIGO interferometers and tested at the 40m prototype. Applications of optimal time domain feedback control techniques and estimators to aLIGO control loops are also discussed.
Commissioning work is still ongoing at the sites. First science run of advanced LIGO is planned for September 2015 and will last for 3-4 months. This run will be followed by a set of small instrument upgrades that will be installed on a time scale of few months. Second science run will start in spring 2016 and last for about 6 months. Since current sensitivity of advanced LIGO is already more than factor of 3 higher compared to initial detectors and keeps improving on a monthly basis, upcoming science runs have a good chance for the first direct detection of gravitational waves.