7 resultados para structural control.

em CaltechTHESIS


Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this work, the development of a probabilistic approach to robust control is motivated by structural control applications in civil engineering. Often in civil structural applications, a system's performance is specified in terms of its reliability. In addition, the model and input uncertainty for the system may be described most appropriately using probabilistic or "soft" bounds on the model and input sets. The probabilistic robust control methodology contrasts with existing H∞/μ robust control methodologies that do not use probability information for the model and input uncertainty sets, yielding only the guaranteed (i.e., "worst-case") system performance, and no information about the system's probable performance which would be of interest to civil engineers.

The design objective for the probabilistic robust controller is to maximize the reliability of the uncertain structure/controller system for a probabilistically-described uncertain excitation. The robust performance is computed for a set of possible models by weighting the conditional performance probability for a particular model by the probability of that model, then integrating over the set of possible models. This integration is accomplished efficiently using an asymptotic approximation. The probable performance can be optimized numerically over the class of allowable controllers to find the optimal controller. Also, if structural response data becomes available from a controlled structure, its probable performance can easily be updated using Bayes's Theorem to update the probability distribution over the set of possible models. An updated optimal controller can then be produced, if desired, by following the original procedure. Thus, the probabilistic framework integrates system identification and robust control in a natural manner.

The probabilistic robust control methodology is applied to two systems in this thesis. The first is a high-fidelity computer model of a benchmark structural control laboratory experiment. For this application, uncertainty in the input model only is considered. The probabilistic control design minimizes the failure probability of the benchmark system while remaining robust with respect to the input model uncertainty. The performance of an optimal low-order controller compares favorably with higher-order controllers for the same benchmark system which are based on other approaches. The second application is to the Caltech Flexible Structure, which is a light-weight aluminum truss structure actuated by three voice coil actuators. A controller is designed to minimize the failure probability for a nominal model of this system. Furthermore, the method for updating the model-based performance calculation given new response data from the system is illustrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding the mechanisms of enzymes is crucial for our understanding of their role in biology and for designing methods to perturb or harness their activities for medical treatments, industrial processes, or biological engineering. One aspect of enzymes that makes them difficult to fully understand is that they are in constant motion, and these motions and the conformations adopted throughout these transitions often play a role in their function.

Traditionally, it has been difficult to isolate a protein in a particular conformation to determine what role each form plays in the reaction or biology of that enzyme. A new technology, computational protein design, makes the isolation of various conformations possible, and therefore is an extremely powerful tool in enabling a fuller understanding of the role a protein conformation plays in various biological processes.

One such protein that undergoes large structural shifts during different activities is human type II transglutaminase (TG2). TG2 is an enzyme that exists in two dramatically different conformational states: (1) an open, extended form, which is adopted upon the binding of calcium, and (2) a closed, compact form, which is adopted upon the binding of GTP or GDP. TG2 possess two separate active sites, each with a radically different activity. This open, calcium-bound form of TG2 is believed to act as a transglutaminse, where it catalyzes the formation of an isopeptide bond between the sidechain of a peptide-bound glutamine and a primary amine. The closed, GTP-bound conformation is believed to act as a GTPase. TG2 is also implicated in a variety of biological and pathological processes.

To better understand the effects of TG2’s conformations on its activities and pathological processes, we set out to design variants of TG2 isolated in either the closed or open conformations. We were able to design open-locked and closed-biased TG2 variants, and use these designs to unseat the current understanding of the activities and their concurrent conformations of TG2 and explore each conformation’s role in celiac disease models. This work also enabled us to help explain older confusing results in regards to this enzyme and its activities. The new model for TG2 activity has immense implications for our understanding of its functional capabilities in various environments, and for our ability to understand which conformations need to be inhibited in the design of new drugs for diseases in which TG2’s activities are believed to elicit pathological effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Hamilton Jacobi Bellman (HJB) equation is central to stochastic optimal control (SOC) theory, yielding the optimal solution to general problems specified by known dynamics and a specified cost functional. Given the assumption of quadratic cost on the control input, it is well known that the HJB reduces to a particular partial differential equation (PDE). While powerful, this reduction is not commonly used as the PDE is of second order, is nonlinear, and examples exist where the problem may not have a solution in a classical sense. Furthermore, each state of the system appears as another dimension of the PDE, giving rise to the curse of dimensionality. Since the number of degrees of freedom required to solve the optimal control problem grows exponentially with dimension, the problem becomes intractable for systems with all but modest dimension.

In the last decade researchers have found that under certain, fairly non-restrictive structural assumptions, the HJB may be transformed into a linear PDE, with an interesting analogue in the discretized domain of Markov Decision Processes (MDP). The work presented in this thesis uses the linearity of this particular form of the HJB PDE to push the computational boundaries of stochastic optimal control.

This is done by crafting together previously disjoint lines of research in computation. The first of these is the use of Sum of Squares (SOS) techniques for synthesis of control policies. A candidate polynomial with variable coefficients is proposed as the solution to the stochastic optimal control problem. An SOS relaxation is then taken to the partial differential constraints, leading to a hierarchy of semidefinite relaxations with improving sub-optimality gap. The resulting approximate solutions are shown to be guaranteed over- and under-approximations for the optimal value function. It is shown that these results extend to arbitrary parabolic and elliptic PDEs, yielding a novel method for Uncertainty Quantification (UQ) of systems governed by partial differential constraints. Domain decomposition techniques are also made available, allowing for such problems to be solved via parallelization and low-order polynomials.

The optimization-based SOS technique is then contrasted with the Separated Representation (SR) approach from the applied mathematics community. The technique allows for systems of equations to be solved through a low-rank decomposition that results in algorithms that scale linearly with dimensionality. Its application in stochastic optimal control allows for previously uncomputable problems to be solved quickly, scaling to such complex systems as the Quadcopter and VTOL aircraft. This technique may be combined with the SOS approach, yielding not only a numerical technique, but also an analytical one that allows for entirely new classes of systems to be studied and for stability properties to be guaranteed.

The analysis of the linear HJB is completed by the study of its implications in application. It is shown that the HJB and a popular technique in robotics, the use of navigation functions, sit on opposite ends of a spectrum of optimization problems, upon which tradeoffs may be made in problem complexity. Analytical solutions to the HJB in these settings are available in simplified domains, yielding guidance towards optimality for approximation schemes. Finally, the use of HJB equations in temporal multi-task planning problems is investigated. It is demonstrated that such problems are reducible to a sequence of SOC problems linked via boundary conditions. The linearity of the PDE allows us to pre-compute control policy primitives and then compose them, at essentially zero cost, to satisfy a complex temporal logic specification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

More than thirty years after the discovery that Human Immunodeficiency Virus (HIV) was the causative agent of Acquired Immunodeficiency Syndrome (AIDS), the disease remains pandemic as long as no effective universal vaccine is found. Over 34 million individuals in the world are infected with the virus, and the vast majority of them have no access to the antiretroviral therapies that have largely reduced HIV to a chronic disease in the developed world. The first chapter of this thesis introduces the history of the virus. The key to the infectious mechanism of the virus lies in its envelope glycoprotein (Env), a trimeric spike on the viral surface that utilizes host T cell receptors for entry. Though HIV-1 Env is immunogenic, most infected patients do not mount an effective neutralizing antibody response against it. Broadly-neutralizing anti-Env antibodies (bNAbs) present in the serum of a minority of infected individuals are usually sufficient to prevent the progression to full blown AIDS. Thus, the molecular details of these bNAbs as well as the antibody-antigen interface are of prime interest for structural studies, as insight gained would contribute to the design of a more effective immunogen and potential vaccine candidate. The second chapter of this thesis describes the low-resolution crystal structure of one such antibody, 2G12 dimer, which targets a high mannose epitope on the surface of Env. Patients infected with HIV-2, a related virus with ~35% sequence identity in the Env region, can generally mount a robust antibody response sufficient for viral control for reasons still unknown. The final two chapters of this thesis focus on the first reported structural studies of HIV-2 Env, the molecular details of which may inform HIV-1 therapy and immunogen design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current technological advances in fabrication methods have provided pathways to creating architected structural meta-materials similar to those found in natural organisms that are structurally robust and lightweight, such as diatoms. Structural meta-materials are materials with mechanical properties that are determined by material properties at various length scales, which range from the material microstructure (nm) to the macro-scale architecture (μm – mm). It is now possible to exploit material size effect, which emerge at the nanometer length scale, as well as structural effects to tune the material properties and failure mechanisms of small-scale cellular solids, such as nanolattices. This work demonstrates the fabrication and mechanical properties of 3-dimensional hollow nanolattices in both tension and compression. Hollow gold nanolattices loaded in uniaxial compression demonstrate that strength and stiffness vary as a function of geometry and tube wall thickness. Structural effects were explored by increasing the unit cell angle from 30° to 60° while keeping all other parameters constant; material size effects were probed by varying the tube wall thickness, t, from 200nm to 635nm, at a constant relative density and grain size. In-situ uniaxial compression experiments reveal an order-of-magnitude increase in yield stress and modulus in nanolattices with greater lattice angles, and a 150% increase in the yield strength without a concomitant change in modulus in thicker-walled nanolattices for fixed lattice angles. These results imply that independent control of structural and material size effects enables tunability of mechanical properties of 3-dimensional architected meta-materials and highlight the importance of material, geometric, and microstructural effects in small-scale mechanics. This work also explores the flaw tolerance of 3D hollow-tube alumina kagome nanolattices with and without pre-fabricated notches, both in experiment and simulation. Experiments demonstrate that the hollow kagome nanolattices in uniaxial tension always fail at the same load when the ratio of notch length (a) to sample width (w) is no greater than 1/3, with no correlation between failure occurring at or away from the notch. For notches with (a/w) > 1/3, the samples fail at lower peak loads and this is attributed to the increased compliance as fewer unit cells span the un-notched region. Finite element simulations of the kagome tension samples show that the failure is governed by tensile loading for (a/w) < 1/3 but as (a/w) increases, bending begins to play a significant role in the failure. This work explores the flaw sensitivity of hollow alumina kagome nanolattices in tension, using experiments and simulations, and demonstrates that the discrete-continuum duality of architected structural meta-materials gives rise to their flaw insensitivity even when made entirely of intrinsically brittle materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

STEEL, the Caltech created nonlinear large displacement analysis software, is currently used by a large number of researchers at Caltech. However, due to its complexity, lack of visualization tools (such as pre- and post-processing capabilities) rapid creation and analysis of models using this software was difficult. SteelConverter was created as a means to facilitate model creation through the use of the industry standard finite element solver ETABS. This software allows users to create models in ETABS and intelligently convert model information such as geometry, loading, releases, fixity, etc., into a format that STEEL understands. Models that would take several days to create and verify now take several hours or less. The productivity of the researcher as well as the level of confidence in the model being analyzed is greatly increased.

It has always been a major goal of Caltech to spread the knowledge created here to other universities. However, due to the complexity of STEEL it was difficult for researchers or engineers from other universities to conduct analyses. While SteelConverter did help researchers at Caltech improve their research, sending SteelConverter and its documentation to other universities was less than ideal. Issues of version control, individual computer requirements, and the difficulty of releasing updates made a more centralized solution preferred. This is where the idea for Caltech VirtualShaker was born. Through the creation of a centralized website where users could log in, submit, analyze, and process models in the cloud, all of the major concerns associated with the utilization of SteelConverter were eliminated. Caltech VirtualShaker allows users to create profiles where defaults associated with their most commonly run models are saved, and allows them to submit multiple jobs to an online virtual server to be analyzed and post-processed. The creation of this website not only allowed for more rapid distribution of this tool, but also created a means for engineers and researchers with no access to powerful computer clusters to run computationally intensive analyses without the excessive cost of building and maintaining a computer cluster.

In order to increase confidence in the use of STEEL as an analysis system, as well as verify the conversion tools, a series of comparisons were done between STEEL and ETABS. Six models of increasing complexity, ranging from a cantilever column to a twenty-story moment frame, were analyzed to determine the ability of STEEL to accurately calculate basic model properties such as elastic stiffness and damping through a free vibration analysis as well as more complex structural properties such as overall structural capacity through a pushover analysis. These analyses showed a very strong agreement between the two softwares on every aspect of each analysis. However, these analyses also showed the ability of the STEEL analysis algorithm to converge at significantly larger drifts than ETABS when using the more computationally expensive and structurally realistic fiber hinges. Following the ETABS analysis, it was decided to repeat the comparisons in a software more capable of conducting highly nonlinear analysis, called Perform. These analyses again showed a very strong agreement between the two softwares in every aspect of each analysis through instability. However, due to some limitations in Perform, free vibration analyses for the three story one bay chevron brace frame, two bay chevron brace frame, and twenty story moment frame could not be conducted. With the current trend towards ultimate capacity analysis, the ability to use fiber based models allows engineers to gain a better understanding of a building’s behavior under these extreme load scenarios.

Following this, a final study was done on Hall’s U20 structure [1] where the structure was analyzed in all three softwares and their results compared. The pushover curves from each software were compared and the differences caused by variations in software implementation explained. From this, conclusions can be drawn on the effectiveness of each analysis tool when attempting to analyze structures through the point of geometric instability. The analyses show that while ETABS was capable of accurately determining the elastic stiffness of the model, following the onset of inelastic behavior the analysis tool failed to converge. However, for the small number of time steps the ETABS analysis was converging, its results exactly matched those of STEEL, leading to the conclusion that ETABS is not an appropriate analysis package for analyzing a structure through the point of collapse when using fiber elements throughout the model. The analyses also showed that while Perform was capable of calculating the response of the structure accurately, restrictions in the material model resulted in a pushover curve that did not match that of STEEL exactly, particularly post collapse. However, such problems could be alleviated by choosing a more simplistic material model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ion channels are a large class of integral membrane proteins that allow for the diffusion of ions across a cellular membrane and are found in all forms of life. Pentameric ligand-gated ion channels (pLGICs) comprise a large family of proteins that include the nicotinic acetylcholine receptor (nAChR) and the γ-aminobutyric acid (GABA) receptor. These ion channels are responsible for the fast synaptic transmission that occurs in humans and as a result are of fundamental biological importance. pLGICs bind ligands (neurotransmitters), and upon ligand-binding undergo activation. The activation event causes an ion channel to enter a new physical state that is able to conduct ions. Ion channels allow for the flux of ions across the membrane through a pore that is formed upon ion channel activation. For pLGICs to function properly both ligand-binding and ion channel activation must occur. The ligand-binding event has been studied extensively over the past few decades, and a detailed mechanism of binding has emerged. During activation the ion channel must undergo structural rearrangements that allow the protein to enter a conformation in which ions can flow through. Despite this great and ubiquitous importance, a fundamental understanding of the ion channel activation mechanism and kinetics, as well as concomitant structural arrangements, remains elusive.

This dissertation describes efforts that have been made to temporally control the activation of ligand-gated ion channels. Temporal control of ion channel activation provides a means by which to activate ion channels when desired. The majority of this work examines the use of light to activate ion channels. Several photocages were examined in this thesis; photocages are molecules that release a ligand under irradiation, and, for the work described here, the released ligand then activates the ion channel. First, a new water-soluble photoacid was developed for the activation of proton-sensitive ion channels. Activation of acid-sensing ion channels, ASIC2a and GLIC, was observed only upon irradiation. Next, a variety of Ru2+ photocages were also developed for the release of amine ligands. The Ru2+ systems interacted in a deleterious manner with a representative subset of biologically essential ion channels. The rapid mixing of ion channels with agonist was also examined. A detection system was built to monitor ion channels activation in the rapid mixing experiments. I have shown that liposomes, and functionally-reconstituted ELIC, are not destroyed during the mixing process. The work presented here provides the means to deliver agonist to ligand-gated ion channels in a controlled fashion.