926 resultados para complex polymerization method
Resumo:
A capillary electrophoresis method has been developed to study DNA-protein complexes by mobility-shift assay. This method is at least 100 times more sensitive than conventional gel mobility-shift procedures. Key features of the technique include the use of a neutral coated capillary, a small amount of linear polymer in the separation medium, and use of covalently dye-labeled DNA probes that can be detected with a commercially available laser-induced fluorescence monitor. The capillary method provides quantitative data in runs requiring < 20 min, from which dissociation constants are readily determined. As a test case we studied interactions of a developmentally important sea urchin embryo transcription factor, SpP3A2. As little as 2-10 x 10(6) molecules of specific SpP3A2-oligonucleotide complex were reproducibly detected, using recombinant SpP3A2, crude nuclear extract, egg lysates, and even a single sea urchin egg lysed within the capillary column.
Resumo:
The brain amyloid of Alzheimer disease (AD) may potentially be imaged in patients with AD by using neuroimaging technology and a radiolabeled form of the 40-residue beta-amyloid peptide A beta 1-40 that is enabled to undergo transport through the brain capillary endothelial wall, which makes up the blood-brain barrier (BBB) in vivo. Transport of 125I-labeled A beta 1-40 (125I-A beta 1-40) through the BBB was found to be negligible by experiments with both an intravenous injection technique and an internal carotid artery perfusion method in anesthetized rats. In addition, 125I-A beta 1-40 was rapidly metabolized after either intravenous injection or internal carotid artery perfusion. BBB transport was increased and peripheral metabolism was decreased by conjugation of monobiotinylated 125I-A beta 1-40 to a vector-mediated drug delivery system, which consisted of a conjugate of streptavidin (SA) and the OX26 monoclonal antibody to the rat transferrin receptor, which undergoes receptor-mediated transcytosis through the BBB. The brain uptake, expressed as percent of injected dose delivered per gram of brain, of the 125I,bio-A beta 1-40/SA-OX26 conjugate was 0.15 +/- 0.01, a level that is 2-fold greater than the brain uptake of morphine. The binding of the 125I,bio-A beta 1-40/SA-OX26 conjugate to the amyloid of AD brain was demonstrated by both film and emulsion autoradiography performed on frozen sections of AD brain. Binding of the 125I,bio-A beta 1-40/SA-OX26 conjugate to the amyloid of AD brain was completely inhibited by high concentrations of unlabeled A beta 1-40. In conclusion, these studies show that BBB transport and access to amyloid within brain may be achieved by conjugation of A beta 1-40 to a vector-mediated BBB drug delivery system.
Resumo:
The nature of the alloreactive T-cell response is not yet clearly understood. These strong cellular responses are thought to be the basis of allograft rejection and graft-vs.-host disease. The question of the extent of responding T-cell repertoires has so far been addressed by cellular cloning, often combined with molecular T-cell receptor (TCR) analysis. Here we present a broad repertoire analysis of primed responder cells from mixed lymphocyte cultures in which two different DR1/3 responders were stimulated with DR3/4 cells. Repertoire analysis was performed by TCR spectratyping, a method by which T cells are analyzed on the basis of the complementarity-determining region 3 length of different variable region (V) families. Strikingly, both responders showed very similar repertoires when the TCR V beta was used as a lineage marker. This was not seen when TCR V alpha was analyzed. A different pattern of TCR V beta was observed if the stimulating alloantigen was changed. This finding indicates that alloreactive T cells form a specific repertoire for each alloantigen. Since conservation appears to be linked to TCR V beta, the question of different roles of alpha and beta chains in allorecognition is raised.
Resumo:
Upon photolysis at 355 nm, dioxygen is released from a (mu-peroxo)(mu-hydroxo)bis[bis(bipyridyl)cobalt-(III)] complex in aqueous solutions and at physiological pH with a quantum yield of 0.04. The [Co(bpy)2(H2O)2]2+ (bpy = bipyridyl) photoproduct was generated on a nanosecond or faster time scale as determined by time-resolved optical absorption spectroscopy. A linear correspondence between the spectral changes and the oxygen production indicates that O2 is released on the same time scale. Oxyhemoglobin was formed from deoxyhemoglobin upon photodissociation of the (mu-peroxo) (mu-hydroxo)bis[bis(bipyridyl)cobalt(III)] complex, verifying that dioxygen is a primary photoproduct. This complex and other related compounds provide a method to study fast biological reactions involving O2, such as the reduction of dioxygen to water by cytochrome oxidase.
Resumo:
Replication-incompetent retroviral vectors encoding histochemical reporter genes have been used for studying lineal relationships in a variety of species. A crucial element in the interpretation of data generated by this method is the identification of sibling relationships, or clonal boundaries. The use of a library of viruses in which each member is unique can greatly facilitate this aspect of the analysis. A previously reported murine retroviral library containing about 80 members demonstrated the utility of the library approach. However, the relatively low number of tags in the murine library necessitated using low infection rates in order to give confidence in clonal assignments. To obviate the need for low infection rates, a far more complex library was created and characterized. The CHAPOL library was constructed such that each member encodes a histochemical reporter gene and has a DNA tag derived from a degenerate oligonucleotide pool synthesized to have a complexity of > 1 x 10(7). The library was tested after infection of cells in vitro or in vivo. The DNA tag from each histochemically labeled cell or clone of cells was recovered by PCR and sequenced for unambiguous identification. Three hundred and twenty tags have been identified after infection, and so far no tag has been seen to result from more than one independent infection. Thus, an equal distribution of inserts is suggested, and Monte Carlo analysis predicts a complexity of > 10(4) members.
Resumo:
Reatores tubulares de polimerização podem apresentar um perfil de velocidade bastante distorcido. Partindo desta observação, um modelo estocástico baseado no modelo de dispersão axial foi proposto para a representação matemática da fluidodinâmica de um reator tubular para produção de poliestireno. A equação diferencial foi obtida inserindo a aleatoriedade no parâmetro de dispersão, resultando na adição de um termo estocástico ao modelo capaz de simular as oscilações observadas experimentalmente. A equação diferencial estocástica foi discretizada e resolvida pelo método Euler-Maruyama de forma satisfatória. Uma função estimadora foi desenvolvida para a obtenção do parâmetro do termo estocástico e o parâmetro do termo determinístico foi calculado pelo método dos mínimos quadrados. Uma análise de convergência foi conduzida para determinar o número de elementos da discretização e o modelo foi validado através da comparação de trajetórias e de intervalos de confiança computacionais com dados experimentais. O resultado obtido foi satisfatório, o que auxilia na compreensão do comportamento fluidodinâmico complexo do reator estudado.
Resumo:
Thermal degradation of PLA is a complex process since it comprises many simultaneous reactions. The use of analytical techniques, such as differential scanning calorimetry (DSC) and thermogravimetry (TGA), yields useful information but a more sensitive analytical technique would be necessary to identify and quantify the PLA degradation products. In this work the thermal degradation of PLA at high temperatures was studied by using a pyrolyzer coupled to a gas chromatograph with mass spectrometry detection (Py-GC/MS). Pyrolysis conditions (temperature and time) were optimized in order to obtain an adequate chromatographic separation of the compounds formed during heating. The best resolution of chromatographic peaks was obtained by pyrolyzing the material from room temperature to 600 °C during 0.5 s. These conditions allowed identifying and quantifying the major compounds produced during the PLA thermal degradation in inert atmosphere. The strategy followed to select these operation parameters was by using sequential pyrolysis based on the adaptation of mathematical models. By application of this strategy it was demonstrated that PLA is degraded at high temperatures by following a non-linear behaviour. The application of logistic and Boltzmann models leads to good fittings to the experimental results, despite the Boltzmann model provided the best approach to calculate the time at which 50% of PLA was degraded. In conclusion, the Boltzmann method can be applied as a tool for simulating the PLA thermal degradation.
Resumo:
We present a derivative-free optimization algorithm coupled with a chemical process simulator for the optimal design of individual and complex distillation processes using a rigorous tray-by-tray model. The proposed approach serves as an alternative tool to the various models based on nonlinear programming (NLP) or mixed-integer nonlinear programming (MINLP) . This is accomplished by combining the advantages of using a commercial process simulator (Aspen Hysys), including especially suited numerical methods developed for the convergence of distillation columns, with the benefits of the particle swarm optimization (PSO) metaheuristic algorithm, which does not require gradient information and has the ability to escape from local optima. Our method inherits the superstructure developed in Yeomans, H.; Grossmann, I. E.Optimal design of complex distillation columns using rigorous tray-by-tray disjunctive programming models. Ind. Eng. Chem. Res.2000, 39 (11), 4326–4335, in which the nonexisting trays are considered as simple bypasses of liquid and vapor flows. The implemented tool provides the optimal configuration of distillation column systems, which includes continuous and discrete variables, through the minimization of the total annual cost (TAC). The robustness and flexibility of the method is proven through the successful design and synthesis of three distillation systems of increasing complexity.
Resumo:
Society, as we know it today, is completely dependent on computer networks, Internet and distributed systems, which place at our disposal the necessary services to perform our daily tasks. Moreover, and unconsciously, all services and distributed systems require network management systems. These systems allow us to, in general, maintain, manage, configure, scale, adapt, modify, edit, protect or improve the main distributed systems. Their role is secondary and is unknown and transparent to the users. They provide the necessary support to maintain the distributed systems whose services we use every day. If we don’t consider network management systems during the development stage of main distributed systems, then there could be serious consequences or even total failures in the development of the distributed systems. It is necessary, therefore, to consider the management of the systems within the design of distributed systems and systematize their conception to minimize the impact of the management of networks within the project of distributed systems. In this paper, we present a formalization method of the conceptual modelling for design of a network management system through the use of formal modelling tools, thus allowing from the definition of processes to identify those responsible for these. Finally we will propose a use case to design a conceptual model intrusion detection system in network.
Resumo:
We present a targetless motion tracking method for detecting planar movements with subpixel accuracy. This method is based on the computation and tracking of the intersection of two nonparallel straight-line segments in the image of a moving object in a scene. The method is simple and easy to implement because no complex structures have to be detected. It has been tested and validated using a lab experiment consisting of a vibrating object that was recorded with a high-speed camera working at 1000 fps. We managed to track displacements with an accuracy of hundredths of pixel or even of thousandths of pixel in the case of tracking harmonic vibrations. The method is widely applicable because it can be used for distance measuring amplitude and frequency of vibrations with a vision system.
Resumo:
A novel polymer/TiC nanocomposites “PPA/TiC, poly(PA-co-ANI)/TiC and PANI/TiC” was successfully synthesized by chemical oxidation polymerization at room temperature using p-anisidine and/or aniline monomers and titanium carbide (TiC) in the presence of hydrochloric acid as a dopant with ammonium persulfate as oxidant. These nanocomposites obtained were characterized by Fourier transform infrared (FTIR) spectroscopy, X-ray diffraction (XRD), transmission electron microscopy (TEM), energy dispersive spectroscopy (EDS), and thermogravimetric analysis (TGA). XRD indicated the presence of interactions between polymers and TiC nanoparticle and the TGA revealed that the TiC nanoparticles improve the thermal stability of the polymers. The electrical conductivity of nanocomposites is in the range of 0.079–0.91 S cm−1. The electrochemical behavior of the polymers extracted from the nanocomposites has been analyzed by cyclic voltammetry. Good electrochemical response has been observed for polymer films; the observed redox processes indicate that the polymerisation on TiC nanoparticles produces electroactive polymers. These nanocomposite microspheres can potentially used in commercial applications as fillers for antistatic and anticorrosion coatings.
Resumo:
Numerical modelling methodologies are important by their application to engineering and scientific problems, because there are processes where analytical mathematical expressions cannot be obtained to model them. When the only available information is a set of experimental values for the variables that determine the state of the system, the modelling problem is equivalent to determining the hyper-surface that best fits the data. This paper presents a methodology based on the Galerkin formulation of the finite elements method to obtain representations of relationships that are defined a priori, between a set of variables: y = z(x1, x2,...., xd). These representations are generated from the values of the variables in the experimental data. The approximation, piecewise, is an element of a Sobolev space and has derivatives defined in a general sense into this space. The using of this approach results in the need of inverting a linear system with a structure that allows a fast solver algorithm. The algorithm can be used in a variety of fields, being a multidisciplinary tool. The validity of the methodology is studied considering two real applications: a problem in hydrodynamics and a problem of engineering related to fluids, heat and transport in an energy generation plant. Also a test of the predictive capacity of the methodology is performed using a cross-validation method.
Resumo:
Many multifactorial biologic effects, particularly in the context of complex human diseases, are still poorly understood. At the same time, the systematic acquisition of multivariate data has become increasingly easy. The use of such data to analyze and model complex phenotypes, however, remains a challenge. Here, a new analytic approach is described, termed coreferentiality, together with an appropriate statistical test. Coreferentiality is the indirect relation of two variables of functional interest in respect to whether they parallel each other in their respective relatedness to multivariate reference data, which can be informative for a complex effect or phenotype. It is shown that the power of coreferentiality testing is comparable to multiple regression analysis, sufficient even when reference data are informative only to a relatively small extent of 2.5%, and clearly exceeding the power of simple bivariate correlation testing. Thus, coreferentiality testing uses the increased power of multivariate analysis, however, in order to address a more straightforward interpretable bivariate relatedness. Systematic application of this approach could substantially improve the analysis and modeling of complex phenotypes, particularly in the context of human study where addressing functional hypotheses by direct experimentation is often difficult.
Resumo:
In principle, the world and life itself are the contexts of theatrical events. The term context is broad and thus seems hardly usable. It only makes sense to use the term when terminologies and methodologies determine which parts of their contexts are to be incorporated and analysed for which theatrical event. This presentation exemplifies a method which is particularly suitable for sensibly selecting the most important contexts for research in theatre history. The complexity of the representation increases continuously from The Presentation of Self in Everyday Life to Brecht’s “Street Scene” and “Everyday Theatre”, portrayals of rulers in feasts and parades to Hamlet productions by the Royal Shakespeare Company or a Wagner opera in Bayreuth. The different forms of theatre thus constitute a continuum which spans from “everyday theatre” to “art theatre”. The representation of the world in this continuum is sometimes questioned by the means of theatre, for example when Commedia dell’arte takes a critical stance towards the representative theatre of the humanists or when playful devices such as reversal, parody and fragmentation challenge the representative character of productions, which is applied by the Vice character for instance. There is a second component that has an impact on the continuum without a theatrical device: attitude, opinion, norms and bans which originate from society. As excerpts of contexts, they refer to single forms of theatre in the continuum. This results in a complex system of four components which evolves from the panorama between the antipodes “everyday theatre” and “art theatre” as well as both spheres of influence of which only one uses theatrical devices. All components interact in a specific time frame in a specific place in a specific way in each case, which can then be described as the theatricality in this time frame. This presentation will deal with what the concept is capable of doing.
Resumo:
Thesis (Master's)--University of Washington, 2016-06