911 resultados para Chromium reduction destillation, cold single step
Resumo:
BACKGROUND The variant Creutzfeldt-Jakob disease incidence peaked a decade ago and has since declined. Based on epidemiologic evidence, the causative agent, pathogenic prion, has not constituted a tangible contamination threat to large-scale manufacturing of human plasma-derived proteins. Nonetheless, manufacturers have studied the prion removal capabilities of various manufacturing steps to better understand product safety. Collectively analyzing the results could reveal experimental reproducibility and detect trends and mechanisms driving prion removal. STUDY DESIGN AND METHODS Plasma Protein Therapeutics Association member companies collected more than 200 prion removal studies on plasma protein manufacturing steps, including precipitation, adsorption, chromatography, and filtration, as well as combined steps. The studies used a range of model spiking agents and bench-scale process replicas. The results were grouped based on key manufacturing variables to identify factors impacting removal. The log reduction values of a group are presented for comparison. RESULTS Overall prion removal capacities evaluated by independent groups were in good agreement. The removal capacity evaluated using biochemical assays was consistent with prion infectivity removal measured by animal bioassays. Similar reduction values were observed for a given step using various spiking agents, except highly purified prion protein in some circumstances. Comparison between combined and single-step studies revealed complementary or overlapping removal mechanisms. Steps with high removal capacities represent the conditions where the physiochemical differences between prions and therapeutic proteins are most significant. CONCLUSION The results support the intrinsic ability of certain plasma protein manufacturing steps to remove prions in case of an unlikely contamination, providing a safeguard to products.
Resumo:
Olivier Danvy and others have shown the syntactic correspondence between reduction semantics (a small-step semantics) and abstract machines, as well as the functional correspondence between reduction-free normalisers (a big-step semantics) and abstract machines. The correspondences are established by program transformation (so-called interderivation) techniques. A reduction semantics and a reduction-free normaliser are interderivable when the abstract machine obtained from them is the same. However, the correspondences fail when the underlying reduction strategy is hybrid, i.e., relies on another sub-strategy. Hybridisation is an essential structural property of full-reducing and complete strategies. Hybridisation is unproblematic in the functional correspondence. But in the syntactic correspondence the refocusing and inlining-of-iterate-function steps become context sensitive, preventing the refunctionalisation of the abstract machine. We show how to solve the problem and showcase the interderivation of normalisers for normal order, the standard, full-reducing and complete strategy of the pure lambda calculus. Our solution makes it possible to interderive, rather than contrive, full-reducing abstract machines. As expected, the machine we obtain is a variant of Pierre Crégut s full Krivine machine KN.
Resumo:
We have succeeded in constructing a stable full-length cDNA clone of strain H77 (genotype 1a) of hepatitis C virus (HCV). We devised a cassette vector with fixed 5′ and 3′ termini and constructed multiple full-length cDNA clones of H77 in a single step by cloning of the entire ORF, which was amplified by long reverse transcriptase–PCR, directly into this vector. The infectivity of two complete full-length cDNA clones was tested by the direct intrahepatic injection of a chimpanzee with RNA transcripts. However, we found no evidence for HCV replication. Sequence analysis of these and 16 additional full-length clones revealed that seven clones were defective for polyprotein synthesis, and the remaining nine clones had 6–28 amino acid mutations in the predicted polyprotein compared with the consensus sequence of H77. Next, we constructed a consensus chimera from four of the full-length cDNA clones with just two ligation steps. Injection of RNA transcripts from this consensus clone into the liver of a chimpanzee resulted in viral replication. The sequence of the virus recovered from the chimpanzee was identical to that of the injected RNA transcripts. This stable infectious molecular clone should be an important tool for developing a better understanding of the molecular biology and pathogenesis of HCV.
Resumo:
A biomassa lignicelulósica tem sua estrutura composta por celulose, hemicelulose e lignina. Dentre essas, a lignina tem se mostrado interessante por ser uma fonte precursora sustentável de fragmentos aromáticos antes obtidos apenas de combustíveis fósseis. Sua estrutura é composta por resíduos de fenilpropanóides p-hidroxibenzeno (H), guaiacil (G) e siringil (S) unidas por ligações C–C e C–O–C em que a ligação β–O–4 é a predominante (mais de 50%). Devido à sua complexidade estrutural e conformacional, a clivagem de suas ligações é pouco seletiva e a caracterização dos fragmentos resultantes é complexa. Uma estratégia comumente empregada para evitar esses desafios é o uso de modelos mais simples. Entretanto, poucas metodologias são reportadas na literatura para a sua síntese e a maioria delas envolve o emprego de halocetonas. O presente trabalho desenvolveu duas novas metodologias promissoras para síntese desses oligômeros, contendo ligação β–O–4 por meio da química de diazo: (a) reação de inserção O–H entre fenol e α–aril diazocetonas, e (b) compostos α–diazo β-cetoéster. Ademais, a utilização de monômeros contendo a função fenol e diazocetona no mesmo anel permitiria a síntese de cadeias de diversos tamanhos em uma única etapa. Como ponto de partida para o estudo, limitou-se à síntese de dímeros, visando entender a reação de inserção O–H. Os produtos desejados foram obtidos em rendimentos de 27–51% após catálise com Cu(hfac)2. Por fim, os modelos de lignina propriamente ditos foram sintetizados após simples adição aldólica e redução em rendimentos globais de 51–78%. Os estudos envolvendo a inserção de fenol em α–diazo β-cetoéster mostraram resultados promissores, corroborando para uma nova estratégia sintética para a obtenção de modelos de lignina. Novos estudos em nosso laboratório estão sendo desenvolvidos para se obter resultados mais conclusivos.
Resumo:
The study evaluated sources of within- and between-subject variability in standard white-on-white (W-W) perimetry and short-wavelength automated perimetry (SWAP). The Influence of staircase strategy on the fatigue effect in W-W perimetry was investigated for a 4 dB single step, single reversal strategy; a variable step size, single reversal dynamic strategy; and the standard 4-2 dB double reversal strategy. The fatigue effect increased as the duration of the examination Increased and was greatest in the second eye for all strategies. The fatigue effect was lowest for the 4dB strategy, which exhibited the shortest examination time and was greatest for the 4-2 dB strategy, which exhibited the longest examination time. Staircase efficiency was lowest for the 4 dB strategy and highest for the dynamic strategy which thus offers a reduced examination time and low inter-subject variability. The normal between-subject variability of SWAP was determined for the standard 4-2 dB double reversal strategy and the 3 dB single reversal FASTPAC strategy and compared to that of W-W perimetry, The decrease in sensitivity with Increase in age was greatest for SWAP. The between-subject variability of SWAP was greater than W-W perimetry. Correction for the Influence of ocular media absorption reduced the between-subject variability of SWAP, The FASTPAC strategy yielded the lowest between-subject variability In SWAP, but the greatest between-subject variability In WoW perimetry. The greater between-subject variability of SWAP has profound Implications for the delineation of visual field abnormality, The fatigue effect for the Full Threshold strategy in SWAP was evaluated with conventional opaque, and translucent occlusion of the fellow eye. SWAP exhibited a greater fatigue effect than W-W perimetry. Translucent occlusion reduced the between-subject variability of W-W perimetry but Increased the between-subject variability of SWAP. The elevation of sensitivity was greater with translucent occlusion which has implications for the statistical analysis of W-W perimetry and SWAP. The influence of age-related cataract extraction and IOL implantation upon the visual field derived by WoW perimetry and SWAP was determined. Cataract yielded a general reduction In sensitivity which was preferentially greater in SWAP, even after the correction of SWAP for the attenuation of the stimulus by the ocular media. There was no correlation between either backward or forward light scatter and the magnitude of the attenuation of W-W or SWAP sensitivity. The post-operative mean deviation in SWAP was positive and has ramifications for the statistical Interpretation of SWAP. Short-wavelength-sensitive pathway isolation was assessed as a function of stimulus eccentricity using the two-colour Increment threshold method. At least 15 dB of SWS pathway Isolation was achieved for 440 nm, 450 nm and 460 nm stimuli at a background luminance of 100 cdm-2, There was a slight decrease In SWS pathway Isolation for all stimulus wavelengths with increasing eccentricity which was not of clinical significance. Adopting a 450 nm stimulus may reduce between-subject variability In SWAP due to a reduction In ocular media absorption and macular pigment absorption.
Resumo:
The direct CO2 electrochemical reduction on model platinum single crystal electrodes Pt(hkl) is studied in [C2mim+][NTf2−], a suitable room temperature ionic liquid (RTIL) medium due to its moderate viscosity, high CO2 solubility and conductivity. Single crystal electrodes represent the most convenient type of surface structured electrodes for studying the impact of RTIL ion adsorption on relevant electrocatalytic reactions, such as surface sensitive electrochemical CO2 reduction. We propose here based on cyclic voltammetry and in situ electrolysis measurements, for the first time, the formation of a stable adduct [C2mimH–CO2−] by a radical–radical coupling after the simultaneous reduction of CO2 and [C2mim+]. It means between the CO2 radical anion and the radical formed from the reduction of the cation [C2mim+] before forming the corresponding electrogenerated carbene. This is confirmed by the voltammetric study of a model imidazolium-2-carboxylate compound formed following the carbene pathway. The formation of that stable adduct [C2mimH–CO2−] blocks CO2 reduction after a single electron transfer and inhibits CO2 and imidazolium dimerization reactions. However, the electrochemical reduction of CO2 under those conditions provokes the electrochemical cathodic degradation of the imidazolium based RTIL. This important limitation in CO2 recycling by direct electrochemical reduction is overcome by adding a strong acid, [H+][NTf2−], into solution. Then, protons become preferentially adsorbed on the electrode surface by displacing the imidazolium cations and inhibiting their electrochemical reduction. This fact allows the surface sensitive electro-synthesis of HCOOH from CO2 reduction in [C2mim+][NTf2−], with Pt(110) being the most active electrode studied.
Resumo:
In this review, we detail the efforts performed to couple the purification and the immobilization of industrial enzymes in a single step. The use of antibodies, the development of specific domains with affinity for some specific supports will be revised. Moreover, we will discuss the use of domains that increase the affinity for standard matrices (ionic exchangers, silicates). We will show how the control of the immobilization conditions may convert some unspecific supports in largely specific ones. The development of tailor-made heterofunctional supports as a tool to immobilize–stabilize–purify some proteins will be discussed in deep, using low concentration of adsorbent groups and a dense layer of groups able to give an intense multipoint covalent attachment. The final coupling of mutagenesis and tailor made supports will be the last part of the review.
Resumo:
Pears have been grown in the south region of Brazil, where the climatic conditions are favourable. The aim of this work was to determine the harvest maturity index as well as maximum storage period of 'Packham's Triumph? and 'Rocha' pears to maintain quality attributes. The ?Packham?s Triumph? fruit were harvested from a commercial orchard at 7 days intervals and flesh firmness was used as a maturity index (MI1=76, MI2=67 and MI3=58 N). ?Rocha? pears were harvested twice and they were considered as MI1 and MI3 because of the firmness values. The fruit were stored at 1±1C and 90-95% RH for 15, 30, 45 and 60 days and evaluated at the end of each storage period and after five days at room temperature (24±1C), simulating a helflife period. Flesh firmness, water loss, peduncle dehydration, epidermis colour, soluble solids, titratable acidity were measured. ?Packham?s? pears harvested at MI1 and MI2 showed firmness loss after 30 days of cold storage, whereas fruit harvested at MI3 retained the initial values, resulting in firmer fruit after 60 days (P<0.001). Fruit harvested in MI3 had less firmness loss after 5 days at room temperature following 45 and 60 days of cold storage. ?Rocha? pears harvested in MI1 and MI3 showed firmness reduction during cold storage, which was intensified at room temperature. Maximum values of water loss approached 6%. Fruit peduncles of both cultivars dehydrated after 60 days of cold storage, but their colour remained green, independent of harvest maturity index. ?Packham?s Triumph? and ?Rocha? pears harvested at MI3 showed better quality attributes after 60 days of cold storage plus 5 days of shelf-life than fruit harvested at other maturity stages.
Resumo:
The population Monte Carlo algorithm is an iterative importance sampling scheme for solving static problems. We examine the population Monte Carlo algorithm in a simplified setting, a single step of the general algorithm, and study a fundamental problem that occurs in applying importance sampling to high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of estimate under conditions on the importance function. We demonstrate the exponential growth of the asymptotic variance with the dimension and show that the optimal covariance matrix for the importance function can be estimated in special cases.
Resumo:
This thesis addresses computational challenges arising from Bayesian analysis of complex real-world problems. Many of the models and algorithms designed for such analysis are ‘hybrid’ in nature, in that they are a composition of components for which their individual properties may be easily described but the performance of the model or algorithm as a whole is less well understood. The aim of this research project is to after a better understanding of the performance of hybrid models and algorithms. The goal of this thesis is to analyse the computational aspects of hybrid models and hybrid algorithms in the Bayesian context. The first objective of the research focuses on computational aspects of hybrid models, notably a continuous finite mixture of t-distributions. In the mixture model, an inference of interest is the number of components, as this may relate to both the quality of model fit to data and the computational workload. The analysis of t-mixtures using Markov chain Monte Carlo (MCMC) is described and the model is compared to the Normal case based on the goodness of fit. Through simulation studies, it is demonstrated that the t-mixture model can be more flexible and more parsimonious in terms of number of components, particularly for skewed and heavytailed data. The study also reveals important computational issues associated with the use of t-mixtures, which have not been adequately considered in the literature. The second objective of the research focuses on computational aspects of hybrid algorithms for Bayesian analysis. Two approaches will be considered: a formal comparison of the performance of a range of hybrid algorithms and a theoretical investigation of the performance of one of these algorithms in high dimensions. For the first approach, the delayed rejection algorithm, the pinball sampler, the Metropolis adjusted Langevin algorithm, and the hybrid version of the population Monte Carlo (PMC) algorithm are selected as a set of examples of hybrid algorithms. Statistical literature shows how statistical efficiency is often the only criteria for an efficient algorithm. In this thesis the algorithms are also considered and compared from a more practical perspective. This extends to the study of how individual algorithms contribute to the overall efficiency of hybrid algorithms, and highlights weaknesses that may be introduced by the combination process of these components in a single algorithm. The second approach to considering computational aspects of hybrid algorithms involves an investigation of the performance of the PMC in high dimensions. It is well known that as a model becomes more complex, computation may become increasingly difficult in real time. In particular the importance sampling based algorithms, including the PMC, are known to be unstable in high dimensions. This thesis examines the PMC algorithm in a simplified setting, a single step of the general sampling, and explores a fundamental problem that occurs in applying importance sampling to a high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of the estimate under conditions on the importance function. Additionally, the exponential growth of the asymptotic variance with the dimension is demonstrated and we illustrates that the optimal covariance matrix for the importance function can be estimated in a special case.
Resumo:
Campylobacter jejuni followed by Campylobacter coli contribute substantially to the economic and public health burden attributed to food-borne infections in Australia. Genotypic characterisation of isolates has provided new insights into the epidemiology and pathogenesis of C. jejuni and C. coli. However, currently available methods are not conducive to large scale epidemiological investigations that are necessary to elucidate the global epidemiology of these common food-borne pathogens. This research aims to develop high resolution C. jejuni and C. coli genotyping schemes that are convenient for high throughput applications. Real-time PCR and High Resolution Melt (HRM) analysis are fundamental to the genotyping schemes developed in this study and enable rapid, cost effective, interrogation of a range of different polymorphic sites within the Campylobacter genome. While the sources and routes of transmission of campylobacters are unclear, handling and consumption of poultry meat is frequently associated with human campylobacteriosis in Australia. Therefore, chicken derived C. jejuni and C. coli isolates were used to develop and verify the methods described in this study. The first aim of this study describes the application of MLST-SNP (Multi Locus Sequence Typing Single Nucleotide Polymorphisms) + binary typing to 87 chicken C. jejuni isolates using real-time PCR analysis. These typing schemes were developed previously by our research group using isolates from campylobacteriosis patients. This present study showed that SNP + binary typing alone or in combination are effective at detecting epidemiological linkage between chicken derived Campylobacter isolates and enable data comparisons with other MLST based investigations. SNP + binary types obtained from chicken isolates in this study were compared with a previously SNP + binary and MLST typed set of human isolates. Common genotypes between the two collections of isolates were identified and ST-524 represented a clone that could be worth monitoring in the chicken meat industry. In contrast, ST-48, mainly associated with bovine hosts, was abundant in the human isolates. This genotype was, however, absent in the chicken isolates, indicating the role of non-poultry sources in causing human Campylobacter infections. This demonstrates the potential application of SNP + binary typing for epidemiological investigations and source tracing. While MLST SNPs and binary genes comprise the more stable backbone of the Campylobacter genome and are indicative of long term epidemiological linkage of the isolates, the development of a High Resolution Melt (HRM) based curve analysis method to interrogate the hypervariable Campylobacter flagellin encoding gene (flaA) is described in Aim 2 of this study. The flaA gene product appears to be an important pathogenicity determinant of campylobacters and is therefore a popular target for genotyping, especially for short term epidemiological studies such as outbreak investigations. HRM curve analysis based flaA interrogation is a single-step closed-tube method that provides portable data that can be easily shared and accessed. Critical to the development of flaA HRM was the use of flaA specific primers that did not amplify the flaB gene. HRM curve analysis flaA interrogation was successful at discriminating the 47 sequence variants identified within the 87 C. jejuni and 15 C. coli isolates and correlated to the epidemiological background of the isolates. In the combinatorial format, the resolving power of flaA was additive to that of SNP + binary typing and CRISPR (Clustered regularly spaced short Palindromic repeats) HRM and fits the PHRANA (Progressive hierarchical resolving assays using nucleic acids) approach for genotyping. The use of statistical methods to analyse the HRM data enhanced sophistication of the method. Therefore, flaA HRM is a rapid and cost effective alternative to gel- or sequence-based flaA typing schemes. Aim 3 of this study describes the development of a novel bioinformatics driven method to interrogate Campylobacter MLST gene fragments using HRM, and is called ‘SNP Nucleated Minim MLST’ or ‘Minim typing’. The method involves HRM interrogation of MLST fragments that encompass highly informative “Nucleating SNPS” to ensure high resolution. Selection of fragments potentially suited to HRM analysis was conducted in silico using i) “Minimum SNPs” and ii) the new ’HRMtype’ software packages. Species specific sets of six “Nucleating SNPs” and six HRM fragments were identified for both C. jejuni and C. coli to ensure high typeability and resolution relevant to the MLST database. ‘Minim typing’ was tested empirically by typing 15 C. jejuni and five C. coli isolates. The association of clonal complexes (CC) to each isolate by ‘Minim typing’ and SNP + binary typing were used to compare the two MLST interrogation schemes. The CCs linked with each C. jejuni isolate were consistent for both methods. Thus, ‘Minim typing’ is an efficient and cost effective method to interrogate MLST genes. However, it is not expected to be independent, or meet the resolution of, sequence based MLST gene interrogation. ‘Minim typing’ in combination with flaA HRM is envisaged to comprise a highly resolving combinatorial typing scheme developed around the HRM platform and is amenable to automation and multiplexing. The genotyping techniques described in this thesis involve the combinatorial interrogation of differentially evolving genetic markers on the unified real-time PCR and HRM platform. They provide high resolution and are simple, cost effective and ideally suited to rapid and high throughput genotyping for these common food-borne pathogens.
Resumo:
Book summary: In a constantly evolving context of performance management, accountability and risk assessment, police organisations and frontline police officers are required to pay careful attention to what has come to be known as ‘at risk people’, ‘vulnerable populations’ or ‘vulnerable people’. Vulnerable people have become a key focus of policy. Concurrently, there have been stronger demands on police, and a steep increase in police powers in relation to their interaction with vulnerable people. The premise of this protectionist and interventionist agenda is threefold: to protect the rights of vulnerable individuals proactively cater for their vulnerability within the justice system; and to secure police operations and protocols within strict guidelines. This collection unpacks ‘vulnerable people policing’ in theory and practice and guides the reader through the policing process as it is experienced by police officers, victims, offenders, witnesses and justice stakeholders. Each chapter features a single step of the policing process: from police recruit education through to custody, and the final transfer of vulnerable people to courts and sentencing. This edited collection provides analytical, theoretical and empirical insights on vulnerable people policing, and reflects on critical issues in a domain that is increasingly subject to speedy conversion from policy to practice, and heightened media and political scrutiny. It breaks down policing practices, operations and procedures that have vulnerable populations as a focus, bringing together original and innovative academic research and literature, practitioner experience and discussion of policy implications (from local and international perspectives). The particular nature of this collection highlights the multi-disciplinary nature of police work, sheds light on how specific, mandatory policies guide police officers steps in their interaction with vulnerable populations, and discusses the practicalities of police decision making at key points in this process.
Resumo:
We directly constructed reduced graphene oxide–titanium oxide nanotube (RGO–TNT) film using a single-step, combined electrophoretic deposition–anodization (CEPDA) method. This method, based on the simultaneous anodic growth of tubular TiO2 and the electrophoretic-driven motion of RGO, allowed the formation of an effective interface between the two components, thus improving the electron transfer kinetics. Composites of these graphitic carbons with different levels of oxygen-containing groups, electron conductivity and interface reaction time were investigated; a fine balance of these parameters was achieved.
Resumo:
Migraine is a painful and debilitating, neurovascular disease. Current migraine head pain treatments work with differing efficacies in migraineurs. The opioid system plays an important role in diverse biological functions including analgesia, drug response and pain reduction. The A118G single nucleotide polymorphism (SNP) in exon 1 of the μ-opioid receptor gene (OPRM1) has been associated with elevated pain responses and decreased pain threshold in a variety of populations. The aim of the current preliminary study was to test whether genotypes of the OPRM1 A118G SNP are associated with head pain severity in a clinical cohort of female migraineurs. This was a preliminary study to determine whether genotypes of the OPRM1 A118G SNP are associated with head pain severity in a clinical cohort of female migraineurs. A total of 153 chronic migraine with aura sufferers were assessed for migraine head pain using the Migraine Disability Assessment Score instrument and classified into high and low pain severity groups. DNA was extracted and genotypes obtained for the A118G SNP. Logistic regression analysis adjusting for age effects showed the A118G SNP of the OPRM1 gene to be significantly associated with migraine pain severity in the test population (P = 0.0037). In particular, G118 allele carriers were more likely to be high pain sufferers compared to homozygous carriers of the A118 allele (OR = 3.125, 95 % CI = 1.41, 6.93, P = 0.0037). These findings suggest that A118G genotypes of the OPRM1 gene may influence migraine-associated head pain in females. Further investigations are required to fully understand the effect of this gene variant on migraine head pain including studies in males and in different migraine subtypes, as well as in response to head pain medication.