965 resultados para Traditional enrichment method
Resumo:
In this paper we investigate the mixture adsorption of ethylene, ethane, nitrogen and argon on graphitized thermal carbon black and in slit pores by means of the Grand Canonical Monte Carlo simulations. Pure component adsorption isotherms on graphitized thermal carbon black are first characterized with the GCMC method, and then mixture simulations are carried out over a wide range of pore width, temperature, pressure and composition to investigate the cooperative and competitive adsorption of all species in the mixture. Results of mixture simulations are compared with the experimental data of ethylene and ethane (Friederich and Mullins, 1972) on Sterling FTG-D5 (homogeneous carbon black having a BET surface area of 13 m(2)/g) at 298 K and a pressure range of 1.3-93 kPa. Because of the co-operative effect, the Henry constant determined by the traditional chromatography method is always greater than that obtained from the volumetric method.
Resumo:
The worldwide trend for the deregulation of the electricity generation and transmission industries has led to dramatic changes in system operation and planning procedures. The optimum approach to transmission-expansion planning in a deregulated environment is an open problem especially when the responsibilities of the organisations carrying out the planning work need to be addressed. To date there is a consensus that the system operator and network manager perform the expansion planning work in a centralised way. However, with an increasing input from the electricity market, the objectives, constraints and approaches toward transmission planning should be carefully designed to ensure system reliability as well as meeting the market requirements. A market-oriented approach for transmission planning in a deregulated environment is proposed. Case studies using the IEEE 14-bus system and the Australian national electricity market grid are performed. In addition, the proposed method is compared with a traditional planning method to further verify its effectiveness.
Resumo:
Carbons with slitlike pores can serve as effective host materials for storage of hythane fuel, a bridge between the petrol combustion and hydrogen fuel cells. We have used grand canonical Monte Carlo simulation for the modeling of the hydrogen and methane mixture storage at 293 K and pressure of methane and hydrogen mixture up to 2 MPa. We have found that these pores serve as efficient vessels for the storage of hythane fuel near ambient temperatures and low pressures. We find that, for carbons having optimized slitlike pores of size H congruent to 7 angstrom ( pore width that can accommodate one adsorbed methane layer), and bulk hydrogen mole fraction >= 0.9, the volumetric stored energy exceeds the 2010 target of 5.4 MJ dm(-3) established by the U. S. FreedomCAR Partnership. At the same condition, the content of hydrogen in slitlike carbon pores is congruent to 7% by energy. Thus, we have obtained the composition corresponding to hythane fuel in carbon nanospaces with greatly enhanced volumetric energy in comparison to the traditional compression method. We proposed the simple system with added extra container filled with pure free/adsorbed methane for adjusting the composition of the desorbed mixture as needed during delivery. Our simulation results indicate that light slit pore carbon nanomaterials with optimized parameters are suitable filling vessels for storage of hythane fuel. The proposed simple system consisting of main vessel with physisorbed hythane fuel, and an extra container filled with pure free/adsorbed methane will be particularly suitable for combustion of hythane fuel in buses and passenger cars near ambient temperatures and low pressures.
Resumo:
By focusing on developments between 1996 and 2006, this paper explains the reasons for one of Australia’s public health inconsistencies, the comparatively low adoption of adjusted water fluoridation in Queensland. As such, this work involved literature review and traditional historical method. In Queensland, parliamentary support for water fluoridation is conditional on community approval. Political ambivalence and the constraints of the “Fluoridation of Public Water Supplies Act (1963)” Qld have hindered the advocacy of water fluoridation. The political circumstance surrounding the “Lord Mayor’s Taskforce on Fluoridation Report” (1997) influenced its findings and confirms that Australia’s biggest local authority, the Brisbane City Council, failed to authoritatively analyse water fluoridation. In 2004, a private member’s bill to mandate fluoridation failed in a spectacular fashion. In 2005, an official systems review of Queensland Health recommended public debate about water fluoridation. Our principal conclusion is that without mandatory legislation, widespread implementationof water fluoridation in Queensland is most unlikely.
Resumo:
Today, speciality use organoclays are being developed for an increasingly large number of specific applications. Many of these, including use in cosmetics, polishes, greases and paints, require that the material be free from abrasive impurities so that the product retains a smooth `feel'. The traditional `wet' method preparation of organoclays inherently removes abrasives naturally present in the parent mineral clay, but it is time-consuming and expensive. The primary objective of this thesis was to explore the alternative `dry' method (which is both quicker and cheaper but which provides no refining of the parent clay) as a process, and to examine the nature of the organoclays produced, for the production of a wide range of commercially usable organophilic clays in a facile way. Natural Wyoming bentonite contains two quite different types of silicate surface (that of the clay mineral montmorillonite and that of a quartz impurity) that may interact with the cationic surfactant added in the `dry' process production of organoclays. However, it is oil shale, and not the quartz, that is chiefly responsible for the abrasive nature of the material, although air refinement in combination with the controlled milling of the bentonite as a pretreatment may offer a route to its removal. Ion exchange of Wyoming bentonite with a long chain quaternary ammonium salt using the `dry' process affords a partially exchanged, 69-78%, organoclay, with a monolayer formation of ammonium ions in the interlayer. Excess ion pairs are sorbed on the silicate surfaces of both the clay mineral and the quartz impurity phases. Such surface sorption is enhanced by the presence of very finely divided, super paramagnetic, Fe2O3 or Fe(O)(OH) contaminating the surfaces of the major mineral components. The sorbed material is labile to washing, and induces a measurable shielding of the 29Si nuclei in both clay and quartz phases in the MAS NMR experiment, due to an anisotropic magnetic susceptibility effect. XRD data for humidified samples reveal the interlamellar regions to be strongly hydrophobic, with the by-product sodium chloride being expelled to the external surfaces. Many organic cations will exchange onto a clay. The tetracationic cyclophane, and multipurpose receptor, cyclobis(paraquat-p-phenylene) undergoes ion exchange onto Wyoming bentonite to form a pillared clay with a very regular gallery height. The major plane of the cyclophane is normal to the silicate surfaces, thus allowing the cavity to remain available for complexation. A series of group VI substituted o-dimethoxybenzenes were introduced, and shown to participate in host/guest interactions with the cyclophane. Evidence is given which suggests that the binding of the host structure to a clay substrate offers advantages, not only of transportability and usability but of stability, to the charge-transfer complex which may prove useful in a variety of commercial applications. The fundamental relationship between particle size, cation exchange capacity and chemical composition of clays was also examined. For Wyoming bentonite the extent of isomorphous substitution increases with decreasing particle size, causing the CEC to similarly increase, although the isomorphous substitution site: edge site ratio remains invarient throughout the particle size range studied.
Resumo:
Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.
Resumo:
The first clinically proven nicotine replacement product to obtain regulatory approval was Nicorette® gum. It provides a convenient way of delivering nicotine directly to the buccal cavity, thus, circumventing 'first-pass' elimination following gastrointestinal absorption. Since launch, Nicorette® gum has been investigated in numerous studies (clinical) which are often difficult to compare due to large variations in study design and degree of sophistication. In order to standardise testing, in 2000 the European Pharmacopoeia introduced an apparatus to investigate the in vitro release of drug substances from medical chewing gum. With use of the chewing machine, the main aims of this project were to determine factors that could affect release from Nicorette® gum, to develop an in vitro in vivo correlation and to investigate formulation variables on release of nicotine from gums. A standard in vitro test method was developed. The gum was placed in the chewing chamber with 40 mL of artificial saliva at 37'C and chewed at 60 chews per minute. The chew rate, the type of dissolution medium used, pH, volume, temperature and the ionic strength of the dissolution medium were altered to investigate the effects on release in vitro. It was found that increasing the temperature of the dissolution media and the rate at which the gums were chewed resulted in a greater release of nicotine, whilst increasing the ionic strength of the dissolution medium to 80 mM resulted in a lower release. The addition of 0.1 % sodium Jauryl sulphate to the artificial saliva was found to double the release of nicotine compared to the use of artificial saliva and water alone. Although altering the dissolution volume and the starting pH did not affect the release. The increase in pH may be insufficient to provide optimal conditions for nicotine absorption (since the rate at which nicotine is transported through the buccal membrane was found to be higher at pH values greater than 8.6 where nicotine is predominately unionised). Using a time mapping function, it was also possible to establish a level A in vitro in vivo correlation. 4 mg Nicorette® gum was chewed at various chew rates in vitro and correlated to an in vivo chew-out study. All chew rates used in vitro could be successfully used for IVIVC purposes, however statistically, chew rates of 10 and 20 chews per minute performed better than all other chew rates. Finally a series of nicotine gums was made to investigate the effect of formulation variables on release of nicotine from the gum. Using a directly compressible gum base, in comparison to Nicorette® the gums crumbled when chewed in vitro, resulting in a faster release of nicotine. To investigate the effect of altering the gum base, the concentration of sodium salts, sugar syrup, the form of the active drug, the addition sequence and the incorporation of surfactant into the gum, the traditional manufacturing method was used to make a series of gum formulations. Results showed that the time of addition of the active drug, the incorporation of surfactants and using different gum base all increased the release of nicotine from the gum. In contrast, reducing the concentration of sodium carbonate resulted in a lower release. Using a stronger nicotine ion-exchange resin delayed the release of nicotine from the gum, whilst altering the concentration of sugar syrup had little effect on the release but altered the texture of the gum.
Resumo:
WHAT IS ALREADY KNOWN ABOUT THIS SUBJECT • The cytotoxic effects of 6-mercaptopurine (6-MP) were found to be due to drug-derived intracellular metabolites (mainly 6-thioguanine nucleotides and to some extent 6-methylmercaptopurine nucleotides) rather than the drug itself. • Current empirical dosing methods for oral 6-MP result in highly variable drug and metabolite concentrations and hence variability in treatment outcome. WHAT THIS STUDY ADDS • The first population pharmacokinetic model has been developed for 6-MP active metabolites in paediatric patients with acute lymphoblastic leukaemia and the potential demographic and genetically controlled factors that could lead to interpatient pharmacokinetic variability among this population have been assessed. • The model shows a large reduction in interindividual variability of pharmacokinetic parameters when body surface area and thiopurine methyltransferase polymorphism are incorporated into the model as covariates. • The developed model offers a more rational dosing approach for 6-MP than the traditional empirical method (based on body surface area) through combining it with pharmacogenetically guided dosing based on thiopurine methyltransferase genotype. AIMS - To investigate the population pharmacokinetics of 6-mercaptopurine (6-MP) active metabolites in paediatric patients with acute lymphoblastic leukaemia (ALL) and examine the effects of various genetic polymorphisms on the disposition of these metabolites. METHODS - Data were collected prospectively from 19 paediatric patients with ALL (n = 75 samples, 150 concentrations) who received 6-MP maintenance chemotherapy (titrated to a target dose of 75 mg m−2 day−1). All patients were genotyped for polymorphisms in three enzymes involved in 6-MP metabolism. Population pharmacokinetic analysis was performed with the nonlinear mixed effects modelling program (nonmem) to determine the population mean parameter estimate of clearance for the active metabolites. RESULTS - The developed model revealed considerable interindividual variability (IIV) in the clearance of 6-MP active metabolites [6-thioguanine nucleotides (6-TGNs) and 6-methylmercaptopurine nucleotides (6-mMPNs)]. Body surface area explained a significant part of 6-TGNs clearance IIV when incorporated in the model (IIV reduced from 69.9 to 29.3%). The most influential covariate examined, however, was thiopurine methyltransferase (TPMT) genotype, which resulted in the greatest reduction in the model's objective function (P < 0.005) when incorporated as a covariate affecting the fractional metabolic transformation of 6-MP into 6-TGNs. The other genetic covariates tested were not statistically significant and therefore were not included in the final model. CONCLUSIONS - The developed pharmacokinetic model (if successful at external validation) would offer a more rational dosing approach for 6-MP than the traditional empirical method since it combines the current practice of using body surface area in 6-MP dosing with a pharmacogenetically guided dosing based on TPMT genotype.
Resumo:
BACKGROUND: Contrast detection is an important aspect of the assessment of visual function; however, clinical tests evaluate limited spatial frequencies and contrasts. This study validates the accuracy and inter-test repeatability of a swept-frequency near and distance mobile app Aston contrast sensitivity test, which overcomes this limitation compared to traditional charts. METHOD: Twenty subjects wearing their full refractive correction underwent contrast sensitivity testing on the new near application (near app), distance app, CSV-1000 and Pelli-Robson charts with full correction and with vision degraded by 0.8 and 0.2 Bangerter degradation foils. In addition repeated measures using the 0.8 occluding foil were taken. RESULTS: The mobile apps (near more than distance, p = 0.005) recorded a higher contrast sensitivity than printed tests (p < 0.001); however, all charts showed a reduction in measured contrast sensitivity with degradation (p < 0.001) and a similar decrease with increasing spatial frequency (interaction > 0.05). Although the coefficient of repeatability was lowest for the Pelli-Robson charts (0.14 log units), the mobile app charts measured more spatial frequencies, took less time and were more repeatable (near: 0.26 to 0.37 log units; distance: 0.34 to 0.39 log units) than the CSV-1000 (0.30 to 0.93 log units). The duration to complete the CSV-1000 was 124 ± 37 seconds, Pelli-Robson 78 ± 27 seconds, near app 53 ± 15 seconds and distance app 107 ± 36 seconds. CONCLUSIONS: While there were differences between charts in contrast levels measured, the new Aston near and distance apps are valid, repeatable and time-efficient method of assessing contrast sensitivity at multiple spatial frequencies.
Resumo:
Saturation mutagenesis is a powerful tool in modern protein engineering, which permits key residues within a protein to be targeted in order to potentially enhance specific functionalities. However, the creation of large libraries using conventional saturation mutagenesis with degenerate codons (NNN or NNK/S) has inherent redundancy and consequent disparities in codon representation. Therefore, both chemical (trinucleotide phosphoramidites) and biological methods (sequential, enzymatic single codon additions) of non-degenerate saturation mutagenesis have been developed in order to combat these issues and so improve library quality. Large libraries with multiple saturated positions can be limited by the method used to screen them. Although the traditional screening method of choice, cell-dependent methods, such as phage display, are limited by the need for transformation. A number of cell-free screening methods, such as CIS display, which link the screened phenotype with the encoded genotype, have the capability of screening libraries with up to 1014 members. This thesis describes the further development of ProxiMAX technology to reduce library codon bias and its integration with CIS display to screen the resulting library. Synthetic MAX oligonucleotides are ligated to an acceptor base sequence, amplified, and digested, subsequently adding a randomised codon to the acceptor, which forms an iterative cycle using the digested product of the previous cycle as the base sequence for the next. Initial use of ProxiMAX highlighted areas of the process where changes could be implemented in order to improve the codon representation in the final library. The refined process was used to construct a monomeric anti-NGF peptide library, based on two proprietary dimeric peptides (Isogenica) that bind NGF. The resulting library showed greatly improved codon representation that equated to a theoretical diversity of ~69%. The library was subsequently screened using CIS display and the discovered peptides assessed for NGF-TrkA inhibition by ELISA. Despite binding to TrkA, these peptides showed lower levels of inhibition of the NGF-TrkA interaction than the parental dimeric peptides, highlighting the importance of dimerization for inhibition of NGF-TrkA binding.
Desenvolvimento da célula base de microestruturas periódicas de compósitos sob otimização topológica
Resumo:
This thesis develops a new technique for composite microstructures projects by the Topology Optimization process, in order to maximize rigidity, making use of Deformation Energy Method and using a refining scheme h-adaptative to obtain a better defining the topological contours of the microstructure. This is done by distributing materials optimally in a region of pre-established project named as Cell Base. In this paper, the Finite Element Method is used to describe the field and for government equation solution. The mesh is refined iteratively refining so that the Finite Element Mesh is made on all the elements which represent solid materials, and all empty elements containing at least one node in a solid material region. The Finite Element Method chosen for the model is the linear triangular three nodes. As for the resolution of the nonlinear programming problem with constraints we were used Augmented Lagrangian method, and a minimization algorithm based on the direction of the Quasi-Newton type and Armijo-Wolfe conditions assisting in the lowering process. The Cell Base that represents the composite is found from the equivalence between a fictional material and a preescribe material, distributed optimally in the project area. The use of the strain energy method is justified for providing a lower computational cost due to a simpler formulation than traditional homogenization method. The results are presented prescription with change, in displacement with change, in volume restriction and from various initial values of relative densities.
Resumo:
During the last decades the growth and development of Information and Communication Technologies (ICT) have led us to a new social paradigm that reflects a deep change not only in an individual but also in a social behavioural pattern. All these changes define the so-called knowledge and information society. This social evolution has had different textual states and types as its main core and instrument of transformation therefore scholars and specialists in the field of Humanities have not missed the opportunity of studying recent phenomena, which have come from the current setting given by new technologies. Researchers in Humanities have had to reconsider their own traditional working method based on printed text in order to analyse how nowadays we search for data, select them, analyse them and the ways we create and spread new information and knowledge. Having in mind this scenario where humanist works, the concept of Digital Humanities has arisen conditioned by the existence of cyberspace, digital text, hypertext, on-line text, implementation of new ways of communication, global access to information and various elements which build a common methodology for all humanistic disciplines. Most of these changes have affected the educational system and education as an academic and humanistic discipline.
Resumo:
The molecular profiling system was developed using directed terminal-restriction fragment length polymorphism (dT-RFLP) to characterize soil nematode assemblages by relative abundance of feeding guilds and validation by comparison to traditional morphological method. The good performance of these molecular tools applied to soil nematodes assemblages create an opportunity to develop a novel approach for rapid assessment of the biodiversity changes of benthic nematodes assemblages of marine and estuarine sediments. The main aim of this research is to combine morphological and molecular analysis of estuarine nematodes assemblages, to establish a tool for fast assessment of the biodiversity changes within habitat recovery of Zostera noltii seagrass beds; and validate the dT-RFLP as a high-throughput tool to assess the system recovery. It was also proposed to develop a database of sequences related to individuals identified at species level to develop a new taxonomic reference system. A molecular phylogenetic analysis of the estuarine nematodes has being performed. After morphological identification, barcoding of 18S rDNA are being determined for each nematode species and the results have shown a good degree of concordance between traditional morphology-based identification and DNA sequences. The digest strategy developed for soil nematodes is not suitable for marine nematodes. Then five samples were cloned and sequenced and the sequence data was used to design a new dT-RFLP strategy to adapt this tool to marine assemblages. Several solutions were presented by DRAT and tested empirically to select the solution that cuts most efficiently, separating the different clusters. The results of quantitative PCR showed differences in nematode density between two sampling stations according the abundance of the nematode density obtained by the traditional methods. These results suggest that qPCR could be a robust tool for enumeration of nematode abundance, saving time.
Resumo:
Questo progetto di ricerca si pone l'obiettivo di gettare luce sul commercio delle spezie nel Medioevo, a partire dai preziosi dati contenuti nei registri del dazio di Bologna (1388-1448), nei quali venivano raccolti tutti i prodotti afferenti al cosiddetto "dazio della mercanzia" che transitavano in città per poi proseguire il viaggio verso altre destinazioni. Nel Medioevo, Bologna rappresentava un importante snodo per collegare i principali empori del mare Adriatico (prima fra tutti Venezia) con i mercati della Toscana, come Firenze, Pisa e il suo sbocco marittimo, Porto Pisano, da cui le spezie salpavano in direzione di altre regioni europee, come la Francia, l'Inghilterra, la penisola iberica e le Fiandre. I quantitativi di spezie giornalieri, mensili, annuali e totali costituiscono un dato inedito ed inaspettato: infatti, un prodotto tradizionalmente descritto dalla storiografia come raro, prezioso e difficile da reperire, affluiva in realtà con sorprendente costanza e raggiungendo volumi molto elevati. Considerando che Bologna, nonostante la sua importanza nel panorama italiano, rappresentava pur sempre uno snodo "minore" nella complessa rete di circuiti commerciali su cui erano solite viaggiare le spezie (come le grandi rotte marittime, per esempio), questi quantitativi tanto elevati di spezie ci obbligano a riflettere su quanto detto sino ad ora sul commercio di questi prodotti nel Medioevo e a mettere i dati bolognesi a confronto con quelli provenienti da altre fonti. Affiancando al tradizionale metodo storiografico un approccio "empirico", che tenga conto delle caratteristiche materiali ed organolettiche delle spezie, nonché delle informazioni provenienti da un ampio numero di fonti – non necessariamente legate al periodo preso in esame – è possibile riaprire il dibattito attorno a questo tema, che ha ancora molto da offrire alla ricerca storico alimentare.
Resumo:
Modern lifestyle markedly changed eating habits worldwide, with an increasing demand for ready-to-eat foods, such as minimally processed fruits and leafy greens. Packaging and storage conditions of those products may favor the growth of psychrotrophic bacteria, including the pathogen Listeria monocytogenes. In this work, minimally processed leafy vegetables samples (n = 162) from retail market from Ribeirao Preto, Sao Paulo, Brazil, were tested for the presence or absence of Listeria spp. by the immunoassay Listeria Rapid Test, Oxoid. Two L. monocytogenes positive and six artificially contaminated samples of minimally processed leafy vegetables were evaluated by the Most Probable Number (MPN) with detection by classical culture method and also culture method combined with real-time PCR (RTi-PCR) for 16S rRNA genes of L monocytogenes. Positive MPN enrichment tubes were analyzed by RTi-PCR with primers specific for L. monocytogenes using the commercial preparation ABSOLUTET (TM) QPCR SYBR (R) Green Mix (ABgene, UK). Real-time PCR assay presented good exclusivity and inclusivity results and no statistical significant difference was found in comparison with the conventional culture method (p < 0.05). Moreover, RTi-PCR was fist and easy to perform, with MPN results obtained in ca. 48 h for RTi-PCR in comparison to 7 days for conventional method. (C) 2009 Elsevier Ltd. All rights reserved.