957 resultados para Second generation bioethanol
Resumo:
The hydrothermal liquefaction(HTL) of algal biomass is a promising route to viable second generation biofuels. In this investigation HTL was assessed for the valorisation of algae used in the remediation of acid mine drainage (AMD). Initially the HTL process was evaluated using Arthrospira platensis (Spirulina) with additional metal sulphates to simulate metal remediation. Optimised conditions were then used to process a natural algal community (predominantly Chlamydomonas sp.) cultivated under two scenarios: high uptake and low uptake of metals from AMD. High metal concentrations appear to catalyse the conversion to bio-oil, and do not significantly affect the heteroatom content or higher heating value of the bio-oil produced. The associated metals were found to partition almost exclusively into the solid residue, favourable for potential metal recovery. High metal loadings also caused partitioning of phosphates from the aqueous phase to the solid phase, potentially compromising attempts to recycle process water as a growth supplement. HTL was therefore found to be a suitable method of processing algae used in AMD remediation, producing a crude oil suitable for upgrading into hydrocarbon fuels, an aqueous and gas stream suitable for supplementing the algal growth and the partitioning of most contaminant metals to the solid residue where they would be readily amenable for recovery and/or disposal.
Resumo:
The semiconductor industry's urge towards faster, smaller and cheaper integrated circuits has lead the industry to smaller node devices. The integrated circuits that are now under volume production belong to 22 nm and 14 nm technology nodes. In 2007 the 45 nm technology came with the revolutionary high- /metal gate structure. 22 nm technology utilizes fully depleted tri-gate transistor structure. The 14 nm technology is a continuation of the 22 nm technology. Intel is using second generation tri-gate technology in 14 nm devices. After 14 nm, the semiconductor industry is expected to continue the scaling with 10 nm devices followed by 7 nm. Recently, IBM has announced successful production of 7 nm node test chips. This is the fashion how nanoelectronics industry is proceeding with its scaling trend. For the present node of technologies selective deposition and selective removal of the materials are required. Atomic layer deposition and the atomic layer etching are the respective techniques used for selective deposition and selective removal. Atomic layer deposition still remains as a futuristic manufacturing approach that deposits materials and lms in exact places. In addition to the nano/microelectronics industry, ALD is also widening its application areas and acceptance. The usage of ALD equipments in industry exhibits a diversi cation trend. With this trend, large area, batch processing, particle ALD and plasma enhanced like ALD equipments are becoming prominent in industrial applications. In this work, the development of an atomic layer deposition tool with microwave plasma capability is described, which is a ordable even for lightly funded research labs.
Resumo:
This chapter examines four papers that have been influential in the use of virtual worlds for learning, but also draws on a range of other research and literature in order to locate virtual world learning across the landscape of higher education. Whilst there is sometimes a misconception that research into learning in virtual worlds is very new, the field began to develop in the late 1990’s and has continued since then. Typical examples of the first iterations of virtual worlds include Second Life, Active Worlds, and Kaneva, which have been available for up to 20 years. The second generation is currently being developed, examples being High Fidelity and Project Sansar. The chapter reviews the literature in this field and suggests central themes that emerge are: Socialisation; Presence and immersion in virtual world learning; Learning collaboratively and Trajectories of participation
Resumo:
Structural Health Monitoring (SHM) is an emerging area of research associated to improvement of maintainability and the safety of aerospace, civil and mechanical infrastructures by means of monitoring and damage detection. Guided wave structural testing method is an approach for health monitoring of plate-like structures using smart material piezoelectric transducers. Among many kinds of transducers, the ones that have beam steering feature can perform more accurate surface interrogation. A frequency steerable acoustic transducer (FSATs) is capable of beam steering by varying the input frequency and consequently can detect and localize damage in structures. Guided wave inspection is typically performed through phased arrays which feature a large number of piezoelectric transducers, complexity and limitations. To overcome the weight penalty, the complex circuity and maintenance concern associated with wiring a large number of transducers, new FSATs are proposed that present inherent directional capabilities when generating and sensing elastic waves. The first generation of Spiral FSAT has two main limitations. First, waves are excited or sensed in one direction and in the opposite one (180 ̊ ambiguity) and second, just a relatively rude approximation of the desired directivity has been attained. Second generation of Spiral FSAT is proposed to overcome the first generation limitations. The importance of simulation tools becomes higher when a new idea is proposed and starts to be developed. The shaped transducer concept, especially the second generation of spiral FSAT is a novel idea in guided waves based of Structural Health Monitoring systems, hence finding a simulation tool is a necessity to develop various design aspects of this innovative transducer. In this work, the numerical simulation of the 1st and 2nd generations of Spiral FSAT has been conducted to prove the directional capability of excited guided waves through a plate-like structure.
Resumo:
Le lessivage des nitrates, la contamination de la nappe phréatique et l’eutrophisation des cours d’eau figurent parmi les enjeux planétaires qui affectent la durabilité de l’agriculture et des ressources naturelles. Ce mémoire présente le développement d’une première génération d’un nouveau senseur électrochimique pour le dosage de précisions des nitrates. Celui-ci est basé sur la spectroscopie d’impédance électrochimique d’une membrane polymérique sélective aux ions. Grâce à cette approche, un senseur compact et abordable a été produit. Par son utilisation en solutions aqueuses et en substrats de croissance saturés, il a été montré que le senseur permettait de quantifier des ajouts contrôlés de nitrates allant de 0,6 ppm à 60 ppm. La mise en application en substrat de croissance a pu être étudiée en comparaison avec des méthodes certifiées ISO 17025 visant l’analyse de ces substrats. Le senseur a aussi montré une grande versatilité par son utilisation sur divers appareils de mesure d’impédance. En plus, il a démontré une stabilité possible suite à une implantation d’un mois directement en substrat de croissance sous les variables environnementales d’une pépinière forestière. Par l’étude du spectre d’impédance du senseur en solutions pures de différentes concentrations, il a aussi été possible de proposer le circuit électrique équivalent du système, qui met en évidence deux parcours compétitifs du courant, un au coeur de la membrane et un deuxième en solution. Les résultats de ces travaux sont au coeur de deux publications scientifiques dont le manuscrit est inclus à ce mémoire. Pour finir cette étude, des suggestions seront faites pour guider l’amélioration du senseur par le développement d’une deuxième génération de celui-ci.
Resumo:
The second generation of large scale interferometric gravitational wave (GW) detectors will be limited by quantum noise over a wide frequency range in their detection band. Further sensitivity improvements for future upgrades or new detectors beyond the second generation motivate the development of measurement schemes to mitigate the impact of quantum noise in these instruments. Two strands of development are being pursued to reach this goal, focusing both on modifications of the well-established Michelson detector configuration and development of different detector topologies. In this paper, we present the design of the world's first Sagnac speed meter (SSM) interferometer, which is currently being constructed at the University of Glasgow. With this proof-of-principle experiment we aim to demonstrate the theoretically predicted lower quantum noise in a Sagnac interferometer compared to an equivalent Michelson interferometer, to qualify SSM for further research towards an implementation in a future generation large scale GW detector, such as the planned Einstein telescope observatory.
Resumo:
The immune system provides a rich metaphor for computer security: anomaly detection that works in nature should work for machines. However, early artificial immune system approaches for computer security had only limited success. Arguably, this was due to these artificial systems being based on too simplistic a view of the immune system. We present here a second generation artificial immune system for process anomaly detection. It improves on earlier systems by having different artificial cell types that process information. Following detailed information about how to build such second generation systems, we find that communication between cells types is key to performance. Through realistic testing and validation we show that second generation artificial immune systems are capable of anomaly detection beyond generic system policies. The paper concludes with a discussion and outline of the next steps in this exciting area of computer security.
Resumo:
This article explores web sites developed to express the interests and experiences of young Chinese people in Britain. Drawing on content analysis of site discussions and dialogues with site users, we argue these new communicative practices are best understood through a reworking of the social capital problematic. Firstly by recognising the irreducibility of Internet-mediated connections to the calculative instrumentalism underlying many applications of social capital theory. Secondly, by providing a more differentiated account of social capital. The interactions we explore comprise a specifically “second generation” form of social capital, cutting across the binary of bonding and bridging social capital. Thirdly judgement on the social capital consequences of Internet interactions must await a longer-term assessment of whether British Chinese institutions emerge to engage with the wider polity.
Resumo:
Porous polymer particles are used in an extraordinarily wide range of advanced and everyday applications, from combinatorial chemistry, solid-phase organic synthesis and polymer-supported reagents, to environmental analyses and the purification of drinking water. The installation and exploitation of functional chemical handles on the particles is often a prerequisite for their successful exploitation, irrespective of the application and the porous nature of the particles. New methodology for the chemical modification of macroreticular polymers is the primary focus of the work presented in this thesis. Porous polymer microspheres decorated with a diverse range of functional groups were synthesised by the post-polymerisation chemical modification of beaded polymers via olefin cross metathesis. The polymer microspheres were prepared by the precipitation polymerisation of divinylbenzene in porogenic (pore-forming) solvents; the olefin cross-metathesis (CM) functionalisation reactions exploited the pendent (polymer-bound) vinyl groups that were not consumed by polymerisation. Olefin CM reactions involving the pendent vinyl groups were performed in dichloromethane using second-generation Grubbs catalyst (Grubbs II), and a wide range of coupling partners used. The results obtained indicate that high quality, porous polymer microspheres synthesised by precipitation polymerisation in near-θ solvents can be functionalised by olefin CM under very mild conditions to install a diverse range of chemical functionalities into a common polydivinylbenzene precursor. Gel-type polymer microspheres were prepared by the precipitation copolymerisation reaction of divinylbenzene and allyl methacrylate in neat acetonitrile. The unreacted pendent vinyl groups that were not consumed by polymerisation were subjected to internal and external olefin metathesis-based hypercrosslinking reactions. Internal hypercrosslinking was carried out by using ring-closing metathesis (RCM) reactions in toluene using Grubbs II catalyst. Under these conditions, hypercrosslinked (HXL) polymers with specific surface areas around 500 m2g-1 were synthesised. External hypercrosslinking was attempted by using CM/RCM in the presence of a multivinyl coupling partner in toluene using second-generation Hoveyda-Grubbs catalyst. The results obtained indicate that no HXL polymers were obtained. However, during the development of this methodology, a new type of polymerisation was discovered with tetraallylorthosilicate as monomer.
Resumo:
The quality and the speed for genome sequencing has advanced at the same time that technology boundaries are stretched. This advancement has been divided so far in three generations. The first-generation methods enabled sequencing of clonal DNA populations. The second-generation massively increased throughput by parallelizing many reactions while the third-generation methods allow direct sequencing of single DNA molecules. The first techniques to sequence DNA were not developed until the mid-1970s, when two distinct sequencing methods were developed almost simultaneously, one by Alan Maxam and Walter Gilbert, and the other one by Frederick Sanger. The first one is a chemical method to cleave DNA at specific points and the second one uses ddNTPs, which synthesizes a copy from the DNA chain template. Nevertheless, both methods generate fragments of varying lengths that are further electrophoresed. Moreover, it is important to say that until the 1990s, the sequencing of DNA was relatively expensive and it was seen as a long process. Besides, using radiolabeled nucleotides also compounded the problem through safety concerns and prevented the automation. Some advancements within the first generation include the replacement of radioactive labels by fluorescent labeled ddNTPs and cycle sequencing with thermostable DNA polymerase, which allows automation and signal amplification, making the process cheaper, safer and faster. Another method is Pyrosequencing, which is based on the “sequencing by synthesis” principle. It differs from Sanger sequencing, in that it relies on the detection of pyrophosphate release on nucleotide incorporation. By the end of the last millennia, parallelization of this method started the Next Generation Sequencing (NGS) with 454 as the first of many methods that can process multiple samples, calling it the 2º generation sequencing. Here electrophoresis was completely eliminated. One of the methods that is sometimes used is SOLiD, based on sequencing by ligation of fluorescently dye-labeled di-base probes which competes to ligate to the sequencing primer. Specificity of the di-base probe is achieved by interrogating every 1st and 2nd base in each ligation reaction. The widely used Solexa/Illumina method uses modified dNTPs containing so called “reversible terminators” which blocks further polymerization. The terminator also contains a fluorescent label, which can be detected by a camera. Now, the previous step towards the third generation was in charge of Ion Torrent, who developed a technique that is based in a method of “sequencing-by-synthesis”. Its main feature is the detection of hydrogen ions that are released during base incorporation. Likewise, the third generation takes into account nanotechnology advancements for the processing of unique DNA molecules to a real time synthesis sequencing system like PacBio; and finally, the NANOPORE, projected since 1995, also uses Nano-sensors forming channels obtained from bacteria that conducts the sample to a sensor that allows the detection of each nucleotide residue in the DNA strand. The advancements in terms of technology that we have nowadays have been so quick, that it makes wonder: ¿How do we imagine the next generation?
Resumo:
This paper discusses some economic integrations in Latin America, which have become an expression of governance in the neoliberalist context -- These integrations are also the results of second-generation adjustments in terms of trade openness, sale of state assets, free short-term capital mobility and Asian and European integrations that preceded the regional ones -- In addition to this, this paper provides answers to the following questions: Are integrations aiming to achieve development? Would North-countries integrations take the same endangering course as in South America? Who should benefit from the integrations? Is there a link between development and demographics?
Resumo:
Part 15: Performance Management Frameworks
Resumo:
The human immune system has numerous properties that make it ripe for exploitation in the computational domain, such as robustness and fault tolerance, and many different algorithms, collectively termed Artificial Immune Systems (AIS), have been inspired by it. Two generations of AIS are currently in use, with the first generation relying on simplified immune models and the second generation utilising interdisciplinary collaboration to develop a deeper understanding of the immune system and hence produce more complex models. Both generations of algorithms have been successfully applied to a variety of problems, including anomaly detection, pattern recognition, optimisation and robotics. In this chapter an overview of AIS is presented, its evolution is discussed, and it is shown that the diversification of the field is linked to the diversity of the immune system itself, leading to a number of algorithms as opposed to one archetypal system. Two case studies are also presented to help provide insight into the mechanisms of AIS; these are the idiotypic network approach and the Dendritic Cell Algorithm.
Resumo:
Os fluxos migratórios das últimas décadas têm contribuído para que as comunidades de falantes de português se estabeleçam um pouco por todo o mundo, nomeadamente, nos países escandinavos. Um desses países de acolhimento, a Finlândia, tem desenvolvido esforços para a integração dos imigrantes que aí se estabelecem, designadamente, através da promoção de políticas de língua, que visam o ensino da língua de herança aos alunos em idade escolar. Tem sido neste contexto que as pequenas comunidades lusófonas, a viverem no país, têm podido facultar aos filhos, imigrantes de primeira e segunda gerações, acesso ao ensino formal da sua língua de herança, o português. Os sujeitos da nossa amostra fazem parte de uma dessas pequenas comunidades e residem em Tampere, na Finlândia, onde frequentam escolas cuja língua de ensino é o finlandês. O presente trabalho pretende, por um lado, dar conta da realidade sociocultural daqueles alunos de Português Língua Não-Materna, e, por outro, visa refletir sobre a aquisição/ aprendizagem da competência pragmática por parte destes sujeitos, falantes de herança, através da realização que fazem de atos ilocutórios diretivos – de pedido e de ordem –, bem como, sobre o grau de formalização das expressões de delicadeza que fazem. Por conseguinte, elaborámos uma Ficha Sociolinguística para recolha de dados referentes ao contexto familiar, sociocultural e linguístico dos alunos. Posteriormente, elaborámos e aplicámos um teste linguístico, de Tarefas de Elicitação do Discurso, com vista à recolha de dados para a construção de um corpus linguístico que nos permitisse desenvolver o presente estudo. A aplicação do teste linguístico e consequente tratamento dos dados recolhidos revelaram que as escolhas pragmáticas dos sujeitos são condicionadas pelo contexto sociocultural e linguístico em que estão imersos. Constatámos, igualmente, que a generalidade dos alunos recorre a duas estratégias de mitigação do discurso, pelo uso de duas expressões de delicadeza: a fórmula «se faz favor»/«por favor» e o verbo modal «poder».
Resumo:
This is an open access article under the CC BY-NC-ND license - http://creativecommons.org/licenses/by-nc-nd/4.0/