911 resultados para Lubrication and cooling techniques
Resumo:
Position sensitive particle detectors are needed in high energy physics research. This thesis describes the development of fabrication processes and characterization techniques of silicon microstrip detectors used in the work for searching elementary particles in the European center for nuclear research, CERN. The detectors give an electrical signal along the particles trajectory after a collision in the particle accelerator. The trajectories give information about the nature of the particle in the struggle to reveal the structure of the matter and the universe. Detectors made of semiconductors have a better position resolution than conventional wire chamber detectors. Silicon semiconductor is overwhelmingly used as a detector material because of its cheapness and standard usage in integrated circuit industry. After a short spread sheet analysis of the basic building block of radiation detectors, the pn junction, the operation of a silicon radiation detector is discussed in general. The microstrip detector is then introduced and the detailed structure of a double-sided ac-coupled strip detector revealed. The fabrication aspects of strip detectors are discussedstarting from the process development and general principles ending up to the description of the double-sided ac-coupled strip detector process. Recombination and generation lifetime measurements in radiation detectors are discussed shortly. The results of electrical tests, ie. measuring the leakage currents and bias resistors, are displayed. The beam test setups and the results, the signal to noise ratio and the position accuracy, are then described. It was found out in earlier research that a heavy irradiation changes the properties of radiation detectors dramatically. A scanning electron microscope method was developed to measure the electric potential and field inside irradiated detectorsto see how a high radiation fluence changes them. The method and the most important results are discussed shortly.
Resumo:
BACKGROUND: Therapeutic hypothermia following hypoxic ischaemic encephalopathy in term infants was introduced into Switzerland in 2005. Initial documentation of perinatal and resuscitation details was poor and neuromonitoring insufficient. In 2011, a National Asphyxia and Cooling Register was introduced. AIMS: To compare management of cooled infants before and after introduction of the register concerning documentation, neuromonitoring, cooling methods and evaluation of temperature variability between cooling methods. STUDY DESIGN: Data of cooled infants before the register was in place (first time period: 2005-2010) and afterwards (second time period: 2011-2012) was collected with a case report form. RESULTS: 150 infants were cooled during the first time period and 97 during the second time period. Most infants were cooled passively or passively with gel packs during both time periods (82% in 2005-2010 vs 70% in 2011-2012), however more infants were cooled actively during the second time period (18% versus 30%). Overall there was a significant reduction in temperature variability (p < 0.001) comparing the two time periods. A significantly higher proportion of temperature measurements within target temperature range (72% versus 77%, p < 0.001), fewer temperature measurements above (24% versus 7%, p < 0.001) and more temperatures below target range (4% versus 16%, p < 0.001) were recorded during the second time period. Neuromonitoring improved after introduction of the cooling register. CONCLUSION: Management of infants with HIE improved since introducing the register. Temperature variability was reduced, more temperature measurements in the target range and fewer temperature measurements above target range were observed. Neuromonitoring has improved, however imaging should be performed more often.
Resumo:
Mauritia vinifera (buriti) is a palm tree that grows wild in different areas of Brazil, particularly in the Amazonian region. The buriti oil is rich in carotenoids, especially in β-carotene. The growing interest in other natural sources of β-carotene has stimulated the industrial use of buriti as a raw material for pulp oil extraction. Most processes are based on the conventional technologies, involving drying and pressing the pulp for oil recovery and further separation of carotenoids in a liquid phase using organics solvents. In the present work, the ethanol-based process was evaluated for simultaneous carotenoids recovering and fractionating from buriti pulp. The raw material and ethanol, 1:4 ratio, were placed in an erlenmeyer flask and maintained at 30rpm for 1 hour in a temperature-controlled bath at 65ºC. The mixture was filtered under vacuum and cooling at 10ºC to allow for the separation of the solvent in two phases. Carotenoids composition, determined by HPLC, has indicated a β-carotene concentration about 12 times greater in the lower phase than in the upper phase. The profile of the carotenoids in the denser phase is quite similar to that of raw buriti oil, and the concentration of total carotenoids is 40% higher than that of the original raw oil, making the ethanol-based process particularly attractive for industrial applications.
Resumo:
Tutkimuksen tarkoituksena oli selvittää typenoksidien vähennysmahdollisuudet Stora Enson Varkauden tehtaiden sellutehtaalla ja voimalaitoksella. Tutkimuksessa käsiteltiin tehdasalueen suurimpia typenoksidien päästölähteitä: soodakattilaa, meesauunia, kuorikattilaa, öljykattilaa ja muovi-alumiinijakeen kaasutuslaitosta. Tutkimuksessa selvitettiin typenoksidipäästöjen syntymekanismit ja erilaisiin polttotekniikoihin soveltuvat typenoksidien vähennystekniikat. Varkauden tehtaiden typenoksidien vuosipäästö vuonna 2001 oli 836 tonnia. Kansallinen lainsäädäntö, kansainväliset sopimukset sekä paras käytettävissä oleva tekniikka (BAT) huomioiden selvitettiin kuhunkin kohteeseen parhaiten soveltuvat ratkaisut. Tutkimuksen perusteella laadittiin toimenpideohjelma, joka määrittelee suositeltavan toteutusjärjestyksen typenoksidien vähennystoimenpiteille. Toimenpideohjelman tärkeimpinä kriteereinä pidettiin vuonna 2004 tulevan uuden ympäristöluvan arvioituja luparajoja sekä toimenpiteiden kustannustehokkuutta. Toteutusjärjestyksessä ensimmäiseksi valittiin koeajojakson järjestäminen ajon optimoimiseksi kiertopetikattilalla ja toiseksi meesauunin ajon optimointi jatkuvatoimisen NOx-analysaattorin avulla. Seuraaviksi toimenpiteiksi ehdotettiin vertikaali-ilmajärjestelmän käyttöönottoa soodakattilalla sekä SNCR-järjestelmän asennusta kuorikattilalle. Saavutettava NOx-vähennys tulisi olemaan 10 – 45 % ja hinta 30 – 3573 EUR vähennettyä NOx-tonnia kohti. Tutkimuksen osana Ilmatieteen laitoksella teetetyn typenoksidien leviämisselvityksen mukaan Stora Enson tehtaiden NOx-päästöjen vaikutus Varkauden ilmanlaatuun on hyvin pieni. Suurin osa NOx-päästöistä aiheutuu liikenteestä.
Resumo:
Työssä tarkastellaan kahta kaasuturbiinin imuilman sisäänottojärjestelmän kehitysmenetelmää, imuilman jäähdytystä ja sähköstaattista suodatusta. Imuilman jäähdytysmenetelmien tarkastelussa käytettiin kahta kaasuturbiinin tehonlisäystekniikoiden laskentatyökalua. Arviointi kohdistettiin Glanford Brigg Generating Station -voimalaitoksen kaasuturbiinityyppiin ja paikallisiin englantilaisiin ilmasto-olosuhteisiin. Tarkastelussa olivat kostutusjäähdytys ja overspray. Tuloksia vertailtiin keskenään ja näiden perusteella arvioitiin menetelmien vaikutuksia tehoon, hyötysuhteeseen sekä veden kulutukseen. Sähköstaattisen suodattimen prototyyppi oli rakennettu Briggin voimalaitokselle. Järjestelmää kehitetään kaupalliseksi tuotteeksi ja tätä varten kerättiin tekninen dokumentaatio kokonaisuudeksi, jota voitiin hyödyntää tuotteistusprosessissa. Imuilman jäähdyttämisellä voidaan saavuttaa merkittävä tehonlisäys ilmasto-olosuhteista riippuen. Menetelmällä voidaan myös tasata lämpötilan vuorokausi-vaihtelusta aiheutuvia tehoeroja. Sähköstaattisen suodattimen prototyyppi saavutti kehitysvaiheelle asetetut tavoitteet. Sähköstaattinen suodatus tarjoaa useita etuja perinteiseen mekaaniseen suodatukseen verrattuna.
Resumo:
We show that the coercive field in ferritin and ferrihydrite depends on the maximum magnetic field in a hysteresis loop and that coercivity and loop shifts depend both on the maximum and cooling fields. In the case of ferritin, we show that the time dependence of the magnetization also depends on the maximum and previous cooling fields. This behavior is associated to changes in the intraparticle energy barriers imprinted by these fields. Accordingly, the dependence of the coercive and loop-shift fields with the maximum field in ferritin and ferrihydrite can be described within the frame of a uniform-rotation model considering a dependence of the energy barrier with the maximum and the cooling fields.
Resumo:
The possible coexistence of ferromagnetism and charge/orbital order in Bi3/4Sr1/4MnO3 has been investigated. The manganite Bi0.75Sr0.25MnO3, with commensurate charge balance, undergoes an electronic transition at TCO~600 K that produces a longrange modulation with double periodicity along a and c axis, and unusual anisotropic evolution of the lattice parameters. The previously proposed ferromagnetic properties of this new ordered phase were studied by magnetometry and diffraction techniques. In zero field the magnetic structure is globally antiferromagnetic, ruling out the apparition of spontaneous ferromagnetism. However, the application of magnetic fields produces a continuous progressive canting of the moments, inducing a ferromagnetic phase even for relatively small fields (H<<1 T). Application of pulsed high fields produces a remarkable and reversible spin polarization (under 30 T, the ferromagnetic moment is ~3 ¿B/Mn, without any sign of charge order melting). The coexistence of ferromagnetism and charge order at low and very-high fields is a remarkable property of this system.
Resumo:
Glucose homeostasis as well as homeostatic and hedonic control of feeding is regulated by hormonal, neuronal, and nutrient-related cues. Glucose, besides its role as a source of metabolic energy, is an important signal controlling hormone secretion and neuronal activity, hence contributing to whole-body metabolic integration in coordination with feeding control. Brain glucose sensing plays a key, but insufficiently explored, role in these metabolic and behavioral controls, which when deregulated may contribute to the development of obesity and diabetes. The recent introduction of innovative transgenic, pharmacogenetic, and optogenetic techniques allows unprecedented analysis of the complexity of central glucose sensing at the molecular, cellular, and neuronal circuit levels, which will lead to a new understanding of the pathogenesis of metabolic diseases.
Resumo:
The application of contrast media in post-mortem radiology differs from clinical approaches in living patients. Post-mortem changes in the vascular system and the absence of blood flow lead to specific problems that have to be considered for the performance of post-mortem angiography. In addition, interpreting the images is challenging due to technique-related and post-mortem artefacts that have to be known and that are specific for each applied technique. Although the idea of injecting contrast media is old, classic methods are not simply transferable to modern radiological techniques in forensic medicine, as they are mostly dedicated to single-organ studies or applicable only shortly after death. With the introduction of modern imaging techniques, such as post-mortem computed tomography (PMCT) and post-mortem magnetic resonance (PMMR), to forensic death investigations, intensive research started to explore their advantages and limitations compared to conventional autopsy. PMCT has already become a routine investigation in several centres, and different techniques have been developed to better visualise the vascular system and organ parenchyma in PMCT. In contrast, the use of PMMR is still limited due to practical issues, and research is now starting in the field of PMMR angiography. This article gives an overview of the problems in post-mortem contrast media application, the various classic and modern techniques, and the issues to consider by using different media.
Resumo:
This paper analyzes the impact of infrastructure investment on Spanish economic growth between 1850 and 1935. Using new infrastructure data and VAR techniques, this paper shows that the growth impact of local-scope infrastructure investment was positive, but returns to investment in large nation-wide networks were not significantly different from zero. Two complementary explanations are suggested for the last result. On the one hand, public intervention and the application of non-efficiency investment criteria were very intense in large network construction. On the other hand, returns to new investment in large networks might have decreased dramatically once the basic links were constructed.
Resumo:
Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.
Resumo:
Through a combined approach integrating RNA-Seq, SNP-array, FISH and PCR techniques, we identified two novel t(15;21) translocations leading to the inactivation of RUNX1 and its partners SIN3A and TCF12. One is a complex t(15;21)(q24;q22), with both breakpoints mapped at the nucleotide level, joining RUNX1 to SIN3A and UBL7-AS1 in a patient with myelodysplasia. The other is a recurrent t(15;21)(q21;q22), juxtaposing RUNX1 and TCF12, with an opposite transcriptional orientation, in three myeloid leukemia cases. Since our transcriptome analysis indicated a significant number of differentially expressed genes associated with both translocations, we speculate an important pathogenetic role for these alterations involving RUNX1.
Resumo:
This book is one out of 8 IAEG XII Congress volumes, and deals with Landslide processes, including: field data and monitoring techniques, prediction and forecasting of landslide occurrence, regional landslide inventories and dating studies, modeling of slope instabilities and secondary hazards (e.g. impulse waves and landslide-induced tsunamis, landslide dam failures and breaching), hazard and risk assessment, earthquake and rainfall induced landslides, instabilities of volcanic edifices, remedial works and mitigation measures, development of innovative stabilization techniques and applicability to specific engineering geological conditions, use of geophysical techniques for landslide characterization and investigation of triggering mechanisms. Focuses is given to innovative techniques, well documented case studies in different environments, critical components of engineering geological and geotechnical investigations, hydrological and hydrogeological investigations, remote sensing and geophysical techniques, modeling of triggering, collapse, runout and landslide reactivation, geotechnical design and construction procedures in landslide zones, interaction of landslides with structures and infrastructures and possibility of domino effects. The Engineering Geology for Society and Territory volumes of the IAEG XII Congress held in Torino from September 15-19, 2014, analyze the dynamic role of engineering geology in our changing world and build on the four main themes of the congress: environment, processes, issues, and approaches.
Resumo:
This article presents the results of a study of the efficiency of silanation process of calcium phosphate glasses particles and its effect on the bioactivity behavior of glasspoly( methyl methacrylate) (PMMA) composites. Two different calcium phosphate glasses: 44.5CaO-44.5P2O5-11Na2O (BV11) and 44.5CaO-44.5P2O5-6Na2O-5TiO2 (G5) were synthesized and treated with silane coupling agent. The glasses obtained were characterized by Microprobe and BET while the efficiency of silanation process was determined using Fourier Transform Infrared Spectroscopy (FTIR), X-ray Photoelectron Spectroscopy (XPS) and Thermal Analysis (DTA and TG)techniques. The content of coupling agent chemically tightly bond to the silanated glasses ascended to 1.69 6 0.02 wt % for BV11sil glass and 0.93 6 0.01 wt % for G5sil glass. The in vitro bioactivity test carried out in Simulated Body Fluid (SBF) revealed certain bioactive performance with the use of both silanated glasses in a 30% (by weight) as filler of the PMMA composites because of a superficial deposition of an apatite-like layer with low content of CO3 22 and HPO4 22 in its structure after soaking for 30 days occurred. VC 2013 Wiley Periodicals,Inc. J Biomed Mater Res Part B: Appl Biomater 00B: 000-000, 2013.
Resumo:
Pantoea agglomerans strains are among the most promising biocontrol agents for avariety of bacterial and fungal plant diseases, particularly fire blight of apple and pear. However, commercial registration of P. agglomerans biocontrol products is hampered because this species is currently listed as a biosafety level 2 (BL2) organism due to clinical reports as an opportunistichuman pathogen. This study compares plant-origin and clinical strains in a search for discriminating genotypic/phenotypic markers using multi-locus phylogenetic analysis and fluorescent amplified fragment length polymorphisms (fAFLP) fingerprinting.Results: Majority of the clinical isolates from culture collections were found to be improperly designated as P. agglomerans after sequence analysis. The frequent taxonomic rearrangements underwent by the Enterobacter agglomerans/Erwinia herbicola complex may be a major problem in assessing clinical associations within P. agglomerans. In the P. agglomerans sensu stricto (in the stricter sense) group, there was no discrete clustering of clinical/biocontrol strains and no marker was identified that was uniquely associated to clinical strains. A putative biocontrol-specific fAFLP marker was identified only in biocontrol strains. The partial ORF located in this band corresponded to an ABC transporter that was found in all P. agglomerans strains. Conclusion: Taxonomic mischaracterization was identified as a major problem with P.agglomerans, and current techniques removed a majority of clinical strains from this species. Although clear discrimination between P. agglomerans plant and clinical strains was not obtained with phylogenetic analysis, a single marker characteristic of biocontrol strains was identified whichmay be of use in strain biosafety determinations. In addition, the lack of Koch's postulate fulfilment, rare retention of clinical strains for subsequent confirmation, and the polymicrobial nature of P. agglomerans clinical reports should be considered in biosafety assessment of beneficial strains in this species