24 resultados para Compact


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel method for functional lung imaging was introduced by adapting the K-edge subtraction method (KES) to in vivo studies of small animals. In this method two synchrotron radiation energies, which bracket the K-edge of the contrast agent, are used for simultaneous recording of absorption-contrast images. Stable xenon gas is used as the contrast agent, and imaging is performed in projection or computed tomography (CT) mode. Subtraction of the two images yields the distribution of xenon, while removing practically all features due to other structures, and the xenon density can be calculated quantitatively. Because the images are recorded simultaneously, there are no movement artifacts in the subtraction image. Time resolution for a series of CT images is one image/s, which allows functional studies. Voxel size is 0.1mm3, which is an order better than in traditional lung imaging methods. KES imaging technique was used in studies of ventilation distribution and the effects of histamine-induced airway narrowing in healthy, mechanically ventilated, and anaesthetized rabbits. First, the effect of tidal volume on ventilation was studied, and the results show that an increase in tidal volume without an increase in minute ventilation results a proportional increase in regional ventilation. Second, spiral CT was used to quantify the airspace volumes in lungs in normal conditions and after histamine aerosol inhalation, and the results showed large patchy filling defects in peripheral lungs following histamine provocation. Third, the kinetics of proximal and distal airway response to histamine aerosol were examined, and the findings show that the distal airways react immediately to histamine and start to recover, while the reaction and the recovery in proximal airways is slower. Fourth, the fractal dimensions of lungs was studied, and it was found that the fractal dimension is higher at the apical part of the lungs compared to the basal part, indicating structural differences between apical and basal lung level. These results provide new insights to lung function and the effects of drug challenge studies. Nowadays the technique is available at synchrotron radiation facilities, but the compact synchrotron radiation sources are being developed, and in relatively near future the method may be used at hospitals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Black hole X-ray binaries, binary systems where matter from a companion star is accreted by a stellar mass black hole, thereby releasing enormous amounts of gravitational energy converted into radiation, are seen as strong X-ray sources in the sky. As a black hole can only be detected via its interaction with its surroundings, these binary systems provide important evidence for the existence of black holes. There are now at least twenty cases where the measured mass of the X-ray emitting compact object in a binary exceeds the upper limit for a neutron star, thus inferring the presence of a black hole. These binary systems serve as excellent laboratories not only to study the physics of accretion but also to test predictions of general relativity in strongly curved space time. An understanding of the accretion flow onto these, the most compact objects in our Universe, is therefore of great importance to physics. We are only now slowly beginning to understand the spectra and variability observed in these X-ray sources. During the last decade, a framework has developed that provides an interpretation of the spectral evolution as a function of changes in the physics and geometry of the accretion flow driven by a variable accretion rate. This doctoral thesis presents studies of two black hole binary systems, Cygnus~X-1 and GRS~1915+105, plus the possible black hole candidate Cygnus~X-3, and the results from an attempt to interpret their observed properties within this emerging framework. The main result presented in this thesis is an interpretation of the spectral variability in the enigmatic source Cygnus~X-3, including the nature and accretion geometry of its so-called hard spectral state. The results suggest that the compact object in this source, which has not been uniquely identified as a black hole on the basis of standard mass measurements, is most probably a massive, ~30 Msun, black hole, and thus the most massive black hole observed in a binary in our Galaxy so far. In addition, results concerning a possible observation of limit-cycle variability in the microquasar GRS~1915+105 are presented as well as evidence of `mini-hysteresis' in the extreme hard state of Cygnus X-1.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hantaviruses, members of the genus Hantavirus in the Bunyaviridae family, are enveloped single-stranded RNA viruses with tri-segmented genome of negative polarity. In humans, hantaviruses cause two diseases, hemorrhagic fever with renal syndrome (HFRS) and hantavirus pulmonary syndrome (HPS), which vary in severity depending on the causative agent. Each hantavirus is carried by a specific rodent host and is transmitted to humans through excreta of infected rodents. The genome of hantaviruses encodes four structural proteins: the nucleocapsid protein (N), the glycoproteins (Gn and Gc), and the polymerase (L) and also the nonstructural protein (NSs). This thesis deals with the functional characterization of hantavirus N protein with regard to its structure. Structural studies of the N protein have progressed slowly and the crystal structure of the whole protein is still not available, therefore biochemical assays coupled with bioinformatical modeling proved essential for studying N protein structure and functions. Presumably, during RNA encapsidation, the N protein first forms intermediate trimers and then oligomers. First, we investigated the role of N-terminal domain in the N protein oligomerization. The results suggested that the N-terminal region of the N protein forms a coiled-coil, in which two antiparallel alpha helices interact via their hydrophobic seams. Hydrophobic residues L4, I11, L18, L25 and V32 in the first helix and L44, V51, L58 and L65 in the second helix were crucial for stabilizing the structure. The results were consistent with the head-to-head, tail-to-tail model for hantavirus N protein trimerization. We demonstrated that an intact coiled-coil structure of the N terminus is crucial for the oligomerization capacity of the N protein. We also added new details to the head-to-head, tail-to-tail model of trimerization by suggesting that the initial step is based on interaction(s) between intact intra-molecular coiled-coils of the monomers. We further analyzed the importance of charged aa residues located within the coiled-coil for the N protein oligomerization. To predict the interacting surfaces of the monomers we used an upgraded in silico model of the coiled-coil domain that was docked into a trimer. Next the predicted target residues were mutated. The results obtained using the mammalian two-hybrid assay suggested that conserved charged aa residues within the coiled-coil make a substantial contribution to the N protein oligomerization. This contribution probably involves the formation of interacting surfaces of the N monomers and also stabilization of the coiled-coil via intramolecular ionic bridging. We proposed that the tips of the coiled-coils are the first to come into direct contact and thus initiate tight packing of the three monomers into a compact structure. This was in agreement with the previous results showing that an increase in ionic strength abolished the interaction between N protein molecules. We also showed that residues having the strongest effect on the N protein oligomerization are not scattered randomly throughout the coiled-coil 3D model structure, but form clusters. Next we found evidence for the hantaviral N protein interaction with the cytoplasmic tail of the glycoprotein Gn. In order to study this interaction we used the GST pull-down assay in combination with mutagenesis technique. The results demonstrated that intact, properly folded zinc fingers of the Gn protein cytoplasmic tail as well as the middle domain of the N protein (that includes aa residues 80 248 and supposedly carries the RNA-binding domain) are essential for the interaction. Since hantaviruses do not have a matrix protein that mediates the packaging of the viral RNA in other negatve stranded viruses (NSRV), hantaviral RNPs should be involved in a direct interaction with the intraviral domains of the envelope-embedded glycoproteins. By showing the N-Gn interaction we provided the evidence for one of the crucial steps in the virus replication at which RNPs are directed to the site of the virus assembly. Finally we started analysis of the N protein RNA-binding region, which is supposedly located in the middle domain of the N protein molecule. We developed a model for the initial step of RNA-binding by the hantaviral N protein. We hypothesized that the hantaviral N protein possesses two secondary structure elements that initiate the RNA encapsidation. The results suggest that amino acid residues (172-176) presumably act as a hook to catch vRNA and that the positively charged interaction surface (aa residues 144-160) enhances the initial N-RNA interacation. In conclusion, we elucidated new functions of hantavirus N protein. Using in silico modeling we predicted the domain structure of the protein and using experimental techniques showed that each domain is responsible for executing certain function(s). We showed that intact N terminal coiled-coil domain is crucial for oligomerization and charged residues located on its surface form a interaction surface for the N monomers. The middle domain is essential for interaction with the cytoplasmic tail of the Gn protein and RNA binding.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis has two items: biofouling and antifouling in paper industry. Biofouling means unwanted microbial accumulation on surfaces causing e.g. disturbances in industrial processes, contamination of medical devices or of water distribution networks. Antifouling focuses on preventing accumulation of the biofilms in undesired places. Deinococcus geothermalis is a pink-pigmented, thermophilic bacterium, and extremely resistant towards radiation, UV-light and desiccation and known as a biofouler of paper machines forming firm and biocide resistant biofilms on the stainless steel surfaces. The compact structure of biofilm microcolonies of D. geothermalis E50051 and the adhesion into abiotic surfaces were investigated by confocal laser scanning microscope combined with carbohydrate specific fluorescently labelled lectins. The extracellular polymeric substance in D. geothermalis microcolonies was found to be a composite of at least five different glycoconjugates contributing to adhesion, functioning as structural elements, putative storages for water, gliding motility and likely also to protection. The adhesion threads that D. geothermalis seems to use to adhere on an abiotic surface and to anchor itself to the neighbouring cells were shown to be protein. Four protein components of type IV pilin were identified. In addition, the lectin staining showed that the adhesion threads were covered with galactose containing glycoconjugates. The threads were not exposed on planktic cells indicating their primary role in adhesion and in biofilm formation. I investigated by quantitative real-time PCR the presence of D. geothermalis in biofilms, deposits, process waters and paper end products from 24 paper and board mills. The primers designed for doing this were targeted to the 16S rRNA gene of D. geothermalis. We found D. geothermalis DNA from 9 machines, in total 16 samples of the 120 mill samples searched for. The total bacterial content varied in those samples between 107 to 3 ×1010 16S rRNA gene copies g-1. The proportion of D. geothermalis in those same samples was minor, 0.03 1.3 % of the total bacterial content. Nevertheless D. geothermalis may endanger paper quality as its DNA was shown in an end product. As an antifouling method towards biofilms we studied the electrochemical polarization. Two novel instruments were designed for this work. The double biofilm analyzer was designed for search for a polarization program that would eradicate D. geothermalis biofilm or from stainless steel under conditions simulating paper mill environment. The Radbox instrument was designed to study the generation of reactive oxygen species during the polarization that was effective in antifouling of D. geothermalis. We found that cathodic character and a pulsed mode of polarization were required to achieve detaching D. geothermalis biofilm from stainless steel. We also found that the efficiency of polarization was good on submerged, and poor on splash area biofilms. By adding oxidative biocides, bromochloro-5,5-dimethylhydantoin, 2,2-dibromo-2-cyanodiacetamide or peracetic acid gave additive value with polarization, being active on splash area biofilms. We showed that the cathodically weighted pulsed polarization that was active in removing D. geothermalis was also effective in generation of reactive oxygen species. It is possible that the antifouling effect relied on the generation of ROS on the polarized steel surfaces. Antifouling method successful towards D. geothermalis that is a tenacious biofouler and possesses a high tolerance to oxidative stressors could be functional also towards other biofoulers and applicable in wet industrial processes elsewhere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A compact selection of statistics on the social security programmes administered by the Kela. Including both tables and charts, the Pocket statistics presents key data on the benefits provided by the Kela, supplemented by selected data about programmes administered by other organizations. Most of the data is updated to the end of 2010, with some of the presentations extending into 2011.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tämän pro gradu -tutkielman tarkoituksena on määrittää jätteenkeräyksen ja -siirron yhteiskunnalliset kustannukset valitulla tutkimusalueella Helsingin Punavuoressa. Jätteenkeräyksen ja -siirron kustannukset vastaavat suuruudeltaan merkittävää osaa jätehuollon kokonaiskustannuksista, minkä vuoksi kustannusten tutkimiselle ja tarkastelulle löytyy kysyntää. Lisäksi keräyksen ja siirron kustannukset saattavat vaihdella suuresti johtuen erilaisista kaupunkirakenteista,keräysmenetelmistä ja teknologioista, joten tapaustarkastelun avulla pystytään selvittämään yksityiskohtaisesti alueen jätteenkeräyksen ja -siirron kustannukset. Tutkimusalue Helsingin Punavuoressa on yksi Suomen tiheimmin asutuista alueista, missä jätteidenkeräystä hankaloittaa kapeat kadut, useat sisäpihoille sijoitetut jätehuoneet ja vilkas liikenne. Erityispiirteidensä vuoksi jätteenkeräys- ja siirto aiheuttaa tutkimusalueella yksityisten kustannusten lisäksi myös useita ulkoisvaikutuksia muun muassa ilmansaasteiden ja viihtyvyyshaittojen muodossa. Tässä työssä lasketaan jätteenkeräyksen ja -siirron yhteiskunnalliset kustannukset neljän eri jätelajin osalta huomioimalla sekä yksityiset kustannustekijät että ulkoiskustannuksina syntyvien päästöjen kustannukset. Työn aineistona on käytetty erilaisia kustannuslaskelmien kirjallisuuslähteitä, asiantuntija-arvioita ja tutkimusalueella tehtyjä kellotusmittauksia. Alueen kellotusmittauksiin perustuvalla aikaperusteisella laskentatavalla jätteenkeräyksen ja -siirron jätetonnikohtaisiksi keskimääräisiksi kustannuksiksi saatiin 73 €/t. Kustannuksissa havaittiin kuitenkin suuria jätelajikohtaisia eroja, jolloin keräyksen ja siirron kustannukset heittelivät 49–125 €/t välillä. Suuret jätelajikohtaiset kustannuserot ovat selitettävissä pitkälti jätteiden koostumuksella, koska kevyiden ja paljon tilaa vievien jätelajien jätetonnikohtaiset kustannukset olivat suurimpia. Teoriataustan ja lähdeaineiston perusteella saadut tulokset myös osoittavat, että jätteenkeräyksen ja siirron kustannuksista huomioitujen ulkoiskustannusten osuus on häviävän pieni verrattuna yksityisten kustannusten tasoon.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In meteorology, observations and forecasts of a wide range of phenomena for example, snow, clouds, hail, fog, and tornados can be categorical, that is, they can only have discrete values (e.g., "snow" and "no snow"). Concentrating on satellite-based snow and cloud analyses, this thesis explores methods that have been developed for evaluation of categorical products and analyses. Different algorithms for satellite products generate different results; sometimes the differences are subtle, sometimes all too visible. In addition to differences between algorithms, the satellite products are influenced by physical processes and conditions, such as diurnal and seasonal variation in solar radiation, topography, and land use. The analysis of satellite-based snow cover analyses from NOAA, NASA, and EUMETSAT, and snow analyses for numerical weather prediction models from FMI and ECMWF was complicated by the fact that we did not have the true knowledge of snow extent, and we were forced simply to measure the agreement between different products. The Sammon mapping, a multidimensional scaling method, was then used to visualize the differences between different products. The trustworthiness of the results for cloud analyses [EUMETSAT Meteorological Products Extraction Facility cloud mask (MPEF), together with the Nowcasting Satellite Application Facility (SAFNWC) cloud masks provided by Météo-France (SAFNWC/MSG) and the Swedish Meteorological and Hydrological Institute (SAFNWC/PPS)] compared with ceilometers of the Helsinki Testbed was estimated by constructing confidence intervals (CIs). Bootstrapping, a statistical resampling method, was used to construct CIs, especially in the presence of spatial and temporal correlation. The reference data for validation are constantly in short supply. In general, the needs of a particular project drive the requirements for evaluation, for example, for the accuracy and the timeliness of the particular data and methods. In this vein, we discuss tentatively how data provided by general public, e.g., photos shared on the Internet photo-sharing service Flickr, can be used as a new source for validation. Results show that they are of reasonable quality and their use for case studies can be warmly recommended. Last, the use of cluster analysis on meteorological in-situ measurements was explored. The Autoclass algorithm was used to construct compact representations of synoptic conditions of fog at Finnish airports.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A better understanding of vacuum arcs is desirable in many of today's 'big science' projects including linear colliders, fusion devices, and satellite systems. For the Compact Linear Collider (CLIC) design, radio-frequency (RF) breakdowns occurring in accelerating cavities influence efficiency optimisation and cost reduction issues. Studying vacuum arcs both theoretically as well as experimentally under well-defined and reproducible direct-current (DC) conditions is the first step towards exploring RF breakdowns. In this thesis, we have studied Cu DC vacuum arcs with a combination of experiments, a particle-in-cell (PIC) model of the arc plasma, and molecular dynamics (MD) simulations of the subsequent surface damaging mechanism. We have also developed the 2D Arc-PIC code and the physics model incorporated in it, especially for the purpose of modelling the plasma initiation in vacuum arcs. Assuming the presence of a field emitter at the cathode initially, we have identified the conditions for plasma formation and have studied the transitions from field emission stage to a fully developed arc. The 'footing' of the plasma is the cathode spot that supplies the arc continuously with particles; the high-density core of the plasma is located above this cathode spot. Our results have shown that once an arc plasma is initiated, and as long as energy is available, the arc is self-maintaining due to the plasma sheath that ensures enhanced field emission and sputtering. The plasma model can already give an estimate on how the time-to-breakdown changes with the neutral evaporation rate, which is yet to be determined by atomistic simulations. Due to the non-linearity of the problem, we have also performed a code-to-code comparison. The reproducibility of plasma behaviour and time-to-breakdown with independent codes increased confidence in the results presented here. Our MD simulations identified high-flux, high-energy ion bombardment as a possible mechanism forming the early-stage surface damage in vacuum arcs. In this mechanism, sputtering occurs mostly in clusters, as a consequence of overlapping heat spikes. Different-sized experimental and simulated craters were found to be self-similar with a crater depth-to-width ratio of about 0.23 (sim) - 0.26 (exp). Experiments, which we carried out to investigate the energy dependence of DC breakdown properties, point at an intrinsic connection between DC and RF scaling laws and suggest the possibility of accumulative effects influencing the field enhancement factor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bayesian networks are compact, flexible, and interpretable representations of a joint distribution. When the network structure is unknown but there are observational data at hand, one can try to learn the network structure. This is called structure discovery. This thesis contributes to two areas of structure discovery in Bayesian networks: space--time tradeoffs and learning ancestor relations. The fastest exact algorithms for structure discovery in Bayesian networks are based on dynamic programming and use excessive amounts of space. Motivated by the space usage, several schemes for trading space against time are presented. These schemes are presented in a general setting for a class of computational problems called permutation problems; structure discovery in Bayesian networks is seen as a challenging variant of the permutation problems. The main contribution in the area of the space--time tradeoffs is the partial order approach, in which the standard dynamic programming algorithm is extended to run over partial orders. In particular, a certain family of partial orders called parallel bucket orders is considered. A partial order scheme that provably yields an optimal space--time tradeoff within parallel bucket orders is presented. Also practical issues concerning parallel bucket orders are discussed. Learning ancestor relations, that is, directed paths between nodes, is motivated by the need for robust summaries of the network structures when there are unobserved nodes at work. Ancestor relations are nonmodular features and hence learning them is more difficult than modular features. A dynamic programming algorithm is presented for computing posterior probabilities of ancestor relations exactly. Empirical tests suggest that ancestor relations can be learned from observational data almost as accurately as arcs even in the presence of unobserved nodes.