898 resultados para SOFT-GAMMA REPEATERS
Resumo:
Massive protostars have associated bipolar outflows with velocities of hundreds of km s-1. Such outflows can produce strong shocks when they interact with the ambient medium leading to regions of nonthermal radio emission. Aims. We aim at exploring under which conditions relativistic particles are accelerated at the terminal shocks of the protostellar jets and whether they can produce significant gamma-ray emission. Methods. We estimate the conditions necessary for particle acceleration up to very high energies and gamma-ray production in the nonthermal hot spots of jets associated with massive protostars embedded in dense molecular clouds. Results. We show that relativistic bremsstrahlung and proton-proton collisions can make molecular clouds with massive young stellar objects detectable by the Fermi satellite at MeV-GeV energies and by Cherenkov telescope arrays in the GeV-TeV range. Conclusions. Gamma-ray astronomy can be used to probe the physical conditions in star-forming regions and particle acceleration processes in the complex environment of massive molecular clouds.
Resumo:
Objective To evaluate the utility of a new multimodal image-guided intervention technique to detect epileptogenic areas with a gamma probe as compared with intraoperative electrocorticography. Materials and Methods Two symptomatic patients with refractory epilepsy underwent magnetic resonance imaging, videoelectroencephalography, brain SPECT scan, neuropsychological evaluation and were submitted to gamma probe-assisted surgery. Results In patient 1, maximum radioactive count was initially observed on the temporal gyrus at about 3.5 cm posteriorly to the tip of the left temporal lobe. After corticotomy, the gamma probe indicated maximum count at the head of the hippocampus, in agreement with the findings of intraoperative electrocorticography. In patient 2, maximum count was observed in the occipital region at the transition between the temporal and parietal lobes (right hemisphere). During the surgery, the area of epileptogenic activity mapped at electrocorticography was also delimited, demarcated, and compared with the gamma probe findings. After lesionectomy, new radioactive counts were performed both in the patients and on the surgical specimens (ex-vivo). Conclusion The comparison between intraoperative electrocorticography and gamma probe-assisted surgery showed similarity of both methods. The advantages of gamma probe include: noninvasiveness, low cost and capacity to demonstrate decrease in the radioactive activity at the site of excision after lesionectomy.
Resumo:
The efficacy of Gamma Knife surgery (GKS) in local tumor control of non-secreting paragangliomas (PGLs) has been fully described by previous studies. However, with regard to secreting PGL, only one previous case report exists advocating its efficacy at a biological level. The aims of this study were: 1) to evaluate the safety/efficacy of GKS in a dopamine-secreting PGL; 2) to investigate whether the biological concentrations of free methoxytyramine could be used as a marker of treatment efficacy during the follow-up. We describe the case of a 62-year-old man diagnosed with left PGL. He initially underwent complete surgical excision. Thirty months after, he developed recurrent biological and neuroradiological disease; the most sensitive biomarker for monitoring the disease, concentration of plasma free methoxytyramine, started to increase. GKS was performed at a maximal marginal dose of 16 Gy. During the following 30 months, concentration of free methoxytyramine gradually decreased from 0.14 nmol/l (2*URL) before GKS to 0.09 nmol/l, 6 months after GKS and 0.07 nmol/l at the last follow-up after GKS (1.1*URL), confirming the efficacy of the treatment. Additionally, at 30 months there was approximately 36.6% shrinkage from the initial target volume. The GKS treatment was safe and effective, this being confirmed clinically, neuroradiologically and biologically. The case illustrates the importance of laboratory tests taking into account methoxytyramine when analyzing biological samples to assess the biochemical activity of a PGL. In addition, the identification of methoxytyramine as a unique positive biomarker could designate it for the monitoring of tumor relapse after treatments, including Gamma Knife surgery.
Resumo:
Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will consist of two arrays (one in the north, one in the south) for full sky coverage and will be operated as open observatory. The design of CTA is based on currently available technology. This document reports on the status and presents the major design concepts of CTA.
Resumo:
Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will consist of two arrays (one in the north, one in the south) for full sky coverage and will be operated as open observatory. The design of CTA is based on currently available technology. This document reports on the status and presents the major design concepts of CTA.
Resumo:
Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will consist of two arrays (one in the north, one in the south) for full sky coverage and will be operated as open observatory. The design of CTA is based on currently available technology. This document reports on the status and presents the major design concepts of CTA.
Resumo:
The purpose of gamma spectrometry and gamma and X-ray tomography of nuclear fuel is to determine both radionuclide concentration and integrity and deformation of nuclear fuel. The aims of this thesis have been to find out the basics of gamma spectrometry and tomography of nuclear fuel, to find out the operational mechanisms of gamma spectrometry and tomography equipment of nuclear fuel, and to identify problems that relate to these measurement techniques. In gamma spectrometry of nuclear fuel the gamma-ray flux emitted from unstable isotopes is measured using high-resolution gamma-ray spectroscopy. The production of unstable isotopes correlates with various physical fuel parameters. In gamma emission tomography the gamma-ray spectrum of irradiated nuclear fuel is recorded for several projections. In X-ray transmission tomography of nuclear fuel a radiation source emits a beam and the intensity, attenuated by the nuclear fuel, is registered by the detectors placed opposite. When gamma emission or X-ray transmission measurements are combined with tomographic image reconstruction methods, it is possible to create sectional images of the interior of nuclear fuel. MODHERATO is a computer code that simulates the operation of radioscopic or tomographic devices and it is used to predict and optimise the performance of imaging systems. Related to the X-ray tomography, MODHERATO simulations have been performed by the author. Gamma spectrometry and gamma and X-ray tomography are promising non-destructive examination methods for understanding fuel behaviour under normal, transient and accident conditions.
Resumo:
This paper analyses how fiscal adjustment comes about when both central and sub-national governments are involved in consolidation. We test sustainability of public debt with a fiscal rule for both the federal and regional government. Results for the German Länder show that lower tier governments bear a relatively smaller part of the burden of debt consolidation, if they consolidate at all. Most of the fiscal adjustment occurs via central government debt. In contrast, both the US federal and state levels contribute to consolidation of public finances.
Resumo:
Capillary electrophoresis (CE) with capacitively coupled contactless conductivity detection (C4D) was used for determination of sodium and potassium concentrations in diet and non-diet soft drinks. Higher sodium concentrations were found in the diet samples due to the utilization of sodium salts of cyclamate and saccharine as sweeteners. The CE-C4D method can be used by food industries and health regulatory agencies for monitoring sodium and potassium content, not only in soft drink but in many others food products.
Resumo:
Connectivity depends on rates of dispersal between communities. For marine soft-sediment communities continued small-scale dispersal as post-larvae and as adults can be equally important in maintaining community composition, as initial recruitment of substrate by pelagic larvae. In this thesis post-larval dispersal strategies of benthic invertebrates, as well as mechanisms by which communities are connected were investigated. Such knowledge on dispersal is scarce, due to the difficulties in actually measuring dispersal directly in nature, and dispersal has not previously been quantified in the Baltic Sea. Different trap-types were used underwater to capture dispersing invertebrates at different sites, while in parallel measuring waves and currents. Local community composition was found to change predictably under varying rates of dispersal and physical connectivity (waves and currents). This response was, however, dependent on dispersal-related traits of taxa. Actively dispersing taxa will be relatively better at maintaining their position, as they are not as dependent on hydrodynamic conditions for dispersal and will be less prone to be passively transported by currents. Taxa also dispersed in relative proportions that were distinctly different from resident community composition and a significant proportion (40 %) of taxa were found to lack a planktonic larval life-stage. Community assembly was re-started in a large-scale manipulative field experiment over one year across several sites, which revealed how patterns of community composition (α-, β- and λ-diversity) change depending on rates of dispersal. Results also demonstrated that in response to small-scale disturbance, initial recruitment was by nearby-dominant species after which other species arrived from successively further away. At later assembly time, the number of coexisting species increased beyond what was expected purely by local niche requirements (species sorting), transferring regional differences in community composition (β-diversity) to the local scale (α-diversity, mass effect). Findings of this thesis complement more theoretical studies in metacommunity ecology by demonstrating that understanding how and when individuals disperse relative to underlying environmental heterogeneity is key to interpreting how patterns of diversity change across different spatial scales. Such information from nature is critical when predicting responses to, for example, different types of disturbances or management actions in conservation.
Resumo:
Tutkielmassa analysoidaan kolmen internetsivuston uutisartikkeleita kielitieteen näkökulmasta. Tavoitteena on selvittää esiintyykö internetsivustojen BBC, CNN ja Fox News uutisoinnissa politiikkaan liittyviä ennakkoasenteita tai puolueellisuuksia ja miten ne käytännössä näkyvät uutisartikkeleiden kielessä. Kriittiseen diskurssianalyysiin pohjautuen tutkielma esittelee jokaisen uutissivuston taustaan (esimerkiksi rakenteeseen ja rahoitukseen) liittyviä seikkoja sekä mediadiskurssiin ja politiikkaan liittyvät taustatiedot, jolla taataan Norman Fairclough'n kolmivaiheisen menetelmän mahdollisimman perusteellinen toteuttaminen. Uutissivustoja analysoidaan kriittiselle diskurssianalyysille sopivan funktionaalisen kieliopin ja muiden lingvististen välineiden avulla. Koko aineiston (404 artikkelia) otsikot analysoidaan ensin, minkä jälkeen analysoidaan yhdeksän kokonaista artikkelia kolmeen eri aihealueeseen liittyen niin, että jokaiselta internetsivustolta analysoidaan yksi artikkeli jokaista aihetta kohden. Analyysikeinoina käytetään ensisijaisesti systeemis-funktionaalisen kieliopin tekstuaalisen metafunktion välineitä (thematic structure). Myös ideationaalisen metafunktion välineitä (transitivity), referenssiketjuja (referential identity chains) ja leksikaalista analyysia käytetään hyväksi. Lähtökohtaisesti tavoitteena on analysoida uutissivustoja vertailevasti, jolloin analyysin tulokset ovat paremmin havainnoitavissa ja perusteltavissa. Hypoteesi aikaisempien tutkimusten ja yleisen mielikuvan perusteella on, että CNN uutisoi demokraattipuolueelle ja Fox News taas republikaanipuolueelle edulliseen sävyyn. Tutkimustulokset vaihtelivat hypoteesia tukevista ja sen vastaisista tuloksista niihin, jotka eivät olleet tarpeeksi tuettuja kumpaankaan suuntaan. Vahvimmat tulokset ovat kuitenkin hypoteesia tukevia, joten tässä tutkielmassa todetaan, ettei uutisointi ole puolueetonta ainakaan näiden kolmen internetsivuston kohdalla. Lisäksi muutaman aihealueen kohdalla uutisointi on niin toistuvaa tietystä näkökulmasta, että luonnollistumisteorian mukaista aatteiden luonnollistumista saattaa tapahtua. Tutkielmassa käytettyjen menetelmien menestyksen perusteella suositellaan, että tekstuaalisen metafunktion analyysivälineitä käytetään enemmän. Lisäksi suositellaan meta-analyysin harkitsemista, jotta voitaisiin selvittää, mitkä analyysimetodit parhaiten sopivat minkäkinlaisen aineiston analysointiin.
Resumo:
This study aimed to evaluate the interference of tuberculin test on the gamma-interferon (INFg) assay, to estimate the sensitivity and specificity of the INFg assay in Brazilian conditions, and to simulate multiple testing using the comparative tuberculin test and the INFg assay. Three hundred-fifty cattle from two TB-free and two TB-infected herds were submitted to the comparative tuberculin test and the INFg assay. The comparative tuberculin test was performed using avian and bovine PPD. The INFg assay was performed by the BovigamTM kit (CSL Veterinary, Australia), according to the manufacturer's specifications. Sensitivity and specificity of the INFg assay were assessed by a Bayesian latent class model. These diagnostic parameters were also estimate for multiple testing. The results of INFg assay on D0 and D3 after the comparative tuberculin test were compared by the McNemar's test and kappa statistics. Results of mean optical density from INFg assay on both days were similar. Sensitivity and specificity of the INFg assay showed results varying (95% confidence intervals) from 72 to 100% and 74 to 100% respectively. Sensitivity of parallel testing was over 97.5%, while specificity of serial testing was over 99.7%. The INFg assay proved to be a very useful diagnostic method.
Resumo:
When modeling machines in their natural working environment collisions become a very important feature in terms of simulation accuracy. By expanding the simulation to include the operation environment, the need for a general collision model that is able to handle a wide variety of cases has become central in the development of simulation environments. With the addition of the operating environment the challenges for the collision modeling method also change. More simultaneous contacts with more objects occur in more complicated situations. This means that the real-time requirement becomes more difficult to meet. Common problems in current collision modeling methods include for example dependency on the geometry shape or mesh density, calculation need increasing exponentially in respect to the number of contacts, the lack of a proper friction model and failures due to certain configurations like closed kinematic loops. All these problems mean that the current modeling methods will fail in certain situations. A method that would not fail in any situation is not very realistic but improvements can be made over the current methods.