980 resultados para C source
Resumo:
OBJECTIVES: There is concern regarding the possible health effects of cellular telephone use. We examined whether the source of funding of studies of the effects of low-level radiofrequency radiation is associated with the results of studies. We conducted a systematic review of studies of controlled exposure to radiofrequency radiation with health-related outcomes (electroencephalogram, cognitive or cardiovascular function, hormone levels, symptoms, and subjective well-being). DATA SOURCES: We searched EMBASE, Medline, and a specialist database in February 2005 and scrutinized reference lists from relevant publications. DATA EXTRACTION: Data on the source of funding, study design, methodologic quality, and other study characteristics were extracted. The primary outcome was the reporting of at least one statistically significant association between the exposure and a health-related outcome. Data were analyzed using logistic regression models. DATA SYNTHESIS: Of 59 studies, 12 (20%) were funded exclusively by the telecommunications industry, 11 (19%) were funded by public agencies or charities, 14 (24%) had mixed funding (including industry), and in 22 (37%) the source of funding was not reported. Studies funded exclusively by industry reported the largest number of outcomes, but were least likely to report a statistically significant result: The odds ratio was 0.11 (95% confidence interval, 0.02-0.78), compared with studies funded by public agencies or charities. This finding was not materially altered in analyses adjusted for the number of outcomes reported, study quality, and other factors. CONCLUSIONS: The interpretation of results from studies of health effects of radiofrequency radiation should take sponsorship into account.
Resumo:
The novel tabletop miniaturized radiocarbon dating system (MICADAS) at ETH Zurich features a hybrid Cs sputter negative ion source for the measurement of solid graphite and gaseous CO2 samples. The source produces stable currents of up to 6 mu A C- out of gaseous samples with an efficiency of 3-6%. A gas feeding system has been set up that enables constant dosing of CO2 into the Cs sputter ion source and ensures stable measuring conditions. The system is based on a syringe in which CO2 gas is mixed with He and then pressed continuously into the ion source at a constant flow rate. Minimized volumes allow feeding samples of 3-30 mu g carbon quantitatively into the ion source. In order to test the performance of the system, several standards and blanks have successfully been measured. The ratios of C-14/C-12 could be repeated within statistical errors to better than 1.0% and the C-13/C-12 ratios to better than 0.2%. The blank was < 1 pMC.
Resumo:
A multiple source model (MSM) for the 6 MV beam of a Varian Clinac 2300 C/D was developed by simulating radiation transport through the accelerator head for a set of square fields using the GEANT Monte Carlo (MC) code. The corresponding phase space (PS) data enabled the characterization of 12 sources representing the main components of the beam defining system. By parametrizing the source characteristics and by evaluating the dependence of the parameters on field size, it was possible to extend the validity of the model to arbitrary rectangular fields which include the central 3 x 3 cm2 field without additional precalculated PS data. Finally, a sampling procedure was developed in order to reproduce the PS data. To validate the MSM, the fluence, energy fluence and mean energy distributions determined from the original and the reproduced PS data were compared and showed very good agreement. In addition, the MC calculated primary energy spectrum was verified by an energy spectrum derived from transmission measurements. Comparisons of MC calculated depth dose curves and profiles, using original and PS data reproduced by the MSM, agree within 1% and 1 mm. Deviations from measured dose distributions are within 1.5% and 1 mm. However, the real beam leads to some larger deviations outside the geometrical beam area for large fields. Calculated output factors in 10 cm water depth agree within 1.5% with experimentally determined data. In conclusion, the MSM produces accurate PS data for MC photon dose calculations for the rectangular fields specified.
Resumo:
Dental identification is the most valuable method to identify human remains in single cases with major postmortem alterations as well as in mass casualties because of its practicability and demanding reliability. Computed tomography (CT) has been investigated as a supportive tool for forensic identification and has proven to be valuable. It can also scan the dentition of a deceased within minutes. In the present study, we investigated currently used restorative materials using ultra-high-resolution dual-source CT and the extended CT scale for the purpose of a color-encoded, in scale, and artifact-free visualization in 3D volume rendering. In 122 human molars, 220 cavities with 2-, 3-, 4- and 5-mm diameter were prepared. With presently used filling materials (different composites, temporary filling materials, ceramic, and liner), these cavities were restored in six teeth for each material and cavity size (exception amalgam n = 1). The teeth were CT scanned and images reconstructed using an extended CT scale. Filling materials were analyzed in terms of resulting Hounsfield units (HU) and filling size representation within the images. Varying restorative materials showed distinctively differing radiopacities allowing for CT-data-based discrimination. Particularly, ceramic and composite fillings could be differentiated. The HU values were used to generate an updated volume-rendering preset for postmortem extended CT scale data of the dentition to easily visualize the position of restorations, the shape (in scale), and the material used which is color encoded in 3D. The results provide the scientific background for the application of 3D volume rendering to visualize the human dentition for forensic identification purposes.
Resumo:
The many different proxy records from the European Project for Ice Coring in Antarctica (EPICA) Dome C ice core allow for the first time a comparison of nine glacial terminations in great detail. Despite the fact that all terminations cover the transition from a glacial maximum into an interglacial, there are large differences between single terminations. For some terminations, Antarctic temperature increased only moderately, while for others, the amplitude of change at the termination was much larger. For the different terminations, the rate of change in temperature is more similar than the magnitude or duration of change. These temperature changes were accompanied by vast changes in dust and sea salt deposition all over Antarctica. Here we investigate the phasing between a South American dust proxy (non-sea-salt calcium flux, nssCa2+), a sea ice proxy (sea salt sodium flux, ssNa+) and a proxy for Antarctic temperature (deuterium, δD). In particular, we look into whether a similar sequence of events applies to all terminations, despite their different characteristics. All proxies are derived from the EPICA Dome C ice core, resulting in a relative dating uncertainty between the proxies of less than 20 years. At the start of the terminations, the temperature (δD) increase and dust (nssCa2+ flux) decrease start synchronously. The sea ice proxy (ssNa+ flux), however, only changes once the temperature has reached a particular threshold, approximately 5°C below present day temperatures (corresponding to a δD value of −420‰). This reflects to a large extent the limited sensitivity of the sea ice proxy during very cold periods with large sea ice extent. At terminations where this threshold is not reached (TVI, TVIII), ssNa+ flux shows no changes. Above this threshold, the sea ice proxy is closely coupled to the Antarctic temperature, and interglacial levels are reached at the same time for both ssNa+ and δD. On the other hand, once another threshold at approximately 2°C below present day temperature is passed (corresponding to a δD value of −402‰), nssCa2+ flux has reached interglacial levels and does not change any more, despite further warming. This threshold behaviour most likely results from a combination of changes to the threshold friction velocity for dust entrainment and to the distribution of surface wind speeds in the dust source region.
Resumo:
„Open source and European antitrust laws: An analysis of copyleft and the prohibition of software license fees on the basis of art. 101 TFEU and the block exemptions“ Open source software and open source licenses (like the GNU GPL) are not only relevant for computer nerds or activists – they are already business. They are for example the fundament of LINUX, the only real rival of MICROSOFT’s WINDOWS-line in the field of operating systems for IBM PC compatibles. Art. 101 TFEU (like the identical predecessor art. 81 TEC) as part of the EU antitrust laws prohibits contract terms like price fixing and some forms of technology control. Are copyleft – the „viral effect“, the „cancer“ – and the interdiction of software license fees in the cross hairs of this legal rule? On the other side the European Union has since 2004 a new Technology Transfer Block Exemption with software license agreements for the first time in its scope: a safe harbour and a dry place under a umbrella for open source software? After the introduction (A) with a description of open source software the following text analyses the system of the European Unions competition law respectivley antitrust law and the requirements of the block exemptions (B). Starting point of antitrust analysis are undertakings – but who are the untertakings (C) in the field of widespread, independent developers as part of the „bazar organization“? To see how much open source has to fear from the law of the European Union, at the end the anti competitive and pro competitive effects of open source are totalized within the legal framework (D). The conclusion (E) shows: not nothing, but not much.
Resumo:
The topic of this study was to evaluate state-dependent effects of diazepam on the frequency characteristics of 47-channel spontaneous EEG maps. A novel method, the FFT-Dipole-Approximation (Lehmann and Michel, 1990), was used to study effects on the strength and the topography of the maps in the different frequency bands. Map topography was characterized by the 3-dimensional location of the equivalent dipole source and map strength was defined as the spatial standard deviation (the Global Field Power) of the maps of each frequency point. The Global Field Power can be considered as a measure of the amount of energy produced by the system, while the source location gives an estimate of the center of gravity of all sources in the brain that were active at a certain frequency. State-dependency was studied by evaluating the drug effects before and after a continuous performance task of 25 min duration. Clear interactions between drug (diazepam vs. placebo) and time after drug intake (before and after the task) were found, especially in the inferior-superior location of the dipole sources. It supports the hypothesis that diazepam, like other drugs, has different effects on brain functions depending on the momentary functional state of the brain. In addition to the drug effects, clearly different source locations and Global Field Power were found for the different frequency bands, replicating earlier reports (Michel et al., 1992).
Resumo:
Campylobacteriosis is the most frequent zoonosis in developed countries and various domestic animals can function as reservoir for the main pathogens Campylobacter jejuni and Campylobacter coli. In the present study we compared population structures of 730 C. jejuni and C. coli from human cases, 610 chicken, 159 dog, 360 pig and 23 cattle isolates collected between 2001 and 2012 in Switzerland. All isolates had been typed with multi locus sequence typing (MLST) and flaB-typing and their genotypic resistance to quinolones was determined. We used complementary approaches by testing for differences between isolates from different hosts with the proportion similarity as well as the fixation index and by attributing the source of the human isolates with Bayesian assignment using the software STRUCTURE. Analyses were done with MLST and flaB data in parallel and both typing methods were tested for associations of genotypes with quinolone resistance. Results obtained with MLST and flaB data corresponded remarkably well, both indicating chickens as the main source for human infection for both Campylobacter species. Based on MLST, 70.9% of the human cases were attributed to chickens, 19.3% to cattle, 8.6% to dogs and 1.2% to pigs. Furthermore we found a host independent association between sequence type (ST) and quinolone resistance. The most notable were ST-45, all isolates of which were susceptible, while for ST-464 all were resistant.