964 resultados para electron capture detection
Resumo:
High magnification and large depth of field with a temporal resolution of less than 100 microseconds are possible using the present invention which combines a linear electron beam produced by a tungsten filament from an SX-40A Scanning Electron Microscope (SEM), a magnetic deflection coil with lower inductance resulting from reducing the number of turns of the saddle-coil wires, while increasing the diameter of the wires, a fast scintillator, photomultiplier tube, photomultiplier tube base, and signal amplifiers and a high speed data acquisition system which allows for a scan rate of 381 frames per second and 256.times.128 pixel density in the SEM image at a data acquisition rate of 25 MHz. The data acquisition and scan position are fully coordinated. A digitizer and a digital waveform generator which generates the sweep signals to the scan coils run off the same clock to acquire the signal in real-time.
Resumo:
News blog hot topics are important for the information recommendation service and marketing. However, information overload and personalized management make the information arrangement more difficult. Moreover, what influences the formation and development of blog hot topics is seldom paid attention to. In order to correctly detect news blog hot topics, the paper first analyzes the development of topics in a new perspective based on W2T (Wisdom Web of Things) methodology. Namely, the characteristics of blog users, context of topic propagation and information granularity are unified to analyze the related problems. Some factors such as the user behavior pattern, network opinion and opinion leader are subsequently identified to be important for the development of topics. Then the topic model based on the view of event reports is constructed. At last, hot topics are identified by the duration, topic novelty, degree of topic growth and degree of user attention. The experimental results show that the proposed method is feasible and effective.
Resumo:
Static anaylsis represents an approach of checking source code or compiled code of applications before it gets executed. Chess and McGraw state that static anaylsis promises to identify common coding problems automatically. While manual code checking is also a form of static analysis, software tools are used in most cases in order to perform the checks. Chess and McGraw additionaly claim that good static checkers can help to spot and eradicate common security bugs.
Resumo:
We propose CIMD (Collaborative Intrusion and Malware Detection), a scheme for the realization of collaborative intrusion detection approaches. We argue that teams, respectively detection groups with a common purpose for intrusion detection and response, improve the measures against malware. CIMD provides a collaboration model, a decentralized group formation and an anonymous communication scheme. Participating agents can convey intrusion detection related objectives and associated interests for collaboration partners. These interests are based on intrusion objectives and associated interests for collaboration partners. These interests are based on intrusion detection related ontology, incorporating network and hardware configurations and detection capabilities. Anonymous Communication provided by CIMD allows communication beyond suspicion, i.e. the adversary can not perform better than guessing an IDS to be the source of a message at random. The evaluation takes place with the help of NeSSi² (www.nessi2.de), the Network Security Simulator, a dedicated environment for analysis of attacks and countermeasures in mid-scale and large-scale networks. A CIMD prototype is being built based on the JIAC agent framework(www.jiac.de).
Resumo:
Our daily lives become more and more dependent upon smartphones due to their increased capabilities. Smartphones are used in various ways, e.g. for payment systems or assisting the lives of elderly or disabled people. Security threats for these devices become more and more dangerous since there is still a lack of proper security tools for protection. Android emerges as an open smartphone platform which allows modification even on operating system level and where third-party developers first time have the opportunity to develop kernel-based low-level security tools. Android quickly gained its popularity among smartphone developers and even beyond since it bases on Java on top of "open" Linux in comparison to former proprietary platforms which have very restrictive SDKs and corresponding APIs. Symbian OS, holding the greatest market share among all smartphone OSs, was even closing critical APIs to common developers and introduced application certification. This was done since this OS was the main target for smartphone malwares in the past. In fact, more than 290 malwares designed for Symbian OS appeared from July 2004 to July 2008. Android, in turn, promises to be completely open source. Together with the Linux-based smartphone OS OpenMoko, open smartphone platforms may attract malware writers for creating malicious applications endangering the critical smartphone applications and owners privacy. Since signature-based approaches mainly detect known malwares, anomaly-based approaches can be a valuable addition to these systems. They base on mathematical algorithms processing data that describe the state of a certain device. For gaining this data, a monitoring client is needed that has to extract usable information (features) from the monitored system. Our approach follows a dual system for analyzing these features. On the one hand, functionality for on-device light-weight detection is provided. But since most algorithms are resource exhaustive, remote feature analysis is provided on the other hand. Having this dual system enables event-based detection that can react to the current detection need. In our ongoing research we aim to investigates the feasibility of light-weight on-device detection for certain occasions. On other occasions, whenever significant changes are detected on the device, the system can trigger remote detection with heavy-weight algorithms for better detection results. In the absence of the server respectively as a supplementary approach, we also consider a collaborative scenario. Here, mobile devices sharing a common objective are enabled by a collaboration module to share information, such as intrusion detection data and results. This is based on an ad-hoc network mode that can be provided by a WiFi or Bluetooth adapter nearly every smartphone possesses.
Resumo:
Anomaly detection compensates shortcomings of signature-based detection such as protecting against Zero-Day exploits. However, Anomaly Detection can be resource-intensive and is plagued by a high false-positive rate. In this work, we address these problems by presenting a Cooperative Intrusion Detection approach for the AIS, the Artificial Immune System, as an example for an anomaly detection approach. In particular we show, how the cooperative approach reduces the false-positive rate of the detection and how the overall detection process can be organized to account for the resource constraints of the participating devices. Evaluations are carried out with the novel network simulation environment NeSSi as well as formally with an extension to the epidemic spread model SIR
Resumo:
Characterization of the combustion products released during the burning of commonly used engineering metallic materials may aid in material selection and risk assessment for the design of oxygen systems. The characterization of combustion products in regards to size distribution and morphology gives useful information for systems addressing fire detection. Aluminum rods (3.2-mm diameter cylinders) were vertically mounted inside a combustion chamber and ignited in pressurized oxygen by resistively heating an aluminum/palladium igniter wire attached to the bottom of the test sample. This paper describes the experimental work conducted to establish the particle size distribution and morphology of the resultant combustion products collected after the burning was completed and subsequently analyzed. In general, the combustion products consisted of a re-solidified oxidized slag and many small hollow spheres of size ranging from about 500 nm to 1000 µm in diameter, surfaced with quenched dendritic and grain-like structures. The combustion products were characterized using optical and scanning electron microscopy.
Resumo:
The power of testing for a population-wide association between a biallelic quantitative trait locus and a linked biallelic marker locus is predicted both empirically and deterministically for several tests. The tests were based on the analysis of variance (ANOVA) and on a number of transmission disequilibrium tests (TDT). Deterministic power predictions made use of family information, and were functions of population parameters including linkage disequilibrium, allele frequencies, and recombination rate. Deterministic power predictions were very close to the empirical power from simulations in all scenarios considered in this study. The different TDTs had very similar power, intermediate between one-way and nested ANOVAs. One-way ANOVA was the only test that was not robust against spurious disequilibrium. Our general framework for predicting power deterministically can be used to predict power in other association tests. Deterministic power calculations are a powerful tool for researchers to plan and evaluate experiments and obviate the need for elaborate simulation studies.
Resumo:
This paper presents a method for investigating ship emissions, the plume capture and analysis system (PCAS), and its application in measuring airborne pollutant emission factors (EFs) and particle size distributions. The current investigation was conducted in situ, aboard two dredgers (Amity: a cutter suction dredger and Brisbane: a hopper suction dredger) but the PCAS is also capable of performing such measurements remotely at a distant point within the plume. EFs were measured relative to the fuel consumption using the fuel combustion derived plume CO2. All plume measurements were corrected by subtracting background concentrations sampled regularly from upwind of the stacks. Each measurement typically took 6 minutes to complete and during one day, 40 to 50 measurements were possible. The relationship between the EFs and plume sample dilution was examined to determine the plume dilution range over which the technique could deliver consistent results when measuring EFs for particle number (PN), NOx, SO2, and PM2.5 within a targeted dilution factor range of 50-1000 suitable for remote sampling. The EFs for NOx, SO2, and PM2.5 were found to be independent of dilution, for dilution factors within that range. The EF measurement for PN was corrected for coagulation losses by applying a time dependant particle loss correction to the particle number concentration data. For the Amity, the EF ranges were PN: 2.2 - 9.6 × 1015 (kg-fuel)-1; NOx: 35-72 g(NO2).(kg-fuel)-1, SO2 0.6 - 1.1 g(SO2).(kg-fuel)-1and PM2.5: 0.7 – 6.1 g(PM2.5).(kg-fuel)-1. For the Brisbane they were PN: 1.0 – 1.5 x 1016 (kg-fuel)-1, NOx: 3.4 – 8.0 g(NO2).(kg-fuel)-1, SO2: 1.3 – 1.7 g(SO2).(kg-fuel)-1 and PM2.5: 1.2 – 5.6 g(PM2.5).(kg-fuel)-1. The results are discussed in terms of the operating conditions of the vessels’ engines. Particle number emission factors as a function of size as well as the count median diameter (CMD), and geometric standard deviation of the size distributions are provided. The size distributions were found to be consistently uni-modal in the range below 500 nm, and this mode was within the accumulation mode range for both vessels. The representative CMDs for the various activities performed by the dredgers ranged from 94-131 nm in the case of the Amity, and 58-80 nm for the Brisbane. A strong inverse relationship between CMD and EF(PN) was observed.
Resumo:
Fine-grained matrices in carbonaceous chondrites and small, micron-sized inclusions in achondrites can be characterized effectively using high resolution transmission electron microscopy (HRTEM).
Resumo:
Vibration Based Damage Identification Techniques which use modal data or their functions, have received significant research interest in recent years due to their ability to detect damage in structures and hence contribute towards the safety of the structures. In this context, Strain Energy Based Damage Indices (SEDIs), based on modal strain energy, have been successful in localising damage in structuers made of homogeneous materials such as steel. However, their application to reinforced concrete (RC) structures needs further investigation due to the significant difference in the prominent damage type, the flexural crack. The work reported in this paper is an integral part of a comprehensive research program to develop and apply effective strain energy based damage indices to assess damage in reinforced concrete flexural members. This research program established (i) a suitable flexural crack simulation technique, (ii) four improved SEDI's and (iii) programmable sequentional steps to minimise effects of noise. This paper evaluates and ranks the four newly developed SEDIs and existing seven SEDIs for their ability to detect and localise flexural cracks in RC beams. Based on the results of the evaluations, it recommends the SEDIs for use with single and multiple vibration modes.
Resumo:
The "standard" procedure for calibrating the Vesuvio eV neutron spectrometer at the ISIS neutron source, forming the basis for data analysis over at least the last decade, was recently documented in considerable detail by the instrument’s scientists. Additionally, we recently derived analytic expressions of the sensitivity of recoil peak positions with respect to fight-path parameters and presented neutron–proton scattering results that together called in to question the validity of the "standard" calibration. These investigations should contribute significantly to the assessment of the experimental results obtained with Vesuvio. Here we present new results of neutron–deuteron scattering from D2 in the backscattering angular range (theata > 90 degrees) which are accompanied by a striking energy increase that violates the Impulse Approximation, thus leading unequivocally the following dilemma: (A) either the "standard" calibration is correct and then the experimental results represent a novel quantum dynamical effect of D which stands in blatant contradiction of conventional theoretical expectations; (B) or the present "standard" calibration procedure is seriously deficient and leads to artificial outcomes. For Case(A), we allude to the topic of attosecond quantumdynamical phenomena and our recent neutron scattering experiments from H2 molecules. For Case(B),some suggestions as to how the "standard" calibration could be considerably improved are made.
Resumo:
Filamentary single crystals, blades, sheets, euhedral crystals and powders may form by vapor phase condensation depending on the supersauration conditions in the vapor with respect to the condensing species [1]. Filamentary crystal growth requires the operation of an axial screw dislocation [2]. A Vapor-Liquid-Solid (VLS) mechanism may also produce filamentary single crystals, ribbons and blades. The latter two morphologies are typically twinned. Crystals grown by this mechanism do not require the presence of an axial screw dislocation. Impurities may either promote or inhibit crystal growth [3]. The VLS mechanism allows crystals to grow at small supersaturation of the vapor. Thin enstatite blades, ribbons and sheets have been observed in chondritic porous Interplanetary Dust Partics (IDP's) [4, 5]. The requisite screw dislocation for vapor phase condensation [1] has been observed in these enstatite blades [4]. Bradley et al. [4] suggest that these crystals are primary vapor phase condensates which could have formed either in the solar nebula or in presolar environments. These observations [4,5] are significant in that they may provide a demonstrable link to theoretical predictions: viz. that in the primordial solar nebula filamentary condensates could cluster into 'lint balls' and form the predecessors to comets [6].
Resumo:
A recent NASA program to collect stratospheric dust particles using high-flying WB57 aircraft has made available many more potential candidates for the study of extraterrestrial materials. This preliminary report provides an interpretation of the types of particles returned from one flag (W7017) collected in August, 1981 using a subset of 81 allocated particles. This particular collection period is after the Mt. St. Helen's eruptions. Therefore, the flag may contain significant quantities of volcanic debris in addition to the expected terrestrial contaminants [1]. All particles were mounted on nucleopore filters and have been examined using a modified JEOL100CX analytical electron microscope. For most of the particles, X-ray energy dispersive spectra and images were obtained at 40kV on samples which have not received any conductive coating. However, in order to improve resolution (to ~30A) some images are recorded at 100kV. In addition, 16 samples have been coated with a thin layer (<50A) of Au/Pd.