954 resultados para scintillation detectors


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Automated remote ultrasound detectors allow large amounts of data on bat presence and activity to be collected. Processing of such data involves identifying bat species from their echolocation calls. Automated species identification has the potential to provide more consistent, predictable, and potentially higher levels of accuracy than identification by humans. In contrast, identification by humans permits flexibility and intelligence in identification, as well as the incorporation of features and patterns that may be difficult to quantify. We compared humans with artificial neural networks (ANNs) in their ability to classify short recordings of bat echolocation calls of variable signal to noise ratios; these sequences are typical of those obtained from remote automated recording systems that are often used in large-scale ecological studies. We presented 45 recordings (1–4 calls) produced by known species of bats to ANNs and to 26 human participants with 1 month to 23 years of experience in acoustic identification of bats. Humans correctly classified 86% of recordings to genus and 56% to species; ANNs correctly identified 92% and 62%, respectively. There was no significant difference between the performance of ANNs and that of humans, but ANNs performed better than about 75% of humans. There was little relationship between the experience of the human participants and their classification rate. However, humans with <1 year of experience performed worse than others. Currently, identification of bat echolocation calls by humans is suitable for ecological research, after careful consideration of biases. However, improvements to ANNs and the data that they are trained on may in future increase their performance to beyond those demonstrated by humans.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Corner detection has shown its great importance in many computer vision tasks. However, in real-world applications, noise in the image strongly affects the performance of corner detectors. Few corner detectors have been designed to be robust to heavy noise by now, partly because the noise could be reduced by a denoising procedure. In this paper, we present a corner detector that could find discriminative corners in images contaminated by noise of different levels, without any denoising procedure. Candidate corners (i.e., features) are firstly detected by a modified SUSAN approach, and then false corners in noise are rejected based on their local characteristics. Features in flat regions are removed based on their intensity centroid, and features on edge structures are removed using the Harris response. The detector is self-adaptive to noise since the image signal-to-noise ratio (SNR) is automatically estimated to choose an appropriate threshold for refining features. Experimental results show that our detector has better performance at locating discriminative corners in images with strong noise than other widely used corner or keypoint detectors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Affect is an important feature of multimedia content and conveys valuable information for multimedia indexing and retrieval. Most existing studies for affective content analysis are limited to low-level features or mid-level representations, and are generally criticized for their incapacity to address the gap between low-level features and high-level human affective perception. The facial expressions of subjects in images carry important semantic information that can substantially influence human affective perception, but have been seldom investigated for affective classification of facial images towards practical applications. This paper presents an automatic image emotion detector (IED) for affective classification of practical (or non-laboratory) data using facial expressions, where a lot of “real-world” challenges are present, including pose, illumination, and size variations etc. The proposed method is novel, with its framework designed specifically to overcome these challenges using multi-view versions of face and fiducial point detectors, and a combination of point-based texture and geometry. Performance comparisons of several key parameters of relevant algorithms are conducted to explore the optimum parameters for high accuracy and fast computation speed. A comprehensive set of experiments with existing and new datasets, shows that the method is effective despite pose variations, fast, and appropriate for large-scale data, and as accurate as the method with state-of-the-art performance on laboratory-based data. The proposed method was also applied to affective classification of images from the British Broadcast Corporation (BBC) in a task typical for a practical application providing some valuable insights.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, rapid advances in information technology have led to various data collection systems which are enriching the sources of empirical data for use in transport systems. Currently, traffic data are collected through various sensors including loop detectors, probe vehicles, cell-phones, Bluetooth, video cameras, remote sensing and public transport smart cards. It has been argued that combining the complementary information from multiple sources will generally result in better accuracy, increased robustness and reduced ambiguity. Despite the fact that there have been substantial advances in data assimilation techniques to reconstruct and predict the traffic state from multiple data sources, such methods are generally data-driven and do not fully utilize the power of traffic models. Furthermore, the existing methods are still limited to freeway networks and are not yet applicable in the urban context due to the enhanced complexity of the flow behavior. The main traffic phenomena on urban links are generally caused by the boundary conditions at intersections, un-signalized or signalized, at which the switching of the traffic lights and the turning maneuvers of the road users lead to shock-wave phenomena that propagate upstream of the intersections. This paper develops a new model-based methodology to build up a real-time traffic prediction model for arterial corridors using data from multiple sources, particularly from loop detectors and partial observations from Bluetooth and GPS devices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Origin-Destination matrices (ODM) estimation can benefits of the availability of sample trajectories which can be measured thanks to recent technologies. This paper focus on the case of transport networks where traffic counts are measured by magnetic loops and sample trajectories available. An example of such network is the city of Brisbane, where Bluetooth detectors are now operating. This additional data source is used to extend the classical ODM estimation to a link-specific ODM (LODM) one using a convex optimisation resolution that incorporates networks constraints as well. The proposed algorithm is assessed on a simulated network.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Even though crashes between trains and road users are rare events at railway level crossings, they are one of the major safety concerns for the Australian railway industry. Nearmiss events at level crossings occur more frequently, and can provide more information about factors leading to level crossing incidents. In this paper we introduce a video analytic approach for automatically detecting and localizing vehicles from cameras mounted on trains for detecting near-miss events. To detect and localize vehicles at level crossings we extract patches from an image and classify each patch for detecting vehicles. We developed a region proposals algorithm for generating patches, and we use a Convolutional Neural Network (CNN) for classifying each patch. To localize vehicles in images we combine the patches that are classified as vehicles according to their CNN scores and positions. We compared our system with the Deformable Part Models (DPM) and Regions with CNN features (R-CNN) object detectors. Experimental results on a railway dataset show that the recall rate of our proposed system is 29% higher than what can be achieved with DPM or R-CNN detectors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent decreases in costs, and improvements in performance, of silicon array detectors open a range of potential applications of relevance to plant physiologists, associated with spectral analysis in the visible and short-wave near infra-red (far-red) spectrum. The performance characteristics of three commercially available ‘miniature’ spectrometers based on silicon array detectors operating in the 650–1050-nm spectral region (MMS1 from Zeiss, S2000 from Ocean Optics, and FICS from Oriel, operated with a Larry detector) were compared with respect to the application of non-invasive prediction of sugar content of fruit using near infra-red spectroscopy (NIRS). The FICS–Larry gave the best wavelength resolution; however, the narrow slit and small pixel size of the charge-coupled device detector resulted in a very low sensitivity, and this instrumentation was not considered further. Wavelength resolution was poor with the MMS1 relative to the S2000 (e.g. full width at half maximum of the 912 nm Hg peak, 13 and 2 nm for the MMS1 and S2000, respectively), but the large pixel height of the array used in the MMS1 gave it sensitivity comparable to the S2000. The signal-to-signal standard error ratio of spectra was greater by an order of magnitude with the MMS1, relative to the S2000, at both near saturation and low light levels. Calibrations were developed using reflectance spectra of filter paper soaked in range of concentrations (0–20% w/v) of sucrose, using a modified partial least squares procedure. Calibrations developed with the MMS1 were superior to those developed using the S2000 (e.g. coefficient of correlation of 0.90 and 0.62, and standard error of cross-validation of 1.9 and 5.4%, respectively), indicating the importance of high signal to noise ratio over wavelength resolution to calibration accuracy. The design of a bench top assembly using the MMS1 for the non-invasive assessment of mesocarp sugar content of (intact) melon fruit is reported in terms of light source and angle between detector and light source, and optimisation of math treatment (derivative condition and smoothing function).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spectral data were collected of intact and ground kernels using 3 instruments (using Si-PbS, Si, and InGaAs detectors), operating over different areas of the spectrum (between 400 and 2500 nm) and employing transmittance, interactance, and reflectance sample presentation strategies. Kernels were assessed on the basis of oil and water content, and with respect to the defect categories of insect damage, rancidity, discoloration, mould growth, germination, and decomposition. Predictive model performance statistics for oil content models were acceptable on all instruments (R2 > 0.98; RMSECV < 2.5%, which is similar to reference analysis error), although that for the instrument employing reflectance optics was inferior to models developed for the instruments employing transmission optics. The spectral positions for calibration coefficients were consistent with absorbance due to the third overtones of CH2 stretching. Calibration models for moisture content in ground samples were acceptable on all instruments (R2 > 0.97; RMSECV < 0.2%), whereas calibration models for intact kernels were relatively poor. Calibration coefficients were more highly weighted around 1360, 740 and 840 nm, consistent with absorbance due to overtones of O-H stretching and combination. Intact kernels with brown centres or rancidity could be discriminated from each other and from sound kernels using principal component analysis. Part kernels affected by insect damage, discoloration, mould growth, germination, and decomposition could be discriminated from sound kernels. However, discrimination among these defect categories was not distinct and could not be validated on an independent set. It is concluded that there is good potential for a low cost Si photodiode array instrument to be employed to identify some quality defects of intact macadamia kernels and to quantify oil and moisture content of kernels in the process laboratory and for oil content in-line. Further work is required to examine the robustness of predictive models across different populations, including growing districts, cultivars and times of harvest.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The availability and quality of irrigation water has become an issue limiting productivity in many Australian vegetable regions. Production is also under competitive pressure from supply chain forces. Producers look to new technologies, including changing irrigation infrastructure, exploring new water sources, and more complex irrigation management, to survive these stresses. Often there is little objective information investigating which improvements could improve outcomes for vegetable producers, and external communities (e.g. meeting NRM targets). This has led to investment in inappropriate technologies, and costly repetition of errors, as business independently discover the worth of technologies by personal experience. In our project, we investigated technology improvements for vegetable irrigation. Through engagement with industry and other researchers, we identified technologies most applicable to growers, particularly those that addressed priority issues. We developed analytical tools for ‘what if’ scenario testing of technologies. We conducted nine detailed experiments in the Lockyer Valley and Riverina vegetable growing districts, as well as case studies on grower properties in southern Queensland. We investigated root zone monitoring tools (FullStop™ wetting front detectors and Soil Solution Extraction Tubes - SSET), drip system layout, fertigation equipment, and altering planting arrangements. Our project team developed and validated models for broccoli, sweet corn, green beans and lettuce, and spreadsheets for evaluating economic risks associated with new technologies. We presented project outcomes at over 100 extension events, including irrigation showcases, conferences, field days, farm walks and workshops. The FullStops™ were excellent for monitoring root zone conditions (EC, nitrate levels), and managing irrigation with poor quality water. They were easier to interpret than the SSET. The SSET were simpler to install, but required wet soil to be reliable. SSET were an option for monitoring deeper soil zones, unsuitable for FullStop™ installations. Because these root zone tools require expertise, and are labour intensive, we recommend they be used to address specific problems, or as a periodic auditing strategy, not for routine monitoring. In our research, we routinely found high residual N in horticultural soils, with subsequently little crop yield response to additional nitrogen fertiliser. With improved irrigation efficiency (and less leaching), it may be timely to re-examine nitrogen budgets and recommendations for vegetable crops. Where the drip irrigation tube was located close to the crop row (i.e. within 5-8 cm), management of irrigation was easier. It improved nitrogen uptake, water use efficiency, and reduced the risk of poor crop performance through moisture stress, particularly in the early crop establishment phases. Close proximity of the drip tube to the crop row gives the producer more options for managing salty water, and more flexibility in taking risks with forecast rain. In many vegetable crops, proximate drip systems may not be cost-effective. The next best alternative is to push crop rows closer to the drip tube (leading to an asymmetric row structure). The vegetable crop models are good at predicting crop phenology (development stages, time to harvest), input use (water, fertiliser), environmental impacts (nutrient, salt movement) and total yields. The two immediate applications for the models are understanding/predicting/manipulating harvest dates and nitrogen movements in vegetable cropping systems. From the economic tools, the major influences on accumulated profit are price and yield. In doing ‘what if’ analyses, it is very important to be as accurate as possible in ascertaining what the assumed yield and price ranges are. In most vegetable production systems, lowering the required inputs (e.g. irrigation requirement, fertiliser requirement) is unlikely to have a major influence on accumulated profit. However, if a resource is constraining (e.g. available irrigation water), it is usually most profitable to maximise return per unit of that resource.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ALICE (A Large Ion Collider Experiment) is an experiment at CERN (European Organization for Nuclear Research), where a heavy-ion detector is dedicated to exploit the unique physics potential of nucleus-nucleus interactions at LHC (Large Hadron Collider) energies. In a part of that project, 716 so-called type V4 modules were assembles in Detector Laboratory of Helsinki Institute of Physics during the years 2004 - 2006. Altogether over a million detector strips has made this project the most massive particle detector project in the science history of Finland. One ALICE SSD module consists of a double-sided silicon sensor, two hybrids containing 12 HAL25 front end readout chips and some passive components, such has resistors and capacitors. The components are connected together by TAB (Tape Automated Bonding) microcables. The components of the modules were tested in every assembly phase with comparable electrical tests to ensure the reliable functioning of the detectors and to plot the possible problems. The components were accepted or rejected by the limits confirmed by ALICE collaboration. This study is concentrating on the test results of framed chips, hybrids and modules. The total yield of the framed chips is 90.8%, hybrids 96.1% and modules 86.2%. The individual test results have been investigated in the light of the known error sources that appeared during the project. After solving the problems appearing during the learning-curve of the project, the material problems, such as defected chip cables and sensors, seemed to induce the most of the assembly rejections. The problems were typically seen in tests as too many individual channel failures. Instead, the bonding failures rarely caused the rejections of any component. One sensor type among three different sensor manufacturers has proven to have lower quality than the others. The sensors of this manufacturer are very noisy and their depletion voltage are usually outside of the specification given to the manufacturers. Reaching 95% assembling yield during the module production demonstrates that the assembly process has been highly successful.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

VHF nighttime scintillations, recorded during a high solar activity period at a meridian chain of stations covering a magnetic latitude belt of 3°–21°N (420 km subionospheric points) are analyzed to investigate the influence of equatorial spread F irregularities on the occurrence of scintillation at latitudes away from the equator. Observations show that saturated amplitude scintillations start abruptly about one and a half hours after ground sunset and their onset is almost simultaneous at stations whose subionospheric points are within 12°N latitude of the magnetic equator, but is delayed at a station whose subionospheric point is at 21°N magnetic latitude by 15 min to 4 hours. In addition, the occurrence of postsunset scintillations at all the stations is found to be conditional on their prior occurrence at the equatorial station. If no postsunset scintillation activity is seen at the equatorial station, no scintillations are seen at other stations also. The occurrence of scintillations is explained as caused by rising plasma bubbles and associated irregularities over the magnetic equator and the subsequent mapping of these irregularities down the magnetic field lines to the F region of higher latitudes through some instantaneous mechanism; and hence an equatorial control is established on the generation of postsunset scintillation-producing irregularities in the entire low-latitude belt.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research reported in this thesis dealt with single crystals of thallium bromide grown for gamma-ray detector applications. The crystals were used to fabricate room temperature gamma-ray detectors. Routinely produced TlBr detectors often are poor quality. Therefore, this study concentrated on developing the manufacturing processes for TlBr detectors and methods of characterisation that can be used for optimisation of TlBr purity and crystal quality. The processes under concern were TlBr raw material purification, crystal growth, annealing and detector fabrication. The study focused on single crystals of TlBr grown from material purified by a hydrothermal recrystallisation method. In addition, hydrothermal conditions for synthesis, recrystallisation, crystal growth and annealing of TlBr crystals were examined. The final manufacturing process presented in this thesis deals with TlBr material purified by the Bridgman method. Then, material is hydrothermally recrystallised in pure water. A travelling molten zone (TMZ) method is used for additional purification of the recrystallised product and then for the final crystal growth. Subsequent processing is similar to that described in the literature. In this thesis, literature on improving quality of TlBr material/crystal and detector performance is reviewed. Aging aspects as well as the influence of different factors (temperature, time, electrode material and so on) on detector stability are considered and examined. The results of the process development are summarised and discussed. This thesis shows the considerable improvement in the charge carrier properties of a detector due to additional purification by hydrothermal recrystallisation. As an example, a thick (4 mm) TlBr detector produced by the process was fabricated and found to operate successfully in gamma-ray detection, confirming the validity of the proposed purification and technological steps. However, for the complete improvement of detector performance, further developments in crystal growth are required. The detector manufacturing process was optimized by characterisation of material and crystals using methods such as X-ray diffraction (XRD), polarisation microscopy, high-resolution inductively coupled plasma mass (HR-ICPM), Fourier transform infrared (FTIR), ultraviolet and visual (UV-Vis) spectroscopy, field emission scanning electron microscope (FESEM) and energy-dispersive X-ray spectroscopy (EDS), current-voltage (I-V) and capacity voltage (CV) characterisation, and photoconductivity, as well direct detector examination.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dissertation deals with remote narrowband measurements of the electromagnetic radiation emitted by lightning flashes. A lightning flash consists of a number of sub-processes. The return stroke, which transfers electrical charge from the thundercloud to to the ground, is electromagnetically an impulsive wideband process; that is, it emits radiation at most frequencies in the electromagnetic spectrum, but its duration is only some tens of microseconds. Before and after the return stroke, multiple sub-processes redistribute electrical charges within the thundercloud. These sub-processes can last for tens to hundreds of milliseconds, many orders of magnitude longer than the return stroke. Each sub-process causes radiation with specific time-domain characteristics, having maxima at different frequencies. Thus, if the radiation is measured at a single narrow frequency band, it is difficult to identify the sub-processes, and some sub-processes can be missed altogether. However, narrowband detectors are simple to design and miniaturize. In particular, near the High Frequency band (High Frequency, 3 MHz to 30 MHz), ordinary shortwave radios can, in principle, be used as detectors. This dissertation utilizes a prototype detector which is essentially a handheld AM radio receiver. Measurements were made in Scandinavia, and several independent data sources were used to identify lightning sub-processes, as well as the distance to each individual flash. It is shown that multiple sub-processes radiate strongly near the HF band. The return stroke usually radiates intensely, but it cannot be reliably identified from the time-domain signal alone. This means that a narrowband measurement is best used to characterize the energy of the radiation integrated over the whole flash, without attempting to identify individual processes. The dissertation analyzes the conditions under which this integrated energy can be used to estimate the distance to the flash. It is shown that flash-by-flash variations are large, but the integrated energy is very sensitive to changes in the distance, dropping as approximately the inverse cube root of the distance. Flashes can, in principle, be detected at distances of more than 100 km, but since the ground conductivity can vary, ranging accuracy drops dramatically at distances larger than 20 km. These limitations mean that individual flashes cannot be ranged accurately using a single narrowband detector, and the useful range is limited to 30 kilometers at the most. Nevertheless, simple statistical corrections are developed, which enable an accurate estimate of the distance to the closest edge of an active storm cell, as well as the approach speed. The results of the dissertation could therefore have practical applications in real-time short-range lightning detection and warning systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Boron neutron capture therapy (BNCT) is a form of chemically targeted radiotherapy that utilises the high neutron capture cross-section of boron-10 isotope to achieve a preferential dose increase in the tumour. The BNCT dosimetry poses a special challenge as the radiation dose absorbed by the irradiated tissues consists of several dose different components. Dosimetry is important as the effect of the radiation on the tissue is correlated with the radiation dose. Consistent and reliable radiation dose delivery and dosimetry are thus basic requirements for radiotherapy. The international recommendations for are not directly applicable to BNCT dosimetry. The existing dosimetry guidance for BNCT provides recommendations but also calls for investigating for complementary methods for comparison and improved accuracy. In this thesis the quality assurance and stability measurements of the neutron beam monitors used in dose delivery are presented. The beam monitors were found not to be affected by the presence of a phantom in the beam and that the effect of the reactor core power distribution was less than 1%. The weekly stability test with activation detectors has been generally reproducible within the recommended tolerance value of 2%. An established toolkit for epithermal neutron beams for determination of the dose components is presented and applied in an international dosimetric intercomparison. The measured quantities (neutron flux, fast neutron and photon dose) by the groups in the intercomparison were generally in agreement within the stated uncertainties. However, the uncertainties were large, ranging from 3-30% (1 standard deviation), emphasising the importance of dosimetric intercomparisons if clinical data is to be compared between different centers. Measurements with the Exradin type 2M ionisation chamber have been repeated in the epithermal neutron beam in the same measurement configuration over the course of 10 years. The presented results exclude severe sensitivity changes to thermal neutrons that have been reported for this type of chamber. Microdosimetry and polymer gel dosimetry as complementary methods for epithermal neutron beam dosimetry are studied. For microdosimetry the comparison of results with ionisation chambers and computer simulation showed that the photon dose measured with microdosimetry was lower than with the two other methods. The disagreement was within the uncertainties. For neutron dose the simulation and microdosimetry results agreed within 10% while the ionisation chamber technique gave 10-30% lower neutron dose rates than the two other methods. The response of the BANG-3 gel was found to be linear for both photon and epithermal neutron beam irradiation. The dose distribution normalised to dose maximum measured by MAGIC polymer gel was found to agree well with the simulated result near the dose maximum while the spatial difference between measured and simulated 30% isodose line was more than 1 cm. In both the BANG-3 and MAGIC gel studies, the interpretation of the results was complicated by the presence of high-LET radiation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Solar flares were first observed by plain eye in white light by William Carrington in England in 1859. Since then these eruptions in the solar corona have intrigued scientists. It is known that flares influence the space weather experienced by the planets in a multitude of ways, for example by causing aurora borealis. Understanding flares is at the epicentre of human survival in space, as astronauts cannot survive the highly energetic particles associated with large flares in high doses without contracting serious radiation disease symptoms, unless they shield themselves effectively during space missions. Flares may be at the epicentre of man s survival in the past as well: it has been suggested that giant flares might have played a role in exterminating many of the large species on Earth, including dinosaurs. Having said that prebiotic synthesis studies have shown lightning to be a decisive requirement for amino acid synthesis on the primordial Earth. Increased lightning activity could be attributed to space weather, and flares. This thesis studies flares in two ways: in the spectral and the spatial domain. We have extracted solar spectra using three different instruments, namely GOES (Geostationary Operational Environmental Satellite), RHESSI (Reuven Ramaty High Energy Solar Spectroscopic Imager) and XSM (X-ray Solar Monitor) for the same flares. The GOES spectra are low resolution obtained with a gas proportional counter, the RHESSI spectra are higher resolution obtained with Germanium detectors and the XSM spectra are very high resolution observed with a silicon detector. It turns out that the detector technology and response influence the spectra we see substantially, and are important to understanding what conclusions to draw from the data. With imaging data, there was not such a luxury of choice available. We used RHESSI imaging data to observe the spatial size of solar flares. In the present work the focus was primarily on current solar flares. However, we did make use of our improved understanding of solar flares to observe young suns in NGC 2547. The same techniques used with solar monitors were applied with XMM-Newton, a stellar X-ray monitor, and coupled with ground based Halpha observations these techniques yielded estimates for flare parameters in young suns. The material in this thesis is therefore structured from technology to application, covering the full processing path from raw data and detector responses to concrete physical parameter results, such as the first measurement of the length of plasma flare loops in young suns.