535 resultados para Dem gross error detection
Resumo:
Despite a central role in angiosperm reproduction, few gametophyte-specific genes and promoters have been isolated, particularly for the inaccessible female gametophyte (embryo sac). Using the Ds-based enhancer-detector line ET253, we have cloned an egg apparatus-specific enhancer (EASE) from Arabidopsis (Arabidopsis thaliana). The genomic region flanking the Ds insertion site was further analyzed by examining its capability to control gusA and GFP reporter gene expression in the embryo sac in a transgenic context. Through analysis of a 5' and 3' deletion series in transgenic Arabidopsis, the sequence responsible for egg apparatus-specific expression was delineated to 77 bp. Our data showed that this enhancer is unique in the Arabidopsis genome, is conserved among different accessions, and shows an unusual pattern of sequence variation. This EASE works independently of position and orientation in Arabidopsis but is probably not associated with any nearby gene, suggesting either that it acts over a large distance or that a cryptic element was detected. Embryo-specific ablation in Arabidopsis was achieved by transactivation of a diphtheria toxin gene under the control of the EASE. The potential application of the EASE element and similar control elements as part of an open-source biotechnology toolkit for apomixis is discussed.
Resumo:
The task addressed in this thesis is the automatic alignment of an ensemble of misaligned images in an unsupervised manner. This application is especially useful in computer vision applications where annotations of the shape of an object of interest present in a collection of images is required. Performing this task manually is a slow, tedious, expensive and error prone process which hinders the progress of research laboratories and businesses. Most recently, the unsupervised removal of geometric variation present in a collection of images has been referred to as congealing based on the seminal work of Learned-Miller [21]. The only assumption made in congealing is that the parametric nature of the misalignment is known a priori (e.g. translation, similarity, a�ne, etc) and that the object of interest is guaranteed to be present in each image. The capability to congeal an ensemble of misaligned images stemming from the same object class has numerous applications in object recognition, detection and tracking. This thesis concerns itself with the construction of a congealing algorithm titled, least-squares congealing, which is inspired by the well known image to image alignment algorithm developed by Lucas and Kanade [24]. The algorithm is shown to have superior performance characteristics when compared to previously established methods: canonical congealing by Learned-Miller [21] and stochastic congealing by Z�ollei [39].
Resumo:
This thesis addresses the problem of detecting and describing the same scene points in different wide-angle images taken by the same camera at different viewpoints. This is a core competency of many vision-based localisation tasks including visual odometry and visual place recognition. Wide-angle cameras have a large field of view that can exceed a full hemisphere, and the images they produce contain severe radial distortion. When compared to traditional narrow field of view perspective cameras, more accurate estimates of camera egomotion can be found using the images obtained with wide-angle cameras. The ability to accurately estimate camera egomotion is a fundamental primitive of visual odometry, and this is one of the reasons for the increased popularity in the use of wide-angle cameras for this task. Their large field of view also enables them to capture images of the same regions in a scene taken at very different viewpoints, and this makes them suited for visual place recognition. However, the ability to estimate the camera egomotion and recognise the same scene in two different images is dependent on the ability to reliably detect and describe the same scene points, or ‘keypoints’, in the images. Most algorithms used for this purpose are designed almost exclusively for perspective images. Applying algorithms designed for perspective images directly to wide-angle images is problematic as no account is made for the image distortion. The primary contribution of this thesis is the development of two novel keypoint detectors, and a method of keypoint description, designed for wide-angle images. Both reformulate the Scale- Invariant Feature Transform (SIFT) as an image processing operation on the sphere. As the image captured by any central projection wide-angle camera can be mapped to the sphere, applying these variants to an image on the sphere enables keypoints to be detected in a manner that is invariant to image distortion. Each of the variants is required to find the scale-space representation of an image on the sphere, and they differ in the approaches they used to do this. Extensive experiments using real and synthetically generated wide-angle images are used to validate the two new keypoint detectors and the method of keypoint description. The best of these two new keypoint detectors is applied to vision based localisation tasks including visual odometry and visual place recognition using outdoor wide-angle image sequences. As part of this work, the effect of keypoint coordinate selection on the accuracy of egomotion estimates using the Direct Linear Transform (DLT) is investigated, and a simple weighting scheme is proposed which attempts to account for the uncertainty of keypoint positions during detection. A word reliability metric is also developed for use within a visual ‘bag of words’ approach to place recognition.
Resumo:
In this study, the host-specificity and -sensitivity of human- and bovine-specific adenoviruses (HS-AVs and BS-AVs) were evaluated by testing wastewater/fecal samples from various animal species in Southeast, Queensland, Australia. The overall specificity and sensitivity of the HS-AVs marker were 1.0 and 0.78, respectively. These figures for the BS-AVs were 1.0 and 0.73, respectively. Twenty environmental water samples were colleted during wet conditions and 20 samples were colleted during dry conditions from the Maroochy Coastal River and tested for the presence of fecal indicator bacteria (FIB), host-specific viral markers, zoonotic bacterial and protozoan pathogens using PCR/qPCR. The concentrations of FIB in water samples collected after wet conditions were generally higher compared to dry conditions. HS-AVs was detected in 20% water samples colleted during wet conditions and whereas BS-AVs was detected in both wet (i.e., 10%) and dry (i.e., 10%) conditions. Both, C. jejuni mapA and Salmonella invA genes were detected in 10% and 10% of samples, respectively collected during dry conditions. The concentrations of Salmonella invA ranged between 3.5 × 102 to 4.3 × 102 genomic copies per 500 ml of water G. lamblia β-giardin gene was detected only in one sample (5%) collected during the dry conditions. Weak or significant correlations were observed between FIB with viral markers and zoonotic pathogens. However, during dry conditions, no significant correlations were observed between FIB concentrations with viral markers and zoonotic pathogens. The prevalence of HS-AVs in samples collected from the study river suggests that the quality of water is affected by human fecal pollution and as well as bovine fecal pollution. The results suggest that HS-AVs and BS-AVs detection using PCR could be a useful tool for the identification of human sourced fecal pollution in coastal waters.
Resumo:
BACKGROUND: The presence of insects in stored grains is a significant problem for grain farmers, bulk grain handlers and distributors worldwide. Inspections of bulk grain commodities is essential to detect pests and therefore to reduce the risk of their presence in exported goods. It has been well documented that insect pests cluster in response to factors such as microclimatic conditions within bulk grain. Statistical sampling methodologies for grains, however, have typically considered pests and pathogens to be homogeneously distributed throughout grain commodities. In this paper we demonstrate a sampling methodology that accounts for the heterogeneous distribution of insects in bulk grains. RESULTS: We show that failure to account for the heterogeneous distribution of pests may lead to overestimates of the capacity for a sampling program to detect insects in bulk grains. Our results indicate the importance of the proportion of grain that is infested in addition to the density of pests within the infested grain. We also demonstrate that the probability of detecting pests in bulk grains increases as the number of sub-samples increases, even when the total volume or mass of grain sampled remains constant. CONCLUSION: This study demonstrates the importance of considering an appropriate biological model when developing sampling methodologies for insect pests. Accounting for a heterogeneous distribution of pests leads to a considerable improvement in the detection of pests over traditional sampling models.
Resumo:
The potential to sequester atmospheric carbon in agricultural and forest soils to offset greenhouse gas emissions has generated interest in measuring changes in soil carbon resulting from changes in land management. However, inherent spatial variability of soil carbon limits the precision of measurement of changes in soil carbon and hence, the ability to detect changes. We analyzed variability of soil carbon by intensively sampling sites under different land management as a step toward developing efficient soil sampling designs. Sites were tilled crop-land and a mixed deciduous forest in Tennessee, and old-growth and second-growth coniferous forest in western Washington, USA. Six soil cores within each of three microplots were taken as an initial sample and an additional six cores were taken to simulate resampling. Soil C variability was greater in Washington than in Tennessee, and greater in less disturbed than in more disturbed sites. Using this protocol, our data suggest that differences on the order of 2.0 Mg C ha(-1) could be detected by collection and analysis of cores from at least five (tilled) or two (forest) microplots in Tennessee. More spatial variability in the forested sites in Washington increased the minimum detectable difference, but these systems, consisting of low C content sandy soil with irregularly distributed pockets of organic C in buried logs, are likely to rank among the most spatially heterogeneous of systems. Our results clearly indicate that consistent intramicroplot differences at all sites will enable detection of much more modest changes if the same microplots are resampled.
Resumo:
Field studies show that the internal screens in a gross pollutant trap (GPT) are often clogged with organic matter, due to infrequent cleaning. The hydrodynamic performance of a GPT with fully blocked screens was comprehensively investigated under a typical range of onsite operating conditions. Using an acoustic Doppler velocimeter (ADV), velocity profiles across three critical sections of the GPT were measured and integrated to examine the net fluid flow at each section. The data revealed that when the screens are fully blocked, the flow structure within the GPT radically changes. Consequently, the capture/retention performance of the device rapidly deteriorates. Good agreement was achieved between the experimental and the previous 2D computational fluid dynamics (CFD) velocity profiles for the lower GPT inlet flow conditions.
Resumo:
An algorithm based on the concept of combining Kalman filter and Least Error Square (LES) techniques is proposed in this paper. The algorithm is intended to estimate signal attributes like amplitude, frequency and phase angle in the online mode. This technique can be used in protection relays, digital AVRs, DGs, DSTATCOMs, FACTS and other power electronics applications. The Kalman filter is modified to operate on a fictitious input signal and provides precise estimation results insensitive to noise and other disturbances. At the same time, the LES system has been arranged to operate in critical transient cases to compensate the delay and inaccuracy identified because of the response of the standard Kalman filter. Practical considerations such as the effect of noise, higher order harmonics, and computational issues of the algorithm are considered and tested in the paper. Several computer simulations and a laboratory test are presented to highlight the usefulness of the proposed method. Simulation results show that the proposed technique can simultaneously estimate the signal attributes, even if it is highly distorted due to the presence of non-linear loads and noise.
Resumo:
A technique was developed to investigate the capture/retention characteristic of a gross pollutant trap (GPT) with fully and partially blocked internal screens. Custom modified spheres of variable density filled with liquid were released into the GPT inlet and monitored at the outlet. The outlet data shows that the capture/retention performances of a GPT with fully blocked screens deteriorate rapidly. During higher flow rates, screen blockages below 68% approach maximum efficiency. At lower flow rates, the high performance trend is reversed and the variation in behaviour of pollutants with different densities becomes more noticeable. Additional experiments with a second upstream inlet configured GPT showed an improved capture/retention performance. It was also noted that the bypass allows the incoming pollutants to escape when the GPT is blocked. This useful feature prevents upstream blockages between cleaning intervals.
Resumo:
The QUT-NOISE-TIMIT corpus consists of 600 hours of noisy speech sequences designed to enable a thorough evaluation of voice activity detection (VAD) algorithms across a wide variety of common background noise scenarios. In order to construct the final mixed-speech database, a collection of over 10 hours of background noise was conducted across 10 unique locations covering 5 common noise scenarios, to create the QUT-NOISE corpus. This background noise corpus was then mixed with speech events chosen from the TIMIT clean speech corpus over a wide variety of noise lengths, signal-to-noise ratios (SNRs) and active speech proportions to form the mixed-speech QUT-NOISE-TIMIT corpus. The evaluation of five baseline VAD systems on the QUT-NOISE-TIMIT corpus is conducted to validate the data and show that the variety of noise available will allow for better evaluation of VAD systems than existing approaches in the literature.
Resumo:
Robustness of the track allocation problem is rarely addressed in literatures and the obtained track allocation schemes (TAS) embody some bottlenecks. Therefore, an approach to detect bottlenecks is needed to support local optimization. First a TAS is transformed to an executable model by Petri nets. Then disturbances analysis is performed using the model and the indicators of the total trains' departure delays are collected to detect bottlenecks when each train suffers a disturbance. Finally, the results of the tests based on a rail hub linking six lines and a TAS about thirty minutes show that the minimum buffer time is 21 seconds and there are two bottlenecks where the buffer times are 57 and 44 seconds respectively, and it indicates that the bottlenecks do not certainly locate at the area where there is minimum buffer time. The proposed approach can further support selection of multi schemes and robustness optimization.
Resumo:
Fibre Bragg Grating (FBG) sensors have been installed along an existing line for the purposes of train detection and weight measurement. The results show fair accuracy and high resolution on the vertical force acted on track when the train wheels are rolling upon. While the sensors are already in place and data is available, further applications beyond train detection are explored. This study presents the analysis on the unique signatures from the data collected to characterise wheel-rail interaction for rail defect detection. Focus of this first stage of work is placed on the repeatability of signals from the same wheel-rail interactions while the rail is in healthy state. Discussions on the preliminary results and hence the feasibility of this condition monitoring application, as well as technical issues to be addressed in practice, are given.
Resumo:
Purpose. To investigate the effect of various presbyopic vision corrections on nighttime driving performance on a closed-road driving circuit. Methods. Participants were 11 presbyopes (mean age, 57.3 ± 5.8 years), with a mean best sphere distance refractive error of R+0.23±1.53 DS and L+0.20±1.50 DS, whose only experience of wearing presbyopic vision correction was reading spectacles. The study involved a repeated-measures design by which a participant's nighttime driving performance was assessed on a closed-road circuit while wearing each of four power-matched vision corrections. These included single-vision distance lenses (SV), progressive-addition spectacle lenses (PAL), monovision contact lenses (MV), and multifocal contact lenses (MTF CL) worn in a randomized order. Measures included low-contrast road hazard detection and avoidance, road sign and near target recognition, lane-keeping, driving time, and legibility distance for street signs. Eye movement data (fixation duration and number of fixations) were also recorded. Results. Street sign legibility distances were shorter when wearing MV and MTF CL than SV and PAL (P < 0.001), and participants drove more slowly with MTF CL than with PALs (P = 0.048). Wearing SV resulted in more errors (P < 0.001) and in more (P = 0.002) and longer (P < 0.001) fixations when responding to near targets. Fixation duration was also longer when viewing distant signs with MTF CL than with PAL (P = 0.031). Conclusions. Presbyopic vision corrections worn by naive, unadapted wearers affected nighttime driving. Overall, spectacle corrections (PAL and SV) performed well for distance driving tasks, but SV negatively affected viewing near dashboard targets. MTF CL resulted in the shortest legibility distance for street signs and longer fixation times.
Resumo:
Secret-sharing schemes describe methods to securely share a secret among a group of participants. A properly constructed secret-sharing scheme guarantees that the share belonging to one participant does not reveal anything about the shares of others or even the secret itself. Besides being used to distribute a secret, secret-sharing schemes have also been used in secure multi-party computations and redundant residue number systems for error correction codes. In this paper, we propose that the secret-sharing scheme be used as a primitive in a Network-based Intrusion Detection System (NIDS) to detect attacks in encrypted Networks. Encrypted networks such as Virtual Private Networks (VPNs) fully encrypt network traffic which can include both malicious and non-malicious traffic. Traditional NIDS cannot monitor such encrypted traffic. We therefore describe how our work uses a combination of Shamir's secret-sharing scheme and randomised network proxies to enable a traditional NIDS to function normally in a VPN environment.