857 resultados para Detection and segmentation


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Progress in crop improvement is limited by the ability to identify favourable combinations of genotypes (G) and management practices (M) in relevant target environments (E) given the resources available to search among the myriad of possible combinations. To underpin yield advance we require prediction of phenotype based on genotype. In plant breeding, traditional phenotypic selection methods have involved measuring phenotypic performance of large segregating populations in multi-environment trials and applying rigorous statistical procedures based on quantitative genetic theory to identify superior individuals. Recent developments in the ability to inexpensively and densely map/sequence genomes have facilitated a shift from the level of the individual (genotype) to the level of the genomic region. Molecular breeding strategies using genome wide prediction and genomic selection approaches have developed rapidly. However, their applicability to complex traits remains constrained by gene-gene and gene-environment interactions, which restrict the predictive power of associations of genomic regions with phenotypic responses. Here it is argued that crop ecophysiology and functional whole plant modelling can provide an effective link between molecular and organism scales and enhance molecular breeding by adding value to genetic prediction approaches. A physiological framework that facilitates dissection and modelling of complex traits can inform phenotyping methods for marker/gene detection and underpin prediction of likely phenotypic consequences of trait and genetic variation in target environments. This approach holds considerable promise for more effectively linking genotype to phenotype for complex adaptive traits. Specific examples focused on drought adaptation are presented to highlight the concepts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recent investigations into plant tissues have indicated that the free form of the natural polyphenolic antioxidant, ellagic acid (EA), is much more plentiful than first envisaged; consequently a re-assessment of solvent systems for the extraction of this water-insoluble form is needed. As EA solubility and its UV-Vis spectrum, commonly used for detection and quantification, are both governed by pH, an understanding of this dependence is vital if accurate EA measurements are to be achieved. After evaluating the pH effects on the solubility and UV-Vis spectra of commercial EA, an extraction protocol was devised that promoted similar pH conditions for both standard solutions and plant tissue extracts. The extraction so devised followed by HPLC with photodiode-array detection (DAD) provided a simple, sensitive and validated methodology that determined free EA in a variety of plant extracts. The use of 100 % methanol or a triethanolamine-based mixture as the standard dissolving solvents were the best choices, while these higher pH-generating solvents were more efficient in extracting EA from the plants tested with the final choice allied to the plants’ natural acidity. Two of the native Australian plants anise myrtle (Syzygium anisatum) and Kakadu plum (Terminalia ferdinandiana) exhibited high concentrations of free EA. Furthermore, the dual approach to measuring EA UV-Vis spectra made possible an assessment of the effect of acidified eluent on EA spectra when the DAD was employed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents 'vSpeak', the first initiative taken in Pakistan for ICT enabled conversion of dynamic Sign Urdu gestures into natural language sentences. To realize this, vSpeak has adopted a novel approach for feature extraction using edge detection and image compression which gives input to the Artificial Neural Network that recognizes the gesture. This technique caters for the blurred images as well. The training and testing is currently being performed on a dataset of 200 patterns of 20 words from Sign Urdu with target accuracy of 90% and above.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Bioremediation, which is the exploitation of the intrinsic ability of environmental microbes to degrade and remove harmful compounds from nature, is considered to be an environmentally sustainable and cost-effective means for environmental clean-up. However, a comprehensive understanding of the biodegradation potential of microbial communities and their response to decontamination measures is required for the effective management of bioremediation processes. In this thesis, the potential to use hydrocarbon-degradative genes as indicators of aerobic hydrocarbon biodegradation was investigated. Small-scale functional gene macro- and microarrays targeting aliphatic, monoaromatic and low molecular weight polyaromatic hydrocarbon biodegradation were developed in order to simultaneously monitor the biodegradation of mixtures of hydrocarbons. The validity of the array analysis in monitoring hydrocarbon biodegradation was evaluated in microcosm studies and field-scale bioremediation processes by comparing the hybridization signal intensities to hydrocarbon mineralization, real-time polymerase chain reaction (PCR), dot blot hybridization and both chemical and microbiological monitoring data. The results obtained by real-time PCR, dot blot hybridization and gene array analysis were in good agreement with hydrocarbon biodegradation in laboratory-scale microcosms. Mineralization of several hydrocarbons could be monitored simultaneously using gene array analysis. In the field-scale bioremediation processes, the detection and enumeration of hydrocarbon-degradative genes provided important additional information for process optimization and design. In creosote-contaminated groundwater, gene array analysis demonstrated that the aerobic biodegradation potential that was present at the site, but restrained under the oxygen-limited conditions, could be successfully stimulated with aeration and nutrient infiltration. During ex situ bioremediation of diesel oil- and lubrication oil-contaminated soil, the functional gene array analysis revealed inefficient hydrocarbon biodegradation, caused by poor aeration during composting. The functional gene array specifically detected upper and lower biodegradation pathways required for complete mineralization of hydrocarbons. Bacteria representing 1 % of the microbial community could be detected without prior PCR amplification. Molecular biological monitoring methods based on functional genes provide powerful tools for the development of more efficient remediation processes. The parallel detection of several functional genes using functional gene array analysis is an especially promising tool for monitoring the biodegradation of mixtures of hydrocarbons.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Detection and prevention of global network satellite system (GNSS) “spoofing” attacks, or the broadcast of false global navigation satellite system services, has recently attracted much research interest. This survey aims to fill three gaps in the literature: first, to assess in detail the exact nature of threat scenarios posed by spoofing against the most commonly cited targets; second, to investigate the many practical impediments, often underplayed, to carrying out GNSS spoofing attacks in the field; and third, to survey and assess the effectiveness of a wide range of proposed defences against GNSS spoofing. Our conclusion lists promising areas of future research.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Along with useful microorganisms, there are some that cause potential damage to the animals and plants. Detection and identification of these harmful organisms in a cost and time effective way is a challenge for the researchers. The future of detection methods for microorganisms shall be guided by biosensor, which has already contributed enormously in sensing and detection technology. Here, we aim to review the use of various biosensors, developed by integrating the biological and physicochemical/mechanical properties (of tranducers), which can have enormous implication in healthcare, food, agriculture and biodefence. We have also highlighted the ways to improve the functioning of the biosensor.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Expenditure on dental and oral health services in Australia is $3.4 billion AUD annually. This is the sixth highest health cost and accounts for 7 % of total national health expenditure. Approximately 49 % of Australian children aged 6 years have caries experience in their deciduous teeth and this is rising. The aetiology of dental caries involves a complex interplay of individual, behavioural, social, economic, political and environmental conditions, and there is increasing interest in genetic predisposition and epigenetic modification. Methods The Oral Health Sub-study; a cross sectional study of a birth cohort began in November 2012 by examining mothers and their children who were six years old by the time of initiation of the study, which is ongoing. Data from detailed questionnaires of families from birth onwards and data on mothers’ knowledge, attitudes and practices towards oral health collected at the time of clinical examination are used. Subjects’ height, weight and mid-waist circumference are taken and Body Mass Index (BMI) computed, using an electronic Bio-Impedance balance. Dental caries experience is scored using the International Caries Detection and Assessment System (ICDAS). Saliva is collected for physiological measures. Salivary Deoxyribose Nucleic Acid (DNA) is extracted for genetic studies including epigenetics using the SeqCap Epi Enrichment Kit. Targets of interest are being confirmed by pyrosequencing to identify potential epigenetic markers of caries risk. Discussion This study will examine a wide range of potential determinants for childhood dental caries and evaluate inter-relationships amongst them. The findings will provide an evidence base to plan and implement improved preventive strategies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In document images, we often find printed lines over-lapping with hand written elements especially in case of signatures. Typical examples of such images are bank cheques and payment slips. Although the detection and removal of the horizontal lines has been addressed, the restoration of the handwritten area after removal of lines, persists to be a problem of interest. lit this paper, we propose a method for line removal and restoration of the erased areas of the handwritten elements. Subjective evaluation of the results have been conducted to analyze the effectiveness of the proposed method. The results are promising with an accuracy of 86.33%. The entire Process takes less than half a second for completion on a 2.4 GHz 512 MB RAM Pentium IV PC for a document image.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A Delay Tolerant Network (DTN) is a dynamic, fragmented, and ephemeral network formed by a large number of highly mobile nodes. DTNs are ephemeral networks with highly mobile autonomous nodes. This requires distributed and self-organised approaches to trust management. Revocation and replacement of security credentials under adversarial influence by preserving the trust on the entity is still an open problem. Existing methods are mostly limited to detection and removal of malicious nodes. This paper makes use of the mobility property to provide a distributed, self-organising, and scalable revocation and replacement scheme. The proposed scheme effectively utilises the Leverage of Common Friends (LCF) trust system concepts to revoke compromised security credentials, replace them with new ones, whilst preserving the trust on them. The level of achieved entity confidence is thereby preserved. Security and performance of the proposed scheme is evaluated using an experimental data set in comparison with other schemes based around the LCF concept. Our extensive experimental results show that the proposed scheme distributes replacement credentials up to 35% faster and spreads spoofed credentials of strong collaborating adversaries up to 50% slower without causing any significant increase on the communication and storage overheads, when compared to other LCF based schemes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With technology scaling, vulnerability to soft errors in random logic is increasing. There is a need for on-line error detection and protection for logic gates even at sea level. The error checker is the key element for an on-line detection mechanism. We compare three different checkers for error detection from the point of view of area, power and false error detection rates. We find that the double sampling checker (used in Razor), is the simplest and most area and power efficient, but suffers from very high false detection rates of 1.15 times the actual error rates. We also find that the alternate approaches of triple sampling and integrate and sample method (I&S) can be designed to have zero false detection rates, but at an increased area, power and implementation complexity. The triple sampling method has about 1.74 times the area and twice the power as compared to the Double Sampling method and also needs a complex clock generation scheme. The I&S method needs about 16% more power with 0.58 times the area as double sampling, but comes with more stringent implementation constraints as it requires detection of small voltage swings.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The matched filter method for detecting a periodic structure on a surface hidden behind randomness is known to detect up to (r(0)/Lambda) gt;= 0.11, where r(0) is the coherence length of light on scattering from the rough part and 3 is the wavelength of the periodic part of the surface-the above limit being much lower than what is allowed by conventional detection methods. The primary goal of this technique is the detection and characterization of the periodic structure hidden behind randomness without the use of any complicated experimental or computational procedures. This paper examines this detection procedure for various values of the amplitude a of the periodic part beginning from a = 0 to small finite values of a. We thus address the importance of the following quantities: `(a)lambda) `, which scales the amplitude of the periodic part with the wavelength of light, and (r(0))Lambda),in determining the detectability of the intensity peaks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The literature review elucidates the mechanism of oxidation in proteins and amino acids and gives an overview of the detection and analysis of protein oxidation products as well as information about ?-lactoglobulin and studies carried out on modifications of this protein under certain conditions. The experimental research included the fractionation of the tryptic peptides of ?-lactoglobulin using preparative-HPLC-MS and monitoring the oxidation process of these peptides via reverse phase-HPLC-UV. Peptides chosen to be oxidized were selected with respect to their amino acid content which were susceptible to oxidation and fractionated according to their m/z values. These peptides were: IPAVFK (m/z 674), ALPMHIR (m/z 838), LIVTQTMK (m/z 934) and VLVLDTDYK (m/z 1066). Even though it was not possible to solely isolate the target peptides due to co-elution of various fractions, the percentages of target peptides in the samples were satisfactory to carry out the oxidation procedure. IPAVFK and VLVLDTDYK fractions were found to yield the oxidation products reviewed in literature, however, unoxidized peptides were still present in high amounts after 21 days of oxidation. The UV data at 260 and 280 nm enabled to monitor both the main peptides and the oxidation products due to the absorbance of aromatic side-chains these peptides possess. ALPMHIR and LIVTQTMK fractions were oxidatively consumed rapidly and oxidation products of these peptides were observed even on day 0. High rates of depletion of these peptides were acredited to the presence of His (H) and sulfur-containing side-chains of Met (M). In conclusion, selected peptides hold the potential to be utilized as marker peptides in ?-lactoglobulin oxidation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper discusses a successful application of the Acoustic Emission Technique (AET) for the detection and location of leak paths present on an inaccessible side of an end shield of a Pressurised Heavy Water Reactor (PHWR). The methodology was based on the fact that air- and water-leak AE signals have different characteristic features. Baseline data was generated from a sound end shield of a PHWR for characterising the background noise. A mock-up end shield system with saw-cut leak paths was used to verify the validity of the methodology. It was found that air-leak signals under pressurisation (as low as 3 psi) could be detected by frequency domain analysis. Signals due to air leaks from various locations of defective end shield were acquired and analysed. It was possible to detect and locate leak paths. The presence of detected leak paths was further confirmed by an alternative test.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The problem of denoising damage indicator signals for improved operational health monitoring of systems is addressed by applying soft computing methods to design filters. Since measured data in operational settings is contaminated with noise and outliers, pattern recognition algorithms for fault detection and isolation can give false alarms. A direct approach to improving the fault detection and isolation is to remove noise and outliers from time series of measured data or damage indicators before performing fault detection and isolation. Many popular signal-processing approaches do not work well with damage indicator signals, which can contain sudden changes due to abrupt faults and non-Gaussian outliers. Signal-processing algorithms based on radial basis function (RBF) neural network and weighted recursive median (WRM) filters are explored for denoising simulated time series. The RBF neural network filter is developed using a K-means clustering algorithm and is much less computationally expensive to develop than feedforward neural networks trained using backpropagation. The nonlinear multimodal integer-programming problem of selecting optimal integer weights of the WRM filter is solved using genetic algorithm. Numerical results are obtained for helicopter rotor structural damage indicators based on simulated frequencies. Test signals consider low order polynomial growth of damage indicators with time to simulate gradual or incipient faults and step changes in the signal to simulate abrupt faults. Noise and outliers are added to the test signals. The WRM and RBF filters result in a noise reduction of 54 - 71 and 59 - 73% for the test signals considered in this study, respectively. Their performance is much better than the moving average FIR filter, which causes significant feature distortion and has poor outlier removal capabilities and shows the potential of soft computing methods for specific signal-processing applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A damage detection and imaging methodology based on symmetry of neighborhood sensor path and similarity of signal patterns with respect to radial paths in a circular array of sensors has been developed It uses information regarding Limb wave propagation along with a triangulation scheme to rapidly locate and quantify the severity of damage without using all of the sensor data. In a plate like structure, such a scheme can be effectively employed besides full field imaging of wave scattering pattern from the damage, if present in the plate. This new scheme is validated experimentally. Hole and corrosion type damages have been detected and quantified using the proposed scheme successfully. A wavelet based cumulative damage index has been studied which shows monotonic sensitivity against the severity of the damage. which is most desired in a Structural Health Monitoring system. (C) 2010 Elsevier Ltd. All rights reserved.