245 resultados para Ananas sativus extract


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Affine covariant local image features are a powerful tool for many applications, including matching and calibrating wide baseline images. Local feature extractors that use a saliency map to locate features require adaptation processes in order to extract affine covariant features. The most effective extractors make use of the second moment matrix (SMM) to iteratively estimate the affine shape of local image regions. This paper shows that the Hessian matrix can be used to estimate local affine shape in a similar fashion to the SMM. The Hessian matrix requires significantly less computation effort than the SMM, allowing more efficient affine adaptation. Experimental results indicate that using the Hessian matrix in conjunction with a feature extractor that selects features in regions with high second order gradients delivers equivalent quality correspondences in less than 17% of the processing time, compared to the same extractor using the SMM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most approaches to business process compliance are restricted to the analysis of the structure of processes. It has been argued that full regulatory compliance requires information on not only the structure of processes but also on what the tasks in a process do. To this end Governatori and Sadiq[2007] proposed to extend business processes with semantic annotations. We propose a methodology to automatically extract one kind of such annotations; in particular the annotations related to the data schema and templates linked to the various tasks in a business process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The quality of discovered features in relevance feedback (RF) is the key issue for effective search query. Most existing feedback methods do not carefully address the issue of selecting features for noise reduction. As a result, extracted noisy features can easily contribute to undesirable effectiveness. In this paper, we propose a novel feature extraction method for query formulation. This method first extract term association patterns in RF as knowledge for feature extraction. Negative RF is then used to improve the quality of the discovered knowledge. A novel information filtering (IF) model is developed to evaluate the proposed method. The experimental results conducted on Reuters Corpus Volume 1 and TREC topics confirm that the proposed model achieved encouraging performance compared to state-of-the-art IF models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A multiple reaction monitoring mass spectrometric assay for the quantification of PYY in human plasma has been developed. A two stage sample preparation protocol was employed in which plasma containing the full length neuropeptide was first digested using trypsin, followed by solid-phase extraction to extract the digested peptide from the complex plasma matrix. The peptide extracts were analysed by LC-MS using multiple reaction monitoring to detect and quantify PYY. The method has been validated for plasma samples, yielding linear responses over the range 5–1,000 ng mL−1. The method is rapid, robust and specific for plasma PYY detection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Enterprise Systems (ES) have emerged as possibly the most important and challenging development in the corporate use of information technology in the last decade. Organizations have invested heavily in these large, integrated application software suites expecting improvments in; business processes, management of expenditure, customer service, and more generally, competitiveness, improved access to better information/knowledge (i.e., business intelligence and analytics). Forrester survey data consistently shows that investment in ES and enterprise applications in general remains the top IT spending priority, with the ES market estimated at $38 billion and predicted to grow at a steady rate of 6.9%, reaching $50 billion by 2012 (Wang & Hamerman, 2008). Yet, organizations have failed to realize all the anticipated benefits. One of the key reasons is the inability of employees to properly utilize the capabilities of the enterprise systems to complete the work and extract information critical to decision making. In response, universities (tertiary institutes) have developed academic programs aimed at addressing the skill gaps. In parallel with the proliferation of ES, there has been growing recognition of the importance of Teaching Enterprise Systems at tertiary education institutes. Many academic papers have discused the important role of Enterprise System curricula at tertiary education institutes (Ask, 2008; Hawking, 2004; Stewart, 2001), where the teaching philosophises, teaching approaches and challenges in Enterprise Systems education were discussed. Following the global trends, tertiary institutes in the Pacific-Asian region commenced introducing Enterprise System curricula in late 1990s with a range of subjects (a subject represents a single unit, rather than a collection of units; which we refer to as a course) in faculties / schools / departments of Information Technology, Business and in some cases in Engineering. Many tertiary educations commenced their initial subject offers around four salient concepts of Enterprise Systems: (1) Enterprise Systems implementations, (2) Introductions to core modules of Enterprise Systems, (3) Application customization using a programming language (e.g. ABAP) and (4) Systems Administration. While universities have come a long way in developing curricula in the enterprise system area, many obstacles remain: high cost of technology, qualified faculty to teach, lack of teaching materials, etc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data mining techniques extract repeated and useful patterns from a large data set that in turn are utilized to predict the outcome of future events. The main purpose of the research presented in this paper is to investigate data mining strategies and develop an efficient framework for multi-attribute project information analysis to predict the performance of construction projects. The research team first reviewed existing data mining algorithms, applied them to systematically analyze a large project data set collected by the survey, and finally proposed a data-mining-based decision support framework for project performance prediction. To evaluate the potential of the framework, a case study was conducted using data collected from 139 capital projects and analyzed the relationship between use of information technology and project cost performance. The study results showed that the proposed framework has potential to promote fast, easy to use, interpretable, and accurate project data analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monitoring the natural environment is increasingly important as habit degradation and climate change reduce theworld’s biodiversity.We have developed software tools and applications to assist ecologists with the collection and analysis of acoustic data at large spatial and temporal scales.One of our key objectives is automated animal call recognition, and our approach has three novel attributes. First, we work with raw environmental audio, contaminated by noise and artefacts and containing calls that vary greatly in volume depending on the animal’s proximity to the microphone. Second, initial experimentation suggested that no single recognizer could dealwith the enormous variety of calls. Therefore, we developed a toolbox of generic recognizers to extract invariant features for each call type. Third, many species are cryptic and offer little data with which to train a recognizer. Many popular machine learning methods require large volumes of training and validation data and considerable time and expertise to prepare. Consequently we adopt bootstrap techniques that can be initiated with little data and refined subsequently. In this paper, we describe our recognition tools and present results for real ecological problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Generic sentiment lexicons have been widely used for sentiment analysis these days. However, manually constructing sentiment lexicons is very time-consuming and it may not be feasible for certain application domains where annotation expertise is not available. One contribution of this paper is the development of a statistical learning based computational method for the automatic construction of domain-specific sentiment lexicons to enhance cross-domain sentiment analysis. Our initial experiments show that the proposed methodology can automatically generate domain-specific sentiment lexicons which contribute to improve the effectiveness of opinion retrieval at the document level. Another contribution of our work is that we show the feasibility of applying the sentiment metric derived based on the automatically constructed sentiment lexicons to predict product sales of certain product categories. Our research contributes to the development of more effective sentiment analysis system to extract business intelligence from numerous opinionated expressions posted to the Web

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Structural health monitoring (SHM) refers to the procedure used to assess the condition of structures so that their performance can be monitored and any damage can be detected early. Early detection of damage and appropriate retrofitting will aid in preventing failure of the structure and save money spent on maintenance or replacement and ensure the structure operates safely and efficiently during its whole intended life. Though visual inspection and other techniques such as vibration based ones are available for SHM of structures such as bridges, the use of acoustic emission (AE) technique is an attractive option and is increasing in use. AE waves are high frequency stress waves generated by rapid release of energy from localised sources within a material, such as crack initiation and growth. AE technique involves recording these waves by means of sensors attached on the surface and then analysing the signals to extract information about the nature of the source. High sensitivity to crack growth, ability to locate source, passive nature (no need to supply energy from outside, but energy from damage source itself is utilised) and possibility to perform real time monitoring (detecting crack as it occurs or grows) are some of the attractive features of AE technique. In spite of these advantages, challenges still exist in using AE technique for monitoring applications, especially in the area of analysis of recorded AE data, as large volumes of data are usually generated during monitoring. The need for effective data analysis can be linked with three main aims of monitoring: (a) accurately locating the source of damage; (b) identifying and discriminating signals from different sources of acoustic emission and (c) quantifying the level of damage of AE source for severity assessment. In AE technique, the location of the emission source is usually calculated using the times of arrival and velocities of the AE signals recorded by a number of sensors. But complications arise as AE waves can travel in a structure in a number of different modes that have different velocities and frequencies. Hence, to accurately locate a source it is necessary to identify the modes recorded by the sensors. This study has proposed and tested the use of time-frequency analysis tools such as short time Fourier transform to identify the modes and the use of the velocities of these modes to achieve very accurate results. Further, this study has explored the possibility of reducing the number of sensors needed for data capture by using the velocities of modes captured by a single sensor for source localization. A major problem in practical use of AE technique is the presence of sources of AE other than crack related, such as rubbing and impacts between different components of a structure. These spurious AE signals often mask the signals from the crack activity; hence discrimination of signals to identify the sources is very important. This work developed a model that uses different signal processing tools such as cross-correlation, magnitude squared coherence and energy distribution in different frequency bands as well as modal analysis (comparing amplitudes of identified modes) for accurately differentiating signals from different simulated AE sources. Quantification tools to assess the severity of the damage sources are highly desirable in practical applications. Though different damage quantification methods have been proposed in AE technique, not all have achieved universal approval or have been approved as suitable for all situations. The b-value analysis, which involves the study of distribution of amplitudes of AE signals, and its modified form (known as improved b-value analysis), was investigated for suitability for damage quantification purposes in ductile materials such as steel. This was found to give encouraging results for analysis of data from laboratory, thereby extending the possibility of its use for real life structures. By addressing these primary issues, it is believed that this thesis has helped improve the effectiveness of AE technique for structural health monitoring of civil infrastructures such as bridges.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A significant number of patients diagnosed with primary brain tumours report unmet information needs. Using concept mapping methodology, this study aimed to identify strategies for improving information provision, and to describe factors that health professionals understood to influence their provision of information to patients with brain tumours and their families. Concept mapping is a mixed methods approach that uses statistical methods to represent participants’ perceived relationships between elements as conceptual maps. These maps, and results of associated data collection and analyses, are used to extract concepts involved in information provision to these patients. Thirty health professionals working across a range of neuro-oncology roles and settings participated in the concept mapping process. Participants rated a care coordinator as the most important strategy for improving brain tumour care, with psychological support as a whole rated as the most important element of care. Five major themes were identified as facilitating information provision: health professionals’ communication skills, style and attitudes; patients’ needs and preferences; perceptions of patients’ need for protection and initiative; rapport and continuity between patients and health professionals; and the nature of the health care system. Overall, health professionals conceptualised information provision as ‘individualised’, dependent on these interconnected personal and environmental factors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The inquiries to return predictability are traditionally limited to conditional mean, while literature on portfolio selection is replete with moment-based analysis with up to the fourth moment being considered. This paper develops a distribution-based framework for both return prediction and portfolio selection. More specifically, a time-varying return distribution is modeled through quantile regressions and copulas, using quantile regressions to extract information in marginal distributions and copulas to capture dependence structure. A preference function which captures higher moments is proposed for portfolio selection. An empirical application highlights the additional information provided by the distributional approach which cannot be captured by the traditional moment-based methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Demineralized freeze-dried bone allografts (DFDBAs) have been proposed as a useful adjunct in periodontal therapy to induce periodontal regeneration through the induction of new bone formation. The presence of bone morphogenetic proteins (BMPs) within the demineralized matrix has been proposed as a possible mechanism through which DFDBA may exert its biologic effect. However, in recent years, the predictability of results using DFDBA has been variable and has led to its use being questioned. One reason for the variability in tissue response may be attributed to differences in the processing of DFDBA, which may lead to loss of activity of any bioactive substances within the DFDBA matrix. Therefore, the purpose of this investigation was to determine whether there are detectable levels of bone morphogenetic proteins in commercial DFDBA preparations. METHODS: A single preparation of DFDBA was obtained from three commercial sources. Each preparation was studied in triplicate. Proteins within the DFDBA samples were first extracted with 4M guanidinium HCI for seven days at 40 degrees celsius and the residue was further extracted with 4M guanidinium HCL/EDTA for seven days at 40 degrees celsius. Two anti-human BMP-2 and -4 antibodies were used for the detection of the presence of BMP's in the extracts. RESULTS: Neither BMP-2 nor BMP-4 was detected in any of the extracts. When recombinant human BMP-2 and -4 were added throughout the extraction process of DFDBA extraction, not only were intact proteins detected but smaller molecular weight fragments were also noted in the extract. CONCLUSIONS: These results indicate that all of the DFDBA samples tested had no detectable amounts of BMP-2 and -4. In addition, an unknown substance present in the DFDBA may be responsible for degradation of whatever BMPs might be present.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper outlines a feasible scheme to extract deck trend when a rotary-wing unmanned aerial vehicle (RUAV)approaches an oscillating deck. An extended Kalman filter (EKF) is de- veloped to fuse measurements from multiple sensors for effective estimation of the unknown deck heave motion. Also, a recursive Prony Analysis (PA) procedure is proposed to implement online curve-fitting of the estimated heave mo- tion. The proposed PA constructs an appropriate model with parameters identified using the forgetting factor recursive least square (FFRLS)method. The deck trend is then extracted by separating dominant modes. Performance of the proposed procedure is evaluated using real ship motion data, and simulation results justify the suitability of the proposed method into safe landing of RUAVs operating in a maritime environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Virus-like particle-based vaccines for high-risk human papillomaviruses (HPVs) appear to have great promise; however, cell culture-derived vaccines will probably be very expensive. The optimization of expression of different codon-optimized versions of the HPV-16 L1 capsid protein gene in plants has been explored by means of transient expression from a novel suite of Agrobacterium tumefaciens binary expression vectors, which allow targeting of recombinant protein to the cytoplasm, endoplasmic reticulum (ER) or chloroplasts. A gene resynthesized to reflect human codon usage expresses better than the native gene, which expresses better than a plant-optimized gene. Moreover, chloroplast localization allows significantly higher levels of accumulation of L1 protein than does cytoplasmic localization, whilst ER retention was least successful. High levels of L1 (>17% total soluble protein) could be produced via transient expression: the protein assembled into higher-order structures visible by electron microscopy, and a concentrated extract was highly immunogenic in mice after subcutaneous injection and elicited high-titre neutralizing antibodies. Transgenic tobacco plants expressing a human codon-optimized gene linked to a chloroplast-targeting signal expressed L1 at levels up to 11% of the total soluble protein. These are the highest levels of HPV L1 expression reported for plants: these results, and the excellent immunogenicity of the product, significantly improve the prospects of making a conventional HPV vaccine by this means. © 2007 SGM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A high-throughput method of isolating and cloning geminivirus genomes from dried plant material, by combining an Extract-n-Amp™-based DNA isolation technique with rolling circle amplification (RCA) of viral DNA, is presented. Using this method an attempt was made to isolate and clone full geminivirus genomes/genome components from 102 plant samples, including dried leaves stored at room temperature for between 6 months and 10 years, with an average hands-on-time to RCA-ready DNA of 15 min per 20 samples. While storage of dried leaves for up to 6 months did not appreciably decrease cloning success rates relative to those achieved with fresh samples, efficiency of the method decreased with increasing storage time. However, it was still possible to clone virus genomes from 47% of 10-year-old samples. To illustrate the utility of this simple method for high-throughput geminivirus diversity studies, six Maize streak virus genomes, an Abutilon mosaic virus DNA-B component and the DNA-A component of a previously unidentified New Word begomovirus species were fully sequenced. Genomic clones of the 69 other viruses were verified as such by end sequencing. This method should be extremely useful for the study of any circular DNA plant viruses with genome component lengths smaller than the maximum size amplifiable by RCA. © 2008 Elsevier B.V. All rights reserved.