30 resultados para experimental knowledge extraction


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kinetic studies on the AR (aldose reductase) protein have shown that it does not behave as a classical enzyme in relation to ring aldose sugars. As with non-enzymatic glycation reactions, there is probably a free radical element involved derived from monosaccharide autoxidation. in the case of AR, there is free radical oxidation of NADPH by autoxidizing monosaccharides, which is enhanced in the presence of the NADPH-binding protein. Thus any assay for AR based on the oxidation of NADPH in the presence of autoxidizing monosaccharides is invalid, and tissue AR measurements based on this method are also invalid, and should be reassessed. AR exhibits broad specificity for both hydrophilic and hydrophobic aldehydes that suggests that the protein may be involved in detoxification. The last thing we would want to do is to inhibit it. ARIs (AR inhibitors) have a number of actions in the cell which are not specific, and which do not involve them binding to AR. These include peroxy-radical scavenging and effects of metal ion chelation. The AR/ARI story emphasizes the importance of correct experimental design in all biocatalytic experiments. Developing the use of Bayesian utility functions, we have used a systematic method to identify the optimum experimental designs for a number of kinetic model data sets. This has led to the identification of trends between kinetic model types, sets of design rules and the key conclusion that such designs should be based on some prior knowledge of K-m and/or the kinetic model. We suggest an optimal and iterative method for selecting features of the design such as the substrate range, number of measurements and choice of intermediate points. The final design collects data suitable for accurate modelling and analysis and minimizes the error in the parameters estimated, and is suitable for simple or complex steady-state models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In areas such as drug development, clinical diagnosis and biotechnology research, acquiring details about the kinetic parameters of enzymes is crucial. The correct design of an experiment is critical to collecting data suitable for analysis, modelling and deriving the correct information. As classical design methods are not targeted to the more complex kinetics being frequently studied, attention is needed to estimate parameters of such models with low variance. We demonstrate that a Bayesian approach (the use of prior knowledge) can produce major gains quantifiable in terms of information, productivity and accuracy of each experiment. Developing the use of Bayesian Utility functions, we have used a systematic method to identify the optimum experimental designs for a number of kinetic model data sets. This has enabled the identification of trends between kinetic model types, sets of design rules and the key conclusion that such designs should be based on some prior knowledge of K-M and/or the kinetic model. We suggest an optimal and iterative method for selecting features of the design such as the substrate range, number of measurements and choice of intermediate points. The final design collects data suitable for accurate modelling and analysis and minimises the error in the parameters estimated. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative structure activity relationships (QSARs) have been developed to optimise the choice of nitrogen heterocyclic molecules that can be used to separate the minor actinides such as americium(III) from europium(III) in the aqueous PUREX raffinate of nuclear waste. Experimental data on distribution coefficients and separation factors (SFs) for 47 such ligands have been obtained and show SF values ranging from 0.61 to 100. The ligands were divided into a training set of 36 molecules to develop the QSAR and a test set of 11 molecules to validate the QSAR. Over 1500 molecular descriptors were calculated for each heterocycle and the Genetic Algorithm was used to select the most appropriate for use in multiple regression equations. Equations were developed fitting the separation factors to 6-8 molecular descriptors which gave r(2) values of >0.8 for the training set and values of >0.7 for the test set, thus showing good predictive quality. The descriptors used in the equations were primarily electronic and steric. These equations can be used to predict the separation factors of nitrogen heterocycles not yet synthesised and/or tested and hence obtain the most efficient ligands for lanthanide and actinide separation. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims to introduce a knowledge-based managemental prototype entitled Eþ for environmental-conscious construction relied on an integration of current environmental management tools in construction area. The overall objective of developing the Eþ prototype is to facilitate selectively reusing the retrievable knowledge in construction engineering and management areas assembled from previous projects for the best practice in environmental-conscious construction. The methodologies adopted in previous and ongoing research related to the development of the Eþ belong to the operations research area and the information technology area, including literature review, questionnaire survey and interview, statistical analysis, system analysis and development, experimental research and simulation, and so on. The content presented in this paper includes an advanced Eþ prototype, a comprehensive review of environmental management tools integrated to the Eþ prototype, and an experimental case study of the implementation of the Eþ prototype. It is expected that the adoption and implementation of the Eþ prototype can effectively facilitate contractors to improve their environmental performance in the lifecycle of projectbased construction and to reduce adverse environmental impacts due to the deployment of various engineering and management processes at each construction stage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to provide a quantitative multicriteria decision-making approach to knowledge management in construction entrepreneurship education by means of an analytic knowledge network process (KANP) Design/methodology/approach- The KANP approach in the study integrates a standard industrial classification with the analytic network process (ANP). For the construction entrepreneurship education, a decision-making model named KANP.CEEM is built to apply the KANP method in the evaluation of teaching cases to facilitate the case method, which is widely adopted in entrepreneurship education at business schools. Findings- The study finds that there are eight clusters and 178 nodes in the KANP.CEEM model, and experimental research on the evaluation of teaching cases discloses that the KANP method is effective in conducting knowledge management to the entrepreneurship education. Research limitations/implications- As an experimental research, this paper ignores the concordance between a selected standard classification and others, which perhaps limits the usefulness of KANP.CEEM model elsewhere. Practical implications- As the KANP.CEEM model is built based on the standard classification codes and the embedded ANP, it is thus expected that the model has a wide potential in evaluating knowledge-based teaching materials for any education purpose with a background from the construction industry, and can be used by both faculty and students. Originality/value- This paper fulfils a knowledge management need and offers a practical tool for an academic starting out on the development of knowledge-based teaching cases and other teaching materials or for a student going through the case studies and other learning materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chain is a commonly used component in offshore moorings where its ruggedness and corrosion resistance make it an attractive choice. Another attractive property is that a straight chain is inherently torque balanced. Having said this, if a chain is loaded in a twisted condition, or twisted when under load, it exhibits highly non-linear torsional behaviour. The consequences of this behaviour can cause handling difficulties or may compromise the integrity of the mooring system, and care must be taken to avoid problems for both the chain and any components to which it is connected. Even with knowledge of the potential problems, there will always be occasions where, despite the utmost care, twist is unavoidable. Thus it is important for the engineer to be able to determine the effects. A frictionless theory has been developed in Part 1 of the paper that may be used to predict the resultant torques and movement or 'lift' in the links as non-dimensional functions of the angle of twist. The present part of the paper describes a series of experiments undertaken on both studless and stud-link chain to allow comparison of this theoretical model with experimental data. Results are presented for the torsional response and link lift for 'constant twist' and 'constant load' type tests on chains of three different link sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fast Knowledge-based Evolution Strategy, KES, for the multi-objective minimum spanning tree, is presented. The proposed algorithm is validated, for the bi-objective case, with an exhaustive search for small problems (4-10 nodes), and compared with a deterministic algorithm, EPDA and NSGA-II for larger problems (up to 100 nodes) using benchmark hard instances. Experimental results show that KES finds the true Pareto fronts for small instances of the problem and calculates good approximation Pareto sets for larger instances tested. It is shown that the fronts calculated by YES are superior to NSGA-II fronts and almost as good as those established by EPDA. KES is designed to be scalable to multi-objective problems and fast due to its small complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we introduce a novel high-level visual content descriptor devised for performing semantic-based image classification and retrieval. The work can be treated as an attempt for bridging the so called "semantic gap". The proposed image feature vector model is fundamentally underpinned by an automatic image labelling framework, called Collaterally Cued Labelling (CCL), which incorporates the collateral knowledge extracted from the collateral texts accompanying the images with the state-of-the-art low-level visual feature extraction techniques for automatically assigning textual keywords to image regions. A subset of the Corel image collection was used for evaluating the proposed method. The experimental results indicate that our semantic-level visual content descriptors outperform both conventional visual and textual image feature models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Results from both experimental measurements and 3D numerical simulations of Ground Source Heat Pump systems (GSHP) at a UK climate are presented. Experimental measurements of a horizontal-coupled slinky GSHP were undertaken in Talbot Cottage at Drayton St Leonard site, Oxfordshire, UK. The measured thermophysical properties of in situ soil were used in the CFD model. The thermal performance of slinky heat exchangers for the horizontal-coupled GSHP system for different coil diameters and slinky interval distances was investigated using a validated 3D model. Results from a two month period of monitoring the performance of the GSHP system showed that the COP decreased with the running time. The average COP of the horizontal-coupled GSHP was 2.5. The numerical prediction showed that there was no significant difference in the specific heat extraction of the slinky heat exchanger at different coil diameters. However, the larger the diameter of coil, the higher the heat extraction per meter length of soil. The specific heat extraction also increased, but the heat extraction per meter length of soil decreased with the increase of coil central interval distance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes the paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate appropriate and diverse range of keyphrases that reflect the document. This paper proposes a solution that examines the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are many published methods available for creating keyphrases for documents. Previous work in the field has shown that in a significant proportion of cases author selected keyphrases are not appropriate for the document they accompany. This requires the use of such automated methods to improve the use of keyphrases. Often the keyphrases are not updated when the focus of a paper changes or include keyphrases that are more classificatory than explanatory. The published methods are all evaluated using different corpora, typically one relevant to their field of study. This not only makes it difficult to incorporate the useful elements of algorithms in future work but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of six corpora. The methods chosen were term frequency, inverse document frequency, the C-Value, the NC-Value, and a synonym based approach. These methods were compared to evaluate performance and quality of results, and to provide a future benchmark. It is shown that, with the comparison metric used for this study Term Frequency and Inverse Document Frequency were the best algorithms, with the synonym based approach following them. Further work in the area is required to determine an appropriate (or more appropriate) comparison metric.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A common procedure for studying the effects on cognition of repetitive transcranial magnetic stimulation (rTMS) is to deliver rTMS concurrent with task performance, and to compare task performance on these trials versus on trials without rTMS. Recent evidence that TMS can have effects on neural activity that persist longer than the experimental session itself, however, raise questions about the assumption of the transient nature of rTMS that underlies many concurrent (or "online") rTMS designs. To our knowledge, there have been no studies in the cognitive domain examining whether the application of brief trains of rTMS during specific epochs of a complex task may have effects that spill over into subsequent task epochs, and perhaps into subsequent trials. We looked for possible immediate spill-over and longer-term cumulative effects of rTMS in data from two studies of visual short-term delayed recognition. In 54 subjects, 10-Hz rTMS trains were applied to five different brain regions during the 3-s delay period of a spatial task, and in a second group of 15 subjects, electroencephalography (EEG) was recorded while 10-Hz rTMS was applied to two brain areas during the 3-s delay period of both spatial and object tasks. No evidence for immediate effects was found in the comparison of the memory probe-evoked response on trials that were vs. were not preceded by delay-period rTMS. No evidence for cumulative effects was found in analyses of behavioral performance, and of EEG signal, as a function of task block. The implications of these findings, and their relation to the broader literature on acute vs. long-lasting effects of rTMS, are considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coupling a review of previous studies on the acquisition of grammatical aspects undertaken from contrasting paradigmatic views of second language acquisition (SLA) with new experimental data from L2 Portuguese, the present study contributes to this specific literature as well as general debates in L2 epistemology. We tested 31 adult English learners of L2 Portuguese across three experiments, examining the extent to which they had acquired the syntax and (subtle) semantics of grammatical aspect. Demonstrating that many individuals acquired target knowledge of what we contend is a poverty-of-the-stimulus semantic entailment related to the checking of aspectual features encoded in Portuguese preterit and imperfect morphology, namely, a [±accidental] distinction that obtains in a restricted subset of contexts, we conclude that UG-based approaches to SLA are in a better position to tap and gauge underlying morphosyntactic competence, since based on independent theoretical linguistic descriptions, they make falsifiable predictions that are amenable to empirical scrutiny, seek to describe and explain beyond performance, and can account for L2 convergence on poverty-of-the-stimulus knowledge as well as L2 variability/optionality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

n the past decade, the analysis of data has faced the challenge of dealing with very large and complex datasets and the real-time generation of data. Technologies to store and access these complex and large datasets are in place. However, robust and scalable analysis technologies are needed to extract meaningful information from these datasets. The research field of Information Visualization and Visual Data Analytics addresses this need. Information visualization and data mining are often used complementary to each other. Their common goal is the extraction of meaningful information from complex and possibly large data. However, though data mining focuses on the usage of silicon hardware, visualization techniques also aim to access the powerful image-processing capabilities of the human brain. This article highlights the research on data visualization and visual analytics techniques. Furthermore, we highlight existing visual analytics techniques, systems, and applications including a perspective on the field from the chemical process industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experimental philosophy brings empirical methods to philosophy. These methods are used to probe how people think about philosophically interesting things such as knowledge, morality, freedom, etc. This paper explores the contribution that qualitative methods have to make in this enterprise. I argue that qualitative methods have the potential to make a much greater contribution than they have so far. Along the way, I acknowledge a few types of resistance that proponents of qualitative methods in experimental philosophy might encounter, and provide reasons to think they are ill-founded.