95 resultados para Issued-based approach
Resumo:
Ancient DNA (aDNA) research has long depended on the power of PCR to amplify trace amounts of surviving genetic material from preserved specimens. While PCR permits specific loci to be targeted and amplified, in many ways it can be intrinsically unsuited to damaged and degraded aDNA templates. PCR amplification of aDNA can produce highly-skewed distributions with significant contributions from miscoding lesion damage and non-authentic sequence artefacts. As traditional PCR-based approaches have been unable to fully resolve the molecular nature of aDNA damage over many years, we have developed a novel single primer extension (SPEX)-based approach to generate more accurate sequence information. SPEX targets selected template strands at defined loci and can generate a quantifiable redundancy of coverage; providing new insights into the molecular nature of aDNA damage and fragmentation. SPEX sequence data reveals inherent limitations in both traditional and metagenomic PCR-based approaches to aDNA, which can make current damage analyses and correct genotyping of ancient specimens problematic. In contrast to previous aDNA studies, SPEX provides strong quantitative evidence that C U-type base modifications are the sole cause of authentic endogenous damage-derived miscoding lesions. This new approach could allow ancient specimens to be genotyped with unprecedented accuracy.
Resumo:
In positron emission tomography and single photon emission computed tomography studies using D2 dopamine (DA) receptor radiotracers, a decrease in radiotracer binding potential (BP) is usually interpreted in terms of increased competition with synaptic DA. However, some data suggest that this signal may also reflect agonist (DA)-induced increases in D2 receptor (D2R) internalization, a process which would presumably also decrease the population of receptors available for binding to hydrophilic radioligands. To advance interpretation of alterations in D2 radiotracer BP, direct methods of assessment of D2R internalization are required. Here, we describe a confocal microscopy-based approach for the quantification of agonist-dependent receptor internalization. The method relies upon double-labeling of the receptors with antibodies directed against intracellular as well as extracellular epitopes. Following agonist stimulation, DA D2R internalization was quantified by differentiating, in optical cell sections, the signal due to the staining of the extracellular from intracellular epitopes of D2Rs. Receptor internalization was increased in the presence of the D2 agonists DA and bromocriptine, but not the D1 agonist SKF38393. Pretreatment with either the D2 antagonist sulpiride, or inhibitors of internalization (phenylarsine oxide and high molarity sucrose), blocked D2-agonist induced receptor internalization, thus validating this method in vitro. This approach therefore provides a direct and streamlined methodology for investigating the pharmacological and mechanistic aspects of D2R internalization, and should inform the interpretation of results from in vivo receptor imaging studies.
Resumo:
Motivation: Hydrogen bonds are one of the most important inter-atomic interactions in biology. Previous experimental, theoretical and bioinformatics analyses have shown that the hydrogen bonding potential of amino acids is generally satisfied and that buried unsatisfied hydrogen-bond-capable residues are destabilizing. When studying mutant proteins, or introducing mutations to residues involved in hydrogen bonding, one needs to know whether a hydrogen bond can be maintained. Our aim, therefore, was to develop a rapid method to evaluate whether a sidechain can form a hydrogen-bond. Results: A novel knowledge-based approach was developed in which the conformations accessible to the residues involved are taken into account. Residues involved in hydrogen bonds in a set of high resolution crystal structures were analyzed and this analysis is then applied to a given protein. The program was applied to assess mutations in the tumour-suppressor protein, p53. This raised the number of distinct mutations identified as disrupting sidechain-sidechain hydrogen bonding from 181 in our previous analysis to 202 in this analysis.
Resumo:
Micromorphological characters of the fruiting bodies, such as ascus-type and hymenial amyloidity, and secondary chemistry have been widely employed as key characters in Ascomycota classification. However, the evolution of these characters has yet not been studied using molecular phylogenies. We have used a combined Bayesian and maximum likelihood based approach to trace character evolution on a tree inferred from a combined analysis of nuclear and mitochondrial ribosomal DNA sequences. The maximum likelihood aspect overcomes simplifications inherent in maximum parsimony methods, whereas the Markov chain Monte Carlo aspect renders results independent of any particular phylogenetic tree. The results indicate that the evolution of the two chemical characters is quite different, being stable once developed for the medullary lecanoric acid, whereas the cortical chlorinated xanthones appear to have been lost several times. The current ascus-types and the amyloidity of the hymenial gel in Pertusariaceae appear to have been developed within the family. The basal ascus-type of pertusarialean fungi remains unknown. (c) 2006 The Linnean Society of London, Biological Journal of the Linnean Society, 2006, 89, 615-626.
Resumo:
Risk management (RM) comprises of risk identification, risk analysis, response planning, monitoring and action planning tasks that are carried out throughout the life cycle of a project in order to ensure that project objectives are met. Although the methodological aspects of RM are well-defined, the philosophical background is rather vague. In this paper, a learning-based approach is proposed. In order to implement this approach in practice, a tool has been developed to facilitate construction of a lessons learned database that contains risk-related information and risk assessment throughout the life cycle of a project. The tool is tested on a real construction project. The case study findings demonstrate that it can be used for storing as well as updating risk-related information and finally, carrying out a post-project appraisal. The major weaknesses of the tool are identified as, subjectivity of the risk rating process and unwillingness of people to enter information about reasons of failure.
Resumo:
Prebiotics are non-digestible (by the host) food ingredients that have a beneficial effect through their selective metabolism in the intestinal tract. Key to this is the specificity of microbial changes. The present paper reviews the concept in terms of three criteria: (a) resistance to gastric acidity, hydrolysis by mammalian enzymes and gastrointestinal absorption; (b) fermentation by intestinal microflora; (c) selective stimulation of the growth and/or activity of intestinal bacteria associated with health and wellbeing. The conclusion is that prebiotics that currently fulfil these three criteria are fructo-oligosaccharides, galacto-oligosaccharides and lactulose, although promise does exist with several other dietary carbohydrates. Given the range of food vehicles that may be fortified by prebiotics, their ability to confer positive microflora changes and the health aspects that may accrue, it is important that robust technologies to assay functionality are used. This would include a molecular-based approach to determine flora changes. The future use of prebiotics may allow species-level changes in the microbiota, an extrapolation into genera other than the bifidobacteria and lactobacilli, and allow preferential use in disease-prone areas of the body.
Resumo:
In positron emission tomography and single photon emission computed tomography studies using D2 dopamine (DA) receptor radiotracers, a decrease in radiotracer binding potential (BP) is usually interpreted in terms of increased competition with synaptic DA. However, some data suggest that this signal may also reflect agonist (DA)-induced increases in D2 receptor (D2R) internalization, a process which would presumably also decrease the population of receptors available for binding to hydrophilic radioligands. To advance interpretation of alterations in D2 radiotracer BP, direct methods of assessment of D2R internalization are required. Here, we describe a confocal microscopy-based approach for the quantification of agonist-dependent receptor internalization. The method relies upon double-labeling of the receptors with antibodies directed against intracellular as well as extracellular epitopes. Following agonist stimulation, DA D2R internalization was quantified by differentiating, in optical cell sections, the signal due to the staining of the extracellular from intracellular epitopes of D2Rs. Receptor internalization was increased in the presence of the D2 agonists DA and bromocriptine, but not the D1 agonist SKF38393. Pretreatment with either the D2 antagonist sulpiride, or inhibitors of internalization (phenylarsine oxide and high molarity sucrose), blocked D2-agonist induced receptor internalization, thus validating this method in vitro. This approach therefore provides a direct and streamlined methodology for investigating the pharmacological and mechanistic aspects of D2R internalization, and should inform the interpretation of results from in vivo receptor imaging studies.
Resumo:
This paper describes a framework architecture for the automated re-purposing and efficient delivery of multimedia content stored in CMSs. It deploys specifically designed templates as well as adaptation rules based on a hierarchy of profiles to accommodate user, device and network requirements invoked as constraints in the adaptation process. The user profile provides information in accordance with the opt-in principle, while the device and network profiles provide the operational constraints such as for example resolution and bandwidth limitations. The profiles hierarchy ensures that the adaptation privileges the users' preferences. As part of the adaptation, we took into account the support for users' special needs, and therefore adopted a template-based approach that could simplify the adaptation process integrating accessibility-by-design in the template.
Resumo:
Using the classical Parzen window estimate as the target function, the kernel density estimation is formulated as a regression problem and the orthogonal forward regression technique is adopted to construct sparse kernel density estimates. The proposed algorithm incrementally minimises a leave-one-out test error score to select a sparse kernel model, and a local regularisation method is incorporated into the density construction process to further enforce sparsity. The kernel weights are finally updated using the multiplicative nonnegative quadratic programming algorithm, which has the ability to reduce the model size further. Except for the kernel width, the proposed algorithm has no other parameters that need tuning, and the user is not required to specify any additional criterion to terminate the density construction procedure. Two examples are used to demonstrate the ability of this regression-based approach to effectively construct a sparse kernel density estimate with comparable accuracy to that of the full-sample optimised Parzen window density estimate.
Resumo:
This paper addresses the crucial problem of wayfinding assistance in the Virtual Environments (VEs). A number of navigation aids such as maps, agents, trails and acoustic landmarks are available to support the user for navigation in VEs, however it is evident that most of the aids are visually dominated. This work-in-progress describes a sound based approach that intends to assist the task of 'route decision' during navigation in a VE using music. Furthermore, with use of musical sounds it aims to reduce the cognitive load associated with other visually as well as physically dominated tasks. To achieve these goals, the approach exploits the benefits provided by music to ease and enhance the task of wayfinding, whilst making the user experience in the VE smooth and enjoyable.
Resumo:
Using the classical Parzen window (PW) estimate as the desired response, the kernel density estimation is formulated as a regression problem and the orthogonal forward regression technique is adopted to construct sparse kernel density (SKD) estimates. The proposed algorithm incrementally minimises a leave-one-out test score to select a sparse kernel model, and a local regularisation method is incorporated into the density construction process to further enforce sparsity. The kernel weights of the selected sparse model are finally updated using the multiplicative nonnegative quadratic programming algorithm, which ensures the nonnegative and unity constraints for the kernel weights and has the desired ability to reduce the model size further. Except for the kernel width, the proposed method has no other parameters that need tuning, and the user is not required to specify any additional criterion to terminate the density construction procedure. Several examples demonstrate the ability of this simple regression-based approach to effectively construct a SKID estimate with comparable accuracy to that of the full-sample optimised PW density estimate. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
Information technologies are used across all stages of the construction process, and are crucial in the delivery of large projects. Drawing on detailed research on a construction megaproject, we take a practice-based approach to examining the practical and theoretical tensions between existing ways of working and the introduction of new coordination tools in this paper. We analyze the new hybrid practices that emerge, using insights from actor-network theory to articulate the delegation of actions to material and digital objects within ecologies of practice. The three vignettes that we discuss highlight this delegation of actions, the “plugging” and “patching” of ecologies occurring across media and the continual iterations of working practices between different types of media. By shifting the focus from tools to these wider ecologies of practice, the approach has important managerial mplications for the stabilization of new technologies and practices and for managing technological change on large construction projects. We conclude with a discussion of new directions for research, oriented to further elaborating on the importance of the material in understanding change.
Resumo:
Airborne LIght Detection And Ranging (LIDAR) provides accurate height information for objects on the earth, which makes LIDAR become more and more popular in terrain and land surveying. In particular, LIDAR data offer vital and significant features for land-cover classification which is an important task in many application domains. In this paper, an unsupervised approach based on an improved fuzzy Markov random field (FMRF) model is developed, by which the LIDAR data, its co-registered images acquired by optical sensors, i.e. aerial color image and near infrared image, and other derived features are fused effectively to improve the ability of the LIDAR system for the accurate land-cover classification. In the proposed FMRF model-based approach, the spatial contextual information is applied by modeling the image as a Markov random field (MRF), with which the fuzzy logic is introduced simultaneously to reduce the errors caused by the hard classification. Moreover, a Lagrange-Multiplier (LM) algorithm is employed to calculate a maximum A posteriori (MAP) estimate for the classification. The experimental results have proved that fusing the height data and optical images is particularly suited for the land-cover classification. The proposed approach works very well for the classification from airborne LIDAR data fused with its coregistered optical images and the average accuracy is improved to 88.9%.
Resumo:
Brand competition is modelled using an agent based approach in order to examine the long run dynamics of market structure and brand characteristics. A repeated game is designed where myopic firms choose strategies based on beliefs about their rivals and consumers. Consumers are heterogeneous and can observe neighbour behaviour through social networks. Although firms do not observe them, the social networks have a significant impact on the emerging market structure. Presence of networks tends to polarize market share and leads to higher volatility in brands. Yet convergence in brand characteristics usually happens whenever the market reaches a steady state. Scale-free networks accentuate the polarization and volatility more than small world or random networks. Unilateral innovations are less frequent under social networks.
Resumo:
The work reported in this paper is motivated towards handling single node failures for parallel summation algorithms in computer clusters. An agent based approach is proposed in which a task to be executed is decomposed to sub-tasks and mapped onto agents that traverse computing nodes. The agents intercommunicate across computing nodes to share information during the event of a predicted node failure. Two single node failure scenarios are considered. The Message Passing Interface is employed for implementing the proposed approach. Quantitative results obtained from experiments reveal that the agent based approach can handle failures more efficiently than traditional failure handling approaches.