917 resultados para Approach to CSR development


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel methodology is described in which transcriptomics is combined with the measurement of bread-making quality and other agronomic traits for wheat genotypes grown in different environments (wet and cool or hot and dry conditions) to identify transcripts associated with these traits. Seven doubled haploid lines from the Spark x Rialto mapping population were selected to be matched for development and known alleles affecting quality. These were grown in polytunnels with different environments applied 14 days post-anthesis, and the whole experiment was repeated over 2 years. Transcriptomics using the wheat Affymetrix chip was carried out on whole caryopsis samples at two stages during grain filling. Transcript abundance was correlated with the traits for approximately 400 transcripts. About 30 of these were selected as being of most interest, and markers were derived from them and mapped using the population. Expression was identified as being under cis control for 11 of these and under trans control for 18. These transcripts are candidates for involvement in the biological processes which underlie genotypic variation in these traits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Details about the parameters of kinetic systems are crucial for progress in both medical and industrial research, including drug development, clinical diagnosis and biotechnology applications. Such details must be collected by a series of kinetic experiments and investigations. The correct design of the experiment is essential to collecting data suitable for analysis, modelling and deriving the correct information. We have developed a systematic and iterative Bayesian method and sets of rules for the design of enzyme kinetic experiments. Our method selects the optimum design to collect data suitable for accurate modelling and analysis and minimises the error in the parameters estimated. The rules select features of the design such as the substrate range and the number of measurements. We show here that this method can be directly applied to the study of other important kinetic systems, including drug transport, receptor binding, microbial culture and cell transport kinetics. It is possible to reduce the errors in the estimated parameters and, most importantly, increase the efficiency and cost-effectiveness by reducing the necessary amount of experiments and data points measured. (C) 2003 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Acquiring details of kinetic parameters of enzymes is crucial to biochemical understanding, drug development, and clinical diagnosis in ocular diseases. The correct design of an experiment is critical to collecting data suitable for analysis, modelling and deriving the correct information. As classical design methods are not targeted to the more complex kinetics being frequently studied, attention is needed to estimate parameters of such models with low variance. Methods: We have developed Bayesian utility functions to minimise kinetic parameter variance involving differentiation of model expressions and matrix inversion. These have been applied to the simple kinetics of the enzymes in the glyoxalase pathway (of importance in posttranslational modification of proteins in cataract), and the complex kinetics of lens aldehyde dehydrogenase (also of relevance to cataract). Results: Our successful application of Bayesian statistics has allowed us to identify a set of rules for designing optimum kinetic experiments iteratively. Most importantly, the distribution of points in the range is critical; it is not simply a matter of even or multiple increases. At least 60 % must be below the KM (or plural if more than one dissociation constant) and 40% above. This choice halves the variance found using a simple even spread across the range.With both the glyoxalase system and lens aldehyde dehydrogenase we have significantly improved the variance of kinetic parameter estimation while reducing the number and costs of experiments. Conclusions: We have developed an optimal and iterative method for selecting features of design such as substrate range, number of measurements and choice of intermediate points. Our novel approach minimises parameter error and costs, and maximises experimental efficiency. It is applicable to many areas of ocular drug design, including receptor-ligand binding and immunoglobulin binding, and should be an important tool in ocular drug discovery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper aims to introduce a knowledge-based managemental prototype entitled Eþ for environmental-conscious construction relied on an integration of current environmental management tools in construction area. The overall objective of developing the Eþ prototype is to facilitate selectively reusing the retrievable knowledge in construction engineering and management areas assembled from previous projects for the best practice in environmental-conscious construction. The methodologies adopted in previous and ongoing research related to the development of the Eþ belong to the operations research area and the information technology area, including literature review, questionnaire survey and interview, statistical analysis, system analysis and development, experimental research and simulation, and so on. The content presented in this paper includes an advanced Eþ prototype, a comprehensive review of environmental management tools integrated to the Eþ prototype, and an experimental case study of the implementation of the Eþ prototype. It is expected that the adoption and implementation of the Eþ prototype can effectively facilitate contractors to improve their environmental performance in the lifecycle of projectbased construction and to reduce adverse environmental impacts due to the deployment of various engineering and management processes at each construction stage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are three key driving forces behind the development of Internet Content Management Systems (CMS) - a desire to manage the explosion of content, a desire to provide structure and meaning to content in order to make it accessible, and a desire to work collaboratively to manipulate content in some meaningful way. Yet the traditional CMS has been unable to meet the latter of these requirements, often failing to provide sufficient tools for collaboration in a distributed context. Peer-to-Peer (P2P) systems are networks in which every node is an equal participant (whether transmitting data, exchanging content, or invoking services) and there is an absence of any centralised administrative or coordinating authorities. P2P systems are inherently more scalable than equivalent client-server implementations as they tend to use resources at the edge of the network much more effectively. This paper details the rationale and design of a P2P middleware for collaborative content management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a technique that can be used as part of a simple and practical agile method for requirements engineering. It is based on disciplined goal-responsibility modelling but eschews formality in favour of a set of practicality objectives. The technique can be used together with Agile Programming to develop software in internet time. We illustrate the technique and introduce lazy refinement, responsibility composition and context sketching. Goal sketching has been used in a number of real-world development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper argues for the relevance of paying attention to structuring participation processes across scales as one of the ways in which participation of multi-organisational partnerships that involve conflicting interests might be managed. Issue wise the paper deals with problems in connection with land mobilisation for road widening in complex and concentrated high value urban settings. It discusses a case study of plan implementation involving individual landowners, the land development market, the local government, other governmental and non-governmental organisations and the state government, which together achieved objectives that seemed impossible at first sight. In theoretical terms, the paper engages with Jessop's (2001) Strategic-Relational Approach (SRA), arguing for its potential for informing action in a way that is capable of achieving steering outputs. The claim for SRA is demonstrated by re-examining the case study. The factors that come through as SRA is applied are drawn out and it is suggested that the theory though non-deterministic, helps guide action by highlighting certain dynamics of systems that can be used for institutional intervention. These dynamics point to the importance of paying attention to scale and the way in which participation and negotiation processes are structured so as to favour certain outcomes rather than others

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Species-based indices are frequently employed as surrogates for wider biodiversity health and measures of environmental condition. Species selection is crucial in determining an indicators metric value and hence the validity of the interpretation of ecosystem condition and function it provides, yet an objective process to identify appropriate indicator species is frequently lacking. 2. An effective indicator needs to (i) be representative, reflecting the status of wider biodiversity; (ii) be reactive, acting as early-warning systems for detrimental changes in environmental conditions; (iii) respond to change in a predictable way. We present an objective, niche-based approach for species' selection, founded on a coarse categorisation of species' niche space and key resource requirements, which ensures the resultant indicator has these key attributes. 3. We use UK farmland birds as a case study to demonstrate this approach, identifying an optimal indicator set containing 12 species. In contrast to the 19 species included in the farmland bird index (FBI), a key UK biodiversity indicator that contributes to one of the UK Government's headline indicators of sustainability, the niche space occupied by these species fully encompasses that occupied by the wider community of 62 species. 4. We demonstrate that the response of these 12 species to land-use change is a strong correlate to that of the wider farmland bird community. Furthermore, the temporal dynamics of the index based on their population trends closely matches the population dynamics of the wider community. However, in both analyses, the magnitude of the change in our indicator was significantly greater, allowing this indicator to act as an early-warning system. 5. Ecological indicators are embedded in environmental management, sustainable development and biodiversity conservation policy and practice where they act as metrics against which progress towards national, regional and global targets can be measured. Adopting this niche-based approach for objective selection of indicator species will facilitate the development of sensitive and representative indices for a range of taxonomic groups, habitats and spatial scales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Housing in the UK accounts for 30.5% of all energy consumed and is responsible for 25% of all carbon emissions. The UK Government’s Code for Sustainable Homes requires all new homes to be zero carbon by 2016. The development and widespread diffusion of low and zero carbon (LZC) technologies is recognised as being a key solution for housing developers to deliver against this zero-carbon agenda. The innovation challenge to design and incorporate these technologies into housing developers’ standard design and production templates will usher in significant technical and commercial risks. In this paper we report early results from an ongoing Engineering and Physical Sciences Research Council project looking at the innovation logic and trajectory of LZC technologies in new housing. The principal theoretical lens for the research is the socio-technical network approach which considers actors’ interests and interpretative flexibilities of technologies and how they negotiate and reproduce ‘acting spaces’ to shape, in this case, the selection and adoption of LZC technologies. The initial findings are revealing the form and operation of the technology networks around new housing developments as being very complex, involving a range of actors and viewpoints that vary for each housing development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The building sector is one of the highest consumers of energy in the world. This has led to high dependency on using fossil fuel to supply energy without due consideration to its environmental impact. Saudi Arabia has been through rapid development accompanied by population growth, which in turn has increased the demand for construction. However, this fast development has been met without considering sustainable building design. General design practices rely on using international design approaches and features without considering the local climate and aspects of traditional passive design. This is by constructing buildings with a large amount of glass fully exposed to solar radiation. The aim of this paper is to investigate the development of sustainability in passive design and vernacular architecture. Furthermore, it compares them with current building in Saudi Arabia in terms of making the most of the climate. Moreover, it will explore the most sustainable renewable energy that can be used to reduce the environmental impact on modern building in Saudi Arabia. This will be carried out using case studies demonstrating the performance of vernacular design in Saudi Arabia and thus its benefits in terms of environmental, economic and social sustainability. It argues that the adoption of a hybrid approach can improve the energy efficiency as well as reduce the carbon footprint of buildings. This is by combining passive design, learning from the vernacular architecture and implementing innovative sustainable technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Induction of classification rules is one of the most important technologies in data mining. Most of the work in this field has concentrated on the Top Down Induction of Decision Trees (TDIDT) approach. However, alternative approaches have been developed such as the Prism algorithm for inducing modular rules. Prism often produces qualitatively better rules than TDIDT but suffers from higher computational requirements. We investigate approaches that have been developed to minimize the computational requirements of TDIDT, in order to find analogous approaches that could reduce the computational requirements of Prism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper develops a more precise specification and understanding of the process of national-level knowledge accumulation and absorptive capabilities by applying the reasoning and evidence from the firm-level analysis pioneered by Cohen and Levinthal (1989, 1990). In doing so, we acknowledge that significant cross-border effects due to the role of both inward and outward FDI exist and that assimilation of foreign knowledge is not only confined to catching-up economies but is also carried out by countries at the frontier-sharing phase. We postulate a non-linear relationship between national absorptive capacity and the technological gap, due to the effects of the cumulative nature of the learning process and the increase in complexity of external knowledge as the country approaches the technological frontier. We argue that national absorptive capacity and the accumulation of knowledge stock are simultaneously determined. This implies that different phases of technological development require different strategies. During the catching-up phase, knowledge accumulation occurs predominately through the absorption of trade and/or inward FDI-related R&D spillovers. At the pre-frontier-sharing phase onwards, increases in the knowledge base occur largely through independent knowledge creation and actively accessing foreign-located technological spillovers, inter alia through outward FDI-related R&D, joint ventures and strategic alliances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose an alternative model of, what is often called, land value capture in the planning system. Based on development viability models, negotiations and policy formation regarding the level of planning obligations have taken place at the local level with little clear guidance on technique, approach and method. It is argued that current approaches are regressive and fail to reflect how the ability of sites to generate planning gain can vary over time and between sites. The alternative approach suggested here attempts to rationalise rather than replace the existing practice of development viability appraisal. It is based upon the assumption that schemes with similar development values should produce similar levels of return to the landowner, developer and other stakeholders in the development as well as similar levels of planning obligations in all parts of the country. Given the high level of input uncertainty in viability modelling, a simple viability model is ‘good enough’ to quantify the maximum level of planning obligations for a given level of development value. We have argued that such an approach can deliver a more durable, equitable, simpler, consistent and cheaper method for policy formation regarding planning obligations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the sources of systematic errors in climate models is challenging because of coupled feedbacks and errors compensation. The developing seamless approach proposes that the identification and the correction of short term climate model errors have the potential to improve the modeled climate on longer time scales. In previous studies, initialised atmospheric simulations of a few days have been used to compare fast physics processes (convection, cloud processes) among models. The present study explores how initialised seasonal to decadal hindcasts (re-forecasts) relate transient week-to-month errors of the ocean and atmospheric components to the coupled model long-term pervasive SST errors. A protocol is designed to attribute the SST biases to the source processes. It includes five steps: (1) identify and describe biases in a coupled stabilized simulation, (2) determine the time scale of the advent of the bias and its propagation, (3) find the geographical origin of the bias, (4) evaluate the degree of coupling in the development of the bias, (5) find the field responsible for the bias. This strategy has been implemented with a set of experiments based on the initial adjustment of initialised simulations and exploring various degrees of coupling. In particular, hindcasts give the time scale of biases advent, regionally restored experiments show the geographical origin and ocean-only simulations isolate the field responsible for the bias and evaluate the degree of coupling in the bias development. This strategy is applied to four prominent SST biases of the IPSLCM5A-LR coupled model in the tropical Pacific, that are largely shared by other coupled models, including the Southeast Pacific warm bias and the equatorial cold tongue bias. Using the proposed protocol, we demonstrate that the East Pacific warm bias appears in a few months and is caused by a lack of upwelling due to too weak meridional coastal winds off Peru. The cold equatorial bias, which surprisingly takes 30 years to develop, is the result of an equatorward advection of midlatitude cold SST errors. Despite large development efforts, the current generation of coupled models shows only little improvement. The strategy proposed in this study is a further step to move from the current random ad hoc approach, to a bias-targeted, priority setting, systematic model development approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in hardware technologies allow to capture and process data in real-time and the resulting high throughput data streams require novel data mining approaches. The research area of Data Stream Mining (DSM) is developing data mining algorithms that allow us to analyse these continuous streams of data in real-time. The creation and real-time adaption of classification models from data streams is one of the most challenging DSM tasks. Current classifiers for streaming data address this problem by using incremental learning algorithms. However, even so these algorithms are fast, they are challenged by high velocity data streams, where data instances are incoming at a fast rate. This is problematic if the applications desire that there is no or only a very little delay between changes in the patterns of the stream and absorption of these patterns by the classifier. Problems of scalability to Big Data of traditional data mining algorithms for static (non streaming) datasets have been addressed through the development of parallel classifiers. However, there is very little work on the parallelisation of data stream classification techniques. In this paper we investigate K-Nearest Neighbours (KNN) as the basis for a real-time adaptive and parallel methodology for scalable data stream classification tasks.