953 resultados para Context-sensitive analysis
Resumo:
Standard form contracts are typically developed through a negotiated consensus, unless they are proffered by one specific interest group. Previously published plans of work and other descriptions of the processes in construction projects tend to focus on operational issues, or they tend to be prepared from the point of view of one or other of the dominant interest groups. Legal practice in the UK permits those who draft contracts to define their terms as they choose. There are no definitive rulings from the courts that give an indication as to the detailed responsibilities of project participants. The science of terminology offers useful guidance for discovering and describing terms and their meanings in their practical context, but has never been used for defining terms for responsibilities of participants in the construction project management process. Organizational analysis enables the management task to be deconstructed into its elemental parts in order that effective organizational structures can be developed. Organizational mapping offers a useful technique for reducing text-based descriptions of project management roles and responsibilities to a comparable basis. Research was carried out by means of a desk study, detailed analysis of nine plans of work and focus groups representing all aspects of the construction industry. No published plan of work offers definitive guidance. There is an enormous amount of variety in the way that terms are used for identifying responsibilities of project participants. A catalogue of concepts and terms (a “Terminology”) has been compiled and indexed to enable those who draft contracts to choose the most appropriate titles for project participants. The purpose of this terminology is to enable the selection and justification of appropriate terms in order to help define roles. The terminology brings an unprecedented clarity to the description of roles and responsibilities in construction projects and, as such, will be helpful for anyone seeking to assemble a team and specify roles for project participants.
Resumo:
Facilitating the visual exploration of scientific data has received increasing attention in the past decade or so. Especially in life science related application areas the amount of available data has grown at a breath taking pace. In this paper we describe an approach that allows for visual inspection of large collections of molecular compounds. In contrast to classical visualizations of such spaces we incorporate a specific focus of analysis, for example the outcome of a biological experiment such as high throughout screening results. The presented method uses this experimental data to select molecular fragments of the underlying molecules that have interesting properties and uses the resulting space to generate a two dimensional map based on a singular value decomposition algorithm and a self organizing map. Experiments on real datasets show that the resulting visual landscape groups molecules of similar chemical properties in densely connected regions.
Resumo:
Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.
Resumo:
Global hydrological models (GHMs) model the land surface hydrologic dynamics of continental-scale river basins. Here we describe one such GHM, the Macro-scale - Probability-Distributed Moisture model.09 (Mac-PDM.09). The model has undergone a number of revisions since it was last applied in the hydrological literature. This paper serves to provide a detailed description of the latest version of the model. The main revisions include the following: (1) the ability for the model to be run for n repetitions, which provides more robust estimates of extreme hydrological behaviour, (2) the ability of the model to use a gridded field of coefficient of variation (CV) of daily rainfall for the stochastic disaggregation of monthly precipitation to daily precipitation, and (3) the model can now be forced with daily input climate data as well as monthly input climate data. We demonstrate the effects that each of these three revisions has on simulated runoff relative to before the revisions were applied. Importantly, we show that when Mac-PDM.09 is forced with monthly input data, it results in a negative runoff bias relative to when daily forcings are applied, for regions of the globe where the day-to-day variability in relative humidity is high. The runoff bias can be up to - 80% for a small selection of catchments but the absolute magnitude of the bias may be small. As such, we recommend future applications of Mac-PDM.09 that use monthly climate forcings acknowledge the bias as a limitation of the model. The performance of Mac-PDM.09 is evaluated by validating simulated runoff against observed runoff for 50 catchments. We also present a sensitivity analysis that demonstrates that simulated runoff is considerably more sensitive to method of PE calculation than to perturbations in soil moisture and field capacity parameters.
Resumo:
Locality to other nodes on a peer-to-peer overlay network can be established by means of a set of landmarks shared among the participating nodes. Each node independently collects a set of latency measures to landmark nodes, which are used as a multi-dimensional feature vector. Each peer node uses the feature vector to generate a unique scalar index which is correlated to its topological locality. A popular dimensionality reduction technique is the space filling Hilbert’s curve, as it possesses good locality preserving properties. However, there exists little comparison between Hilbert’s curve and other techniques for dimensionality reduction. This work carries out a quantitative analysis of their properties. Linear and non-linear techniques for scaling the landmark vectors to a single dimension are investigated. Hilbert’s curve, Sammon’s mapping and Principal Component Analysis have been used to generate a 1d space with locality preserving properties. This work provides empirical evidence to support the use of Hilbert’s curve in the context of locality preservation when generating peer identifiers by means of landmark vector analysis. A comparative analysis is carried out with an artificial 2d network model and with a realistic network topology model with a typical power-law distribution of node connectivity in the Internet. Nearest neighbour analysis confirms Hilbert’s curve to be very effective in both artificial and realistic network topologies. Nevertheless, the results in the realistic network model show that there is scope for improvements and better techniques to preserve locality information are required.
Resumo:
Discrepancies between recent global earth albedo anomaly data obtained from the climate models, space and ground observations call for a new and better earth reflectance measurement technique. The SALEX (Space Ashen Light Explorer) instrument is a space-based visible and IR instrument for precise estimation of the global earth albedo by measuring the ashen light reflected off the shadowy side of the Moon from the low earth orbit. The instrument consists of a conventional 2-mirror telescope, a pair of a 3-mirror visible imager and an IR bolometer. The performance of this unique multi-channel optical system is sensitive to the stray light contamination due to the complex optical train incorporating several reflecting and refracting elements, associated mounts and the payload mechanical enclosure. This could be further aggravated by the very bright and extended observation target (i.e. the Moon). In this paper, we report the details of extensive stray light analysis including ghosts and cross-talks, leading to the optimum set of stray light precautions for the highest signal-to-noise ratio attainable.
Resumo:
Poverty, as defined within development discourse, does not fully capture the reality in which the poor live, which is formed also by values and beliefs specific to a given culture and setting. This article uses a memetic approach to investigating the reality of poverty among pastoralists and urban dwellers in Kenya. By distinguishing the semantic space and the cultural context in which the definitions are framed, it enables the researcher to make sufficient generalisations while also recognising the differences between cultures. The results demonstrate how pastoralists and urban dwellers conceptualise poverty differently particularly in regard to causes. Further, the article suggests that development actors often utilise a Western construct which does not entirely reflect the values and beliefs of the poor.
Resumo:
A range of funding schemes and policy instruments exist to effect enhancement of the landscapes and habitats of the UK. While a number of assessments of these mechanisms have been conducted, little research has been undertaken to compare both quantitatively and qualitatively their relative effectiveness across a range of criteria. It is argued that few tools are available for such a multi-faceted evaluation of effectiveness. A form of Multiple Criteria Decision Analysis (MCDA) is justified and utilized as a framework in which to evaluate the effectiveness of nine mechanisms in relation to the protection of existing areas of chalk grassland and the creation of new areas in the South Downs of England. These include established schemes, such as the Countryside Stewardship and Environmentally Sensitive Area Schemes, along with other less common mechanisms, for example, land purchase and tender schemes. The steps involved in applying an MCDA to evaluate such mechanisms are identified and the process is described. Quantitative results from the comparison of the effectiveness of different mechanisms are presented, although the broader aim of the paper is that of demonstrating the performance of MCDA as a tool for measuring the effectiveness of mechanisms aimed at landscape and habitat enhancement.
Resumo:
Despite the wide use of Landscape Character Assessment (LCA) as a tool for landscape planning in NW Europe, there are few examples of its application in the Mediterranean. This paper reports on the results from the development of a typology for LCA in a study area of northern Sardinia, Italy to provide a spatial framework for the analysis of current patterns of cork oak distribution and future restoration of this habitat. Landscape units were derived from a visual interpretation of map data stored within a GIS describing the physical and cultural characteristics of the study area. The units were subsequently grouped into Landscape Types according to the similarity of shared attributes using Two Way Indicator Species Analysis (TWINSPAN). The preliminary results showed that the methodology classified distinct Landscape Types but, based on field observations, there is a need for further refinement of the classification. The distribution and properties of two main cork oak habitats types was examined within the identified Landscape Types namely woodlands and wood pastures using Patch Analyst. The results show very clearly a correspondence between the distribution of cork oak pastures and cork oak woodland and landscape types. This forms the basis of the development of strategies for the maintenance, restoration and recreation of these habitat types within the study area, ultimately for the whole island of Sardinia. Future work is required to improve the landscape characterisation , particularly with respect to cultural factors, and to determine the validity of the landscape spatial framework for the analysis of cork oak distribution as part of a programme of habitat restoration and re-creation.
Resumo:
Pharmacogenetic trials investigate the effect of genotype on treatment response. When there are two or more treatment groups and two or more genetic groups, investigation of gene-treatment interactions is of key interest. However, calculation of the power to detect such interactions is complicated because this depends not only on the treatment effect size within each genetic group, but also on the number of genetic groups, the size of each genetic group, and the type of genetic effect that is both present and tested for. The scale chosen to measure the magnitude of an interaction can also be problematic, especially for the binary case. Elston et al. proposed a test for detecting the presence of gene-treatment interactions for binary responses, and gave appropriate power calculations. This paper shows how the same approach can also be used for normally distributed responses. We also propose a method for analysing and performing sample size calculations based on a generalized linear model (GLM) approach. The power of the Elston et al. and GLM approaches are compared for the binary and normal case using several illustrative examples. While more sensitive to errors in model specification than the Elston et al. approach, the GLM approach is much more flexible and in many cases more powerful. Copyright © 2005 John Wiley & Sons, Ltd.
Resumo:
Individuals are typically co-infected by a diverse community of microparasites (e.g. viruses or protozoa) and macroparasites (e.g. helminths). Vertebrates respond to these parasites differently, typically mounting T helper type 1 (Th1) responses against microparasites and Th2 responses against macroparasites. These two responses may be antagonistic such that hosts face a 'decision' of how to allocate potentially limiting resources. Such decisions at the individual host level will influence parasite abundance at the population level which, in turn, will feed back upon the individual level. We take a first step towards a complete theoretical framework by placing an analysis of optimal immune responses under microparasite-macroparasite co-infection within an epidemiological framework. We show that the optimal immune allocation is quantitatively sensitive to the shape of the trade-off curve and qualitatively sensitive to life-history traits of the host, microparasite and macroparasite. This model represents an important first step in placing optimality models of the immune response to co-infection into an epidemiological framework. Ultimately, however, a more complete framework is needed to bring together the optimal strategy at the individual level and the population-level consequences of those responses, before we can truly understand the evolution of host immune responses under parasite co-infection.
Resumo:
In survival analysis frailty is often used to model heterogeneity between individuals or correlation within clusters. Typically frailty is taken to be a continuous random effect, yielding a continuous mixture distribution for survival times. A Bayesian analysis of a correlated frailty model is discussed in the context of inverse Gaussian frailty. An MCMC approach is adopted and the deviance information criterion is used to compare models. As an illustration of the approach a bivariate data set of corneal graft survival times is analysed. (C) 2006 Elsevier B.V. All rights reserved.
Resumo:
GP catalyzes the phosphorylation of glycogen to Glc-1-P. Because of its fundamental role in the metabolism of glycogen, GP has been the target for a systematic structure-assisted design of inhibitory compounds, which could be of value in the therapeutic treatment of type 2 diabetes mellitus. The most potent catalytic-site inhibitor of GP identified to date is spirohydantoin of glucopyranose (hydan). In this work, we employ MD free energy simulations to calculate the relative binding affinities for GP of hydan and two spirohydantoin analogues, methyl-hydan and n-hydan, in which a hydrogen atom is replaced by a methyl- or amino group, respectively. The results are compared with the experimental relative affinities of these ligands, estimated by kinetic measurements of the ligand inhibition constants. The calculated binding affinity for methyl-hydan (relative to hydan) is 3.75 +/- 1.4 kcal/mol, in excellent agreement with the experimental value (3.6 +/- 0.2 kcal/mol). For n-hydan, the calculated value is 1.0 +/- 1.1 kcal/mol, somewhat smaller than the experimental result (2.3 +/- 0.1 kcal/mol). A free energy decomposition analysis shows that hydan makes optimum interactions with protein residues and specific water molecules in the catalytic site. In the other two ligands, structural perturbations of the active site by the additional methyl- or amino group reduce the corresponding binding affinities. The computed binding free energies are sensitive to the preference of a specific water molecule for two well-defined positions in the catalytic site. The behavior of this water is analyzed in detail, and the free energy profile for the translocation of the water between the two positions is evaluated. The results provide insights into the role of water molecules in modulating ligand binding affinities. A comparison of the interactions between a set of ligands and their surrounding groups in X-ray structures is often used in the interpretation of binding free energy differences and in guiding the design of new ligands. For the systems in this work, such an approach fails to estimate the order of relative binding strengths, in contrast to the rigorous free energy treatment.
Resumo:
The UK Construction Industry has been criticized for being slow to change and adopt innovations. The idiosyncrasies of participants, their roles in a social system and the contextual differences between sections of the UK Construction Industry are viewed as being paramount to explaining innovation diffusion within this context. Three innovation diffusion theories from outside construction management literature are introduced, Cohesion, Structural Equivalence and Thresholds. The relevance of each theory, in relation to the UK Construction Industry, is critically reviewed using literature and empirical data. Analysis of the data results in an explanatory framework being proposed. The framework introduces a Personal Awareness Threshold concept, highlights the dominant role of Cohesion through the main stages of diffusion, together with the use of Structural Equivalence during the later stages of diffusion and the importance of Adoption Threshold levels.
Resumo:
This paper examines the intellectual and professional contribution of comparative and international studies to the field of education. It explores the nature of the challenges that are currently being faced, and assesses its potential for the advancement of future teaching, research and professional development. Attention is paid to the place of comparative and international education (CIE)-past and present-in teacher education, in postgraduate studies, and in the realms of policy and practice, theory and research. Consideration is first given to the nature and history of CIE, to its initial contributions to the field of education in the UK, and to its chief mechanisms and sites of production. Influential methodological and theoretical developments are examined, followed by an exploration of emergent questions, controversies and dilemmas that could benefit from sustained comparative analysis in the future. Conclusions consider implications for the place of CIE in the future of educational studies as a whole; for relations between and beyond the 'disciplines of education'; and for the development of sustainable research capacity in this field.