79 resultados para process analysis
Resumo:
Background: Microarray based comparative genomic hybridisation (CGH) experiments have been used to study numerous biological problems including understanding genome plasticity in pathogenic bacteria. Typically such experiments produce large data sets that are difficult for biologists to handle. Although there are some programmes available for interpretation of bacterial transcriptomics data and CGH microarray data for looking at genetic stability in oncogenes, there are none specifically to understand the mosaic nature of bacterial genomes. Consequently a bottle neck still persists in accurate processing and mathematical analysis of these data. To address this shortfall we have produced a simple and robust CGH microarray data analysis process that may be automated in the future to understand bacterial genomic diversity. Results: The process involves five steps: cleaning, normalisation, estimating gene presence and absence or divergence, validation, and analysis of data from test against three reference strains simultaneously. Each stage of the process is described and we have compared a number of methods available for characterising bacterial genomic diversity, for calculating the cut-off between gene presence and absence or divergence, and shown that a simple dynamic approach using a kernel density estimator performed better than both established, as well as a more sophisticated mixture modelling technique. We have also shown that current methods commonly used for CGH microarray analysis in tumour and cancer cell lines are not appropriate for analysing our data. Conclusion: After carrying out the analysis and validation for three sequenced Escherichia coli strains, CGH microarray data from 19 E. coli O157 pathogenic test strains were used to demonstrate the benefits of applying this simple and robust process to CGH microarray studies using bacterial genomes.
Resumo:
The animal gastrointestinal tract houses a large microbial community, the gut microbiota, that confers many benefits to its host, such as protection from pathogens and provision of essential metabolites. Metagenomic approaches have defined the chicken fecal microbiota in other studies, but here, we wished to assess the correlation between the metagenome and the bacterial proteome in order to better understand the healthy chicken gut microbiota. Here, we performed high-throughput sequencing of 16S rRNA gene amplicons and metaproteomics analysis of fecal samples to determine microbial gut composition and protein expression. 16 rRNA gene sequencing analysis identified Clostridiales, Bacteroidaceae, and Lactobacillaceae species as the most abundant species in the gut. For metaproteomics analysis, peptides were generated by using the Fasp method and subsequently fractionated by strong anion exchanges. Metaproteomics analysis identified 3,673 proteins. Among the most frequently identified proteins, 380 proteins belonged to Lactobacillus spp., 155 belonged to Clostridium spp., and 66 belonged to Streptococcus spp. The most frequently identified proteins were heat shock chaperones, including 349 GroEL proteins, from many bacterial species, whereas the most abundant enzymes were pyruvate kinases, as judged by the number of peptides identified per protein (spectral counting). Gene ontology and KEGG pathway analyses revealed the functions and locations of the identified proteins. The findings of both metaproteomics and 16S rRNA sequencing analyses are discussed.
Resumo:
About 90% of the anthropogenic increase in heat stored in the climate system is found the oceans. Therefore it is relevant to understand the details of ocean heat uptake. Here we present a detailed, process-based analysis of ocean heat uptake (OHU) processes in HiGEM1.2, an atmosphere-ocean general circulation model (AOGCM) with an eddy-permitting ocean component of 1/3 degree resolution. Similarly to various other models, HiGEM1.2 shows that the global heat budget is dominated by a downward advection of heat compensated by upward isopycnal diffusion. Only in the upper tropical ocean do we find the classical balance between downward diapycnal diffusion and upward advection of heat. The upward isopycnal diffusion of heat is located mostly in the Southern Ocean, which thus dominates the global heat budget. We compare the responses to a 4xCO2 forcing and an enhancement of the windstress forcing in the Southern Ocean. This highlights the importance of regional processes for the global ocean heat uptake. These are mainly surface fluxes and convection in the high latitudes, and advection in the Southern Ocean mid-latitudes. Changes in diffusion are less important. In line with the CMIP5 models, HiGEM1.2 shows a band of strong OHU in the mid-latitude Southern Ocean in the 4xCO2 run, which is mostly advective. By contrast, in the high-latitude Southern Ocean regions it is the suppression of convection that leads to OHU. In the enhanced windstress run, convection is strengthened at high Southern latitudes, leading to heat loss, while the magnitude of the OHU in the Southern mid-latitudes is very similar to the 4xCO2 results. Remarkably, there is only very small global OHU in the enhanced windstress run. The wind stress forcing just leads to a redistribution of heat. We relate the ocean changes at high southern latitudes to the effect of climate change on the Antarctic Circumpolar Current (ACC). It weakens in the 4xCO2 run and strengthens in the wind stress run. The weakening is due to a narrowing of the ACC, caused by an expansion of the Weddell Gyre, and a flattening of the isopycnals, which are explained by a combination of the wind stress forcing and increased precipitation.
Resumo:
We define and experimentally test a public provision mechanism that meets three basic ethical requirements and allows community members to influence, via monetary bids, which of several projects is implemented. For each project, participants are assigned personal values, which can be positive or negative. We provide either public or private information about personal values. This produces two distinct public provision games, which are experimentally implemented and analyzed for various projects. In spite of the complex experimental task, participants do not rely on bidding their own personal values as an obvious simple heuristic whose general acceptance would result in fair and efficient outcomes. Rather, they rely on strategic underbidding. Although underbidding is affected by projects’ characteristics, the provision mechanism mostly leads to the implementation of the most efficient project.
Resumo:
Despite the many models developed for phosphorus concentration prediction at differing spatial and temporal scales, there has been little effort to quantify uncertainty in their predictions. Model prediction uncertainty quantification is desirable, for informed decision-making in river-systems management. An uncertainty analysis of the process-based model, integrated catchment model of phosphorus (INCA-P), within the generalised likelihood uncertainty estimation (GLUE) framework is presented. The framework is applied to the Lugg catchment (1,077 km2), a River Wye tributary, on the England–Wales border. Daily discharge and monthly phosphorus (total reactive and total), for a limited number of reaches, are used to initially assess uncertainty and sensitivity of 44 model parameters, identified as being most important for discharge and phosphorus predictions. This study demonstrates that parameter homogeneity assumptions (spatial heterogeneity is treated as land use type fractional areas) can achieve higher model fits, than a previous expertly calibrated parameter set. The model is capable of reproducing the hydrology, but a threshold Nash-Sutcliffe co-efficient of determination (E or R 2) of 0.3 is not achieved when simulating observed total phosphorus (TP) data in the upland reaches or total reactive phosphorus (TRP) in any reach. Despite this, the model reproduces the general dynamics of TP and TRP, in point source dominated lower reaches. This paper discusses why this application of INCA-P fails to find any parameter sets, which simultaneously describe all observed data acceptably. The discussion focuses on uncertainty of readily available input data, and whether such process-based models should be used when there isn’t sufficient data to support the many parameters.
Resumo:
The frequency of persistent atmospheric blocking events in the 40-yr ECMWF Re-Analysis (ERA-40) is compared with the blocking frequency produced by a simple first-order Markov model designed to predict the time evolution of a blocking index [defined by the meridional contrast of potential temperature on the 2-PVU surface (1 PVU ≡ 1 × 10−6 K m2 kg−1 s−1)]. With the observed spatial coherence built into the model, it is able to reproduce the main regions of blocking occurrence and the frequencies of sector blocking very well. This underlines the importance of the climatological background flow in determining the locations of high blocking occurrence as being the regions where the mean midlatitude meridional potential vorticity (PV) gradient is weak. However, when only persistent blocking episodes are considered, the model is unable to simulate the observed frequencies. It is proposed that this persistence beyond that given by a red noise model is due to the self-sustaining nature of the blocking phenomenon.
Resumo:
The research record on the quantification of sediment transport processes in periglacial mountain environments in Scandimvia dates back to the 1950s. A wide range of measurements is. available, especially from the Karkevagge region of northern Sweden. Within this paper satellite image analysis and tools provided by geographic information systems (GIS) are exploited in order to extend and improve this research and to complement geophysical methods. The processes of interest include mass movements such as solifluction, slope wash, dirty avalanches and rock-and boulder falls. Geomorphic process units have been derived in order to allow quantification via GIS techniques at a catchment scale. Mass movement rates based on existing Field measurements are employed in the budget calculation. In the Karkevagge catch ment. 80% of the area can be identified either as a source area for sediments or as a zone where sediments are deposited. The overall budget for the slopes beneath the rockwalls in the Karkevagge is approximately 680 t a(-1) whilst about 150 : a-1 are transported into the fluvial System.
Resumo:
Partnerships are complex, diverse and subtle relationships, the nature of which changes with time, but they are vital for the functioning of the development chain. This paper reviews the meaning of partnership between development institutions as well as some of the main approaches taken to analyse the relationships. The latter typically revolve around analyses based on power, discourse, interdependence and functionality. The paper makes the case for taking a multianalytical approach to understanding partnership but points out three problem areas: identifying acceptable/unacceptable trade-offs between characteristics of partnership, the analysis of multicomponent partnerships (where one partner has a number of other partners) and the analysis of long-term partnership. The latter is especially problematic for long-term partnerships between donors and field agencies that share an underlying commitment based on religious beliefs. These problems with current methods of analysing partnership are highlighted by focusing upon the Catholic Church-based development chain, linking donors in the North (Europe) and their field partners in the South (Abuja Ecclesiastical Province, Nigeria). It explores a narrated history of a relationship with a single donor spanning 35 years from the perspective of one partner (the field agency).
Resumo:
The principles of organization theory are applied to the organization of construction projects. This is done by proposing a framework for modelling the whole process of building procurement. This consists of a framework for describing the environments within which construction projects take place. This is followed by the development of a series of hypotheses about the organizational structure of construction projects. Four case studies are undertaken, and the extent to which their organizational structure matches the model is compared to the level of success achieved by each project. To this end there is a systematic method for evaluating the success of building project organizations, because any conclusions about the adequacy of a particular organization must be related to the degree of success achieved by that organization. In order to test these hypotheses, a mapping technique is developed. The technique offered is a development of a technique known as Linear Responsibility Analysis, and is called "3R analysis" as it deals with roles, responsibilities and relationships. The analysis of the case studies shows that they tended to suffer due to inappropriate organizational structure. One of the prevailing problems of public sector organization is that organizational structures are inadequately defined, and too cumbersome to respond to environmental demands on the project. The projects tended to be organized as rigid hierarchies, particularly at decision points, when what was required was a more flexible, dynamic and responsive organization. The study concludes with a series of recommendations; including suggestions for increasing the responsiveness of construction project organizations, and reducing the lead-in times for the inception periods.
Resumo:
Standard form contracts are typically developed through a negotiated consensus, unless they are proffered by one specific interest group. Previously published plans of work and other descriptions of the processes in construction projects tend to focus on operational issues, or they tend to be prepared from the point of view of one or other of the dominant interest groups. Legal practice in the UK permits those who draft contracts to define their terms as they choose. There are no definitive rulings from the courts that give an indication as to the detailed responsibilities of project participants. The science of terminology offers useful guidance for discovering and describing terms and their meanings in their practical context, but has never been used for defining terms for responsibilities of participants in the construction project management process. Organizational analysis enables the management task to be deconstructed into its elemental parts in order that effective organizational structures can be developed. Organizational mapping offers a useful technique for reducing text-based descriptions of project management roles and responsibilities to a comparable basis. Research was carried out by means of a desk study, detailed analysis of nine plans of work and focus groups representing all aspects of the construction industry. No published plan of work offers definitive guidance. There is an enormous amount of variety in the way that terms are used for identifying responsibilities of project participants. A catalogue of concepts and terms (a “Terminology”) has been compiled and indexed to enable those who draft contracts to choose the most appropriate titles for project participants. The purpose of this terminology is to enable the selection and justification of appropriate terms in order to help define roles. The terminology brings an unprecedented clarity to the description of roles and responsibilities in construction projects and, as such, will be helpful for anyone seeking to assemble a team and specify roles for project participants.
Resumo:
Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.
Resumo:
A scheme to describe SDS−lysozyme complex formation has been proposed on the basis of isothermal titration calorimetry (ITC) and FTIR spectroscopy data. ITC isotherms are convoluted and reveal a marked effect of both SDS and lysozyme concentration on the stoichiometry of the SDS−lysozyme complex. The binding isotherms have been described with the aid of FTIR spectroscopy in terms of changes in the lysozyme structure and the nature of the SDS binding. At low SDS concentrations, ITC isotherms feature an exothermic region that corresponds to specific electrostatic binding of SDS to positively charged amino acid residues on the lysozyme surface. This leads to charge neutralization of the complex and precipitation. The number of SDS molecules that bind specifically to lysozyme is approximately 8, as determined from our ITC isotherms, and is independent of lysozyme solution concentration. At high SDS concentrations, hydrophobic cooperative association dominates the binding process. Saturated binding stoichiometries as a molar ratio of SDS per molecule of lysozyme range from 220:1 to 80:1, depending on the lysozyme solution concentration. A limiting value of 78:1 has been calculated for lysozyme solution concentrations above 0.25 mM.
Resumo:
This paper examines the dynamics of the ongoing conflict in Prestea, Ghana, where indigenous galamsey mining groups are operating illegally on a concession awarded to Bogoso Gold Limited (BGL), property of the Canadian-listed multinational Gold Star Resources. Despite being issued firm orders by the authorities to abandon their activities, galamsey leaders maintain that they are working areas of the concession that are of little interest to the company; they further counter that there are few alternative sources of local employment, which is why they are mining in the first place. Whilst the Ghanaian Government is in the process of setting aside plots to relocate illegal mining parties and is developing alternative livelihood projects, efforts are far from encouraging: in addition to a series of overlooked logistical problems, the areas earmarked for relocation have not yet been prospected to ascertain gold content, and the alternative income-earning activities identified are inappropriate. As has been the case throughout mineral-rich sub-Saharan Africa, the conflict in Prestea has come about largely because the national mining sector reform program, which prioritizes the expansion of predominantly foreign-controlled large-scale projects, has neglected the concerns of indigenous subsistence groups.
Resumo:
A range of funding schemes and policy instruments exist to effect enhancement of the landscapes and habitats of the UK. While a number of assessments of these mechanisms have been conducted, little research has been undertaken to compare both quantitatively and qualitatively their relative effectiveness across a range of criteria. It is argued that few tools are available for such a multi-faceted evaluation of effectiveness. A form of Multiple Criteria Decision Analysis (MCDA) is justified and utilized as a framework in which to evaluate the effectiveness of nine mechanisms in relation to the protection of existing areas of chalk grassland and the creation of new areas in the South Downs of England. These include established schemes, such as the Countryside Stewardship and Environmentally Sensitive Area Schemes, along with other less common mechanisms, for example, land purchase and tender schemes. The steps involved in applying an MCDA to evaluate such mechanisms are identified and the process is described. Quantitative results from the comparison of the effectiveness of different mechanisms are presented, although the broader aim of the paper is that of demonstrating the performance of MCDA as a tool for measuring the effectiveness of mechanisms aimed at landscape and habitat enhancement.
High throughput, high resolution selection of polymorphic microsatellite loci for multiplex analysis
Resumo:
Background Large-scale genetic profiling, mapping and genetic association studies require access to a series of well-characterised and polymorphic microsatellite markers with distinct and broad allele ranges. Selection of complementary microsatellite markers with non-overlapping allele ranges has historically proved to be a bottleneck in the development of multiplex microsatellite assays. The characterisation process for each microsatellite locus can be laborious and costly given the need for numerous, locus-specific fluorescent primers. Results Here, we describe a simple and inexpensive approach to select useful microsatellite markers. The system is based on the pooling of multiple unlabelled PCR amplicons and their subsequent ligation into a standard cloning vector. A second round of amplification utilising generic labelled primers targeting the vector and unlabelled locus-specific primers targeting the microsatellite flanking region yield allelic profiles that are representative of all individuals contained within the pool. Suitability of various DNA pool sizes was then tested for this purpose. DNA template pools containing between 8 and 96 individuals were assessed for the determination of allele ranges of individual microsatellite markers across a broad population. This helped resolve the balance between using pools that are large enough to allow the detection of many alleles against the risk of including too many individuals in a pool such that rare alleles are over-diluted and so do not appear in the pooled microsatellite profile. Pools of DNA from 12 individuals allowed the reliable detection of all alleles present in the pool. Conclusion The use of generic vector-specific fluorescent primers and unlabelled locus-specific primers provides a high resolution, rapid and inexpensive approach for the selection of highly polymorphic microsatellite loci that possess non-overlapping allele ranges for use in large-scale multiplex assays.