13 resultados para Sequence analysis with oligonucleotid series

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The genome of Salmonella enterica serovar Enteritidis was shown to possess three IS3-like insertion elements, designated IS1230A, B and C, and each was cloned and their respective deoxynucleotide sequences determined. Mutations in elements IS1230A and B resulted in frameshifts in the open reading frames that encoded a putative transposase to be inactive. IS1230C was truncated at nucleotide 774 relative to IS1230B and therefore did not possess the 3' terminal inverted repeat. The three IS1230 derivatives were closely related to each other based on nucleotide sequence similarity. IS1230A was located adjacent to the sef operon encoding SEF14 fimbriae located at minute 97 of the genome of S. Enteritidis. IS1230B was located adjacent to the umuDC operon at minute 42.5 on the genome, itself located near to one terminus of an 815-kb genome inversion of S. Enteritidis relative to S. Typhimurium. IS1230C was located next to attB, the bacteriophage P22 attachment site, and proB, encoding gamma-glutamyl phosphate reductase. A truncated 3' remnant of IS1230, designated IS1230T, was identified in a clinical isolate of S. Typhimurium DT193 strain 2391. This element was located next to attB adjacent to which were bacteriophage P22-like sequences. Southern hybridisation of total genomic DNA from eighteen phage types of S. Enteritidis and eighteen definitive types of S. Typhimurium showed similar, if not identical, restriction fragment profiles in the respective serovars when probed with IS1230A.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The nucleotide sequence of a 3 kb region immediately upstream of the sef operon of Salmonella enteritidis was determined. A 1230 base pair insertion sequence which shared sequence identity (> 75%) with members of the IS3 family was revealed. This element, designated IS1230, had almost identical (90% identity) terminal inverted repeats to Escherichia coli IS3 but unlike other IS3-like sequences lacked the two characteristic open reading frames which encode the putative transposase. S. enteritidis possessed only one copy of this insertion sequence although Southern hybridisation analysis of restriction digests of genomic DNA revealed another fragment located in a region different from the sef operon which hybridised weakly which suggested the presence of an IS1230 homologue. The distribution of IS1230 and IS1230-like elements was shown to be widespread amongst salmonellas and the patterns of restriction fragments which hybridised differed significantly between Salmonella serotypes and it is suggested that IS1230 has potential for development as a differential diagnostic tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exploratory analysis of data seeks to find common patterns to gain insights into the structure and distribution of the data. In geochemistry it is a valuable means to gain insights into the complicated processes making up a petroleum system. Typically linear visualisation methods like principal components analysis, linked plots, or brushing are used. These methods can not directly be employed when dealing with missing data and they struggle to capture global non-linear structures in the data, however they can do so locally. This thesis discusses a complementary approach based on a non-linear probabilistic model. The generative topographic mapping (GTM) enables the visualisation of the effects of very many variables on a single plot, which is able to incorporate more structure than a two dimensional principal components plot. The model can deal with uncertainty, missing data and allows for the exploration of the non-linear structure in the data. In this thesis a novel approach to initialise the GTM with arbitrary projections is developed. This makes it possible to combine GTM with algorithms like Isomap and fit complex non-linear structure like the Swiss-roll. Another novel extension is the incorporation of prior knowledge about the structure of the covariance matrix. This extension greatly enhances the modelling capabilities of the algorithm resulting in better fit to the data and better imputation capabilities for missing data. Additionally an extensive benchmark study of the missing data imputation capabilities of GTM is performed. Further a novel approach, based on missing data, will be introduced to benchmark the fit of probabilistic visualisation algorithms on unlabelled data. Finally the work is complemented by evaluating the algorithms on real-life datasets from geochemical projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explains some drawbacks on previous approaches for detecting influential observations in deterministic nonparametric data envelopment analysis models as developed by Yang et al. (Annals of Operations Research 173:89-103, 2010). For example efficiency scores and relative entropies obtained in this model are unimportant to outlier detection and the empirical distribution of all estimated relative entropies is not a Monte-Carlo approximation. In this paper we developed a new method to detect whether a specific DMU is truly influential and a statistical test has been applied to determine the significance level. An application for measuring efficiency of hospitals is used to show the superiority of this method that leads to significant advancements in outlier detection. © 2014 Springer Science+Business Media New York.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This PhD thesis analyses networks of knowledge flows, focusing on the role of indirect ties in the knowledge transfer, knowledge accumulation and knowledge creation process. It extends and improves existing methods for mapping networks of knowledge flows in two different applications and contributes to two stream of research. To support the underlying idea of this thesis, which is finding an alternative method to rank indirect network ties to shed a new light on the dynamics of knowledge transfer, we apply Ordered Weighted Averaging (OWA) to two different network contexts. Knowledge flows in patent citation networks and a company supply chain network are analysed using Social Network Analysis (SNA) and the OWA operator. The OWA is used here for the first time (i) to rank indirect citations in patent networks, providing new insight into their role in transferring knowledge among network nodes; and to analyse a long chain of patent generations along 13 years; (ii) to rank indirect relations in a company supply chain network, to shed light on the role of indirectly connected individuals involved in the knowledge transfer and creation processes and to contribute to the literature on knowledge management in a supply chain. In doing so, indirect ties are measured and their role as means of knowledge transfer is shown. Thus, this thesis represents a first attempt to bridge the OWA and SNA fields and to show that the two methods can be used together to enrich the understanding of the role of indirectly connected nodes in a network. More specifically, the OWA scores enrich our understanding of knowledge evolution over time within complex networks. Future research can show the usefulness of OWA operator in different complex networks, such as the on-line social networks that consists of thousand of nodes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper seeks to advance the theory and practice of the dynamics of complex networks in relation to direct and indirect citations. It applies social network analysis (SNA) and the ordered weighted averaging operator (OWA) to study a patent citations network. So far the SNA studies investigating long chains of patents citations have rarely been undertaken and the importance of a node in a network has been associated mostly with its number of direct ties. In this research OWA is used to analyse complex networks, assess the role of indirect ties, and provide guidance to reduce complexity for decision makers and analysts. An empirical example of a set of European patents published in 2000 in the renewable energy industry is provided to show the usefulness of the proposed approach for the preference ranking of patent citations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tensor analysis plays an important role in modern image and vision computing problems. Most of the existing tensor analysis approaches are based on the Frobenius norm, which makes them sensitive to outliers. In this paper, we propose L1-norm-based tensor analysis (TPCA-L1), which is robust to outliers. Experimental results upon face and other datasets demonstrate the advantages of the proposed approach. © 2006 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aircraft manufacturing industries are looking for solutions in order to increase their productivity. One of the solutions is to apply the metrology systems during the production and assembly processes. Metrology Process Model (MPM) (Maropoulos et al, 2007) has been introduced which emphasises metrology applications with assembly planning, manufacturing processes and product designing. Measurability analysis is part of the MPM and the aim of this analysis is to check the feasibility for measuring the designed large scale components. Measurability Analysis has been integrated in order to provide an efficient matching system. Metrology database is structured by developing the Metrology Classification Model. Furthermore, the feature-based selection model is also explained. By combining two classification models, a novel approach and selection processes for integrated measurability analysis system (MAS) are introduced and such integrated MAS could provide much more meaningful matching results for the operators. © Springer-Verlag Berlin Heidelberg 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The major role of information and communication technology (ICT) in the new economy is well documented: countries worldwide are pouring resources into their ICT infrastructure despite the widely acknowledged “productivity paradox”. Evaluating the contribution of ICT investments has become an elusive but important goal of IS researchers and economists. But this area of research is fraught with complexity and we have used Solow's Residual together with time-series analysis tools to overcome some methodological inadequacies of previous studies. Using this approach, we conduct a study of 20 countries to determine if there was empirical evidence to support claims that ICT investments are worthwhile. The results show that ICT contributes to economic growth in many developed countries and newly industrialized economies (NIEs), but not in developing countries. We finally suggest ICT-complementary factors, in an attempt to rectify possible flaws in ICT policies as a contribution towards improvement in global productivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this poster we presented our preliminary work on the study of spammer detection and analysis with 50 active honeypot profiles implemented on Weibo.com and QQ.com microblogging networks. We picked out spammers from legitimate users by manually checking every captured user's microblogs content. We built a spammer dataset for each social network community using these spammer accounts and a legitimate user dataset as well. We analyzed several features of the two user classes and made a comparison on these features, which were found to be useful to distinguish spammers from legitimate users. The followings are several initial observations from our analysis on the features of spammers captured on Weibo.com and QQ.com. ¦The following/follower ratio of spammers is usually higher than legitimate users. They tend to follow a large amount of users in order to gain popularity but always have relatively few followers. ¦There exists a big gap between the average numbers of microblogs posted per day from these two classes. On Weibo.com, spammers post quite a lot microblogs every day, which is much more than legitimate users do; while on QQ.com spammers post far less microblogs than legitimate users. This is mainly due to the different strategies taken by spammers on these two platforms. ¦More spammers choose a cautious spam posting pattern. They mix spam microblogs with ordinary ones so that they can avoid the anti-spam mechanisms taken by the service providers. ¦Aggressive spammers are more likely to be detected so they tend to have a shorter life while cautious spammers can live much longer and have a deeper influence on the network. The latter kind of spammers may become the trend of social network spammer. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research aims to examine the effectiveness of Soft Systems Methodology (SSM) to enable systemic change within local goverment and local NHS environments and to examine the role of the facilitator within this process. Checkland's Mode 2 variant of Soft Systems Methodology was applied on an experimental basis in two environments, Herefordshire Health Authority and Sand well Health Authority. The Herefordshire application used SSM in the design of an Integrated Care Pathway for stroke patients. In Sandwell, SSM was deployed to assist in the design of an Infonnation Management and Technology (IM&T) Strategy for the boundary-spanning Sandwell Partnership. Both of these environments were experiencing significant organisational change as the experiments unfurled. The explicit objectives of the research were: To examine the evolution and development of SSM and to contribute to its further development. To apply the Soft Systems Methodology to change processes within the NHS. To evaluate the potential role of SSM in this wider process of change. To assess the role of the researcher as a facilitator within this process. To develop a critical framework through which the impact of SSM on change might be understood and assessed. In developing these objectives, it became apparent that there was a gap in knowledge relating to SSM. This gap concerns the evaluation of the role of the approach in the change process. The case studies highlighted issues in stakeholder selection and management; the communicative assumptions in SSM; the ambiguous role of the facilitator; and the impact of highly politicised problem environments on the effectiveness of the methodology in the process of change. An augmented variant on SSM that integrates an appropriate (social constructivist) evaluation method is outlined, together with a series of hypotheses about the operationalisation of this proposed method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative analysis of solid-state processes from isothermal microcalorimetric data is straightforward if data for the total process have been recorded and problematic (in the more likely case) when they have not. Data are usually plotted as a function of fraction reacted (α); for calorimetric data, this requires knowledge of the total heat change (Q) upon completion of the process. Determination of Q is difficult in cases where the process is fast (initial data missing) or slow (final data missing). Here we introduce several mathematical methods that allow the direct calculation of Q by selection of data points when only partial data are present, based on analysis with the Pérez-Maqueda model. All methods in addition allow direct determination of the reaction mechanism descriptors m and n and from this the rate constant, k. The validity of the methods is tested with the use of simulated calorimetric data, and we introduce a graphical method for generating solid-state power-time data. The methods are then applied to the crystallization of indomethacin from a glass. All methods correctly recovered the total reaction enthalpy (16.6 J) and suggested that the crystallization followed an Avrami model. The rate constants for crystallization were determined to be 3.98 × 10-6, 4.13 × 10-6, and 3.98 × 10 -6 s-1 with methods 1, 2, and 3, respectively. © 2010 American Chemical Society.