945 resultados para 060114 Systems Biology
Resumo:
Information overload has become a serious issue for web users. Personalisation can provide effective solutions to overcome this problem. Recommender systems are one popular personalisation tool to help users deal with this issue. As the base of personalisation, the accuracy and efficiency of web user profiling affects the performances of recommender systems and other personalisation systems greatly. In Web 2.0, the emerging user information provides new possible solutions to profile users. Folksonomy or tag information is a kind of typical Web 2.0 information. Folksonomy implies the users‘ topic interests and opinion information. It becomes another source of important user information to profile users and to make recommendations. However, since tags are arbitrary words given by users, folksonomy contains a lot of noise such as tag synonyms, semantic ambiguities and personal tags. Such noise makes it difficult to profile users accurately or to make quality recommendations. This thesis investigates the distinctive features and multiple relationships of folksonomy and explores novel approaches to solve the tag quality problem and profile users accurately. Harvesting the wisdom of crowds and experts, three new user profiling approaches are proposed: folksonomy based user profiling approach, taxonomy based user profiling approach, hybrid user profiling approach based on folksonomy and taxonomy. The proposed user profiling approaches are applied to recommender systems to improve their performances. Based on the generated user profiles, the user and item based collaborative filtering approaches, combined with the content filtering methods, are proposed to make recommendations. The proposed new user profiling and recommendation approaches have been evaluated through extensive experiments. The effectiveness evaluation experiments were conducted on two real world datasets collected from Amazon.com and CiteULike websites. The experimental results demonstrate that the proposed user profiling and recommendation approaches outperform those related state-of-the-art approaches. In addition, this thesis proposes a parallel, scalable user profiling implementation approach based on advanced cloud computing techniques such as Hadoop, MapReduce and Cascading. The scalability evaluation experiments were conducted on a large scaled dataset collected from Del.icio.us website. This thesis contributes to effectively use the wisdom of crowds and expert to help users solve information overload issues through providing more accurate, effective and efficient user profiling and recommendation approaches. It also contributes to better usages of taxonomy information given by experts and folksonomy information contributed by users in Web 2.0.
Resumo:
Item folksonomy or tag information is a kind of typical and prevalent web 2.0 information. Item folksonmy contains rich opinion information of users on item classifications and descriptions. It can be used as another important information source to conduct opinion mining. On the other hand, each item is associated with taxonomy information that reflects the viewpoints of experts. In this paper, we propose to mine for users’ opinions on items based on item taxonomy developed by experts and folksonomy contributed by users. In addition, we explore how to make personalized item recommendations based on users’ opinions. The experiments conducted on real word datasets collected from Amazon.com and CiteULike demonstrated the effectiveness of the proposed approaches.
Resumo:
The Large scaled emerging user created information in web 2.0 such as tags, reviews, comments and blogs can be used to profile users’ interests and preferences to make personalized recommendations. To solve the scalability problem of the current user profiling and recommender systems, this paper proposes a parallel user profiling approach and a scalable recommender system. The current advanced cloud computing techniques including Hadoop, MapReduce and Cascading are employed to implement the proposed approaches. The experiments were conducted on Amazon EC2 Elastic MapReduce and S3 with a real world large scaled dataset from Del.icio.us website.
Resumo:
Social tags in web 2.0 are becoming another important information source to describe the content of items as well as to profile users’ topic preferences. However, as arbitrary words given by users, tags contains a lot of noise such as tag synonym and semantic ambiguity a large number personal tags that only used by one user, which brings challenges to effectively use tags to make item recommendations. To solve these problems, this paper proposes to use a set of related tags along with their weights to represent semantic meaning of each tag for each user individually. A hybrid recommendation generation approaches that based on the weighted tags are proposed. We have conducted experiments using the real world dataset obtained from Amazon.com. The experimental results show that the proposed approaches outperform the other state of the art approaches.
Resumo:
It is a big challenge to clearly identify the boundary between positive and negative streams for information filtering systems. Several attempts have used negative feedback to solve this challenge; however, there are two issues for using negative relevance feedback to improve the effectiveness of information filtering. The first one is how to select constructive negative samples in order to reduce the space of negative documents. The second issue is how to decide noisy extracted features that should be updated based on the selected negative samples. This paper proposes a pattern mining based approach to select some offenders from the negative documents, where an offender can be used to reduce the side effects of noisy features. It also classifies extracted features (i.e., terms) into three categories: positive specific terms, general terms, and negative specific terms. In this way, multiple revising strategies can be used to update extracted features. An iterative learning algorithm is also proposed to implement this approach on the RCV1 data collection, and substantial experiments show that the proposed approach achieves encouraging performance and the performance is also consistent for adaptive filtering as well.
Resumo:
1. A diverse array of patterns has been reported regarding the spatial extent of population genetic structure and effective dispersal in freshwater macroinvertebrates. In river systems, the movements of many taxa can be restricted to varying degrees by the natural stream channel hierarchy. 2. In this study, we sampled populations of the non-biting freshwater midge Echinocladius martini in the Paluma bioregion of tropical northeast Queensland to investigate fine scale patterns of within- and among-stream dispersal and gene flow within a purported historical refuge. We amplified a 639 bp fragment of mitochondrial COI and analysed genetic structure using pairwise ΦST, hierarchical AMOVA, Mantel tests and a parsimony network. Genetic variation was partitioned among stream sections using Streamtree to investigate the effect of potential instream dispersal barriers. 3. The data revealed strong natal site fidelity and significant differentiation among neighbouring, geographically proximate streams. We found evidence for only episodic adult flight among sites on separate stream reaches. Overall, however, our data suggested that both larval and adult dispersal was largely limited to within a stream channel. 4. This may arise from a combination of the high density of riparian vegetation physically restricting dispersal and from the joint effects of habitat stability and large population sizes. Together these may mitigate the requirement for movement among streams to avoid inbreeding and local extinction due to habitat change and may thus enable persistence of upstream populations in the absence of regular compensatory upstream flight. Taken together, these data suggest that dispersal of E. martini is highly restricted, to the scale of only a few kilometres, and hence occurs predominantly within the natal stream.
Resumo:
While the importance of literature studies in the IS discipline is well recognized, little attention has been paid to the underlying structure and method of conducting effective literature reviews. Despite the fact that literature is often used to refine the research context and direct the pathways for successful research outcomes, there is very little evidence of the use of resource management tools to support the literature review process. In this paper we want to contribute to advancing the way in which literature studies in Information Systems are conducted, by proposing a systematic, pre-defined and tool-supported method to extract, analyse and report literature. This paper presents how to best identify relevant IS papers to review within a feasible and justifiable scope, how to extract relevant content from identified papers, how to synthesise and analyse the findings of a literature review and what are ways to effectively write and present the results of a literature review. The paper is specifically targeted towards novice IS researchers, who would seek to conduct a systematic detailed literature review in a focused domain. Specific contributions of our method are extensive tool support, the identification of appropriate papers including primary and secondary paper sets and a pre-codification scheme. We use a literature study on shared services as an illustrative example to present the proposed approach.
Resumo:
The elastic task model, a significant development in scheduling of real-time control tasks, provides a mechanism for flexible workload management in uncertain environments. It tells how to adjust the control periods to fulfill the workload constraints. However, it is not directly linked to the quality-of-control (QoC) management, the ultimate goal of a control system. As a result, it does not tell how to make the best use of the system resources to maximize the QoC improvement. To fill in this gap, a new feedback scheduling framework, which we refer to as QoC elastic scheduling, is developed in this paper for real-time process control systems. It addresses the QoC directly through embedding both the QoC management and workload adaptation into a constrained optimization problem. The resulting solution for period adjustment is in a closed-form expressed in QoC measurements, enabling closed-loop feedback of the QoC to the task scheduler. Whenever the QoC elastic scheduler is activated, it improves the QoC the most while still meeting the system constraints. Examples are given to demonstrate the effectiveness of the QoC elastic scheduling.
Resumo:
Tony Fitzgerald’s visionary leap was to see beyond localised, individual wrongdoing. He suggested remedies that were systemic, institutionalised, and directed at underlying structural problems that led to corruption. His report said ‘the problems with which this Inquiry is concerned are not merely associated with individuals, but are institutionalized and related to attitudes which have become entrenched’ (Fitzgerald Report 1989, 13). His response was to suggest an enmeshed system of measures to not only respond reactively to future corruption, but also to prevent its recurrence through improved integrity systems. In the two decades since that report the primary focus of corruption studies and anti-corruption activism has remained on corruption at the local level or within sovereign states. International activism was largely directed at co-ordinating national campaigns and to use international instruments to make these campaigns more effective domestically. This reflects the broader fact that, since the rise of the nation state, states have comprised the majority of the largest institutional actors and have been the most significant institution in the lives of most individuals. This made states the ‘main game in town’ for the ‘governance disciplines’ of ethics, law, political science and economics.
Resumo:
In silico experimental modeling of cancer involves combining findings from biological literature with computer-based models of biological systems in order to conduct investigations of hypotheses entirely in the computer laboratory. In this paper, we discuss the use of in silico modeling as a precursor to traditional clinical and laboratory research, allowing researchers to refine their experimental programs with an aim to reducing costs and increasing research efficiency. We explain the methodology of in silico experimental trials before providing an example of in silico modeling from the biomathematical literature with a view to promoting more widespread use and understanding of this research strategy.
Resumo:
Privacy has become one of the main impediments for e-health in its advancement to providing better services to its consumers. Even though many security protocols are being developed to protect information from being compromised, privacy is still a major issue in healthcare where privacy protection is very important. When consumers are confident that their sensitive information is safe from being compromised, their trust in these services will be higher and would lead to better adoption of these systems. In this paper we propose a solution to the problem of patient privacy in e-health through an information accountability framework could enhance consumer trust in e-health services and would lead to the success of e-health services.
Resumo:
Sandwich components have emerged as light weight, efficient, economical, recyclable and reusable building systems which provide an alternative to both stiffened steel and reinforced concrete. These components are made of composite materials in which two metal face plates or Glassfibre Reinforced Cement (GRC) layers are bonded and form a sandwich with light weight compact polyurethane (PU) elastomer core. Existing examples of product applications are light weight sandwich panels for walls and roofs, Sandwich Plate System (SPS) for stadia, arena terraces, naval construction and bridges and Domeshell structures for dome type structures. Limited research has been conducted to investigate performance characteristics and applicability of sandwich or hybrid materials as structural flooring systems. Performance characteristics of Hybrid Floor Plate Systems comprising GRC, PU and Steel have not been adequately investigated and quantified. Therefore there is very little knowledge and design guidance for their application in commercial and residential buildings. This research investigates performance characteristics steel, PU and GRC in Hybrid Floor Plate Systems (HFPS) and develops a new floor system with appropriate design guide lines.
Resumo:
This paper presents an approach to predict the operating conditions of machine based on classification and regression trees (CART) and adaptive neuro-fuzzy inference system (ANFIS) in association with direct prediction strategy for multi-step ahead prediction of time series techniques. In this study, the number of available observations and the number of predicted steps are initially determined by using false nearest neighbor method and auto mutual information technique, respectively. These values are subsequently utilized as inputs for prediction models to forecast the future values of the machines’ operating conditions. The performance of the proposed approach is then evaluated by using real trending data of low methane compressor. A comparative study of the predicted results obtained from CART and ANFIS models is also carried out to appraise the prediction capability of these models. The results show that the ANFIS prediction model can track the change in machine conditions and has the potential for using as a tool to machine fault prognosis.
Resumo:
Growing evidence suggests that a novel member of the Chlamydiales order, Waddlia chondrophila, is a potential agent of miscarriage in humans and abortion in ruminants. Due to the lack of genetic tools to manipulate chlamydia, genomic analysis is proving to be the most incisive tool in stimulating investigations into the biology of these obligate intracellular bacteria. 454/Roche and Solexa/Illumina technologies were thus used to sequence and assemble de novo the full genome of the first representative of the Waddliaceae family, W. chondrophila. The bacteria possesses a 2′116′312bp chromosome and a 15′593 bp low-copy number plasmid that might integrate into the bacterial chromosome. The Waddlia genome displays numerous repeated sequences indicating different genome dynamics from classical chlamydia which almost completely lack repetitive elements. Moreover, W. chondrophila exhibits many virulence factors also present in classical chlamydia, including a functional type III secretion system, but also a large complement of specific factors for resistance to host or environmental stresses. Large families of outer membrane proteins were identified indicating that these highly immunogenic proteins are not Chlamydiaceae specific and might have been present in their last common ancestor. Enhanced metabolic capability for the synthesis of nucleotides, amino acids, lipids and other co-factors suggests that the common ancestor of the modern Chlamydiales may have been less dependent on their eukaryotic host. The fine-detailed analysis of biosynthetic pathways brings us closer to possibly developing a synthetic medium to grow W. chondrophila, a critical step in the development of genetic tools. As a whole, the availability of the W. chondrophila genome opens new possibilities in Chlamydiales research, providing new insights into the evolution of members of the order Chlamydiales and the biology of the Waddliaceae.