821 resultados para MESANGIAL OVERLOAD
Resumo:
Recommender systems are one of the recent inventions to deal with ever growing information overload in relation to the selection of goods and services in a global economy. Collaborative Filtering (CF) is one of the most popular techniques in recommender systems. The CF recommends items to a target user based on the preferences of a set of similar users known as the neighbours, generated from a database made up of the preferences of past users. With sufficient background information of item ratings, its performance is promising enough but research shows that it performs very poorly in a cold start situation where there is not enough previous rating data. As an alternative to ratings, trust between the users could be used to choose the neighbour for recommendation making. Better recommendations can be achieved using an inferred trust network which mimics the real world "friend of a friend" recommendations. To extend the boundaries of the neighbour, an effective trust inference technique is required. This thesis proposes a trust interference technique called Directed Series Parallel Graph (DSPG) which performs better than other popular trust inference algorithms such as TidalTrust and MoleTrust. Another problem is that reliable explicit trust data is not always available. In real life, people trust "word of mouth" recommendations made by people with similar interests. This is often assumed in the recommender system. By conducting a survey, we can confirm that interest similarity has a positive relationship with trust and this can be used to generate a trust network for recommendation. In this research, we also propose a new method called SimTrust for developing trust networks based on user's interest similarity in the absence of explicit trust data. To identify the interest similarity, we use user's personalised tagging information. However, we are interested in what resources the user chooses to tag, rather than the text of the tag applied. The commonalities of the resources being tagged by the users can be used to form the neighbours used in the automated recommender system. Our experimental results show that our proposed tag-similarity based method outperforms the traditional collaborative filtering approach which usually uses rating data.
Resumo:
Endocrinopathic laminitis is frequently associated with hyperinsulinaemia but the role of glucose in the pathogenesis of the disease has not been fully investigated. This study aimed to determine the endogenous insulin response to a quantity of glucose equivalent to that administered during a laminitis-inducing, euglycaemic, hyperinsulinaemic clamp, over 48. h in insulin-sensitive Standardbred racehorses. In addition, the study investigated whether glucose infusion, in the absence of exogenous insulin administration, would result in the development of clinical and histopathological evidence of laminitis. Glucose (50% dextrose) was infused intravenously at a rate of 0.68 mL/kg/h for 48. h in treated horses (n = 4) and control horses (n = 3) received a balanced electrolyte solution (0.68 mL/kg/h). Lamellar histology was examined at the conclusion of the experiment. Horses in the treatment group were insulin sensitive (M value 0.039 ± 0.0012. mmol/kg/min and M-to-I ratio (100×) 0.014 ± 0.002) as determined by an approximated hyperglycaemic clamp. Treated horses developed glycosuria, hyperglycaemia (10.7 ± 0.78. mmol/L) and hyperinsulinaemia (208 ± 26.1. μIU/mL), whereas control horses did not. None of the horses became lame as a consequence of the experiment but all of the treated horses developed histopathological evidence of laminitis in at least one foot. Combined with earlier studies, the results showed that laminitis may be induced by either insulin alone or a combination of insulin and glucose, but that it is unlikely to be due to a glucose overload mechanism. Based on the histopathological data, the potential threshold for insulin toxicity (i.e. laminitis) in horses may be at or below a serum concentration of ∼200. μIU/mL.
Resumo:
With the advent of large-scale wind farms and their integration into electrical grids, more uncertainties, constraints and objectives must be considered in power system development. It is therefore necessary to introduce risk-control strategies into the planning of transmission systems connected with wind power generators. This paper presents a probability-based multi-objective model equipped with three risk-control strategies. The model is developed to evaluate and enhance the ability of the transmission system to protect against overload risks when wind power is integrated into the power system. The model involves: (i) defining the uncertainties associated with wind power generators with probability measures and calculating the probabilistic power flow with the combined use of cumulants and Gram-Charlier series; (ii) developing three risk-control strategies by specifying the smallest acceptable non-overload probability for each branch and the whole system, and specifying the non-overload margin for all branches in the whole system; (iii) formulating an overload risk index based on the non-overload probability and the non-overload margin defined; and (iv) developing a multi-objective transmission system expansion planning (TSEP) model with the objective functions composed of transmission investment and the overload risk index. The presented work represents a superior risk-control model for TSEP in terms of security, reliability and economy. The transmission expansion planning model with the three risk-control strategies demonstrates its feasibility in the case study using two typical power systems
Resumo:
The effects of increased training (IT) load on plasma concentrations of lipopolysaccharides (LPS), proinflammatory cytokines, and anti-LPS antibodies during exercise in the heat were investigated in 18 male runners, who performed 14 days of normal training (NT) or 14 days of 20% IT load in 2 equal groups. Before (trial 1) and after (trial 2) the training intervention, all subjects ran at 70% maximum oxygen uptake on a treadmill under hot (35 degrees C) and humid (similar to 40%) conditions, until core temperature reached 39.5 degrees C or volitional exhaustion. Venous blood samples were drawn before, after, and 1.5 h after exercise. Plasma LPS concentration after exercise increased by 71% (trial 1, p < 0.05) and 21% (trial 2) in the NT group and by 92% (trial 1, p < 0.01) and 199% (trial 2, p < 0.01) in the IT group. Postintervention plasma LPS concentration was 35% lower before exercise (p < 0.05) and 47% lower during recovery (p < 0.01) in the IT than in the NT group. Anti-LPS IgM concentration during recovery was 35% lower in the IT than in the NT group (p < 0.05). Plasma interleukin (IL)-6 and tumor necrosis factor (TNF)-alpha concentrations after exercise (IL-6, 3-7 times, p < 0.01, and TNF-alpha, 33%, p < 0.01) and during recovery (IL-6, 2-4 times, p < 0.05, and TNF-alpha, 30%, p < 0.01) were higher than at rest within each group. These data suggest that a short-term tolerable increase in training load may protect against developing endotoxemia during exercise in the heat.
Resumo:
First year Property Economics students enrolled in the Bachelor of Urban Development at QUT are required to undertake a number of compulsory subjects, alongside students undertaking studies in other disciplines. One such common unit is ‘Stewardship of Land’, an interdisciplinary unit that introduces students to the characteristics of land and land tenure with a focus on land use and property rights. It covers a range of issues including: native title, land contamination, heritage values, alternative uses, the property development process, impact of environmental and social factors, and the management of land, both urban and regional. Teaching such a diverse content to a diverse audience has in previous years proved difficult, from the perspectives of relevance, engagement and content overload. In 2011 a project was undertake to redevelop this unit to reflect ‘threshold concepts’, concepts that are “transformative, probably irreversible, integrative, often troublesome and probably bounded” (Meyer & Land, 2003) . This project involved the development of a new set of underlying concepts students should draw from the unit, application of these to the unit curriculum, and a survey of the student response to these changes. This paper reports on the threshold concepts developed for this unit, the changes this made to the unit curriculum, and a preliminary report on survey responses. Recommendations for other educators seeking to incorporate threshold concepts into their curricula are provided.
Resumo:
Topic recommendation can help users deal with the information overload issue in micro-blogging communities. This paper proposes to use the implicit information network formed by the multiple relationships among users, topics and micro-blogs, and the temporal information of micro-blogs to find semantically and temporally relevant topics of each topic, and to profile users' time-drifting topic interests. The Content based, Nearest Neighborhood based and Matrix Factorization models are used to make personalized recommendations. The effectiveness of the proposed approaches is demonstrated in the experiments conducted on a real world dataset that collected from Twitter.com.
Resumo:
With the explosion of Web 2.0 application such as blogs, social and professional networks, and various other types of social media, the rich online information and various new sources of knowledge flood users and hence pose a great challenge in terms of information overload. It is critical to use intelligent agent software systems to assist users in finding the right information from an abundance of Web data. Recommender systems can help users deal with information overload problem efficiently by suggesting items (e.g., information and products) that match users’ personal interests. The recommender technology has been successfully employed in many applications such as recommending films, music, books, etc. The purpose of this report is to give an overview of existing technologies for building personalized recommender systems in social networking environment, to propose a research direction for addressing user profiling and cold start problems by exploiting user-generated content newly available in Web 2.0.
Resumo:
The IEEE Wireless LAN standard has been a true success story by enabling convenient, efficient and low-cost access to broadband networks for both private and professional use. However, the increasing density and uncoordinated operation of wireless access points, combined with constantly growing traffic demands have started hurting the users' quality of experience. On the other hand, the emerging ubiquity of wireless access has placed it at the center of attention for network attacks, which not only raises users' concerns on security but also indirectly affects connection quality due to proactive measures against security attacks. In this work, we introduce an integrated solution to congestion avoidance and attack mitigation problems through cooperation among wireless access points. The proposed solution implements a Partially Observable Markov Decision Process (POMDP) as an intelligent distributed control system. By successfully differentiating resource hampering attacks from overload cases, the control system takes an appropriate action in each detected anomaly case without disturbing the quality of service for end users. The proposed solution is fully implemented on a small-scale testbed, on which we present our observations and demonstrate the effectiveness of the system to detect and alleviate both attack and congestion situations.
Resumo:
News blog hot topics are important for the information recommendation service and marketing. However, information overload and personalized management make the information arrangement more difficult. Moreover, what influences the formation and development of blog hot topics is seldom paid attention to. In order to correctly detect news blog hot topics, the paper first analyzes the development of topics in a new perspective based on W2T (Wisdom Web of Things) methodology. Namely, the characteristics of blog users, context of topic propagation and information granularity are unified to analyze the related problems. Some factors such as the user behavior pattern, network opinion and opinion leader are subsequently identified to be important for the development of topics. Then the topic model based on the view of event reports is constructed. At last, hot topics are identified by the duration, topic novelty, degree of topic growth and degree of user attention. The experimental results show that the proposed method is feasible and effective.
Resumo:
Self reported driving behaviour in the occupational driving context has typically been measured through scales adapted from the general driving population (i.e. the Manchester Driver Behaviour Questionnaire (DBQ)). However, research suggests that occupational driving is influenced by unique factors operating within the workplace environment, and thus, a behavioural scale should reflect those behaviours prevalent and unique within the driving context. To overcome this limitation, developed the Occupational Driver Behaviour Questionnaire (ODBQ) which utilises a relevant theoretical model to assess the impact of the broader workplace context on driving behaviour. Although the theoretical argument has been established, research is yet to examine whether the ODBQ or the DBQ is a more sensitive measure of the workplace context. As such, this paper identifies selected organisational factors (i.e. safety climate and role overload) as predictors of the DBQ and the ODBQ and compares the relative predictive value in both models. In undertaking this task, 248 occupational drivers were recruited from a community-oriented nursing population. As predicted, hierarchical regression analyses revealed that the organisational factors accounted for a significantly greater proportion of variance in the ODBQ than the DBQ. These findings offer a number of practical and theoretical applications for occupational driving practice and future research.
Resumo:
With the explosive growth of resources available through the Internet, information mismatching and overload have become a severe concern to users. Web users are commonly overwhelmed by huge volume of information and are faced with the challenge of finding the most relevant and reliable information in a timely manner. Personalised information gathering and recommender systems represent state-of-the-art tools for efficient selection of the most relevant and reliable information resources, and the interest in such systems has increased dramatically over the last few years. However, web personalization has not yet been well-exploited; difficulties arise while selecting resources through recommender systems from a technological and social perspective. Aiming to promote high quality research in order to overcome these challenges, this paper provides a comprehensive survey on the recent work and achievements in the areas of personalised web information gathering and recommender systems. The report covers concept-based techniques exploited in personalised information gathering and recommender systems.
Resumo:
Currently, recommender systems (RS) have been widely applied in many commercial e-commerce sites to help users deal with the information overload problem. Recommender systems provide personalized recommendations to users and thus help them in making good decisions about which product to buy from the vast number of product choices available to them. Many of the current recommender systems are developed for simple and frequently purchased products like books and videos, by using collaborative-filtering and content-based recommender system approaches. These approaches are not suitable for recommending luxurious and infrequently purchased products as they rely on a large amount of ratings data that is not usually available for such products. This research aims to explore novel approaches for recommending infrequently purchased products by exploiting user generated content such as user reviews and product click streams data. From reviews on products given by the previous users, association rules between product attributes are extracted using an association rule mining technique. Furthermore, from product click streams data, user profiles are generated using the proposed user profiling approach. Two recommendation approaches are proposed based on the knowledge extracted from these resources. The first approach is developed by formulating a new query from the initial query given by the target user, by expanding the query with the suitable association rules. In the second approach, a collaborative-filtering recommender system and search-based approaches are integrated within a hybrid system. In this hybrid system, user profiles are used to find the target user’s neighbour and the subsequent products viewed by them are then used to search for other relevant products. Experiments have been conducted on a real world dataset collected from one of the online car sale companies in Australia to evaluate the effectiveness of the proposed recommendation approaches. The experiment results show that user profiles generated from user click stream data and association rules generated from user reviews can improve recommendation accuracy. In addition, the experiment results also prove that the proposed query expansion and the hybrid collaborative filtering and search-based approaches perform better than the baseline approaches. Integrating the collaborative-filtering and search-based approaches has been challenging as this strategy has not been widely explored so far especially for recommending infrequently purchased products. Therefore, this research will provide a theoretical contribution to the recommender system field as a new technique of combining collaborative-filtering and search-based approaches will be developed. This research also contributes to a development of a new query expansion technique for infrequently purchased products recommendation. This research will also provide a practical contribution to the development of a prototype system for recommending cars.
Resumo:
Teachers of construction economics and estimating have for a long time recognised that there is more to construction pricing than detailed calculation of costs (to the contractor). We always get to the point where we have to say "of course, experience or familiarity of the market is very important and this needs judgement, intuition, etc". Quite how important is the matter in construction pricing is not known and we tend to trivialise its effect. If judgement of the market has a minimal effect, little harm would be done, but if it is really important then some quite serious consequences arise which go well beyond the teaching environment. Major areas of concern for the quantity surveyor are in cost modelling and cost planning - neither of which pay any significant attention to the market effect. There are currently two schools of thought about the market effect issue. The first school is prepared to ignore possible effects until more is known. This may be called the pragmatic school. The second school exists solely to criticise the first school. We will call this the antagonistic school. Neither the pragmatic nor the antagonistic schools seem to be particularly keen to resolve the issue one way or the other. The founder and leader of the antagonistic school is Brian Fine whose paper in 1974 is still the basic text on the subject, and in which he coined the term 'socially acceptable' price to describe what we now recognise as the market effect. Mr Fine's argument was then, and is since, that the uncertainty surrounding the contractors' costing and cost estimating process is such that the uncertainty surrounding the contractors' cost that it logically leads to a market-orientated pricing approach. Very little factual evidence, however, seems to be available to support these arguments in any conclusive manner. A further, and more important point for the pragmatic school, is that, even if the market effect is as important as Mr Fine believes, there are no indications of how it can be measured, evaluated or predicted. Since 1974 evidence has been accumulating which tends to reinforce the antagonists' view. A review of the literature covering both contractors' and designers' estimates found many references to the use of value judgements in construction pricing (Ashworth & Skitmore, 1985), which supports the antagonistic view in implying the existence of uncertainty overload. The most convincing evidence emerged quite by accident in some research we recently completed with practicing quantity surveyors in estimating accuracy (Skitmore, 1985). In addition to demonstrating that individual quantity surveyors and certain types of buildings had significant effect on estimating accuracy, one surprise result was that only a very small amount of information was used by the most expert surveyors for relatively very accurate estimates. Only the type and size of building, it seemed, was really relevant in determining accuracy. More detailed information about the buildings' specification, and even a sight to the drawings, did not significantly improve their accuracy level. This seemed to offer clear evidence that the constructional aspects of the project were largely irrelevant and that the expert surveyors were somehow tuning in to the market price of the building. The obvious next step is to feed our expert surveyors with more relevant 'market' information in order to assess its effect. The problem with this is that our experts do not seem able to verbalise their requirements in this respect - a common occurrence in research of this nature. The lack of research into the nature of market effects on prices also means the literature provides little of benefit. Hence the need for this study. It was felt that a clearer picture of the nature of construction markets would be obtained in an environment where free enterprise was a truly ideological force. For this reason, the United States of America was chosen for the next stage of our investigations. Several people were interviewed in an informal and unstructured manner to elicit their views on the action of market forces on construction prices. Although a small number of people were involved, they were thought to be reasonably representative of knowledge in construction pricing. They were also very well able to articulate their views. Our initial reaction to the interviews was that our USA subjects held very close views to those held in the UK. However, detailed analysis revealed the existence of remarkably clear and consistent insights that would not have been obtained in the UK. Further evidence was also obtained from literature relating to the subject and some of the interviewees very kindly expanded on their views in later postal correspondence. We have now analysed all the evidence received and, although a great deal is of an anecdotal nature, we feel that our findings enable at least the basic nature of the subject to be understood and that the factors and their interrelationships can now be examined more formally in relation to construction price levels. I must express my gratitude to the Royal Institution of Chartered Surveyors' Educational Trust and the University of Salford's Department of Civil Engineering for collectively funding this study. My sincere thanks also go to our American participants who freely gave their time and valuable knowledge to us in our enquiries. Finally, I must record my thanks to Tim and Anne for their remarkable ability to produce an intelligible typescript from my unintelligible writing.
Resumo:
Changes in plasma zinc concentration and markers of immune function were examined in a group of 10 male runners (n = 10) following a moderate increase in training over four weeks. Seven sedentary males acted as controls. Fasting blood samples were taken at rest, before (T0) and after (T4) four weeks of increased (+ 16 %) training and after two weeks of reduced (-31 %) training (T6). Blood was analysed for plasma zinc concentration, differential leucocyte counts, lymphocyte subpopulations and lymphocyte proliferation using incorporation of 3H-thymidine. The runners increased their training volume by 16 % over the four weeks. When compared with the nonathletes, the runners had lower concentrations of plasma zinc (p = 0.012), CD3 + (p = 0.042) and CD19 + lymphocytes (p = 0.010) over the four weeks. Lymphocyte proliferation in response to Concanavalin A stimulation was greater in the runners (p = 0.0090). Plasma zinc concentration and immune markers remained constant during the study. Plasma zinc concentration correlated with total leucocyte counts in the athletes at T6 (r = -0.72, p < 0.05) and with Pokeweed mitogen stimulation in the nonathletes at T6 (r = -0.92, p < 0.05). Therefore, athletes are unlikely to benefit from zinc supplementation during periods of moderately increased training volume.
Resumo:
Conceptual modelling supports developers and users of information systems in areas of documentation, analysis or system redesign. The ongoing interest in the modelling of business processes has led to a variety of different grammars, raising the question of the quality of these grammars for modelling. An established way of evaluating the quality of a modelling grammar is by means of an ontological analysis, which can determine the extent to which grammars contain construct deficit, overload, excess or redundancy. While several studies have shown the relevance of most of these criteria, predictions about construct redundancy have yielded inconsistent results in the past, with some studies suggesting that redundancy may even be beneficial for modelling in practice. In this paper we seek to contribute to clarifying the concept of construct redundancy by introducing a revision to the ontological analysis method. Based on the concept of inheritance we propose an approach that distinguishes between specialized and distinct construct redundancy. We demonstrate the potential explanatory power of the revised method by reviewing and clarifying previous results found in the literature.