168 resultados para Ranking de diversificação


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The growing importance and need of data processing for information extraction is vital for Web databases. Due to the sheer size and volume of databases, retrieval of relevant information as needed by users has become a cumbersome process. Information seekers are faced by information overloading - too many result sets are returned for their queries. Moreover, too few or no results are returned if a specific query is asked. This paper proposes a ranking algorithm that gives higher preference to a user’s current search and also utilizes profile information in order to obtain the relevant results for a user’s query.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Handling information overload online, from the user's point of view is a big challenge, especially when the number of websites is growing rapidly due to growth in e-commerce and other related activities. Personalization based on user needs is the key to solving the problem of information overload. Personalization methods help in identifying relevant information, which may be liked by a user. User profile and object profile are the important elements of a personalization system. When creating user and object profiles, most of the existing methods adopt two-dimensional similarity methods based on vector or matrix models in order to find inter-user and inter-object similarity. Moreover, for recommending similar objects to users, personalization systems use the users-users, items-items and users-items similarity measures. In most cases similarity measures such as Euclidian, Manhattan, cosine and many others based on vector or matrix methods are used to find the similarities. Web logs are high-dimensional datasets, consisting of multiple users, multiple searches with many attributes to each. Two-dimensional data analysis methods may often overlook latent relationships that may exist between users and items. In contrast to other studies, this thesis utilises tensors, the high-dimensional data models, to build user and object profiles and to find the inter-relationships between users-users and users-items. To create an improved personalized Web system, this thesis proposes to build three types of profiles: individual user, group users and object profiles utilising decomposition factors of tensor data models. A hybrid recommendation approach utilising group profiles (forming the basis of a collaborative filtering method) and object profiles (forming the basis of a content-based method) in conjunction with individual user profiles (forming the basis of a model based approach) is proposed for making effective recommendations. A tensor-based clustering method is proposed that utilises the outcomes of popular tensor decomposition techniques such as PARAFAC, Tucker and HOSVD to group similar instances. An individual user profile, showing the user's highest interest, is represented by the top dimension values, extracted from the component matrix obtained after tensor decomposition. A group profile, showing similar users and their highest interest, is built by clustering similar users based on tensor decomposed values. A group profile is represented by the top association rules (containing various unique object combinations) that are derived from the searches made by the users of the cluster. An object profile is created to represent similar objects clustered on the basis of their similarity of features. Depending on the category of a user (known, anonymous or frequent visitor to the website), any of the profiles or their combinations is used for making personalized recommendations. A ranking algorithm is also proposed that utilizes the personalized information to order and rank the recommendations. The proposed methodology is evaluated on data collected from a real life car website. Empirical analysis confirms the effectiveness of recommendations made by the proposed approach over other collaborative filtering and content-based recommendation approaches based on two-dimensional data analysis methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Collaborative question answering (cQA) portals such as Yahoo! Answers allow users as askers or answer authors to communicate, and exchange information through the asking and answering of questions in the network. In their current set-up, answers to a question are arranged in chronological order. For effective information retrieval, it will be advantageous to have the users’ answers ranked according to their quality. This paper proposes a novel approach of evaluating and ranking the users’answers and recommending the top-n quality answers to information seekers. The proposed approach is based on a user-reputation method which assigns a score to an answer reflecting its answer author’s reputation level in the network. The proposed approach is evaluated on a dataset collected from a live cQA, namely, Yahoo! Answers. To compare the results obtained by the non-content-based user-reputation method, experiments were also conducted with several content-based methods that assign a score to an answer reflecting its content quality. Various combinations of non-content and content-based scores were also used in comparing results. Empirical analysis shows that the proposed method is able to rank the users’ answers and recommend the top-n answers with good accuracy. Results of the proposed method outperform the content-based methods, various combinations, and the results obtained by the popular link analysis method, HITS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The existing Collaborative Filtering (CF) technique that has been widely applied by e-commerce sites requires a large amount of ratings data to make meaningful recommendations. It is not directly applicable for recommending products that are not frequently purchased by users, such as cars and houses, as it is difficult to collect rating data for such products from the users. Many of the e-commerce sites for infrequently purchased products are still using basic search-based techniques whereby the products that match with the attributes given in the target user's query are retrieved and recommended to the user. However, search-based recommenders cannot provide personalized recommendations. For different users, the recommendations will be the same if they provide the same query regardless of any difference in their online navigation behaviour. This paper proposes to integrate collaborative filtering and search-based techniques to provide personalized recommendations for infrequently purchased products. Two different techniques are proposed, namely CFRRobin and CFAg Query. Instead of using the target user's query to search for products as normal search based systems do, the CFRRobin technique uses the products in which the target user's neighbours have shown interest as queries to retrieve relevant products, and then recommends to the target user a list of products by merging and ranking the returned products using the Round Robin method. The CFAg Query technique uses the products that the user's neighbours have shown interest in to derive an aggregated query, which is then used to retrieve products to recommend to the target user. Experiments conducted on a real e-commerce dataset show that both the proposed techniques CFRRobin and CFAg Query perform better than the standard Collaborative Filtering (CF) and the Basic Search (BS) approaches, which are widely applied by the current e-commerce applications. The CFRRobin and CFAg Query approaches also outperform the e- isting query expansion (QE) technique that was proposed for recommending infrequently purchased products.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Texas Department of Transportation (TxDOT) is concerned about the widening gap between preservation needs and available funding. Funding levels are not adequate to meet the preservation needs of the roadway network; therefore projects listed in the 4-Year Pavement Management Plan must be ranked to determine which projects should be funded now and which can be postponed until a later year. Currently, each district uses locally developed methods to prioritize projects. These ranking methods have relied on less formal qualitative assessments based on engineers’ subjective judgment. It is important for TxDOT to have a 4-Year Pavement Management Plan that uses a transparent, rational project ranking process. The objective of this study is to develop a conceptual framework that describes the development of the 4-Year Pavement Management Plan. It can be largely divided into three Steps; 1) Network-Level project screening process, 2) Project-Level project ranking process, and 3) Economic Analysis. A rational pavement management procedure and a project ranking method accepted by districts and the TxDOT administration will maximize efficiency in budget allocations and will potentially help improve pavement condition. As a part of the implementation of the 4-Year Pavement Management Plan, the Network-Level Project Screening (NLPS) tool including the candidate project identification algorithm and the preliminary project ranking matrix was developed. The NLPS has been used by the Austin District Pavement Engineer (DPE) to evaluate PMIS (Pavement Management Information System) data and to prepare a preliminary list of candidate projects for further evaluation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aims: This study determined whether the visibility benefits of positioning retroreflective strips in biological motion configurations were evident at real world road worker sites. Methods: 20 visually normal drivers (M=40.3 years) participated in this study that was conducted at two road work sites (one suburban and one freeway) on two separate nights. At each site, four road workers walked in place wearing one of four different clothing options: a) standard road worker night vest, b) standard night vest plus retroreflective strips on thighs, c) standard night vest plus retroreflective strips on ankles and knees, d) standard night vest plus retroreflective strips on eight moveable joints (full biomotion). Participants seated in stationary vehicles at three different distances (80m, 160m, 240m) rated the relative conspicuity of the four road workers using a series of a standardized visibility and ranking scales. Results: Adding retroreflective strips in the full biomotion configuration to the standard night vest significantly (p<0.001) enhanced perceptions of road worker visibility compared to the standard vest alone, or in combination with thigh retroreflective markings. These visibility benefits were evident at all distances and at both sites. Retroreflective markings at the ankles and knees also provided visibility benefits compared to the standard vest, however, the full biomotion configuration was significantly better than all of the other configurations. Conclusions: These data provide the first evidence that the benefits of biomotion retroreflective markings that have been previously demonstrated under laboratory and closed- and open-road conditions are also evident at real work sites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background/aims: Access to appropriate health care following an acute cardiac event is important for positive outcomes. The aim of the Cardiac ARIA index was to derive an objective, comparable, geographic measure reflecting access to cardiac services across Australia. Methods: Geographic Information Systems (GIS) were used to model a numeric-alpha index based on acute management from onset of symptoms to return to the community. Acute time frames have been calculated to include time for ambulance to arrive, assess and load patient, and travel to facility by road 40–80 kph. Results: The acute phase of the index was modelled into five categories: 1 [24/7 percutaneous cardiac intervention (PCI) ≤1 h]; 2 [24/7 PCI 1–3 h, and PCI less than an additional hour to nearest accident and emergency room (A&E)]: 3 [Nearest A&E ≤3 h (no 24/7 PCI within an extra hour)]: 4 [Nearest A&E 3–12 h (no 24/7 PCI within an extra hour)]: 5 [Nearest A&E 12–24 h (no 24/7 PCI within an extra hour)]. Discharge care was modelled into three categories based on time to a cardiac rehabilitation program, retail pharmacy, pathology services, hospital, GP or remote clinic: (A) all services ≤30 min; (B) >30 min and ≤60 min; (C) >60 min. Examples of the index indicate that the majority of population locations within capital cities were category 1A; Alice Springs and Byron Bay were 3A; and the Northern Territory town of Maningrida had minimal access to cardiac services with an index ranking of 5C. Conclusion: The Cardiac ARIA index provides an invaluable tool to inform appropriate strategies for the use of scarce cardiac resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

House dust is a heterogeneous matrix, which contains a number of biological materials and particulate matter gathered from several sources. It is the accumulation of a number of semi-volatile and non-volatile contaminants. The contaminants are trapped and preserved. Therefore, house dust can be viewed as an archive of both the indoor and outdoor air pollution. There is evidence to show that on average, people tend to stay indoors most of the time and this increases exposure to house dust. The aims of this investigation were to: " assess the levels of Polycyclic Aromatic Hydrocarbons (PAHs), elements and pesticides in the indoor environment of the Brisbane area; " identify and characterise the possible sources of elemental constituents (inorganic elements), PAHs and pesticides by means of Positive Matrix Factorisation (PMF); and " establish the correlations between the levels of indoor air pollutants (PAHs, elements and pesticides) with the external and internal characteristics or attributes of the buildings and indoor activities by means of multivariate data analysis techniques. The dust samples were collected during the period of 2005-2007 from homes located in different suburbs of Brisbane, Ipswich and Toowoomba, in South East Queensland, Australia. A vacuum cleaner fitted with a paper bag was used as a sampler for collecting the house dust. A survey questionnaire was filled by the house residents which contained information about the indoor and outdoor characteristics of their residences. House dust samples were analysed for three different pollutants: Pesticides, Elements and PAHs. The analyses were carried-out for samples of particle size less than 250 µm. The chemical analyses for both pesticides and PAHs were performed using a Gas Chromatography Mass Spectrometry (GC-MS), while elemental analysis was carried-out by using Inductively-Coupled Plasma-Mass Spectroscopy (ICP-MS). The data was subjected to multivariate data analysis techniques such as multi-criteria decision-making procedures, Preference Ranking Organisation Method for Enrichment Evaluations (PROMETHEE), coupled with Geometrical Analysis for Interactive Aid (GAIA) in order to rank the samples and to examine data display. This study showed that compared to the results from previous works, which were carried-out in Australia and overseas, the concentrations of pollutants in house dusts in Brisbane and the surrounding areas were relatively very high. The results of this work also showed significant correlations between some of the physical parameters (types of building material, floor level, distance from industrial areas and major road, and smoking) and the concentrations of pollutants. Types of building materials and the age of houses were found to be two of the primary factors that affect the concentrations of pesticides and elements in house dust. The concentrations of these two types of pollutant appear to be higher in old houses (timber houses) than in the brick ones. In contrast, the concentrations of PAHs were noticed to be higher in brick houses than in the timber ones. Other factors such as floor level, and distance from the main street and industrial area, also affected the concentrations of pollutants in the house dust samples. To apportion the sources and to understand mechanisms of pollutants, Positive Matrix Factorisation (PMF) receptor model was applied. The results showed that there were significant correlations between the degree of concentration of contaminants in house dust and the physical characteristics of houses, such as the age and the type of the house, the distance from the main road and industrial areas, and smoking. Sources of pollutants were identified. For PAHs, the sources were cooking activities, vehicle emissions, smoking, oil fumes, natural gas combustion and traces of diesel exhaust emissions; for pesticides the sources were application of pesticides for controlling termites in buildings and fences, treating indoor furniture and in gardens for controlling pests attacking horticultural and ornamental plants; for elements the sources were soil, cooking, smoking, paints, pesticides, combustion of motor fuels, residual fuel oil, motor vehicle emissions, wearing down of brake linings and industrial activities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The construction phase of building projects is often a crucial influencing factor in success or failure of projects. Project managers are believed to play a significant role in firms’ success and competitiveness. Therefore, it is important for firms to better understand the demands of managing projects and the competencies that project managers require for more effective project delivery. In a survey of building project managers in the state of Queensland, Australia, it was found that management and information management system are the top ranking competencies required by effective project managers. Furthermore, a significant number of respondents identified the site manager, construction manager and client’s representative as the three individuals whose close and regular contacts with project managers have the greatest influence on the project managers’ performance. Based on these findings, an intra-project workgroups model is proposed to help project managers facilitate more effective management of people and information on building projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study proposes a framework of a model-based hot spot identification method by applying full Bayes (FB) technique. In comparison with the state-of-the-art approach [i.e., empirical Bayes method (EB)], the advantage of the FB method is the capability to seamlessly integrate prior information and all available data into posterior distributions on which various ranking criteria could be based. With intersection crash data collected in Singapore, an empirical analysis was conducted to evaluate the following six approaches for hot spot identification: (a) naive ranking using raw crash data, (b) standard EB ranking, (c) FB ranking using a Poisson-gamma model, (d) FB ranking using a Poisson-lognormal model, (e) FB ranking using a hierarchical Poisson model, and (f) FB ranking using a hierarchical Poisson (AR-1) model. The results show that (a) when using the expected crash rate-related decision parameters, all model-based approaches perform significantly better in safety ranking than does the naive ranking method, and (b) the FB approach using hierarchical models significantly outperforms the standard EB approach in correctly identifying hazardous sites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study proposes a full Bayes (FB) hierarchical modeling approach in traffic crash hotspot identification. The FB approach is able to account for all uncertainties associated with crash risk and various risk factors by estimating a posterior distribution of the site safety on which various ranking criteria could be based. Moreover, by use of hierarchical model specification, FB approach is able to flexibly take into account various heterogeneities of crash occurrence due to spatiotemporal effects on traffic safety. Using Singapore intersection crash data(1997-2006), an empirical evaluate was conducted to compare the proposed FB approach to the state-of-the-art approaches. Results show that the Bayesian hierarchical models with accommodation for site specific effect and serial correlation have better goodness-of-fit than non hierarchical models. Furthermore, all model-based approaches perform significantly better in safety ranking than the naive approach using raw crash count. The FB hierarchical models were found to significantly outperform the standard EB approach in correctly identifying hotspots.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Denaturation of tissues can provide a unique biological environment for regenerative medicine application only if minimal disruption of their microarchitecture is achieved during the decellularization process. The goal is to keep the structural integrity of such a construct as functional as the tissues from which they were derived. In this work, cartilage-on-bone laminates were decellularized through enzymatic, non-ionic and ionic protocols. This work investigated the effects of decellularization process on the microarchitecture of cartiligous extracellular matrix; determining the extent of how each process deteriorated the structural organization of the network. High resolution microscopy was used to capture cross-sectional images of samples prior to and after treatment. The variation of the microarchitecture was then analysed using a well defined fast Fourier image processing algorithm. Statistical analysis of the results revealed how significant the alternations among aforementioned protocols were (p < 0.05). Ranking the treatments by their effectiveness in disrupting the ECM integrity, they were ordered as: Trypsin> SDS> Triton X-100.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Queensland University of Technology (QUT) was one of the first universities in Australia to establish an institutional repository. Launched in November 2003, the repository (QUT ePrints) uses the EPrints open source repository software (from Southampton) and has enjoyed the benefit of an institutional deposit mandate since January 2004. Currently (April 2012), the repository holds over 36,000 records, including 17,909 open access publications with another 2,434 publications embargoed but with mediated access enabled via the ‘Request a copy’ button which is a feature of the EPrints software. At QUT, the repository is managed by the library.QUT ePrints (http://eprints.qut.edu.au) The repository is embedded into a number of other systems at QUT including the staff profile system and the University’s research information system. It has also been integrated into a number of critical processes related to Government reporting and research assessment. Internally, senior research administrators often look to the repository for information to assist with decision-making and planning. While some statistics could be drawn from the advanced search feature and the existing download statistics feature, they were rarely at the level of granularity or aggregation required. Getting the information from the ‘back end’ of the repository was very time-consuming for the Library staff. In 2011, the Library funded a project to enhance the range of statistics which would be available from the public interface of QUT ePrints. The repository team conducted a series of focus groups and individual interviews to identify and prioritise functionality requirements for a new statistics ‘dashboard’. The participants included a mix research administrators, early career researchers and senior researchers. The repository team identified a number of business criteria (eg extensible, support available, skills required etc) and then gave each a weighting. After considering all the known options available, five software packages (IRStats, ePrintsStats, AWStats, BIRT and Google Urchin/Analytics) were thoroughly evaluated against a list of 69 criteria to determine which would be most suitable. The evaluation revealed that IRStats was the best fit for our requirements. It was deemed capable of meeting 21 out of the 31 high priority criteria. Consequently, IRStats was implemented as the basis for QUT ePrints’ new statistics dashboards which were launched in Open Access Week, October 2011. Statistics dashboards are now available at four levels; whole-of-repository level, organisational unit level, individual author level and individual item level. The data available includes, cumulative total deposits, time series deposits, deposits by item type, % fulltexts, % open access, cumulative downloads, time series downloads, downloads by item type, author ranking, paper ranking (by downloads), downloader geographic location, domains, internal v external downloads, citation data (from Scopus and Web of Science), most popular search terms, non-search referring websites. The data is displayed in charts, maps and table format. The new statistics dashboards are a great success. Feedback received from staff and students has been very positive. Individual researchers have said that they have found the information to be very useful when compiling a track record. It is now very easy for senior administrators (including the Deputy Vice Chancellor-Research) to compare the full-text deposit rates (i.e. mandate compliance rates) across organisational units. This has led to increased ‘encouragement’ from Heads of School and Deans in relation to the provision of full-text versions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the context of increasing demand for potable water and the depletion of water resources, stormwater is a logical alternative. However, stormwater contains pollutants, among which metals are of particular interest due to their toxicity and persistence in the environment. Hence, it is imperative to remove toxic metals in stormwater to the levels prescribed by drinking water guidelines for potable use. Consequently, various techniques have been proposed, among which sorption using low cost sorbents is economically viable and environmentally benign in comparison to other techniques. However, sorbents show affinity towards certain toxic metals, which results in poor removal of other toxic metals. It was hypothesised in this study that a mixture of sorbents that have different metal affinity patterns can be used for the efficient removal of a range of toxic metals commonly found in stormwater. The performance of six sorbents in the sorption of Al, Cr, Cu, Pb, Ni, Zn and Cd, which are the toxic metals commonly found in urban stormwater, was investigated to select suitable sorbents for creating the mixtures. For this purpose, a multi criteria analytical protocol was developed using the decision making methods: PROMETHEE (Preference Ranking Organisation METHod for Enrichment Evaluations) and GAIA (Graphical Analysis for Interactive Assistance). Zeolite and seaweed were selected for the creation of trial mixtures based on their metal affinity pattern and the performance on predetermined selection criteria. The metal sorption mechanisms employed by seaweed and zeolite were defined using kinetics, isotherm and thermodynamics parameters, which were determined using the batch sorption experiments. Additionally, the kinetics rate-limiting steps were identified using an innovative approach using GAIA and Spearman correlation techniques developed as part of the study, to overcome the limitation in conventional graphical methods in predicting the degree of contribution of each kinetics step in limiting the overall metal removal rate. The sorption kinetics of zeolite was found to be primarily limited by intraparticle diffusion followed by the sorption reaction steps, which were governed mainly by the hydrated ionic diameter of metals. The isotherm study indicated that the metal sorption mechanism of zeolite was primarily of a physical nature. The thermodynamics study confirmed that the energetically favourable nature of sorption increased in the order of Zn < Cu < Cd < Ni < Pb < Cr < Al, which is in agreement with metal sorption affinity of zeolite. Hence, sorption thermodynamics has an influence on the metal sorption affinity of zeolite. On the other hand, the primary kinetics rate-limiting step of seaweed was the sorption reaction process followed by intraparticle diffusion. The boundary layer diffusion was also found to limit the metal sorption kinetics at low concentration. According to the sorption isotherm study, Cd, Pb, Cr and Al were sorbed by seaweed via ion exchange, whilst sorption of Ni occurred via physisorption. Furthermore, ionic bonding is responsible for the sorption of Zn. The thermodynamics study confirmed that sorption by seaweed was energetically favourable in the order of Zn < Cu < Cd < Cr . Al < Pb < Ni. However, this did not agree with the affinity series derived for seaweed suggesting a limited influence of sorption thermodynamics on metal affinity for seaweed. The investigation of zeolite-seaweed mixtures indicated that mixing sorbents have an effect on the kinetics rates and the sorption affinity. Additionally, the theoretical relationships were derived to predict the boundary layer diffusion rate, intraparticle diffusion rate, the sorption reaction rate and the enthalpy of mixtures based on that of individual sorbents. In general, low coefficient of determination (R2) for the relationships between theoretical and experimental data indicated that the relationships were not statistically significant. This was attributed to the heterogeneity of the properties of sorbents. Nevertheless, in relative terms, the intraparticle diffusion rate, sorption reaction rate and enthalpy of sorption had higher R2 values than the boundary layer diffusion rate suggesting that there was some relationship between the former set of parameters of mixtures and that of sorbents. The mixture, which contained 80% of zeolite and 20% of seaweed, showed similar affinity for the sorption of Cu, Ni, Cd, Cr and Al, which was attributed to approximately similar sorption enthalpy of the metal ions. Therefore, it was concluded that the seaweed-zeolite mixture can be used to obtain the same affinity for various metals present in a multi metal system provided the metal ions have similar enthalpy during sorption by the mixture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Texas Department of Transportation (TxDOT) is concerned about the widening gap between pavement preservation needs and available funding. Thus, the TxDOT Austin District Pavement Engineer (DPE) has investigated methods to strategically allocate available pavement funding to potential projects that improve the overall performance of the District and Texas highway systems. The primary objective of the study presented in this paper is to develop a network-level project screening and ranking method that supports the Austin District 4-year pavement management plan development. The study developed candidate project selection and ranking algorithms that evaluated pavement conditions of each project candidate using data contained in the Pavement Management Information system (PMIS) database and incorporated insights from Austin District pavement experts; and implemented the developed method and supporting algorithm. This process previously required weeks to complete, but now requires about 10 minutes including data preparation and running the analysis algorithm, which enables the Austin DPE to devote more time and resources to conducting field visits, performing project-level evaluation and testing candidate projects. The case study results showed that the proposed method assisted the DPE in evaluating and prioritizing projects and allocating funds to the right projects at the right time.