995 resultados para ranking systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Central Brazil, the long-term sustainability of beef cattle systems is under threat over vast tracts of farming areas, as more than half of the 50 million hectares of sown pastures are suffering from degradation. Overgrazing practised to maintain high stocking rates is regarded as one of the main causes. High stocking rates are deliberate and crucial decisions taken by the farmers, which appear paradoxical, even irrational given the state of knowledge regarding the consequences of overgrazing. The phenomenon however appears inextricably linked with the objectives that farmers hold. In this research those objectives were elicited first and from their ranking two, ‘asset value of cattle (representing cattle ownership)' and ‘present value of economic returns', were chosen to develop an original bi-criteria Compromise Programming model to test various hypotheses postulated to explain the overgrazing behaviour. As part of the model a pasture productivity index is derived to estimate the pasture recovery cost. Different scenarios based on farmers' attitudes towards overgrazing, pasture costs and capital availability were analysed. The results of the model runs show that benefits from holding more cattle can outweigh the increased pasture recovery and maintenance costs. This result undermines the hypothesis that farmers practise overgrazing because they are unaware or uncaring about overgrazing costs. An appropriate approach to the problem of pasture degradation requires information on the economics, and its interplay with farmers' objectives, for a wide range of pasture recovery and maintenance methods. Seen within the context of farmers' objectives, some level of overgrazing appears rational. Advocacy of the simple ‘no overgrazing' rule is an insufficient strategy to maintain the long-term sustainability of the beef production systems in Central Brazil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Central Brazil, the long-term, sustainability of beef cattle systems is under threat over vast tracts of farming areas, as more than half of the 50 million hectares of sown pastures are suffering from. degradation. Overgrazing practised to maintain high stocking rates is regarded as one of the main causes. High stocking rates are deliberate and crucial decisions taken by the farmers, which appear paradoxical, even irrational given the state of knowledge regarding the consequences of overgrazing. The phenomenon however appears inextricably linked with the objectives that farmers hold. In this research those objectives were elicited first and from their ranking two, 'asset value of cattle (representing cattle ownership and 'present value of economic returns', were chosen to develop an original bi-criteria Compromise Programming model to test various hypotheses postulated to explain the overgrazing behaviour. As part of the model a pasture productivity index is derived to estimate the pasture recovery cost. Different scenarios based on farmers' attitudes towards overgrazing, pasture costs and capital availability were analysed. The results of the model runs show that benefits from holding more cattle can outweigh the increased pasture recovery and maintenance costs. This result undermines the hypothesis that farmers practise overgrazing because they are unaware or uncaring caring about overgrazing costs. An appropriate approach to the problem of pasture degradation requires information on the economics,and its interplay with farmers' objectives, for a wide range of pasture recovery and maintenance methods. Seen within the context of farmers' objectives, some level of overgrazing appears rational. Advocacy of the simple 'no overgrazing' rule is an insufficient strategy to maintain the long-term sustainability of the beef production systems in Central Brazil. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Very large scale scheduling and planning tasks cannot be effectively addressed by fully automated schedule optimisation systems, since many key factors which govern 'fitness' in such cases are unformalisable. This raises the question of an interactive (or collaborative) approach, where fitness is assigned by the expert user. Though well-researched in the domains of interactively evolved art and music, this method is as yet rarely used in logistics. This paper concerns a difficulty shared by all interactive evolutionary systems (IESs), but especially those used for logistics or design problems. The difficulty is that objective evaluation of IESs is severely hampered by the need for expert humans in the loop. This makes it effectively impossible to, for example, determine with statistical confidence any ranking among a decent number of configurations for the parameters and strategy choices. We make headway into this difficulty with an Automated Tester (AT) for such systems. The AT replaces the human in experiments, and has parameters controlling its decision-making accuracy (modelling human error) and a built-in notion of a target solution which may typically be at odds with the solution which is optimal in terms of formalisable fitness. Using the AT, plausible evaluations of alternative designs for the IES can be done, allowing for (and examining the effects of) different levels of user error. We describe such an AT for evaluating an IES for very large scale planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web service is one of the most fundamental technologies in implementing service oriented architecture (SOA) based applications. One essential challenge related to web service is to find suitable candidates with regard to web service consumer’s requests, which is normally called web service discovery. During a web service discovery protocol, it is expected that the consumer will find it hard to distinguish which ones are more suitable in the retrieval set, thereby making selection of web services a critical task. In this paper, inspired by the idea that the service composition pattern is significant hint for service selection, a personal profiling mechanism is proposed to improve ranking and recommendation performance. Since service selection is highly dependent on the composition process, personal knowledge is accumulated from previous service composition process and shared via collaborative filtering where a set of users with similar interest will be firstly identified. Afterwards a web service re-ranking mechanism is employed for personalised recommendation. Experimental studies are conduced and analysed to demonstrate the promising potential of this research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Excellence in Research for Australia (ERA) initiative being conducted by the Australian Research Council (ARC), mandates a single journal and conference ranking scheme over every academic discipline in Australia. A universal publication outlet ranking list mandated by a government agency is unique and has attracted interest and comment both within Australia and overseas. Equally, the interest shown has come from all sectors involved in academic publishing – authors, reviewers, publishers – and from commercial and open access publishers. This paper investigates the distribution of information systems journals over the various ERA parameters and comments on a claim of bias whereby the ranking of a journal is positively influenced by the number of years it has been in existence in the areas of information systems and business journals. Clear evidence of the diversity of the information systems discipline is observed. The benefits of a multidisciplinary foundation for information systems is also noted. Longer established journals are shown to attract higher rankings and possible reasons for and implications flowing from this are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ranking method is a key element of Content-based Image Retrieval (CBIR) system, which can affect the final retrieval performance. In the literature, previous ranking methods based on either distance or probability do not explicitly relate to precision and recall, which are normally used to evaluate the performance of CBIR systems. In this paper, a novel ranking method based on relative density is proposed to improve the probability based approach by ranking images in the class. The proposed method can achieve optimal precision and recall. The experiments conducted on a large photographic collection show significant improvements of retrieval performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Urban Sustainability expresses the level of conservation of a city while living a town or consuming its urban resources, but the measurement of urban sustainability depends on what are considered important indicators of conservation besides the permitted levels of consumption in accordance with adopted criteria. This criterion should have common factors that are shared for all the members tested or cities to be evaluated as in this particular case for Abu Dhabi, but also have specific factors that are related to the geographic place, community and culture, that is the measures of urban sustainability specific to a middle east climate, community and culture where GIS Vector and Raster analysis have a role or add a value in urban sustainability measurements or grading are considered herein. Scenarios were tested using various GIS data types to replicate urban history (ten years period), current status and expected future of Abu Dhabi City setting factors to climate, community needs and culture. The useful Vector or Raster GIS data sets that are related to every scenario where selected and analysed in the sense of how and how much it can benefit the urban sustainability ranking in quantity and quality tests, this besides assessing the suitable data nature, type and format, the important topology rules to be considered, the useful attributes to be added, the relationships which should be maintained between data types of a geo- database, and specify its usage in a specific scenario test, then setting weights to each and every data type representing some elements of a phenomenon related to urban suitability factor. The results of assessing the role of GIS analysis provided data collection specifications such as the measures of accuracy reliable to a certain type of GIS functional analysis used in an urban sustainability ranking scenario tests. This paper reflects the prior results of the research that is conducted to test the multidiscipline evaluation of urban sustainability using different indicator metrics, that implement vector GIS Analysis and Raster GIS analysis as basic tools to assist the evaluation and increase of its reliability besides assessing and decomposing it, after which a hypothetical implementation of the chosen evaluation model represented by various scenarios was implemented on the planned urban sustainability factors for a certain period of time to appraise the expected future grade of urban sustainability and come out with advises associated with scenarios for assuring gap filling and relative high urban future sustainability. The results this paper is reflecting are concentrating on the elements of vector and raster GIS analysis that assists the proper urban sustainability grading within the chosen model, the reliability of spatial data collected; analysis selected and resulted spatial information. Starting from selecting some important indicators to comprise the model which include regional culture, climate and community needs an example of what was used is Energy Demand & Consumption (Cooling systems). Thus, this factor is related to the climate and it‟s regional specific as the temperature varies around 30-45 degrees centigrade in city areas, GIS 3D Polygons of building data used to analyse the volume of buildings, attributes „building heights‟, estimate the number of floors from the equation, following energy demand was calculated and consumption for the unit volume, and compared it in scenario with possible sustainable energy supply or using different environmental friendly cooling systems this is followed by calculating the cooling system effects on an area unit selected to be 1 sq. km, combined with the level of greenery area, and open space, as represented by parks polygons, trees polygons, empty areas, pedestrian polygons and road surface area polygons. (initial measures showed that cooling system consumption can be reduced by around 15 -20 % with a well-planned building distributions, proper spaces and with using environmental friendly products and building material, temperature levels were also combined in the scenario extracted from satellite images as interpreted from thermal bands 3 times during the period of assessment. Other examples of the assessment of GIS analysis to urban sustainability took place included Waste Productivity, some effects of greenhouse gases measured by the intensity of road polygons and closeness to dwelling areas, industry areas as defined from land use land cover thematic maps produced from classified satellite images then vectors were created to take part in defining their role within the scenarios. City Noise and light intensity assessment was also investigated, as the region experiences rapid development and noise is magnified due to construction activities, closeness of the airports, and highways. The assessment investigated the measures taken by urban planners to reduce degradation or properly manage it. Finally as a conclusion tables were presented to reflect the scenario results in combination with GIS data types, analysis types, and the level of GIS data reliability to measure the sustainability level of a city related to cultural and regional demands.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to maintain the transportation operation, proper monitoring systems should be established on road structures, especially bridges. Since these systems need enormous investments, only a part of bridges should be equipped. Thus, the priorities of the bridges should be ranked. In this paper, a method based on two-level synthetic evaluation is proposed. First, the importance of each bridge is analyzed through the economic analysis. Six factors are considered for the bridges in a network, including construction cost, service duration, length, location importance coefficient, traffic volume, and reconstruction time. Second, the safety condition of the bridge is evaluated by using improved entropy method (IEM) which combines subjective weight with objective entropy weight. Five indices are incorporated in this step, i.e., design and construction condition, technical condition, level of overloading, hazard of wind and earthquake and environmental factors. Finally, the priorities of all the bridge in one network can be ranked and classified through a judge matrix. To demonstrate the effectiveness of the proposed method, a main highway including 16 bridges is taken as an illustrative example. The results show that the bridges can be ranked and classified quickly by using the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Learning preference models from human generated data is an important task in modern information processing systems. Its popular setting consists of simple input ratings, assigned with numerical values to indicate their relevancy with respect to a specific query. Since ratings are often specified within a small range, several objects may have the same ratings, thus creating ties among objects for a given query. Dealing with this phenomena presents a general problem of modelling preferences in the presence of ties and being query-specific. To this end, we present in this paper a novel approach by constructing probabilistic models directly on the collection of objects exploiting the combinatorial structure induced by the ties among them. The proposed probabilistic setting allows exploration of a super-exponential combinatorial state-space with unknown numbers of partitions and unknown order among them. Learning and inference in such a large state-space are challenging, and yet we present in this paper efficient algorithms to perform these tasks. Our approach exploits discrete choice theory, imposing generative process such that the finite set of objects is partitioned into subsets in a stagewise procedure, and thus reducing the state-space at each stage significantly. Efficient Markov chain Monte Carlo algorithms are then presented for the proposed models. We demonstrate that the model can potentially be trained in a large-scale setting of hundreds of thousands objects using an ordinary computer. In fact, in some special cases with appropriate model specification, our models can be learned in linear time. We evaluate the models on two application areas: (i) document ranking with the data from the Yahoo! challenge and (ii) collaborative filtering with movie data. We demonstrate that the models are competitive against state-of-the-arts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Huge image collections are becoming available lately. In this scenario, the use of Content-Based Image Retrieval (CBIR) systems has emerged as a promising approach to support image searches. The objective of CBIR systems is to retrieve the most similar images in a collection, given a query image, by taking into account image visual properties such as texture, color, and shape. In these systems, the effectiveness of the retrieval process depends heavily on the accuracy of ranking approaches. Recently, re-ranking approaches have been proposed to improve the effectiveness of CBIR systems by taking into account the relationships among images. The re-ranking approaches consider the relationships among all images in a given dataset. These approaches typically demands a huge amount of computational power, which hampers its use in practical situations. On the other hand, these methods can be massively parallelized. In this paper, we propose to speedup the computation of the RL-Sim algorithm, a recently proposed image re-ranking approach, by using the computational power of Graphics Processing Units (GPU). GPUs are emerging as relatively inexpensive parallel processors that are becoming available on a wide range of computer systems. We address the image re-ranking performance challenges by proposing a parallel solution designed to fit the computational model of GPUs. We conducted an experimental evaluation considering different implementations and devices. Experimental results demonstrate that significant performance gains can be obtained. Our approach achieves speedups of 7x from serial implementation considering the overall algorithm and up to 36x on its core steps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In developing countries many water distribution systems are branched networks with little redundancy. If any component in the distribution system fails, many users are left relying on secondary water sources. These sources oftentimes do not provide potable water and prolonged use leads to increased cases of water borne illnesses. Increasing redundancy in branched networks increases the reliability of the networks, but is oftentimes viewed as unaffordable. This paper presents a procedure for water system managers to use to determine which loops when added to a branch network provide the most benefit for users. Two methods are presented, one ranking the loops based on total number of users benefited, and one ranking the loops of number of vulnerable users benefited. A case study is presented using the water distribution system of Medina Bank Village, Belize. It was found that forming loops in upstream pipes connected to the main line had the potential to benefit the most users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper sheds new light on the determination of environmental policies in majoritarian federal electoral systems such as the U.S., and derives implications for the environmental federalism debate on whether the national or local government should have authority over environmental policies. In majoritarian systems, where the legislature consists of geographically distinct electoral districts, the majority party (at either the national or the state level) favors its own home districts; depending on the location of polluting industries and the associated pollution damages, the majority party may therefore impose sub-optimally high or low pollution taxes due to a majority bias. We show that majority bias can influence the social-welfare ranking of alternative government policies and, in some cases, may actually bring distortionary policies closer to the first-best solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been hypothesized that results from the short term bioassays will ultimately provide information that will be useful for human health hazard assessment. Although toxicologic test systems have become increasingly refined, to date, no investigator has been able to provide qualitative or quantitative methods which would support the use of short term tests in this capacity.^ Historically, the validity of the short term tests have been assessed using the framework of the epidemiologic/medical screens. In this context, the results of the carcinogen (long term) bioassay is generally used as the standard. However, this approach is widely recognized as being biased and, because it employs qualitative data, cannot be used in the setting of priorities. In contrast, the goal of this research was to address the problem of evaluating the utility of the short term tests for hazard assessment using an alternative method of investigation.^ Chemical carcinogens were selected from the list of carcinogens published by the International Agency for Research on Carcinogens (IARC). Tumorigenicity and mutagenicity data on fifty-two chemicals were obtained from the Registry of Toxic Effects of Chemical Substances (RTECS) and were analyzed using a relative potency approach. The relative potency framework allows for the standardization of data "relative" to a reference compound. To avoid any bias associated with the choice of the reference compound, fourteen different compounds were used.^ The data were evaluated in a format which allowed for a comparison of the ranking of the mutagenic relative potencies of the compounds (as estimated using short term data) vs. the ranking of the tumorigenic relative potencies (as estimated from the chronic bioassays). The results were statistically significant (p $<$.05) for data standardized to thirteen of the fourteen reference compounds. Although this was a preliminary investigation, it offers evidence that the short term test systems may be of utility in ranking the hazards represented by chemicals which may be human carcinogens. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mechanisms that allow pathogens to colonize the host are not the product of isolated genes, but instead emerge from the concerted operation of regulatory networks. Therefore, identifying components and the systemic behavior of networks is necessary to a better understanding of gene regulation and pathogenesis. To this end, I have developed systems biology approaches to study transcriptional and post-transcriptional gene regulation in bacteria, with an emphasis in the human pathogen Mycobacterium tuberculosis (Mtb). First, I developed a network response method to identify parts of the Mtb global transcriptional regulatory network utilized by the pathogen to counteract phagosomal stresses and survive within resting macrophages. As a result, the method unveiled transcriptional regulators and associated regulons utilized by Mtb to establish a successful infection of macrophages throughout the first 14 days of infection. Additionally, this network-based analysis identified the production of Fe-S proteins coupled to lipid metabolism through the alkane hydroxylase complex as a possible strategy employed by Mtb to survive in the host. Second, I developed a network inference method to infer the small non-coding RNA (sRNA) regulatory network in Mtb. The method identifies sRNA-mRNA interactions by integrating a priori knowledge of possible binding sites with structure-driven identification of binding sites. The reconstructed network was useful to predict functional roles for the multitude of sRNAs recently discovered in the pathogen, being that several sRNAs were postulated to be involved in virulence-related processes. Finally, I applied a combined experimental and computational approach to study post-transcriptional repression mediated by small non-coding RNAs in bacteria. Specifically, a probabilistic ranking methodology termed rank-conciliation was developed to infer sRNA-mRNA interactions based on multiple types of data. The method was shown to improve target prediction in Escherichia coli, and therefore is useful to prioritize candidate targets for experimental validation.