885 resultados para fuzzy based evaluation method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Personal information is increasingly gathered and used for providing services tailored to user preferences, but the datasets used to provide such functionality can represent serious privacy threats if not appropriately protected. Work in privacy-preserving data publishing targeted privacy guarantees that protect against record re-identification, by making records indistinguishable, or sensitive attribute value disclosure, by introducing diversity or noise in the sensitive values. However, most approaches fail in the high-dimensional case, and the ones that don’t introduce a utility cost incompatible with tailored recommendation scenarios. This paper aims at a sensible trade-off between privacy and the benefits of tailored recommendations, in the context of privacy-preserving data publishing. We empirically demonstrate that significant privacy improvements can be achieved at a utility cost compatible with tailored recommendation scenarios, using a simple partition-based sanitization method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Life Cycle Climate Performance (LCCP) is an evaluation method by which heating, ventilation, air conditioning and refrigeration systems can be evaluated for their global warming impact over the course of their complete life cycle. LCCP is more inclusive than previous metrics such as Total Equivalent Warming Impact. It is calculated as the sum of direct and indirect emissions generated over the lifetime of the system “from cradle to grave”. Direct emissions include all effects from the release of refrigerants into the atmosphere during the lifetime of the system. This includes annual leakage and losses during the disposal of the unit. The indirect emissions include emissions from the energy consumption during manufacturing process, lifetime operation, and disposal of the system. This thesis proposes a standardized approach to the use of LCCP and traceable data sources for all aspects of the calculation. An equation is proposed that unifies the efforts of previous researchers. Data sources are recommended for average values for all LCCP inputs. A residential heat pump sample problem is presented illustrating the methodology. The heat pump is evaluated at five U.S. locations in different climate zones. An excel tool was developed for residential heat pumps using the proposed method. The primary factor in the LCCP calculation is the energy consumption of the system. The effects of advanced vapor compression cycles are then investigated for heat pump applications. Advanced cycle options attempt to reduce the energy consumption in various ways. There are three categories of advanced cycle options: subcooling cycles, expansion loss recovery cycles and multi-stage cycles. The cycles selected for research are the suction line heat exchanger cycle, the expander cycle, the ejector cycle, and the vapor injection cycle. The cycles are modeled using Engineering Equation Solver and the results are applied to the LCCP methodology. The expander cycle, ejector cycle and vapor injection cycle are effective in reducing LCCP of a residential heat pump by 5.6%, 8.2% and 10.5%, respectively in Phoenix, AZ. The advanced cycles are evaluated with the use of low GWP refrigerants and are capable of reducing the LCCP of a residential heat by 13.7%, 16.3% and 18.6% using a refrigerant with a GWP of 10. To meet the U.S. Department of Energy’s goal of reducing residential energy use by 40% by 2025 with a proportional reduction in all other categories of residential energy consumption, a reduction in the energy consumption of a residential heat pump of 34.8% with a refrigerant GWP of 10 for Phoenix, AZ is necessary. A combination of advanced cycle, control options and low GWP refrigerants are necessary to meet this goal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La description des termes dans les ressources terminologiques traditionnelles se limite à certaines informations, comme le terme (principalement nominal), sa définition et son équivalent dans une langue étrangère. Cette description donne rarement d’autres informations qui peuvent être très utiles pour l’utilisateur, surtout s’il consulte les ressources dans le but d’approfondir ses connaissances dans un domaine de spécialité, maitriser la rédaction professionnelle ou trouver des contextes où le terme recherché est réalisé. Les informations pouvant être utiles dans ce sens comprennent la description de la structure actancielle des termes, des contextes provenant de sources authentiques et l’inclusion d’autres parties du discours comme les verbes. Les verbes et les noms déverbaux, ou les unités terminologiques prédicatives (UTP), souvent ignorés par la terminologie classique, revêtent une grande importance lorsqu’il s’agit d’exprimer une action, un processus ou un évènement. Or, la description de ces unités nécessite un modèle de description terminologique qui rend compte de leurs particularités. Un certain nombre de terminologues (Condamines 1993, Mathieu-Colas 2002, Gross et Mathieu-Colas 2001 et L’Homme 2012, 2015) ont d’ailleurs proposé des modèles de description basés sur différents cadres théoriques. Notre recherche consiste à proposer une méthodologie de description terminologique des UTP de la langue arabe, notamment l’arabe standard moderne (ASM), selon la théorie de la Sémantique des cadres (Frame Semantics) de Fillmore (1976, 1977, 1982, 1985) et son application, le projet FrameNet (Ruppenhofer et al. 2010). Le domaine de spécialité qui nous intéresse est l’informatique. Dans notre recherche, nous nous appuyons sur un corpus recueilli du web et nous nous inspirons d’une ressource terminologique existante, le DiCoInfo (L’Homme 2008), pour compiler notre propre ressource. Nos objectifs se résument comme suit. Premièrement, nous souhaitons jeter les premières bases d’une version en ASM de cette ressource. Cette version a ses propres particularités : 1) nous visons des unités bien spécifiques, à savoir les UTP verbales et déverbales; 2) la méthodologie développée pour la compilation du DiCoInfo original devra être adaptée pour prendre en compte une langue sémitique. Par la suite, nous souhaitons créer une version en cadres de cette ressource, où nous regroupons les UTP dans des cadres sémantiques, en nous inspirant du modèle de FrameNet. À cette ressource, nous ajoutons les UTP anglaises et françaises, puisque cette partie du travail a une portée multilingue. La méthodologie consiste à extraire automatiquement les unités terminologiques verbales et nominales (UTV et UTN), comme Ham~ala (حمل) (télécharger) et taHmiyl (تحميل) (téléchargement). Pour ce faire, nous avons adapté un extracteur automatique existant, TermoStat (Drouin 2004). Ensuite, à l’aide des critères de validation terminologique (L’Homme 2004), nous validons le statut terminologique d’une partie des candidats. Après la validation, nous procédons à la création de fiches terminologiques, à l’aide d’un éditeur XML, pour chaque UTV et UTN retenue. Ces fiches comprennent certains éléments comme la structure actancielle des UTP et jusqu’à vingt contextes annotés. La dernière étape consiste à créer des cadres sémantiques à partir des UTP de l’ASM. Nous associons également des UTP anglaises et françaises en fonction des cadres créés. Cette association a mené à la création d’une ressource terminologique appelée « DiCoInfo : A Framed Version ». Dans cette ressource, les UTP qui partagent les mêmes propriétés sémantiques et structures actancielles sont regroupées dans des cadres sémantiques. Par exemple, le cadre sémantique Product_development regroupe des UTP comme Taw~ara (طور) (développer), to develop et développer. À la suite de ces étapes, nous avons obtenu un total de 106 UTP ASM compilées dans la version en ASM du DiCoInfo et 57 cadres sémantiques associés à ces unités dans la version en cadres du DiCoInfo. Notre recherche montre que l’ASM peut être décrite avec la méthodologie que nous avons mise au point.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation verifies whether the following two hypotheses are true: (1) High-occupancy/toll lanes (and therefore other dedicated lanes) have capacity that could still be used; (2) such unused capacity (or more precisely, “unused managed capacity”) can be sold successfully through a real-time auction. To show that the second statement is true, this dissertation proposes an auction-based metering (ABM) system, that is, a mechanism that regulates traffic that enters the dedicated lanes. Participation in the auction is voluntary and can be skipped by paying the toll or by not registering to the new system. This dissertation comprises the following four components: a measurement of unused managed capacity on an existing HOT facility, a game-theoretic model of an ABM system, an operational description of the ABM system, and a simulation-based evaluation of the system. Some other and more specific contributions of this dissertation include the following: (1) It provides a definition and a methodology for measuring unused managed capacity and another important variable referred as “potential volume increase”. (2) It proves that the game-theoretic model has a unique Bayesian Nash equilibrium. (3) And it provides a specific road design that can be applied or extended to other facilities. The results provide evidence that the hypotheses are true and suggest that the ABM system would benefit a public operator interested in reducing traffic congestion significantly, would benefit drivers when making low-reliability trips (such as work-to-home trips), and would potentially benefit a private operator interested in raising revenue.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Qualidade é um dos fatores dinâmicos de competitividade e é, incontestavelmente, um dos pilares fundamentais para a construção do sucesso das organizações. O conceito de Qualidade não é de fácil definição, dado que é algo complexo e de difícil consenso. E muitas são as noções de Qualidade existentes na literatura, mas todas elas convergem num sentido único da busca da melhoria contínua e da excelência. Por outro lado, a implementação da Contabilidade de Gestão (designadamente o Custeio Baseado em Atividades - ABC) numa organização, sobretudo numa instituição de ensino, fornece recursos suficientes para a identificação dos melhores indutores de custo. Assim, toma-se inevitável observar as atividades, inquirir os colaboradores internos e externos, desenvolver e aplicar métodos quantitativos que monitorizem os processos e os procedimentos e, especialmente, que exista o empenho da gestão de topo, de modo a que a estratégia e a qualidade organizacional se inter-relacionem. Este trabalho apresenta diversos aspetos referentes à temática dos custos da qualidade (ou da não qualidade), a própria técnica contemporânea ABC e os principais resultados obtidos através da aplicação de listas de verificação e de um inquérito por questionário, junto dos alunos, colaboradores docentes e não docentes, com o objetivo de analisar a situação dos custos da qualidade baseados na contabilidade e avaliar o grau de satisfação/motivação com a qualidade do serviço prestado na sede do Agrupamento de Escolas do concelho de Estremoz. Os resultados do presente estudo evidenciaram alguns benefícios e dificuldades da aplicabilidade da Gestão da Qualidade numa instituição de ensino. A cultura organizacional deste tipo de instituições será um dos aspetos a ter em consideração, de modo a que os princípios da Gestão da Qualidade sejam implementados de forma harmoniosa e que poderá encaminhar essas organizações num verdadeiro percurso de qualidade, numa filosofia de melhoria contínua até à excelência. ABSTRACT: As a crucial factor for competitiveness, quality is undoubtedly one of the foundations on which a successful organization rests. The complexity surrounding the notion of quality makes it hard to reach a consensus about its meaning, and that is why it has so many different definitions. However, all of them have one thing in common - that it involves a continuous search for improvement and excellence. Nowadays, an organization that uses accounting methods to support management (namely the Activity-based Costing method) has the necessary means to identify where the main costs are originating from, particularly if the organization is somehow related to teaching. It is therefore inevitable to study activities, inquire everyone involved in the organization's activities, as well as develop and apply quantitative methods to monitor processes and procedures. Moreover, it is especially important that top management is fully committed to quality in order to reflect it on its strategy. This work focuses on several aspects pertaining costs associated with quality- and lack of quality­ and shows the Activity-based Costing method in particular, as well as the main results gathered from the verification lists and questionnaires made to students, teachers and staff, with the purpose of analyzing - based on accounting - the amount of costs derived from quality, and evaluating the degree of satisfaction and/or motivation as regards the quality of service provided at a group of schools in Portuguese region of Estremoz. The results of this study show that there are some benefits in applying quality management to teaching institutions, but there are also some difficulties. The organizational culture of these institutions is one aspect that should be taken into consideration, so that quality management principles can be implemented harmoniously. This may direct these organizations to the true path of quality, so they can continuously seek improvement and achieve excellence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thanks to the advanced technologies and social networks that allow the data to be widely shared among the Internet, there is an explosion of pervasive multimedia data, generating high demands of multimedia services and applications in various areas for people to easily access and manage multimedia data. Towards such demands, multimedia big data analysis has become an emerging hot topic in both industry and academia, which ranges from basic infrastructure, management, search, and mining to security, privacy, and applications. Within the scope of this dissertation, a multimedia big data analysis framework is proposed for semantic information management and retrieval with a focus on rare event detection in videos. The proposed framework is able to explore hidden semantic feature groups in multimedia data and incorporate temporal semantics, especially for video event detection. First, a hierarchical semantic data representation is presented to alleviate the semantic gap issue, and the Hidden Coherent Feature Group (HCFG) analysis method is proposed to capture the correlation between features and separate the original feature set into semantic groups, seamlessly integrating multimedia data in multiple modalities. Next, an Importance Factor based Temporal Multiple Correspondence Analysis (i.e., IF-TMCA) approach is presented for effective event detection. Specifically, the HCFG algorithm is integrated with the Hierarchical Information Gain Analysis (HIGA) method to generate the Importance Factor (IF) for producing the initial detection results. Then, the TMCA algorithm is proposed to efficiently incorporate temporal semantics for re-ranking and improving the final performance. At last, a sampling-based ensemble learning mechanism is applied to further accommodate the imbalanced datasets. In addition to the multimedia semantic representation and class imbalance problems, lack of organization is another critical issue for multimedia big data analysis. In this framework, an affinity propagation-based summarization method is also proposed to transform the unorganized data into a better structure with clean and well-organized information. The whole framework has been thoroughly evaluated across multiple domains, such as soccer goal event detection and disaster information management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Maintenance of transport infrastructure assets is widely advocated as the key in minimizing current and future costs of the transportation network. While effective maintenance decisions are often a result of engineering skills and practical knowledge, efficient decisions must also account for the net result over an asset's life-cycle. One essential aspect in the long term perspective of transport infrastructure maintenance is to proactively estimate maintenance needs. In dealing with immediate maintenance actions, support tools that can prioritize potential maintenance candidates are important to obtain an efficient maintenance strategy. This dissertation consists of five individual research papers presenting a microdata analysis approach to transport infrastructure maintenance. Microdata analysis is a multidisciplinary field in which large quantities of data is collected, analyzed, and interpreted to improve decision-making. Increased access to transport infrastructure data enables a deeper understanding of causal effects and a possibility to make predictions of future outcomes. The microdata analysis approach covers the complete process from data collection to actual decisions and is therefore well suited for the task of improving efficiency in transport infrastructure maintenance. Statistical modeling was the selected analysis method in this dissertation and provided solutions to the different problems presented in each of the five papers. In Paper I, a time-to-event model was used to estimate remaining road pavement lifetimes in Sweden. In Paper II, an extension of the model in Paper I assessed the impact of latent variables on road lifetimes; displaying the sections in a road network that are weaker due to e.g. subsoil conditions or undetected heavy traffic. The study in Paper III incorporated a probabilistic parametric distribution as a representation of road lifetimes into an equation for the marginal cost of road wear. Differentiated road wear marginal costs for heavy and light vehicles are an important information basis for decisions regarding vehicle miles traveled (VMT) taxation policies. In Paper IV, a distribution based clustering method was used to distinguish between road segments that are deteriorating and road segments that have a stationary road condition. Within railway networks, temporary speed restrictions are often imposed because of maintenance and must be addressed in order to keep punctuality. The study in Paper V evaluated the empirical effect on running time of speed restrictions on a Norwegian railway line using a generalized linear mixed model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Mediterranean silvo-pastoral system known as Montado, in Portugal, is a complex land use system composed of an open tree stratum in various densities and an herbaceous layer, used for livestock grazing. Livestock also profit from the acorns, and the grazing contributes to avoid shrub encroachment. In the last 20 years, subsidies from the European Union have greatly promoted cattle rearing in this system and the introduction of heavy breeds, at the expense of sheep, goats or the native cattle breeds. The balance of the traditional system is thus threatened, and a precise assessment of the balance between the different components of the system, therefore is highly needed. The goal of this study was to gain a better under- standing of a Montado farm system with cattle rearing as the major economic activity by applying the emergy evaluation method to calculate indices of yield, investment, environmental loading and sustainability. By integrating different ecosystem components, the emergy evaluation method allows a comprehensive evaluation of this complex and multifunctional system at the scale of an individual farm. This method provides a set of indices that can help us understand the system and design management strategies that maximize emergy flow in the farm. In this paper, we apply the emergy evaluation method to a Montado farm with cattle rearing, as a way to gain a better understanding of this system at the farm scale. The value for the transformity of veal (2.66E?06 sej J-1) is slightly higher, when compared to other systems producing protein. That means that the investment of nature and man in this product was higher and it requires a premium price on the market. The renewa- bility for Holm Oaks Farm (49 %), lower than for other similar systems, supports the assumption that this is a farm in which, comparing with others, the number of purchased inputs in relation to renewable inputs provided by nature, is higher. The Emergy Investment Ratio is 0.91 for cattle rearing compared to a value of 0.49 for cork and 0.43 for firewood harvesting, making it clear that cattle rearing is a more labor demanding activity comparing with extractive activities as cork and firewood harvesting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho analisa a prática avaliativa dos Professores de Artes Visuais nas Escolas de Macapá no Estado do Amapá. É uma resposta aos questionamentos. Trata-se de um estudo avaliativo, concebido com o objetivo de levar para professores, uma proposta inovadora, em termos de avaliação. O estudo está voltado para uma pedagogia de enfoque sociocultural, está sintonizado com o atual momento da educação brasileira, em que os educadores buscam formar alunos para a cidadania com uma visão critica do mundo que o cerca, visto que muitas discussões em torno do assunto Avaliação em Artes Visuais deram origem a esse documento significativo no âmbito educacional amapaense. ABSTRACT: This paper analyzes the evaluation method of visual arts teachers in schools of Macapá in the State of Amapá. It is a response to questions. This is an evaluative study, designed with the goal of bringing to teachers, an innovative proposal in terms of evaluation. The study is directed toward pedagogy of cultural focus, is in tune with the present situation of Brazilian education, as educators seek to train students for citizenship with a critical view of the world around him, as many discussions on the subject evaluation Visual Arts led to this significant document in the educational Amapá.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A conventional method for seismic strengthening of masonry walls is externally application of reinforced concrete layer (shotcrete). However, due to the lack of analytical and experimental information on the behavior of strengthened walls, the design procedures are usually followed based on the empirical relations. Using these design procedures have resulted in massive strengthening details in retrofitting projects. This paper presents a computational framework for nonlinear analysis of strengthened masonry walls and its versatility has been verified by comparing the numerical and experimental results. Based on the developed numerical model and available experimental information, design relations and failure modes are proposed for strengthened walls in accordance with the ASCE 41 standard. Finally, a sample masonry structure has been strengthened using the proposed and available conventional methods. It has been shown that using the proposed method results in lower strengthening details and appropriate (ductile) failure modes

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The extension of traditional data mining methods to time series has been effectively applied to a wide range of domains such as finance, econometrics, biology, security, and medicine. Many existing mining methods deal with the task of change points detection, but very few provide a flexible approach. Querying specific change points with linguistic variables is particularly useful in crime analysis, where intuitive, understandable, and appropriate detection of changes can significantly improve the allocation of resources for timely and concise operations. In this paper, we propose an on-line method for detecting and querying change points in crime-related time series with the use of a meaningful representation and a fuzzy inference system. Change points detection is based on a shape space representation, and linguistic terms describing geometric properties of the change points are used to express queries, offering the advantage of intuitiveness and flexibility. An empirical evaluation is first conducted on a crime data set to confirm the validity of the proposed method and then on a financial data set to test its general applicability. A comparison to a similar change-point detection algorithm and a sensitivity analysis are also conducted. Results show that the method is able to accurately detect change points at very low computational costs. More broadly, the detection of specific change points within time series of virtually any domain is made more intuitive and more understandable, even for experts not related to data mining.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The fuzzy logic admits infinite intermediate logical values between false and true. With this principle, it developed in this study a system based on fuzzy rules, which indicates the body mass index of ruminant animals in order to obtain the best time to slaughter. The controller developed has as input the variables weight and height, and as output a new body mass index, called Fuzzy Body Mass Index (Fuzzy BMI), which may serve as a detection system at the time of livestock slaughtering, comparing one another by the linguistic variables "Very Low", "Low", "Average ", "High" and "Very High". For demonstrating the use application of this fuzzy system, an analysis was made with 147 Nellore beeves to determine Fuzzy BMI values for each animal and indicate the location of body mass of any herd. The performance validation of the system was based on a statistical analysis using the Pearson correlation coefficient of 0.923, representing a high positive correlation, indicating that the proposed method is appropriate. Thus, this method allows the evaluation of the herd comparing each animal within the group, thus providing a quantitative method of farmer decision. It was concluded that this study established a computational method based on fuzzy logic that mimics part of human reasoning and interprets the body mass index of any bovine species and in any region of the country.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A fuzzy ruled-based system was developed in this study and resulted in an index indicating the level of uncertainty related to commercial transactions between cassava growers and their dealers. The fuzzy system was developed based on Transaction Cost Economics approach. The fuzzy system was developed from input variables regarding information sharing between grower and dealer on “Demand/purchase Forecasting”, “Production Forecasting” and “Production Innovation”. The output variable is the level of uncertainty regarding the transaction between seller and buyer agent, which may serve as a system for detecting inefficiencies. Evidences from 27 cassava growers registered in the Regional Development Offices of Tupa and Assis, São Paulo, Brazil, and 48 of their dealers supported the development of the system. The mathematical model indicated that 55% of the growers present a Very High level of uncertainty, 33% present Medium or High. The others present Low or Very Low level of uncertainty. From the model, simulations of external interferences can be implemented in order to improve the degree of uncertainty and, thus, lower transaction costs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

While channel coding is a standard method of improving a system’s energy efficiency in digital communications, its practice does not extend to high-speed links. Increasing demands in network speeds are placing a large burden on the energy efficiency of high-speed links and render the benefit of channel coding for these systems a timely subject. The low error rates of interest and the presence of residual intersymbol interference (ISI) caused by hardware constraints impede the analysis and simulation of coded high-speed links. Focusing on the residual ISI and combined noise as the dominant error mechanisms, this paper analyses error correlation through concepts of error region, channel signature, and correlation distance. This framework provides a deeper insight into joint error behaviours in high-speed links, extends the range of statistical simulation for coded high-speed links, and provides a case against the use of biased Monte Carlo methods in this setting