466 resultados para Parameterized polygons


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proceedings of the 29th Annual International Conference of the IEEE EMBS Cité Internationale, Lyon, France August 23-26, 2007

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Evidence Accumulation Clustering (EAC) paradigm is a clustering ensemble method which derives a consensus partition from a collection of base clusterings obtained using different algorithms. It collects from the partitions in the ensemble a set of pairwise observations about the co-occurrence of objects in a same cluster and it uses these co-occurrence statistics to derive a similarity matrix, referred to as co-association matrix. The Probabilistic Evidence Accumulation for Clustering Ensembles (PEACE) algorithm is a principled approach for the extraction of a consensus clustering from the observations encoded in the co-association matrix based on a probabilistic model for the co-association matrix parameterized by the unknown assignments of objects to clusters. In this paper we extend the PEACE algorithm by deriving a consensus solution according to a MAP approach with Dirichlet priors defined for the unknown probabilistic cluster assignments. In particular, we study the positive regularization effect of Dirichlet priors on the final consensus solution with both synthetic and real benchmark data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dado o panorama de conservação de energia a nível nacional e mundial, torna-se hoje em dia muito importante, que seja possível controlar e estimar o consumo energético nos edifícios. Assim, atendendo à actual problemática energética e ao crescente consumo energético nos edifícios, é importante parametrizar, avaliar e comparar este consumo. Neste sentido, nas últimas décadas, têm sido efectuados desenvolvimentos técnicos, quer ao nível do equipamento de campo para efectuar monitorização e medição, quer ao nível da simulação dinâmica de edifícios. Com esta dissertação de mestrado, pretendeu-se efectuar a simulação dinâmica de um edifício escolar existente a funcionar em pleno, e efectuar uma análise de sensibilidade relativamente ao grau de variação dos resultados obtidos através da aplicação de dois programas de cálculo térmico e energético. Foram utilizados, o programa VE-Pro da IES (Integrated Environmental Solutions) e o programa Trace 700 da TRANE. Ambos os programas foram parametrizados com os mesmos dados de entrada, tendo em atenção as opções de simulação disponibilizadas por ambos. Posteriormente, utilizaram-se os dados retirados da simulação para calcular a classificação energética no âmbito do sistema de certificação energética (SCE), através de uma folha de cálculo desenvolvida para o efeito. Foram ainda consideradas várias soluções de eficiência energética para o edifício, com vista a poupanças reais de energia, tendo sempre atenção ao conforto térmico dos ocupantes. Dessas medidas fazem parte, medidas relacionadas com a iluminação, como a substituição da iluminação existente por luminárias do tipo LED (Light Emitting Diode), soluções de energias renováveis, como a instalação de colectores solares para aquecimento das águas quentes sanitárias, e painéis fotovoltaicos para produção de energia, bem como medidas ligadas aos equipamentos de climatização. Posteriormente, recalculou-se a classificação energética afectada das melhorias. Os resultados obtidos nas duas simulações foram analisados sob o ponto de vista do aquecimento, arrefecimento, ventilação, iluminação e equipamentos eléctricos. A comparação das duas simulações para cada parâmetro acima referido, apresentaram variações inferiores a 5%. O desvio maior verificou-se na ventilação, com o valor de aproximadamente 4,9%. Transpondo estes resultados para o cálculo do IEE (Índice de Eficiência Energética), verificou-se um desvio inferior a 2%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Disaster management is one of the most relevant application fields of wireless sensor networks. In this application, the role of the sensor network usually consists of obtaining a representation or a model of a physical phenomenon spreading through the affected area. In this work we focus on forest firefighting operations, proposing three fully distributed ways for approximating the actual shape of the fire. In the simplest approach, a circular burnt area is assumed around each node that has detected the fire and the union of these circles gives the overall fire’s shape. However, as this approach makes an intensive use of the wireless sensor network resources, we have proposed to incorporate two in-network aggregation techniques, which do not require considering the complete set of fire detections. The first technique models the fire by means of a complex shape composed of multiple convex hulls representing different burning areas, while the second technique uses a set of arbitrary polygons. Performance evaluation of realistic fire models on computer simulations reveals that the method based on arbitrary polygons obtains an improvement of 20% in terms of accuracy of the fire shape approximation, reducing the overhead in-network resources to 10% in the best case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ABSTRACT - It is the purpose of the present thesis to emphasize, through a series of examples, the need and value of appropriate pre-analysis of the impact of health care regulation. Specifically, the thesis presents three papers on the theme of regulation in different aspects of health care provision and financing. The first two consist of economic analyses of the impact of health care regulation and the third comprises the creation of an instrument for supporting economic analysis of health care regulation, namely in the field of evaluation of health care programs. The first paper develops a model of health plan competition and pricing in order to understand the dynamics of health plan entry and exit in the presence of switching costs and alternative health premium payment systems. We build an explicit model of death spirals, in which profitmaximizing competing health plans find it optimal to adopt a pattern of increasing relative prices culminating in health plan exit. We find the steady-state numerical solution for the price sequence and the plan’s optimal length of life through simulation and do some comparative statics. This allows us to show that using risk adjusted premiums and imposing price floors are effective at reducing death spirals and switching costs, while having employees pay a fixed share of the premium enhances death spirals and increases switching costs. Price regulation of pharmaceuticals is one of the cost control measures adopted by the Portuguese government, as in many European countries. When such regulation decreases the products’ real price over time, it may create an incentive for product turnover. Using panel data for the period of 1997 through 2003 on drug packages sold in Portuguese pharmacies, the second paper addresses the question of whether price control policies create an incentive for product withdrawal. Our work builds the product survival literature by accounting for unobservable product characteristics and heterogeneity among consumers when constructing quality, price control and competition indexes. These indexes are then used as covariates in a Cox proportional hazard model. We find that, indeed, price control measures increase the probability of exit, and that such effect is not verified in OTC market where no such price regulation measures exist. We also find quality to have a significant positive impact on product survival. In the third paper, we develop a microsimulation discrete events model (MSDEM) for costeffectiveness analysis of Human Immunodeficiency Virus treatment, simulating individual paths from antiretroviral therapy (ART) initiation to death. Four driving forces determine the course of events: CD4+ cell count, viral load resistance and adherence. A novel feature of the model with respect to the previous MSDEMs is that distributions of time to event depend on individuals’ characteristics and past history. Time to event was modeled using parametric survival analysis. Events modeled include: viral suppression, regimen switch due virological failure, regimen switch due to other reasons, resistance development, hospitalization, AIDS events, and death. Disease progression is structured according to therapy lines and the model is parameterized with cohort Portuguese observational data. An application of the model is presented comparing the cost-effectiveness ART initiation with two nucleoside analogue reverse transcriptase inhibitors (NRTI) plus one non-nucleoside reverse transcriptase inhibitor(NNRTI) to two NRTI plus boosted protease inhibitor (PI/r) in HIV- 1 infected individuals. We find 2NRTI+NNRTI to be a dominant strategy. Results predicted by the model reproduce those of the data used for parameterization and are in line with those published in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

სტატიაში დამტკიცებულია წრიული მრავალკუთხედებისათვის შებრუნებული ამოცანის ამოხსნის ერთადერთობა ორი შემთხვევისათვის: პირველი მუდმივი სიმკვრივისა და მეორე დადებითი სიმკვრივისათვის, რომელიც არ იცვლება მიმართულების მიხედვით.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Severini and Mansour introduced in [4]square polygons, as graphical representations of square permutations, that is, permutations such that all entries are records (left or right, minimum or maximum), and they obtained a nice formula for their number. In this paper we give a recursive construction for this class of permutations, that allows to simplify the derivation of their formula and to enumerate the subclass of square permutations with a simple record polygon. We also show that the generating function of these permutations with respect to the number of records of each type is algebraic, answering a question of Wilf in a particular case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Report for the scientific sojourn at the German Aerospace Center (DLR) , Germany, during June and July 2006. The main objective of the two months stay has been to apply the techniques of LEO (Low Earth Orbiters) satellites GPS navigation which DLR currently uses in real time navigation. These techniques comprise the use of a dynamical model which takes into account the precise earth gravity field and models to account for the effects which perturb the LEO’s motion (such as drag forces due to earth’s atmosphere, solar pressure, due to the solar radiation impacting on the spacecraft, luni-solar gravity, due to the perturbation of the gravity field for the sun and moon attraction, and tidal forces, due to the ocean and solid tides). A high parameterized software was produced in the first part of work, which has been used to asses which accuracy could be reached exploring different models and complexities. The objective was to study the accuracy vs complexity, taking into account that LEOs at different heights have different behaviors. In this frame, several LEOs have been selected in a wide range of altitudes, and several approaches with different complexity have been chosen. Complexity is a very important issue, because processors onboard spacecrafts have very limited computing and memory resources, so it is mandatory to keep the algorithms simple enough to let the satellite process it by itself.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting models as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output growth and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting model as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are two ways of creating incentives for interacting agents to behave in a desired way. One is by providing appropriate payoff incentives, which is the subject of mechanism design. The other is by choosing the information that agents observe, which we refer to as information design. We consider a model of symmetric information where a designer chooses and announces the information structure about a payoff relevant state. The interacting agents observe the signal realizations and take actions which affect the welfare of both the designer and the agents. We characterize the general finite approach to deriving the optimal information structure for the designer - the one that maximizes the designer's ex ante expected utility subject to agents playing a Bayes Nash equilibrium. We then apply the general approach to a symmetric two state, two agent, and two actions environment in a parameterized underlying game and fully characterize the optimal information structure: it is never strictly optimal for the designer to use conditionally independent private signals; the optimal information structure may be a public signal or may consist of correlated private signals. Finally, we examine how changes in the underlying game affect the designer's maximum payoff. This exercise provides a joint mechanism/information design perspective.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Time-inconsistency is an essential feature of many policy problems (Kydland and Prescott, 1977). This paper presents and compares three methods for computing Markov-perfect optimal policies in stochastic nonlinear business cycle models. The methods considered include value function iteration, generalized Euler-equations, and parameterized shadow prices. In the context of a business cycle model in which a scal authority chooses government spending and income taxation optimally, while lacking the ability to commit, we show that the solutions obtained using value function iteration and generalized Euler equations are somewhat more accurate than that obtained using parameterized shadow prices. Among these three methods, we show that value function iteration can be applied easily, even to environments that include a risk-sensitive scal authority and/or inequality constraints on government spending. We show that the risk-sensitive scal authority lowers government spending and income-taxation, reducing the disincentive households face to accumulate wealth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bacteria often possess multiple siderophore-based iron uptake systems for scavenging this vital resource from their environment. However, some siderophores seem redundant, because they have limited iron-binding efficiency and are seldom expressed under iron limitation. Here, we investigate the conundrum of why selection does not eliminate this apparent redundancy. We focus on Pseudomonas aeruginosa, a bacterium that can produce two siderophores-the highly efficient but metabolically expensive pyoverdine, and the inefficient but metabolically cheap pyochelin. We found that the bacteria possess molecular mechanisms to phenotypically switch from mainly producing pyoverdine under severe iron limitation to mainly producing pyochelin when iron is only moderately limited. We further show that strains exclusively producing pyochelin grew significantly better than strains exclusively producing pyoverdine under moderate iron limitation, whereas the inverse was seen under severe iron limitation. This suggests that pyochelin is not redundant, but that switching between siderophore strategies might be beneficial to trade off efficiencies versus costs of siderophores. Indeed, simulations parameterized from our data confirmed that strains retaining the capacity to switch between siderophores significantly outcompeted strains defective for one or the other siderophore under fluctuating iron availabilities. Finally, we discuss how siderophore switching can be viewed as a form of collective decision-making, whereby a coordinated shift in behaviour at the group level emerges as a result of positive and negative feedback loops operating among individuals at the local scale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los mapas de vegetación son a menudo utilizados como proxis de una estratificación de hábitats para generar distribuciones geográficas contínuas de organismos a partir de datos discretos mediante modelos multi-variantes. Sin embargo, los mapas de vegetación suelen ser poco apropiados para ser directamente aplicados a este fin, pues sus categorías no se concibieron con la intención de corresponder a tipos de hábitat. En este artículo presentamos y aplicamos el método de Agrupamiento por Doble Criterio para generalizar un mapa de vegetación extraordinariamente detallado (350 clases) del Parque Natural del Montseny (Cataluña) en categorías que mantienen la coherencia tanto desde el punto de vista estructural (a través de una matriz de disimilaridad espectral calculada mediante una imágen del satélite SPOT-5) como en términos de vegetación (gracias a una matriz de disimilaridad calculada mediante propiedades de vegetación deducidas de la leyenda jerárquica del mapa). El método simplifica de 114 a 18 clases el 67% del área de estudio. Añadiendo otras agregaciones más triviales basadas exclusivamente en criterios de cubierta de suelo, el 73% del área de estudio pasa de 167 a 25 categorías. Como valor añadido, el método identifica el 10% de los polígonos originales como anómalos (a partir de comparar las propiedades espectrales de cada polígono con el resto de los de su clases), lo que implica cambios en la cubierta entre las fechas del soporte utilizado para generar el mapa original y la imagen de satélite, o errores en la producción de éste.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L’objectiu d’aquest projecte és crear un sistema de seguiment d’una flota de vehicles amb GPS en temps real. A partir d’un mòdul de captació, el servidor recull la informació geogràfica dels vehicles i l’emmagatzema. I amb un mòdul de processament, es mostra i controla els vehicles, els punts d’interès i els polígons del sistema de Geofencing. En primer lloc, faig una introducció a l’estat de l’art dels sistemes de seguiment de vehicles. A continuació, analitzo els requeriments, especifico el comportament desitjat del sistema, explico el disseny i la implementació. Per últim, faig un seguit de proves per extreure’n les conclusions.