964 resultados para multi-purpose optimisation
Resumo:
In this article, a real-world case- study is presented with two general objectives: to give a clear and simple illustrative example of application of social multi-criteria evaluation (SMCE) in the field of rural renewable energy policies, and to help in understanding to what extent and under which circumstances solar energy is suitable for electrifying isolated farmhouses. In this sense, this study might offer public decision- makers some insight on the conditions that favour the diffusion of renewable energy, in order to help them to design more effective energy policies for rural communities.
Resumo:
The main argument developed here is the proposal of the concept of “Social Multi-Criteria Evaluation” (SMCE) as a possible useful framework for the application of social choice to the difficult policy problems of our Millennium, where, as stated by Funtowicz and Ravetz, “facts are uncertain, values in dispute, stakes high and decisions urgent”. This paper starts from the following main questions: 1. Why “Social” Multi-criteria Evaluation? 2. How such an approach should be developed? The foundations of SMCE are set up by referring to concepts coming from complex system theory and philosophy, such as reflexive complexity, post-normal science and incommensurability. To give some operational guidelines on the application of SMCE basic questions to be answered are: 1. How is it possible to deal with technical incommensurability? 2. How can we deal with the issue of social incommensurability? To answer these questions, by using theoretical considerations and lessons learned from realworld case studies, is the main objective of the present article.
Resumo:
We extend the basic tax evasion model to a multi-period economy exhibiting sustained growth. When individuals conceal part of their true income from the tax authority, they face the risk of being audited and hence of paying the corresponding fine. Both taxes and fines determine individual saving and the rate of capital accumulation. In this context we show that the sign of the relation between the level of the tax rate and the amount of evaded income is the same as that obtained in static setups. Moreover, high tax rates on income are typically associated with low growth rates as occurs in standard growth models that disregard the tax evasion phenomenon.
Resumo:
The goal of this paper is to study the role of multi-product firms in the market provision of product variety. The analysis is conducted using the spokes model of non-localized competition proposed by Chen and Riordan (2007). Firstly, we show that multi-product firms are at a competitive disadvantage vis-a-vis single-product firms and can only emerge if economies of scope are sufficiently strong. Secondly, under duopoly product variety may be higher or lower with respect to both the first best and the monopolistically competitive equilibrium. However, within a relevant range of parameter values duopolists drastically restrict their product range in order to relax price competition, and as a result product variety is far below the efficient level.
Resumo:
The objective of this work was to develop an easily applicable technique and a standardized protocol for high-quality post-mortem angiography. This protocol should (1) increase the radiological interpretation by decreasing artifacts due to the perfusion and by reaching a complete filling of the vascular system and (2) ease and standardize the execution of the examination. To this aim, 45 human corpses were investigated by post-mortem computed tomography (CT) angiography using different perfusion protocols, a modified heart-lung machine and a new contrast agent mixture, specifically developed for post-mortem investigations. The quality of the CT angiographies was evaluated radiologically by observing the filling of the vascular system and assessing the interpretability of the resulting images and by comparing radiological diagnoses to conventional autopsy conclusions. Post-mortem angiography yielded satisfactory results provided that the volumes of the injected contrast agent mixture were high enough to completely fill the vascular system. In order to avoid artifacts due to the post-mortem perfusion, a minimum of three angiographic phases and one native scan had to be performed. These findings were taken into account to develop a protocol for quality post-mortem CT angiography that minimizes the risk of radiological misinterpretation. The proposed protocol is easy applicable in a standardized way and yields high-quality radiologically interpretable visualization of the vascular system in post-mortem investigations.
Resumo:
Polarization indices presented up to now have only focused their attention on the distribution of income/wealth. However, in many circumstances income is not the only relevant dimension that might be the cause of social conflict, so it is very important to have a social polarization index able to cope with alternative dimensions. In this paper we present an axiomatic characterization of one of such indices: it has been obtained as an extension of the (income) polarization measure introduced in Duclos, Esteban and Ray (2004) to a wider domain. It turns out that the axiomatic structure introduced in that paper alone is not appropriate to obtain a fully satisfactory characterization of our measure, so additional axioms are proposed. As a byproduct, we present an alternative axiomatization of the aforementioned income polarization measure.
Resumo:
Thermal systems interchanging heat and mass by conduction, convection, radiation (solar and thermal ) occur in many engineering applications like energy storage by solar collectors, window glazing in buildings, refrigeration of plastic moulds, air handling units etc. Often these thermal systems are composed of various elements for example a building with wall, windows, rooms, etc. It would be of particular interest to have a modular thermal system which is formed by connecting different modules for the elements, flexibility to use and change models for individual elements, add or remove elements without changing the entire code. A numerical approach to handle the heat transfer and fluid flow in such systems helps in saving the full scale experiment time, cost and also aids optimisation of parameters of the system. In subsequent sections are presented a short summary of the work done until now on the orientation of the thesis in the field of numerical methods for heat transfer and fluid flow applications, the work in process and the future work.
Resumo:
The main purpose of this work is to give a survey of main monotonicity properties of queueing processes based on the coupling method. The literature on this topic is quite extensive, and we do not consider all aspects of this topic. Our more concrete goal is to select the most interesting basic monotonicity results and give simple and elegant proofs. Also we give a few new (or revised) proofs of a few important monotonicity properties for the queue-size and workload processes both in single-server and multi- server systems. The paper is organized as follows. In Section 1, the basic notions and results on coupling method are given. Section 2 contains known coupling results for renewal processes with focus on construction of synchronized renewal instants for a superposition of independent renewal processes. In Section 3, we present basic monotonicity results for the queue-size and workload processes. We consider both discrete-and continuous-time queueing systems with single and multi servers. Less known results on monotonicity of queueing processes with dependent service times and interarrival times are also presented. Section 4 is devoted to monotonicity of general Jackson-type queueing networks with Markovian routing. This section is based on the notable paper [17]. Finally, Section 5 contains elements of stability analysis of regenerative queues and networks, where coupling and monotonicity results play a crucial role to establish minimal suficient stability conditions. Besides, we present some new monotonicity results for tandem networks.
Resumo:
Certaines dégénérescences rétiniennes sont engendrées par des mutations¦génétiques et conduisent à la perte des cellules photosensibles, les¦photorécepteurs (cônes et/ou bâtonnets), et donc à la cécité (Roy et al., 2010).¦La prévalence est de 1/3000 chez les Caucasiens. Les Rétinites Pigmentaires¦(RP) en composent la majorité des cas, suivent l'Amaurose congénitale de¦Leber et la maladie de Stargardt. Il n'y a pas une mutation type associés à une¦maladie mais diverses mutations peuvent aboutir à une dégénérescence de la¦rétine. Tout comme le reste du système nerveux central, la rétine lésée n'a pas¦les capacités de se régénérer. Un objectif du traitement est de ralentir la¦dégénérescence de la rétine dans le but de la stabiliser. La thérapie génique¦constitue actuellement la seule approche thérapeutique à même de traiter les¦dégénérescences rétiniennes d'origine génétique. Elle consiste à utiliser un virus¦modifié, qui n'a plus les capacités de se reproduire, appelé vecteur pour cibler¦certaines cellules afin d'ajouter un gène sain ou d'inhiber un gène malade. Les¦virus associés à l'adénovirus (AAV) et les Lentivirus (LV) sont les 2 principaux¦types de virus utilisés en thérapie génique en ophtalmologie. D'autres vecteurs¦existent, comme les adénovirus et le virus de l'anémie infectieuse équine. Des¦études de thérapie génique effectuées chez l'homme avec le vecteur AAV ont¦démontré une sensible amélioration des fonctions visuelles (acuité visuelle,¦champ visuel, pupillométrie et le déplacement dans un environnement avec une¦lumière tamisée) chez des patients atteints d'Amaurose congénitale de Leber¦(Maguire et al., Ali et al., Hauswirth et al., Bennett et al.). Le vecteur utilisé au¦cours de ce travail est un LV, qui a pour avantage de pouvoir transporter de¦grands gènes. Lorsque ce vecteur est pseudotypé avec une enveloppe VSVG, il¦transduit (transférer un gène qui sera fonctionnel dans la cellule cible) bien¦l'épithélium pigmentaire rétinien (nécessaire à la survie et à la fonction des¦photorécepteurs). Afin de changer le tropisme du vecteur, celui testé dans cette¦étude contient une enveloppe de type Mokola qui cible efficacement les cellules¦gliales du cerveau et donc probablement aussi les cellules de Müller de la rétine.¦Le but à court terme est de transformer génétiquement ces cellules pour leur¦faire sécréter des molécules favorisant la survie des photorécepteurs. Pour¦révéler la cellule ciblée par le vecteur, le gène qui sera exprimé dans les cellules¦transduites code pour la protéine fluorescente verte 2 (GFPII) et n'a pas de¦fonction thérapeutique. Après avoir produit le virus, deux types de souris ont été¦injectées : des souris dépourvues du gène de la rhodopsine appelées Rho -/- et¦des souris sauvages appelées C57BL6. Les souris Rho -/- ont été choisies en¦tant que modèle de dégénérescence rétinienne et les souris C57BL6 en tant que¦comparatif. Les souris Rho -/- et C57BL56 ont été injectées entre le 2ème et le¦3ème mois de vie et sacrifiées 7 jours après. Des coupes histologiques de la rétine¦ont permis de mesurer et comparer pour chaque oeil, les distances de¦transduction du RPE et de la neurorétine (= toute la rétine sauf le RPE). La¦distance sur laquelle le RPE est transduit détermine la taille de la bulle¦d'injection alors que la distance sur laquelle la neurorétine est transduite¦détermine la capacité du vecteur à diffuser dans la rétine. Les résultats montrent¦une expression plus importante de la GFPII dans le RPE que dans la neurorétine¦chez les souris Rho -/- et C57BL6. Les principales cellules transduites au¦niveau de la neurorétine sont, comme attendu, les cellules de Müller. Lorsque¦l'on compare les proportions de neurorétine et de RPE transduites, on constate¦qu'il y a globalement eu une meilleure transduction chez les souris Rho -/-¦que chez les souris C57BL6. Cela signifie que le vecteur est plus efficace pour¦transduire une rétine dégénérée qu'une rétine saine. Pour déterminer quels types¦de cellules exprimaient la GFPII, des anticorps spécifiques de certains types de¦cellules ont été utilisés. Ces résultats sont similaires à ceux d'autres études¦effectuées précédemment, dont celle de Calame et al. en 2011, et tendent à¦prouver que le vecteur lentiviral avec l'enveloppe Mokola et le promoteur EFs¦est idéal pour transduire avec un gène thérapeutique des cellules de Müller dans¦des rétines en dégénérescence.
Resumo:
Purpose: Emergency room reading performances have been a point of interest in recent studies comparing radiologists to other physician groups. Our objective was to evaluate and compare the reading performances of radiologists and surgeons in an emergency room setting of non-traumatic abdominal CTs. Methods and materials: A total of ten readers representing four groups participated in this study: three senior radiologists and visceral surgeons, respectively, and two junior radiologists and surgeons, respectively. Each observer blindedly evaluated a total of 150 multi-slice acute abdominal CTs. CTs were chosen representing established proportions of acute abdomen pathologies in a Level I trauma centre from 2003 to 2005. Each answer was interpretated as right or wrong regarding pathology location, diagnosis and need for operation. Gold standard was the intraoperative result, and the clinical patient follow-up for non-operated patients. Significance was assumed at a p <.05 level. Results: Senior radiologists had a mean score of 2.38 ± 1.14, junior radiologists a score of 2.34 ± 1.14, whereas senior surgeons scored 2.07 ± 1.30 and junior surgeons 1.62 ± 1.42. No significant difference was found between the two radiologist groups, but results were significantly better for senior surgeons as compared to junior surgeons and better for the two radiologist groups as compared to each of the surgeon groups (all p <.05). Conclusion: Abdominal CT reading in an acute abdomen setting should continue to rely on an evaluation by a radiologist, whether senior or junior. Satisfying reading results can be achieved by senior visceral surgeons, but junior surgeons need more experience for a good reading performance.
Resumo:
We consider a principal who deals with a privately informed agent protected by limited liability in a correlated information setting. The agent's technology is such that the fixed cost declines with the marginal cost (the type), so that countervailing incentives may arise. We show that, with high liability, the first-best outcome can be effected for any type if (1) the fixed cost is non-concave in type, under the contract that yields the smallest feasible loss to the agent; (2) the fixed cost is not very concave in type, under the contract that yields the maximum sustainable loss to the agent. We further show that, with low liability, the first-best outcome is still implemented for a non-degenerate range of types if the fixed cost is less concave in type than some given threshold, which tightens as the liability reduces. The optimal contract entails pooling otherwise.
Resumo:
Este trabajo presenta un sistema para detectar y clasificar objetos binarios según la forma de éstos. En el primer paso del procedimiento, se aplica un filtrado para extraer el contorno del objeto. Con la información de los puntos de forma se obtiene un descriptor BSM con características altamente descriptivas, universales e invariantes. En la segunda fase del sistema se aprende y se clasifica la información del descriptor mediante Adaboost y Códigos Correctores de Errores. Se han usado bases de datos públicas, tanto en escala de grises como en color, para validar la implementación del sistema diseñado. Además, el sistema emplea una interfaz interactiva en la que diferentes métodos de procesamiento de imágenes pueden ser aplicados.
Resumo:
In recent years, multi-atlas fusion methods have gainedsignificant attention in medical image segmentation. Inthis paper, we propose a general Markov Random Field(MRF) based framework that can perform edge-preservingsmoothing of the labels at the time of fusing the labelsitself. More specifically, we formulate the label fusionproblem with MRF-based neighborhood priors, as an energyminimization problem containing a unary data term and apairwise smoothness term. We present how the existingfusion methods like majority voting, global weightedvoting and local weighted voting methods can be reframedto profit from the proposed framework, for generatingmore accurate segmentations as well as more contiguoussegmentations by getting rid of holes and islands. Theproposed framework is evaluated for segmenting lymphnodes in 3D head and neck CT images. A comparison ofvarious fusion algorithms is also presented.
Resumo:
NORTH SEA STUDY OCCASIONAL PAPER No. 112