964 resultados para Optimal management
Resumo:
Even though patients who develop ischemic stroke despite taking antiplatelet drugs represent a considerable proportion of stroke hospital admissions, there is a paucity of data from investigational studies regarding the most suitable therapeutic intervention. There have been no clinical trials to test whether increasing the dose or switching antiplatelet agents reduces the risk for subsequent events. Certain issues have to be considered in patients managed for a first or recurrent stroke while receiving antiplatelet agents. Therapeutic failure may be due to either poor adherence to treatment, associated co-morbid conditions and diminished antiplatelet effects (resistance to treatment). A diagnostic work up is warranted to identify the etiology and underlying mechanism of stroke, thereby guiding further management. Risk factors (including hypertension, dyslipidemia and diabetes) should be treated according to current guidelines. Aspirin or aspirin plus clopidogrel may be used in the acute and early phase of ischemic stroke, whereas in the long-term, antiplatelet treatment should be continued with aspirin, aspirin/extended release dipyridamole or clopidogrel monotherapy taking into account tolerance, safety, adherence and cost issues. Secondary measures to educate patients about stroke, the importance of adherence to medication, behavioral modification relating to tobacco use, physical activity, alcohol consumption and diet to control excess weight should also be implemented.
Resumo:
OBJECTIVES: Resuscitation in severe head injury may be detrimental when given with hypotonic fluids. We evaluated the effects of lactated Ringer's solution (sodium 131 mmol/L, 277 mOsm/L) compared with hypertonic saline (sodium 268 mmol/L, 598 mOsm/L) in severely head-injured children over the first 3 days after injury. DESIGN: An open, randomized, and prospective study. SETTING: A 16-bed pediatric intensive care unit (ICU) (level III) at a university children's hospital. PATIENTS: A total of 35 consecutive children with head injury. INTERVENTIONS: Thirty-two children with Glasgow Coma Scores of <8 were randomly assigned to receive either lactated Ringer's solution (group 1) or hypertonic saline (group 2). Routine care was standardized, and included the following: head positioning at 30 degrees; normothermia (96.8 degrees to 98.6 degrees F [36 degrees to 37 degrees C]); analgesia and sedation with morphine (10 to 30 microg/kg/hr), midazolam (0.2 to 0.3 mg/kg/hr), and phenobarbital; volume-controlled ventilation (PaCO2 of 26.3 to 30 torr [3.5 to 4 kPa]); and optimal oxygenation (PaO2 of 90 to 105 torr [12 to 14 kPa], oxygen saturation of >92%, and hematocrit of >0.30). MEASUREMENTS AND MAIN RESULTS: Mean arterial pressure and intracranial pressure (ICP) were monitored continuously and documented hourly and at every intervention. The means of every 4-hr period were calculated and serum sodium concentrations were measured at the same time. An ICP of 15 mm Hg was treated with a predefined sequence of interventions, and complications were documented. There was no difference with respect to age, male/female ratio, or initial Glasgow Coma Score. In both groups, there was an inverse correlation between serum sodium concentration and ICP (group 1: r = -.13, r2 = .02, p < .03; group 2: r = -.29, r2 = .08, p < .001) that disappeared in group 1 and increased in group 2 (group 1: r = -.08, r2 = .01, NS; group 2: r = -.35, r2 =.12, p < .001). Correlation between serum sodium concentration and cerebral perfusion pressure (CPP) became significant in group 2 after 8 hrs of treatment (r = .2, r2 = .04, p = .002). Over time, ICP and CPP did not significantly differ between the groups. However, to keep ICP at <15 mm Hg, group 2 patients required significantly fewer interventions (p < .02). Group 1 patients received less sodium (8.0 +/- 4.5 vs. 11.5 +/- 5.0 mmol/kg/day, p = .05) and more fluid on day 1 (2850 +/- 1480 vs. 2180 +/- 770 mL/m2, p = .05). They also had a higher frequency of acute respiratory distress syndrome (four vs. 0 patients, p = .1) and more than two complications (six vs. 1 patient, p = .09). Group 2 patients had significantly shorter ICU stay times (11.6 +/- 6.1 vs. 8.0 +/- 2.4 days; p = .04) and shorter mechanical ventilation times (9.5 +/- 6.0 vs. 6.9 +/- 2.2 days; p = .1). The survival rate and duration of hospital stay were similar in both groups. CONCLUSIONS: Treatment of severe head injury with hypertonic saline is superior to that treatment with lactated Ringer's solution. An increase in serum sodium concentrations significantly correlates with lower ICP and higher CPP. Children treated with hypertonic saline require fewer interventions, have fewer complications, and stay a shorter time in the ICU.
Resumo:
In the present research we have set forth a new, simple, Trade-Off model that would allow us to calculate how much debt and, by default, how much equity a company should have, using easily available information and calculating the cost of debt dynamically on the basis of the effect that the capital structure of the company has on the risk of bankruptcy; in an attempt to answer this question. The proposed model has been applied to the companies that make up the Dow Jones Industrial Average (DJIA) in 2007. We have used consolidated financial data from 1996 to 2006, published by Bloomberg. We have used simplex optimization method to find the debt level that maximizes firm value. Then, we compare the estimated debt with real debt of companies using statistical nonparametric Mann-Whitney. The results indicate that 63% of companies do not show a statistically significant difference between the real and the estimated debt.
Resumo:
There are conflicting data on the prevalence of coronary events and the quality of the management of modifiable cardiovascular risk factors (CVRF) inHIV-infected patients. Methods.We performed a retrospective descriptive study to determine the prevalence of coronary events and to evaluate the management of CVRF in a Mediterranean cohort of 3760 HIV-1-infected patients from April 1983 through June 2011. Results.We identified 81 patients with a history of a coronary event (prevalence 2.15%); 83% of them suffered an acute myocardial infarction. At the time of the coronary event, CVRF were highly prevalent (60.5% hypertension, 48% dyslipidemia, and 16% diabetes mellitus).OtherCVRF, such as smoking, hypertension, lack of exercise, and body mass index, were not routinely assessed. After the coronary event, a significant decrease in total cholesterol ( � = 0.025) and LDLcholesterol ( � = 0.004) was observed. However, the percentage of patients whomaintained LDL-cholesterol > 100mg/dL remained stable (from 46% to 41%, � = 0.103). Patients using protease inhibitors associated with a favorable lipid profile increased over time ( � = 0.028). Conclusions.The prevalence of coronary events in our cohort is low. CVRF prevalence is high and theirmanagement is far from optimal. More aggressive interventions should be implemented to diminish cardiovascular risk in HIV-infected patients.
Resumo:
BACKGROUND AND AIMS: Data from prospective cohorts describing dyslipidaemia prevalence and treatment trends are lacking. Using data from the prospective CoLaus study, we aimed to examine changes in serum lipid levels, dyslipidaemia prevalence and management in a population-based sample of Swiss adults. METHODS AND RESULTS: Cardiovascular risk was assessed using PROCAM. Dyslipidaemia and low-density lipoprotein cholesterol (LDL-C) target levels were defined according to the Swiss Group for Lipids and Atherosclerosis. Complete baseline and follow up (FU) data were available for n = 4863 subjects during mean FU time of 5.6 years. Overall, 32.1% of participants were dyslipidaemic at baseline vs 46.3% at FU (p < 0.001). During this time, lipid lowering medication (LLM) rates among dyslipidaemic subjects increased from 34.0% to 39.2% (p < 0.001). In secondary prevention, LLM rates were 42.7% at baseline and 53.2% at FU (p = 0.004). In multivariate analysis, LLM use among dyslipidaemic subjects, between baseline and FU, was positively associated with personal history of CVD, older age, hypertension, higher BMI and diabetes, while negatively associated with higher educational level. Among treated subjects, LDL-C target achievement was positively associated with diabetes and negatively associated with personal history of CVD and higher BMI. Among subjects treated at baseline, LLM discontinuation was negatively associated with older age, male sex, smoking, hypertension and parental history of CVD. CONCLUSIONS: In Switzerland, the increase over time in dyslipidaemia prevalence was not paralleled by a similar increase in LLM. In a real-life setting, dyslipidaemia management remains far from optimal, both in primary and secondary prevention.
Resumo:
BACKGROUND: The burden of asthma on patients and healthcare systems is substantial. Interventions have been developed to overcome difficulties in asthma management. These include chronic disease management programmes, which are more than simple patient education, encompassing a set of coherent interventions that centre on the patients' needs, encouraging the co-ordination and integration of health services provided by a variety of healthcare professionals, and emphasising patient self-management as well as patient education. OBJECTIVES: To evaluate the effectiveness of chronic disease management programmes for adults with asthma. SEARCH METHODS: Cochrane Central Register of Controlled Trials (CENTRAL), Cochrane Effective Practice and Organisation of Care (EPOC) Group Specialised Register, MEDLINE (MEDLINE In-Process and Other Non-Indexed Citations), EMBASE, CINAHL, and PsycINFO were searched up to June 2014. We also handsearched selected journals from 2000 to 2012 and scanned reference lists of relevant reviews. SELECTION CRITERIA: We included individual or cluster-randomised controlled trials, non-randomised controlled trials, and controlled before-after studies comparing chronic disease management programmes with usual care in adults over 16 years of age with a diagnosis of asthma. The chronic disease management programmes had to satisfy at least the following five criteria: an organisational component targeting patients; an organisational component targeting healthcare professionals or the healthcare system, or both; patient education or self-management support, or both; active involvement of two or more healthcare professionals in patient care; a minimum duration of three months. DATA COLLECTION AND ANALYSIS: After an initial screen of the titles, two review authors working independently assessed the studies for eligibility and study quality; they also extracted the data. We contacted authors to obtain missing information and additional data, where necessary. We pooled results using the random-effects model and reported the pooled mean or standardised mean differences (SMDs). MAIN RESULTS: A total of 20 studies including 81,746 patients (median 129.5) were included in this review, with a follow-up ranging from 3 to more than 12 months. Patients' mean age was 42.5 years, 60% were female, and their asthma was mostly rated as moderate to severe. Overall the studies were of moderate to low methodological quality, because of limitations in their design and the wide confidence intervals for certain results.Compared with usual care, chronic disease management programmes resulted in improvements in asthma-specific quality of life (SMD 0.22, 95% confidence interval (CI) 0.08 to 0.37), asthma severity scores (SMD 0.18, 95% CI 0.05 to 0.30), and lung function tests (SMD 0.19, 95% CI 0.09 to 0.30). The data for improvement in self-efficacy scores were inconclusive (SMD 0.51, 95% CI -0.08 to 1.11). Results on hospitalisations and emergency department or unscheduled visits could not be combined in a meta-analysis because the data were too heterogeneous; results from the individual studies were inconclusive overall. Only a few studies reported results on asthma exacerbations, days off work or school, use of an action plan, and patient satisfaction. Meta-analyses could not be performed for these outcomes. AUTHORS' CONCLUSIONS: There is moderate to low quality evidence that chronic disease management programmes for adults with asthma can improve asthma-specific quality of life, asthma severity, and lung function tests. Overall, these results provide encouraging evidence of the potential effectiveness of these programmes in adults with asthma when compared with usual care. However, the optimal composition of asthma chronic disease management programmes and their added value, compared with education or self-management alone that is usually offered to patients with asthma, need further investigation.
Resumo:
Que ce soit d'un point de vue, urbanistique, social, ou encore de la gouvernance, l'évolution des villes est un défi majeur de nos sociétés contemporaines. En offrant la possibilité d'analyser des configurations spatiales et sociales existantes ou en tentant de simuler celles à venir, les systèmes d'information géographique sont devenus incontournables dans la gestion et dans la planification urbaine. En cinq ans la population de la ville de Lausanne est passée de 134'700 à 140'570 habitants, alors que les effectifs de l'école publique ont crû de 12'200 à 13'500 élèves. Cet accroissement démographique associé à un vaste processus d'harmonisation de la scolarité obligatoire en Suisse ont amené le Service des écoles à mettre en place et à développer en collaboration avec l'université de Lausanne des solutions SIG à même de répondre à différentes problématiques spatiales. Établies en 1989, les limites des établissements scolaires (bassins de recrutement) ont dû être redéfinies afin de les réadapter aux réalités d'un paysage urbain et politique en pleine mutation. Dans un contexte de mobilité et de durabilité, un système d'attribution de subventions pour les transports publics basé sur la distance domicile-école et sur l'âge des écoliers, a été conçu. La réalisation de ces projets a nécessité la construction de bases de données géographiques ainsi que l'élaboration de nouvelles méthodes d'analyses exposées dans ce travail. Cette thèse s'est ainsi faite selon une dialectique permanente entre recherches théoriques et nécessités pratiques. La première partie de ce travail porte sur l'analyse du réseau piéton de la ville. La morphologie du réseau est investiguée au travers d'approches multi-échelles du concept de centralité. La première conception, nommée sinuo-centralité ("straightness centrality"), stipule qu'être central c'est être relié aux autres en ligne droite. La deuxième, sans doute plus intuitive, est intitulée centricité ("closeness centrality") et exprime le fait qu'être central c'est être proche des autres (fig. 1, II). Les méthodes développées ont pour but d'évaluer la connectivité et la marchabilité du réseau, tout en suggérant de possibles améliorations (création de raccourcis piétons). Le troisième et dernier volet théorique expose et développe un algorithme de transport optimal régularisé. En minimisant la distance domicile-école et en respectant la taille des écoles, l'algorithme permet de réaliser des scénarios d'enclassement. L'implémentation des multiplicateurs de Lagrange offre une visualisation du "coût spatial" des infrastructures scolaires et des lieux de résidence des écoliers. La deuxième partie de cette thèse retrace les aspects principaux de trois projets réalisés dans le cadre de la gestion scolaire. À savoir : la conception d'un système d'attribution de subventions pour les transports publics, la redéfinition de la carte scolaire, ou encore la simulation des flux d'élèves se rendant à l'école à pied. *** May it be from an urbanistic, a social or from a governance point of view, the evolution of cities is a major challenge in our contemporary societies. By giving the opportunity to analyse spatial and social configurations or attempting to simulate future ones, geographic information systems cannot be overlooked in urban planning and management. In five years, the population of the city of Lausanne has grown from 134'700 to 140'570 inhabitants while the numbers in public schools have increased from 12'200 to 13'500 students. Associated to a considerable harmonisation process of compulsory schooling in Switzerland, this demographic rise has driven schooling services, in collaboration with the University of Lausanne, to set up and develop GIS capable of tackling various spatial issues. Established in 1989, the school districts had to be altered so that they might fit the reality of a continuously changing urban and political landscape. In a context of mobility and durability, an attribution system for public transport subventions based on the distance between residence and school and on the age of the students was designed. The implementation of these projects required the built of geographical databases as well as the elaboration of new analysis methods exposed in this thesis. The first part of this work focuses on the analysis of the city's pedestrian network. Its morphology is investigated through multi-scale approaches of the concept of centrality. The first conception, named the straightness centrality, stipulates that being central is being connected to the others in a straight line. The second, undoubtedly more intuitive, is called closeness centrality and expresses the fact that being central is being close to the others. The goal of the methods developed is to evaluate the connectivity and walkability of the network along with suggesting possible improvements (creation of pedestrian shortcuts).The third and final theoretical section exposes and develops an algorithm of regularised optimal transport. By minimising home to school distances and by respecting school capacity, the algorithm enables the production of student allocation scheme. The implementation of the Lagrange multipliers offers a visualisation of the spatial cost associated to the schooling infrastructures and to the student home locations. The second part of this thesis recounts the principal aspects of three projects fulfilled in the context of school management. It focuses namely on the built of an attribution system for public transport subventions, a school redistricting process and on simulating student pedestrian flows.
Resumo:
The Kyoto protocol allows Annex I countries to deduct carbon sequestered by land use, land-use change and forestry from their national carbon emissions. Thornley and Cannell (2000) demonstrated that the objectives of maximizing timber and carbon sequestration are not complementary. Based on this finding, this paper determines the optimal selective management regime taking into account the underlying biophysical and economic processes. The results show that the net benefits of carbon storage only compensate the decrease in net benefits of timber production once the carbon price has exceeded a certain threshold value. The sequestration costs are significantly lower than previous estimates
Resumo:
This thesis studies the use of heuristic algorithms in a number of combinatorial problems that occur in various resource constrained environments. Such problems occur, for example, in manufacturing, where a restricted number of resources (tools, machines, feeder slots) are needed to perform some operations. Many of these problems turn out to be computationally intractable, and heuristic algorithms are used to provide efficient, yet sub-optimal solutions. The main goal of the present study is to build upon existing methods to create new heuristics that provide improved solutions for some of these problems. All of these problems occur in practice, and one of the motivations of our study was the request for improvements from industrial sources. We approach three different resource constrained problems. The first is the tool switching and loading problem, and occurs especially in the assembly of printed circuit boards. This problem has to be solved when an efficient, yet small primary storage is used to access resources (tools) from a less efficient (but unlimited) secondary storage area. We study various forms of the problem and provide improved heuristics for its solution. Second, the nozzle assignment problem is concerned with selecting a suitable set of vacuum nozzles for the arms of a robotic assembly machine. It turns out that this is a specialized formulation of the MINMAX resource allocation formulation of the apportionment problem and it can be solved efficiently and optimally. We construct an exact algorithm specialized for the nozzle selection and provide a proof of its optimality. Third, the problem of feeder assignment and component tape construction occurs when electronic components are inserted and certain component types cause tape movement delays that can significantly impact the efficiency of printed circuit board assembly. Here, careful selection of component slots in the feeder improves the tape movement speed. We provide a formal proof that this problem is of the same complexity as the turnpike problem (a well studied geometric optimization problem), and provide a heuristic algorithm for this problem.
Resumo:
In this Master’s thesis agent-based modeling has been used to analyze maintenance strategy related phenomena. The main research question that has been answered was: what does the agent-based model made for this study tell us about how different maintenance strategy decisions affect profitability of equipment owners and maintenance service providers? Thus, the main outcome of this study is an analysis of how profitability can be increased in industrial maintenance context. To answer that question, first, a literature review of maintenance strategy, agent-based modeling and maintenance modeling and optimization was conducted. This review provided the basis for making the agent-based model. Making the model followed a standard simulation modeling procedure. With the simulation results from the agent-based model the research question was answered. Specifically, the results of the modeling and this study are: (1) optimizing the point in which a machine is maintained increases profitability for the owner of the machine and also the maintainer with certain conditions; (2) time-based pricing of maintenance services leads to a zero-sum game between the parties; (3) value-based pricing of maintenance services leads to a win-win game between the parties, if the owners of the machines share a substantial amount of their value to the maintainers; and (4) error in machine condition measurement is a critical parameter to optimizing maintenance strategy, and there is real systemic value in having more accurate machine condition measurement systems.
Resumo:
Recent developments in power electronics technology have made it possible to develop competitive and reliable low-voltage DC (LVDC) distribution networks. Further, islanded microgrids—isolated small-scale localized distribution networks— have been proposed to reliably supply power using distributed generations. However, islanded operations face many issues such as power quality, voltage regulation, network stability, and protection. In this thesis, an energy management system (EMS) that ensures efficient energy and power balancing and voltage regulation has been proposed for an LVDC island network utilizing solar panels for electricity production and lead-acid batteries for energy storage. The EMS uses the master/slave method with robust communication infrastructure to control the production, storage, and loads. The logical basis for the EMS operations has been established by proposing functionalities of the network components as well as by defining appropriate operation modes that encompass all situations. During loss-of-powersupply periods, load prioritizations and disconnections are employed to maintain the power supply to at least some loads. The proposed EMS ensures optimal energy balance in the network. A sizing method based on discrete-event simulations has also been proposed to obtain reliable capacities of the photovoltaic array and battery. In addition, an algorithm to determine the number of hours of electric power supply that can be guaranteed to the customers at any given location has been developed. The successful performances of all the proposed algorithms have been demonstrated by simulations.
Resumo:
Objectif principal: Il n’est pas démontré que les interventions visant à maîtriser voire modérer la médicamentation de patients atteints d’hypertension peuvent améliorer leur gestion de la maladie. Cette revue systématique propose d’évaluer les programmes de gestion contrôlée de la médicamentation pour l’hypertension, en s’appuyant sur la mesure de l’observance des traitements par les patients (CMGM). Design: Revue systématique. Sources de données: MEDLINE, EMBASE, CENTRAL, résumés de conférences internationales sur l’hypertension et bibliographies des articles pertinents. Méthodes: Des essais contrôlés randomisés (ECR) et des études observationnelles (EO) ont été évalués par 2 réviseurs indépendants. L’évaluation de la qualité (de ce matériel) a été réalisée avec l’aide de l’outil de Cochrane de mesure du risque de biais, et a été estimée selon une échelle à quatre niveaux de qualité Une synthèse narrative des données a été effectuée en raison de l'hétérogénéité importante des études. Résultats: 13 études (8 ECR, 5 EO) de 2150 patients hypertendus ont été prises en compte. Parmi elles, 5 études de CMGM avec l’utilisation de dispositifs électroniques comme seule intervention ont relevé une diminution de la tension artérielle (TA), qui pourrait cependant être expliquée par les biais de mesure. L’amélioration à court terme de la TA sous CMGM dans les interventions complexes a été révélée dans 4 études à qualité faible ou modérée. Dans 4 autres études sur les soins intégrés de qualité supérieure, il n'a pas été possible de distinguer l'impact de la composante CMGM, celle-ci pouvant être compromise par des traitements médicamenteux. L’ensemble des études semble par ailleurs montrer qu’un feed-back régulier au médecin traitant peut être un élément essentiel d’efficacité des traitements CMGM, et peut être facilement assuré par une infirmière ou un pharmacien, grâce à des outils de communication appropriés. Conclusions: Aucune preuve convaincante de l'efficacité des traitements CMGM comme technologie de la santé n’a été établie en raison de designs non-optimaux des études identifiées et des ualités méthodologiques insatisfaisantes de celles-ci. Les recherches futures devraient : suivre les normes de qualité approuvées et les recommandations cliniques actuelles pour le traitement de l'hypertension, inclure des groupes spécifiques de patients avec des problèmes d’attachement aux traitements, et considérer les résultats cliniques et économiques de l'organisation de soins ainsi que les observations rapportées par les patients.
Resumo:
La thèse propose d’introduire une perspective globale dans le traitement juridique du transport intermodal international qui prendrait racine dans la stratégie logistique des entreprises. La conception juridique se heurte, en effet, aux évolutions opérationnelles et organisationnelles des transports et aboutit à une incertitude juridique. Les transporteurs ont dû s’adapter aux exigences d’optimisation des flux des chargeurs dont les modes de production et de distribution reposent sur le supply chain management (SCM). Ce concept est le fruit de la mondialisation et des technologies de l’information. La concurrence induite par la mondialisation et le pilotage optimal des flux ont impulsé de nouvelles stratégies de la part des entreprises qui tentent d’avoir un avantage concurrentiel sur le marché. Ces stratégies reposent sur l’intégration interfonctionnelle et interoganisationnelle. Dans cette chaîne logistique globale (ou SCM) l’intermodal est crucial. Il lie et coordonne les réseaux de production et de distribution spatialement désagrégés des entreprises et, répond aux exigences de maîtrise de l’espace et du temps, à moindre coût. Ainsi, le transporteur doit d’une part, intégrer les opérations de transport en optimisant les déplacements et, d’autre part, s’intégrer à la chaîne logistique du client en proposant des services de valeur ajoutée pour renforcer la compétitivité de la chaîne de valeur. Il en découle une unité technique et économique de la chaîne intermodale qui est pourtant, juridiquement fragmentée. Les Conventions internationales en vigueur ont été élaborées pour chaque mode de transport en faisant fi de l’interaction entre les modes et entre les opérateurs. L’intermodal est considéré comme une juxtaposition des modes et des régimes juridiques. Ce dépeçage juridique contraste avec la gestion de la chaîne intermodale dont les composantes individuelles s’effacent au profit de l’objectif global à atteindre. L’on expose d’abord l’ampleur de l’incertitude juridique due aux difficultés de circonscrire le champ d’opérations couvert par les Conventions en vigueur. Une attention est portée aux divergences d’interprétations qui débouchent sur la « désunification » du droit du transport. On s’intéresse ensuite aux interactions entre le transport et la chaîne logistique des chargeurs. Pour cela, on retrace l’évolution des modes de production et de distribution de ces derniers. C’est effectivement de la stratégie logistique que découle la conception de la chaîne intermodale. Partant de ce système, on identifie les caractéristiques fondamentales de l’intermodal. La thèse aboutit à dissiper les confusions liées à la qualification de l’intermodal et qui sont à la base des divergences d’interprétations et de l’incertitude juridique. De plus, elle met en exergue l’unité économique du contrat de transport intermodal qui devrait guider la fixation d’un régime de responsabilité dédié à ce système intégré de transport. Enfin, elle initie une approche ignorée des débats juridiques.
Resumo:
De récents développements en théorie de la decision ont largement enrichi notre connaissance de la notion d'incertitude knightienne, usuellement appelée ambiguïté. Néanmoins ces dévelopement tardent à être intégrés au coeur de la théorie économique. Nous suggérons que l'analyse de phénonèmes économiques tel que l'innovation et la Recherche et Développement gagnerait à intégrer les modèles de décision en situation d'ambiguïté. Nous étayons notre propos en analysant l'allocation des droits de propriété d'une découverte. Les deux premières parties de la présentation s'inspire d'un modèle d'Aghion et de Tirole, The Management of Innovation, portant sur l'allocation des droits de propriété entre une unité de recherche et un investisseur. Il est démontré qu'un désaccord entre les agents sur la technologie de recherche affecte leur niveau d'effort, l'allocation des droits de propriété et l'allocation des revenus subséquents. Finalement, nous examinons une situation où plusieurs chercheurs sont en compétition en s'inspirant du traitement de l'incertitude de Savage. La présence d'ambuïgité affecte le comportement des agents et l'allocation des droits de propriétés de manière qui n'est pas captée en assumant l'hypothèse de risque.
Resumo:
The base concept from which the entire research problem emerged is as follows: Lack of spatial planning and effective development management system lead to urban sprawl with non-optimal density of population to support urban infrastructure on the one side causing a lesser quality of life in urban areas. On the other side it causes loss of productivity of natural ecosystems and agricultural areas due to disturbance to the ecosystems. Planned compact high density development with compatible mixed land use can go a long way in achieving environmental efficiency of development management system.