38 resultados para City planning and redevelopment
Resumo:
Background: As part of the second generation surveillance system for HIV/Aids in Switzerland, repeated cross-sectional surveys were conducted in 1993, 1994, 1996, 2000, 2006 and 2011 among attenders of all low threshold facilities (LTFs) with needle exchange programmes and/or supervised drug consumption rooms for injection or inhalation. The number of syringes distributed to the injectors has also been measured annually since 2000. Distribution in other settings, such as pharmacies, is also monitored nationally. Methods: Periodic surveys of LTFs have been conducted using an interviewer/self-administered questionnaire structured along four themes: socio-demographic characteristics, drug consumption, risk/preventive behaviour and health. Analysis is restricted to attenders who had injected drugs during their lifetime (IDU´s). Pearson's chi-square test and trend analysis were conducted on annual aggregated data. Trend significance was assessed using Stata's non parametric test nptrend. Results: Median age of IDU´s increased from 26 years in 1993 to 40 in 2011; most are men (78%). Total yearly number of syringes distributed by LTFs has decreased by 44% in 10 years. Use of cocaine has increased (Table 1). Injection, regular use of heroin and borrowing of syringes/needles have decreased, while sharing of other material remains stable. There are fewer new injectors; more IDU´s report substitution treatment. Most attenders had ever been tested for HIV (90% in 1993, 94% in 2011). Reported prevalence of HIV remained stable around 10%; that of HCV decreased from 62% in 2000 to 42% in 2011. Conclusions: Overall, findings indicate a decrease in injection as a means of drug consumption in that population. This interpretation is supported by data from other sources, such as a national decrease in distribution from other delivery points. Switzerland's behavioural surveillance system is sustainable and allows the HIV epidemic to be monitored among this hard-to-reach population, providing information for planning and evaluation.
Resumo:
Summary Forests are key ecosystems of the earth and associated with a large range of functions. Many of these functions are beneficial to humans and are referred to as ecosystem services. Sustainable development requires that all relevant ecosystem services are quantified, managed and monitored equally. Natural resource management therefore targets the services associated with ecosystems. The main hypothesis of this thesis is that the spatial and temporal domains of relevant services do not correspond to a discrete forest ecosystem. As a consequence, the services are not quantified, managed and monitored in an equal and sustainable manner. The thesis aims were therefore to test this hypothesis, establish an improved conceptual approach and provide spatial applications for the relevant land cover and structure variables. The study was carried out in western Switzerland and based primarily on data from a countrywide landscape inventory. This inventory is part of the third Swiss national forest inventory and assesses continuous landscape variables based on a regular sampling of true colour aerial imagery. In addition, land cover variables were derived from Landsat 5 TM passive sensor data and land structure variables from active sensor data from a small footprint laserscanning system. The results confirmed the main hypothesis, as relevant services did not scale well with the forest ecosystem. Instead, a new conceptual approach for sustainable management of natural resources was described. This concept quantifies the services as a continuous function of the landscape, rather than a discrete function of the forest ecosystem. The explanatory landscape variables are therefore called continuous fields and the forest becomes a dependent and function-driven management unit. Continuous field mapping methods were established for land cover and structure variables. In conclusion, the discrete forest ecosystem is an adequate planning and management unit. However, monitoring the state of and trends in sustainability of services requires them to be quantified as a continuous function of the landscape. Sustainable natural resource management iteratively combines the ecosystem and gradient approaches. Résumé Les forêts sont des écosystèmes-clés de la terre et on leur attribue un grand nombre de fonctions. Beaucoup de ces fonctions sont bénéfiques pour l'homme et sont nommées services écosystémiques. Le développement durable exige que ces services écosystémiques soient tous quantifiés, gérés et surveillés de façon égale. La gestion des ressources naturelles a donc pour cible les services attribués aux écosystèmes. L'hypothèse principale de cette thèse est que les domaines spatiaux et temporels des services attribués à la forêt ne correspondent pas à un écosystème discret. Par conséquent, les services ne sont pas quantifiés, aménagés et surveillés d'une manière équivalente et durable. Les buts de la thèse étaient de tester cette hypothèse, d'établir une nouvelle approche conceptuelle de la gestion des ressources naturelles et de préparer des applications spatiales pour les variables paysagères et structurelles appropriées. L'étude a été menée en Suisse occidentale principalement sur la base d'un inventaire de paysage à l'échelon national. Cet inventaire fait partie du troisième inventaire forestier national suisse et mesure de façon continue des variables paysagères sur la base d'un échantillonnage régulier sur des photos aériennes couleur. En outre, des variables de couverture ? terrestre ont été dérivées des données d'un senseur passif Landsat 5 TM, ainsi que des variables structurelles, dérivées du laserscanning, un senseur actif. Les résultats confirment l'hypothèse principale, car l'échelle des services ne correspond pas à celle de l'écosystème forestier. Au lieu de cela, une nouvelle approche a été élaborée pour la gestion durable des ressources naturelles. Ce concept représente les services comme une fonction continue du paysage, plutôt qu'une fonction discrète de l'écosystème forestier. En conséquence, les variables explicatives de paysage sont dénommées continuous fields et la forêt devient une entité dépendante, définie par la fonction principale du paysage. Des méthodes correspondantes pour la couverture terrestre et la structure ont été élaborées. En conclusion, l'écosystème forestier discret est une unité adéquate pour la planification et la gestion. En revanche, la surveillance de la durabilité de l'état et de son évolution exige que les services soient quantifiés comme fonction continue du paysage. La gestion durable des ressources naturelles joint donc l'approche écosystémique avec celle du gradient de manière itérative.
Resumo:
Palliative care, which is intended to keep patients at home as long as possible, is increasingly proposed for patients who live at home, with their family, or in retirement homes. Although their condition is expected to have a lethal evolution, the patients-or more often their families or entourages-are sometimes confronted with sudden situations of respiratory distress, convulsions, hemorrhage, coma, anxiety, or pain. Prehospital emergency services are therefore often confronted with palliative care situations, situations in which medical teams are not skilled and therefore frequently feel awkward.We conducted a retrospective study about cases of palliative care situations that were managed by prehospital emergency physicians (EPs) over a period of 8 months in 2012, in the urban region of Lausanne in the State of Vaud, Switzerland.The prehospital EPs managed 1586 prehospital emergencies during the study period. We report 4 situations of respiratory distress or neurological disorders in advanced cancer patients, highlighting end-of-life and palliative care situations that may be encountered by prehospital emergency services.The similarity of the cases, the reasons leading to the involvement of prehospital EPs, and the ethical dilemma illustrated by these situations are discussed. These situations highlight the need for more formal education in palliative care for EPs and prehospital emergency teams, and the need to fully communicate the planning and implementation of palliative care with patients and patients' family members.
Resumo:
The relation among education, disease prevalence, and frequency of health service utilization was analyzed using data from the Swiss National Health Survey SOMIPOPS, conducted in 1981-1983 on a randomly selected sample of 4,255 individuals, representative of the entire Swiss population. The prevalence of several important cardiovascular, respiratory, digestive, osteoarticular, and psychiatric disorders was higher among less educated individuals; only allergic conditions were directly associated with indicators of social class. More educated individuals reported lower frequencies of general practitioner visits, but higher frequencies of specialized consultations. These findings confirm that education is an important determinant not only of mortality but also of morbidity and health-care utilization and require careful consideration in terms of the planning and evaluation of health services.
Resumo:
PURPOSE: The Gastro-Intestinal Working Party of the EORTC Radiation Oncology Group (GIWP-ROG) developed guidelines for target volume definition in neoadjuvant radiation of adenocarcinomas of the gastroesophageal junction (GEJ) and the stomach. METHODS AND MATERIALS: Guidelines about the definition of the clinical target volume (CTV) are based on a systematic literature review of the location and frequency of local recurrences and lymph node involvement in adenocarcinomas of the GEJ and the stomach. Therefore, MEDLINE was searched up to August 2008. Guidelines concerning prescription, planning and treatment delivery are based on a consensus between the members of the GIWP-ROG. RESULTS: In order to support a curative resection of GEJ and gastric cancer, an individualized preoperative treatment volume based on tumour location has to include the primary tumour and the draining regional lymph nodes area. Therefore we recommend to use the 2nd English Edition of the Japanese Classification of Gastric Carcinoma of the Japanese Gastric Cancer Association which developed the concept of assigning tumours of the GEJ and the stomach to anatomically defined sub-sites corresponding respectively to a distinct lymphatic spread pattern. CONCLUSION: The GIWP-ROG defined guidelines for preoperative irradiation of adenocarcinomas of the GEJ and the stomach to reduce variability in the framework of future clinical trials.
Resumo:
AIM: In a survey conducted in the Lausanne catchment area in 2000, we could estimate on the basis of file assessment that first-episode psychosis (FEP) patients had psychotic symptoms for more than 2 years before treatment and that 50% did not attend any outpatient appointment after discharge from hospital. In this paper, we describe the implementation of a specialized programme aimed at improving engagement and quality of treatment for early psychosis patients in the Lausanne catchment area in Switzerland. METHOD: The Treatment and Early Intervention in Psychosis Program-Lausanne is a comprehensive 3-year programme composed of (i) an outpatient clinic based on assertive case management; (ii) a specialized inpatient unit; and (iii) an intensive mobile team, connected for research to the Center for Psychiatric Neuroscience. RESULTS: Eight years after implementation, the programme has included 350 patients with a disengagement rate of 9% over 3 years of treatment. All patients have been assessed prospectively and 90 participated in neurobiological research. Based on this experience, the Health Department funded the implementation of similar programmes in other parts of the state, covering a total population of 540 000 people. CONCLUSION: Programmes for early intervention in psychosis have a major impact on patients' engagement into treatment. While development of mobile teams and assertive case management with specific training are crucial, they do not necessitate massive financial support to be started. Inclusion of a research component is important as well, in terms of service planning and improvement of both quality of care and impact of early intervention strategies.
Resumo:
Introduction: Schizophrenia is associated with multiple neuropsychological dysfunctions, such as disturbances of attention, memory, perceptual functioning, concept formation and executive processes. These cognitive functions are reported to depend on the integrity of the prefrontal and thalamo-prefrontal circuits. Multiple lines of evidence suggest that schizophrenia is related to abnormalities in neural circuitry and impaired structural connectivity. Here, we report a preliminary case-control study that showed a correlation between thalamo-frontal connections and several cognitive functions known to be impaired in schizophrenia. Materials and Methods: We investigated 9 schizophrenic patients (DSM IV criteria, Diagnostic Interview for Genetic Studies) and 9 age and sex matched control subjects. We obtained from each volunteer a DT-MRI dataset (3 T, _ _ 1,000 s/mm2), and a high resolution anatomic T1. The thalamo- frontal tracts are simulated with DTI tractography on these dataset, a method allowing inference of the main neural fiber tracks from Diffusion MRI data. In order to see an eventual correlation with the thalamo-frontal connections, every subject performs a battery of neuropsychological tests including computerized tests of attention (sustained attention, selective attention and reaction time), working memory tests (Plane test and the working memory sub-tests of the Wechsler Adult Intelligence Scale), a executive functioning task (Tower of Hanoï) and a test of visual binding abilities. Results: In a pilot case-control study (patients: n _ 9; controls: n _ 9), we showed that this methodology is appropriate and giving results in the excepted range. Considering the relation of the connectivity density and the neuropsychological data, a correlation between the number of thalamo- frontal fibers and the performance in the Tower of Hanoï was observed in the patients (Pearson correlation, r _ 0.76, p _ 0.05) but not in control subjects. In the most difficult item of the test, the least number of fibers corresponds to the worst performance of the test (fig. 2, number of supplementary movements of the elements necessary to realize the right configuration). It's interesting to note here that in an independent study, we showed that schizophrenia patients (n _ 32) perform in the most difficult item of the Tower of Hanoï (Mann-Whitney, p _ 0.005) significantly worse than control subjects (n _ 29). This has been observed in several others neuropsychological studies. Discussion: This pilot study of schizophrenia patients shows a correlation between the number of thalam-frontal fibers and the performance in the Tower of Hanoï, which is a planning and goal oriented actions task known to be associated with frontal dysfonction. This observation is consistent with the proposed impaired connectivity in schizophrenia. We aim to pursue the study with a larger sample in order to determine if other neuropsychological tests may be associated with the connectivity density.
Resumo:
Astute control of brain activity states is critical for adaptive behaviours and survival. In mammals and birds, electroencephalographic recordings reveal alternating states of wakefulness, slow wave sleep and paradoxical sleep (or rapid eye movement sleep). This control is profoundly impaired in narcolepsy with cataplexy, a disease resulting from the loss of orexin/hypocretin neurotransmitter signalling in the brain. Narcolepsy with cataplexy is characterized by irresistible bouts of sleep during the day, sleep fragmentation during the night and episodes of cataplexy, a sudden loss of muscle tone while awake and experiencing emotions. The neural mechanisms underlying cataplexy are unknown, but commonly thought to involve those of rapid eye movement-sleep atonia, and cataplexy typically is considered as a rapid eye movement sleep disorder. Here we reassess cataplexy in hypocretin (Hcrt, also known as orexin) gene knockout mice. Using a novel video/electroencephalogram double-blind scoring method, we show that cataplexy is not a state per se, as believed previously, but a dynamic, multi-phased process involving a reproducible progression of states. A knockout-specific state and a stereotypical paroxysmal event were introduced to account for signals and electroencephalogram spectral characteristics not seen in wild-type littermates. Cataplexy almost invariably started with a brief phase of wake-like electroencephalogram, followed by a phase featuring high-amplitude irregular theta oscillations, defining an activity profile distinct from paradoxical sleep, referred to as cataplexy-associated state and in the course of which 1.5-2 s high-amplitude, highly regular, hypersynchronous paroxysmal theta bursts (∼7 Hz) occurred. In contrast to cataplexy onset, exit from cataplexy did not show a predictable sequence of activities. Altogether, these data contradict the hypothesis that cataplexy is a state similar to paradoxical sleep, even if long cataplexies may evolve into paradoxical sleep. Although not exclusive to overt cataplexy, cataplexy-associated state and hypersynchronous paroxysmal theta activities are highly enriched during cataplexy in hypocretin/orexin knockout mice. Their occurrence in an independent narcolepsy mouse model, the orexin/ataxin 3 transgenic mouse, undergoing loss of orexin neurons, was confirmed. Importantly, we document for the first time similar paroxysmal theta hypersynchronies (∼4 Hz) during cataplexy in narcoleptic children. Lastly, we show by deep recordings in mice that the cataplexy-associated state and hypersynchronous paroxysmal theta activities are independent of hippocampal theta and involve the frontal cortex. Cataplexy hypersynchronous paroxysmal theta bursts may represent medial prefrontal activity, associated in humans and rodents with reward-driven motor impulse, planning and conflict monitoring.
Resumo:
Il est important pour les entreprises de compresser les informations détaillées dans des sets d'information plus compréhensibles. Au chapitre 1, je résume et structure la littérature sur le sujet « agrégation d'informations » en contrôle de gestion. Je récapitule l'analyse coûts-bénéfices que les comptables internes doivent considérer quand ils décident des niveaux optimaux d'agrégation d'informations. Au-delà de la perspective fondamentale du contenu d'information, les entreprises doivent aussi prendre en considération des perspectives cogni- tives et comportementales. Je développe ces aspects en faisant la part entre la comptabilité analytique, les budgets et plans, et la mesure de la performance. Au chapitre 2, je focalise sur un biais spécifique qui se crée lorsque les informations incertaines sont agrégées. Pour les budgets et plans, des entreprises doivent estimer les espérances des coûts et des durées des projets, car l'espérance est la seule mesure de tendance centrale qui est linéaire. A la différence de l'espérance, des mesures comme le mode ou la médiane ne peuvent pas être simplement additionnés. En considérant la forme spécifique de distributions des coûts et des durées, l'addition des modes ou des médianes résultera en une sous-estimation. Par le biais de deux expériences, je remarque que les participants tendent à estimer le mode au lieu de l'espérance résultant en une distorsion énorme de l'estimati¬on des coûts et des durées des projets. Je présente également une stratégie afin d'atténuer partiellement ce biais. Au chapitre 3, j'effectue une étude expérimentale pour comparer deux approches d'esti¬mation du temps qui sont utilisées en comptabilité analytique, spécifiquement « coûts basés sur les activités (ABC) traditionnelles » et « time driven ABC » (TD-ABC). Au contraire des affirmations soutenues par les défenseurs de l'approche TD-ABC, je constate que cette dernière n'est pas nécessairement appropriée pour les calculs de capacité. Par contre, je démontre que le TD-ABC est plus approprié pour les allocations de coûts que l'approche ABC traditionnelle. - It is essential for organizations to compress detailed sets of information into more comprehensi¬ve sets, thereby, establishing sharp data compression and good decision-making. In chapter 1, I review and structure the literature on information aggregation in management accounting research. I outline the cost-benefit trade-off that management accountants need to consider when they decide on the optimal levels of information aggregation. Beyond the fundamental information content perspective, organizations also have to account for cognitive and behavi¬oral perspectives. I elaborate on these aspects differentiating between research in cost accounti¬ng, budgeting and planning, and performance measurement. In chapter 2, I focus on a specific bias that arises when probabilistic information is aggregated. In budgeting and planning, for example, organizations need to estimate mean costs and durations of projects, as the mean is the only measure of central tendency that is linear. Different from the mean, measures such as the mode or median cannot simply be added up. Given the specific shape of cost and duration distributions, estimating mode or median values will result in underestimations of total project costs and durations. In two experiments, I find that participants tend to estimate mode values rather than mean values resulting in large distortions of estimates for total project costs and durations. I also provide a strategy that partly mitigates this bias. In the third chapter, I conduct an experimental study to compare two approaches to time estimation for cost accounting, i.e., traditional activity-based costing (ABC) and time-driven ABC (TD-ABC). Contrary to claims made by proponents of TD-ABC, I find that TD-ABC is not necessarily suitable for capacity computations. However, I also provide evidence that TD-ABC seems better suitable for cost allocations than traditional ABC.
Resumo:
Partant de deux études de cas, le projet de Frente Ribeirinha à Porto (Portugal) et celui d'Euroméditerranée à Marseille, ce projet de thèse a pour objectif l'étude des modalités de valorisation du tissu bâti (patrimonialisation) par les acteurs institutionnels dans les projets de régénération urbaine. Suivant l'hypothèse de l'entrée dans un nouveau régime de patrimonialité, l'analyse doit expliciter les stratégies à l'oeuvre ainsi que la manière dont les autorités publiques influent sur les représentations de l'espace construit. Abordant la question des échelles d'intervention des projets de régénération urbaine à partir des deux études de cas, mais également de littératures sur des projets anglo-saxons, la thèse cherche à voir, au prisme de la construction patrimoniale, comment ces projets sont directement connectés à l'économie internationale et au phénomène de globalisation et s'ils prennent en compte l'échelle du quotidien et de la domesticité. Il s'agit de montrer les limites d'une valorisation du tissu bâti dans un objectif de retombées économiques (ville consommable) au détriment des valeurs d'usage et de signifiants du quotidien. D'où l'hypothèse de l'importance d'une prise en compte des représentations de valeurs aux différentes échelles (dimensions socio-culturelles du projet de régénération) pour un équilibre qualitatif dans la fabrication du territoire et des projets sur le long terme. La volonté est sensiblement la compréhension des nouvelles constructions patrimoniales ainsi que des pratiques de conception et de mise en oeuvre du projet urbain. - Based on two case studies, the Frente Ribeirinha project in Porto (Portugal) and Euroméditerranée in Marseille, this research project aims at studying the modes of enhancement of the built fabric by institutional actors in urban regeneration projects. Posing the idea of a new heritage regime, this analysis attempts to explain the different strategies at work and how public authorities influence the built space's representations. Looking at the different scales of intervention of regeneration projects in our two case studies, as well as Anglo-Saxon literature on projects, it seeks at seeing, through heritage processes, how these projects are directly connected to the international economy and the phenomenon of globalization. Also, it aims at investigating whether policies take into account the scale of everyday life and domesticity. It attempts to show the limit of a built fabric's valuation with economic benefits objectives (consumable city) rather than taking into account values and meanings of everyday life. Hence, the thesis suggests taking into account representations and values at different scales (socio-cultural dimensions of the regeneration project) for a qualitative balance in urban planning and urban projects' manufacturing. The aim is to broaden the understanding on heritage construction, on urban design practices and on implementation of urban projects.
Resumo:
1. Identifying the boundary of a species' niche from observational and environmental data is a common problem in ecology and conservation biology and a variety of techniques have been developed or applied to model niches and predict distributions. Here, we examine the performance of some pattern-recognition methods as ecological niche models (ENMs). Particularly, one-class pattern recognition is a flexible and seldom used methodology for modelling ecological niches and distributions from presence-only data. The development of one-class methods that perform comparably to two-class methods (for presence/absence data) would remove modelling decisions about sampling pseudo-absences or background data points when absence points are unavailable. 2. We studied nine methods for one-class classification and seven methods for two-class classification (five common to both), all primarily used in pattern recognition and therefore not common in species distribution and ecological niche modelling, across a set of 106 mountain plant species for which presence-absence data was available. We assessed accuracy using standard metrics and compared trade-offs in omission and commission errors between classification groups as well as effects of prevalence and spatial autocorrelation on accuracy. 3. One-class models fit to presence-only data were comparable to two-class models fit to presence-absence data when performance was evaluated with a measure weighting omission and commission errors equally. One-class models were superior for reducing omission errors (i.e. yielding higher sensitivity), and two-classes models were superior for reducing commission errors (i.e. yielding higher specificity). For these methods, spatial autocorrelation was only influential when prevalence was low. 4. These results differ from previous efforts to evaluate alternative modelling approaches to build ENM and are particularly noteworthy because data are from exhaustively sampled populations minimizing false absence records. Accurate, transferable models of species' ecological niches and distributions are needed to advance ecological research and are crucial for effective environmental planning and conservation; the pattern-recognition approaches studied here show good potential for future modelling studies. This study also provides an introduction to promising methods for ecological modelling inherited from the pattern-recognition discipline.
Resumo:
PURPOSE: We investigated the influence of beam modulation on treatment planning by comparing four available stereotactic radiosurgery (SRS) modalities: Gamma-Knife-Perfexion, Novalis-Tx Dynamic-Conformal-Arc (DCA) and Dynamic-Multileaf-Collimation-Intensity-Modulated-radiotherapy (DMLC-IMRT), and Cyberknife. MATERIAL AND METHODS: Patients with arteriovenous malformation (n = 10) or acoustic neuromas (n = 5) were planned with different treatment modalities. Paddick conformity index (CI), dose heterogeneity (DH), gradient index (GI) and beam-on time were used as dosimetric indices. RESULTS: Gamma-Knife-Perfexion can achieve high degree of conformity (CI = 0.77 ± 0.04) with limited low-doses (GI = 2.59 ± 0.10) surrounding the inhomogeneous dose distribution (D(H) = 0.84 ± 0.05) at the cost of treatment time (68.1 min ± 27.5). Novalis-Tx-DCA improved this inhomogeneity (D(H) = 0.30 ± 0.03) and treatment time (16.8 min ± 2.2) at the cost of conformity (CI = 0.66 ± 0.04) and Novalis-TX-DMLC-IMRT improved the DCA CI (CI = 0.68 ± 0.04) and inhomogeneity (D(H) = 0.18 ± 0.05) at the cost of low-doses (GI = 3.94 ± 0.92) and treatment time (21.7 min ± 3.4) (p<0.01). Cyberknife achieved comparable conformity (CI = 0.77 ± 0.06) at the cost of low-doses (GI = 3.48 ± 0.47) surrounding the homogeneous (D(H) = 0.22 ± 0.02) dose distribution and treatment time (28.4min±8.1) (p<0.01). CONCLUSIONS: Gamma-Knife-Perfexion will comply with all SRS constraints (high conformity while minimizing low-dose spread). Multiple focal entries (Gamma-Knife-Perfexion and Cyberknife) will achieve better conformity than High-Definition-MLC of Novalis-Tx at the cost of treatment time. Non-isocentric beams (Cyberknife) or IMRT-beams (Novalis-Tx-DMLC-IMRT) will spread more low-dose than multiple isocenters (Gamma-Knife-Perfexion) or dynamic arcs (Novalis-Tx-DCA). Inverse planning and modulated fluences (Novalis-Tx-DMLC-IMRT and CyberKnife) will deliver the most homogeneous treatment. Furthermore, Linac-based systems (Novalis and Cyberknife) can perform image verification at the time of treatment delivery.
Resumo:
The discipline of Enterprise Architecture Management (EAM) deals with the alignment of business and information systems architectures. While EAM has long been regarded as a discipline for IT managers this book takes a different stance: It explains how top executives can use EAM for leveraging their strategic planning and controlling processes and how EAM can contribute to sustainable competitive advantage. Based on the analysis of best practices from eight leading European companies from various industries the book presents crucial elements of successful EAM. It outlines what executives need to do in terms of governance, processes, methodologies and culture in order to bring their management to the next level. Beyond this, the book points how EAM might develop in the next decade allowing today's managers to prepare for the future of architecture management.
Resumo:
Dans certaines portions des agglomérations (poches de pauvreté de centre-ville, couronnes suburbaines dégradées, espaces périurbains sans aménité), un cumul entre des inégalités sociales (pauvreté, chômage, etc.) et environnementales (exposition au bruit, aux risques industriels, etc.) peut être observé. La persistance de ces inégalités croisées dans le temps indique une tendance de fond : la capacité d'accéder à un cadre de vie de qualité n'est pas équitablement partagée parmi les individus. Ce constat interroge : comment se créent ces inégalités ? Comment infléchir cette tendance et faire la ville plus juste ?¦Apporter des réponses à cette problématique nécessite d'identifier les facteurs de causalités qui entrent en jeu dans le système de (re)production des inégalités urbaines. Le fonctionnement des marchés foncier et immobilier, la « tyrannie des petites décisions » et les politiques publiques à incidence spatiale sont principalement impliqués. Ces dernières, agissant sur tous les éléments du système, sont placées au coeur de ce travail. On va ainsi s'intéresser précisément à la manière dont les collectivités publiques pilotent la production de la ville contemporaine, en portant l'attention sur la maîtrise publique d'ouvrage (MPO) des grands projets urbains.¦Poser la question de la justice dans la fabrique de la ville implique également de questionner les référentiels normatifs de l'action publique : à quelle conception de la justice celle-ci doit- elle obéir? Quatre perspectives (radicale, substantialiste, procédurale et intégrative) sont caractérisées, chacune se traduisant par des principes d'action différenciés. Une méthodologie hybride - empruntant à la sociologie des organisations et à l'analyse des politiques publiques - vient clore le volet théorique, proposant par un détour métaphorique d'appréhender le projet urbain comme une pièce de théâtre dont le déroulement dépend du jeu d'acteurs.¦Cette méthodologie est utilisée dans le volet empirique de la recherche, qui consiste en une analyse de la MPO d'un projet urbain en cours dans la première couronne de l'agglomération lyonnaise : le Carré de Soie. Trois grands objectifs sont poursuivis : descriptif (reconstruire le scénario), analytique (évaluer la nature de la pièce : conte de fée, tragédie ou match d'improvisation ?) et prescriptif (tirer la morale de l'histoire). La description de la MPO montre le déploiement successif de quatre stratégies de pilotage, dont les implications sur les temporalités, le contenu du projet (programmes, morphologies) et les financements publics vont être déterminantes. Sur la base de l'analyse, plusieurs recommandations peuvent être formulées - importance de l'anticipation et de l'articulation entre planification et stratégie foncière notamment - pour permettre à la sphère publique de dominer le jeu et d'assurer la production de justice par le projet urbain (réalisation puis entretien des équipements et espaces publics, financement de logements de qualité à destination d'un large éventail de populations, etc.). Plus généralement, un décalage problématique peut être souligné entre les territoires stratégiques pour le développement de l'agglomération et les capacités de portage limitées des communes concernées. Ce déficit plaide pour le renforcement des capacités d'investissement de la structure intercommunale.¦La seule logique du marché (foncier, immobilier) mène à la polarisation sociale et à la production d'inégalités urbaines. Faire la ville juste nécessite une forte volonté des collectivités publiques, laquelle doit se traduire aussi bien dans l'ambition affichée - une juste hiérarchisation des priorités dans le développement urbain - que dans son opérationnalisation - une juste maîtrise publique d'ouvrage des projets urbains.¦Inner-city neighborhoods, poor outskirts, and peri-urban spaces with no amenities usually suffer from social and environmental inequalities, such as poverty, unemployment, and exposure to noise and industrial hazards. The observed persistence of these inequalities over time points to an underlying trend - namely, that access to proper living conditions is fundamentally unequal, thus eliciting the question of how such inequalities are effected and how this trend can be reversed so as to build a more equitable city.¦Providing answers to such questions requires that the causal factors at play within the system of (re)production of urban inequalities be identified. Real estate markets, "micromotives and macrobehavior", and public policies that bear on space are mostly involved. The latter are central in that they act on all the elements of the system. This thesis therefore focuses on the way public authorities shape the production of contemporary cities, by studying the public project ownership of major urban projects.¦The study of justice within the urban fabric also implies that the normative frames of reference of public action be questioned: what conception of justice should public action refer to? This thesis examines four perspectives (radical, substantialist, procedural, and integrative) each of which results in different principles of action. This theoretical part is concluded by a hybrid methodology that draws from sociology of organizations and public policy analysis and that suggests that the urban project may be understood as a play, whose outcome hinges on the actors' acting.¦This methodology is applied to the empirical analysis of the public project ownership of an ongoing urban project in the Lyon first-ring suburbs: the Carré de Soie. Three main objectives are pursued: descriptive (reconstructing the scenario), analytical (assessing the nature of the play - fairy tale, tragedy or improvisation match), and prescriptive (drawing the moral of the story). The description of the public project ownership shows the successive deployment of four control strategies, whose implications on deadlines, project content (programs, morphologies), and public funding are significant. Building on the analysis, several recommendations can be made to allow the public sphere to control the process and ensure the urban project produces equity (most notably, anticipation and articulation of planning and real- estate strategy, as well as provision and maintenance of equipment and public spaces, funding of quality housing for a wide range of populations, etc.). More generally, a gap can be highlighted between those territories that are strategic to the development of the agglomeration and the limited resources of the municipalities involved. This deficit calls for strengthening the investment abilities of the intermunicipal structure.¦By itself, the real-estate market logic brings about social polarization and urban inequalities. Building an equitable city requires a strong will on the part of public authorities, a will that must be reflected both in the stated ambition - setting priorities of urban development equitably - and in its implementation managing urban public projects fairly.