931 resultados para TOP-DOWN
Resumo:
Une composante PRE (potentiel relié aux événements) nommée la N2pc est associée au déploiement de l’attention visuo-spatiale. Nous avons examiné la modulation de la N2pc en fonction de la présence ou l’absence d’une cible, la séparation physique de deux items saillants ainsi que leur similarité. Les stimuli présentés étaient des lignes variant selon leur orientation et leur couleur, les items saillants étant bleus et les items non saillants, gris. Les résultats démontrent une augmentation de l’amplitude de la N2pc en lien avec la distance séparant deux items saillants ainsi qu’une augmentation de l’amplitude de la N2pc lorsque les items saillants avaient des orientations plus similaires. Aucune interaction entre ces deux facteurs n’a été observée. Une interaction significative a par contre été observée entre la présence/absence d’une cible et la similarité du distracteur avec la cible recherchée. Ces résultats montrent une dissociation entre l’activité reliée à la distance entre les items saillants et celle qui est reliée à la similarité distracteur-cible, car ils ne peuvent pas être expliqués par un seul mécanisme. Donc, les résultats suggèrent qu’une combinaison de traitement ascendant et de traitement descendant module la composante N2pc.
Resumo:
Objectif : Cette thèse a pour but de préciser les mécanismes neuropsychologiques de la douleur, de la régulation endogène de la douleur et de l'hypoalgésie induite psychologiquement (HIP) par la synthèse de près de trente ans de recherche imagerie cérébrale fonctionnelle. Méthodologie : Étant donné l'abondance des études sur le sujet et le manque d'intégration de leurs résultats, la technique de métaanalyse quantitative basée sur les coordonnées d'activation cérébrale fut privilégiée dans cette thèse, telle qu’implémentée dans l'algorithme ALE (Activation Likelyhood Estimate). Une force supplémentaire de cette thèse repose sur la rigueur du processus de sélection des articles. En effet, les études incluses dans les métaanalyses devaient satisfaire des critères stricts d'inclusion, ceci dans le but de favoriser la précision et la validité des conclusions subséquentes. Étude 1 : Le premier article visait à identifier les aires cérébrales impliquées dans la réduction de la douleur par des méthodes psychologiques d'interventions. Les articles retenus portent sur une variété de méthodes d'intervention, telles que le placebo, l'hypnose, la méditation, la perception de contrôle sur la stimulation douloureuse et l'induction d'émotions. Les résultats indiquent que l'HIP implique un vaste réseau d'activation qui comprend le cortex cingulaire antérieur, l'insula antérieure, les zones orbitofrontale et préfrontale latérale, ainsi que les régions pariétale, temporale et souscorticales. Ces activations reflèteraient l'implication des mécanismes neuropsychologiques cognitifs et émotionnels sous-tendent les interventions psychologiques ciblées par ces études, incluant la conscience de soi et la motivation. De plus, les divergences de patron d'activation entre les approches ont été explorées, notamment pour le placebo et la distraction. Étude 2 : Le deuxième article a identifié des patrons d'activations préférentiellement associés à la perception de la douleur, à l'HIP, ainsi que des activations communément associées à la douleur et l'HIP. Les résultats indiquent que 1) la perception de la douleur est associée à l'activation d'aires somatosensorielles et motrices, ce qui pourrait être le reflet de la préparation d'une action adaptative, 2) l'HIP est liée à l'engagement de régions préfrontales antéromédianes et orbitales, possiblement en lien avec des processus motivationnels et émotionnels, et 3) la douleur et l'HIP sont associés à l'activation d'aires préfrontales dorsolatérales, de l'insula antérieure et du cortex cingulaire moyen, ce qui pourrait refléter l'engagement spontané pendant la douleur de mécanismes endogènes de régulation descendante. Conclusion : Par ces études, cette thèse fait le point sur les mécanismes cérébraux impliqués différentiellement dans la perception de la douleur, dans sa régulation endogène et dans l'hypoalgésie induite psychologiquement.
Resumo:
L'écologie urbaine est un nouveau champ de recherche qui cherche à comprendre les structures et les patrons des communautés et des écosystèmes situés dans des paysages urbains. Les petits plans d’eau sont connus comme des écosystèmes aquatiques qui peuvent contenir une biodiversité considérable pour plusieurs groupes taxonomiques (oiseaux, amphibiens, macroinvertébrés), ce qui en fait des écosystèmes intéressants pour les études de conservation. Cependant, la biodiversité du zooplancton, un élément central des réseaux trophiques aquatiques, n’est pas entièrement connue pour les plans d’eaux urbains et devrait être mieux décrite et comprise. Cette étude a évalué les patrons de biodiversité des communautés zooplanctoniques dans des plans d’eau urbains sur l’Ile de Montréal et leurs sources de variation. Des suggestions pour l’évaluation et la conservation de la biodiversité sont aussi discutées. La biodiversité zooplanctonique des plans d’eaux urbains s’est avérée être assez élevée, avec les cladocères et les rotifères montrant les contributions à la diversité gamma et bêta les plus élevées. Sur l’ensemble des plans d’eau, il y avait une corrélation négative entre les contributions à la bêta diversité des cladocères et des rotifères. Au niveau de chaque plan d'eau, la zone littorale colonisée par des macrophytes s'est avérée être un habitat important pour la biodiversité zooplactonique, contribuant considérablement à la richesse en taxons, souvent avec une différente composition en espèces. Les communautés zooplanctoniques répondaient aux facteurs ascendants et descendants, mais aussi aux pratiques d’entretien, car le fait de vider les plans d’eau en hiver affecte la composition des communautés zooplanctoniques. Les communautés de cladocères dans ces plans d’eau possédaient des quantités variables de diversité phylogénétique, ce qui permet de les classer afin de prioriser les sites à préserver par rapport à la diversité phylogénétique. Le choix des sites à préserver afin de maximiser la diversité phylogénétique devrait être correctement établi, afin d’eviter de faire des choix sous-optimaux. Cependant, pour des taxons tels que les cladocères, pour lesquels les relations phylogénétiques demeurent difficiles à établir, placer une confiance absolue dans un seul arbre est une procédure dangereuse. L’incorporation de l’incertitude phylogénétique a démontré que, lorsqu’elle est prise en compte, plusieurs différences potentielles entre la diversité phylogenétique ne sont plus supportées. Les patrons de composition des communautés différaient entre les plans d’eau, les mois et les zones d’échantillonnage. Etant donné les intéractions sont significatives entres ces facters; ceci indique que tous ces facteurs devraient êtres considérés. L’urbanisation ne semblait pas sélectionner pour un type unique de composition des groupes alimentaires, étant donné que les communautés pouvaient changer entres des assemblages de types alimentaires différents. Les variables environnementales, surtout la couverture du plan d’eau en macrophytes, étaient des facteurs importants pour la biodiversité zooplanctonique, affectant la richesse spécifique de divers groupes taxonomiques et alimentaires. Ces variables affectaient aussi la composition des communautés, mais dans une moindre mesure, étant des variables explicatives modestes, ce qui indiquerait le besoin de considérer d’autres processus.
Resumo:
Magnetism and magnetic materials have been an ever-attractive subject area for engineers and scientists alike because of its versatility in finding applications in useful devices. They find applications in a host of devices ranging from rudimentary devices like loud speakers to sophisticated gadgets like waveguides and Magnetic Random Access Memories (MRAM).The one and only material in the realm of magnetism that has been at the centre stage of applications is ferrites and in that spinel ferrites received the lions share as far as practical applications are concerned.It has been the endeavour of scientists and engineers to remove obsolescence and improve upon the existing so as to save energy and integrate in to various other systems. This has been the hallmark of material scientists and this has led to new materials and new technologies.In the field of ferrites too there has been considerable interest to devise new materials based on iron oxides and other compounds. This means synthesising ultra fine particles and tuning its properties to device new materials. There are various preparation techniques ranging from top- down to bottom-up approaches. This includes synthesising at molecular level, self assembling,gas based condensation. Iow temperature eo-precipitation, solgel process and high energy ball milling. Among these methods sol-gel process allows good control of the properties of ceramic materials. The advantage of this method includes processing at low temperature. mixing at the molecular level and fabrication of novel materials for various devices.Composites are materials. which combine the good qualities of one or more components. They can be prepared in situ or by mechanical means by the incorporation of fine particles in appropriate matrixes. The size of the magnetic powders as well as the nature of matrix affect the processability and other physical properties of the final product. These plastic/rubber magnets can in turn be useful for various applications in different devices. In applications involving ferrites at high frequencies, it is essential that the material possesses an appropriate dielectric permittivity and suitable magnetic permeability. This can be achieved by synthesizing rubber ferrite composites (RFC's). RFCs are very useful materials for microwave absorptions. Hence the synthesis of ferrites in the nanoregirne.investigations on their size effects on the structural, magnetic, and electrical properties and the incorporation of these ferrites into polymer matrixes assume significance.In the present study, nano particles of NiFe204, Li(!5Fe2S04 and Col-e-O, are prepared by sol gel method. By appropriate heat treatments, particles of different grain sizes are obtained. The structural, magnetic and electrical measurements are evaluated as a function of grain size and temperature. NiFel04 prepared in the ultrafine regime are then incorporated in nitrile rubber matrix. The incorporation was carried out according to a specific recipe and for various loadings of magnetic fillers. The cure characteristics, magnetic properties, electrical properties and mechanical properties of these elastomer blends are carried out. The electrical permittivity of all the rubber samples in the X - band are also conducted.
Resumo:
Nanoparticles are of immense importance both from the fundamental and application points of view. They exhibit quantum size effects which are manifested in their improved magnetic and electric properties. Mechanical attrition by high energy ball milling (HEBM) is a top down process for producing fine particles. However, fineness is associated with high surface area and hence is prone to oxidation which has a detrimental effect on the useful properties of these materials. Passivation of nanoparticles is known to inhibit surface oxidation. At the same time, coating polymer film on inorganic materials modifies the surface properties drastically. In this work a modified set-up consisting of an RF plasma polymerization technique is employed to coat a thin layer of a polymer film on Fe nanoparticles produced by HEBM. Ball-milled particles having different particle size ranges are coated with polyaniline. Their electrical properties are investigated by measuring the dc conductivity in the temperature range 10–300 K. The low temperature dc conductivity (I–V ) exhibited nonlinearity. This nonlinearity observed is explained on the basis of the critical path model. There is clear-cut evidence for the occurrence of intergranular tunnelling. The results are presented here in this paper
Resumo:
In now-a-days semiconductor and MEMS technologies the photolithography is the working horse for fabrication of functional devices. The conventional way (so called Top-Down approach) of microstructuring starts with photolithography, followed by patterning the structures using etching, especially dry etching. The requirements for smaller and hence faster devices lead to decrease of the feature size to the range of several nanometers. However, the production of devices in this scale range needs photolithography equipment, which must overcome the diffraction limit. Therefore, new photolithography techniques have been recently developed, but they are rather expensive and restricted to plane surfaces. Recently a new route has been presented - so-called Bottom-Up approach - where from a single atom or a molecule it is possible to obtain functional devices. This creates new field - Nanotechnology - where one speaks about structures with dimensions 1 - 100 nm, and which has the possibility to replace the conventional photolithography concerning its integral part - the self-assembly. However, this technique requires additional and special equipment and therefore is not yet widely applicable. This work presents a general scheme for the fabrication of silicon and silicon dioxide structures with lateral dimensions of less than 100 nm that avoids high-resolution photolithography processes. For the self-aligned formation of extremely small openings in silicon dioxide layers at in depth sharpened surface structures, the angle dependent etching rate distribution of silicon dioxide against plasma etching with a fluorocarbon gas (CHF3) was exploited. Subsequent anisotropic plasma etching of the silicon substrate material through the perforated silicon dioxide masking layer results in high aspect ratio trenches of approximately the same lateral dimensions. The latter can be reduced and precisely adjusted between 0 and 200 nm by thermal oxidation of the silicon structures owing to the volume expansion of silicon during the oxidation. On the basis of this a technology for the fabrication of SNOM calibration standards is presented. Additionally so-formed trenches were used as a template for CVD deposition of diamond resulting in high aspect ratio diamond knife. A lithography-free method for production of periodic and nonperiodic surface structures using the angular dependence of the etching rate is also presented. It combines the self-assembly of masking particles with the conventional plasma etching techniques known from microelectromechanical system technology. The method is generally applicable to bulk as well as layered materials. In this work, layers of glass spheres of different diameters were assembled on the sample surface forming a mask against plasma etching. Silicon surface structures with periodicity of 500 nm and feature dimensions of 20 nm were produced in this way. Thermal oxidation of the so structured silicon substrate offers the capability to vary the fill factor of the periodic structure owing to the volume expansion during oxidation but also to define silicon dioxide surface structures by selective plasma etching. Similar structures can be simply obtained by structuring silicon dioxide layers on silicon. The method offers a simple route for bridging the Nano- and Microtechnology and moreover, an uncomplicated way for photonic crystal fabrication.
Resumo:
Die weltweit agierenden sozialen Investitionsfonds haben ihren konzeptionellen Ursprung in dem 1986 erstmals in Bolivien implementierten Sozialen Notstandsfond. Der Fonds hatte die schnelle und fokussierte soziale Abfederung der unter der “Ägide“ der internationalen Finanzorganisationen im bolivianischen Kontext seit 1985 umgesetzten neoliberalen Strukturanpassungsprogramme zum Ziel. Resultierend aus den überwiegend positiven Erfahrungen der ersten Fondsgeneration und der im Verlauf der letzten zwei Dekaden stetigen Weiterentwicklung der nationalen Fondsstruktur kommt dem bolivianischen Modell im Hinblick auf die zukünftige Ausgestaltung von Sozialfondsstrukturen bis heute eine besondere Bedeutung zu. Anhand der verschiedenen bolivianischen Sozialfondsgenerationen fokussiert das vorliegende working paper die Interaktionen und Interdependenzen, die sich innerhalb der internationalen, der nationalen und der lokalen Ebene abzeichnen. So lässt sich im Kontext des bolivianischen Sozialfonds ein beachtlicher Kompetenztransfer zugunsten der internationalen Ebene konstatieren, der durchaus als “Denationalisierung“ der nationalen Sozialpolitik beschrieben werden kann. Gleichzeitig zeigt die vorliegende Untersuchung, dass die nationale Ebene diese Neuverteilung traditionell nationalstaatlicher Verantwortungsbereiche sowohl über Legitimationsgewinne als auch politische Gestaltungsspielräume partiell zu kompensieren vermochte. Mit Blick auf die lokale Ebene dominierten indes klassische top-down-Logiken. Die lokale Ebene trat somit weniger als gestaltender Akteur, denn als passiver Adressat sozialpolitischer Prioritätensetzungen auf.
Resumo:
Ontologies have been established for knowledge sharing and are widely used as a means for conceptually structuring domains of interest. With the growing usage of ontologies, the problem of overlapping knowledge in a common domain becomes critical. In this short paper, we address two methods for merging ontologies based on Formal Concept Analysis: FCA-Merge and ONTEX. --- FCA-Merge is a method for merging ontologies following a bottom-up approach which offers a structural description of the merging process. The method is guided by application-specific instances of the given source ontologies. We apply techniques from natural language processing and formal concept analysis to derive a lattice of concepts as a structural result of FCA-Merge. The generated result is then explored and transformed into the merged ontology with human interaction. --- ONTEX is a method for systematically structuring the top-down level of ontologies. It is based on an interactive, top-down- knowledge acquisition process, which assures that the knowledge engineer considers all possible cases while avoiding redundant acquisition. The method is suited especially for creating/merging the top part(s) of the ontologies, where high accuracy is required, and for supporting the merging of two (or more) ontologies on that level.
Resumo:
The principal objective of this paper is to develop a methodology for the formulation of a master plan for renewable energy based electricity generation in The Gambia, Africa. Such a master plan aims to develop and promote renewable sources of energy as an alternative to conventional forms of energy for generating electricity in the country. A tailor-made methodology for the preparation of a 20-year renewable energy master plan focussed on electricity generation is proposed in order to be followed and verified throughout the present dissertation, as it is applied for The Gambia. The main input data for the proposed master plan are (i) energy demand analysis and forecast over 20 years and (ii) resource assessment for different renewable energy alternatives including their related power supply options. The energy demand forecast is based on a mix between Top-Down and Bottom-Up methodologies. The results are important data for future requirements of (primary) energy sources. The electricity forecast is separated in projections at sent-out level and at end-user level. On the supply side, Solar, Wind and Biomass, as sources of energy, are investigated in terms of technical potential and economic benefits for The Gambia. Other criteria i.e. environmental and social are not considered in the evaluation. Diverse supply options are proposed and technically designed based on the assessed renewable energy potential. This process includes the evaluation of the different available conversion technologies and finalizes with the dimensioning of power supply solutions, taking into consideration technologies which are applicable and appropriate under the special conditions of The Gambia. The balance of these two input data (demand and supply) gives a quantitative indication of the substitution potential of renewable energy generation alternatives in primarily fossil-fuel-based electricity generation systems, as well as fuel savings due to the deployment of renewable resources. Afterwards, the identified renewable energy supply options are ranked according to the outcomes of an economic analysis. Based on this ranking, and other considerations, a 20-year investment plan, broken down into five-year investment periods, is prepared and consists of individual renewable energy projects for electricity generation. These projects included basically on-grid renewable energy applications. Finally, a priority project from the master plan portfolio is selected for further deeper analysis. Since solar PV is the most relevant proposed technology, a PV power plant integrated to the fossil-fuel powered main electrical system in The Gambia is considered as priority project. This project is analysed by economic competitiveness under the current conditions in addition to sensitivity analysis with regard to oil and new-technology market conditions in the future.
Resumo:
The ongoing growth of the World Wide Web, catalyzed by the increasing possibility of ubiquitous access via a variety of devices, continues to strengthen its role as our prevalent information and commmunication medium. However, although tools like search engines facilitate retrieval, the task of finally making sense of Web content is still often left to human interpretation. The vision of supporting both humans and machines in such knowledge-based activities led to the development of different systems which allow to structure Web resources by metadata annotations. Interestingly, two major approaches which gained a considerable amount of attention are addressing the problem from nearly opposite directions: On the one hand, the idea of the Semantic Web suggests to formalize the knowledge within a particular domain by means of the "top-down" approach of defining ontologies. On the other hand, Social Annotation Systems as part of the so-called Web 2.0 movement implement a "bottom-up" style of categorization using arbitrary keywords. Experience as well as research in the characteristics of both systems has shown that their strengths and weaknesses seem to be inverse: While Social Annotation suffers from problems like, e. g., ambiguity or lack or precision, ontologies were especially designed to eliminate those. On the contrary, the latter suffer from a knowledge acquisition bottleneck, which is successfully overcome by the large user populations of Social Annotation Systems. Instead of being regarded as competing paradigms, the obvious potential synergies from a combination of both motivated approaches to "bridge the gap" between them. These were fostered by the evidence of emergent semantics, i. e., the self-organized evolution of implicit conceptual structures, within Social Annotation data. While several techniques to exploit the emergent patterns were proposed, a systematic analysis - especially regarding paradigms from the field of ontology learning - is still largely missing. This also includes a deeper understanding of the circumstances which affect the evolution processes. This work aims to address this gap by providing an in-depth study of methods and influencing factors to capture emergent semantics from Social Annotation Systems. We focus hereby on the acquisition of lexical semantics from the underlying networks of keywords, users and resources. Structured along different ontology learning tasks, we use a methodology of semantic grounding to characterize and evaluate the semantic relations captured by different methods. In all cases, our studies are based on datasets from several Social Annotation Systems. Specifically, we first analyze semantic relatedness among keywords, and identify measures which detect different notions of relatedness. These constitute the input of concept learning algorithms, which focus then on the discovery of synonymous and ambiguous keywords. Hereby, we assess the usefulness of various clustering techniques. As a prerequisite to induce hierarchical relationships, our next step is to study measures which quantify the level of generality of a particular keyword. We find that comparatively simple measures can approximate the generality information encoded in reference taxonomies. These insights are used to inform the final task, namely the creation of concept hierarchies. For this purpose, generality-based algorithms exhibit advantages compared to clustering approaches. In order to complement the identification of suitable methods to capture semantic structures, we analyze as a next step several factors which influence their emergence. Empirical evidence is provided that the amount of available data plays a crucial role for determining keyword meanings. From a different perspective, we examine pragmatic aspects by considering different annotation patterns among users. Based on a broad distinction between "categorizers" and "describers", we find that the latter produce more accurate results. This suggests a causal link between pragmatic and semantic aspects of keyword annotation. As a special kind of usage pattern, we then have a look at system abuse and spam. While observing a mixed picture, we suggest that an individual decision should be taken instead of disregarding spammers as a matter of principle. Finally, we discuss a set of applications which operationalize the results of our studies for enhancing both Social Annotation and semantic systems. These comprise on the one hand tools which foster the emergence of semantics, and on the one hand applications which exploit the socially induced relations to improve, e. g., searching, browsing, or user profiling facilities. In summary, the contributions of this work highlight viable methods and crucial aspects for designing enhanced knowledge-based services of a Social Semantic Web.
Resumo:
Das Working Paper diskutiert top down initiierte Klimaanpassungspolitik in Nicaragua und präsentiert alternative Akteure der Sensibilisierung für Klimawandel und Partizipationsformen im ländlichen Raum. Aufbauend auf früheren Studien wird dabei angenommen, dass die top down initiierten Partizipationsformen in dem zentralamerikanischen Land keine gleichberechtigten Zugangsmöglichkeiten der Gesellschaftsmitglieder an politischen Verhandlungen erzeugen und strukturelle Exklusionsmechanismen nur durch die BewohnerInnen selbst verändert werden können. Diese Erkenntnisse werden im Working Paper aufgenommen und auf die Fragestellung nach dem Potential von Basisorganisationen analysiert, gerechtere Zugangsmöglichkeiten für ländliche BewohnerInnen zu politischen Verhandlungen und/oder Entscheidungsprozessen zu schaffen. Empirisch basiert die Studie auf der Untersuchung des Beitrags von zwei Basisorganisationen für eine verfahrensgerechte Klimapolitik zur Reduzierung der Ausschlussmechanismen ländlicher Personen(gruppen) und Einbindung lokaler Klimawandelerfahrungen unterschiedlicher sozialer Gruppen.
Resumo:
Using the case of an economically declined neighbourhood in the post-industrial German Ruhr Area (sometimes characterized as Germany’s “Rust Belt”), we analyse, describe and conclude how urban agriculture can be used as a catalyst to stimulate and support urban renewal and regeneration, especially from a socio-cultural perspective. Using the methodological framework of participatory action research, and linking bottom-up and top-down planning approaches, a project path was developed to include the population affected and foster individual responsibility for their district, as well as to strengthen inhabitants and stakeholder groups in a permanent collective stewardship for the individual forms of urban agriculture developed and implemented. On a more abstract level, the research carried out can be characterized as a form of action research with an intended transgression of the boundaries between research, planning, design, and implementation. We conclude that by synchronously combining those four domains with intense feedback loops, synergies for the academic knowledge on the potential performance of urban agriculture in terms of sustainable development, as well as the benefits for the case-study area and the interests of individual urban gardeners can be achieved.
Resumo:
In a recent experiment, Freedman et al. recorded from inferotemporal (IT) and prefrontal cortices (PFC) of monkeys performing a "cat/dog" categorization task (Freedman 2001 and Freedman, Riesenhuber, Poggio, Miller 2001). In this paper we analyze the tuning properties of view-tuned units in our HMAX model of object recognition in cortex (Riesenhuber 1999) using the same paradigm and stimuli as in the experiment. We then compare the simulation results to the monkey inferotemporal neuron population data. We find that view-tuned model IT units that were trained without any explicit category information can show category-related tuning as observed in the experiment. This suggests that the tuning properties of experimental IT neurons might primarily be shaped by bottom-up stimulus-space statistics, with little influence of top-down task-specific information. The population of experimental PFC neurons, on the other hand, shows tuning properties that cannot be explained just by stimulus tuning. These analyses are compatible with a model of object recognition in cortex (Riesenhuber 2000) in which a population of shape-tuned neurons provides a general basis for neurons tuned to different recognition tasks.
Resumo:
We propose a probabilistic object classifier for outdoor scene analysis as a first step in solving the problem of scene context generation. The method begins with a top-down control, which uses the previously learned models (appearance and absolute location) to obtain an initial pixel-level classification. This information provides us the core of objects, which is used to acquire a more accurate object model. Therefore, their growing by specific active regions allows us to obtain an accurate recognition of known regions. Next, a stage of general segmentation provides the segmentation of unknown regions by a bottom-strategy. Finally, the last stage tries to perform a region fusion of known and unknown segmented objects. The result is both a segmentation of the image and a recognition of each segment as a given object class or as an unknown segmented object. Furthermore, experimental results are shown and evaluated to prove the validity of our proposal
Resumo:
La Política (public policy) de Educación Media Articulada con la Educación Superior en Bogotá , implementada durante las alcaldías de Luis Eduardo Garzón y Samuel Moreno, apunta a disminuir la falta de oportunidades en el acceso a la educación superior de los bachilleres bogotanos, a través de la transformación de los colegios a nivel pedagógico, administrativo, físico y organizacional para que los estudiantes de décimo y once cursen créditos de educación superior en los colegios a la par que terminan su educación media. No obstante los objetivos loables de la política, varias críticas y efectos negativos acompañaron su implementación. Este es el tema de la monografía, que pretende analizar el desarrollo de la política a la luz de los enfoques teóricos de políticas públicas (Top-down y bottom-up entre otros) en su formulación, implementación y evaluación.