996 resultados para Relative degree n


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The length – weight relationship and relative condition factor of the shovel nose catfish, Arius subrostratus (Valenciennes, 1840) from Champakkara backwater were studied by examination of 392 specimens collected during June to September 2008. These fishes ranged from 6 to 29 cm in total length and 5.6 to 218 g in weight. The relation between the total length and weight of Arius subrostratus is described as Log W = -1.530+2.6224 log L for males, Log W = - 2.131 + 3.0914 log L for females and Log W = - 1.742 + 2.8067 log L for sexes combined. The mean relative condition factor (Kn) values ranged from 0.75 to 1.07 for males, 0.944 to 1.407 for females and 0.96 to 1.196 for combined sexes. The length-weight relationship and relative condition factor showed that the well-being of A. subrostratus is good. The morphometric measurements of various body parts and meristic counts were recorded. The morphometric measurements were found to be non-linear and there is no significant difference observed between the two sexes. From the present investigation, the fin formula can be written as D: I, 7; P: I, 12; A: 17 – 20; C: 26 – 32. There is no change in meristic counts with the increase in body length.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Restarting automata can be seen as analytical variants of classical automata as well as of regulated rewriting systems. We study a measure for the degree of nondeterminism of (context-free) languages in terms of deterministic restarting automata that are (strongly) lexicalized. This measure is based on the number of auxiliary symbols (categories) used for recognizing a language as the projection of its characteristic language onto its input alphabet. This type of recognition is typical for analysis by reduction, a method used in linguistics for the creation and verification of formal descriptions of natural languages. Our main results establish a hierarchy of classes of context-free languages and two hierarchies of classes of non-context-free languages that are based on the expansion factor of a language.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let G be finite group and K a number field or a p-adic field with ring of integers O_K. In the first part of the manuscript we present an algorithm that computes the relative algebraic K-group K_0(O_K[G],K) as an abstract abelian group. We solve the discrete logarithm problem, both in K_0(O_K[G],K) and the locally free class group cl(O_K[G]). All algorithms have been implemented in MAGMA for the case K = \IQ. In the second part of the manuscript we prove formulae for the torsion subgroup of K_0(\IZ[G],\IQ) for large classes of dihedral and quaternion groups.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis concerns with the main aspects of medical trace molecules detection by means of intracavity laser absorption spectroscopy (ICLAS), namely with the equirements for highly sensitive, highly selective, low price, and compact size sensor. A novel two modes semiconductor laser sensor is demonstrated. Its operation principle is based on the competition between these two modes. The sensor sensitivity is improved when the sample is placed inside the two modes laser cavity, and the competition between the two modes exists. The effects of the mode competition in ICLAS are discussed theoretically and experimentally. The sensor selectivity is enhanced using external cavity diode laser (ECDL) configuration, where the tuning range only depends on the external cavity configuration. In order to considerably reduce the sensor cost, relative intensity noise (RIN) is chosen for monitoring the intensity ratio of the two modes. RIN is found to be an excellent indicator for the two modes intensity ratio variations which strongly supports the sensor methodology. On the other hand, it has been found that, wavelength tuning has no effect on the RIN spectrum which is very beneficial for the proposed detection principle. In order to use the sensor for medical applications, the absorption line of an anesthetic sample, propofol, is measured. Propofol has been dissolved in various solvents. RIN has been chosen to monitor the sensor response. From the measured spectra, the sensor sensitivity enhancement factor is found to be of the order of 10^(3) times of the conventional laser spectroscopy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regionale Arbeitsmärkte unterscheiden sich erheblich hinsichtlich wesentlicher Kennzahlen wie der Arbeitslosenquote, des Lohnniveaus oder der Beschäftigungsentwicklung. Wegen ihrer Persistenz sind diese Unterschiede von hoher Relevanz für die Politik. Die wirtschaftswissenschaftliche Literatur liefert bereits theoretische Modelle für die Analyse regionaler Arbeitsmärkte. In der Regel sind diese Modelle aber nicht dazu geeignet, regionale Arbeitsmarktunterschiede endogen zu erklären. Das bedeutet, dass sich die Unterschiede regionaler Arbeitsmärkte in der Regel nicht aus den Modellzusammenngen selbst ergeben, sondern „von außen“ eingebracht werden müssen. Die empirische Literatur liefert Hinweise, dass die Unterschiede zwischen regionalen Arbeitsmärkten auf die Höhe der regionalen Arbeitsnachfrage zurückzuführen sind. Die Arbeitsnachfrage wiederum leitet sich aus den Gütermärkten ab: Es hängt von der Entwicklung der regionalen Gütermärkte ab, wie viele Arbeitskräfte benötigt werden. Daraus folgt, dass die Ursachen für Unterschiede regionaler Arbeitsmärkte in den Unterschieden zwischen den regionalen Gütermärkten zu suchen sind. Letztere werden durch die Literatur zur Neuen Ökonomischen Geographie (NÖG) untersucht. Die Literatur zur NÖG erklärt Unterschiede regionaler Gütermärkte, indem sie zentripetale und zentrifugale Kräfte gegenüberstellt. Zentripetale Kräfte sind solche, welche hin zur Agglomeration ökonomischer Aktivität wirken. Im Zentrum dieser Diskussion steht vor allem das Marktpotenzial: Unternehmen siedeln sich bevorzugt an solchen Standorten an, welche nahe an großen Märkten liegen. Erwerbspersonen wiederum bevorzugen solche Regionen, welche ihnen entsprechende Erwerbsaussichten bieten. Beides zusammen bildet einen sich selbst verstärkenden Prozess, der zur Agglomeration ökonomischer Aktivität führt. Dem stehen jedoch zentrifugale Kräfte gegenüber, welche eine gleichmäßigere Verteilung ökonomischer Aktivität bewirken. Diese entstehen beispielsweise durch immobile Produktionsfaktoren oder Ballungskosten wie etwa Umweltverschmutzung, Staus oder hohe Mietpreise. Sind die zentripetalen Kräfte hinreichend stark, so bilden sich Zentren heraus, in denen sich die ökonomische Aktivität konzentriert, während die Peripherie ausdünnt. In welchem Ausmaß dies geschieht, hängt von dem Verhältnis beider Kräfte ab. Üblicherweise konzentriert sich die Literatur zur NÖG auf Unterschiede zwischen regionalen Gütermärkten und geht von der Annahme perfekter Arbeitsmärkte ohne Arbeitslosigkeit aus. Die Entstehung und Persistenz regionaler Arbeitsmarktunterschiede kann die NÖG daher üblicherweise nicht erklären. An dieser Stelle setzt die Dissertation an. Sie erweitert die NÖG um Friktionen auf dem Arbeitsmarkt, um die Entstehung und Persistenz regionaler Arbeitsmarktunterschiede zu erklären. Sie greift dazu auf eine empirische Regelmäßigkeit zurück: Zahlreiche Studien belegen einen negativen Zusammenhang zwischen Lohn und Arbeitslosigkeit. In Regionen, in denen die Arbeitslosigkeit hoch ist, ist das Lohnniveau gering und umgekehrt. Dieser Zusammenhang wird als Lohnkurve bezeichnet. Auf regionaler Ebene lässt sich die Lohnkurve mithilfe der Effizienzlohntheorie erklären, die als theoretische Grundlage in der Dissertation Anwendung findet. Konzentriert sich nun die ökonomische Aktivität aufgrund der zentripetalen Kräfte in einer Region, so ist in diesem Zentrum die Arbeitsnachfrage höher. Damit befindet sich das Zentrum auf einer günstigen Position der Lohnkurve mit geringer Arbeitslosigkeit und hohem Lohnniveau. Umgekehrt findet sich die Peripherie auf einer unnstigen Position mit hoher Arbeitslosigkeit und geringem Lohnniveau wieder. Allerdings kann sich die Lohnkurve in Abhängigkeit des Agglomerationsgrades verschieben. Das komplexe Zusammenspiel der endogenen Agglomeration mit den Arbeitsmarktfriktionen kann dann unterschiedliche Muster regionaler Arbeitsmarktdisparitäten hervorrufen. Die Dissertation zeigt auf, wie im Zusammenspiel der NÖG mit Effizienzlöhnen regionale Arbeitsmarktdisparitäten hervorgerufen werden. Es werden theoretische Modelle formuliert, die diese Interaktionen erklären und welche die bestehende Literatur durch spezifische Beiträge erweitern. Darüber hinaus werden die zentralen Argumente der Theorie einem empirischen Test unterworfen. Es kann gezeigt werden, dass das zentrale Argument – der positive Effekt des Marktpotentials auf die Arbeitsnachfrage – relevant ist. Außerdem werden Politikimplikationen abgeleitet und der weitere Forschungsbedarf aufgezeigt.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The traditional task of a central bank is to preserve price stability and, in doing so, not to impair the real economy more than necessary. To meet this challenge, it is of great relevance whether inflation is only driven by inflation expectations and the current output gap or whether it is, in addition, influenced by past inflation. In the former case, as described by the New Keynesian Phillips curve, the central bank can immediately and simultaneously achieve price stability and equilibrium output, the so-called ‘divine coincidence’ (Blanchard and Galí 2007). In the latter case, the achievement of price stability is costly in terms of output and will be pursued over several periods. Similarly, it is important to distinguish this latter case, which describes ‘intrinsic’ inflation persistence, from that of ‘extrinsic’ inflation persistence, where the sluggishness of inflation is not a ‘structural’ feature of the economy but merely ‘inherited’ from the sluggishness of the other driving forces, inflation expectations and output. ‘Extrinsic’ inflation persistence is usually considered to be the less challenging case, as policy-makers are supposed to fight against the persistence in the driving forces, especially to reduce the stickiness of inflation expectations by a credible monetary policy, in order to reestablish the ‘divine coincidence’. The scope of this dissertation is to contribute to the vast literature and ongoing discussion on inflation persistence: Chapter 1 describes the policy consequences of inflation persistence and summarizes the empirical and theoretical literature. Chapter 2 compares two models of staggered price setting, one with a fixed two-period duration and the other with a stochastic duration of prices. I show that in an economy with a timeless optimizing central bank the model with the two-period alternating price-setting (for most parameter values) leads to more persistent inflation than the model with stochastic price duration. This result amends earlier work by Kiley (2002) who found that the model with stochastic price duration generates more persistent inflation in response to an exogenous monetary shock. Chapter 3 extends the two-period alternating price-setting model to the case of 3- and 4-period price durations. This results in a more complex Phillips curve with a negative impact of past inflation on current inflation. As simulations show, this multi-period Phillips curve generates a too low degree of autocorrelation and too early turnings points of inflation and is outperformed by a simple Hybrid Phillips curve. Chapter 4 starts from the critique of Driscoll and Holden (2003) on the relative real-wage model of Fuhrer and Moore (1995). While taking the critique seriously that Fuhrer and Moore’s model will collapse to a much simpler one without intrinsic inflation persistence if one takes their arguments literally, I extend the model by a term for inequality aversion. This model extension is not only in line with experimental evidence but results in a Hybrid Phillips curve with inflation persistence that is observably equivalent to that presented by Fuhrer and Moore (1995). In chapter 5, I present a model that especially allows to study the relationship between fairness attitudes and time preference (impatience). In the model, two individuals take decisions in two subsequent periods. In period 1, both individuals are endowed with resources and are able to donate a share of their resources to the other individual. In period 2, the two individuals might join in a common production after having bargained on the split of its output. The size of the production output depends on the relative share of resources at the end of period 1 as the human capital of the individuals, which is built by means of their resources, cannot fully be substituted one against each other. Therefore, it might be rational for a well-endowed individual in period 1 to act in a seemingly ‘fair’ manner and to donate own resources to its poorer counterpart. This decision also depends on the individuals’ impatience which is induced by the small but positive probability that production is not possible in period 2. As a general result, the individuals in the model economy are more likely to behave in a ‘fair’ manner, i.e., to donate resources to the other individual, the lower their own impatience and the higher the productivity of the other individual. As the (seemingly) ‘fair’ behavior is modelled as an endogenous outcome and as it is related to the aspect of time preference, the presented framework might help to further integrate behavioral economics and macroeconomics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The central challenge in face recognition lies in understanding the role different facial features play in our judgments of identity. Notable in this regard are the relative contributions of the internal (eyes, nose and mouth) and external (hair and jaw-line) features. Past studies that have investigated this issue have typically used high-resolution images or good-quality line drawings as facial stimuli. The results obtained are therefore most relevant for understanding the identification of faces at close range. However, given that real-world viewing conditions are rarely optimal, it is also important to know how image degradations, such as loss of resolution caused by large viewing distances, influence our ability to use internal and external features. Here, we report experiments designed to address this issue. Our data characterize how the relative contributions of internal and external features change as a function of image resolution. While we replicated results of previous studies that have shown internal features of familiar faces to be more useful for recognition than external features at high resolution, we found that the two feature sets reverse in importance as resolution decreases. These results suggest that the visual system uses a highly non-linear cue-fusion strategy in combining internal and external features along the dimension of image resolution and that the configural cues that relate the two feature sets play an important role in judgments of facial identity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report is a formal documentation of the results of an assessment of the degree to which Lean Principles and Practices have been implemented in the US Aerospace and Defense Industry. An Industry Association team prepared it for the DCMA-DCAAIndustry Association “Crosstalk” Coalition in response to a “Crosstalk” meeting action request to the industry associations. The motivation of this request was provided by the many potential benefits to system product quality, affordability and industry responsiveness, which a high degree of industry Lean implementation can produce.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A targeted, stimuli-responsive, polymeric drug delivery vehicle is being developed in our lab to help alleviate severe side-effects caused by narrow therapeutic window drugs. Targeting specific cell types or organs via proteins, specifically, lectin-mediated targeting holds potential due to the high specificity and affinity of receptor-ligand interactions, rapid internalization, and relative ease of processing. Dextran, a commercially available, biodegradable polymer has been conjugated to doxorubicin and galactosamine to target hepatocytes in a three-step, one-pot synthesis. The loading of doxorubicin and galactose on the conjugates was determined by absorbance at 485 nm and elemental analysis, respectively. Conjugation efficiency based on the amount loaded of each reactant varies from 20% to 50% for doxorubicin and from 2% to 20% for galactosamine. Doxorubicin has also been attached to dextran through an acid-labile hydrazide bond. Doxorubicin acts by intercalating with DNA in the nuclei of cells. The fluorescence of doxorubicin is quenched when it binds to DNA. This allows a fluorescence-based cell-free assay to evaluate the efficacy of the polymer conjugates where we measure the fluorescence of doxorubicin and the conjugates in increasing concentrations of calf thymus DNA. Fluorescence quenching indicates that our conjugates can bind to DNA. The degree of binding increases with polymer molecular weight and substitution of doxorubicin. In cell culture experiments with hepatocytes, the relative uptake of polymer conjugates was evaluated using flow cytometry, and the killing efficiency was determined using the MTT cell proliferation assay. We have found that conjugate uptake is much lower than that of free doxorubicin. Lower uptake of conjugates may increase the maximum dose of drug tolerated by the body. Also, non-galactosylated conjugate uptake is lower than that of the galactosylated conjugate. Microscopy indicates that doxorubicin localizes almost exclusively at the nucleus, whereas the conjugates are present throughout the cell. Doxorubicin linked to dextran through a hydrazide bond was used to achieve improved killing efficiency. Following uptake, the doxorubicin dissociates from the polymer in an endosomal compartment and diffuses to the nucleus. The LC₅₀ of covalently linked doxorubicin is 7.4 μg/mL, whereas that of hydrazide linked doxorubicin is 4.4 μg/mL.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presentamos una experiencia exitosa de aprendizaje que partió de Criptogamia (asignatura optativa de segundo ciclo de Biología), que dio lugar a un proyecto de investigación gestionado por los propios alumnos. La iniciativa se consolidó estableciendo una Asociación de Estudiantes centrada en investigación y divulgación. En poco tiempo, los participantes han presentado comunicaciones científicas, y organizado actividades dirigidas a diversos públicos, dentro y fuera de la comunidad universitaria. Actualmente se plantea una colaboración multidisciplinar con otros organismos de investigación y la extensión de su ámbito de estudio. Abordamos su incidencia en el aprendizaje en varios aspectos: científico (técnicas específicas, rigor, búsqueda de información e interpretación de resultados), comunicativo (estructuración y presentación de la información obtenida, para diversos públicos), y organizativo, incluyendo el trabajo en equipo. Aunque de carácter esponneo, esta experiencia muestra rasgos evaluables en cuanto a sus posibilidades para otras asignaturas. Analizamos las características y planteamiento de esta optativa, el perfil de sus alumnos, y el contexto universitario que la acoge. Detectamos como factores principales los aspectos participativos de la asignatura, la cohesión del grupo, el carácter voluntario de la implicación, los beneficios percibidos por los estudiantes, y la disponibilidad de recursos humanos (supervisión) y materiales (equipamiento y subvenciones)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Un dels reptes cabdals de la Universitat és enllaçar l’experiència de recerca amb la docència, així com promoure la internacionalització dels estudis, especialment a escala europea, tenint present que ambdues poden actuar com a catalitzadores de la millora de la qualitat docent. Una de les fórmules d’internacionalització és la realització d’assignatures compartides entre universitats de diferents països, fet que suposa l’oportunitat d’implementar noves metodologies docents. En aquesta comunicació es presenta una experiència en aquesta línia desenvolupada entre la Universitat de Girona i la Universitat de Joensuu (Finndia) en el marc dels estudis de Geografia amb la realització de l’assignatura 'The faces of landscape: Catalonia and North Karelia'. Aquesta es desenvolupa al llarg de dues setmanes intensives, una en cadascuna de les Universitats. L’objectiu és presentar i analitzar diferents significats del concepte paisatge aportant també metodologies d’estudi tant dels aspectes físics i ecològics com culturals que s’hi poden vincular i que són les que empren els grups de recerca dels professors responsables de l’assignatura. Aquesta part teòrica es completa amb una presentació de les característiques i dinàmiques pròpies dels paisatges finlandesos i catalans i una sortida de camp. Per a la part pràctica es constitueixen grups d’estudi multinacionals que treballen a escala local algun dels aspectes en els dos països, es comparen i es realitza una presentació i defensa davant del conjunt d’estudiants i professorat. La llengua vehicular de l’assignatura és l’anglès

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resumen tomado de la publicaci??n

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IP based networks still do not have the required degree of reliability required by new multimedia services, achieving such reliability will be crucial in the success or failure of the new Internet generation. Most of existing schemes for QoS routing do not take into consideration parameters concerning the quality of the protection, such as packet loss or restoration time. In this paper, we define a new paradigm to develop new protection strategies for building reliable MPLS networks, based on what we have called the network protection degree (NPD). This NPD consists of an a priori evaluation, the failure sensibility degree (FSD), which provides the failure probability and an a posteriori evaluation, the failure impact degree (FID), to determine the impact on the network in case of failure. Having mathematical formulated these components, we point out the most relevant components. Experimental results demonstrate the benefits of the utilization of the NPD, when used to enhance some current QoS routing algorithms to offer a certain degree of protection

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To establish a prediction model of the degree of disability in adults with Spinal CordInjury (SCI ) based on the use of the WHO-DAS II . Methods: The disability degree was correlatedwith three variable groups: clinical, sociodemographic and those related with rehabilitation services.A model of multiple linear regression was built to predict disability. 45 people with sci exhibitingdiverse etiology, neurological level and completeness participated. Patients were older than 18 andthey had more than a six-month post-injury. The WHO-DAS II and the ASIA impairment scale(AIS ) were used. Results: Variables that evidenced a significant relationship with disability were thefollowing: occupational situation, type of affiliation to the public health care system, injury evolutiontime, neurological level, partial preservation zone, ais motor and sensory scores and number ofclinical complications during the last year. Complications significantly associated to disability werejoint pain, urinary infections, intestinal problems and autonomic disreflexia. None of the variablesrelated to rehabilitation services showed significant association with disability. The disability degreeexhibited significant differences in favor of the groups that received the following services: assistivedevices supply and vocational, job or educational counseling. Conclusions: The best predictiondisability model in adults with sci with more than six months post-injury was built with variablesof injury evolution time, AIS sensory score and injury-related unemployment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El hecho de que, desde la Constitución de 1991, normas y sentencias tiendan en considerar que los ejecutivos locales (alcaldes, gobernadores) deben ser unos agentes regidores que practican el gobierno del territorio y la “descentralización controlada” más que unos actores gobernantes, defensores de la gobernancia de sus territorios, plantea la cuestión de saber por qué y cómo aquéllos pueden combinar este doble rol. La mayor parte de la respuesta se encuentra en el examen simultáneo de la lógica de regulación de las políticas pública (Muller, 1997 y 2002) y de las condiciones de su implementación en el territorio o “despliegue territorial” (Medellín, 2003).La territorialidad de una política pública se refiere, según Muller, a una situación en la cual la lógica dominante de una política pública es una lógica territorial u horizontal (regulación de un territorio geográfico en una dialéctica centro-periferia), mientras la sectorialidad se refiere a una situación en la cual la lógica dominante de una política pública es una lógica sectorial o vertical (regulación de la reproducción de un sector determinado verticalmente en una dialéctica global-sectorial). A cada lógica de regulación de las políticas públicas le corresponde una manera de gobernar un territorio: el gobierno del territorio para la sectorialidad y la gobernancia de los territorios para la territorialidad.Al contrario del caso francés en el cual la construcción del Estado marca el fin de una lógica de territorialidad (la “territorialidad tradicional”), el Estado colombiano es un Estado endémico, un Estado fragmentado confrontado a la lucha por el control territorial y el temor a la fractura de la unidad territorial (Navas, 2003). Su capacidad parcial por cubrir todo el territorio y actuar sobre él (su “territorialidad parcial” según Medellín) afecta su gobernabilidad y el despliegue territorial de las políticas públicas, sometido a negociaciones inciertas, cercanas a la temática de la gobernancia.Por ello, la gobernabilidad en Colombia sólo es posible mediante políticas públicas híbridas, hijas del “matrimonio indisoluble de la sectorialidad y de la territorialidad”, de la “secterritorialidad”, que combina en proporción variable lógica sectorial (o sectorialidad, que implica centralización) y lógica territorial (o territorialidad, que implica descentralización), gracias a un modo de gobierno híbrido, la “gobiernancia” del territorio, ésta es una “dosificación” compleja entre gobier-no y gobernancia. Es, entonces, posible determinar el grado de territorialidad y de sectorialidad de una política pública en un momento dado y clasificar las políticas públicas según su grado de territorialidad y de sectorialidad, es decir, según la importancia relativa de sus lógicas de regulación.Si el periodo que empieza al iniciar los años setenta, y que termina al iniciar los años ochenta es un periodo en el cual domina primordialmente la sectorialidad y se asegura la gobernabilidad recurriendo casi exclusivamente al gobierno del territorio a través del despliegue territorial separado de cada sector, el periodo que va de 1984 a la época actual, caracterizado por la “descentralización controlada”, señala un cambio parcial de lógica de regulación de las políticas públicas a través del recurrir parcial a la lógica de territorialidad como respuesta a alguna crisis de sectorialidad.Como bien lo muestra el análisis y la evaluación de las políticas públicas de vivienda de interés social, ordenamiento y desarrollo del territorio municipal, educación y acueducto y alcantarillado, la gobernabilidad en este secundo periodo está asegurada, entonces, sólo gracias a la gobiernancia en los territorios o combinación entre el gobierno del territorio y la gobernancia de los territorios, es decir, gracias a un modo de gobierno híbrido que les permite a los ejecutivos locales conciliar su inclinación por la gobernancia de sus territorios y sus obligaciones en cuanto al gobierno del territorio y ser, al mismo tiempo, actores gobernantes y agentes regidores.-----Since the 1991 Constitution, most rulings and sentences tend to consider that individuals in charge of the local executive power (mayors, governors) should be some sort of tuling aldermen responsible for their particular territories and their “controlled decentralization” instead of actual governing actors, advocates and defenders of the governance of their territories. This fact raises the questions of why and how they can possibly play this double role. The bulk of the answer to this question can be found by examining both the logic behind public policies (Muller) and the conditions in whixh these are implemented in each territory, or “territorial deployment” (Medellìn).According to Muller, the territoriality of a particular public policy refers to a situation whereby the prevailing logic is territorial or horizontal (regulation of a geographical territory follows a center-periphery scheme), whereas its sectoriality refers to a situation whereby the prevailing logic is sectorial or vertical (regulation of a sector’s reproduction is vertically determined by a global-sectorial scheme). For each regulation logic behind public policies there is a corresponding way of governing a particular territory: governing the territory for the sake of sectoriality, and governing it for the sake of territoriality.As opposed to the French case, whereby the construction of the State signals the purposes of a territorial logic (“traditional territoriality”), the Colombian State is na endemic one, a fragmented State struggling for territorial control and in fear of the fragmentation of territorial unity (Navas). Its limited capacity to cover the whole territory and to act on it (its “limited or partial territoriality”, according to Medellín) affects governability as well as the deployment of public policies, which is frequently subject to uncertain negotiations related to the problem of governance.That es why governability in Colombia is only possible through hybrid public policies, which are in turn the result of the “indissoluble marriage between sectoriality and territoriality”, the result of a sort of “secterritoriality” which combines in various proportions a sectorial logic (or sectoriallity, which implies centralization) and a territorial logic (or territoriallity, which implies decentralization), all due to a hybrid form of governmen, or “governance” of the territory, a complex and variable dosage of both government and governance. Keeping this in mind, it is possible to establish the degree of territoriality and sectoriality of a public policy at a particular time, and to classify public policies according to their degree of territoriality and/or sectoriality, that is, according to the relative importance of the logics behind their regulation.From the early 70s to the early 80s, sectoriality prevailed and governability was guaranteed almost exclusively through the separate territorial deployment of each sector, then, from 1984 to the present, “controlled decentralization” has shown partial changes in the regulation logic behind public policies by resorting, at least in part, to a logic of territoriality in response to some sort of sectorriality crisis.As can be clearly seen after analyzing and evaluatin public policies in matters such as statesubsidized housing, municipal land development and legislation, education, water and sewage services, governability during this second period can only be guaranteed by governance in the territories or by a combination of both, government in the territory and governance in the territories. In other words, governability is possible thanks to a type of hybrid government that allows those in charge of exercising local executive power to reconcile their bias towards the governance of their teritories and their duties vis-àvis the government of the territory, but capable of being, at the same time, governing actors and ruling agents, active modern-day aldermen.