914 resultados para Constraints of monotonicity
Resumo:
La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.
Resumo:
This thesis presents an alternative approach to the analytical design of surface-mounted axialflux permanent-magnet machines. Emphasis has been placed on the design of axial-flux machines with a one-rotor-two-stators configuration. The design model developed in this study incorporates facilities to include both the electromagnetic design and thermal design of the machine as well as to take into consideration the complexity of the permanent-magnet shapes, which is a typical requirement for the design of high-performance permanent-magnet motors. A prototype machine with rated 5 kW output power at 300 min-1 rotation speed has been designed and constructed for the purposesof ascertaining the results obtained from the analytical design model. A comparative study of low-speed axial-flux and low-speed radial-flux permanent-magnet machines is presented. The comparative study concentrates on 55 kW machines with rotation speeds 150 min-1, 300 min-1 and 600 min-1 and is based on calculated designs. A novel comparison method is introduced. The method takes into account the mechanical constraints of the machine and enables comparison of the designed machines, with respect to the volume, efficiency and cost aspects of each machine. It is shown that an axial-flux permanent-magnet machine with one-rotor-two-stators configuration has generally a weaker efficiency than a radial-flux permanent-magnet machine if for all designs the same electric loading, air-gap flux density and current density have been applied. On the other hand, axial-flux machines are usually smaller in volume, especially when compared to radial-flux machines for which the length ratio (axial length of stator stack vs. air-gap diameter)is below 0.5. The comparison results show also that radial-flux machines with alow number of pole pairs, p < 4, outperform the corresponding axial-flux machines.
Resumo:
The need for high performance, high precision, and energy saving in rotating machinery demands an alternative solution to traditional bearings. Because of the contactless operation principle, the rotating machines employing active magnetic bearings (AMBs) provide many advantages over the traditional ones. The advantages such as contamination-free operation, low maintenance costs, high rotational speeds, low parasitic losses, programmable stiffness and damping, and vibration insulation come at expense of high cost, and complex technical solution. All these properties make the use of AMBs appropriate primarily for specific and highly demanding applications. High performance and high precision control requires model-based control methods and accurate models of the flexible rotor. In turn, complex models lead to high-order controllers and feature considerable computational burden. Fortunately, in the last few years the advancements in signal processing devices provide new perspective on the real-time control of AMBs. The design and the real-time digital implementation of the high-order LQ controllers, which focus on fast execution times, are the subjects of this work. In particular, the control design and implementation in the field programmable gate array (FPGA) circuits are investigated. The optimal design is guided by the physical constraints of the system for selecting the optimal weighting matrices. The plant model is complemented by augmenting appropriate disturbance models. The compensation of the force-field nonlinearities is proposed for decreasing the uncertainty of the actuator. A disturbance-observer-based unbalance compensation for canceling the magnetic force vibrations or vibrations in the measured positions is presented. The theoretical studies are verified by the practical experiments utilizing a custom-built laboratory test rig. The test rig uses a prototyping control platform developed in the scope of this work. To sum up, the work makes a step in the direction of an embedded single-chip FPGA-based controller of AMBs.
Resumo:
The thesis studies the representations of different elements of contemporary work as present in Knowledge Management (KM). KM is approached as management discourse that is seen to affect and influence managerial practices in organizations. As representatives of KM discourse four journal articles are analyzed, using the methodology of Critical Discourse Analysis and the framework of Critical Management Studies, with a special emphasis on the question of structure and agency. The results of the analysis reveal that structural elements such as information technology and organizational structures are strongly present in the most influential KM representations, making their improvement also a desirable course of action for managers. In contrast agentic properties are not in a central role, they are subjugated to structural constraints of varying kind and degree. The thesis claims that one such constraint is KM discourse itself, influencing managerial and organizational choices and decision making. The thesis concludes that the way human beings are represented, studied and treated in management studies such as KM needs to be re-examined. Pro gradu-tutkielmassa analysoidaan työhön ja sen tekijään liittyviä representaatioita Tietojohtamisen kirjallisuudessa. Tietojohtamista tarkastellaan liikkeenjohdollisena diskurssina, jolla nähdään olevan vaikutus organisaatioiden päätöksentekoon ja toimintaan. Tutkielmassa analysoidaan neljä Tietojohtamisen tieteellistä artikkelia, käyttäen metodina kriittistä diskurssianalyysiä. Tutkielman viitekehyksenä on kriittinen liikkeenjohdon tutkimus. Lisäksi työssä pohditaan kysymystä rakenteen ja toimijan välisestä vuorovaikutuksesta. Tutkielman analyysi paljastaa, että tietojohtamisen vaikutusvaltaisimmat representaatiot painottavat rakenteellisia tekijöitä, kuten informaatioteknologiaa ja organisaatiorakenteita. Tämän seurauksena mm. panostukset em. tekijöihin nähdään organisaatioissa toivottavana toimintana. Vastaavasti representaatiot jotka painottavat yksilöitä ja toimintaa ovat em. tekijöille alisteisessa asemassa. Tapaa, jolla yksilöitä kuvataan ja käsitellään Tietojohtamisen diskurssissa, tulisikin laajentaa ja monipuolistaa.
Resumo:
Systems made of parts that are totally connected do not work, neither ecosys- tems nor artifacts. Relative connectance is inversely related to diversity, and both magnitudes can find a common frame of expression, in which some constant expressing the constraints of any organization might be embodied. If S is Simp- son's index, the expression (1 - S)IS as a measure of diversity offers some advantages or, at least, helps further reasoning. Such expression is the ratio between total interspecific possible interactions and possible intraspecific inter- actions.
Resumo:
Given their central role in mercury (Hg) excretion and suitability as reservoirs, bird feathers are useful Hg biomonitors. Nevertheless, the interpretation of Hg concentrations is still questioned as a result of a poor knowledge of feather physiology and mechanisms affecting Hg deposition. Given the constraints of feather availability to ecotoxicological studies, we tested the effect of intra-individual differences in Hg concentrations according to feather type (body vs. flight feathers), position in the wing and size (mass and length) in order to understand how these factors could affect Hg estimates. We measured Hg concentration of 154 feathers from 28 un-moulted barn owls (Tyto alba), collected dead on roadsides. Median Hg concentration was 0.45 (0.076-4.5) mg kg(-1) in body feathers, 0.44 (0.040-4.9) mg kg(-1) in primary and 0.60 (0.042-4.7) mg kg(-1) in secondary feathers, and we found a poor effect of feather type on intra-individual Hg levels. We also found a negative effect of wing feather mass on Hg concentration but not of feather length and of its position in the wing. We hypothesize that differences in feather growth rate may be the main driver of between-feather differences in Hg concentrations, which can have implications in the interpretation of Hg concentrations in feathers. Finally, we recommend that, whenever possible, several feathers from the same individual should be analysed. The five innermost primaries have lowest mean deviations to both between-feather and intra-individual mean Hg concentration and thus should be selected under restrictive sampling scenarios.
Resumo:
It has been shown that mental rotation of objects and human body parts is processed differently in the human brain. But what about body parts belonging to other primates? Does our brain process this information like any other object or does it instead maximize the structural similarities with our homologous body parts? We tried to answer this question by measuring the manual reaction time (MRT) of human participants discriminating the handedness of drawings representing the hands of four anthropoid primates (orangutan, chimpanzee, gorilla, and human). Twenty-four right-handed volunteers (13 males and 11 females) were instructed to judge the handedness of a hand drawing in palm view by pressing a left/right key. The orientation of hand drawings varied from 0º (fingers upwards) to 90º lateral (fingers pointing away from the midline), 180º (fingers downwards) and 90º medial (finger towards the midline). The results showed an effect of rotation angle (F(3, 69) = 19.57, P < 0.001), but not of hand identity, on MRTs. Moreover, for all hand drawings, a medial rotation elicited shorter MRTs than a lateral rotation (960 and 1169 ms, respectively, P < 0.05). This result has been previously observed for drawings of the human hand and related to biomechanical constraints of movement performance. Our findings indicate that anthropoid hands are essentially equivalent stimuli for handedness recognition. Since the task involves mentally simulating the posture and rotation of the hands, we wondered if "mirror neurons" could be involved in establishing the motor equivalence between the stimuli and the participants' own hands.
Resumo:
In this doctoral thesis, a tomographic STED microscopy technique for 3D super-resolution imaging was developed and utilized to observebone remodeling processes. To improve upon existing methods, wehave used a tomographic approach using a commercially available stimulated emission depletion (STED) microscope. A certain region of interest (ROI) was observed at two oblique angles: one at a standard inverted configuration from below (bottom view) and another from the side (side view) via a micro-mirror positioned close to the ROI. The two viewing angles were reconstructed into a final tomogram. The technique, named as tomographic STED microscopy, was able to achieve an axial resolution of approximately 70 nm on microtubule structures in a fixed biological specimen. High resolution imaging of osteoclasts (OCs) that are actively resorbing bone was achieved by creating an optically transparent coating on a microscope coverglass that imitates a fractured bone surface. 2D super-resolution STED microscopy on the bone layer showed approximately 60 nm of lateral resolution on a resorption associated organelle allowing these structures to be imaged with super-resolution microscopy for the first time. The developed tomographic STED microscopy technique was further applied to study resorption mechanisms of OCs cultured on the bone coating. The technique revealed actin cytoskeleton with specific structures, comet-tails, some of which were facing upwards and some others were facing downwards. This, in our opinion, indicated that during bone resorption, an involvement of the actin cytoskeleton in vesicular exocytosis and endocytosis is present. The application of tomographic STED microscopy in bone biology demonstrated that 3D super-resolution techniques can provide new insights into biological 3D nano-structures that are beyond the diffraction-limit when the optical constraints of super-resolution imaging are carefully taken into account.
Resumo:
This is a study of one participant's reflective practice as she worked to develop online communities in a face-to-face science course. Her process of reflective practice was examined in order to address factors that influenced her learning path, and the benefits and challenges of collaborative action research. These research goals were pursued using a collaborative action research methodology, initially chosen for its close match with Schon's (1983) model of reflective practice. The participant's learning fit vnth Mezirow's (1991) model of transformative learning. She began with beliefs that matched her goals, and she demonstrated significant learning in three areas. First, she demonstrated instrumental learning around the constraints of workload and time, and achieving online learning community indicators. Second, she demonstrated communicative learning that helped her to see her own needs for feedback and communication more clearly, and how other process partners had been a support to her. Third, her emancipatory learning saw her revisiting and questioning her goals. It was through the reflective conversation during the planned meetings and the researcher's reframing and interrogation of that reflection that the participant was able to clarify and extend her thinking, and in so doing, critically reflect on her practice as she worked to develop online learning communities. In this way, the collaborative action research methodology was an embodiment of co-constructivism through collaborative reflective practice. Schon's (1983) model of reflective practice positions a lone practitioners moving through cycles ofplan-act-observe-reflect. The results fi"om this study suggest that collaboration is an important piece of the reflective practice model.
Resumo:
This thesis compares the foreign economic poUcy dimension of the development strategies adopted by the governments of two Commonwealth caribbean countries: The Hardey government In Jamaica, and the· Williams government in Trlnidad and T ooago, The foreign economic policIes adopted by these governments appeared, on the surface~ to be markedly dissimilar. The Jamakan strategv on the one hand, emphasised self-reliance and national autonomy; and featured the espousal of radical oonaHgnment together with attempts to re-deftne the terms of the Islands externaa economIc relaUoos. The Trinidadian strategy 00 the other hand, featured Uberal externaUy-oriented growth poUctes, and close relatjoos with Western governments and financial institutions. Th1s study attempts to identify the explanatory factors that account for the apparent dlssimUarUy 1n the foreign economic policies of these two govemnents. The study is based on a comparison of how the structural bases of an underdeveloped ecooomYg and the foreign penetration and vulnerabUUy to external pressures asSOCiated wUh dependence, shape and influence foreign economic poUcy strategy. The framework views fore1gn ecooom1c strategy as an adaptive response on the part of the decision makers of a state to the coostralnts and opportunities provided by a particular situation. The · situat i 00' in this case being the events, conditions, structures and processes, associated wUh dependente and underdevelopment. The results indicate that the similarities and dissimHarities in the foreign economic policies of the governments of Jamaica and Trinidad were a reflecUon of the simHarities and dissimilarities in their respective situations. The conclusion derived suggests that If the foreign pol1cy field as an arena of choice, Is indeed one of opportunities and constraints for each and every state, then poHcy makers of smaU, weak, hlghW penetrated and vulnerable states enter thlS arena with constraints outweighing opportunities. This places effective limits 00 their decisional latitude and the range of policy options avaUable. Policy makers thus have to decide critical issues with few estabUshed precedents, in the face of domestic social and political cleavages, as wen as serious foreign pressures. This is a reflection not only of the trappings of dependence, but also of the Umned capabilities arising from the sman size of the state, and the Impact of the resource-gap In an underdeveloped economy. The Trinidadian strategy 1s UlustraUve of a development strategy made viable through a combination of a fortuitous circumstance, a confluence of the interests of influential groups» and accurate perception on the part of poUcy makers. These factors enabled policy makers to minimise some of the constraints of dependence. The faUure of Manlets strategy on the other hand, 15 iHustraUve of the problems involved tn the adoption of poUcles that work against the interest of internal and external political and economic forces. It is also tUustraUve of the consequences of the faUure 00 the part of policy makers to clarify goals, and to reconcile the values of rapid economic growth with increased self-reliance and national autonomy. These values tend to be mutuany Incompatible given the existing patterns of relations in the jnternational economy.
Resumo:
This thesis describes the development of a model-based vision system that exploits hierarchies of both object structure and object scale. The focus of the research is to use these hierarchies to achieve robust recognition based on effective organization and indexing schemes for model libraries. The goal of the system is to recognize parameterized instances of non-rigid model objects contained in a large knowledge base despite the presence of noise and occlusion. Robustness is achieved by developing a system that can recognize viewed objects that are scaled or mirror-image instances of the known models or that contain components sub-parts with different relative scaling, rotation, or translation than in models. The approach taken in this thesis is to develop an object shape representation that incorporates a component sub-part hierarchy- to allow for efficient and correct indexing into an automatically generated model library as well as for relative parameterization among sub-parts, and a scale hierarchy- to allow for a general to specific recognition procedure. After analysis of the issues and inherent tradeoffs in the recognition process, a system is implemented using a representation based on significant contour curvature changes and a recognition engine based on geometric constraints of feature properties. Examples of the system's performance are given, followed by an analysis of the results. In conclusion, the system's benefits and limitations are presented.
Resumo:
This paper investigates the linear degeneracies of projective structure estimation from point and line features across three views. We show that the rank of the linear system of equations for recovering the trilinear tensor of three views reduces to 23 (instead of 26) in the case when the scene is a Linear Line Complex (set of lines in space intersecting at a common line) and is 21 when the scene is planar. The LLC situation is only linearly degenerate, and we show that one can obtain a unique solution when the admissibility constraints of the tensor are accounted for. The line configuration described by an LLC, rather than being some obscure case, is in fact quite typical. It includes, as a particular example, the case of a camera moving down a hallway in an office environment or down an urban street. Furthermore, an LLC situation may occur as an artifact such as in direct estimation from spatio-temporal derivatives of image brightness. Therefore, an investigation into degeneracies and their remedy is important also in practice.
Resumo:
Obtaining automatic 3D profile of objects is one of the most important issues in computer vision. With this information, a large number of applications become feasible: from visual inspection of industrial parts to 3D reconstruction of the environment for mobile robots. In order to achieve 3D data, range finders can be used. Coded structured light approach is one of the most widely used techniques to retrieve 3D information of an unknown surface. An overview of the existing techniques as well as a new classification of patterns for structured light sensors is presented. This kind of systems belong to the group of active triangulation method, which are based on projecting a light pattern and imaging the illuminated scene from one or more points of view. Since the patterns are coded, correspondences between points of the image(s) and points of the projected pattern can be easily found. Once correspondences are found, a classical triangulation strategy between camera(s) and projector device leads to the reconstruction of the surface. Advantages and constraints of the different patterns are discussed
Resumo:
El marcaje de proteínas con ubiquitina, conocido como ubiquitinación, cumple diferentes funciones que incluyen la regulación de varios procesos celulares, tales como: la degradación de proteínas por medio del proteosoma, la reparación del ADN, la señalización mediada por receptores de membrana, y la endocitosis, entre otras (1). Las moléculas de ubiquitina pueden ser removidas de sus sustratos gracias a la acción de un gran grupo de proteasas, llamadas enzimas deubiquitinizantes (DUBs) (2). Las DUBs son esenciales para la manutención de la homeostasis de la ubiquitina y para la regulación del estado de ubiquitinación de diferentes sustratos. El gran número y la diversidad de DUBs descritas refleja tanto su especificidad como su utilización para regular un amplio espectro de sustratos y vías celulares. Aunque muchas DUBs han sido estudiadas a profundidad, actualmente se desconocen los sustratos y las funciones biológicas de la mayoría de ellas. En este trabajo se investigaron las funciones de las DUBs: USP19, USP4 y UCH-L1. Utilizando varias técnicas de biología molecular y celular se encontró que: i) USP19 es regulada por las ubiquitin ligasas SIAH1 y SIAH2 ii) USP19 es importante para regular HIF-1α, un factor de transcripción clave en la respuesta celular a hipoxia, iii) USP4 interactúa con el proteosoma, iv) La quimera mCherry-UCH-L1 reproduce parcialmente los fenotipos que nuestro grupo ha descrito previamente al usar otros constructos de la misma enzima, y v) UCH-L1 promueve la internalización de la bacteria Yersinia pseudotuberculosis.
Resumo:
This paper uses a hybrid human capital / signaling model to study grading standards in schools when tuition fees are allowed. The paper analyzes the grading standard set by a profit maximizing school and compares it with the efficient one. The paper also studies grading standards when tuition fees have limits. When fees are regulated a profit maximizing school will set lower grading standards than when they are not regulated. Credit constraints of families also induce schools to lower their standards. Given that in the model presented competition is not feasible, these results show the importance of regulation of grading standards.