282 resultados para Formalization
Resumo:
Registering originative business contracts allows entrepreneurs and creditors to choose, andcourts to enforce market-friendly contract rules that protect innocent third parties whenadjudicating disputes on subsequent contracts. This reduces information asymmetry for thirdparties, which enhances impersonal trade. It does so without seriously weakening property rights,because it is rightholders who choose or activate the legal rules and can, therefore, minimize thecost of any possible weakening. Registries are essential not only to make the chosen rules publicbut to ensure rightholders commitment and avoid rule-gaming, because independent registriesmake rightholders choices verifiable by courts. The theory is supported by comparative andhistorical analyses.
Resumo:
Simplifying business formalization and eliminating outdated formalities is often a good way of improving the institutional environment for firms. Unfortunately, the World Bank s "Doing Business" project is harming such policies by promoting a reform agenda that gives them priority even in countries lacking functional business registers, so that the reformed registers keep producing valueless information, but faster. Its methodology also promotes biased measurements that impede proper consideration of the essential tradeoffs in the design of formalization institutions. If "Doing Business" is to stop jeopardizing its true objectives and contribute positively to scientific progress, institutional reform and economic development, then its aims, governance and methodology need to change.
Resumo:
The effects of patch size and isolation on metapopulation dynamics have received wide empirical support and theoretical formalization. By contrast, the effects of patch quality seem largely underinvestigated, partly due to technical difficulties in properly assessing quality. Here we combine habitat-quality modeling with four years of demographic monitoring in a metapopulation of greater white-toothed shrews (Crocidura russula) to investigate the role of patch quality on metapopulation processes. Together, local patch quality and connectivity significantly enhanced local population sizes and occupancy rates (R2 = 14% and 19%, respectively). Accounting for the quality of patches connected to the focal one and acting as potential sources improved slightly the model explanatory power for local population sizes, pointing to significant source-sink dynamics. Local habitat quality, in interaction with connectivity, also increased colonization rate (R2 = 28%), suggesting the ability of immigrants to target high-quality patches. Overall, patterns were best explained when assuming a mean dispersal distance of 800 m, a realistic value for the species under study. Our results thus provide evidence that patch quality, in interaction with connectivity, may affect major demographic processes.
Resumo:
Sex determination is often seen as a dichotomous process: individual sex is assumed to be determined either by genetic (genotypic sex determination, GSD) or by environmental factors (environmental sex determination, ESD), most often temperature (temperature sex determination, TSD). We endorse an alternative view, which sees GSD and TSD as the ends of a continuum. Both effects interact a priori, because temperature can affect gene expression at any step along the sex-determination cascade. We propose to define sex-determination systems at the population- (rather than individual) level, via the proportion of variance in phenotypic sex stemming from genetic versus environmental factors, and we formalize this concept in a quantitative-genetics framework. Sex is seen as a threshold trait underlain by a liability factor, and reaction norms allow modeling interactions between genotypic and temperature effects (seen as the necessary consequences of thermodynamic constraints on the underlying physiological processes). As this formalization shows, temperature changes (due to e.g., climatic changes or range expansions) are expected to provoke turnovers in sex-determination mechanisms, by inducing large-scale sex reversal and thereby sex-ratio selection for alternative sex-determining genes. The frequency of turnovers and prevalence of homomorphic sex chromosomes in cold-blooded vertebrates might thus directly relate to the temperature dependence in sex-determination mechanisms.
Resumo:
Most economic interactions happen in a context of sequential exchangein which innocent third parties suffer information asymmetry with respect toprevious "originative" contracts. The law reduces transaction costs byprotecting these third parties but preserves some element of consent byproperty rightholders to avoid damaging property enforcement?e.g., it isthey, as principals, who authorize agents in originative contracts. Judicialverifiability of these originative contracts is obtained either as an automaticbyproduct of transactions or, when these would have remained private, byrequiring them to be made public. Protecting third parties produces a legalcommodity which is easy to trade impersonally, improving the allocationand specialization of resources. Historical delay in generalizing this legalcommoditization paradigm is attributed to path dependency?the law firstdeveloped for personal trade?and an unbalance in vested interests, asluddite legal professionals face weak public bureaucracies.
Resumo:
Academic advising is a key element for learning success in virtual environments that has received little attention from researchers. This paper focuses on the organizational arrangements needed for the delivery of academic advising in online higher education. We present the general dimensions of organizational structures (division of labor, hierarchy of authority and formalization) and their possible forms when applied to academic advising. The specific solution adopted at the Open University of Catalonia is described and assessed in order to draw general conclusions of interest for other institutions.
Resumo:
This paper analyzes the role of formalization of land property rights in the war against illicit crops in Colombia. We argue that as a consequence of the increase of state presence and visibility during the period of 2000 and 2009, municipalities with a higher level of formalization of their land property rights saw a greater reduction in the area allocated to illicit crops. We hypothesize that this is due to the increased cost of growing illicit crops on formal land compared to informal, and due to the possibility of obtaining more benets in the newly in- stalled institutional environment when land is formalized. We exploit the variation in the level of formalization of land property rights in a set of municipalities that had their rst cadastral census collected in the period of 1994-2000; this selection procedure guarantees reliable data and an unbiased source of variation. Using fixed effects estimators, we found a signicant negative relationship between the level of formalization of land property rights and the number of hectares allocated to coca crops per municipality. These results remain robust through a number of sensitivity analyses. Our ndings contribute to the growing body of evidence on the positive effects of formal land property rights, and e ective policies in the war on drugs in Colombia.
Resumo:
In the last years there has been an increasing demand of a variety of logical systems, prompted mostly by applications of logic in AI, logic programming and other related areas. Labeled Deductive Systems (LDS) were developed as a flexible methodology to formalize such a kind of complex logical systems. In the last decade, defeasible argumentation has proven to be a confluence point for many approaches to formalizing commonsense reasoning. Different formalisms have been developed, many of them sharing common features. This paper presents a formalization of an LDS for defensible argumentation, in which the main issues concerning defeasible argumentation are captured within a unified logical framework. The proposed framework is defined in two stages. First, defeasible inference will be formalized by characterizing an argumentative LDS. That system will be then extended in order to capture conflict among arguments using a dialectical approach. We also present some logical properties emerging from the proposed framework, discussing also its semantical characterization.
Resumo:
The main objective of this dissertation is to create new knowledge on an administrative innovation, its adoption, diffusion and finally its effectiveness. In this dissertation the administrative innovation is approached through a widely utilized management philosophy, namely the total quality management (TQM) strategy. TQM operationalizes a self-assessment procedure, which is based on continual improvement principles and measuring the improvements. This dissertation also captures the theme of change management as it analyzes the adoption and diffusion of the administrative innovation. It identifies innovation characteristics as well as organisational and individual factors explaining the adoption and implementation. As a special feature, this study also explores the effectiveness of the innovation based on objective data. For studying the administrative innovation (TQM model), a multinational Case Company provides a versatile ground for a deep, longitudinal analysis. The Case Company started the adoption systematically in the mid 1980s in some of its units. As part of their strategic planning today, the procedure is in use throughout the entire global company. The empirical story begins from the innovation adoption decision that was made in the Case Company over 22 years ago. In order to be able to capture the right atmosphere and backgrounds leading to the adoption decision, key informants from that time were interviewed, since the main target was to clarify the dynamics of how an administrative innovation develops. In addition, archival material was collected and studied, available memos and data relating to the innovation, innovation adoption and later to the implementation contained altogether 20500 pages of documents. A survey was furthermore conducted at the end of 2006 focusing on questions related to the innovation, organization and leadership characteristics and the response rate totalled up to 54%. For measuring the effectiveness of the innovation implementation, the needed longitudinal objective performance data was collected. This data included the profit unit level experience of TQM, the development of the self assessment scores per profit unit and performance data per profit unit measured with profitability, productivity and customer satisfaction. The data covered the years 1995-2006. As a result, the prerequisites for the successful adoption of an administrative innovation were defined, such as the top management involvement, support of the change agents and effective tools for implementation and measurement. The factors with the greatest effect on the depth of the implementation were the timing of the adoption and formalization. The results also indicated that the TQM model does have an effect on the company performance measured with profitability, productivity and customer satisfaction. Consequently this thesis contributes to the present literature (i) by taking into its scope an administrative innovation and focusing on the whole innovation implementation process, from the adoption, through diffusion until its consequences, (ii) because the studied factors with an effect on the innovation adoption and diffusion are multifaceted and grouped into individual, organizational and environmental factors, and a strong emphasis is put on the role of the individual change agents and (iii) by measuring the depth and consistency of the administrative innovation. This deep analysis was possible due to the availability of longitudinal data with triangulation possibilities.
Resumo:
Allergy has been on the rise for half a century and concerns nearly 30% of children; it has now become a real public health problem. The guidelines on prevention of allergy set up by the French Society of Paediatrics (SFP) and the European Society of Paediatric Allergology and Clinical Immunology (ESPACI) are based on screening children at risk through a systematic search of the family history and recommend, for children at risk, exclusive breastfeeding whenever possible or otherwise utilization of hypoallergenic infant formula, which has demonstrated efficacy. The AllerNaiss practice survey assessed the modes of screening and prevention of allergy in French maternity units in 2012. The SFP guidelines are known by 82% of the maternity units that took part in the survey, and the ESPACI guidelines by 55% of them. A screening strategy is in place in 59% of the participating maternity wards, based on local consensus for 36% of them, 13% of the units having a written screening procedure. Screening is based on the search for a history of allergy in first-degree relatives (99%) during pregnancy (51%), in the delivery room (50%), and after delivery (89%). A mode of prevention of the risk of allergy exists in 62% of the maternity units, most often in writing (49%). A hypoallergenic infant formula is prescribed for non-breastfed children in 90% of the units. The survey shows that there is a real need for formalization of allergy risk screening and prevention of allergy in newborns in French maternity units.
Resumo:
Capsules were prepared from chitosan (QTS)-poly(vinyl alcohol) (PVA) blend by saline coacervation and then by formalization. A adsorbent based on chitosan, insoluble on acid solution, was obtained. The morphology, average diameters of QTS/PVA capsules and their pores were studied by using scanning electron microscopy. The entrapment-adsorption of dimethylglioxime and ethylenediaminetetracetate by the capsules were studied. The removal of the ion nickel (II) and copper (II), was more effective than by using unloaded capsules.
Resumo:
As a discipline, logic is arguably constituted of two main sub-projects: formal theories of argument validity on the basis of a small number of patterns, and theories of how to reduce the multiplicity of arguments in non-logical, informal contexts to the small number of patterns whose validity is systematically studied (i.e. theories of formalization). Regrettably, we now tend to view logic 'proper' exclusively as what falls under the first sub-project, to the neglect of the second, equally important sub-project. In this paper, I discuss two historical theories of argument formalization: Aristotle's syllogistic theory as presented in the "Prior Analytics", and medieval theories of supposition. They both illustrate this two-fold nature of logic, containing in particular illuminating reflections on how to formalize arguments (i.e. the second sub-project). In both cases, the formal methods employed differ from the usual modern technique of translating an argument in ordinary language into a specially designed symbolism, a formal language. The upshot is thus a plea for a broader conceptualization of what it means to formalize.
Resumo:
In this article I intend to show that certain aspects of A.N. Whitehead's philosophy of organism and especially his epochal theory of time, as mainly exposed in his well-known work Process and Reality, can serve in clarify the underlying assumptions that shape nonstandard mathematical theories as such and also as metatheories of quantum mechanics. Concerning the latter issue, I point to an already significant research on nonstandard versions of quantum mechanics; two of these approaches are chosen to be critically presented in relation to the scope of this work. The main point of the paper is that, insofar as we can refer a nonstandard mathematical entity to a kind of axiomatical formalization essentially 'codifying' an underlying mental process indescribable as such by analytic means, we can possibly apply certain principles of Whitehead's metaphysical scheme focused on the key notion of process which is generally conceived as the becoming of actual entities. This is done in the sense of a unifying approach to provide an interpretation of nonstandard mathematical theories as such and also, in their metatheoretical status, as a formalization of the empirical-experimental context of quantum mechanics.
Resumo:
Abstract: In this article we analyze the key concept of Hilbert's axiomatic method, namely that of axiom. We will find two different concepts: the first one from the period of Hilbert's foundation of geometry and the second one at the time of the development of his proof theory. Both conceptions are linked to two different notions of intuition and show how Hilbert's ideas are far from a purely formalist conception of mathematics. The principal thesis of this article is that one of the main problems that Hilbert encountered in his foundational studies consisted in securing a link between formalization and intuition. We will also analyze a related problem, that we will call "Frege's Problem", form the time of the foundation of geometry and investigate the role of the Axiom of Completeness in its solution.
Resumo:
Pro Gradu -tutkimuksen keskeisin tavoite on ollut selvittää minkälaisia haasteita organisaatiot kohtaavat kehittäessään toimintojaan. Kontekstina tutkimuksessa on toiminut institutionalisoituminen, jolla viitataan toimintojen virallistamiseen. Tutkimus toteutettiin laadullisena ja aineistonkeräystapana käytettiin puolistrukturoituja haastatteluja. Tutkimuksen haastatteluiden kautta päästiin tutkimaan suomalaisten kuntasektoritoimijoiden muutosprosessien haasteellisuutta. Tutkimustulokset osoittivat, että haasteet organisaatiomuutosten aikana ovat hyvin moninaisia. Muutoshaasteet ovat kuitenkin luonteeltaan sellaisia, että ne voidaan kohdata keinoina muutoksen parempaan läpivientiin. Erityisesti työntekijöiden huomioiminen osana muutosprosessia korostui tämän tutkimuksen osana. Myös sosioteknisen systeemiajattelun rooli korostui oivana keinona kohdata muutoshaasteet.