885 resultados para Pneuma (The Greek word)
Resumo:
The term res publica (literally “thing of the people”) was coined by the Romans to translate the Greek word politeia, which, as we know, referred to a political community organised in accordance with certain principles, amongst which the notion of the “good life” (as against exclusively private interests) was paramount. This ideal also came to be known as political virtue. To achieve it, it was necessary to combine the best of each “constitutional” type and avoid their worst aspects (tyranny, oligarchy and ochlocracy). Hence, the term acquired from the Greeks a sense of being a “mixed” and “balanced” system. Anyone that was entitled to citizenship could participate in the governance of the “public thing”. This implied the institutionalization of open debate and confrontation between interested parties as a way of achieving the consensus necessary to ensure that man the political animal, who fought with words and reason, prevailed over his “natural” counterpart. These premises lie at the heart of the project which is now being presented under the title of Res Publica: Citizenship and Political Representation in Portugal, 1820-1926. The fact that it is integrated into the centenary commemorations of the establishment of the Republic in Portugal is significant, as it was the idea of revolution – with its promise of rupture and change – that inspired it. However, it has also sought to explore events that could be considered the precursor of democratization in the history of Portugal, namely the vintista, setembrista and patuleia revolutions. It is true that the republican regime was opposed to the monarchic. However, although the thesis that monarchy would inevitably lead to tyranny had held sway for centuries, it had also been long believed that the monarchic system could be as “politically virtuous” as a republic (in the strict sense of the word) provided that power was not concentrated in the hands of a single individual. Moreover, various historical experiments had shown that republics could also degenerate into Caesarism and different kinds of despotism. Thus, when absolutism began to be overturned in continental Europe in the name of the natural rights of man and the new social pact theories, initiating the difficult process of (written) constitutionalization, the monarchic principle began to be qualified as a “monarchy hedged by republican institutions”, a situation in which not even the king was exempt from isonomy. This context justifies the time frame chosen here, as it captures the various changes and continuities that run through it. Having rejected the imperative mandate and the reinstatement of the model of corporative representation (which did not mean that, in new contexts, this might not be revived, or that the second chamber established by the Constitutional Charter of 1826 might not be given another lease of life), a new power base was convened: national sovereignty, a precept that would be shared by the monarchic constitutions of 1822 and 1838, and by the republican one of 1911. This followed the French example (manifested in the monarchic constitution of 1791 and in the Spanish constitution of 1812), as not even republicans entertained a tradition of republicanism based upon popular sovereignty. This enables us to better understand the rejection of direct democracy and universal suffrage, and also the long incapacitation (concerning voting and standing for office) of the vast body of “passive” citizens, justified by “enlightened”, property- and gender-based criteria. Although the republicans had promised in the propaganda phase to alter this situation, they ultimately failed to do so. Indeed, throughout the whole period under analysis, the realisation of the potential of national sovereignty was mediated above all by the individual citizen through his choice of representatives. However, this representation was indirect and took place at national level, in the hope that action would be motivated not by particular local interests but by the common good, as dictated by reason. This was considered the only way for the law to be virtuous, a requirement that was also manifested in the separation and balance of powers. As sovereignty was postulated as single and indivisible, so would be the nation that gave it soul and the State that embodied it. Although these characteristics were common to foreign paradigms of reference, in Portugal, the constitutionalization process also sought to nationalise the idea of Empire. Indeed, this had been the overriding purpose of the 1822 Constitution, and it persisted, even after the loss of Brazil, until decolonization. Then, the dream of a single nation stretching from the Minho to Timor finally came to an end.
Étude de l’association supramoléculaire bi- et tridimensionnelle d’oximes et d’hydrazones trigonales
Resumo:
Les concepts de la chimie supramoléculaire peuvent être exploités avantageusement pour contrôler la structure et les propriétés des matériaux moléculaires. Dans une approche productive, les composantes moléculaires du matériau peuvent être choisies pour pouvoir s'engager dans des interactions fortes et prévisibles avec leurs voisins. Cette stratégie, appelée la tectonique moléculaire, est caractérisée par la préparation de molécules particulières appelées tectons (du grec tectos, qui signifie constructeur) qui, par design rationnel, s’associent de manière prévisible via de multiples interactions non-covalentes afin de générer l’architecture désirée. Ce processus est réversible et guidé par la présence de fonctions chimiques complémentaires, appelées groupements de reconnaissance, qui sont orientées de manière à conférer un aspect directionnel aux interactions intermoléculaires. Ceci permet de positionner les molécules voisines de façon prédéterminée. Les contraintes imposées par les interactions s’opposent souvent à la tendance naturelle des molécules à former une structure compacte et permettent donc à d'autres molécules invitées d’occuper un volume appréciable dans le matériau, sans toutefois contribuer directement à l'architecture principale. Appliquée à la cristallisation, cette approche peut générer des cristaux poreux, analogues aux zéolites. Les ponts hydrogène offrent une interaction non-covalente de choix dans cette stratégie car ils sont forts et directionnels. L’exploration d’une multitude de fonctions chimiques connues pour pouvoir participer à la formation de ponts hydrogène a permis de créer une grande diversité de nouveaux matériaux lors de l’évolution du domaine du génie cristallin. Une molécule classique, qui illustre bien la stratégie tectonique et qui a eu un fort impact dans le domaine de la chimie supramoléculaire, est l’acide 1,3,5-benzènetricarboxylique, communément appelé acide trimésique. L’acide trimésique donne une orientation trigonale à trois groupements carboxyles, favorisant ainsi la formation d'un réseau hexagonal retenu par ponts hydrogène. Nous avons visé une modification dans laquelle les groupements -COOH de l'acide trimésique sont remplacés par deux autres groupements de reconnaissance, jusqu’ici peu exploités en chimie supramoléculaire, l’oxime et l’hydrazone. Nous rapportons la synthèse et la cristallisation de différentes trioximes et trihydrazones analogues à l'acide trimésique. Les cristaux obtenus ont été analysés par diffraction des rayons-X et leurs structures ont été déterminées. L’auto-assemblage de différentes trioximes et trihydrazones en 2D par adsorption sur graphite a également été étudié en utilisant la microscopie à balayage à effet tunnel. Nos résultats nous permettent de comparer l'organisation en 2D et en 3D de différents analogues de l'acide trimésique.
Resumo:
Notre étude a pour objet la conception, la synthèse ainsi que l’étude structurale d’architectures supramoléculaires obtenues par auto-assemblage, en se basant sur les concepts de la tectonique moléculaire. Cette branche de la chimie supramoléculaire s’occupe de la conception et la synthèse de molécules organiques appelées tectons, du grec tectos qui signifie constructeur. Le tecton est souvent constitué de sites de reconnaissance branchés sur un squelette bien choisi. Les sites de reconnaissance orientés par la géométrie du squelette peuvent participer dans des interactions intermoléculaires qui sont suffisamment fortes et directionnelles pour guider la topologie du cristal résultant. La stratégie envisagée utilise des processus d'auto-assemblage engageant des interactions réversibles entre les tectons. L’auto-assemblage dirigé par de fortes interactions intermoléculaires directionnelles est largement utilisé pour fabriquer des matériaux dont les composants doivent être positionnés en trois dimensions (3D) d'une manière prévisible. Cette stratégie peut également être utilisée pour contrôler l’association moléculaire en deux dimensions (2D), ce qui permet la construction de monocouches organisées et prédéterminées sur différents types des surfaces, tels que le graphite.Notre travail a mis l’accent sur le comportement de la fonction amide comme fonction de reconnaissance qui est un analogue du groupement carboxyle déjà utilisé dans plusieurs études précédentes. Nous avons étudié le comportement d’une série de composés contenant un noyau plat conçu pour faciliter l'adsorption sur le graphite et modifiés par l'ajout de groupes amide pour favoriser la formation de liaisons hydrogène entre les molécules ainsi adsorbées. La capacité de ces composés à former de monocouches organisées à l’échelle moléculaire en 2D a été examinée par microscopie à effet tunnel, etleur organisation en 3D a également été étudiée par cristallographie aux rayons X. Dans notre étude, nous avons systématiquement modifié la géométrie moléculaire et d'autres paramètres afin d'examiner leurs effets sur l'organisation moléculaire. Nos résultats suggèrent que les analyses structurales combinées en 2D et 3D constituent un important atout dans l'effort pour comprendre les interactions entre les molécules adsorbées et l’effet de l’interaction avec la surface du substrat.
Resumo:
La chimie supramoléculaire est un domaine qui suscite depuis quelques années un intérêt grandissant. Le domaine s’appuie sur les interactions intermoléculaires de façon à contrôler l’organisation moléculaire et ainsi moduler les propriétés des matériaux. La sélection et le positionnement adéquat de groupes fonctionnels, utilisés en combinaison avec un squelette moléculaire particulier, permet d’anticiper la façon dont une molécule interagira avec les molécules avoisinantes. Cette stratégie de construction, nommé tectonique moléculaire, fait appel à la conception de molécules appelées tectons (du mot grec signifiant bâtisseur) pouvant s’orienter de façon prévisible par le biais d’interactions faibles et ainsi générer des architectures supramoléculaires inédites. Les tectons utilisent les forces intermoléculaires mises à leur disposition pour s’orienter de façon prédéterminée et ainsi contrecarrer la tendance à s’empiler de la manière la plus compacte possible. Pour ce faire, les tectons sont munies de diverses groupes fonctionnels, aussi appelés groupes de reconnaissance, qui agiront comme guide lors de l’assemblage moléculaire. Le choix du squelette moléculaire du tecton revêt une importance capitale puisqu’il doit permettre une orientation optimale des groupes de reconnaissance. La stratégie de la tectonique moléculaire, utilisée conjointement avec la cristallisation, ouvre la porte à un domaine de la chimie supramoléculaire appelé le génie cristallin. Le génie cristallin permet l’obtention de réseaux cristallins poreux soutenus par des interactions faibles, pouvant accueillir des molécules invitées. Bien que toutes les interactions faibles peuvent être mises à contribution, le pont hydrogène est l’interaction prédominante en ce qui a trait aux réseaux cristallins supramoléculaires. La force, la directionnalité ainsi que la versatilité font du pont hydrogène l’interaction qui, à ce jour, a eu le plus grand impact dans le domaine du génie cristallin. Un des groupements de reconnaissance particulièrement intéressants en génie cristallin, faisant appel aux ponts hydrogène et offrant plusieurs motifs d’interaction, est l’unité 2,4-diamino-1,3,5-triazinyle. L’utilisation de ce groupement de reconnaissance conjointement avec un cœur moléculaire en forme de croix d’Onsager, qui défavorise l’empilement compact, permet l’obtention de valeurs de porosités élevées, comme c’est le cas pour le 2,2’,7,7’-tétrakis(2,4-diamino-1,3,5-triazin-6-yl)-9,9’-spirobi[9H-fluorène]. Nous présentons ici une extension du travail effectué sur les cœurs spirobifluorényles en décrivant la synthèse et l’analyse structurale de molécules avec une unité dispirofluorène-indénofluorényle comme cœur moléculaire. Ce cœur moléculaire exhibe les mêmes caractéristiques structurales que le spirobifluorène, soit une topologie rigide en forme de croix d’Onsager défavorisant l’empilement compact. Nous avons combiné les cœurs dispirofluorène-indénofluorényles avec différents groupements de reconnaissance de façon à étudier l’influence de l’élongation du cœur moléculaire sur le réseau cristallin, en particulier sur le volume accessible aux molécules invitées.
Resumo:
Pós-graduação em Desenvolvimento Humano e Tecnologias - IBRC
Resumo:
Lo virtual es el lugar donde todo empieza, el germen de la imaginación productiva, un ámbito de pulsiones inaugurales y preexistencias sin forma donde todo convive, a la espera de ser diferenciado. Lo virtual es el sitio donde nacen las primeras exploraciones de cualquier acto de concepción, incluida la creación artística o el proyectar arquitectónico. Sin embargo, en las últimas tres décadas de revolución digital, el término ha sido utilizado de forma abusiva para referirse a todo tipo de entornos simulados informáticamente, es decir, a ficciones cerradas, programadas, controladas por el software y sus rutinas, radicalmente actualizadas, acabadas, completas, formalizadas. Paradójicamente, lo virtual ha servido para nombrar construcciones profundamente anti-virtuales. La telemática está propiciando el acceso del ser humano a un nuevo tipo de irrealidad cotidiana sustentada en prácticas espaciales cada vez menos vinculadas con la física y la biología. Esta condición fantasmagórica del habitar digital exige nuevos espacios de diálogo entre arquitectura y tecnología que se centren en el hecho imaginario. Para ello esta tesis propone —a partir de la recuperación del término griego arquitectónica— llevar el alcance de la disciplina hasta el hecho global del habitar. Y, al mismo tiempo, devolver al adjetivo virtual su auténtico significado preliminar, entendiendo que los verdaderos mundos virtuales no pueden simular nada, representar nada, formalizar nada, porque ellos son el origen infinito y amorfo de todo mundo. ABSTRACT The virtual is where it all starts, the seed of productive imagination, an area of inaugural impulses and formless preexistences that beat together, waiting to be differentiated. The virtual is the birthplace of any creative exploration, including those of the architectural project. However, in the last three decades of digital revolution, the term has been mostly misused to refer to all types of computer simulated environments; shut, finished, complete, formalized, radically actualized fictions controlled by software routines. Paradoxically, the virtual has been giving name to profoundly anti-virtual constructions. Telematics is allowing humans to access to a new kind of unreality, based on everyday spatial practices that are increasingly detached from physics and biology. This spectral condition of the digital living demands new dialectics between architecture and technology, focused on the imaginary. This thesis proposes — beginning by recovering the Greek word architectonics— to extend the scope of the discipline beyond edification to the overall fact of inhabiting. And, at the same time, to return to the adjective virtual its authentic preliminary meaning, realizing that the true virtual worlds cannot simulate, represent or formalize anything because they are the amorphous and endless source of every world.
Resumo:
La nanotecnología es el estudio que la mayoría de veces es tomada como una meta tecnológica que nos ayuda en el área de investigación para tratar con la manipulación y el control en forma precisa de la materia con dimensiones comprendidas entre 1 y 100 nanómetros. Recordando que el prefijo nano proviene del griego vavoc que significa enano y corresponde a un factor de 10^-9, que aplicada a las unidades de longitud corresponde a una mil millonésima parte de un metro. Ahora sabemos que esta ciencia permite trabajar con estructuras moleculares y sus átomos, obteniendo materiales que exhiben fenómenos físicos, químicos y biológicos, muy distintos a los que manifiestan los materiales usados con una longitud mayor. Por ejemplo en medicina, los compuestos manométricos y los materiales nano estructurados muchas veces ofrecen una mayor eficacia con respecto a las formulaciones químicas tradicionales, ya que muchas veces llegan a combinar los antiguos compuestos con estos nuevos para crear nuevas terapias e inclusive han llegado a reemplazarlos, revelando así nuevas propiedades diagnósticas y terapéuticas. A su vez, la complejidad de la información a nivel nano es mucho mayor que en los niveles biológicos convencionales y, por tanto, cualquier flujo de trabajo en nano medicina requiere, de forma inherente, estrategias de gestión de información avanzadas. Muchos investigadores en la nanotecnología están buscando la manera de obtener información acerca de estos materiales nanométricos, para mejorar sus estudios que muchas veces lleva a probar estos métodos o crear nuevos compuestos para ayudar a la medicina actual, contra las enfermedades más poderosas como el cáncer. Pero en estos días es muy difícil encontrar una herramienta que les brinde la información específica que buscan en los miles de ensayos clínicos que se suben diariamente en la web. Actualmente, la informática biomédica trata de proporcionar el marco de trabajo que permita lidiar con estos retos de la información a nivel nano, en este contexto, la nueva área de la nano informática pretende detectar y establecer los vínculos existentes entre la medicina, la nanotecnología y la informática, fomentando así la aplicación de métodos computacionales para resolver las cuestiones y problemas que surgen con la información en la amplia intersección entre la biomedicina y la nanotecnología. Otro caso en la actualidad es que muchos investigadores de biomedicina desean saber y comparar la información dentro de los ensayos clínicos que contiene temas de nanotecnología en las diferentes paginas en la web por todo el mundo, obteniendo en si ensayos clínicos que se han creado en Norte América, y ensayos clínicos que se han creado en Europa, y saber si en este tiempo este campo realmente está siendo explotado en los dos continentes. El problema es que no se ha creado una herramienta que estime un valor aproximado para saber los porcentajes del total de ensayos clínicos que se han creado en estas páginas web. En esta tesis de fin de máster, el autor utiliza un mejorado pre-procesamiento de texto y un algoritmo que fue determinado como el mejor procesamiento de texto en una tesis doctoral, que incluyo algunas pruebas con muchos de estos para obtener una estimación cercana que ayudaba a diferenciar cuando un ensayo clínico contiene información sobre nanotecnología y cuando no. En otras palabras aplicar un análisis de la literatura científica y de los registros de ensayos clínicos disponibles en los dos continentes para extraer información relevante sobre experimentos y resultados en nano medicina (patrones textuales, vocabulario en común, descriptores de experimentos, parámetros de caracterización, etc.), seguido el mecanismo de procesamiento para estructurar y analizar dicha información automáticamente. Este análisis concluye con la estimación antes mencionada necesaria para comparar la cantidad de estudios sobre nanotecnología en estos dos continentes. Obviamente usamos un modelo de datos de referencia (gold standard) —un conjunto de datos de entrenamiento anotados manualmente—, y el conjunto de datos para el test es toda la base de datos de estos registros de ensayos clínicos, permitiendo distinguir automáticamente los estudios centrados en nano drogas, nano dispositivos y nano métodos de aquellos enfocados a testear productos farmacéuticos tradicionales.---ABSTRACT---Nanotechnology is the scientific study that usually is seen as a technological goal that helps us in the investigation field to deal with the manipulation and precise control of the matter with dimensions that range from 1 to 100 nanometers. Remembering that the prefix nano comes from the Greek word νᾶνος, meaning dwarf and denotes a factor of 10^-9, that applyied the longitude units is equal to a billionth of a meter. Now we know that this science allows us to work with molecular structures and their atoms, obtaining material that exhibit physical, chemical and biological phenomena very different to those manifesting in materials with a bigger longitude. As an example in medicine, the nanometric compounds and the materials in nano structures are often offered with more effectiveness regarding to the traditional chemical formulas. This is due to the fact that many occasions combining these old compounds with the new ones, creates new therapies and even replaced them, reveling new diagnostic and therapeutic properties. Even though the complexity of the information at nano level is greater than that in conventional biologic level and, thus, any work flow in nano medicine requires, in an inherent way, advance information management strategies. Many researchers in nanotechnology are looking for a way to obtain information about these nanometric materials to improve their studies that leads in many occasions to prove these methods or to create a new compound that helps modern medicine against powerful diseases, such as cancer. But in these days it is difficult to find a tool that searches and provides a specific information in the thousands of clinic essays that are uploaded daily on the web. Currently, the bio medic informatics tries to provide the work frame that will allow to deal with these information challenge in nano level. In this context, the new area of nano informatics pretends to detect and establish the existing links between medicine, nanotechnology and informatics, encouraging the usage of computational methods to resolve questions and problems that surge with the wide information intersection that is between biomedicine and nanotechnology. Another present case, is that many biomedicine researchers want to know and be able to compare the information inside those clinic essays that contains subjects of nanotechnology on the different webpages across the world, obtaining the clinic essays that has been done in North America and the essays done in Europe, and thus knowing if in this time, this field is really being exploited in both continents. In this master thesis, the author will use an enhanced text pre-processor with an algorithm that was defined as the best text processor in a doctoral thesis, that included many of these tests to obtain a close estimation that helps to differentiate when a clinic essay contains information about nanotechnology and when it does not. In other words, applying an analysis to the scientific literature and clinic essay available in both continents, in order to extract relevant information about experiments and the results in nano-medicine (textual patterns, common vocabulary, experiments descriptors, characterization parameters, etc.), followed by the mechanism process to structure and analyze said information automatically. This analysis concludes with the estimation, mentioned before, needed to compare the quantity of studies about nanotechnology in these two continents. Obviously we use a data reference model (Gold standard) – a set of training data manually annotated –, and the set of data for the test conforms the entire database of these clinic essay registers, allowing to distinguish automatically the studies centered on nano drugs, nano devices and nano methods of those focus on testing traditional pharmaceutical products.
Resumo:
Particula I. Dissertatio philologica quam consensu et auctoritate amplissimi philosophorum ordinis in alma literarum academia monasteriensi ad summos in philosophia honores rite impetrandos -- Particula II: extracted from "Jahresbericht über das Gymnasium Dionysianum zu Rheine" 1865-66.
Resumo:
If Project Management (PM) is a well-accepted mode of managing organizations, more and more organizations are adopting PM in order to satisfy the diversified needs of application areas within a variety of industries and organizations. Concurrently, the number of PM practitioners and people involved at various level of qualification is vigorously rising. Thus the importance to characterize, define and understand this field and its underlying strength, basis and development is paramount. For this purpose we will referee to sociology of actor-networks and qualitative scientometrics leading to the choice of the co-word analysis method in enabling us to capture the project management field and its dynamics. Results of a study based on the analysis of EBSCO Business Source Premier Database will be presented and some future trends and scenarios proposed. The main following trends are confirmed, in alignment with previous studies: continuous interest for the “cost engineering” aspects, on going interest for Economic aspects and contracts, how to deal with various project types (categorizations), the integration with Supply Chain Management and Learning and Knowledge Management. Furthermore besides these continuous trends, we can note new areas of interest: the link between strategy and project, Governance, the importance of maturity (organizational performance and metrics, control) and Change Management. We see the actors (Professional Bodies, Governmental Bodies, Agencies, Universities, Industries, Researchers, and Practitioners) reinforcing their competing/cooperative strategies in the development of standards and certifications and moving to more “business oriented” relationships with their members and main stakeholders (Governments, Institutions like European Community, Industries, Agencies, NGOs…), at least at central level.
Resumo:
The thesis consists of five international congress papers and a summary with an introduction. The overarching aim of the studies and the summary is to examine the inner coherency of the theological and anthropological thinking of Gregory of Nyssa (331-395). To the issue is applied an "apophatic approach" with a "Christological focus". It is suggested that the coherency is to be found from the Christological concept of unity between "true God" and "true man" in the one person of Jesus Christ. Gregory is among the first to make a full recognition of two natures of Christ, and to use this recognition systematically in his writings. The aim of the studies is pursued by the method of "identification", a combination of the modern critical "problematic method" and Gregory's own aphairetic method of "following" (akolouthia). The preoccupation with issues relating to the so-called Hellenization of Christianity in the patristic era was strong in the twentieth-century Gregory scholarship. The most discussed questions have been the Greek influence in his thought and his philosophical sources. In the five articles of the thesis it is examined how Gregory's thinking stands in its own right. The manifestly apophatic character of his theological thinking is made a part of the method of examining his thought according to the principles of his own method of following. The basic issue concerning the relation of theology and anthropology is discussed in the contexts of his central Trinitarian, anhtropological, Christological and eschatological sources. In the summary the Christocentric integration of Gregory's thinking is discussed also in relation to the issue of the alledged Hellenization. The main conclusion of the thesis concerns the concept of theology in Gregory. It is not indebted to the classical concept of theology as metaphysics or human speculation of God. Instead, it is founded to the traditional Judeo-Christian idea of God who speaks with his people face to face. In Gregory, theologia connotes the oikonomia of God's self-revelation. It may be regarded as the state of constant expression of love between the Creator and his created image. In theology, the human person becomes an image of the Word by which the Father expresses his love to "man" whom he loves as his own Son. Eventually the whole humankind, as one, gives the divine Word a physical - audible and sensible - Body. Humankind then becomes what theology is. The whole humanity expresses divine love by manifesting Christ in words and deeds, singing in one voice to the glory of the Father, the Son and the Holy Spirit.
Resumo:
The present paper is an attempt to account for the emergence of the designation “only begotten” in the English Bible, its widespread use in pre-modern versions, and its gradual and almost complete disappearance from most contemporary translations. A close examination of the origins of this designation, traceable to its Latin cognate unigenitus, first introduced into the biblical tradition by St. Jerome to render selected occurrences of the Greek adjective monogenes, reveals a unique theological inspiration behind it. “Only begotten,” recurring in English translation of the Bible for almost six centuries as an important christological title, has recently been replaced by translational solutions reflecting a more accurate understanding of the underlying Greek word.
Resumo:
Carolingian scholars paid considerable attention to the Greek found in Martianus Capella’s De nuptiis Philologiae et Mercurii, a late antique Latin work full of obscurities in language and imagery. This article, focusing on glosses on De nuptiis from the oldest gloss tradition, demonstrates that a range of material was available to ninth-century scholars to elucidate Martianus’s Greek and that Greek seems, at times, to have served as a means to obscure. I argue that their interest in obscurity reflects a widespread epistemology and strategy of concealment, hence their intellectual investment in Martianus. For ninth-century readers, then, the Greek in the glossed Martianus manuscripts, however decorative it may have been, also operated at the core of medieval hermeneutics.
Resumo:
This paper examines the interplay of language-internal continuity and external influence in the cyclical development of the Asia Minor Greek adpositional system. The Modern Greek dialects of Asia Minor inherited an adpositional system of the Late Medieval Greek type whereby secondary adpositions regularly combined with primary adpositions to encode spatial region. Secondary adpositions could originally precede simple adpositions ([PREPOSITION + PREPOSITION + NPACC]) or follow the adpositional complement ([PREPOSITION + NPACC + POSTPOSITION]). Asia Minor Greek replicated the structure of Ottoman Turkish postpositional phrases to resolve this variability, fixing the position of secondary adpositions after the complement and thus developing circumpositions of the type [PREPOSITION + NPACC + POSTPOSITION]. Later, some varieties dropped the primary preposition SE from circumpositional phrases, leaving (secondary) postpositions as the only overt relator ([NPACC + POSTPOSITION]) in some environments. In addition, a number of Turkish postpositions were borrowed wholesale, thus enriching the Greek adpositional inventory.
Resumo:
The paper looks into the dynamics of information society policy and its implementation in the Greek context. It argues that information society development is a contested process, influenced by pre-existing state, economy and society relations. Based on this, it looks into the different aspects of the idiosyncratic path which the evolution of the Greek information society has followed, particularly after 2000. Using Bob Jessop's strategic-relational approach (SRA) to the state as an analytical framework and drawing on a number of in-depth interviews with relevant political actors, it provides insights into policy implementation by examining: the public management of information technology projects, how such projects were received in bureaucratic structures and practices, as well as the relationship between the state and the information and communication technology (ICT) sector in public procurement processes. The emphasis is on the period 2000–2008, during which a major operational programme on the information society in Greece was put into effect. The paper also touches upon the post-2008 experience, suggesting that information society developments might include dynamics operating independently and even in contradiction to the state agenda.