940 resultados para Genesis of text
Resumo:
Mr. Kubon's project was inspired by the growing need for an automatic, syntactic analyser (parser) of Czech, which could be used in the syntactic processing of large amounts of texts. Mr. Kubon notes that such a tool would be very useful, especially in the field of corpus linguistics, where creating a large-scale "tree bank" (a collection of syntactic representations of natural language sentences) is a very important step towards the investigation of the properties of a given language. The work involved in syntactically parsing a whole corpus in order to get a representative set of syntactic structures would be almost inconceivable without the help of some kind of robust (semi)automatic parser. The need for the automatic natural language parser to be robust increases with the size of the linguistic data in the corpus or in any other kind of text which is going to be parsed. Practical experience shows that apart from syntactically correct sentences, there are many sentences which contain a "real" grammatical error. These sentences may be corrected in small-scale texts, but not generally in the whole corpus. In order to be able to complete the overall project, it was necessary to address a number of smaller problems. These were; 1. the adaptation of a suitable formalism able to describe the formal grammar of the system; 2. the definition of the structure of the system's dictionary containing all relevant lexico-syntactic information, and the development of a formal grammar able to robustly parse Czech sentences from the test suite; 3. filling the syntactic dictionary with sample data allowing the system to be tested and debugged during its development (about 1000 words); 4. the development of a set of sample sentences containing a reasonable amount of grammatical and ungrammatical phenomena covering some of the most typical syntactic constructions being used in Czech. Number 3, building a formal grammar, was the main task of the project. The grammar is of course far from complete (Mr. Kubon notes that it is debatable whether any formal grammar describing a natural language may ever be complete), but it covers the most frequent syntactic phenomena, allowing for the representation of a syntactic structure of simple clauses and also the structure of certain types of complex sentences. The stress was not so much on building a wide coverage grammar, but on the description and demonstration of a method. This method uses a similar approach as that of grammar-based grammar checking. The problem of reconstructing the "correct" form of the syntactic representation of a sentence is closely related to the problem of localisation and identification of syntactic errors. Without a precise knowledge of the nature and location of syntactic errors it is not possible to build a reliable estimation of a "correct" syntactic tree. The incremental way of building the grammar used in this project is also an important methodological issue. Experience from previous projects showed that building a grammar by creating a huge block of metarules is more complicated than the incremental method, which begins with the metarules covering most common syntactic phenomena first, and adds less important ones later, especially from the point of view of testing and debugging the grammar. The sample of the syntactic dictionary containing lexico-syntactical information (task 4) now has slightly more than 1000 lexical items representing all classes of words. During the creation of the dictionary it turned out that the task of assigning complete and correct lexico-syntactic information to verbs is a very complicated and time-consuming process which would itself be worth a separate project. The final task undertaken in this project was the development of a method allowing effective testing and debugging of the grammar during the process of its development. The problem of the consistency of new and modified rules of the formal grammar with the rules already existing is one of the crucial problems of every project aiming at the development of a large-scale formal grammar of a natural language. This method allows for the detection of any discrepancy or inconsistency of the grammar with respect to a test-bed of sentences containing all syntactic phenomena covered by the grammar. This is not only the first robust parser of Czech, but also one of the first robust parsers of a Slavic language. Since Slavic languages display a wide range of common features, it is reasonable to claim that this system may serve as a pattern for similar systems in other languages. To transfer the system into any other language it is only necessary to revise the grammar and to change the data contained in the dictionary (but not necessarily the structure of primary lexico-syntactic information). The formalism and methods used in this project can be used in other Slavic languages without substantial changes.
Resumo:
Ms. Net wanted to find out if there was what she terms a "collective identity of the intelligentsia" of Romania and France between 1945 and 1989. She conducted her research on a corpus of memoirs from both cultures, and in the process, uncovered some fundamental differences, which she presented in the form of a 178 page manuscript in English, and also on disc. One of the most basic appears to be that French memorialists rarely deal with social, historical and political changes and events. Ms. Net regards these writers as shutting their eyes to reality, and attempting to preserve the past. They are interested in their personal history, and in the genesis of their own works. According to Ms. Net, this tendency is so marked that she doubts whether 20th century French writers share the dominant mentalities of their times. In her opinion all this points to the fact that the French intelligentsia are "trying hard to preserve their cultural hegemony" a task which she maintains has always been an essential aspect of the identity of the French intellectual. In Romania, of course, the situation was very different. To take an example: many Romanian memoirs speak about the campaigns to improve the lot of women, while at the same time recognising and analysing the way that this was simply a "cover" for promoting the most incompetent people, men and women alike. They also express frustration at the way access to information was blocked due to the media being government controlled. Ms. Net concludes, eventually, that, in general, intellectuals, more than any other group in society, ensure the continuity of the dominant mentalities in a given cultural space. Consequently, she feels, we must revise the idea - or myth as she calls it - that intellectuals represent the avant-garde in a given society. Specifically, she concludes that petty bourgeois, patriarchal and elitist mentalities are still prevalent in France. The truth is, she reflects, that intellectuals are always true to their nature, no mater when and where they are living.
Resumo:
A black and white German Holstein calf displayed a complex double malformation in shape of a thoracopagus parasiticus. By means of a molecular genetic investigation the genesis of the malformation from one zygote could be demonstrated. Both vertebral columns showed a pronounced lordosis, with the vertebral column of one animal ending in a rudimentary head. Close to this rudiment two derivates of branchial arches were found. The two thoracic cavities merged into one "thorax". In the shared thoracic cavity one heart was found. In its right atrium, a cherry-sized structure was found in which heart- and vascular smooth muscles were demonstrated histologically. The aorta split shortly after its origin to provide both animals with one aorta each. The larger pair of lungs was connected with a trachea leading to the head while the smaller pair of lungs originated from a trachea deriving from the rudimentary head. The diaphragm jejunum and split afterwards. The pedigree of the affected animal showed neither inbreeding nor any other affected animal.
Resumo:
OBJECTIVES: To assess the correlations between the hormone leptin and lipoatrophy in HIV-positive, treatment-naive patients on combination antiretroviral therapy (cART). DESIGN: Case-control study nested in a multicentre cohort of HIV-infected adults. Cases were patients that developed lipoatrophy and controls those who did not. PATIENTS AND METHODS: Clinical parameters and plasma leptin determinations were studied in 97 HIV-1-infected, treatment-naive Caucasian men (10 cases and 87 controls) on an unchanged and virologically successful drug regimen with a zidovudine/lamivudine backbone at baseline and after 2 years of cART. The association of plasma leptin levels and the development of lipoatrophy was investigated. RESULTS: Two years of cART was not associated with a change in plasma leptin levels. Plasma leptin levels remained sensible to changes in body mass index. There was no difference in leptin levels between patients who developed lipoatrophy and controls, neither before nor after cART. The only predictor of development of lipoatrophy was a higher age (P = 0.02). CONCLUSIONS: Leptin as measured in plasma is unlikely to play a major role in the genesis of lipoatrophy.
Resumo:
The hypothalamo-pituitary-adrenal axis shows functional changes in alcoholics, with raised glucocorticoid release during alcohol intake and during the initial phase of alcohol withdrawal. Raised glucocorticoid concentrations are known to cause neuronal damage after withdrawal from chronic alcohol consumption and in other conditions. The hypothesis for these studies was that chronic alcohol treatment would have differential effects on corticosterone concentrations in plasma and in brain regions. Effects of chronic alcohol and withdrawal on regional brain corticosterone concentrations were examined using a range of standard chronic alcohol treatments in two strains of mice and in rats. Corticosterone was measured by radioimmunoassay and the identity of the corticosterone extracted from brain was verified by high performance liquid chromatography and mass spectrometry. Withdrawal from long term (3 weeks to 8 months) alcohol consumption induced prolonged increases in glucocorticoid concentrations in specific regions of rodent brain, while plasma concentrations remained unchanged. This effect was seen after alcohol administration via drinking fluid or by liquid diet, in both mice and rats and in both genders. Shorter alcohol treatments did not show the selective effect on brain glucocorticoid levels. During the alcohol consumption the regional brain corticosterone concentrations paralleled the plasma concentrations. Type II glucocorticoid receptor availability in prefrontal cortex was decreased after withdrawal from chronic alcohol consumption and nuclear localization of glucocorticoid receptors was increased, a pattern that would be predicted from enhanced glucocorticoid type II receptor activation. This novel observation of prolonged selective increases in brain glucocorticoid activity could explain important consequences of long term alcohol consumption, including memory loss, dependence and lack of hypothalamo-pituitary responsiveness. Local changes in brain glucocorticoid levels may also need to be considered in the genesis of other mental disorders and could form a potential new therapeutic target.
Resumo:
Mediastinal mass syndrome remains an anaesthetic challenge that cannot be underestimated. Depending on the localization and the size of the mediastinal tumour, the clinical presentation is variable ranging from a complete lack of symptoms to severe cardiorespiratory problems. The administration of general anaesthesia can be associated with acute intraoperative or postoperative cardiorespiratory decompensation that may result in death due to tumour-related compression syndromes. The role of the anaesthesiologist, as a part of the interdisciplinary treatment team, is to ensure a safe perioperative period. However, there is still no structured protocol available for perioperative anaesthesiological procedure. The aim of this article is to summarize the genesis of and the diagnostic options for mediastinal mass syndrome and to provide a solid detailed methodology for its safe perioperative management based on a review of the latest literature and our own clinical experiences. Proper anaesthetic management of patients with mediastinal mass syndrome begins with an assessment of the preoperative status, directed foremost at establishing the localization of the tumour and on the basis of the clinical and radiological findings, discerning whether any vital mediastinal structures are affected. We have found it helpful to assign 'severity grade' (using a three-grade clinical classification scale: 'safe', 'uncertain', 'unsafe'), whereby each stage triggers appropriate action in terms of staffing and apparatus, such as the provision of alternatives for airway management, cardiopulmonary bypass and additional specialists. During the preoperative period, we are guided by a 12-point plan that also takes into account the special features of transportation into the operating theatre and patient monitoring. Tumour compression on the airways or the great vessels may create a critical respiratory and/or haemodynamic situation, and therefore the standard of intraoperative management includes induction of anaesthesia in the operating theatre on an adjustable surgical table, the use of short-acting anaesthetics, avoidance of muscle relaxants and maintenance of spontaneous respiration. In the case of severe clinical symptoms and large mediastinal tumours, we consider it absolutely essential to cannulate the femoral vessels preoperatively under local anaesthesia and to provide for the availability of cardiopulmonary bypass in the operating theatre, should extracorporeal circulation become necessary. The benefits of establishing vascular access under local anaesthesia clearly outweigh any associated degree of patient discomfort. In the case of patients classified as 'safe' or 'uncertain', a preoperative consensus with the surgeons should be reached as to the anaesthetic approach and the management of possible complications.
Resumo:
The article discusses the function of an accompanying discourse in relation to the genesis of human practical action. On the one side, theory cannot be taken as the ground for practical action; practical action is not a realisation of intentions. On the other hand, human practical action is accompanied by series of explanations, justifications, declarations of intent, pre‑ and post-rationalisations, motivations etc. These accompanying discourses seem in one way or the other to be necessary for the actual realisation of human practical action. Following Pierre Bourdieu, it is suggested that an accompanying discourse cannot in a meaningful manner be separated from the human practical action, that practical theory should be regarded not as theory but as part of practice, and that practical theory first of all provides a common language for talking about practice and hence for reproducing a fundamentally arbitrary idea of the genesis of human practical action. Parallels are drawn to the education/formal training of semi-professionals.
Resumo:
Research and professional practices have the joint aim of re-structuring the preconceived notions of reality. They both want to gain the understanding about social reality. Social workers use their professional competence in order to grasp the reality of their clients, while researchers’ pursuit is to open the secrecies of the research material. Development and research are now so intertwined and inherent in almost all professional practices that making distinctions between practising, developing and researching has become difficult and in many aspects irrelevant. Moving towards research-based practices is possible and it is easily applied within the framework of the qualitative research approach (Dominelli 2005, 235; Humphries 2005, 280). Social work can be understood as acts and speech acts crisscrossing between social workers and clients. When trying to catch the verbal and non-verbal hints of each others’ behaviour, the actors have to do a lot of interpretations in a more or less uncertain mental landscape. Our point of departure is the idea that the study of social work practices requires tools which effectively reveal the internal complexity of social work (see, for example, Adams & Dominelli & Payne 2005, 294 – 295). The boom of qualitative research methodologies in recent decades is associated with much profound the rupture in humanities, which is called the linguistic turn (Rorty 1967). The idea that language is not transparently mediating our perceptions and thoughts about reality, but on the contrary it constitutes it was new and even confusing to many social scientists. Nowadays we have got used to read research reports which have applied different branches of discursive analyses or narratologic or semiotic approaches. Although differences are sophisticated between those orientations they share the idea of the predominance of language. Despite the lively research work of today’s social work and the research-minded atmosphere of social work practice, semiotics has rarely applied in social work research. However, social work as a communicative practice concerns symbols, metaphors and all kinds of the representative structures of language. Those items are at the core of semiotics, the science of signs, and the science which examines people using signs in their mutual interaction and their endeavours to make the sense of the world they live in, their semiosis. When thinking of the practice of social work and doing the research of it, a number of interpretational levels ought to be passed before reaching the research phase in social work. First of all, social workers have to interpret their clients’ situations, which will be recorded in the files. In some very rare cases those past situations will be reflected in discussions or perhaps interviews or put under the scrutiny of some researcher in the future. Each and every new observation adds its own flavour to the mixture of meanings. Social workers have combined their observations with previous experience and professional knowledge, furthermore, the situation on hand also influences the reactions. In addition, the interpretations made by social workers over the course of their daily working routines are never limited to being part of the personal process of the social worker, but are also always inherently cultural. The work aiming at social change is defined by the presence of an initial situation, a specific goal, and the means and ways of achieving it, which are – or which should be – agreed upon by the social worker and the client in situation which is unique and at the same time socially-driven. Because of the inherent plot-based nature of social work, the practices related to it can be analysed as stories (see Dominelli 2005, 234), given, of course, that they are signifying and told by someone. The research of the practices is concentrating on impressions, perceptions, judgements, accounts, documents etc. All these multifarious elements can be scrutinized as textual corpora, but not whatever textual material. In semiotic analysis, the material studied is characterised as verbal or textual and loaded with meanings. We present a contribution of research methodology, semiotic analysis, which has to our mind at least implicitly references to the social work practices. Our examples of semiotic interpretation have been picked up from our dissertations (Laine 2005; Saurama 2002). The data are official documents from the archives of a child welfare agency and transcriptions of the interviews of shelter employees. These data can be defined as stories told by the social workers of what they have seen and felt. The official documents present only fragmentations and they are often written in passive form. (Saurama 2002, 70.) The interviews carried out in the shelters can be described as stories where the narrators are more familiar and known. The material is characterised by the interaction between the interviewer and interviewee. The levels of the story and the telling of the story become apparent when interviews or documents are examined with the use of semiotic tools. The roots of semiotic interpretation can be found in three different branches; the American pragmatism, Saussurean linguistics in Paris and the so called formalism in Moscow and Tartu; however in this paper we are engaged with the so called Parisian School of semiology which prominent figure was A. J. Greimas. The Finnish sociologists Pekka Sulkunen and Jukka Törrönen (1997a; 1997b) have further developed the ideas of Greimas in their studies on socio-semiotics, and we lean on their ideas. In semiotics social reality is conceived as a relationship between subjects, observations, and interpretations and it is seen mediated by natural language which is the most common sign system among human beings (Mounin 1985; de Saussure 2006; Sebeok 1986). Signification is an act of associating an abstract context (signified) to some physical instrument (signifier). These two elements together form the basic concept, the “sign”, which never constitutes any kind of meaning alone. The meaning will be comprised in a distinction process where signs are being related to other signs. In this chain of signs, the meaning becomes diverged from reality. (Greimas 1980, 28; Potter 1996, 70; de Saussure 2006, 46-48.) One interpretative tool is to think of speech as a surface under which deep structures – i.e. values and norms – exist (Greimas & Courtes 1982; Greimas 1987). To our mind semiotics is very much about playing with two different levels of text: the syntagmatic surface which is more or less faithful to the grammar, and the paradigmatic, semantic structure of values and norms hidden in the deeper meanings of interpretations. Semiotic analysis deals precisely with the level of meaning which exists under the surface, but the only way to reach those meanings is through the textual level, the written or spoken text. That is why the tools are needed. In our studies, we have used the semiotic square and the actant analysis. The former is based on the distinctions and the categorisations of meanings, and the latter on opening the plotting of narratives in order to reach the value structures.
Resumo:
Non-verbal communication (NVC) is considered to represent more than 90 percent of everyday communication. In virtual world, this important aspect of interaction between virtual humans (VH) is strongly neglected. This paper presents a user-test study to demonstrate the impact of automatically generated graphics-based NVC expression on the dialog quality: first, we wanted to compare impassive and emotion facial expression simulation for impact on the chatting. Second, we wanted to see whether people like chatting within a 3D graphical environment. Our model only proposes facial expressions and head movements induced from spontaneous chatting between VHs. Only subtle facial expressions are being used as nonverbal cues - i.e. related to the emotional model. Motion capture animations related to hand gestures, such as cleaning glasses, were randomly used to make the virtual human lively. After briefly introducing the technical architecture of the 3D-chatting system, we focus on two aspects of chatting through VHs. First, what is the influence of facial expressions that are induced from text dialog? For this purpose, we exploited an emotion engine extracting an emotional content from a text and depicting it into a virtual character developed previously [GAS11]. Second, as our goal was not addressing automatic generation of text, we compared the impact of nonverbal cues in conversation with a chatbot or with a human operator with a wizard of oz approach. Among main results, the within group study -involving 40 subjects- suggests that subtle facial expressions impact significantly not only on the quality of experience but also on dialog understanding.
Resumo:
Stemmatology, or the reconstruction of the transmission history of texts, is a field that stands particularly to gain from digital methods. Many scholars already take stemmatic approaches that rely heavily on computational analysis of the collated text (e.g. Robinson and O’Hara 1996; Salemans 2000; Heikkilä 2005; Windram et al. 2008 among many others). Although there is great value in computationally assisted stemmatology, providing as it does a reproducible result and allowing access to the relevant methodological process in related fields such as evolutionary biology, computational stemmatics is not without its critics. The current state-of-the-art effectively forces scholars to choose between a preconceived judgment of the significance of textual differences (the Lachmannian or neo-Lachmannian approach, and the weighted phylogenetic approach) or to make no judgment at all (the unweighted phylogenetic approach). Some basis for judgment of the significance of variation is sorely needed for medieval text criticism in particular. By this, we mean that there is a need for a statistical empirical profile of the text-genealogical significance of the different sorts of variation in different sorts of medieval texts. The rules that apply to copies of Greek and Latin classics may not apply to copies of medieval Dutch story collections; the practices of copying authoritative texts such as the Bible will most likely have been different from the practices of copying the Lives of local saints and other commonly adapted texts. It is nevertheless imperative that we have a consistent, flexible, and analytically tractable model for capturing these phenomena of transmission. In this article, we present a computational model that captures most of the phenomena of text variation, and a method for analysis of one or more stemma hypotheses against the variation model. We apply this method to three ‘artificial traditions’ (i.e. texts copied under laboratory conditions by scholars to study the properties of text variation) and four genuine medieval traditions whose transmission history is known or deduced in varying degrees. Although our findings are necessarily limited by the small number of texts at our disposal, we demonstrate here some of the wide variety of calculations that can be made using our model. Certain of our results call sharply into question the utility of excluding ‘trivial’ variation such as orthographic and spelling changes from stemmatic analysis.
Resumo:
Peripheral neuropathic pain is a disabling condition resulting from nerve injury. It is characterized by the dysregulation of voltage-gated sodium channels (Navs) expressed in dorsal root ganglion (DRG) sensory neurons. The mechanisms underlying the altered expression of Na(v)s remain unknown. This study investigated the role of the E3 ubiquitin ligase NEDD4-2, which is known to ubiquitylate Navs, in the pathogenesis of neuropathic pain in mice. The spared nerve injury (SNI) model of traumatic nerve injury-induced neuropathic pain was used, and an Na(v)1.7-specific inhibitor, ProTxII, allowed the isolation of Na(v)1.7-mediated currents. SNI decreased NEDD4-2 expression in DRG cells and increased the amplitude of Na(v)1.7 and Na(v)1.8 currents. The redistribution of Na(v)1.7 channels toward peripheral axons was also observed. Similar changes were observed in the nociceptive DRG neurons of Nedd4L knockout mice (SNS-Nedd4L(-/-)). SNS-Nedd4L(-/-) mice exhibited thermal hypersensitivity and an enhanced second pain phase after formalin injection. Restoration of NEDD4-2 expression in DRG neurons using recombinant adenoassociated virus (rAAV2/6) not only reduced Na(v)1.7 and Na(v)1.8 current amplitudes, but also alleviated SNI-induced mechanical allodynia. These findings demonstrate that NEDD4-2 is a potent posttranslational regulator of Na(v)s and that downregulation of NEDD4-2 leads to the hyperexcitability of DRG neurons and contributes to the genesis of pathological pain.
Resumo:
Autophagy, a fundamental cellular catabolic process, is involved in the development of numerous diseases including cancer. Autophagy seems to have an ambivalent impact on tumor development. While increasing evidence indicates a cytoprotective role for autophagy that can contribute to resistance against chemotherapy and even against the adverse, hypoxic environment of established tumors, relatively few publications focus on the role of autophagy in early tumorigenesis. However, the consensus is that autophagy is inhibitory for the genesis of tumors. To understand this apparent contradiction, more detailed information about the roles of the individual participants in autophagy is needed. This review will address this topic with respect to autophagy-related protein 5 (ATG5), which in several lines of investigation has been ascribed special significance in the autophagic pathway. Furthermore, it was recently shown that an ATG5 deficiency in melanocytes interferes with oncogene-induced senescence, thus promoting melanoma tumorigenesis. Similarly, an ATG5 deficiency resulted in tumors of the lung and liver in experimental mouse models. Taken together, these findings indicate that ATG5 and the autophagy to which it contributes are essential gatekeepers restricting early tumorigenesis in multiple tissues.
Resumo:
Vorbesitzer: Leonhardstift Frankfurt am Main;
Resumo:
In this paper, we describe NewsCATS (news categorization and trading system), a system implemented to predict stock price trends for the time immediately after the publication of press releases. NewsCATS consists mainly of three components. The first component retrieves relevant information from press releases through the application of text preprocessing techniques. The second component sorts the press releases into predefined categories. Finally, appropriate trading strategies are derived by the third component by means of the earlier categorization. The findings indicate that a categorization of press releases is able to provide additional information that can be used to forecast stock price trends, but that an adequate trading strategy is essential for the results of the categorization to be fully exploited.