907 resultados para process of damage creation
Resumo:
This study of the process of language shift and maintenance in the bilingual community of Romanians living in Hungary was based on 40 tape-recorded Romanian sociolinguistic interviews. These were transcribed into computerised form and provide an excellent source of sociolinguistic, contact linguistic and discourse analysis data, making it possible to show the effect of internal and external factors on the bilingual speech mode. The main topics considered were the choice of Romanian and Hungarian in community interactions, factors of language choice, code-switching: introlanguage and interlanguage, reasons for code-switching, the relationship between age and the frequency of code switching in the interview situation, and the unequal competition of minority and majority languages at school.
Resumo:
Mr. Kubon's project was inspired by the growing need for an automatic, syntactic analyser (parser) of Czech, which could be used in the syntactic processing of large amounts of texts. Mr. Kubon notes that such a tool would be very useful, especially in the field of corpus linguistics, where creating a large-scale "tree bank" (a collection of syntactic representations of natural language sentences) is a very important step towards the investigation of the properties of a given language. The work involved in syntactically parsing a whole corpus in order to get a representative set of syntactic structures would be almost inconceivable without the help of some kind of robust (semi)automatic parser. The need for the automatic natural language parser to be robust increases with the size of the linguistic data in the corpus or in any other kind of text which is going to be parsed. Practical experience shows that apart from syntactically correct sentences, there are many sentences which contain a "real" grammatical error. These sentences may be corrected in small-scale texts, but not generally in the whole corpus. In order to be able to complete the overall project, it was necessary to address a number of smaller problems. These were; 1. the adaptation of a suitable formalism able to describe the formal grammar of the system; 2. the definition of the structure of the system's dictionary containing all relevant lexico-syntactic information, and the development of a formal grammar able to robustly parse Czech sentences from the test suite; 3. filling the syntactic dictionary with sample data allowing the system to be tested and debugged during its development (about 1000 words); 4. the development of a set of sample sentences containing a reasonable amount of grammatical and ungrammatical phenomena covering some of the most typical syntactic constructions being used in Czech. Number 3, building a formal grammar, was the main task of the project. The grammar is of course far from complete (Mr. Kubon notes that it is debatable whether any formal grammar describing a natural language may ever be complete), but it covers the most frequent syntactic phenomena, allowing for the representation of a syntactic structure of simple clauses and also the structure of certain types of complex sentences. The stress was not so much on building a wide coverage grammar, but on the description and demonstration of a method. This method uses a similar approach as that of grammar-based grammar checking. The problem of reconstructing the "correct" form of the syntactic representation of a sentence is closely related to the problem of localisation and identification of syntactic errors. Without a precise knowledge of the nature and location of syntactic errors it is not possible to build a reliable estimation of a "correct" syntactic tree. The incremental way of building the grammar used in this project is also an important methodological issue. Experience from previous projects showed that building a grammar by creating a huge block of metarules is more complicated than the incremental method, which begins with the metarules covering most common syntactic phenomena first, and adds less important ones later, especially from the point of view of testing and debugging the grammar. The sample of the syntactic dictionary containing lexico-syntactical information (task 4) now has slightly more than 1000 lexical items representing all classes of words. During the creation of the dictionary it turned out that the task of assigning complete and correct lexico-syntactic information to verbs is a very complicated and time-consuming process which would itself be worth a separate project. The final task undertaken in this project was the development of a method allowing effective testing and debugging of the grammar during the process of its development. The problem of the consistency of new and modified rules of the formal grammar with the rules already existing is one of the crucial problems of every project aiming at the development of a large-scale formal grammar of a natural language. This method allows for the detection of any discrepancy or inconsistency of the grammar with respect to a test-bed of sentences containing all syntactic phenomena covered by the grammar. This is not only the first robust parser of Czech, but also one of the first robust parsers of a Slavic language. Since Slavic languages display a wide range of common features, it is reasonable to claim that this system may serve as a pattern for similar systems in other languages. To transfer the system into any other language it is only necessary to revise the grammar and to change the data contained in the dictionary (but not necessarily the structure of primary lexico-syntactic information). The formalism and methods used in this project can be used in other Slavic languages without substantial changes.
Resumo:
Slovenia is considered to be one of the most successful Central and Eastern European countries undergoing the process of transition. It has a high GDP per capita (the highest in the Visegrad group) amounting to about 7200 US dollars (at the exchange rates pertaining during Ms. Stropnik's research). In 1994, a low rate of inflation, a low level of public debt and almost balanced public finances, were all positive elements. However, there is a darker side, for instance the dramatic increase in unemployment and (somewhat less dramatic) fall in production during the transition period. This analysis aimed to provide insights into what is actually happening at the household level, since households are the ultimate bearers of macroeconomic and social change. The final output totalled 166 pages in English and Slovenian, available also on disc. The income concept used by Ms. Stropnik is that of the disposable (monetary) household income, i.e. the cash income of all household members - including social security transfers and family benefits, and the net sum of taxes and social security contributions - plus the equivalent of domestic production, used in the household. Non-monetary income sources, such as household own production, benefits in kind, subsidies for goods and services, and fringe benefits, were not taken into account. The concept of relative and objective poverty was followed. Poverty means having less than others in society, it is a state of relative deprivation. Objective aspects of the situation, e.g. command over resources (i.e. the household income) and the relative position of the household in the income distribution, determine who is poor and who is not. Changes in household composition - an increase in the number of pensioners, unemployed and self-employed, concomitant with a large decrease in the number of employees - obviously played a part in the changing structure of household income sources during this period. The overall decrease in the share of wages and salaries from primary employment in 1993 is to be observed in all income deciles. On the other hand, the importance of salaries gained from secondary employment has increased in all deciles. The lower seven deciles experienced a sharp rise in the share of social benefits in the period 1988-1993, mostly because of the increase in the number of persons entitled to claim unemployment benefits. In Slovenia, income inequality has increased considerably during the 1988-1993 period. To make matters worse, the large increase in income inequality occurred in a period of falling real incomes. In 1983 the bottom decile disposed of 3.8 percent and the top decile disposed of 23.4 percent of total monetary income in Slovenia, whereas by 1993 the same statistics revealed 3.1 percent and 18.9 percent respectively. Unemployment greatly increases the risk of living in poverty. In 1993, 35 per cent of all unemployed persons in Slovenia were living in the lowest income quintile. Ms. Stropnik found certain features that were specific to Slovenia and not shared by most countries in transition. For example, the relative income position of pensioners has improved. Retirement did not increase the risk of poverty in 1993 as much as it did in 1983 and 1988. Also, it appears that children have not been particularly hard-hit by the transition upheavals. The incidence of poverty amongst children has not increased in the period 1983-1993. Children were also fairly evenly distributed across income quintiles. In 1983, 11.8 percent of households with children aged 18 or less were poor. In 1993, this figure was 8.4 per cent. On the other hand, poor households with children were, in comparison with other households of the same type, poorer in 1993 than in 1983. Ms. Stropnik also analysed the impact of social transfers. Her conclusion was that the level of social transfers prevented them from being successful in alleviating poverty. Family policy transfers (child allowances, child tax allowances, subsidised child care) did, however, contribute to the lowering of income inequality between families with and without children, and amongst families with different numbers of children. Ms. Stropnik is determined that the results of her research be used in the creation of social policy aimed at helping the poor. She quotes Piachaud approvingly: "If the term 'poverty' carries with it the implication and moral imperative that something should be done about it, then the study of poverty is only ultimately justifiable if it influences individual and social attitudes and actions."
Resumo:
The clinical manifestations of anti-cancer drug associated cardiac side effects are diverse and can range from acutely induced cardiac arrhythmias to Q-T interval prolongation, changes in coronary vasomotion with consecutive myocardial ischemia, myocarditis, pericarditis, severe contractile dysfunction, and potentially fatal heart failure. The pathophysiology of these adverse effects is similarly heterogeneous and the identification of potential mechanisms is frequently difficult since the majority of cancer patients is not only treated with a multitude of cancer drugs but might also be exposed to potentially cardiotoxic radiation therapy. Some of the targets inhibited by new anti-cancer drugs also appear to be important for the maintenance of cellular homeostasis of normal tissue, in particular during exposure to cytotoxic chemotherapy. If acute chemotherapy-induced myocardial damage is only moderate, the process of myocardial remodeling can lead to progressive myocardial dysfunction over years and eventually induce myocardial dysfunction and heart failure. The tools for diagnosing anti-cancer drug associated cardiotoxicity and monitoring patients during chemotherapy include invasive and noninvasive techniques as well as laboratory investigations and are mostly only validated for anthracycline-induced cardiotoxicity and more recently for trastuzumab-associated cardiac dysfunction.
Resumo:
Although eosinophils are considered useful in defense mechanisms against parasites, their exact function in innate immunity remains unclear. The aim of this study is to better understand the role of eosinophils within the gastrointestinal immune system. We show here that lipopolysaccharide from Gram-negative bacteria activates interleukin-5 (IL-5)- or interferon-gamma-primed eosinophils to release mitochondrial DNA in a reactive oxygen species-dependent manner, but independent of eosinophil death. Notably, the process of DNA release occurs rapidly in a catapult-like manner--in less than one second. In the extracellular space, the mitochondrial DNA and the granule proteins form extracellular structures able to bind and kill bacteria both in vitro and under inflammatory conditions in vivo. Moreover, after cecal ligation and puncture, Il5-transgenic but not wild-type mice show intestinal eosinophil infiltration and extracellular DNA deposition in association with protection against microbial sepsis. These data suggest a previously undescribed mechanism of eosinophil-mediated innate immune responses that might be crucial for maintaining the intestinal barrier function after inflammation-associated epithelial cell damage, preventing the host from uncontrolled invasion of bacteria.
Resumo:
Femoroacetabular impingements (FAI) are due to an anatomical disproportion between the proximal femur and the acetabulum which causes premature wear of the joint surfaces. An operation is often necessary in order to relieve symptoms such as limited movement and pain as well as to prevent or slow down the degenerative process. The result is dependent on the preoperative status of the joint with poor results for advanced arthritis of the hip joint. This explains the necessity for an accurate diagnosis in order to recognize early stages of damage to the joint. The diagnosis of FAI includes clinical examination, X-ray examination and magnetic resonance imaging (MRI). The standard X-radiological examination for FAI is carried out using two X-ray images, an anterior-posterior view of the pelvis and a lateral view of the proximal femur, such as the cross-table lateral or Lauenstein projections. It is necessary that positioning criteria are adhered to in order to avoid distortion artifacts. MRI permits an examination of the pelvis on three levels and should also include radial planned sequences for improved representation of peripheral structures, such as the labrum and peripheral cartilage. The use of contrast medium for a direct MR arthrogram has proved to be advantageous particularly for representation of labrum damage. The data with respect to cartilage imaging are still unclear. Further developments in technology, such as biochemical-sensitive MRI applications, will be able to improve the diagnosis of the pelvis in the near future.
Resumo:
In this article the use of Learning Management Systems (LMS) at the School of Engineering, University of Borås, in the year 2004 and the academic year 2009-2010 is investigated. The tools in the LMS were classified into four groups (tools for distribution, tools for communication, tools for interaction and tools for course administration) and the pattern of use was analyzed. The preliminary interpretation of the results was discussed with a group of teachers from the School of Engineering with long experience of using LMS. High expectations about LMS as a tool to facilitate flexible education, student centered methods and the creation of an effective learning environment is abundant in the literature. This study, however, shows that in most of the surveyed courses the available LMS is predominantly used to distribute documents to students. The authors argue that a more elaborate use of LMS and a transformation of pedagogical practices towards social constructivist, learner centered procedures should be treated as an integrated process of professional development.
Resumo:
1 .In their colonized ranges, exotic plants may be released from some of the herbivores or pathogens of their home ranges but these can be replaced by novel enemies. It is of basic and practical interest to understand which characteristics of invaded communities control accumulation of the new pests. Key questions are whether enemy load on exotic species is smaller than on native competitors as suggested by the enemy release hypothesis (ERH) and whether this difference is most pronounced in resource-rich habitats as predicted by the resource–enemy release hypothesis (R-ERH). 2. In 72 populations of 12 exotic invasive species, we scored all visible above-ground damage morphotypes caused by herbivores and fungal pathogens. In addition, we quantified levels of leaf herbivory and fruit damage. We then assessed whether variation in damage diversity and levels was explained by habitat fertility, by relatedness between exotic species and the native community or rather by native species diversity. 3. In a second part of the study, we also tested the ERH and the R-ERH by comparing damage of plants in 28 pairs of co-occurring native and exotic populations, representing nine congeneric pairs of native and exotic species. 4. In the first part of the study, diversity of damage morphotypes and damage levels of exotic populations were greater in resource-rich habitats. Co-occurrence of closely related, native species in the community significantly increased the probability of fruit damage. Herbivory on exotics was less likely in communities with high phylogenetic diversity. 5. In the second part of the study, exotic and native congeneric populations incurred similar damage diversity and levels, irrespective of whether they co-occurred in nutrient-poor or nutrient-rich habitats. 9. Synthesis. We identified habitat productivity as a major community factor affecting accumulation of enemy damage by exotic populations. Similar damage levels in exotic and native congeneric populations, even in species pairs from fertile habitats, suggest that the enemy release hypothesis or the R-ERH cannot always explain the invasiveness of introduced species.
Resumo:
Computer tomography (CT)-based finite element (FE) models of vertebral bodies assess fracture load in vitro better than dual energy X-ray absorptiometry, but boundary conditions affect stress distribution under the endplates that may influence ultimate load and damage localisation under post-yield strains. Therefore, HRpQCT-based homogenised FE models of 12 vertebral bodies were subjected to axial compression with two distinct boundary conditions: embedding in polymethylmethalcrylate (PMMA) and bonding to a healthy intervertebral disc (IVD) with distinct hyperelastic properties for nucleus and annulus. Bone volume fraction and fabric assessed from HRpQCT data were used to determine the elastic, plastic and damage behaviour of bone. Ultimate forces obtained with PMMA were 22% higher than with IVD but correlated highly (R2 = 0.99). At ultimate force, distinct fractions of damage were computed in the endplates (PMMA: 6%, IVD: 70%), cortex and trabecular sub-regions, which confirms previous observations that in contrast to PMMA embedding, failure initiated underneath the nuclei in healthy IVDs. In conclusion, axial loading of vertebral bodies via PMMA embedding versus healthy IVD overestimates ultimate load and leads to distinct damage localisation and failure pattern.
Resumo:
Because proliferative vitreoretinopathy cannot be effectively treated, its prevention is indispensable for the success of surgery for retinal detachment. The elaboration of preventive and therapeutic strategies depends upon the identification of patients who are genetically predisposed to develop the disease, as well as upon an understanding of the biological process involved and the role of local factors, such as the status of the uveovascular barrier. Detachment of the retina or vitreous activates glia to release cytokines and ATP, which not only protect the neuroretina but also promote inflammation, retinal ischemia, cell proliferation, and tissue remodeling. The vitreal microenvironment favors cellular de-differentiation and proliferation of cells with nonspecific nutritional requirements. This may render a pharmacological inhibition of their growth difficult without causing damage to the pharmacologically vulnerable neuroretina. Moreover, reattachment of the retina relies upon the local induction of a controlled wound-healing response involving macrophages and proliferating glia. Hence, the functional outcome of proliferative vitreoretinopathy will be determined by the equilibrium established between protective and destructive repair mechanisms, which will be influenced by the location and the degree of damage to the photoreceptor cells that is induced by peri-retinal gliosis.
Resumo:
In order to overcome the limitations of the linear-quadratic model and include synergistic effects of heat and radiation, a novel radiobiological model is proposed. The model is based on a chain of cell populations which are characterized by the number of radiation induced damages (hits). Cells can shift downward along the chain by collecting hits and upward by a repair process. The repair process is governed by a repair probability which depends upon state variables used for a simplistic description of the impact of heat and radiation upon repair proteins. Based on the parameters used, populations up to 4-5 hits are relevant for the calculation of the survival. The model describes intuitively the mathematical behaviour of apoptotic and nonapoptotic cell death. Linear-quadratic-linear behaviour of the logarithmic cell survival, fractionation, and (with one exception) the dose rate dependencies are described correctly. The model covers the time gap dependence of the synergistic cell killing due to combined application of heat and radiation, but further validation of the proposed approach based on experimental data is needed. However, the model offers a work bench for testing different biological concepts of damage induction, repair, and statistical approaches for calculating the variables of state.
Resumo:
In natural hazard research, risk is defined as a function of (1) the probability of occurrence of a hazardous process, and (2) the assessment of the related extent of damage, defined by the value of elements at risk exposed and their physical vulnerability. Until now, various works have been undertaken to determine vulnerability values for objects exposed to geomorphic hazards such as mountain torrents. Yet, many studies only provide rough estimates for vulnerability values based on proxies for process intensities. However, the deduced vulnerability functions proposed in the literature show a wide range, in particular with respect to medium and high process magnitudes. In our study, we compare vulnerability functions for torrent processes derived from studies in test sites located in the Austrian Alps and in Taiwan. Based on this comparison we expose needs for future research in order to enhance mountain hazard risk management with a particular focus on the question of vulnerability on a catchment scale.
Resumo:
The goal of this study was to investigate the cellular and molecular mechanisms by which glutathione (GSH) is involved in the process of apoptosis induced by cisplatin [cis-diamminedichloroplatinum(II), cis-DDP] in the HL60 human promyelocytic leukemia cell line. The data show that during the onset or induction of apoptosis, GSH levels in cisplatin-treated cells increased 50% compared to control cells. The increase in intracellular GSH was associated with enhanced expression of γ-glutamylcysteine synthetase (γ-GCS), the enzyme that catalyzes the rate- limiting step in the biosynthesis of glutathione. After depletion of intracellular GSH with D,L-buthionine-(S,R)-sulfoximine (BSO), an inhibitor of γ-GCS, biochemical and morphological analysis revealed that the mechanism of cell death had switched from apoptosis to necrosis. In contrast, when intracellular GSH was elevated by exposure of cells to a GSH-ethyl-ester and then treatment with cisplatin, no change in the induction and kinetics of apoptosis were observed. However, when cells were exposed to cisplatin before intracellular GSH levels were increased, apoptosis was observed to occur 6 hours earlier compared to cells without GSH elevation. To further examine the molecular aspects of these effects of GSH on the apoptotic process, changes in the expression of bcl-2 and bax, were investigated in cells with depleted and elevated GSH. Using reverse transcription polymerase chain reaction, no significant change in the expression of bcl-2 gene transcripts was observed in cells in either the GSH depleted or elevated state; however, a 75% reduction in GSH resulted in a 40% decrease in the expression of bax gene transcripts. In contrast, a 6-fold increase in GSH increased the expression of bax by 3-fold relative to controls. Similar results were obtained for bax gene expression and protein synthesis by northern analysis and immunoprecipitation, respectively. These results suggest that GSH serves a dual role in the apoptotic process. The first role which is indirect, involves the protection of the cell from extensive damage following exposure to a specific toxicant so as to prevent death by necrosis, possibly by interacting with the DNA damaging agent and/or its active metabolites. The second role involves a direct involvement of GSH in the apoptotic process that includes upregulation of bax expression. ^
Resumo:
The International Surface Temperature Initiative (ISTI) is striving towards substantively improving our ability to robustly understand historical land surface air temperature change at all scales. A key recently completed first step has been collating all available records into a comprehensive open access, traceable and version-controlled databank. The crucial next step is to maximise the value of the collated data through a robust international framework of benchmarking and assessment for product intercomparison and uncertainty estimation. We focus on uncertainties arising from the presence of inhomogeneities in monthly mean land surface temperature data and the varied methodological choices made by various groups in building homogeneous temperature products. The central facet of the benchmarking process is the creation of global-scale synthetic analogues to the real-world database where both the "true" series and inhomogeneities are known (a luxury the real-world data do not afford us). Hence, algorithmic strengths and weaknesses can be meaningfully quantified and conditional inferences made about the real-world climate system. Here we discuss the necessary framework for developing an international homogenisation benchmarking system on the global scale for monthly mean temperatures. The value of this framework is critically dependent upon the number of groups taking part and so we strongly advocate involvement in the benchmarking exercise from as many data analyst groups as possible to make the best use of this substantial effort.
Explaining Emergence and Consequences of Specific Formal Controls in IS Outsourcing – A Process-View
Resumo:
IS outsourcing projects often fail to achieve project goals. To inhibit this failure, managers need to design formal controls that are tailored to the specific contextual demands. However, the dynamic and uncertain nature of IS outsourcing projects makes the design of such specific formal controls at the outset of a project challenging. Hence, the process of translating high-level project goals into specific formal controls becomes crucial for success or failure of IS outsourcing projects. Based on a comparative case study of four IS outsourcing projects, our study enhances current understanding of such translation processes and their consequences by developing a process model that explains the success or failure to achieve high-level project goals as an outcome of two unique translation patterns. This novel process-based explanation for how and why IS outsourcing projects succeed or fail has important implications for control theory and IS project escalation literature.