990 resultados para Analyse textuelle
Resumo:
Dans un monde devenu séculier et où la tradition chrétienne est en recul, on est en droit de se poser la question de la pertinence des textes bibliques. En lien avec cette situation, on peut aussi se demander quelle représentation de Dieu émerge de la façon dont on accueille ces textes. Ce type de questionnement sous-tend l’intérêt que peut représenter une approche processuelle des textes bibliques, approche encore peu connue dans le monde francophone. Celle-ci est-elle en mesure d’apporter une certaine nouveauté dans la vision de Dieu généralement proposée en milieu chrétien ? Pour répondre à cela, il a semblé pertinent de tenter l’exercice à partir d’un texte englobant et fondateur. Genèse 2, connu pour raconter la création d’Adam et Ève, est porteur d’images presque stéréotypées à force d’avoir été lu et remâché. À ce titre, il a paru particulièrement approprié. Mais, avant même d’explorer le texte sous un angle processuel, il s’est avéré indispensable de commencer cette démarche par une traduction personnelle du texte hébreu, à partir et avec l’aide d’une analyse syntaxique et textuelle, dont on verra qu’elles ont ouvert le texte à de nouvelles hypothèses de traduction, de nouvelles nuances, comme autant de pistes à confronter à une théologie processuelle. Enfin, cette analyse ne peut se faire sans être en dialogue avec différents commentaires, exégétiques ou non, afin de souligner les convergences comme les divergences rencontrées au fil de la recherche et de la réflexion.
Resumo:
Les travaux entrepris dans le cadre de la présente thèse portent sur l’analyse de l’équivalence terminologique en corpus parallèle et en corpus comparable. Plus spécifiquement, nous nous intéressons aux corpus de textes spécialisés appartenant au domaine du changement climatique. Une des originalités de cette étude réside dans l’analyse des équivalents de termes simples. Les bases théoriques sur lesquelles nous nous appuyons sont la terminologie textuelle (Bourigault et Slodzian 1999) et l’approche lexico-sémantique (L’Homme 2005). Cette étude poursuit deux objectifs. Le premier est d’effectuer une analyse comparative de l’équivalence dans les deux types de corpus afin de vérifier si l’équivalence terminologique observable dans les corpus parallèles se distingue de celle que l’on trouve dans les corpus comparables. Le deuxième consiste à comparer dans le détail les équivalents associés à un même terme anglais, afin de les décrire et de les répertorier pour en dégager une typologie. L’analyse détaillée des équivalents français de 343 termes anglais est menée à bien grâce à l’exploitation d’outils informatiques (extracteur de termes, aligneur de textes, etc.) et à la mise en place d’une méthodologie rigoureuse divisée en trois parties. La première partie qui est commune aux deux objectifs de la recherche concerne l’élaboration des corpus, la validation des termes anglais et le repérage des équivalents français dans les deux corpus. La deuxième partie décrit les critères sur lesquels nous nous appuyons pour comparer les équivalents des deux types de corpus. La troisième partie met en place la typologie des équivalents associés à un même terme anglais. Les résultats pour le premier objectif montrent que sur les 343 termes anglais analysés, les termes présentant des équivalents critiquables dans les deux corpus sont relativement peu élevés (12), tandis que le nombre de termes présentant des similitudes d’équivalence entre les corpus est très élevé (272 équivalents identiques et 55 équivalents non critiquables). L’analyse comparative décrite dans ce chapitre confirme notre hypothèse selon laquelle la terminologie employée dans les corpus parallèles ne se démarque pas de celle des corpus comparables. Les résultats pour le deuxième objectif montrent que de nombreux termes anglais sont rendus par plusieurs équivalents (70 % des termes analysés). Il est aussi constaté que ce ne sont pas les synonymes qui forment le groupe le plus important des équivalents, mais les quasi-synonymes. En outre, les équivalents appartenant à une autre partie du discours constituent une part importante des équivalents. Ainsi, la typologie élaborée dans cette thèse présente des mécanismes de l’équivalence terminologique peu décrits aussi systématiquement dans les travaux antérieurs.
Resumo:
This is an important book that ought to launch a debate about how we research our understanding of the world, it is an innovative intervention in a vital public issue, and it is an elegant and scholarly hard look at what is actually happening. Jean Seaton, Prof of Media History, U of Westminster, UK & Official Historian of the BBC -- Summary: This book investigates the question of how comparative studies of international TV news (here: on violence presentation) can best be conceptualized in a way that allows for crossnational, comparative conclusions on an empirically validated basis. This book shows that such a conceptualization is necessary in order to overcome existing restrictions in the comparability of international analysis on violence presentation. Investigated examples include the most watched news bulletins in Great Britain (10o'clock news on the BBC), Germany (Tagesschau on ARD) and Russia (Vremja on Channel 1). This book highlights a substantial cross-national violence news flow as well as a cross-national visual violence flow (key visuals) as distinct transnational components. In addition, event-related textual analysis reveals how the historical rootedness of nations and its symbols of power are still manifested in televisual mediations of violence. In conclusion, this study lobbies for a conscientious use of comparative data/analysis both in journalism research and practice in order to understand what it may convey in the different arenas of today’s newsmaking.
Resumo:
This paper deals with the analysis of the parameters which are effective in shaft voltage generation of induction generators. It focuses on different parasitic capacitive couplings by mathematical equations, finite element simulations and experiments. The effects of different design parameters have been studied on proposed capacitances and resultant shaft voltage. Some parameters can change proposed capacitive coupling such as: stator slot tooth, the gap between slot tooth and winding, and the height of the slot tooth, as well as the air gap between the rotor and the stator. This analysis can be used in a primary stage of a generator design to reduce motor shaft voltage and avoid additional costs of resultant bearing current mitigation.
Resumo:
Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.
Resumo:
Prostate cancer metastasis is reliant on the reciprocal interactions between cancer cells and the bone niche/micro-environment. The production of suitable matrices to study metastasis, carcinogenesis and in particular prostate cancer/bone micro-environment interaction has been limited to specific protein matrices or matrix secreted by immortalised cell lines that may have undergone transformation processes altering signaling pathways and modifying gene or receptor expression. We hypothesize that matrices produced by primary human osteoblasts are a suitable means to develop an in vitro model system for bone metastasis research mimicking in vivo conditions. We have used a decellularized matrix secreted from primary human osteoblasts as a model for prostate cancer function in the bone micro-environment. We show that this collagen I rich matrix is of fibrillar appearance, highly mineralized, and contains proteins, such as osteocalcin, osteonectin and osteopontin, and growth factors characteristic of bone extracellular matrix (ECM). LNCaP and PC3 cells grown on this matrix, adhere strongly, proliferate, and express markers consistent with a loss of epithelial phenotype. Moreover, growth of these cells on the matrix is accompanied by the induction of genes associated with attachment, migration, increased invasive potential, Ca2+ signaling and osteolysis. In summary, we show that growth of prostate cancer cells on matrices produced by primary human osteoblasts mimics key features of prostate cancer bone metastases and thus is a suitable model system to study the tumor/bone micro-environment interaction in this disease.
Resumo:
Concerns regarding groundwater contamination with nitrate and the long-term sustainability of groundwater resources have prompted the development of a multi-layered three dimensional (3D) geological model to characterise the aquifer geometry of the Wairau Plain, Marlborough District, New Zealand. The 3D geological model which consists of eight litho-stratigraphic units has been subsequently used to synthesise hydrogeological and hydrogeochemical data for different aquifers in an approach that aims to demonstrate how integration of water chemistry data within the physical framework of a 3D geological model can help to better understand and conceptualise groundwater systems in complex geological settings. Multivariate statistical techniques(e.g. Principal Component Analysis and Hierarchical Cluster Analysis) were applied to groundwater chemistry data to identify hydrochemical facies which are characteristic of distinct evolutionary pathways and a common hydrologic history of groundwaters. Principal Component Analysis on hydrochemical data demonstrated that natural water-rock interactions, redox potential and human agricultural impact are the key controls of groundwater quality in the Wairau Plain. Hierarchical Cluster Analysis revealed distinct hydrochemical water quality groups in the Wairau Plain groundwater system. Visualisation of the results of the multivariate statistical analyses and distribution of groundwater nitrate concentrations in the context of aquifer lithology highlighted the link between groundwater chemistry and the lithology of host aquifers. The methodology followed in this study can be applied in a variety of hydrogeological settings to synthesise geological, hydrogeological and hydrochemical data and present them in a format readily understood by a wide range of stakeholders. This enables a more efficient communication of the results of scientific studies to the wider community.
Resumo:
Pressure feeder chutes are pieces of equipment used in sugar cane crushing to increase the amount of cane that can be put through a mill. The continuous pressure feeder was developed with the objective to provide a constant feed of bagasse under pressure to the mouth of the crushing mills. The pressure feeder chute is used in a sugarcane milling unit to transfer bagasse from one set of crushing rolls to a second set of crushing rolls. There have been many pressure feeder chute failures in the past. The pressure feeder chute is quite vulnerable and if the bagasse throughput is blocked at the mill rollers, the pressure build-up in the chute can be enormous, which can ultimately result in failure. The result is substantial damage to the rollers, mill and chute construction, and downtimes of up to 48 hours can be experienced. Part of the problem is that the bagasse behaviour in the pressure feeder chute is not understood well. If the pressure feeder chute behaviour was understood, then the chute geometry design could be modified in order to minimise risk of failure. There are possible avenues for changing pressure feeder chute design and operations with a view to producing more reliable pressure feeder chutes in the future. There have been previous attempts to conduct experimental work to determine the causes of pressure feeder chute failures. There are certain guidelines available, however pressure feeder chute failures continue. Pressure feeder chute behaviour still remains poorly understood. This thesis contains the work carried out between April 14th 2009 and October 10th 2012 that focuses on the design of an experimental apparatus to measure forces and visually observe bagasse behaviour in an attempt to understand bagasse behaviour in pressure feeder chutes and minimise the risk of failure.
Resumo:
Aim A new method of penumbral analysis is implemented which allows an unambiguous determination of field size and penumbra size and quality for small fields and other non-standard fields. Both source occlusion and lateral electronic disequilibrium will affect the size and shape of cross-axis profile penumbrae; each is examined in detail. Method A new method of penumbral analysis is implemented where the square of the derivative of the cross-axis profile is plotted. The resultant graph displays two peaks in the place of the two penumbrae. This allows a strong visualisation of the quality of a field penumbra, as well as a mathematically consistent method of determining field size (distance between the two peak’s maxima), and penumbra (full-widthtenth-maximum of peak). Cross-axis profiles were simulated in a water phantom at a depth of 5 cm using Monte Carlo modelling, for field sizes between 5 and 30 mm. The field size and penumbra size of each field was calculated using the method above, as well as traditional definitions set out in IEC976. The effect of source occlusion and lateral electronic disequilibrium on the penumbrae was isolated by repeating the simulations removing electron transport and using an electron spot size of 0 mm, respectively. Results All field sizes calculated using the traditional and proposed methods agreed within 0.2 mm. The penumbra size measured using the proposed method was systematically 1.8 mm larger than the traditional method at all field sizes. The size of the source had a larger effect on the size of the penumbra than did lateral electronic disequilibrium, particularly at very small field sizes. Conclusion Traditional methods of calculating field size and penumbra are proved to be mathematically adequate for small fields. However, the field size definition proposed in this study would be more robust amongst other nonstandard fields, such as flattening filter free. Source occlusion plays a bigger role than lateral electronic disequilibrium in small field penumbra size.
Resumo:
Barmah Forest virus (BFV) disease is an emerging mosquito-borne disease in Australia. We aimed to outline some recent methods in using GIS for the analysis of BFV disease in Queensland, Australia. A large database of geocoded BFV cases has been established in conjunction with population data. The database has been used in recently published studies conducted by the authors to determine spatio-temporal BFV disease hotspots and spatial patterns using spatial autocorrelation and semi-variogram analysis in conjunction with the development of interpolated BFV disease standardised incidence maps. This paper briefly outlines spatial analysis methodologies using GIS tools used in those studies. This paper summarises methods and results from previous studies by the authors, and presents a GIS methodology to be used in future spatial analytical studies in attempt to enhance the understanding of BFV disease in Queensland. The methodology developed is useful in improving the analysis of BFV disease data and will enhance the understanding of the BFV disease distribution in Queensland, Australia.
Resumo:
Existing planning theories tend to be limited in their analytical scope and often fail to account for the impact of many interactions between the multitudes of stakeholders involved in strategic planning processes. Although many theorists rejected structural–functional approaches from the 1970s, this article argues that many of structural–functional concepts remain relevant and useful to planning practitioners. In fact, structural–functional approaches are highly useful and practical when used as a foundation for systemic analysis of real-world, multi-layered, complex planning systems to support evidence-based governance reform. Such approaches provide a logical and systematic approach to the analysis of the wider governance of strategic planning systems that is grounded in systems theory and complementary to existing theories of complexity and planning. While we do not propose its use as a grand theory of planning, this article discusses how structural–functional concepts and approaches might be applied to underpin a practical analysis of the complex decision-making arrangements that drive planning practice, and to provide the evidence needed to target reform of poorly performing arrangements.