930 resultados para Analysis tools


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background: Although the molecular pathogenesis of pituitary adenomas has been assessed by several different techniques, it still remains partially unclear. Ribosomal proteins (RPs) have been recently related to human tumorigenesis, but they have not yet been evaluated in pituitary tumorigenesis. Objective: The aim of this study was to introduce serial analysis of gene expression (SAGE), a high-throughput method, in pituitary research in order to compare differential gene expression. Methods: Two SAGE cDNA libraries were constructed, one using a pool of mRNA obtained from five GH-secreting pituitary tumors and another from three normal pituitaries. Genes differentially expressed between the libraries were further validated by real-time PCR in 22 GH-secreting pituitary tumors and in 15 normal pituitaries. Results: Computer-generated genomic analysis tools identified 13 722 and 14 993 exclusive genes in normal and adenoma libraries respectively. Both shared 6497 genes, 2188 were underexpressed and 4309 overexpressed in tumoral library. In adenoma library, 33 genes encoding RPs were underexpressed. Among these, RPSA, RPS3, RPS14, and RPS29 were validated by real-time PCR. Conclusion: We report the first SAGE library from normal pituitary tissue and GH-secreting pituitary tumor, which provide quantitative assessment of cellular transcriptome. We also validated some downregulated genes encoding RPs. Altogether, the present data suggest that the underexpression of the studied RP genes possibly collaborates directly or indirectly with other genes to modify cell cycle arrest, DNA repair, and apoptosis, leading to an environment that might have a putative role in the tumorigenesis, introducing new perspectives for further studies on molecular genesis of somatotrophinomas.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The presented study carried out an analysis on rural landscape changes. In particular the study focuses on the understanding of driving forces acting on the rural built environment using a statistical spatial model implemented through GIS techniques. It is well known that the study of landscape changes is essential for a conscious decision making in land planning. From a bibliography review results a general lack of studies dealing with the modeling of rural built environment and hence a theoretical modelling approach for such purpose is needed. The advancement in technology and modernity in building construction and agriculture have gradually changed the rural built environment. In addition, the phenomenon of urbanization of a determined the construction of new volumes that occurred beside abandoned or derelict rural buildings. Consequently there are two types of transformation dynamics affecting mainly the rural built environment that can be observed: the conversion of rural buildings and the increasing of building numbers. It is the specific aim of the presented study to propose a methodology for the development of a spatial model that allows the identification of driving forces that acted on the behaviours of the building allocation. In fact one of the most concerning dynamic nowadays is related to an irrational expansion of buildings sprawl across landscape. The proposed methodology is composed by some conceptual steps that cover different aspects related to the development of a spatial model: the selection of a response variable that better describe the phenomenon under study, the identification of possible driving forces, the sampling methodology concerning the collection of data, the most suitable algorithm to be adopted in relation to statistical theory and method used, the calibration process and evaluation of the model. A different combination of factors in various parts of the territory generated favourable or less favourable conditions for the building allocation and the existence of buildings represents the evidence of such optimum. Conversely the absence of buildings expresses a combination of agents which is not suitable for building allocation. Presence or absence of buildings can be adopted as indicators of such driving conditions, since they represent the expression of the action of driving forces in the land suitability sorting process. The existence of correlation between site selection and hypothetical driving forces, evaluated by means of modeling techniques, provides an evidence of which driving forces are involved in the allocation dynamic and an insight on their level of influence into the process. GIS software by means of spatial analysis tools allows to associate the concept of presence and absence with point futures generating a point process. Presence or absence of buildings at some site locations represent the expression of these driving factors interaction. In case of presences, points represent locations of real existing buildings, conversely absences represent locations were buildings are not existent and so they are generated by a stochastic mechanism. Possible driving forces are selected and the existence of a causal relationship with building allocations is assessed through a spatial model. The adoption of empirical statistical models provides a mechanism for the explanatory variable analysis and for the identification of key driving variables behind the site selection process for new building allocation. The model developed by following the methodology is applied to a case study to test the validity of the methodology. In particular the study area for the testing of the methodology is represented by the New District of Imola characterized by a prevailing agricultural production vocation and were transformation dynamic intensively occurred. The development of the model involved the identification of predictive variables (related to geomorphologic, socio-economic, structural and infrastructural systems of landscape) capable of representing the driving forces responsible for landscape changes.. The calibration of the model is carried out referring to spatial data regarding the periurban and rural area of the study area within the 1975-2005 time period by means of Generalised linear model. The resulting output from the model fit is continuous grid surface where cells assume values ranged from 0 to 1 of probability of building occurrences along the rural and periurban area of the study area. Hence the response variable assesses the changes in the rural built environment occurred in such time interval and is correlated to the selected explanatory variables by means of a generalized linear model using logistic regression. Comparing the probability map obtained from the model to the actual rural building distribution in 2005, the interpretation capability of the model can be evaluated. The proposed model can be also applied to the interpretation of trends which occurred in other study areas, and also referring to different time intervals, depending on the availability of data. The use of suitable data in terms of time, information, and spatial resolution and the costs related to data acquisition, pre-processing, and survey are among the most critical aspects of model implementation. Future in-depth studies can focus on using the proposed model to predict short/medium-range future scenarios for the rural built environment distribution in the study area. In order to predict future scenarios it is necessary to assume that the driving forces do not change and that their levels of influence within the model are not far from those assessed for the time interval used for the calibration.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The present article describes and analyses youth criminality in the city of Rosario, Argentina between the years 2003-2006. Key actors’ understandings of and responses to the conflict were investigated by means of semi-structured interviews, observations, discourse analysis of policy documents, analysis of secondary data, and draw heavily on the experience of the author, a citizen and youth worker of Rosario. The actors examined were the police, the local government, young delinquents and youth organisations. Youth criminality is analysed from a conflict transformation approach using conflict analysis tools. Whereas, the provincial police understand the issue as a delinquency problem, other actors perceive it as an expression of a wider urban social conflict between those that are “included” and those that are “excluded” and as one of the negative effects of globalisation processes. The results suggest that police responses addressing only direct violence are ineffective, even contributing to increased tensions and polarisation, whereas strategies addressing cultural and structural violence are more suitable for this type of social urban conflict. Finally, recommendations for local youth policy are proposed to facilitate participation and inclusion of youth and as a tool for peaceful conflict transformation.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper discusses some issues which arise in the dataflow analysis of constraint logic programming (CLP) languages. The basic technique applied is that of abstract interpretation. First, some types of optimizations possible in a number of CLP systems (including efficient parallelization) are presented and the information that has to be obtained at compile-time in order to be able to implement such optimizations is considered. Two approaches are then proposed and discussed for obtaining this information for a CLP program: one based on an analysis of a CLP metainterpreter using standard Prolog analysis tools, and a second one based on direct analysis of the CLP program. For the second approach an abstract domain which approximates groundness (also referred to as "definiteness") information (i.e. constraint to a single valué) and the related abstraction functions are presented.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Compile-time program analysis techniques can be applied to Web service orchestrations to prove or check various properties. In particular, service orchestrations can be subjected to resource analysis, in which safe approximations of upper and lower resource usage bounds are deduced. A uniform analysis can be simultaneously performed for different generalized resources that can be directiy correlated with cost- and performance-related quality attributes, such as invocations of partners, network traffic, number of activities, iterations, and data accesses. The resulting safe upper and lower bounds do not depend on probabilistic assumptions, and are expressed as functions of size or length of data components from an initiating message, using a finegrained structured data model that corresponds to the XML-style of information structuring. The analysis is performed by transforming a BPEL-like representation of an orchestration into an equivalent program in another programming language for which the appropriate analysis tools already exist.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The experimental results obtained in experiment “STACO” made on board the Spacelab D-2 are re-visited, with image-analysis tools not then available. The configuration consisted of a liquid bridge between two solid supporting discs. An expected breakage occurred during the experiment. The recorded images are analysed and the measured behaviour compared with the results of a three dimensional model of the liquid dynamics, obtaining a much better fit than with linear models

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper describes the authors? experience with static analysis of both WCET and stack usage of a satellite on-board software subsystem. The work is a continuation of a previous case study that used a dynamic WCET analysis tool on an earlier version of the same software system. In particular, the AbsInt aiT tool has been evaluated by analysing both C and Ada code generated by Simulink within the UPMSat-2 project. Some aspects of the aiT tool, specifically those dealing with SPARC register windows, are compared to another static analysis tool, Bound-T. The results of the analysis are discussed, and some conclusions on the use of static WCET analysis tools on the SPARC architecture are commented in the paper.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A case study of vocal fold paralysis treatment is described with the help of the voice quality analysis application BioMet®Phon. The case corresponds to a description of a 40 - year old female patient who was diagnosed of vocal fold paralysis following a cardio - pulmonar intervention which required intubation for 8 days and posterior tracheotomy for 15 days. The patient presented breathy and asthenic phon ation, and dysphagia. Six main examinations were conducted during a full year period that the treatment lasted consisting in periodic reviews including video - endostroboscopy, voice analysis and breathing function monitoring. The phoniatrician treatment inc luded 20 sessions of vocal rehabilitation, followed by an intracordal infiltration with Radiesse 8 months after the rehabilitation treatment started followed by 6 sessions of rehabilitation more. The videondoscopy and the voicing quality analysis refer a s ubstantial improvement in the vocal function with recovery in all the measures estimated (jitter, shimmer, mucosal wave contents, glottal closure, harmonic contents and biomechanical function analysis). The paper refers the procedure followed and the results obtained by comparing the longitudinal progression of the treatment, illustrating the utility of voice quality analysis tools in speech therapy.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Nonlinear analysis tools for studying and characterizing the dynamics of physiological signals have gained popularity, mainly because tracking sudden alterations of the inherent complexity of biological processes might be an indicator of altered physiological states. Typically, in order to perform an analysis with such tools, the physiological variables that describe the biological process under study are used to reconstruct the underlying dynamics of the biological processes. For that goal, a procedure called time-delay or uniform embedding is usually employed. Nonetheless, there is evidence of its inability for dealing with non-stationary signals, as those recorded from many physiological processes. To handle with such a drawback, this paper evaluates the utility of non-conventional time series reconstruction procedures based on non uniform embedding, applying them to automatic pattern recognition tasks. The paper compares a state of the art non uniform approach with a novel scheme which fuses embedding and feature selection at once, searching for better reconstructions of the dynamics of the system. Moreover, results are also compared with two classic uniform embedding techniques. Thus, the goal is comparing uniform and non uniform reconstruction techniques, including the one proposed in this work, for pattern recognition in biomedical signal processing tasks. Once the state space is reconstructed, the scheme followed characterizes with three classic nonlinear dynamic features (Largest Lyapunov Exponent, Correlation Dimension and Recurrence Period Density Entropy), while classification is carried out by means of a simple k-nn classifier. In order to test its generalization capabilities, the approach was tested with three different physiological databases (Speech Pathologies, Epilepsy and Heart Murmurs). In terms of the accuracy obtained to automatically detect the presence of pathologies, and for the three types of biosignals analyzed, the non uniform techniques used in this work lightly outperformed the results obtained using the uniform methods, suggesting their usefulness to characterize non-stationary biomedical signals in pattern recognition applications. On the other hand, in view of the results obtained and its low computational load, the proposed technique suggests its applicability for the applications under study.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Resource analysis aims at inferring the cost of executing programs for any possible input, in terms of a given resource, such as the traditional execution steps, time ormemory, and, more recently energy consumption or user defined resources (e.g., number of bits sent over a socket, number of database accesses, number of calls to particular procedures, etc.). This is performed statically, i.e., without actually running the programs. Resource usage information is useful for a variety of optimization and verification applications, as well as for guiding software design. For example, programmers can use such information to choose different algorithmic solutions to a problem; program transformation systems can use cost information to choose between alternative transformations; parallelizing compilers can use cost estimates for granularity control, which tries to balance the overheads of task creation and manipulation against the benefits of parallelization. In this thesis we have significatively improved an existing prototype implementation for resource usage analysis based on abstract interpretation, addressing a number of relevant challenges and overcoming many limitations it presented. The goal of that prototype was to show the viability of casting the resource analysis as an abstract domain, and howit could overcome important limitations of the state-of-the-art resource usage analysis tools. For this purpose, it was implemented as an abstract domain in the abstract interpretation framework of the CiaoPP system, PLAI.We have improved both the design and implementation of the prototype, for eventually allowing an evolution of the tool to the industrial application level. The abstract operations of such tool heavily depend on the setting up and finding closed-form solutions of recurrence relations representing the resource usage behavior of program components and the whole program as well. While there exist many tools, such as Computer Algebra Systems (CAS) and libraries able to find closed-form solutions for some types of recurrences, none of them alone is able to handle all the types of recurrences arising during program analysis. In addition, there are some types of recurrences that cannot be solved by any existing tool. This clearly constitutes a bottleneck for this kind of resource usage analysis. Thus, one of the major challenges we have addressed in this thesis is the design and development of a novel modular framework for solving recurrence relations, able to combine and take advantage of the results of existing solvers. Additionally, we have developed and integrated into our novel solver a technique for finding upper-bound closed-form solutions of a special class of recurrence relations that arise during the analysis of programs with accumulating parameters. Finally, we have integrated the improved resource analysis into the CiaoPP general framework for resource usage verification, and specialized the framework for verifying energy consumption specifications of embedded imperative programs in a real application, showing the usefulness and practicality of the resulting tool.---ABSTRACT---El Análisis de recursos tiene como objetivo inferir el coste de la ejecución de programas para cualquier entrada posible, en términos de algún recurso determinado, como pasos de ejecución, tiempo o memoria, y, más recientemente, el consumo de energía o recursos definidos por el usuario (por ejemplo, número de bits enviados a través de un socket, el número de accesos a una base de datos, cantidad de llamadas a determinados procedimientos, etc.). Ello se realiza estáticamente, es decir, sin necesidad de ejecutar los programas. La información sobre el uso de recursos resulta muy útil para una gran variedad de aplicaciones de optimización y verificación de programas, así como para asistir en el diseño de los mismos. Por ejemplo, los programadores pueden utilizar dicha información para elegir diferentes soluciones algorítmicas a un problema; los sistemas de transformación de programas pueden utilizar la información de coste para elegir entre transformaciones alternativas; los compiladores paralelizantes pueden utilizar las estimaciones de coste para realizar control de granularidad, el cual trata de equilibrar el coste debido a la creación y gestión de tareas, con los beneficios de la paralelización. En esta tesis hemos mejorado de manera significativa la implementación de un prototipo existente para el análisis del uso de recursos basado en interpretación abstracta, abordando diversos desafíos relevantes y superando numerosas limitaciones que éste presentaba. El objetivo de dicho prototipo era mostrar la viabilidad de definir el análisis de recursos como un dominio abstracto, y cómo se podían superar las limitaciones de otras herramientas similares que constituyen el estado del arte. Para ello, se implementó como un dominio abstracto en el marco de interpretación abstracta presente en el sistema CiaoPP, PLAI. Hemos mejorado tanto el diseño como la implementación del mencionado prototipo para posibilitar su evolución hacia una herramienta utilizable en el ámbito industrial. Las operaciones abstractas de dicha herramienta dependen en gran medida de la generación, y posterior búsqueda de soluciones en forma cerrada, de relaciones recurrentes, las cuales modelizan el comportamiento, respecto al consumo de recursos, de los componentes del programa y del programa completo. Si bien existen actualmente muchas herramientas capaces de encontrar soluciones en forma cerrada para ciertos tipos de recurrencias, tales como Sistemas de Computación Algebraicos (CAS) y librerías de programación, ninguna de dichas herramientas es capaz de tratar, por sí sola, todos los tipos de recurrencias que surgen durante el análisis de recursos. Existen incluso recurrencias que no las puede resolver ninguna herramienta actual. Esto constituye claramente un cuello de botella para este tipo de análisis del uso de recursos. Por lo tanto, uno de los principales desafíos que hemos abordado en esta tesis es el diseño y desarrollo de un novedoso marco modular para la resolución de relaciones recurrentes, combinando y aprovechando los resultados de resolutores existentes. Además de ello, hemos desarrollado e integrado en nuestro nuevo resolutor una técnica para la obtención de cotas superiores en forma cerrada de una clase característica de relaciones recurrentes que surgen durante el análisis de programas lógicos con parámetros de acumulación. Finalmente, hemos integrado el nuevo análisis de recursos con el marco general para verificación de recursos de CiaoPP, y hemos instanciado dicho marco para la verificación de especificaciones sobre el consumo de energía de programas imperativas embarcados, mostrando la viabilidad y utilidad de la herramienta resultante en una aplicación real.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Esta tesis doctoral, que es la culminación de mis estudios de doctorado impartidos por el Departamento de Lingüística Aplicada a la Ciencia y a la Tecnología de la Universidad Politécnica de Madrid, aborda el análisis del uso de la matización (hedging) en el lenguaje legal inglés siguiendo los postulados y principios de la análisis crítica de género (Bhatia, 2004) y empleando las herramientas de análisis de córpora WordSmith Tools versión 6 (Scott, 2014). Como refleja el título, el estudio se centra en la descripción y en el análisis contrastivo de las variedades léxico-sintácticas de los matizadores del discurso (hedges) y las estrategias discursivas que con ellos se llevan a cabo, además de las funciones que éstas desempeñan en un corpus de sentencias del Tribunal Supremo de EE. UU., y de artículos jurídicos de investigación americanos, relacionando, en la medida posible, éstas con los rasgos determinantes de los dos géneros, desde una perspectiva socio-cognitiva. El elemento innovador que ofrece es que, a pesar de los numerosos estudios que se han podido realizar sobre los matizadores del discurso en el inglés general (Lakoff, 1973; Hübler, 1983; Clemen, 1997; Markkanen and Schröder, 1997; Mauranen, 1997; Fetzer 2010; y Finnegan, 2010 entre otros) académico (Crompton, 1997; Meyer, 1997; Skelton, 1997; Martín Butragueňo, 2003) científico (Hyland, 1996a, 1996c, 1998c, 2007; Grabe and Kaplan, 1997; Salager-Meyer, 1997 Varttala, 2001) médico (Prince, 1982; Salager-Meyer, 1994; Skelton, 1997), y, en menor medida el inglés legal (Toska, 2012), no existe ningún tipo de investigación que vincule los distintos usos de la matización a las características genéricas de las comunicaciones profesionales. Dentro del lenguaje legal, la matización confirma su dependencia tanto de las expectativas a macro-nivel de la comunidad de discurso, como de las intenciones a micro-nivel del escritor de la comunicación, variando en función de los propósitos comunicativos del género ya sean éstos educativos, pedagógicos, interpersonales u operativos. El estudio pone de relieve el uso predominante de los verbos modales epistémicos y de los verbos léxicos como matizadores del discurso, estos últimos divididos en cuatro tipos (Hyland 1998c; Palmer 1986, 1990, 2001) especulativos, citativos, deductivos y sensoriales. La realización léxico-sintáctica del matizador puede señalar una de cuatro estrategias discursivas particulares (Namsaraev, 1997; Salager-Meyer, 1994), la indeterminación, la despersonalización, la subjectivisación, o la matización camuflada (camouflage hedging), cuya incidencia y función varia según género. La identificación y cuantificación de los distintos matizadores y estrategias empleados en los diferentes géneros del discurso legal puede tener implicaciones pedagógicos para los estudiantes de derecho no nativos que tienen que demostrar una competencia adecuada en su uso y procesamiento. ABSTRACT This doctoral thesis, which represents the culmination of my doctoral studies undertaken in the Department of Linguistics Applied to Science and Technology of the Universidad Politécnica de Madrid, focusses on the analysis of hedging in legal English following the principles of Critical Genre Analysis (Bhatia, 2004), and using WordSmith Tools version 6 (Scott, 2014) corpus analysis tools. As the title suggests, this study centers on the description and contrastive analysis of lexico-grammatical realizations of hedges and the discourse strategies which they can indicate, as well as the functions they can carry out, in a corpus of U.S. Supreme Court opinions and American law review articles. The study relates realization, incidence and function of hedging to the predominant generic characteristics of the two genres from a socio-cognitive perspective. While there have been numerous studies on hedging in general English (Lakoff, 1973; Hübler, 1983; Clemen, 1997; Markkanen and Schröder, 1997; Mauranen, 1997; Fetzer 2010; and Finnegan, 2010 among others) academic English (Crompton, 1997; Meyer, 1997; Skelton, 1997; Martín Butragueňo, 2003) scientific English (Hyland, 1996a, 1996c, 1998c, 2007; Grabe and Kaplan, 1997; Salager-Meyer, 1997 Varttala, 2001) medical English (Prince, 1982; Salager-Meyer, 1994; Skelton, 1997), and, to a lesser degree, legal English (Toska, 2012), this study is innovative in that it links the different realizations and functions of hedging to the generic characteristics of a particular professional communication. Within legal English, hedging has been found to depend on not only the macro-level expectations of the discourse community for a specific genre, but also on the micro-level intentions of the author of a communication, varying according to the educational, pedagogical, interpersonal or operative purposes the genre may have. The study highlights the predominance of epistemic modal verbs and lexical verbs as hedges, dividing the latter into four types (Hyland, 1998c; Palmer, 1986, 1990, 2001): speculative, quotative, deductive and sensorial. Lexical-grammatical realizations of hedges can signal one of four discourse strategies (Namsaraev, 1997; Salager-Meyer, 1994), indetermination, depersonalization, subjectivization and camouflage hedging, as well as fulfill a variety of functions. The identification and quantification of the different hedges and hedging strategies and functions in the two genres may have pedagogical implications for non-native law students who must demonstrate adequate competence in the production and interpretation of hedged discourse.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The Colorado Alliance of Research Libraries has launched the Alliance Shared Print Trust and is in the process of developing a shared print analysis tool. The system allows libraries to compare themselves with other libraries that have added their MARC records so that they can easily and quickly determine what records are unique or held in common with other libraries. The comparison system is built on open source tools and has been embedded in the Gold Rush framework. The author provides a brief overview of other shared print analysis tools.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The aim of this research paper is to analyse the key political posters made for the campaigns of Irish political party Fianna Fáil framed in the Celtic Tiger (1997-2008) and post-Celtic Tiger years (2009-2012). I will then focus on the four posters of the candidate in the elections that took place in 1997, 2002, 2007 and 2011 with the intention of observing first how the leader is represented, and later on pinpointing the similarities and possible differences between each. This is important in order to observe the main linguistic and visual strategies used to persuade the audience to vote that party and to highlight the power of the politician. Critical discourse analysis tools will be helpful to identify the main discursive strategies employed to persuade the Irish population to vote in a certain direction. Van Leeuwen’s (2008) social actor theory will facilitate the understanding of how participants are represented in the corpus under analysis. Finally, the main tools of Kress and van Leeuwen’s visual grammar (2006) will be applied for the analysis of the images. The study reveals that politicians are represented in a consistently positive way, with status and formal appearance so that people are persuaded to vote for the party they represent because they trust them as political leaders. The study, thus, points out that the poster is a powerful tool used in election campaigns to highlight the power of political parties.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The results of empirical studies are limited to particular contexts, difficult to generalise and the studies themselves are expensive to perform. Despite these problems, empirical studies in software engineering can be made effective and they are important to both researchers and practitioners. The key to their effectiveness lies in the maximisation of the information that can be gained by examining existing studies, conducting power analyses for an accurate minimum sample size and benefiting from previous studies through replication. This approach was applied in a controlled experiment examining the combination of automated static analysis tools and code inspection in the context of verification and validation (V&V) of concurrent Java components. The combination of these V&V technologies was shown to be cost-effective despite the size of the study, which thus contributes to research in V&V technology evaluation.