851 resultados para Correlation based analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Proof-Carrying Code (PCC) is a general approach to mobile code safety in which the code supplier augments the program with a certifícate (or proof). The intended benefit is that the program consumer can locally validate the certifícate w.r.t. the "untrusted" program by means of a certifícate checker—a process which should be much simpler, eíñcient, and automatic than generating the original proof. Abstraction Carrying Code (ACC) is an enabling technology for PCC in which an abstract model of the program plays the role of certifícate. The generation of the certifícate, Le., the abstraction, is automatically carried out by an abstract interpretation-based analysis engine, which is parametric w.r.t. different abstract domains. While the analyzer on the producer side typically has to compute a semantic fixpoint in a complex, iterative process, on the receiver it is only necessary to check that the certifícate is indeed a fixpoint of the abstract semantics equations representing the program. This is done in a single pass in a much more efficient process. ACC addresses the fundamental issues in PCC and opens the door to the applicability of the large body of frameworks and domains based on abstract interpretation as enabling technology for PCC. We present an overview of ACC and we describe in a tutorial fashion an application to the problem of resource-aware security in mobile code. Essentially the information computed by a cost analyzer is used to genérate cost certificates which attest a safe and efficient use of a mobile code. A receiving side can then reject code which brings cost certificates (which it cannot validate or) which have too large cost requirements in terms of computing resources (in time and/or space) and accept mobile code which meets the established requirements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The relationship between redd superimposition and spawning habitat availability was investigated in the brown trout (Salmo trutta L.) population inhabiting the river Castril (Granada, Spain). Redd surveys were conducted in 24 river sections to estimate the rate of redd superimposition. Used and available microhabitat was evaluated to compute the suitable spawning habitat (SSH) for brown trout. After analysing the microhabitat characteristics positively selected by females, SSH was defined as an area that met all the following five requirements: water depth between 10 and 50 cm, mean water velocity between 30 and 60 cm s)1, bottom water velocity between 15 and 60 cm s)1, substrate size between 4 and 30 mm and no embeddedness. Simple regression analyses showed that redd superimposition was not correlated with redd numbers, SSH or redd density. A simulation-based analysis was performed to estimate the superimposition rate if redds were randomly placed inside the SSH. This analysis revealed that the observed superimposition rate was higher than expected in 23 of 24 instances, this difference being significant (P menor que 0.05) in eight instances and right at the limit of statistical significance (P = 0.05) in another eight instances. Redd superimposition was high in sections with high redd density. High superimposition however was not exclusive to sections with high redd density and was found in moderate- and low-redd-density sections. This suggests that factors other than habitat availability are also responsible for redd superimposition. We argue that female preference for spawning over previously excavated redds may be the most likely explanation for high superimposition at lower densities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En esta Tesis se presentan dos líneas de investigación relacionadas y que contribuyen a las áreas de Interacción Hombre-Tecnología (o Máquina; siglas en inglés: HTI o HMI), lingüística computacional y evaluación de la experiencia del usuario. Las dos líneas en cuestión son el diseño y la evaluación centrada en el usuario de sistemas de Interacción Hombre-Máquina avanzados. En la primera parte de la Tesis (Capítulos 2 a 4) se abordan cuestiones fundamentales del diseño de sistemas HMI avanzados. El Capítulo 2 presenta una panorámica del estado del arte de la investigación en el ámbito de los sistemas conversacionales multimodales, con la que se enmarca el trabajo de investigación presentado en el resto de la Tesis. Los Capítulos 3 y 4 se centran en dos grandes aspectos del diseño de sistemas HMI: un gestor del diálogo generalizado para tratar la Interacción Hombre-Máquina multimodal y sensible al contexto, y el uso de agentes animados personificados (ECAs) para mejorar la robustez del diálogo, respectivamente. El Capítulo 3, sobre gestión del diálogo, aborda el tratamiento de la heterogeneidad de la información proveniente de las modalidades comunicativas y de los sensores externos. En este capítulo se propone, en un nivel de abstracción alto, una arquitectura para la gestión del diálogo con influjos heterogéneos de información, apoyándose en el uso de State Chart XML. En el Capítulo 4 se presenta una contribución a la representación interna de intenciones comunicativas, y su traducción a secuencias de gestos a ejecutar por parte de un ECA, diseñados específicamente para mejorar la robustez en situaciones de diálogo críticas que pueden surgir, por ejemplo, cuando se producen errores de entendimiento en la comunicación entre el usuario humano y la máquina. Se propone, en estas páginas, una extensión del Functional Mark-up Language definido en el marco conceptual SAIBA. Esta extensión permite representar actos comunicativos que realizan intenciones del emisor (la máquina) que no se pretende sean captadas conscientemente por el receptor (el usuario humano), pero con las que se pretende influirle a éste e influir el curso del diálogo. Esto se consigue mediante un objeto llamado Base de Intenciones Comunicativas (en inglés, Communication Intention Base, o CIB). La representación en el CIB de intenciones “no claradas” además de las explícitas permite la construcción de actos comunicativos que realizan simultáneamente varias intenciones comunicativas. En el Capítulo 4 también se describe un sistema experimental para el control remoto (simulado) de un asistente domótico, con autenticación de locutor para dar acceso, y con un ECA en el interfaz de cada una de estas tareas. Se incluye una descripción de las secuencias de comportamiento verbal y no verbal de los ECAs, que fueron diseñados específicamente para determinadas situaciones con objeto de mejorar la robustez del diálogo. Los Capítulos 5 a 7 conforman la parte de la Tesis dedicada a la evaluación. El Capítulo 5 repasa antecedentes relevantes en la literatura de tecnologías de la información en general, y de sistemas de interacción hablada en particular. Los principales antecedentes en el ámbito de la evaluación de la interacción sobre los cuales se ha desarrollado el trabajo presentado en esta Tesis son el Technology Acceptance Model (TAM), la herramienta Subjective Assessment of Speech System Interfaces (SASSI), y la Recomendación P.851 de la ITU-T. En el Capítulo 6 se describen un marco y una metodología de evaluación aplicados a la experiencia del usuario con sistemas HMI multimodales. Se desarrolló con este propósito un novedoso marco de evaluación subjetiva de la calidad de la experiencia del usuario y su relación con la aceptación por parte del mismo de la tecnología HMI (el nombre dado en inglés a este marco es Subjective Quality Evaluation Framework). En este marco se articula una estructura de clases de factores subjetivos relacionados con la satisfacción y aceptación por parte del usuario de la tecnología HMI propuesta. Esta estructura, tal y como se propone en la presente tesis, tiene dos dimensiones ortogonales. Primero se identifican tres grandes clases de parámetros relacionados con la aceptación por parte del usuario: “agradabilidad ” (likeability: aquellos que tienen que ver con la experiencia de uso, sin entrar en valoraciones de utilidad), rechazo (los cuales sólo pueden tener una valencia negativa) y percepción de utilidad. En segundo lugar, este conjunto clases se reproduce para distintos “niveles, o focos, percepción del usuario”. Éstos incluyen, como mínimo, un nivel de valoración global del sistema, niveles correspondientes a las tareas a realizar y objetivos a alcanzar, y un nivel de interfaz (en los casos propuestos en esta tesis, el interfaz es un sistema de diálogo con o sin un ECA). En el Capítulo 7 se presenta una evaluación empírica del sistema descrito en el Capítulo 4. El estudio se apoya en los mencionados antecedentes en la literatura, ampliados con parámetros para el estudio específico de los agentes animados (los ECAs), la auto-evaluación de las emociones de los usuarios, así como determinados factores de rechazo (concretamente, la preocupación por la privacidad y la seguridad). También se evalúa el marco de evaluación subjetiva de la calidad propuesto en el capítulo anterior. Los análisis de factores efectuados revelan una estructura de parámetros muy cercana conceptualmente a la división de clases en utilidad-agradabilidad-rechazo propuesta en dicho marco, resultado que da cierta validez empírica al marco. Análisis basados en regresiones lineales revelan estructuras de dependencias e interrelación entre los parámetros subjetivos y objetivos considerados. El efecto central de mediación, descrito en el Technology Acceptance Model, de la utilidad percibida sobre la relación de dependencia entre la intención de uso y la facilidad de uso percibida, se confirma en el estudio presentado en la presente Tesis. Además, se ha encontrado que esta estructura de relaciones se fortalece, en el estudio concreto presentado en estas páginas, si las variables consideradas se generalizan para cubrir más ampliamente las categorías de agradabilidad y utilidad contempladas en el marco de evaluación subjetiva de calidad. Se ha observado, asimismo, que los factores de rechazo aparecen como un componente propio en los análisis de factores, y además se distinguen por su comportamiento: moderan la relación entre la intención de uso (que es el principal indicador de la aceptación del usuario) y su predictor más fuerte, la utilidad percibida. Se presentan también resultados de menor importancia referentes a los efectos de los ECAs sobre los interfaces de los sistemas de diálogo y sobre los parámetros de percepción y las valoraciones de los usuarios que juegan un papel en conformar su aceptación de la tecnología. A pesar de que se observa un rendimiento de la interacción dialogada ligeramente mejor con ECAs, las opiniones subjetivas son muy similares entre los dos grupos experimentales (uno interactuando con un sistema de diálogo con ECA, y el otro sin ECA). Entre las pequeñas diferencias encontradas entre los dos grupos destacan las siguientes: en el grupo experimental sin ECA (es decir, con interfaz sólo de voz) se observó un efecto más directo de los problemas de diálogo (por ejemplo, errores de reconocimiento) sobre la percepción de robustez, mientras que el grupo con ECA tuvo una respuesta emocional más positiva cuando se producían problemas. Los ECAs parecen generar inicialmente expectativas más elevadas en cuanto a las capacidades del sistema, y los usuarios de este grupo se declaran más seguros de sí mismos en su interacción. Por último, se observan algunos indicios de efectos sociales de los ECAs: la “amigabilidad ” percibida los ECAs estaba correlada con un incremento la preocupación por la seguridad. Asimismo, los usuarios del sistema con ECAs tendían más a culparse a sí mismos, en lugar de culpar al sistema, de los problemas de diálogo que pudieran surgir, mientras que se observó una ligera tendencia opuesta en el caso de los usuarios del sistema con interacción sólo de voz. ABSTRACT This Thesis presents two related lines of research work contributing to the general fields of Human-Technology (or Machine) Interaction (HTI, or HMI), computational linguistics, and user experience evaluation. These two lines are the design and user-focused evaluation of advanced Human-Machine (or Technology) Interaction systems. The first part of the Thesis (Chapters 2 to 4) is centred on advanced HMI system design. Chapter 2 provides a background overview of the state of research in multimodal conversational systems. This sets the stage for the research work presented in the rest of the Thesis. Chapers 3 and 4 focus on two major aspects of HMI design in detail: a generalised dialogue manager for context-aware multimodal HMI, and embodied conversational agents (ECAs, or animated agents) to improve dialogue robustness, respectively. Chapter 3, on dialogue management, deals with how to handle information heterogeneity, both from the communication modalities or from external sensors. A highly abstracted architectural contribution based on State Chart XML is proposed. Chapter 4 presents a contribution for the internal representation of communication intentions and their translation into gestural sequences for an ECA, especially designed to improve robustness in critical dialogue situations such as when miscommunication occurs. We propose an extension of the functionality of Functional Mark-up Language, as envisaged in much of the work in the SAIBA framework. Our extension allows the representation of communication acts that carry intentions that are not for the interlocutor to know of, but which are made to influence him or her as well as the flow of the dialogue itself. This is achieved through a design element we have called the Communication Intention Base. Such r pr s ntation of “non- clar ” int ntions allows th construction of communication acts that carry several communication intentions simultaneously. Also in Chapter 4, an experimental system is described which allows (simulated) remote control to a home automation assistant, with biometric (speaker) authentication to grant access, featuring embodied conversation agents for each of the tasks. The discussion includes a description of the behavioural sequences for the ECAs, which were designed for specific dialogue situations with particular attention given to the objective of improving dialogue robustness. Chapters 5 to 7 form the evaluation part of the Thesis. Chapter 5 reviews evaluation approaches in the literature for information technologies, as well as in particular for speech-based interaction systems, that are useful precedents to the contributions of the present Thesis. The main evaluation precedents on which the work in this Thesis has built are the Technology Acceptance Model (TAM), the Subjective Assessment of Speech System Interfaces (SASSI) tool, and ITU-T Recommendation P.851. Chapter 6 presents the author’s work in establishing an valuation framework and methodology applied to the users’ experience with multimodal HMI systems. A novel user-acceptance Subjective Quality Evaluation Framework was developed by the author specifically for this purpose. A class structure arises from two orthogonal sets of dimensions. First we identify three broad classes of parameters related with user acceptance: likeability factors (those that have to do with the experience of using the system), rejection factors (which can only have a negative valence) and perception of usefulness. Secondly, the class structure is further broken down into several “user perception levels”; at the very least: an overall system-assessment level, task and goal-related levels, and an interface level (e.g., a dialogue system with or without an ECA). An empirical evaluation of the system described in Chapter 4 is presented in Chapter 7. The study was based on the abovementioned precedents in the literature, expanded with categories covering the inclusion of an ECA, the users’ s lf-assessed emotions, and particular rejection factors (privacy and security concerns). The Subjective Quality Evaluation Framework proposed in the previous chapter was also scrutinised. Factor analyses revealed an item structure very much related conceptually to the usefulness-likeability-rejection class division introduced above, thus giving it some empirical weight. Regression-based analysis revealed structures of dependencies, paths of interrelations, between the subjective and objective parameters considered. The central mediation effect, in the Technology Acceptance Model, of perceived usefulness on the dependency relationship of intention-to-use with perceived ease of use was confirmed in this study. Furthermore, the pattern of relationships was stronger for variables covering more broadly the likeability and usefulness categories in the Subjective Quality Evaluation Framework. Rejection factors were found to have a distinct presence as components in factor analyses, as well as distinct behaviour: they were found to moderate the relationship between intention-to-use (the main measure of user acceptance) and its strongest predictor, perceived usefulness. Insights of secondary importance are also given regarding the effect of ECAs on the interface of spoken dialogue systems and the dimensions of user perception and judgement attitude that may have a role in determining user acceptance of the technology. Despite observing slightly better performance values in the case of the system with the ECA, subjective opinions regarding both systems were, overall, very similar. Minor differences between two experimental groups (one interacting with an ECA, the other only through speech) include a more direct effect of dialogue problems (e.g., non-understandings) on perceived dialogue robustness for the voice-only interface test group, and a more positive emotional response for the ECA test group. Our findings further suggest that the ECA generates higher initial expectations, and users seem slightly more confident in their interaction with the ECA than do those without it. Finally, mild evidence of social effects of ECAs was also found: the perceived friendliness of the ECA increased security concerns, and ECA users may tend to blame themselves rather than the system when dialogue problems are encountered, while the opposite may be true for voice-only users.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The increasing use of video editing software requires faster and more efficient editing tools. As a first step, these tools perform a temporal segmentation in shots that allows a later building of indexes describing the video content. Here, we propose a novel real-time high-quality shot detection strategy, suitable for the last generation of video editing software requiring both low computational cost and high quality results. While abrupt transitions are detected through a very fast pixel-based analysis, gradual transitions are obtained from an efficient edge-based analysis. Both analyses are reinforced with a motion analysis that helps to detect and discard false detections. This motion analysis is carried out exclusively over a reduced set of candidate transitions, thus maintaining the computational requirements demanded by new applications to fulfill user needs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes a corpus-based analysis of the humanizing metaphor and supports that constitutive metaphor in science and technology may be highly metaphorical and active. The study, grounded in Lakoff’s Theory of Metaphor and in Langacker’s relational networks, consists of two phases: firstly, Earth Science metaphorical terms were extracted from databases and dictionaries and, then, contextualized by means of the “Wordsmith” tool in a digitalized corpus created to establish their productivity. Secondly, the terms were classified to disclose the main conceptual metaphors underlying them; then, the mappings and the relational networks of the metaphor were described. Results confirm the systematicity and productivity of the metaphor in this field, show evidence that metaphoricity of scientific terms is gradable, and support that Earth Science metaphors are not only created in terms of their concrete salient properties and attributes, but also on abstract human anthropocentric projections.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose a new algorithm for the design of prediction structures with low delay and limited penalty in the rate-distortion performance for multiview video coding schemes. This algorithm constitutes one of the elements of a framework for the analysis and optimization of delay in multiview coding schemes that is based in graph theory. The objective of the algorithm is to find the best combination of prediction dependencies to prune from a multiview prediction structure, given a number of cuts. Taking into account the properties of the graph-based analysis of the encoding delay, the algorithm is able to find the best prediction dependencies to eliminate from an original prediction structure, while limiting the number of cut combinations to evaluate. We show that this algorithm obtains optimum results in the reduction of the encoding latency with a lower computational complexity than exhaustive search alternatives.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Se está produciendo en la geodesia un cambio de paradigma en la concepción de los modelos digitales del terreno, pasando de diseñar el modelo con el menor número de puntos posibles a hacerlo con cientos de miles o millones de puntos. Este cambio ha sido consecuencia de la introducción de nuevas tecnologías como el escáner láser, la interferometría radar y el tratamiento de imágenes. La rápida aceptación de estas nuevas tecnologías se debe principalmente a la gran velocidad en la toma de datos, a la accesibilidad por no precisar de prisma y al alto grado de detalle de los modelos. Los métodos topográficos clásicos se basan en medidas discretas de puntos que considerados en su conjunto forman un modelo; su precisión se deriva de la precisión en la toma singular de estos puntos. La tecnología láser escáner terrestre (TLS) supone una aproximación diferente para la generación del modelo del objeto observado. Las nubes de puntos, producto del escaneo con TLS, pasan a ser tratadas en su conjunto mediante análisis de áreas, de forma que ahora el modelo final no es el resultado de una agregación de puntos sino la de la mejor superficie que se adapta a las nubes de puntos. Al comparar precisiones en la captura de puntos singulares realizados con métodos taquimétricos y equipos TLS la inferioridad de estos últimos es clara; sin embargo es en el tratamiento de las nubes de puntos, con los métodos de análisis basados en áreas, se han obtenido precisiones aceptables y se ha podido considerar plenamente la incorporación de esta tecnología en estudios de deformaciones y movimientos de estructuras. Entre las aplicaciones del TLS destacan las de registro del patrimonio, registro de las fases en la construcción de plantas industriales y estructuras, atestados de accidentes y monitorización de movimientos del terreno y deformaciones de estructuras. En la auscultación de presas, comparado con la monitorización de puntos concretos dentro, en coronación o en el paramento de la presa, disponer de un modelo continuo del paramento aguas abajo de la presa abre la posibilidad de introducir los métodos de análisis de deformaciones de superficies y la creación de modelos de comportamiento que mejoren la comprensión y previsión de sus movimientos. No obstante, la aplicación de la tecnología TLS en la auscultación de presas debe considerarse como un método complementario a los existentes. Mientras que los péndulos y la reciente técnica basada en el sistema de posicionamiento global diferencial (DGPS) dan una información continua de los movimientos de determinados puntos de la presa, el TLS permite ver la evolución estacional y detectar posibles zonas problemáticas en todo el paramento. En este trabajo se analizan las características de la tecnología TLS y los parámetros que intervienen en la precisión final de los escaneos. Se constata la necesidad de utilizar equipos basados en la medida directa del tiempo de vuelo, también llamados pulsados, para distancias entre 100 m y 300 m Se estudia la aplicación del TLS a la modelización de estructuras y paramentos verticales. Se analizan los factores que influyen en la precisión final, como el registro de nubes, tipo de dianas y el efecto conjunto del ángulo y la distancia de escaneo. Finalmente, se hace una comparación de los movimientos dados por los péndulos directos de una presa con los obtenidos del análisis de las nubes de puntos correspondientes a varias campañas de escaneos de la misma presa. Se propone y valida el empleo de gráficos patrón para relacionar las variables precisión o exactitud con los factores distancia y ángulo de escaneo en el diseño de trabajos de campo. Se expone su aplicación en la preparación del trabajo de campo para la realización de una campaña de escaneos dirigida al control de movimientos de una presa y se realizan recomendaciones para la aplicación de la técnica TLS a grandes estructuras. Se ha elaborado el gráfico patrón de un equipo TLS concreto de alcance medio. Para ello se hicieron dos ensayos de campo en condiciones reales de trabajo, realizando escaneos en todo el rango de distancias y ángulos de escaneo del equipo. Se analizan dos métodos para obtener la precisión en la modelización de paramentos y la detección de movimientos de estos: el método del “plano de mejor ajuste” y el método de la “deformación simulada”. Por último, se presentan los resultados de la comparación de los movimientos estacionales de una presa arco-gravedad entre los registrados con los péndulos directos y los obtenidos a partir de los escaneos realizados con un TLS. Los resultados muestran diferencias de milímetros, siendo el mejor de ellos del orden de un milímetro. Se explica la metodología utilizada y se hacen consideraciones respecto a la densidad de puntos de las nubes y al tamaño de las mallas de triángulos. A shift of paradigm in the conception of the survey digital models is taking place in geodesy, moving from designing a model with the fewer possible number of points to models of hundreds of thousand or million points. This change has happened because of the introduction of new technologies like the laser scanner, the interferometry radar and the processing of images. The fast acceptance of these new technologies has been due mainly to the great speed getting the data, to the accessibility as reflectorless technique, and to the high degree of detail of the models. Classic survey methods are based on discreet measures of points that, considered them as a whole, form a model; the precision of the model is then derived from the precision measuring the single points. The terrestrial laser scanner (TLS) technology supposes a different approach to the model generation of the observed object. Point cloud, the result of a TLS scan, must be treated as a whole, by means of area-based analysis; so, the final model is not an aggregation of points but the one resulting from the best surface that fits with the point cloud. Comparing precisions between the one resulting from the capture of singular points made with tachometric measurement methods and with TLS equipment, the inferiority of this last one is clear; but it is in the treatment of the point clouds, using area-based analysis methods, when acceptable precisions have been obtained and it has been possible to consider the incorporation of this technology for monitoring structures deformations. Among TLS applications it have to be emphasized those of registry of the cultural heritage, stages registry during construction of industrial plants and structures, police statement of accidents and monitorization of land movements and structures deformations. Compared with the classical dam monitoring, approach based on the registry of a set of points, the fact having a continuous model of the downstream face allows the possibility of introducing deformation analysis methods and behavior models that would improve the understanding and forecast of dam movements. However, the application of TLS technology for dam monitoring must be considered like a complementary method with the existing ones. Pendulums and recently the differential global positioning system (DGPS) give a continuous information of the movements of certain points of the dam, whereas TLS allows following its seasonal evolution and to detect damaged zones of the dam. A review of the TLS technology characteristics and the factors affecting the final precision of the scanning data is done. It is stated the need of selecting TLS based on the direct time of flight method, also called pulsed, for scanning distances between 100m and 300m. Modelling of structures and vertical walls is studied. Factors that influence in the final precision, like the registry of point clouds, target types, and the combined effect of scanning distance and angle of incidence are analyzed. Finally, a comparison among the movements given by the direct pendulums of a dam and the ones obtained from the analysis of point clouds is done. A new approach to obtain a complete map-type plot of the precisions of TLS equipment based on the direct measurement of time of flight method at midrange distances is presented. Test were developed in field-like conditions, similar to dam monitoring and other civil engineering works. Taking advantage of graphic semiological techniques, a “distance - angle of incidence” map based was designed and evaluated for field-like conditions. A map-type plot was designed combining isolines with sized and grey scale points, proportional to the precision values they represent. Precisions under different field conditions were compared with specifications. For this purpose, point clouds were evaluated under two approaches: the standar "plane-of-best-fit" and the proposed "simulated deformation”, that showed improved performance. These results lead to a discussion and recommendations about optimal TLS operation in civil engineering works. Finally, results of the comparison of seasonal movements of an arc-gravity dam between the registered by the direct pendulums ant the obtained from the TLS scans, are shown. The results show differences of millimeters, being the best around one millimeter. The used methodology is explained and considerations with respect to the point cloud density and to the size of triangular meshes are done.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta tese de Doutorado procura estudar os métodos teológicos em diálogo, antropologiatranscendental de Karl Rahner e correlação de Paul Tillich, a partir da sistematização da ontologia existencial elaborada pelo jovem Heidegger em Sein und Zeit. Tanto num, como noutro método, o que se discute é a profundidade do ser, na sua possibilidade de aproximação em superação ao que não pode ser dito. Nesse caso, tanto a metafísica tomista, resgatada por Rahner, quanto a secularização protestante e o abismo do ser, enfatizado por Tillich, assumem a impossibilidade de se dizer o conteúdo do sagrado, exatamente por se situarem ou no inconceito da raiz ontológica (Rahner), ou no excesso de sentido do Ultimate Concern (Tillich). Dessas impossibilades, independente se um antes e depois (Rahner), ou se um depois e um antes em profundidade (Tillich), o que se tem é a pergunta ontológica como possibilidade da abertura do ser, nos dois casos sem conteúdo, para que a resposta , que também não responde, seja dada como (im)possibilidade do deslocamento do ser. No resgate da metafísica tomista, em Rahner, há um deslocamento a partir de um sentido, de certo modo linear , daí antropologia-transcendental. Já na leitura protestante de Tillich, o deslocamento se dá por uma dialética , o ser ameaçado pelo não-ser, como uma unidade rompida, daí correlação. A tese é apresentada em quatro capítulos. No primeiro há uma leitura de Heidegger, pela perspectiva da hermenêutica da compreensão. Já no segundo a reflexão se dá em busca da possibilidade da compreensão pela racionalidade ontológica na correlação, Paul Tillich, pela mediação do simbólico. No terceiro, seguindo as mesmas preocupações, se faz a leitura da epistemologia na racionalidade ontológica na antropologia -transcendental, Karl Rahner, pela mediação da pré-apreensão. O quarto capítulo apresenta temas decorrentes, e tradicionais da teologia, em diálogo a partir dos métodos.(AU)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta tese de Doutorado procura estudar os métodos teológicos em diálogo, antropologiatranscendental de Karl Rahner e correlação de Paul Tillich, a partir da sistematização da ontologia existencial elaborada pelo jovem Heidegger em Sein und Zeit. Tanto num, como noutro método, o que se discute é a profundidade do ser, na sua possibilidade de aproximação em superação ao que não pode ser dito. Nesse caso, tanto a metafísica tomista, resgatada por Rahner, quanto a secularização protestante e o abismo do ser, enfatizado por Tillich, assumem a impossibilidade de se dizer o conteúdo do sagrado, exatamente por se situarem ou no inconceito da raiz ontológica (Rahner), ou no excesso de sentido do Ultimate Concern (Tillich). Dessas impossibilades, independente se um antes e depois (Rahner), ou se um depois e um antes em profundidade (Tillich), o que se tem é a pergunta ontológica como possibilidade da abertura do ser, nos dois casos sem conteúdo, para que a resposta , que também não responde, seja dada como (im)possibilidade do deslocamento do ser. No resgate da metafísica tomista, em Rahner, há um deslocamento a partir de um sentido, de certo modo linear , daí antropologia-transcendental. Já na leitura protestante de Tillich, o deslocamento se dá por uma dialética , o ser ameaçado pelo não-ser, como uma unidade rompida, daí correlação. A tese é apresentada em quatro capítulos. No primeiro há uma leitura de Heidegger, pela perspectiva da hermenêutica da compreensão. Já no segundo a reflexão se dá em busca da possibilidade da compreensão pela racionalidade ontológica na correlação, Paul Tillich, pela mediação do simbólico. No terceiro, seguindo as mesmas preocupações, se faz a leitura da epistemologia na racionalidade ontológica na antropologia -transcendental, Karl Rahner, pela mediação da pré-apreensão. O quarto capítulo apresenta temas decorrentes, e tradicionais da teologia, em diálogo a partir dos métodos.(AU)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O objetivo deste estudo é analisar a crítica cinematográfica durante o período chamado de Retomada do cinema nacional. Foram escolhidos os maiores veículos de circulação da região Sudeste do País, os jornais O Globo (Rio de Janeiro), Folha de S.Paulo e O Estado de S.Paulo (São Paulo) e a revista Veja para fazer uma análise de conteúdo das críticas das seis maiores bilheterias da Retomada para, em seguida, conferir a recepção das mesmas pelos diretores desses filmes, por meio de entrevistas semi-estruturadas. A intenção é analisar quais conflitos permeiam a relação entre os críticos de cinema e os cineastas, a fim de contribuir para o melhor entendimento do trabalho de dois elementos dos mais importantes da área de cinema. O confronto do material da análise com as entrevistas confirmou a hipótese da existência de conflitos de valores e opiniões entre os dois lados e permitiu identificar pré-julgamentos, simpatias e antipatias, análises emotivas e não-fundamentadas de alguns críticos e cineastas, mas também opiniões e valores fundamentados de outros, demonstrando uma rica diversidade que não se encaixa em uma única definição.(AU)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O objetivo deste estudo é analisar a crítica cinematográfica durante o período chamado de Retomada do cinema nacional. Foram escolhidos os maiores veículos de circulação da região Sudeste do País, os jornais O Globo (Rio de Janeiro), Folha de S.Paulo e O Estado de S.Paulo (São Paulo) e a revista Veja para fazer uma análise de conteúdo das críticas das seis maiores bilheterias da Retomada para, em seguida, conferir a recepção das mesmas pelos diretores desses filmes, por meio de entrevistas semi-estruturadas. A intenção é analisar quais conflitos permeiam a relação entre os críticos de cinema e os cineastas, a fim de contribuir para o melhor entendimento do trabalho de dois elementos dos mais importantes da área de cinema. O confronto do material da análise com as entrevistas confirmou a hipótese da existência de conflitos de valores e opiniões entre os dois lados e permitiu identificar pré-julgamentos, simpatias e antipatias, análises emotivas e não-fundamentadas de alguns críticos e cineastas, mas também opiniões e valores fundamentados de outros, demonstrando uma rica diversidade que não se encaixa em uma única definição.(AU)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Members of the bacterial families Haemophilus and Neisseria, important human pathogens that commonly colonize the nasopharynx, are naturally competent for DNA uptake from their environment. In each genus this process is discriminant in favor of its own and against foreign DNA through sequence specificity of DNA receptors. The Haemophilus DNA uptake apparatus binds a 29-bp oligonucleotide domain containing a highly conserved 9-bp core sequence, whereas the neisserial apparatus binds a 10-bp motif. Each motif (“uptake sequence”, US) is highly over-represented in the chromosome of the corresponding genus, particularly concentrated with core sequences in inverted pairs forming gene terminators. Two Haemophilus core USs were unexpectedly found forming the terminator of sodC in Neisseria meningitidis (meningococcus), and sequence analysis strongly suggests that this virulence gene, located next to IS1106, arose through horizontal transfer from Haemophilus. By using USs as search strings in a computer-based analysis of genome sequence, it was established that while USs of the “wrong” genus do not occur commonly in Neisseria or Haemophilus, where they do they are highly likely to flag domains of chromosomal DNA that have been transferred from Haemophilus. Three independent domains of Haemophilus-like DNA were found in the meningococcal chromosome, associated respectively with the virulence gene sodC, the bio gene cluster, and an unidentified orf. This report identifies intergenerically transferred DNA and its source in bacteria, and further identifies transformation with heterologous chromosomal DNA as a way of establishing potentially important chromosomal mosaicism in these pathogenic bacteria.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Gene expression profiling provides powerful analyses of transcriptional responses to cellular perturbation. In contrast to DNA array-based methods, reporter gene technology has been underused for this application. Here we describe a genomewide, genome-registered collection of Escherichia coli bioluminescent reporter gene fusions. DNA sequences from plasmid-borne, random fusions of E. coli chromosomal DNA to a Photorhabdus luminescens luxCDABE reporter allowed precise mapping of each fusion. The utility of this collection covering about 30% of the transcriptional units was tested by analyzing individual fusions representative of heat shock, SOS, OxyR, SoxRS, and cya/crp stress-responsive regulons. Each fusion strain responded as anticipated to environmental conditions known to activate the corresponding regulatory circuit. Thus, the collection mirrors E. coli's transcriptional wiring diagram. This genomewide collection of gene fusions provides an independent test of results from other gene expression analyses. Accordingly, a DNA microarray-based analysis of mitomycin C-treated E. coli indicated elevated expression of expected and unanticipated genes. Selected luxCDABE fusions corresponding to these up-regulated genes were used to confirm or contradict the DNA microarray results. The power of partnering gene fusion and DNA microarray technology to discover promoters and define operons was demonstrated when data from both suggested that a cluster of 20 genes encoding production of type I extracellular polysaccharide in E. coli form a single operon.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Although much of the brain’s functional organization is genetically predetermined, it appears that some noninnate functions can come to depend on dedicated and segregated neural tissue. In this paper, we describe a series of experiments that have investigated the neural development and organization of one such noninnate function: letter recognition. Functional neuroimaging demonstrates that letter and digit recognition depend on different neural substrates in some literate adults. How could the processing of two stimulus categories that are distinguished solely by cultural conventions become segregated in the brain? One possibility is that correlation-based learning in the brain leads to a spatial organization in cortex that reflects the temporal and spatial clustering of letters with letters in the environment. Simulations confirm that environmental co-occurrence does indeed lead to spatial localization in a neural network that uses correlation-based learning. Furthermore, behavioral studies confirm one critical prediction of this co-occurrence hypothesis, namely, that subjects exposed to a visual environment in which letters and digits occur together rather than separately (postal workers who process letters and digits together in Canadian postal codes) do indeed show less behavioral evidence for segregated letter and digit processing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The availability of gene-targeted mice deficient in the urokinase-type plasminogen activator (uPA), urokinase receptor (uPAR), tissue-type plasminogen activator (tPA), and plasminogen permits a critical, genetic-based analysis of the physiological and pathological roles of the two mammalian plasminogen activators. We report a comparative study of animals with individual and combined deficits in uPAR and tPA and show that these proteins are complementary fibrinolytic factors in mice. Sinusoidal fibrin deposits are found within the livers of nearly all adult mice examined with a dual deficiency in uPAR and tPA, whereas fibrin deposits are never found in livers collected from animals lacking uPAR and rarely detected in animals lacking tPA alone. This is the first demonstration that uPAR has a physiological role in fibrinolysis. However, uPAR-/-/tPA-/- mice do not develop the pervasive, multi-organ fibrin deposits, severe tissue damage, reduced fertility, and high morbidity and mortality observed in mice with a combined deficiency in tPA and the uPAR ligand, uPA. Furthermore, uPAR-/-/tPA-/- mice do not exhibit the profound impairment in wound repair seen in uPA-/-/tPA-/- mice when they are challenged with a full-thickness skin incision. These results indicate that plasminogen activation focused at the cell surface by uPAR is important in fibrin surveillance in the liver, but that uPA supplies sufficient fibrinolytic potential to clear fibrin deposits from most tissues and support wound healing without the benefit of either uPAR or tPA.