813 resultados para Rorty, Richard - Criticism and interpretation
Resumo:
The ecosystem services concept (ES) is becoming a cornerstone of contemporary sustainability thought. Challenges with this concept and its applications are well documented, but have not yet been systematically assessed alongside strengths and external factors that influence uptake. Such an assessment could form the basis for improving ES thinking, further embedding it into environmental decisions and management. The Young Ecosystem Services Specialists (YESS) completed a Strengths–Weaknesses–Opportunities–Threats (SWOT) analysis of ES through YESS member surveys. Strengths include the approach being interdisciplinary, and a useful communication tool. Weaknesses include an incomplete scientific basis, frameworks being inconsistently applied, and accounting for nature's intrinsic value. Opportunities include alignment with existing policies and established methodologies, and increasing environmental awareness. Threats include resistance to change, and difficulty with interdisciplinary collaboration. Consideration of SWOT themes suggested five strategic areas for developing and implementing ES. The ES concept could improve decision-making related to natural resource use, and interpretation of the complexities of human-nature interactions. It is contradictory – valued as a simple means of communicating the importance of conservation, whilst also considered an oversimplification characterised by ambiguous language. Nonetheless, given sufficient funding and political will, the ES framework could facilitate interdisciplinary research, ensuring decision-making that supports sustainable development.
Resumo:
"Hole in the Head" is a play about a woman who wakes up. Maude wakes up in the first act, and in every subsequent scene she undergoes some form of physical or emotional awakening as characters walk in and out of her front door."Hole in the Head" is accompanied by an introduction that attempts to understand the interplay between creativity and academia through an analysis of theatre, feminist and queer theory, and science.
Resumo:
Alessandro Baricco is an Italian author, pianist, journalist and music critic, among a wide range of many other talents. His novels have won great critical acclaim in Italy and France and are popular around the world. While generally considered among the postmodern writers, some critics have accused him of being a forerunner in a 1990s movement dubbed letteratura giovanile, that is juvenile literature that is simplistic, targets a young audience and is created for the sole purpose of making money. This criticism is unwarranted. Baricco is a multitalented author who pays strict attention to the quality of his work and weaves plotlines replete with a diverse set of genres, literary devices and symbolism, often inspired by other great writers and thinkers. However, literary critics have yet to acknowledge one of Baricco's strongest and most important influences: Homer, the ancient Greek bard and author of the epic poems, the Iliad and the Odyssey. Taking Baricco's work in a Homeric context can aid in viewing it as valid and important work, worthy of scholarly discussion and interpretation, rather than, as some critics accuse, a one-dimensional story meant only for children. This paper will argue that Baricco's work is Homeric and, in fact, Baricco's implementation of many of Homer's devices, such as his understanding of his audience and use rhythmic language and stereotyped story patterns, has aided Baricco's great success and popularity.
Resumo:
Clinical text understanding (CTU) is of interest to health informatics because critical clinical information frequently represented as unconstrained text in electronic health records are extensively used by human experts to guide clinical practice, decision making, and to document delivery of care, but are largely unusable by information systems for queries and computations. Recent initiatives advocating for translational research call for generation of technologies that can integrate structured clinical data with unstructured data, provide a unified interface to all data, and contextualize clinical information for reuse in multidisciplinary and collaborative environment envisioned by CTSA program. This implies that technologies for the processing and interpretation of clinical text should be evaluated not only in terms of their validity and reliability in their intended environment, but also in light of their interoperability, and ability to support information integration and contextualization in a distributed and dynamic environment. This vision adds a new layer of information representation requirements that needs to be accounted for when conceptualizing implementation or acquisition of clinical text processing tools and technologies for multidisciplinary research. On the other hand, electronic health records frequently contain unconstrained clinical text with high variability in use of terms and documentation practices, and without commitmentto grammatical or syntactic structure of the language (e.g. Triage notes, physician and nurse notes, chief complaints, etc). This hinders performance of natural language processing technologies which typically rely heavily on the syntax of language and grammatical structure of the text. This document introduces our method to transform unconstrained clinical text found in electronic health information systems to a formal (computationally understandable) representation that is suitable for querying, integration, contextualization and reuse, and is resilient to the grammatical and syntactic irregularities of the clinical text. We present our design rationale, method, and results of evaluation in processing chief complaints and triage notes from 8 different emergency departments in Houston Texas. At the end, we will discuss significance of our contribution in enabling use of clinical text in a practical bio-surveillance setting.
Resumo:
Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.
Resumo:
High-frequency data collected continuously over a multiyear time frame are required for investigating the various agents that drive ecological and hydrodynamic processes in estuaries. Here, we present water quality and current in-situ observations from a fixed monitoring station operating from 2008 to 2014 in the lower Guadiana Estuary, southern Portugal (37°11.30' N, 7°24.67' W). The data were recorded by a multi-parametric probe providing hourly records (temperature, salinity, chlorophyll, dissolved oxygen, turbidity, and pH) at a water depth of ~1 m, and by a bottom-mounted acoustic Doppler current profiler measuring the pressure, near-bottom temperature, and flow velocity through the water column every 15 min. The time-series data, in particular the probe ones, present substantial gaps arising from equipment failure and maintenance, which are ineluctable with this type of observations in harsh environments. However, prolonged (months-long) periods of multi-parametric observations during contrasted external forcing conditions are available. The raw data are reported together with flags indicating the quality status of each record. River discharge data from two hydrographic stations located near the estuary head are also provided to support data analysis and interpretation.
Resumo:
This study presents a systematic analysis and interpretation of autonomous underwater vehicle-based microbathymetry combined with remotely operated vehicle (ROV) video recordings, rock analyses and temperaturemeasurements within the PACManus hydrothermal area located on Pual Ridge in the Bismarck Sea of eastern Manus Basin. The data obtained during research cruise Magellan-06 and So-216 provides a framework for understanding the relationship between the volcanism, tectonismand hydrothermal activity. PACManus is a submarine felsic vocanically-hosted hydrothermal area that hosts multiple vent fields locatedwithin several hundredmeters of one another but with different fluid chemistries, vent temperatures and morphologies. The total area of hydrothermal activity is estimated to be 20,279m**2. Themicrobathymetrymaps combinedwith the ROV video observations allow for precise high-resolution mapping estimates of the areal extents of hydrothermal activity.We find the distribution of hydrothermal fields in the PACManus area is primarily controlled by volcanic features that include lava domes, thick andmassive blocky lava flows, breccias and feeder dykes. Spatial variation in the permeability of local volcanic facies appears to control the distribution of venting within a field.We define a three-stage chronological sequence for the volcanic evolution of the PACManus based on lava flow morphology, sediment cover and lava SiO2 concentration. In Stage-1, sparsely to moderately porphyritic dacite lavas (68-69.8 wt.% SiO2) erupted to form domes or cryptodomes. In Stage-2, aphyric lava with slightly lower SiO2 concentrations (67.2-67.9 wt.% SiO2) formed jumbled and pillowed lava flows. In the most recent phase Stage-3, massive blocky lavaswith 69 to 72.5wt.% SiO2were erupted throughmultiple vents constructing a volcanic ridge identified as the PACManus neovolcanic zone. The transition between these stages may be gradual and related to progressive heating of a silicic magma following a recharge event of hot, mantle-derived melts.
Resumo:
A joint research expedition between the French IFREMER and the German MARUM was conducted in 2011 using the R/V 'Pourquoi pas?' to study gas hydrate distributions in a pockmark field (1141-1199 m below sea surface) at the continental margin of Nigeria. The seafloor drill rig MeBo of MARUM was used to recover sediments as deep as 56.74 m below seafloor. The presence of gas hydrates in specific core sections was deduced from temperature anomalies recorded during continuous records of infrared thermal scanning and anomalies in pore water chloride concentrations. In situ sediment temperature measurements showed elevated geothermal gradients of up to 258 °C/km in the center of the so-called pockmark A which is up to 4.6 times higher than that in the background sediment (72 °C/km). The gas hydrate distribution and thermal regime in the pockmark are largely controlled by the intensity, periodicity and direction of fluid flow. The joint interaction between fluid flow, gas hydrate formation and dissolution, and the thermal regime governs pockmark formation and evolution on the Nigerian continental margin.
Resumo:
Group IV nanostructures have attracted a great deal of attention because of their potential applications in optoelectronics and nanodevices. Raman spectroscopy has been extensively used to characterize nanostructures since it provides non destructive information about their size, by the adequate modeling of the phonon confinement effect. The Raman spectrum is also sensitive to other factors, as stress and temperature, which can mix with the size effects borrowing the interpretation of the Raman spectrum. We present herein an analysis of the Raman spectra obtained for Si and SiGe nanowires; the influence of the excitation conditions and the heat dissipation media are discussed in order to optimize the experimental conditions for reliable spectra acquisition and interpretation.
Resumo:
An increasing number of neuroimaging studies are concerned with the identification of interactions or statistical dependencies between brain areas. Dependencies between the activities of different brain regions can be quantified with functional connectivity measures such as the cross-correlation coefficient. An important factor limiting the accuracy of such measures is the amount of empirical data available. For event-related protocols, the amount of data also affects the temporal resolution of the analysis. We use analytical expressions to calculate the amount of empirical data needed to establish whether a certain level of dependency is significant when the time series are autocorrelated, as is the case for biological signals. These analytical results are then contrasted with estimates from simulations based on real data recorded with magnetoencephalography during a resting-state paradigm and during the presentation of visual stimuli. Results indicate that, for broadband signals, 50–100 s of data is required to detect a true underlying cross-correlations coefficient of 0.05. This corresponds to a resolution of a few hundred milliseconds for typical event-related recordings. The required time window increases for narrow band signals as frequency decreases. For instance, approximately 3 times as much data is necessary for signals in the alpha band. Important implications can be derived for the design and interpretation of experiments to characterize weak interactions, which are potentially important for brain processing.
Resumo:
Aiming to address requirements concerning integration of services in the context of ?big data?, this paper presents an innovative approach that (i) ensures a flexible, adaptable and scalable information and computation infrastructure, and (ii) exploits the competences of stakeholders and information workers to meaningfully confront information management issues such as information characterization, classification and interpretation, thus incorporating the underlying collective intelligence. Our approach pays much attention to the issues of usability and ease-of-use, not requiring any particular programming expertise from the end users. We report on a series of technical issues concerning the desired flexibility of the proposed integration framework and we provide related recommendations to developers of such solutions. Evaluation results are also discussed.
Resumo:
Si hubiese un denominador común entre todas las artes en lo que ha venido llamándose postmodernidad, éste tendría mucho que ver con el final del origen de la obra. Desde la literatura y la música hasta las artes plásticas y la arquitectura, la superación de la modernidad ha estado caracterizada por la sustitución del concepto de creación por el de intervención artística, o lo que es lo mismo, la interpretación de lo que ya existe. A principios del siglo XX los conceptos modernos de creación y origen implicaban tener que desaprender y olvidar todo lo anterior con el ánimo de partir desde cero; incluso en un sentido material Mies sugería la construcción literal de la materia y su movimiento de acuerdo a unas leyes. A partir de la segunda mitad de siglo los planteamientos historicistas empezaron a surgir como reacción ante la amnesia y la supuesta originalidad de los modernos. En este contexto surgen los libros Learning from Las Vegas, 1972 y Delirious New York, 1978, ambos deudores en muchos aspectos con el anterior libro de Venturi, Complexity and Contradiction in Architecture, 1966. Estos dos libros sobre ciudades, alejándose decididamente de las tendencias historicistas de la época, proponían utilizar el análisis crítico de la realidad existente como vehículo para la teoría y el proyecto de manera simultánea, convirtiéndose indirectamente en Manifiestos. Si en un primer momento Venturi, Rossi y otros planteaban acabar con los límites formales establecidos por la modernidad, así como por cualquiera de los cánones anteriores, tomando la totalidad de la obra construida como sistema de referencia, - al igual que hiciera Eliot en literatura, - los libros de Las Vegas y Nueva York sugerían directamente borrar los límites de la propia disciplina, llegando a poner en duda ¿Qué puede ser considerado arquitectura? Sin embargo, debido precisamente a la ausencia total de límites y a la inmensidad del sistema referencial planteado, “todo puede ser arquitectura”, como apuntaba Hans Hollein en 1968, los libros proponen al mismo tiempo definir el campo de actuación de cada cual de manera individual. Los escritos sobre Las Vegas y Nueva York suponen por un lado la eliminación de los limites disciplinares y por otro, la delimitación de ámbitos de trabajo concretos para sus autores: los propios de cada una de las ciudades interpretadas. La primera parte de la Tesis, Lecciones, se ocupa del necesario proceso de aprendizaje y experimentación previo a la acción crítica propiamente dicha. Los arquitectos contemporáneos necesitan acumular material, conocimiento, documentación, experiencias... antes de lanzarse a proponer mediante la crítica y la edición; y al contrario que ocurría con los modernos, cuanto más abundante sea ese bagaje previo más rica será la interpretación. Las ciudades de Roma, Londres y Berlín se entienden por tanto como experiencias capaces de proporcionar a Venturi, Scott Brown y Koolhaas respectivamente, sus “personales diccionarios”, unas interminables imaginerías con las que posteriormente se enfrentarían a los análisis de Las Vegas y Nueva York. La segunda parte, Críticas, se centra en la producción teórica en sí: los dos libros de ciudades analizados en estrecha relación con el Complexity and Contradiction. El razonamiento analógico característico de estos libros ha servido de guía metodológica para la investigación, estableciéndose relaciones, no entre los propios escritos directamente, sino a través de trabajos pertenecientes a otras disciplinas. En primer lugar se plantea un importante paralelismo entre los métodos de análisis desarrollados en estos libros y los utilizados por la crítica literaria, observando que si el new criticism y el nuevo periodismo sirvieron de guía en los escritos de Venturi y Scott Brown, la nouvelle critique y su propuesta de identificación poética fueron el claro referente de Koolhaas al abordar Nueva York. Por otro lado, la relevancia ganada por la actividad de comisariado artístico y la aparición de la figura del curator, como autoridad capaz de utilizar la obra de arte por encima de las intenciones de su propio autor, sirve, al igual que la figura del editor, como reflejo de la acción transformadora y de apropiación llevada a cabo tanto en Learning from Las Vegas, como en Delirious New York. Por último y a lo largo de toda la investigación las figuras de Bergson y Baudelaire han servido como apoyo teórico. A través de la utilización que de sus ideas hicieron Venturi y Koolhaas respectivamente, se ha tratado de mostrar la proximidad de ambos planteamientos desde un punto de vista ideológico. La Inclusión propuesta por Venturi y la ironía utilizada por Koolhaas, la contradicción y la paradoja, no son sino el reflejo de lógicas que en ambos casos reaccionan al mismo tiempo contra idealismo y materialismo, contra modernidad y antimodernidad, en un continuo intento de ser lo uno y lo otro simultáneamente. ABSTRACT If there was a common denominator among all the arts in what has been called postmodernism, it would have much to do with the end of the origin of the artwork. From literature and music to fine arts and architecture, overcoming modernity has been characterized by replacing the concept of artistic creation by the one of intervention, in other words, the interpretation of what already exists. In the early twentieth century modern concepts of creation and origin involved unlearning and forgetting everything before with the firm intention of starting from scratch. Even in a material sense Mies suggested the literal construction of matter and its motion according to laws. From the mid-century historicist approaches began to emerge in response to the amnesia and originality alleged by moderns. In this context appeared the books Learning from Las Vegas, 1972 and Delirious New York, 1978, both debtors in many respects to the previous book by Venturi, Complexity and Contradiction in Architecture, 1966. These two books on cities, which broke away decidedly with the historicist trends of the time, proposed using critical analysis of the existing reality as a vehicle for theory and projecting at the same time, indirectly becoming manifests. If at first Venturi, Rossi and others pose to erase the formal limits set by modernity, as well as any of the canons before, taking the entire work built as a reference system, - as did Eliot in literature - the books on Las Vegas and New York proposed directly erasing the boundaries of the discipline itself, coming to question what could be considered architecture? However, and precisely because of the absence of limits and the immensity of the established framework, - “everything could be architecture” as Hans Hollein pointed in 1968, - the books suggested at the same time the definition of a field of action for each one individually. The cities of Las Vegas and New York represented on the one hand the elimination of disciplinary limits and on the other, the delimitation of specific areas of work to its authors: Those on each of the cities interpreted. The first part of the thesis, Lessons, attend to the necessary process of learning and experimentation before the critical action itself. Contemporary architects need to accumulate material, knowledge, information, experiences... before proposing through criticism and editing; and unlike happened with moderns, the most abundant this prior baggage is, the richest will be the interpretation. Rome, London and Berlin are therefore understood as experiences capable of providing Venturi, Scott Brown and Koolhaas respectively, their “personal dictionaries”, interminable imageries with which they would later face the analysis of Las Vegas and New York. The second part, Critiques, focuses on the theoretical production itself: the two books on both cities analyzed closely with the Complexity and Contradiction. The analogical reasoning characteristic of these books has served as a methodological guide for the research, establishing relationships, not directly between the writings themselves, but through works belonging to other disciplines. First, an important parallel is set between the methods of analysis developed in these books and those used by literary criticism, noting that if the new criticism and new journalism guided Venturi and Scott Brown´s writings, the nouvelle critique and its poetic identification were clear references for Koolhaas when addressing New York. On the other hand, the relevance gained by curating and the understanding of the figure of the curator as an authority capable to use artworks above the intentions of their authors, like the one of the Editor, reflects the appropriation and processing actions carried out both in Learning from Las Vegas, and Delirious New York. Finally and over all the research Bergson and Baudelaire figures resonate continuously. Through the use of their ideas done by Venturi and Koolhaas respectively, the research has tried to show the proximity of both approaches from an ideological point of view. Inclusion, as posed by Venturi and irony, as used by Koolhaas, contradiction and paradox are reflections of the logic that in both cases allow them to react simultaneously against idealism and materialism, against modernism and anti-modernism.
Resumo:
A central aspect of the problem of evil or the argument from evil is the intensity or quantity of suffering. This quantity is conceived of as something objective and fixed. But because our experience is in part constituted and interpreted by our effectual orientation, there is no such objective quantum of suffering. But where there is no objective quantum of suffering, the argument from evil collapses. Here we begin by examining the connection between the philosophical and existential dimensions of the problem of and argument from evil as suffering. Next we consider the role of the affect in the constitution and interpretation of experience generally, together with implications for the argument from suffering. Third, we look at how a key affectual element of the argument from evil might undercut that argument. And finally, we consider a proposal to categorize suffering as a species of moral or spiritual failure, as affectually wrong.
Resumo:
Esta dissertação é o resultado da pesquisa sobre a teologia de Dietrich Bonhoeffer a partir de seus próprios escritos e da recepção e interpretação de seu pensamento por teólogos que se dedicaram ao estudo de sua reflexão teológica após sua morte, especialmente no que diz respeito à sua proposta de um cristianismo arreligioso. Dentre estes teólogos destacam-se Eberhard Bethge, autor de uma detalhada biografia de Bonhoeffer, além de Robinson e Hamilton, autores que participam do movimento que ficou conhecido como Teologia Radical . O objetivo da pesquisa é exatamente esclarecer o significado da proposta de um cristianismo arreligioso em Bonhoeffer, bem como analisar suas críticas à religião e as possibilidades encontradas em sua compreensão cristológica para uma vivência cristã autêntica num mundo que se tornou adulto. (AU)