897 resultados para Focus-of-attention
Resumo:
We attempt to contribute to a better understanding of cooperative innovation patterns of foreign subsidiaries (FS) in Spain as a representative intermediate country, going deeply into three main aspects: firstly, a sectoral taxonomy which combines international technological dynamism and revealed technological advantage as a way to understand such patterns. Secondly we focus our attention on innovative intensive subsidiaries, assuming they are the most important ones for hosting countries. Thirdly, we combine innovation and structural-competitive variables to explain local cooperation. We found more intense cooperation of FS with local agents in dynamic specialization sectors, as well as the fact that this is mostly carried out in a complementary mode with inner knowledge capabilities of the companies. Cooperative activities are influenced by economicstructural factors of the Spanish economy, particularly in highly innovative companies. Cooperative strategies of domestic firms might also have an influence on those of foreign subsidiaries.
Resumo:
For over 50 years, the Satisfaction of Search effect, and more recently known as the Subsequent Search Miss (SSM) effect, has plagued the field of radiology. Defined as a decrease in additional target accuracy after detecting a prior target in a visual search, SSM errors are known to underlie both real-world search errors (e.g., a radiologist is more likely to miss a tumor if a different tumor was previously detected) and more simplified, lab-based search errors (e.g., an observer is more likely to miss a target ‘T’ if a different target ‘T’ was previously detected). Unfortunately, little was known about this phenomenon’s cognitive underpinnings and SSM errors have proven difficult to eliminate. However, more recently, experimental research has provided evidence for three different theories of SSM errors: the Satisfaction account, the Perceptual Set account, and the Resource Depletion account. A series of studies examined performance in a multiple-target visual search and aimed to provide support for the Resource Depletion account—a first target consumes cognitive resources leaving less available to process additional targets.
To assess a potential mechanism underlying SSM errors, eye movements were recorded in a multiple-target visual search and were used to explore whether a first target may result in an immediate decrease in second-target accuracy, which is known as an attentional blink. To determine whether other known attentional distractions amplified the effects of finding a first target has on second-target detection, distractors within the immediate vicinity of the targets (i.e., clutter) were measured and compared to accuracy for a second target. To better understand which characteristics of attention were impacted by detecting a first target, individual differences within four characteristics of attention were compared to second-target misses in a multiple-target visual search.
The results demonstrated that an attentional blink underlies SSM errors with a decrease in second-target accuracy from 135ms-405ms after detection or re-fixating a first target. The effects of clutter were exacerbated after finding a first target causing a greater decrease in second-target accuracy as clutter increased around a second-target. The attentional characteristics of modulation and vigilance were correlated with second- target misses and suggest that worse attentional modulation and vigilance are predictive of more second-target misses. Taken together, these result are used as the foundation to support a new theory of SSM errors, the Flux Capacitor theory. The Flux Capacitor theory predicts that once a target is found, it is maintained as an attentional template in working memory, which consumes attentional resources that could otherwise be used to detect additional targets. This theory not only proposes why attentional resources are consumed by a first target, but encompasses the research in support of all three SSM theories in an effort to establish a grand, unified theory of SSM errors.
Resumo:
This PhD thesis provides a detailed analysis of the role and significance of Irish drama in the Galician cultural context, from the early twentieth century onwards, through scrutiny of key works translated, adapted and mediated for the Galician stage. Drawing primarily on the theoretical framework of Descriptive Translation Studies, informed by Polysystems theory (Toury), Post-colonial Translation, research on processes of cultural translation (Bassnett, Lefevere, Venuti, Aaltonen), as well as careful comparative attention to the specificities of literary, theatrical and cultural context, I examine the factors governing the incorporation, reshaping and reception of twentieth century Irish plays in Galicia in order to produce a cultural history of the representation of Ireland on the Galician stage. Focusing on the five key periods I have identified in the translation/reception of Irish drama in Galicia, as represented in specific versions of plays by Yeats, Synge, O’Casey and McDonagh, my thesis examines in detail the particular linguistic, sociopolitical, theatrical and cultural dimensions of each rewriting and/or restaging in order to uncover the ways in which Irish identity is perceived, constructed and performed in a Galician context. Moving beyond the literary, historical and philological focus of existing studies of the reception of Irish literature and foreign dramatic texts in the Galician system, my own approach draws on Theatre and Performance Studies to attend also to the performative dimension of these processes of cultural adaptation and reception, giving full account of the different agents involved in theatre translation as a rich and complex process of multivalent cultural mediation.
Resumo:
Polygonal tundra, thermokarst basins and pingos are common and characteristic periglacial features of arctic lowlands underlain by permafrost in Northeast Siberia. Modern polygonal mires are in the focus of biogeochemical, biological, pedological, and cryolithological research with special attention to their carbon stocks and greenhouse-gas fluxes, their biodiversity and their dynamics and functioning under past, present and future climate scenarios. Within the frame of the joint German-Russian DFG-RFBR project Polygons in tundra wetlands: state and dynamics under climate variability in Polar Regions (POLYGON) field studies of recent and of late Quaternary environmental dynamics were carried out in the Indigirka lowland and in the Kolyma River Delta in summer 2012 and summer 2013. Using a multidisciplinary approach, several types of polygons and thermokarst lakes were studied in different landscapes units in the Kolyma Delta in 2012 around the small fishing settlement Pokhodsk. The floral and faunal associations of polygonal tundra were described during the fieldwork. Ecological, hydrological, meteorological, limnological, pedological and cryological features were studied in order to evaluate modern and past environmental conditions and their essential controlling parameters. The ecological monitoring and collection program of polygonal ponds were undertaken as in 2011 in the Indigirka lowland by a former POLYGON expedition (Schirrmeister et al. [eds.] 2012). Exposures, pits and drill cores in the Kolyma Delta were studied to understand the cryolithological structures of frozen ground and to collect samples for detailed paleoenvironmental research of the late Quaternary past. Dendrochronological and ecological studies were carried out in the tree line zone south of the Kolyma Delta. Based on previous work in the Indigirka lowland in 2011 (Schirrmeister et al. [eds.] 2012), the environmental monitoring around the Kytalyk research station was continued until the end of August 2012. In addition, a classical exposure of the late Pleistocene permafrost at the Achchaygy Allaikha River near Chokurdakh was studied. The ecological studies near Pokhodsk were continued in 2013 (chapter 13). Other fieldwork took place at the Pokhodsk-Yedoma-Island in the northwestern part of the Kolyma Delta.
Resumo:
ABSTRACT
One of the binder systems with low environmental footprint is alkali activated slag concretes (AASC), made by adding alkalis such as sodium hydroxide and sodium silicate to industrial by-products such as ground granulated blast furnace slag (GGBS). Whilst they have the similar behaviour as that of traditional cement systems in terms of strength and structural behaviour, AASC do exhibit superior performance in terms of abrasion and acid resistance and fire protection.
In this article, the authors focus their attention on chloride ingress into different grades of AASC. The mix variables in AASC included water-to-binder, binder to aggregate ratio, percentage of alkali and the SiO2/Na2O ratio (silica modulus, Ms). The first challenge is to get mixes for different range of workability (with slump values from 40mm to 240mm) and reasonable early age and long term compressive strength according to each one. Then the chloride diffusion and migration in those mixes were measured and compared with same normal concretes in the existed literature based on chloride penetration depth. Comparing the chloride ingress between tradition concretes and AASCs is worthwhile to prove the possibility of increasing concrete lifetime in proximity to sea and deciding while such concretes are practical for use. Findings show that compared to the PC concretes, the AAS concretes have lower rate of chloride ingress.
Resumo:
If magnetism is universal in nature, magnetic materials are ubiquitous. A life without magnetism is unthinkable and a day without the influence of a magnetic material is unimaginable. They find innumerable applications in the form of many passive and active devices namely, compass, electric motor, generator, microphone, loud speaker, maglev train, magnetic resonance imaging, data recording and reading, hadron collider etc. The list is endless. Such is the influence of magnetism and magnetic materials in ones day to day life. With the advent of nanoscience and nanotechnology, along with the emergence of new areas/fields such as spintronics, multiferroics and magnetic refrigeration, the importance of magnetism is ever increasing and attracting the attention of researchers worldwide. The search for a fluid which exhibits magnetism has been on for quite some time. However nature has not bestowed us with a magnetic fluid and hence it has been the dream of many researchers to synthesize a magnetic fluid which is thought to revolutionize many applications based on magnetism. The discovery of a magnetic fluid by Jacob Rabinow in the year 1952 paved the way for a new branch of Physics/Engineering which later became magnetic fluids. This gave birth to a new class of material called magnetorheological materials. Magnetorheological materials are considered superior to electrorheological materials in that magnetorheology is a contactless operation and often inexpensive.Most of the studies in the past on magnetorheological materials were based on magnetic fluids. Recently the focus has been on the solid state analogue of magnetic fluids which are called Magnetorheological Elastomers (MREs). The very word magnetorheological elastomer implies that the rheological properties of these materials can be altered by the influence of an external applied magnetic field and this process is reversible. If the application of an external magnetic field modifies the viscosity of a magnetic fluid, the effect of external magnetic stimuli on a magnetorheological elastomer is in the modification of its stiffness. They are reversible too. Magnetorheological materials exhibit variable stiffness and find applications in adaptive structures of aerospace, automotive civil and electrical engineering applications. The major advantage of MRE is that the particles are not able to settle with time and hence there is no need of a vessel to hold it. The possibility of hazardous waste leakage is no more with a solid MRE. Moreover, the particles in a solid MRE will not affect the performance and durability of the equipment. Usually MR solids work only in the pre yield region while MR fluids, typically work in the post yield state. The application of an external magnetic field modifies the stiffness constant, shear modulus and loss modulus which are complex quantities. In viscoelastic materials a part of the input energy is stored and released during each cycle and a part is dissipated as heat. The storage modulus G′ represents the capacity of the material to store energy of deformation, which contribute to material stiffness. The loss modulusG′′ represents the ability of the material to dissipate the energy of deformation. Such materials can find applications in the form of adaptive vibration absorbers (ATVAs), stiffness tunable mounts and variable impedance surfaces. MREs are an important material for automobile giants and became the focus of this research for eventual automatic vibration control, sound isolation, brakes, clutches and suspension systems
Resumo:
Despite narratives of secularization, it appears that the British public persistently pay attention to clerical opinion and continually resort to popular expressions of religious faith, not least in time of war. From the throngs of men who gathered to hear the Bishop of London preach recruiting sermons during the First World War, to the attention paid to Archbishop Williams' words of conscience on Iraq, clerical rhetoric remains resonant. For the countless numbers who attended National Days of Prayer during the Second World War, and for the many who continue to find the Remembrance Day service a meaningful ritual, civil religious events provide a source of meaningful ceremony and a focus of national unity. War and religion have been linked throughout the twentieth century and this book explores these links: taking the perspective of the 'home front' rather than the battlefield. Exploring the views and accounts of Anglican clerics on the issue of warfare and international conflict across the century, the authors explore the church's stance on the causes, morality and conduct of warfare; issues of pacifism, obliteration bombing, nuclear possession and deterrence, retribution, forgiveness and reconciliation, and the spiritual opportunities presented by conflict. This book offers invaluable insights into how far the Church influenced public appraisal of war whilst illuminating the changing role of the Church across the twentieth century.
Resumo:
Phase change problems arise in many practical applications such as air-conditioning and refrigeration, thermal energy storage systems and thermal management of electronic devices. The physical phenomenon in such applications are complex and are often difficult to be studied in detail with the help of only experimental techniques. The efforts to improve computational techniques for analyzing two-phase flow problems with phase change are therefore gaining momentum. The development of numerical methods for multiphase flow has been motivated generally by the need to account more accurately for (a) large topological changes such as phase breakup and merging, (b) sharp representation of the interface and its discontinuous properties and (c) accurate and mass conserving motion of the interface. In addition to these considerations, numerical simulation of multiphase flow with phase change introduces additional challenges related to discontinuities in the velocity and the temperature fields. Moreover, the velocity field is no longer divergence free. For phase change problems, the focus of developmental efforts has thus been on numerically attaining a proper conservation of energy across the interface in addition to the accurate treatment of fluxes of mass and momentum conservation as well as the associated interface advection. Among the initial efforts related to the simulation of bubble growth in film boiling applications the work in \cite{Welch1995} was based on the interface tracking method using a moving unstructured mesh. That study considered moderate interfacial deformations. A similar problem was subsequently studied using moving, boundary fitted grids \cite{Son1997}, again for regimes of relatively small topological changes. A hybrid interface tracking method with a moving interface grid overlapping a static Eulerian grid was developed \cite{Juric1998} for the computation of a range of phase change problems including, three-dimensional film boiling \cite{esmaeeli2004computations}, multimode two-dimensional pool boiling \cite{Esmaeeli2004} and film boiling on horizontal cylinders \cite{Esmaeeli2004a}. The handling of interface merging and pinch off however remains a challenge with methods that explicitly track the interface. As large topological changes are crucial for phase change problems, attention has turned in recent years to front capturing methods utilizing implicit interfaces that are more effective in treating complex interface deformations. The VOF (Volume of Fluid) method was adopted in \cite{Welch2000} to simulate the one-dimensional Stefan problem and the two-dimensional film boiling problem. The approach employed a specific model for mass transfer across the interface involving a mass source term within cells containing the interface. This VOF based approach was further coupled with the level set method in \cite{Son1998}, employing a smeared-out Heaviside function to avoid the numerical instability related to the source term. The coupled level set, volume of fluid method and the diffused interface approach was used for film boiling with water and R134a at the near critical pressure condition \cite{Tomar2005}. The effect of superheat and saturation pressure on the frequency of bubble formation were analyzed with this approach. The work in \cite{Gibou2007} used the ghost fluid and the level set methods for phase change simulations. A similar approach was adopted in \cite{Son2008} to study various boiling problems including three-dimensional film boiling on a horizontal cylinder, nucleate boiling in microcavity \cite{lee2010numerical} and flow boiling in a finned microchannel \cite{lee2012direct}. The work in \cite{tanguy2007level} also used the ghost fluid method and proposed an improved algorithm based on enforcing continuity and divergence-free condition for the extended velocity field. The work in \cite{sato2013sharp} employed a multiphase model based on volume fraction with interface sharpening scheme and derived a phase change model based on local interface area and mass flux. Among the front capturing methods, sharp interface methods have been found to be particularly effective both for implementing sharp jumps and for resolving the interfacial velocity field. However, sharp velocity jumps render the solution susceptible to erroneous oscillations in pressure and also lead to spurious interface velocities. To implement phase change, the work in \cite{Hardt2008} employed point mass source terms derived from a physical basis for the evaporating mass flux. To avoid numerical instability, the authors smeared the mass source by solving a pseudo time-step diffusion equation. This measure however led to mass conservation issues due to non-symmetric integration over the distributed mass source region. The problem of spurious pressure oscillations related to point mass sources was also investigated by \cite{Schlottke2008}. Although their method is based on the VOF, the large pressure peaks associated with sharp mass source was observed to be similar to that for the interface tracking method. Such spurious fluctuation in pressure are essentially undesirable because the effect is globally transmitted in incompressible flow. Hence, the pressure field formation due to phase change need to be implemented with greater accuracy than is reported in current literature. The accuracy of interface advection in the presence of interfacial mass flux (mass flux conservation) has been discussed in \cite{tanguy2007level,tanguy2014benchmarks}. The authors found that the method of extending one phase velocity to entire domain suggested by Nguyen et al. in \cite{nguyen2001boundary} suffers from a lack of mass flux conservation when the density difference is high. To improve the solution, the authors impose a divergence-free condition for the extended velocity field by solving a constant coefficient Poisson equation. The approach has shown good results with enclosed bubble or droplet but is not general for more complex flow and requires additional solution of the linear system of equations. In current thesis, an improved approach that addresses both the numerical oscillation of pressure and the spurious interface velocity field is presented by featuring (i) continuous velocity and density fields within a thin interfacial region and (ii) temporal velocity correction steps to avoid unphysical pressure source term. Also I propose a general (iii) mass flux projection correction for improved mass flux conservation. The pressure and the temperature gradient jump condition are treated sharply. A series of one-dimensional and two-dimensional problems are solved to verify the performance of the new algorithm. Two-dimensional and cylindrical film boiling problems are also demonstrated and show good qualitative agreement with the experimental observations and heat transfer correlations. Finally, a study on Taylor bubble flow with heat transfer and phase change in a small vertical tube in axisymmetric coordinates is carried out using the new multiphase, phase change method.
Resumo:
The scleractinian coral Lophelia pertusa has been the focus of deep-sea research since the recognition of the vast extent of coral reefs in North Atlantic waters two decades ago, long after their existence was mentioned by fishermen. These reefs where shown to provide habitat, concentrate biomass and act as feeding or nursery grounds for many species, including those targeted by commercial fisheries. Thus, the attention given to this cold-water coral (CWC) species from researchers and the wider public has increased. Consequently, new research programs triggered research to determine the full extent of the corals geographic distribution and ecological dynamics of “Lophelia reefs”. The present study is based on a systematic standardised sampling design to analyse the distribution and coverage of CWC reefs along European margins from the Bay of Biscay to Iceland. Based on Remotely Operated Vehicle (ROV) image analysis, we report an almost systematic occurrence of Madrepora oculata in association with L. pertusa with similar abundances of both species within explored reefs, despite a tendency of increased abundance of L. pertusa compared to M. oculata toward higher latitudes. This systematic association occasionally reached the colony scale, with “twin” colonies of both species often observed growing next to each other when isolated structures were occurring off-reefs. Finally, several “false chimaera” were observed within reefs, confirming that colonial structures can be “coral bushes” formed by an accumulation of multiple colonies even at the inter-specific scale, with no need for self-recognition mechanisms. Thus, we underline the importance of the hitherto underexplored M. oculata in the Eastern Atlantic, re-establishing a more balanced view that both species and their yet unknown interactions are required to better elucidate the ecology, dynamics and fate of European CWC reefs in a changing environment.
Resumo:
The recent staging of Glasgow 2014 drew universal praise as the ‘Best Games Ever’. Yet the substantial undertaking of hosting the Commonwealth Games (CWG) was sold to the nation as more than just eleven days of sporting spectacle and cultural entertainment. Indeed, the primary strategic justification offered by policymakers and city leaders was the delivery of a bundle of positive and enduring benefits, so-called ‘legacy’. This ubiquitous and amorphous concept has evolved over time to become the central focus of contemporary hosting bids, reflecting a general public policy shift towards using major sporting mega events as a catalyst to generate benefits across economic, environmental and social dimensions, on a scale intended to be truly transformative. At the same time, the academy has drawn attention to the absence of evidence in support of the prevailing legacy rhetoric and raised a number of sociological concerns, not least the socially unequitable distribution of purported benefits. This study investigated how young people living in the core hosting zone related to, and were impacted upon, by the CWG and its associated developments and activities with reference to their socio-spatial horizons, the primary outcome of interest. An ‘ideal world’ Logic Model hypothesised that four mechanisms, identified from official legacy documents and social theories, would alter young people’s subjective readings of the world by virtue of broadening their social networks, extending their spatial boundaries and altering their mind sets. A qualitative methodology facilitated the gathering of situated and contextualised accounts of young people’s attitudes, perceptions, beliefs and behaviours relating to Glasgow 2014. In-depth interviews and focus groups were conducted before and after the Games with 26 young people, aged 14-16 years, at two schools in the East End. This approach was instrumental in privileging the interests of people ‘on the ground’ over those of city-wide and national stakeholders. The findings showed that young people perceived the dominant legacy benefit to be an improved reputation and image for Glasgow and the East End. Primary beneficiaries were identified by them as those with vested business interests e.g. retailers, restaurateurs, and hoteliers, as well as national and local government, with low expectations of personal dividends or ‘trickle down’ benefits. Support for Glasgow 2014 did not necessarily translate into individual engagement with the various cultural and sporting activities leading up to the CWG, including the event itself. The study found that young people who engaged most were those who had the ability to ‘read’ the opportunities available to them and who had the social, cultural and economic capital necessary to grasp them, with the corollary that those who might have gained most were the least likely to have engaged with the CWG. Doubts articulated by research participants about the social sustainability of Glasgow 2014 underscored inherent tensions between the short-lived thrill of the spectacle and the anticipated longevity of its impacts. The headline message is that hosting sporting mega events might not be an effective means of delivering social change. Aspirant host cities should consider more socially equitable alternatives to sporting mega events prior to bidding; and future host cities should endeavour to engage more purposefully with more young people over longer time frames.
Resumo:
A great deal of scholarly research has addressed the issue of dialect mapping in the United States. These studies, usually based on phonetic or lexical items, aim to present an overall picture of the dialect landscape. But what is often missing in these types of projects is an attention to the borders of a dialect region and to what kinds of identity alignments can be found in such areas. This lack of attention to regional and dialect border identities is surprising, given the salience of such borders for many Americans. This salience is also ignored among dialectologists, as nonlinguists‟ perceptions and attitudes have been generally assumed to be secondary to the analysis of “real” data, such as the phonetic and lexical variables used in traditional dialectology. Louisville, Kentucky is considered as a case study for examining how dialect and regional borders in the United States impact speakers‟ linguistic acts of identity, especially the production and perception of such identities. According to Labov, Ash, and Boberg (2006), Louisville is one of the northernmost cities to be classified as part of the South. Its location on the Ohio River, on the political and geographic border between Kentucky and Indiana, places Louisville on the isogloss between Southern and Midland dialects. Through an examination of language attitude surveys, mental maps, focus group interviews, and production data, I show that identity alignments in borderlands are neither simple nor straightforward. Identity at the border is fluid, complex, and dynamic; speakers constantly negotiate and contest their identities. The analysis shows the ways in which Louisvillians shift between Southern and non-Southern identities, in the active and agentive expression of their amplified awareness of belonging brought about by their position on the border.
Resumo:
Having into consideration that we live in a multicultural society, it is important to analyse how far people view and accept each other. Therefore, one should reflect upon concepts such as: representations and stereotypes, because they are interpersonal constructs which are (re)built during the interaction of different sociocultural groups. In this study, we focus our attention on the representations a sociocultural group – Portuguese teachers - has of intercultural education and the role of teachers and educators in the promotion of an intercultural approach at school. We believe that teachers have the responsibility to: find out the representations students have of the Other; reconfigure stereotyped representations; and create representations which favor dialogue and relationship with the Other flourish. Following a sociolinguistic approach (Müller, 1998; Vasseur, 2001; Vasseur & Hudelot, 1998), which is related to the construction and diffusion of representations in discourse, we analyse the discourse of teachers during a workshop called ‘The Other and Myself’, in which they build and discuss about a didactic mask that portrays their own vision of both themselves and their ideas of intercultural education.
Resumo:
The focus of this research is to explore the applications of the finite difference formulation based on the latency insertion method (LIM) to the analysis of circuit interconnects. Special attention is devoted to addressing the issues that arise in very large networks such as on-chip signal and power distribution networks. We demonstrate that the LIM has the power and flexibility to handle various types of analysis required at different stages of circuit design. The LIM is particularly suitable for simulations of very large scale linear networks and can significantly outperform conventional circuit solvers (such as SPICE).
Resumo:
This study examines the pluralistic hypothesis advanced by the late Professor John Hick viz. that all religious faiths provide equally salvific pathways to God, irrespective of their theological and doctrinal differences. The central focus of the study is a critical examination of (a) the epistemology of religious experience as advanced by Professor Hick, (b) the ontological status of the being he understands to be God, and further asks (c) to what extent can the pluralistic view of religious experience be harmonised with the experience with which the Christian life is understood to begin viz. regeneration. Tracing the theological journey of Professor Hick from fundamentalist Christian to religious pluralist, the study notes the reasons given for Hick’s gradual disengagement from the Christian faith. In addition to his belief that the pre-scientific worldview of the Bible was obsolete and passé, Hick took the view that modern biblical scholarship could not accommodate traditionally held Christian beliefs. He conceded that the Incarnation, if true, would be decisive evidence for the uniqueness of Christianity, but rejected the same on the grounds of logical incoherence. This study affirms the view that the doctrine of the Incarnation occupies a place of crucial importance within world religion, but rejects the claim of incoherence. Professor Hick believed that God’s Spirit was at work in all religions, producing a common religious experience, or spiritual awakening to God. The soteriological dimension of this spiritual awakening, he suggests, finds expression as the worshipper turns away from self-centredness to the giving of themselves to God and others. At the level of epistemology he further argued that religious experience itself provided the rational basis for belief in God. The study supports the assertion by Professor Hick that religious experience itself ought to be trusted as a source of knowledge and this on the principle of credulity, which states that a person’s claim to perceive or experience something is prima facie justified, unless there are compelling reasons to the contrary. Hick’s argument has been extensively developed and defended by philosophers such as Alvin Plantinga and William Alston. This confirms the importance of Hick’s contribution to the philosophy of religion, and further establishes his reputation within the field as an original thinker. It is recognised in this thesis, however, that in affirming only the rationality of belief, but not the obligation to believe, Professor Hick’s epistemology is not fully consistent with a Christian theology of revelation. Christian theology views the created order as pre-interpreted and unambiguous in its testimony to God’s existence. To disbelieve in God’s existence is to violate one’s epistemic duty by suppressing the truth. Professor Hick’s critical realist principle, which he regards as the key to understanding what is happening in the different forms of religious experience, is examined within this thesis. According to the critical realist principle, there are realities external to us, yet we are never aware of them as they are in themselves, but only as they appear to us within our particular cognitive machinery and conceptual resources. All awareness of God is interpreted through the lens of pre-existing, culturally relative religious forms, which in turn explains the differing theologies within the world of religion. The critical realist principle views God as unknowable, in the sense that his inner nature is beyond the reach of human conceptual categories and linguistic systems. Professor Hick thus endorses and develops the view of God as ineffable, but employs the term transcategorial when speaking of God’s ineffability. The study takes the view that the notion of transcategoriality as developed by Professor Hick appears to deny any ontological status to God, effectively arguing him out of existence. Furthermore, in attributing the notion of transcategoriality to God, Professor Hick would appear to render incoherent his own fundamental assertion that we can know nothing of God that is either true or false. The claim that the experience of regeneration with which the Christian life begins can be classed as a mere species of the genus common throughout all faiths, is rejected within this thesis. Instead it is argued that Christian regeneration is a distinctive experience that cannot be reduced to a salvific experience, defined merely as an awareness of, or awakening to, God, followed by a turning away from self to others. Professor Hick argued against any notion that the Christian community was the social grouping through which God’s Spirit was working in an exclusively redemptive manner. He supported his view by drawing attention to (a) the presence, at times, of comparable or higher levels of morality in world religion, when contrasted with that evidenced by the followers of Christ, and (b) the presence, at times, of demonstrably lower levels of morality in the followers of Christ, when contrasted with the lives of other religious devotees. These observations are fully supported, but the conclusion reached is rejected, on the grounds that according to Christian theology the saving work of God’s Spirit is evidenced in a life that is changing from what it was before. Christian theology does not suggest or demand that such lives at every stage be demonstrably superior, when contrasted with other virtuous or morally upright members of society. The study concludes by paying tribute to the contribution Professor Hick has made to the field of the epistemology of religious experience.
Resumo:
Part 3: Product-Service Systems