991 resultados para Art - Psychology
Resumo:
We propose an effective elastography technique in which an acoustic radiation force is used for remote palpation to generate localized tissue displacements, which are directly correlated to localized variations of tissue stiffness and are measured using a light probe in the same direction of ultrasound propagation. The experimental geometry has provision to input light beam along the ultrasound propagation direction, and hence it can be prealigned to ensure proper interception of the focal region by the light beam. Tissue-mimicking phantoms with homogeneous and isotropic mechanical properties of normal and malignant breast tissue are considered for the study. Each phantom is insonified by a focusing ultrasound transducer (1 MHz). The focal volume of the transducer and the ultrasound radiation force in the region are estimated through solving acoustic wave propagation through medium assuming average acoustic properties. The forward elastography problem is solved for the region of insonification assuming the Lame's parameters and Poisson's ratio, under Dirichlet boundary conditions which gives a distribution of displacement vectors. The direction of displacement, though presented spatial variation, is predominantly towards the ultrasound propagation direction. Using Monte Carlo (MC) simulation we have traced the photons through the phantom and collected the photons arriving at the detector on the boundary of the object in the direction of ultrasound. The intensity correlations are then computed from detected photons. The intensity correlation function computed through MC simulation showed a modulation whose strength is found to be proportional to the amplitude of displacement and inversely related to the storage (elastic) modulus. It is observed that when the storage modulus in the focal region is increased the computed displacement magnitude, as indicated by the depth of modulation in the intensity autocorrelation, decreased and the trend is approximately exponential.
Resumo:
In this paper, we present a wavelet - based approach to solve the non-linear perturbation equation encountered in optical tomography. A particularly suitable data gathering geometry is used to gather a data set consisting of differential changes in intensity owing to the presence of the inhomogeneous regions. With this scheme, the unknown image, the data, as well as the weight matrix are all represented by wavelet expansions, thus yielding the representation of the original non - linear perturbation equation in the wavelet domain. The advantage in use of the non-linear perturbation equation is that there is no need to recompute the derivatives during the entire reconstruction process. Once the derivatives are computed, they are transformed into the wavelet domain. The purpose of going to the wavelet domain, is that, it has an inherent localization and de-noising property. The use of approximation coefficients, without the detail coefficients, is ideally suited for diffuse optical tomographic reconstructions, as the diffusion equation removes most of the high frequency information and the reconstruction appears low-pass filtered. We demonstrate through numerical simulations, that through solving merely the approximation coefficients one can reconstruct an image which has the same information content as the reconstruction from a non-waveletized procedure. In addition we demonstrate a better noise tolerance and much reduced computation time for reconstructions from this approach.
Resumo:
Reconstructions in optical tomography involve obtaining the images of absorption and reduced scattering coefficients. The integrated intensity data has greater sensitivity to absorption coefficient variations than scattering coefficient. However, the sensitivity of intensity data to scattering coefficient is not zero. We considered an object with two inhomogeneities (one in absorption and the other in scattering coefficient). The standard iterative reconstruction techniques produced results, which were plagued by cross talk, i.e., the absorption coefficient reconstruction has a false positive corresponding to the location of scattering inhomogeneity, and vice-versa. We present a method to remove cross talk in the reconstruction, by generating a weight matrix and weighting the update vector during the iteration. The weight matrix is created by the following method: we first perform a simple backprojection of the difference between the experimental and corresponding homogeneous intensity data. The built up image has greater weightage towards absorption inhomogeneity than the scattering inhomogeneity and its appropriate inverse is weighted towards the scattering inhomogeneity. These two weight matrices are used as multiplication factors in the update vectors, normalized backprojected image of difference intensity for absorption inhomogeneity and the inverse of the above for the scattering inhomogeneity, during the image reconstruction procedure. We demonstrate through numerical simulations, that cross-talk is fully eliminated through this modified reconstruction procedure.
Resumo:
In this article I shall argue that understandings of what constitutes narrative, how it functions, and the contexts in which it applies have broadened in line with cultural, social and intellectual trends which have seen a blurring, if not the dissolution, of boundaries between ‘fact’ and ‘fiction’; ‘literary’ and ‘non-literary’ narrative spaces; history and story; concepts of time and space, text and image, teller and tale, representation and reality.To illustrate some of the ways in which the concept of narrative has travelled across disciplinary and generic boundaries, I shall look at The Art of Travel (de Botton 2003), with a view to demonstrating how the blending of genres works to produce a narrative that is at once personal and philosophical; visual and verbal; didactic and poetic. I shall show that such a text constitutes a site of interrogation of concepts of narrative, even as it depends on the reader’s ability to narrativize experience.
Resumo:
In daily life, rich experiences evolve in every environmental and social interaction. Because experience has a strong impact on how people behave, scholars in different fields are interested in understanding what constitutes an experience. Yet even if interest in conscious experience is on the increase, there is no consensus on how such experience should be studied. Whatever approach is taken, the subjective and psychologically multidimensional nature of experience should be respected. This study endeavours to understand and evaluate conscious experiences. First I intro-duce a theoretical approach to psychologically-based and content-oriented experience. In the experiential cycle presented here, classical psychology and orienting-environmental content are connected. This generic approach is applicable to any human-environment interaction. Here I apply the approach to entertainment virtual environments (VEs) such as digital games and develop a framework with the potential for studying experiences in VEs. The development of the methodological framework included subjective and objective data from experiences in the Cave Automatic Virtual Environment (CAVE) and with numerous digital games (N=2,414). The final framework consisted of fifteen factor-analytically formed subcomponents of the sense of presence, involvement and flow. Together, these show the multidimensional experiential profile of VEs. The results present general experiential laws of VEs and show that the interface of a VE is related to (physical) presence, which psychologically means attention, perception and the cognitively evaluated realness and spatiality of the VE. The narrative of the VE elicits (social) presence and involvement and affects emotional outcomes. Psychologically, these outcomes are related to social cognition, motivation and emotion. The mechanics of a VE affect the cognitive evaluations and emotional outcomes related to flow. In addition, at the very least, user background, prior experience and use context affect the experiential variation. VEs are part of many peoples lives and many different outcomes are related to them, such as enjoyment, learning and addiction, depending on who is making the evalua-tion. This makes VEs societally important and psychologically fruitful to study. The approach and framework presented here contribute to our understanding of experiences in general and VEs in particular. The research can provide VE developers with a state-of-the art method (www.eveqgp.fi) that can be utilized whenever new product and service concepts are designed, prototyped and tested.
Resumo:
The majority of Internet traffic use Transmission Control Protocol (TCP) as the transport level protocol. It provides a reliable ordered byte stream for the applications. However, applications such as live video streaming place an emphasis on timeliness over reliability. Also a smooth sending rate can be desirable over sharp changes in the sending rate. For these applications TCP is not necessarily suitable. Rate control attempts to address the demands of these applications. An important design feature in all rate control mechanisms is TCP friendliness. We should not negatively impact TCP performance since it is still the dominant protocol. Rate Control mechanisms are classified into two different mechanisms: window-based mechanisms and rate-based mechanisms. Window-based mechanisms increase their sending rate after a successful transfer of a window of packets similar to TCP. They typically decrease their sending rate sharply after a packet loss. Rate-based solutions control their sending rate in some other way. A large subset of rate-based solutions are called equation-based solutions. Equation-based solutions have a control equation which provides an allowed sending rate. Typically these rate-based solutions react slower to both packet losses and increases in available bandwidth making their sending rate smoother than that of window-based solutions. This report contains a survey of rate control mechanisms and a discussion of their relative strengths and weaknesses. A section is dedicated to a discussion on the enhancements in wireless environments. Another topic in the report is bandwidth estimation. Bandwidth estimation is divided into capacity estimation and available bandwidth estimation. We describe techniques that enable the calculation of a fair sending rate that can be used to create novel rate control mechanisms.
Resumo:
At the the heart of this study can be seen the dual concern of how the nation is represented as a categorical entity and how this is put to use in everyday social interactions.This can be seen as a reaction to the general approach to categorisation and identity functions that tend to be reified and essentialized within the social sciences. The empirical focus of this study is the Isle of Man, a crown dependency situated geographically central within the British Isles while remaining political outside the United Kingdom. The choice of this site was chosen explicitly as ‘notions of nation’ expressed on the island can be seen as being contested and ephemerally unstable. To get at these ‘notions of nation’ is was necessary to choose specific theoretical tools that were able to capture the wider cultural and representational domain while being capable of addressing the nuanced and functional aspects of interaction. As such, the main theoretical perspective used within this study was that of critical discursive psychology which incorporates the specific theoretical tools interpretative repertoires, ideological dilemmas and subject positions. To supplement these tools, a discursive approach to place was taken in tandem to address the form and function of place attached to nationhood. Two methods of data collection were utilized, that of computer mediated communication and acquaintance interviews. From the data a number of interpretative repertoires were proposed, namely being, essential rights, economic worth, heritage claims, conflict orientation, people-as-nation and place-as-nation. Attached to such interpretative repertoires were the ideological dilemmas region vs. country, people vs. place and individualism vs. collectivism. The subject positions found are much more difficult to condense, but the most significant ones were gender, age and parentage. The final focus of the study, that of place, was shown to be more than just an unreflected on ‘container’ of people but was significant in terms of the rhetorical construction of such places for how people saw themselves and the discursive function of the particular interaction. As such, certain forms of place construction included size, community, temporal, economic, safety, political and recognition. A number of conclusions were drawn from the above which included, that when looking at nation categories we should take into account the specific meanings that people attach to such concepts and to be aware of the particular uses they are put to in interaction. Also, that it is impossible to separate concepts neatly, but it is necessary to be aware of the intersection where concepts cross, and clash, when looking at nationhood.
Resumo:
Maurice Merleau-Ponty (1908-1961) has been known as the philosopher of painting. His interest in the theory of perception intertwined with the questions concerning the artist s perception, the experience of an artwork and the possible interpretations of the artwork. For him, aesthetics was not a sub-field of philosophy, and art was not simply a subject matter for the aesthetic experience, but a form of thinking. This study proposes an opening for a dialogue between Merleau-Pontian phenomenology and contemporary art. The thesis examines his phenomenology through certain works of contemporary art and presents readings of these artworks through his phenomenology. The thesis both shows the potentiality of a method, but also engages in the critical task of finding the possible limitations of his approach. The first part lays out the methodological and conceptual points of departure of Merleau-Ponty s phenomenological approach to perception as well as the features that determined his discussion on encountering art. Merleau-Ponty referred to the experience of perceiving art using the notion of seeing with (voir selon). He stressed a correlative reciprocity described in Eye and Mind (1961) as the switching of the roles of the visible and the painter. The choice of artworks is motivated by certain restrictions in the phenomenological readings of visual arts. The examined works include paintings by Tiina Mielonen, a photographic work by Christian Mayer, a film by Douglas Gordon and Philippe Parreno, and an installation by Monika Sosnowska. These works resonate with, and challenge, his phenomenological approach. The chapters with case studies take up different themes that are central to Merleau-Ponty s phenomenology: space, movement, time, and touch. All of the themes are interlinked with the examined artworks. There are also topics that reappear in the thesis, such as the notion of écart and the question of encountering the other. As Merleau-Ponty argued, the sphere of art has a particular capability to address our being in the world. The thesis presents an interpretation that emphasises the notion of écart, which refers to an experience of divergence or dispossession. The sudden dissociation, surprise or rupture that is needed in order for a meeting between the spectator and the artwork, or between two persons, to be possible. Further, the thesis suggests that through artworks it is possible to take into consideration the écart, the divergence, that defines our subjectivity.
Resumo:
A state-of-the-art model of the coupled ocean-atmosphere system, the climate forecast system (CFS), from the National Centres for Environmental Prediction (NCEP), USA, has been ported onto the PARAM Padma parallel computing system at the Centre for Development of Advanced Computing (CDAC), Bangalore and retrospective predictions for the summer monsoon (June-September) season of 2009 have been generated, using five initial conditions for the atmosphere and one initial condition for the ocean for May 2009. Whereas a large deficit in the Indian summer monsoon rainfall (ISMR; June-September) was experienced over the Indian region (with the all-India rainfall deficit by 22% of the average), the ensemble average prediction was for above-average rainfall during the summer monsoon. The retrospective predictions of ISMR with CFS from NCEP for 1981-2008 have been analysed. The retrospective predictions from NCEP for the summer monsoon of 1994 and that from CDAC for 2009 have been compared with the simulations for each of the seasons with the stand-alone atmospheric component of the model, the global forecast system (GFS), and observations. It has been shown that the simulation with GFS for 2009 showed deficit rainfall as observed. The large error in the prediction for the monsoon of 2009 can be attributed to a positive Indian Ocean Dipole event seen in the prediction from July onwards, which was not present in the observations. This suggests that the error could be reduced with improvement of the ocean model over the equatorial Indian Ocean.
Resumo:
Distribution of particle reinforcements in cast composites is determined by the morphology of the solidification front. Interestingly, during solidification, the morphology of the interface is intrinsically affected by the presence of dispersed reinforcements. Thus the dispersoid distribution and length scale of matrix microstructure is a result of the interplay between these two. A proper combination of material and process parameters can be used to obtain composites with tailored microstructures. This requires the generation of a broad data base and optimization of the complete solidification process. The length scale of soldification microtructure has a large influence on the mechanical properties of the composites. This presentation addresses the concept of a particle distribution map which can help in predicting particle distribution under different solidification conditions Future research directions have also been indicated.
Resumo:
Music signals comprise of atomic notes drawn from a musical scale. The creation of musical sequences often involves splicing the notes in a constrained way resulting in aesthetically appealing patterns. We develop an approach for music signal representation based on symbolic dynamics by translating the lexicographic rules over a musical scale to constraints on a Markov chain. This source representation is useful for machine based music synthesis, in a way, similar to a musician producing original music. In order to mathematically quantify user listening experience, we study the correlation between the max-entropic rate of a musical scale and the subjective aesthetic component. We present our analysis with examples from the south Indian classical music system.