982 resultados para Art objects, Classical.
Resumo:
We propose an effective elastography technique in which an acoustic radiation force is used for remote palpation to generate localized tissue displacements, which are directly correlated to localized variations of tissue stiffness and are measured using a light probe in the same direction of ultrasound propagation. The experimental geometry has provision to input light beam along the ultrasound propagation direction, and hence it can be prealigned to ensure proper interception of the focal region by the light beam. Tissue-mimicking phantoms with homogeneous and isotropic mechanical properties of normal and malignant breast tissue are considered for the study. Each phantom is insonified by a focusing ultrasound transducer (1 MHz). The focal volume of the transducer and the ultrasound radiation force in the region are estimated through solving acoustic wave propagation through medium assuming average acoustic properties. The forward elastography problem is solved for the region of insonification assuming the Lame's parameters and Poisson's ratio, under Dirichlet boundary conditions which gives a distribution of displacement vectors. The direction of displacement, though presented spatial variation, is predominantly towards the ultrasound propagation direction. Using Monte Carlo (MC) simulation we have traced the photons through the phantom and collected the photons arriving at the detector on the boundary of the object in the direction of ultrasound. The intensity correlations are then computed from detected photons. The intensity correlation function computed through MC simulation showed a modulation whose strength is found to be proportional to the amplitude of displacement and inversely related to the storage (elastic) modulus. It is observed that when the storage modulus in the focal region is increased the computed displacement magnitude, as indicated by the depth of modulation in the intensity autocorrelation, decreased and the trend is approximately exponential.
Resumo:
In this paper, we present a wavelet - based approach to solve the non-linear perturbation equation encountered in optical tomography. A particularly suitable data gathering geometry is used to gather a data set consisting of differential changes in intensity owing to the presence of the inhomogeneous regions. With this scheme, the unknown image, the data, as well as the weight matrix are all represented by wavelet expansions, thus yielding the representation of the original non - linear perturbation equation in the wavelet domain. The advantage in use of the non-linear perturbation equation is that there is no need to recompute the derivatives during the entire reconstruction process. Once the derivatives are computed, they are transformed into the wavelet domain. The purpose of going to the wavelet domain, is that, it has an inherent localization and de-noising property. The use of approximation coefficients, without the detail coefficients, is ideally suited for diffuse optical tomographic reconstructions, as the diffusion equation removes most of the high frequency information and the reconstruction appears low-pass filtered. We demonstrate through numerical simulations, that through solving merely the approximation coefficients one can reconstruct an image which has the same information content as the reconstruction from a non-waveletized procedure. In addition we demonstrate a better noise tolerance and much reduced computation time for reconstructions from this approach.
Resumo:
Reconstructions in optical tomography involve obtaining the images of absorption and reduced scattering coefficients. The integrated intensity data has greater sensitivity to absorption coefficient variations than scattering coefficient. However, the sensitivity of intensity data to scattering coefficient is not zero. We considered an object with two inhomogeneities (one in absorption and the other in scattering coefficient). The standard iterative reconstruction techniques produced results, which were plagued by cross talk, i.e., the absorption coefficient reconstruction has a false positive corresponding to the location of scattering inhomogeneity, and vice-versa. We present a method to remove cross talk in the reconstruction, by generating a weight matrix and weighting the update vector during the iteration. The weight matrix is created by the following method: we first perform a simple backprojection of the difference between the experimental and corresponding homogeneous intensity data. The built up image has greater weightage towards absorption inhomogeneity than the scattering inhomogeneity and its appropriate inverse is weighted towards the scattering inhomogeneity. These two weight matrices are used as multiplication factors in the update vectors, normalized backprojected image of difference intensity for absorption inhomogeneity and the inverse of the above for the scattering inhomogeneity, during the image reconstruction procedure. We demonstrate through numerical simulations, that cross-talk is fully eliminated through this modified reconstruction procedure.
Resumo:
A microscopic theory of equilibrium solvation and solvation dynamics of a classical, polar, solute molecule in dipolar solvent is presented. Density functional theory is used to explicitly calculate the polarization structure around a solvated ion. The calculated solvent polarization structure is different from the continuum model prediction in several respects. The value of the polarization at the surface of the ion is less than the continuum value. The solvent polarization also exhibits small oscillations in space near the ion. We show that, under certain approximations, our linear equilibrium theory reduces to the nonlocal electrostatic theory, with the dielectric function (c(k)) of the liquid now wave vector (k) dependent. It is further shown that the nonlocal electrostatic estimate of solvation energy, with a microscopic c(k), is close to the estimate of linearized equilibrium theories of polar liquids. The study of solvation dynamics is based on a generalized Smoluchowski equation with a mean-field force term to take into account the effects of intermolecular interactions. This study incorporates the local distortion of the solvent structure near the ion and also the effects of the translational modes of the solvent molecules.The latter contribution, if significant, can considerably accelerate the relaxation of solvent polarization and can even give rise to a long time decay that agrees with the continuum model prediction. The significance of these results is discussed.
Resumo:
In this article I shall argue that understandings of what constitutes narrative, how it functions, and the contexts in which it applies have broadened in line with cultural, social and intellectual trends which have seen a blurring, if not the dissolution, of boundaries between ‘fact’ and ‘fiction’; ‘literary’ and ‘non-literary’ narrative spaces; history and story; concepts of time and space, text and image, teller and tale, representation and reality.To illustrate some of the ways in which the concept of narrative has travelled across disciplinary and generic boundaries, I shall look at The Art of Travel (de Botton 2003), with a view to demonstrating how the blending of genres works to produce a narrative that is at once personal and philosophical; visual and verbal; didactic and poetic. I shall show that such a text constitutes a site of interrogation of concepts of narrative, even as it depends on the reader’s ability to narrativize experience.
Resumo:
The triangular space between memory, narrative and pictorial representation is the terrain on which this article is developed. Taking the art of memory developed by Giordano Bruno (1548 – 1600) and the art of painting subtly revolutionised by Adam Elsheimer (1578 – 1610) as test-cases, it is shown how both subvert the norms of mimesis and narration prevalent throughout the Renaissance, how disrupted memory creates “incoherent” narratives, and how perspective and the notion of “place” are questioned in a corollary way. Two paintings by Elsheimer are analysed and shown to include, in spite of their supposed “realism”, numerous incoherencies, aporias and strange elements – often overlooked. Thus, they do not conform to two of the basic rules governing both the classical art of memory and the humanist art of painting: well-defined places and the exhaustive translatability of words into images (and vice-versa). In the work of Bruno, both his philosophical claims and the literary devices he uses are analysed as hints for a similar (and contemporaneous) undermining of conventions about the transparency and immediacy of representation.
Resumo:
Like the metal and semiconductor nanoparticles, the melting temperature of free inert-gas nanoparticles decreases with decreasing size. The variation is linear with the inverse of the particle size for large nanoparticles and deviates from the linearity for small nanoparticles. The decrease in the melting temperature is slower for free nanoparticles with non-wetting surfaces, while the decrease is faster for nanoparticles with wetting surfaces. Though the depression of the melting temperature has been reported for inert-gas nanoparticles in porous glasses, superheating has also been observed when the nanoparticles are embedded in some matrices. By using a simple classical approach, the influence of size, geometry and the matrix on the melting temperature of nanoparticles is understood quantitatively and shown to be applicable for other materials. It is also shown that the classical approach can be applied to understand the size-dependent freezing temperature of nanoparticles.
Resumo:
Self-similarity, a concept taken from mathematics, is gradually becoming a keyword in musicology. Although a polysemic term, self-similarity often refers to the multi-scalar feature repetition in a set of relationships, and it is commonly valued as an indication for musical coherence and consistency . This investigation provides a theory of musical meaning formation in the context of intersemiosis, that is, the translation of meaning from one cognitive domain to another cognitive domain (e.g. from mathematics to music, or to speech or graphic forms). From this perspective, the degree of coherence of a musical system relies on a synecdochic intersemiosis: a system of related signs within other comparable and correlated systems. This research analyzes the modalities of such correlations, exploring their general and particular traits, and their operational bounds. Looking forward in this direction, the notion of analogy is used as a rich concept through its two definitions quoted by the Classical literature: proportion and paradigm, enormously valuable in establishing measurement, likeness and affinity criteria. Using quantitative qualitative methods, evidence is presented to justify a parallel study of different modalities of musical self-similarity. For this purpose, original arguments by Benoît B. Mandelbrot are revised, alongside a systematic critique of the literature on the subject. Furthermore, connecting Charles S. Peirce s synechism with Mandelbrot s fractality is one of the main developments of the present study. This study provides elements for explaining Bolognesi s (1983) conjecture, that states that the most primitive, intuitive and basic musical device is self-reference, extending its functions and operations to self-similar surfaces. In this sense, this research suggests that, with various modalities of self-similarity, synecdochic intersemiosis acts as system of systems in coordination with greater or lesser development of structural consistency, and with a greater or lesser contextual dependence.
Resumo:
In this master s thesis, I have discussed the question of authenticity in postprocessual archaeology. Modern archaeology is a product of the modern world, and postprocessual archaeology in turn is strongly influenced by postmodernism. The way authenticity has been understood in processual archaeology is largely dictated by the modern condition. The understanding of authenticity in postprocessual archaeology, however, rests on notions of simulation and metaphor. It has been argued by postprocessual archaeologists that the past can be experienced by metaphor, and that the relationship between now and then is of a metaphorical kind. In postprocessual archaeology, authenticity has been said to be contextual. This view has been based on a contextualist understanding of the meanings of language and metaphor. I argue that, besides being based on metaphor, authenticity is a conventional attribute based on habits of acting, which in turn have their basis in the material world and the materiality of objects. Authenticity is material meaning, and that meaning can be found out by studying the objects as signs in a chain of signification called semiosis. Authenticity therefore is semiosis.
Resumo:
In daily life, rich experiences evolve in every environmental and social interaction. Because experience has a strong impact on how people behave, scholars in different fields are interested in understanding what constitutes an experience. Yet even if interest in conscious experience is on the increase, there is no consensus on how such experience should be studied. Whatever approach is taken, the subjective and psychologically multidimensional nature of experience should be respected. This study endeavours to understand and evaluate conscious experiences. First I intro-duce a theoretical approach to psychologically-based and content-oriented experience. In the experiential cycle presented here, classical psychology and orienting-environmental content are connected. This generic approach is applicable to any human-environment interaction. Here I apply the approach to entertainment virtual environments (VEs) such as digital games and develop a framework with the potential for studying experiences in VEs. The development of the methodological framework included subjective and objective data from experiences in the Cave Automatic Virtual Environment (CAVE) and with numerous digital games (N=2,414). The final framework consisted of fifteen factor-analytically formed subcomponents of the sense of presence, involvement and flow. Together, these show the multidimensional experiential profile of VEs. The results present general experiential laws of VEs and show that the interface of a VE is related to (physical) presence, which psychologically means attention, perception and the cognitively evaluated realness and spatiality of the VE. The narrative of the VE elicits (social) presence and involvement and affects emotional outcomes. Psychologically, these outcomes are related to social cognition, motivation and emotion. The mechanics of a VE affect the cognitive evaluations and emotional outcomes related to flow. In addition, at the very least, user background, prior experience and use context affect the experiential variation. VEs are part of many peoples lives and many different outcomes are related to them, such as enjoyment, learning and addiction, depending on who is making the evalua-tion. This makes VEs societally important and psychologically fruitful to study. The approach and framework presented here contribute to our understanding of experiences in general and VEs in particular. The research can provide VE developers with a state-of-the art method (www.eveqgp.fi) that can be utilized whenever new product and service concepts are designed, prototyped and tested.
Resumo:
The study is the outcome of two research projects on the North American Indian traditions: the role of the shields within the Plains Indians traditional culture and religion, and the bear ceremonialism of the Native North America, especially the significance of the bear among the Plains Indians. This article-based dissertation includes seven separately published scholar papers, forming Chapters 6 12. The introduction formulates the objectives and frame of reference of the study and the conclusions pulls together its results. The study reconsiders the role of the Plains Indian shields with bear motifs. Such shields are found in rock art, in the Plains Indian s paintings and drawings, and in various collections, the main source material being the shields in European and North American museums. The aim is not only to study shields with bear power motifs and the meanings of the bear, but also to discuss appropriate methods for studying these subjects. There are three major aims of the study: to consider methodical questions in studying Plains Indian shields, to examine the complexity of the Plains Indian shields with the bear power motifs, and to offer new interpretations for the basic meanings of the bear among the Plains Indians and the interrelationship between individualism and collectivism in the Plains Indians visionary art that show bear power motifs on the shields. The study constructs a view on the bear shields taking account of all sources of information available and analysing the shields both as physical artefacts and religious objects from different perspectives, studying them as a part of the ensemble of Plains culture and religious traditions. The bear motifs represented the superhuman power that medicine men and warriors could exploit through visions. For the Plains Indians, the bear was a wise animal from which medicine men could get power for healing but also a dangerous animal from which warriors could get power for warfare. The shields with bear motifs represented the bear powers of the owners of the shields. The bear shield was made to represent the vision, and the principal interpretation of the symbolism was based on the individual experience of spiritual world and its powers. The study argues that the bear shield as personal medicine object is based on wider tribal traditions, and the basic meaning is derived from the collective tradition. This means that the bear seen in vision represented particular affairs and it was represented on the shield surface using conventional ways of traditional artistry. In consequence of this, the bear shields reflect not only the individual experiences of bear power but whole field of tribal traditions that legitimated the experiences and offered acceptable interpretations and conventional modes for the bear symbols.
Resumo:
This paper presents an algorithm for generating the Interior Medial Axis Transform (iMAT) of 3D objects with free-form boundaries. The algorithm proposed uses the exact representation of the part and generates an approximate rational spline description of the iMAT. The algorithm generates the iMAT by a tracing technique that marches along the object's boundary. The level of approximation is controlled by the choice of the step size in the tracing procedure. Criteria based on distance and local curvature of boundary entities are used to identify the junction points and the search for these junction points is done in an efficient way. The algorithm works for multiply-connected objects as well. Results of the implementation are provided. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The majority of Internet traffic use Transmission Control Protocol (TCP) as the transport level protocol. It provides a reliable ordered byte stream for the applications. However, applications such as live video streaming place an emphasis on timeliness over reliability. Also a smooth sending rate can be desirable over sharp changes in the sending rate. For these applications TCP is not necessarily suitable. Rate control attempts to address the demands of these applications. An important design feature in all rate control mechanisms is TCP friendliness. We should not negatively impact TCP performance since it is still the dominant protocol. Rate Control mechanisms are classified into two different mechanisms: window-based mechanisms and rate-based mechanisms. Window-based mechanisms increase their sending rate after a successful transfer of a window of packets similar to TCP. They typically decrease their sending rate sharply after a packet loss. Rate-based solutions control their sending rate in some other way. A large subset of rate-based solutions are called equation-based solutions. Equation-based solutions have a control equation which provides an allowed sending rate. Typically these rate-based solutions react slower to both packet losses and increases in available bandwidth making their sending rate smoother than that of window-based solutions. This report contains a survey of rate control mechanisms and a discussion of their relative strengths and weaknesses. A section is dedicated to a discussion on the enhancements in wireless environments. Another topic in the report is bandwidth estimation. Bandwidth estimation is divided into capacity estimation and available bandwidth estimation. We describe techniques that enable the calculation of a fair sending rate that can be used to create novel rate control mechanisms.
Resumo:
Head-on infall of two compact objects with arbitrary mass ratio is investigated using the multipolar post-Minkowskian approximation method. At the third post-Newtonian order the energy flux, in addition to the instantaneous contributions, also includes hereditary contributions consisting of the gravitational-wave tails, tails-of-tails, and the tail-squared terms. The results are given both for infall from infinity and also for infall from a finite distance. These analytical expressions should be useful for the comparison with the high accuracy numerical relativity results within the limit in which post-Newtonian approximations are valid.