815 resultados para Art 54 Código Contencioso Administrativo


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reconstructions in optical tomography involve obtaining the images of absorption and reduced scattering coefficients. The integrated intensity data has greater sensitivity to absorption coefficient variations than scattering coefficient. However, the sensitivity of intensity data to scattering coefficient is not zero. We considered an object with two inhomogeneities (one in absorption and the other in scattering coefficient). The standard iterative reconstruction techniques produced results, which were plagued by cross talk, i.e., the absorption coefficient reconstruction has a false positive corresponding to the location of scattering inhomogeneity, and vice-versa. We present a method to remove cross talk in the reconstruction, by generating a weight matrix and weighting the update vector during the iteration. The weight matrix is created by the following method: we first perform a simple backprojection of the difference between the experimental and corresponding homogeneous intensity data. The built up image has greater weightage towards absorption inhomogeneity than the scattering inhomogeneity and its appropriate inverse is weighted towards the scattering inhomogeneity. These two weight matrices are used as multiplication factors in the update vectors, normalized backprojected image of difference intensity for absorption inhomogeneity and the inverse of the above for the scattering inhomogeneity, during the image reconstruction procedure. We demonstrate through numerical simulations, that cross-talk is fully eliminated through this modified reconstruction procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article I shall argue that understandings of what constitutes narrative, how it functions, and the contexts in which it applies have broadened in line with cultural, social and intellectual trends which have seen a blurring, if not the dissolution, of boundaries between ‘fact’ and ‘fiction’; ‘literary’ and ‘non-literary’ narrative spaces; history and story; concepts of time and space, text and image, teller and tale, representation and reality.To illustrate some of the ways in which the concept of narrative has travelled across disciplinary and generic boundaries, I shall look at The Art of Travel (de Botton 2003), with a view to demonstrating how the blending of genres works to produce a narrative that is at once personal and philosophical; visual and verbal; didactic and poetic. I shall show that such a text constitutes a site of interrogation of concepts of narrative, even as it depends on the reader’s ability to narrativize experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The subject of the present research is historical lighthouse and maritime pilot stations in Finland. If one thinks of these now-abandoned sites as an empty stage, the dissertation aims to recreate the drama that once played out there. The research comprises three main themes. The first, the family problematic, focuses on the relationship between the family members concerned and the public service positions held, as well as the islands on which these people were stationed. The role of the male actors becomes apparent through an examination of the job descriptions of pilots and lighthouse keepers, but the role of the wives appears more problematic: running a household and the insularity of the community came with their own challenges, and the husbands were away for much of the time. In this context the children emerge as crucial. What was their role in the family of a public official? What were the effects of having to move to the mainland for school? The second theme is the station community. A socioecological examination is undertaken which defines the islands as plots allowing the researcher to study the social behaviours of the isolated communities in question. The development of this theme is based on interpretations of interviews revealing starkly opposed views on the existing neighbourly relations. The premise is that social friction is inevitable among people living within close proximity of each other, and the study proceeds to become an analysis that seeks to uncover the sociocultural strategies designed to control the risks of communal living either by creating distance between neighbours or by enhancing their mutual ties. In connection with this, the question of why some neighbourhoods were open and cooperative while others were restrained and quarrelsome is addressed. Finally, the third main theme discusses the changes in piloting and lighthouse keeping that took place increasingly numerous towards the end of the 20th century. How did individuals react to the central management s technocratic strivings and rationalisations, such as the automation of lighthouses and the intense downsizing of the network of pilot stations? How was piloting, previously very comprehensive work, splintered into specialisations, and how did the entire occupation of lighthouse keeping lose its status before completely disappearing, as the new technology took over?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The majority of Internet traffic use Transmission Control Protocol (TCP) as the transport level protocol. It provides a reliable ordered byte stream for the applications. However, applications such as live video streaming place an emphasis on timeliness over reliability. Also a smooth sending rate can be desirable over sharp changes in the sending rate. For these applications TCP is not necessarily suitable. Rate control attempts to address the demands of these applications. An important design feature in all rate control mechanisms is TCP friendliness. We should not negatively impact TCP performance since it is still the dominant protocol. Rate Control mechanisms are classified into two different mechanisms: window-based mechanisms and rate-based mechanisms. Window-based mechanisms increase their sending rate after a successful transfer of a window of packets similar to TCP. They typically decrease their sending rate sharply after a packet loss. Rate-based solutions control their sending rate in some other way. A large subset of rate-based solutions are called equation-based solutions. Equation-based solutions have a control equation which provides an allowed sending rate. Typically these rate-based solutions react slower to both packet losses and increases in available bandwidth making their sending rate smoother than that of window-based solutions. This report contains a survey of rate control mechanisms and a discussion of their relative strengths and weaknesses. A section is dedicated to a discussion on the enhancements in wireless environments. Another topic in the report is bandwidth estimation. Bandwidth estimation is divided into capacity estimation and available bandwidth estimation. We describe techniques that enable the calculation of a fair sending rate that can be used to create novel rate control mechanisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Maurice Merleau-Ponty (1908-1961) has been known as the philosopher of painting. His interest in the theory of perception intertwined with the questions concerning the artist s perception, the experience of an artwork and the possible interpretations of the artwork. For him, aesthetics was not a sub-field of philosophy, and art was not simply a subject matter for the aesthetic experience, but a form of thinking. This study proposes an opening for a dialogue between Merleau-Pontian phenomenology and contemporary art. The thesis examines his phenomenology through certain works of contemporary art and presents readings of these artworks through his phenomenology. The thesis both shows the potentiality of a method, but also engages in the critical task of finding the possible limitations of his approach. The first part lays out the methodological and conceptual points of departure of Merleau-Ponty s phenomenological approach to perception as well as the features that determined his discussion on encountering art. Merleau-Ponty referred to the experience of perceiving art using the notion of seeing with (voir selon). He stressed a correlative reciprocity described in Eye and Mind (1961) as the switching of the roles of the visible and the painter. The choice of artworks is motivated by certain restrictions in the phenomenological readings of visual arts. The examined works include paintings by Tiina Mielonen, a photographic work by Christian Mayer, a film by Douglas Gordon and Philippe Parreno, and an installation by Monika Sosnowska. These works resonate with, and challenge, his phenomenological approach. The chapters with case studies take up different themes that are central to Merleau-Ponty s phenomenology: space, movement, time, and touch. All of the themes are interlinked with the examined artworks. There are also topics that reappear in the thesis, such as the notion of écart and the question of encountering the other. As Merleau-Ponty argued, the sphere of art has a particular capability to address our being in the world. The thesis presents an interpretation that emphasises the notion of écart, which refers to an experience of divergence or dispossession. The sudden dissociation, surprise or rupture that is needed in order for a meeting between the spectator and the artwork, or between two persons, to be possible. Further, the thesis suggests that through artworks it is possible to take into consideration the écart, the divergence, that defines our subjectivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A state-of-the-art model of the coupled ocean-atmosphere system, the climate forecast system (CFS), from the National Centres for Environmental Prediction (NCEP), USA, has been ported onto the PARAM Padma parallel computing system at the Centre for Development of Advanced Computing (CDAC), Bangalore and retrospective predictions for the summer monsoon (June-September) season of 2009 have been generated, using five initial conditions for the atmosphere and one initial condition for the ocean for May 2009. Whereas a large deficit in the Indian summer monsoon rainfall (ISMR; June-September) was experienced over the Indian region (with the all-India rainfall deficit by 22% of the average), the ensemble average prediction was for above-average rainfall during the summer monsoon. The retrospective predictions of ISMR with CFS from NCEP for 1981-2008 have been analysed. The retrospective predictions from NCEP for the summer monsoon of 1994 and that from CDAC for 2009 have been compared with the simulations for each of the seasons with the stand-alone atmospheric component of the model, the global forecast system (GFS), and observations. It has been shown that the simulation with GFS for 2009 showed deficit rainfall as observed. The large error in the prediction for the monsoon of 2009 can be attributed to a positive Indian Ocean Dipole event seen in the prediction from July onwards, which was not present in the observations. This suggests that the error could be reduced with improvement of the ocean model over the equatorial Indian Ocean.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distribution of particle reinforcements in cast composites is determined by the morphology of the solidification front. Interestingly, during solidification, the morphology of the interface is intrinsically affected by the presence of dispersed reinforcements. Thus the dispersoid distribution and length scale of matrix microstructure is a result of the interplay between these two. A proper combination of material and process parameters can be used to obtain composites with tailored microstructures. This requires the generation of a broad data base and optimization of the complete solidification process. The length scale of soldification microtructure has a large influence on the mechanical properties of the composites. This presentation addresses the concept of a particle distribution map which can help in predicting particle distribution under different solidification conditions Future research directions have also been indicated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, parallel computers have been attracting attention for simulating artificial neural networks (ANN). This is due to the inherent parallelism in ANN. This work is aimed at studying ways of parallelizing adaptive resonance theory (ART), a popular neural network algorithm. The core computations of ART are separated and different strategies of parallelizing ART are discussed. We present mapping strategies for ART 2-A neural network onto ring and mesh architectures. The required parallel architecture is simulated using a parallel architectural simulator, PROTEUS and parallel programs are written using a superset of C for the algorithms presented. A simulation-based scalability study of the algorithm-architecture match is carried out. The various overheads are identified in order to suggest ways of improving the performance. Our main objective is to find out the performance of the ART2-A network on different parallel architectures. (C) 1999 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Music signals comprise of atomic notes drawn from a musical scale. The creation of musical sequences often involves splicing the notes in a constrained way resulting in aesthetically appealing patterns. We develop an approach for music signal representation based on symbolic dynamics by translating the lexicographic rules over a musical scale to constraints on a Markov chain. This source representation is useful for machine based music synthesis, in a way, similar to a musician producing original music. In order to mathematically quantify user listening experience, we study the correlation between the max-entropic rate of a musical scale and the subjective aesthetic component. We present our analysis with examples from the south Indian classical music system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the problem of analyzing influence of various factors affecting individual messages posted in social media. The problem is challenging because of various types of influences propagating through the social media network that act simultaneously on any user. Additionally, the topic composition of the influencing factors and the susceptibility of users to these influences evolve over time. This problem has not been studied before, and off-the-shelf models are unsuitable for this purpose. To capture the complex interplay of these various factors, we propose a new non-parametric model called the Dynamic Multi-Relational Chinese Restaurant Process. This accounts for the user network for data generation and also allows the parameters to evolve over time. Designing inference algorithms for this model suited for large scale social-media data is another challenge. To this end, we propose a scalable and multi-threaded inference algorithm based on online Gibbs Sampling. Extensive evaluations on large-scale Twitter and Face book data show that the extracted topics when applied to authorship and commenting prediction outperform state-of-the-art baselines. More importantly, our model produces valuable insights on topic trends and user personality trends beyond the capability of existing approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tonic is a fundamental concept in Indian art music. It is the base pitch, which an artist chooses in order to construct the melodies during a rg(a) rendition, and all accompanying instruments are tuned using the tonic pitch. Consequently, tonic identification is a fundamental task for most computational analyses of Indian art music, such as intonation analysis, melodic motif analysis and rg recognition. In this paper we review existing approaches for tonic identification in Indian art music and evaluate them on six diverse datasets for a thorough comparison and analysis. We study the performance of each method in different contexts such as the presence/absence of additional metadata, the quality of audio data, the duration of audio data, music tradition (Hindustani/Carnatic) and the gender of the singer (male/female). We show that the approaches that combine multi-pitch analysis with machine learning provide the best performance in most cases (90% identification accuracy on average), and are robust across the aforementioned contexts compared to the approaches based on expert knowledge. In addition, we also show that the performance of the latter can be improved when additional metadata is available to further constrain the problem. Finally, we present a detailed error analysis of each method, providing further insights into the advantages and limitations of the methods.