994 resultados para one-pass tableau
Resumo:
Le cinéma des premiers temps, c'est-à-dire la production des deux premières décennies du cinéma, majoritairement caractérisée par des plans autonomes, des histoires courtes et un cadre fixe, n'a essentiellement connu d'études esthétiques que sous un angle narratologique, centrées notamment sur les prémisses du montage. Cette thèse déplace le regard - ou plus simplement le convoque -, en proposant de faire sa place à l'image. Car à qui sait les regarder, les premiers films dévoilent une parenté picturale jusqu'alors ignorée. Les images du cinéma des premiers temps - alors significativement appelées « tableaux » - se sont en effet définies à l'aune de la peinture, et même plus précisément par une imitation littérale des oeuvres d'art. Cette étude révèle que le tableau vivant, défini dans les termes stricts de la reconstitution d'une composition picturale par des acteurs vivants (que ceux-ci tiennent la pose ou non), est au fondement d'une esthétique du film des premiers temps. L'argument est structuré par les illustrations que l'auteure exhume (et compare, à la manière d'un spectaculaire et vivant jeu des 7 différences) parmi cette production filmique majoritairement disparue, brûlée, effacée, et ces références picturales aujourd'hui perdues, dénigrées, oubliées... Néanmoins ce ne sont pas quelques exemples isolés, mais un vrai phénomène historique qui est mis au jours à travers un corpus de films traversant tous les genres du cinéma des premiers temps, et prouvant que les productions du Film d'Art et des séries d'art ou le film Corner in Wheat (D.W. Griffith, 1909), souvent tenus comme un commencement, consistent bien plus en un aboutissement de cette tradition qui consiste à créer des images filmiques sous forme de tableaux vivants. Traçant d'abord ses « contexte et contours », le texte montre que la reconstitution picturale hante toutes les formes de spectacle à l'heure de l'émergence du cinéma. Les scènes de l'époque cultivent internationalement une esthétique de tableau vivant. Et la scène n'a pas l'exclusivité du phénomène : le médium photographique, dès son apparition, s'approprie le procédé, pour (chose jusqu'alors impossible) documenter l'effet visuel de ces reconstitutions, mais aussi pour les réinventer, en particulier pour se légitimer en tant que moyen artistique capable de rivaliser avec la peinture. Le cinéma émergent procède à une appropriation similaire du tableau vivant, qui fait le coeur de ce travail en y étant analysée selon quatre axes théoriques : Reproduire - où l'on découvre le caractère fondamentalement indirect de la filiation picturale de ces tableaux vivants, pris dans une dynamique de reproduction intermédiale qui en fait de véritables exercices de style, par lesquels les producteurs expérimentent et prennent conscience des moyens .artistiques de l'image filmique - ; Réincarner - où l'on étudie les problématiques engagées par la « mise en vie », et plus précisément la « mise en corps » des figures picturales (en particulier de Jésus et du nu), impliquant des enjeux de censure et un questionnement du regard sur l'art, sur le corps, et sur le statut de ces images qui semblent plus originales que l'original - ; Réanimer - où l'on examine la manière dont le cinéma mouvemente la peinture, en remettant la composition en action, en en redéployant l'instant prégnant, en expérimentant la pose gestuelle, l'arrêt du photogramme et tout le spectre de la temporalité cinématographique - ; enfin Recadrer - où l'on analyse le cadrage de ces tableaux repensés à l'aune de la caméra et de l'écran, qui nécessitent de complexifier les catégories théoriques baziniennes, et qui font émerger le tableau vivant comme un lieu de cristallisation d'une image filmique tabu/aire, offrant une résistance au montage linéaire. Or cette résistance se vérifiera jusque dans les films très contemporains, qui, en réactualisant le motif du tableau vivant, briseront la linéarité narrative du montage et feront rejaillir le poids artistique de l'image - ravivant en cela une esthétique fondatrice du cinéma.
Resumo:
We present Very Long Baseline Interferometry (VLBI) observations of the high mass X-ray binary LS I +61˚303, carried out with the European VLBI Network (EVN). Over the 11 hour observing run, performed ~10 days after a radio outburst, the radio source showed a constant flux density, which allowed sensitive imaging of the emission distribution. The structure in the map shows a clear extension to the southeast. Comparing our data with previous VLBI observations we interpret the extension as a collimated radio jet as found in several other X-ray binaries. Assuming that the structure is the result of an expansion that started at the onset of the outburst, we derive an apparent expansion velocity of 0:003 c, which, in the context of Doppler boosting, corresponds to an intrinsic velocity of at least 0:4 c for an ejection close to the line of sight. From the apparent velocity in all available epochs we are able to establish variations in the ejection angle which imply a precessing accretion disk. Finally we point out that LS I +61˚303, like SS 433 and Cygnus X-1, shows evidence for an emission region almostorthogonal to the relativistic jet
Resumo:
In this work we report the obtention of a tetrabutylammonium hydroxide (TBAOH) solution in acetonitrile in a one pot process in order to study the interaction ironporphyrinOH- in non-aqueous systems. All the reactions were carried out under dry argon atmosphere to prevent the contamination of the solution with CO2, which leads to the formation of (TBA)2CO3.
Resumo:
This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4
Resumo:
This paper presents a discussion on the process that led us to a progressively developing of a specific methodological approach for research on one parent families. This process has been systematized and built from the contributions of feminist epistemologies to the methodological design and participatory forms of work. From it derives a scientific and technical contribution, internationally unpublisheduntil now: Single Parenthood and family diversity Survey (EMODIF), which we propose as a not androcentric measuring tool of single parenthood, their profiles,experiences, expectations and realities. With this article we want to offer a systematization of the implications that has had our implementation of the feminist perspective in studies of one parent families.
Resumo:
The heated debate over whether there is only a single mechanism or two mechanisms for morphology has diverted valuable research energy away from the more critical questions about the neural computations involved in the comprehension and production of morphologically complex forms. Cognitive neuroscience data implicate many brain areas. All extant models, whether they rely on a connectionist network or espouse two mechanisms, are too underspecified to explain why more than a few brain areas differ in their activity during the processing of regular and irregular forms. No one doubts that the brain treats regular and irregular words differently, but brain data indicate that a simplistic account will not do. It is time for us to search for the critical factors free from theoretical blinders.
Resumo:
We show how certain N-dimensional dynamical systems are able to exploit the full instability capabilities of their fixed points to do Hopf bifurcations and how such a behavior produces complex time evolutions based on the nonlinear combination of the oscillation modes that emerged from these bifurcations. For really different oscillation frequencies, the evolutions describe robust wave form structures, usually periodic, in which selfsimilarity with respect to both the time scale and system dimension is clearly appreciated. For closer frequencies, the evolution signals usually appear irregular but are still based on the repetition of complex wave form structures. The study is developed by considering vector fields with a scalar-valued nonlinear function of a single variable that is a linear combination of the N dynamical variables. In this case, the linear stability analysis can be used to design N-dimensional systems in which the fixed points of a saddle-node pair experience up to N21 Hopf bifurcations with preselected oscillation frequencies. The secondary processes occurring in the phase region where the variety of limit cycles appear may be rather complex and difficult to characterize, but they produce the nonlinear mixing of oscillation modes with relatively generic features
Resumo:
A new series of 5-benzylidene-2-[(pyridine-4-ylmethylene)hydrazono]-thiazolidin-4-ones 4a-l have been synthesized. These compounds were designed by a molecular hybridization approach. 2-[(Pyridine-4-ylmethylene)hydrazono]-thiazolidin-4-ones 3a-d were also obtained and used as intermediates to give the target compounds. The in vitro antimicrobial and cytotoxic activities were evaluated for both series. The intermediate 3b showed considerable antibiotic activity against B. subtilis and C. albicans. In the cytotoxic activity compounds 3b (IC50= 4.25 ± 0.36 µg/mL) and 4l (IC50= 1.38 ± 0.04 µg/mL) were effective for inhibition of human erythromyeloblastoid leukemia (K-562) and human lung carcinoma (NCI-H292) cell lines, respectively.
Resumo:
Tutkimus on esiselvitystyö, jonka tavoitteena oli selvittää miten luoda innovaatiojärjestelmä Kouvolan seudulle. Lisäksi tavoitteena oli selvittää alueen innovaatiojärjestelmän nykytila, tutkia miten toimijat saadaan sitoutumaan ja toimimaan kohti yhteistä päämäärää sekä kuvata prosessi kehitystyölle. Kouvolan seudulla on tapahtunut lähiaikoina runsaasti muutoksia, jotka vaikuttavat myös innovaatiotoimintaan. Kehitystä on tapahtunut, mutta kehitystä on kiihdytettävä, jotta alue on tulevaisuudessa elinvoimainen seutu asua ja yrittää. Suoritettujen kahdeksan asiantuntijahaastattelun pohjalta voidaan todeta, että Kouvolan seudulla ei nykyisellään ole innovaatiojärjestelmää. Kouvolan seudulla on innovaatiojärjestelmän osia, mutta toiminta on hajanaista ja toimijoilla ei ole selkeää kuvaa omasta roolistaan osana kokonaisuutta. Innovaatioiden syntyä ei voi jättää sattuman käsiin, vaan uusien innovaatioiden syntyminen vaatii yhteistyötä ja vuorovaikutusta eri yritysten ja organisaatioiden välillä. Innovaatiojärjestelmän luomiseen Kouvolan seudulle ei ole yhtä oikeaa ja helppoa tietä. Harmaakorven alueellinen kehitysalustamenetelmä tarjoaa selkeän prosessin kehitystyöhön, mutta onko Kouvolassa tarvittavia resursseja ja osaamista prosessin läpivientiin? Tulisiko lähteä liikkeelle pienistä asioista, yhteisesti määritellyistä käsitteistä, päämääristä ja rooleista. Yhtenä ratkaisuna on Kouvolan seudun innovaatio-ohjelman rakentaminen, jossa nämä määritellään. Ohjelman kehittämisessä on syytä olla mukana mahdollisimman laajalti innovaatiojärjestelmän toimijoita, mutta kehittämisprosessin on syytä olla kevyt. Jo ohjelmaa kehittäessä tarvitaan erilaisia pilot-projekteja ja foorumeita, joiden kautta Kouvolan seudun innovaatiojärjestelmää saadaan kehitettyä ja pikku hiljaa rakennettua luottamusta ja yhteistyötä. Tekemisen kautta tietoisuus toiminnasta ja sen eduista leviää, ja Kouvolan seudun saadaan luotua toimiva alueellinen innovaatiojärjestelmä.
Resumo:
Tutkin kandidaatin tutkielmassani yhteiskuntavastuuviestintää suomalaisten pörssiyhtiöiden vuosikertomuksissa 2004. Tutkimus osoitti, että viestintä on hyvin eritasoista eri yrityksissä. Valveutuneet yhteiskuntavastuuviestijät käyttivät kolmen pilarin mallia kategorioidessaan toimintaansa. Selkeästi oli havaittavissa myös toimialakohtaisia eroja. Tässä tutkimuksessa jatkan samalla aihepiirillä selvittämällä sitä, kuinka Helsingin pörssissä listattujen yhtiöiden yhteiskuntavastuuviestintä on muuttunut kun verrataan vuoden 2004 ja 2008 vuosikertomuksia toisiinsa. Tutkimusmetodi on kvalitatiivinen. Diskurssianalyysin keinoin selvitän miten yritykset viestivät vastuullisuudestaan. Tutkimus osoittaa, että yhteiskuntavastuuviestintä ei ole edelleenkään jokaisen yhtiön intresseissä. Yrityksistä noin kaksi kolmesta viestii jotain yhteiskuntavastuun alueeseen liittyvää. Ward on hieman vähentynyt vuodesta 2004. Näistä yrityksistä vastaavasti noin kahdella kolmesta yhteiskuntavastuutoiminta on johdettua ja tavoitteellista tämä näkyy korkealaatuisena yhteiskuntavastuuviestintänä. Taantuma vuosikertomuksissa näkyi etenkin taloudellisen vastuun lisääntyneenä raportointina.
Resumo:
Cellular automata are models for massively parallel computation. A cellular automaton consists of cells which are arranged in some kind of regular lattice and a local update rule which updates the state of each cell according to the states of the cell's neighbors on each step of the computation. This work focuses on reversible one-dimensional cellular automata in which the cells are arranged in a two-way in_nite line and the computation is reversible, that is, the previous states of the cells can be derived from the current ones. In this work it is shown that several properties of reversible one-dimensional cellular automata are algorithmically undecidable, that is, there exists no algorithm that would tell whether a given cellular automaton has the property or not. It is shown that the tiling problem of Wang tiles remains undecidable even in some very restricted special cases. It follows that it is undecidable whether some given states will always appear in computations by the given cellular automaton. It also follows that a weaker form of expansivity, which is a concept of dynamical systems, is an undecidable property for reversible one-dimensional cellular automata. It is shown that several properties of dynamical systems are undecidable for reversible one-dimensional cellular automata. It shown that sensitivity to initial conditions and topological mixing are undecidable properties. Furthermore, non-sensitive and mixing cellular automata are recursively inseparable. It follows that also chaotic behavior is an undecidable property for reversible one-dimensional cellular automata.
Resumo:
Humans have used arguments for defending or refuting statements long before the creation of logic as a specialized discipline. This can be interpreted as the fact that an intuitive notion of "logical consequence" or a psychic disposition to articulate reasoning according to this pattern is present in common sense, and logic simply aims at describing and codifying the features of this spontaneous capacity of human reason. It is well known, however, that several arguments easily accepted by common sense are actually "logical fallacies", and this indicates that logic is not just a descriptive, but also a prescriptive or normative enterprise, in which the notion of logical consequence is defined in a precise way and then certain rules are established in order to maintain the discourse in keeping with this notion. Yet in the justification of the correctness and adequacy of these rules commonsense reasoning must necessarily be used, and in such a way its foundational role is recognized. Moreover, it remains also true that several branches and forms of logic have been elaborated precisely in order to reflect the structural features of correct argument used in different fields of human reasoning and yet insufficiently mirrored by the most familiar logical formalisms.
Resumo:
The paper supports a dialectical interpretation of Wittgenstein's method focusing on the analysis of the conditions of experience presented in his Philosophical Remarks. By means of a close reading of some key passages dealing with solipsism I will try to lay bare their self-subverting character: the fact that they amount to miniature dialectical exercises offering specific directions to pass from particular pieces of disguised nonsense to corresponding pieces of patent nonsense. Yet, in order to follow those directions one needs to allow oneself to become simultaneously tempted by and suspicious of their all-too-evident "metaphysical tone" - a tone which, as we shall see, is particularly manifest in those claims purporting to state what can or cannot be the case, and, still more particularly, those purporting to state what can or cannot be done in language or thought, thus leading to the view that there are some (determinate) things which are ineffable or unthinkable. I conclude by suggesting that in writing those remarks Wittgenstein was still moved by an ethical project, which gets conspicuously displayed in these reiterations of his attempts to cure the readers (and himself) from some of the temptations expressed by solipsism.