986 resultados para Post-impressionism (Art)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mr. Michl posed the question of how the institutional framework that the former communist regime set up around art production contributed to the success of Czech applied arts. In his theoretical review of the question he discussed the reasons for the lack of success of socialist industrial design as opposed to what he terms pre-industrial arts (such as art glass), and also for the current lack of interest into art institutions of the past regime. His findings in the second, historical section of his work were based largely on interviews with artists and other insiders, as an initial attempt to use questionnaires was unsuccessful. His original assumption that the institutional framework was imposed on artists against their will in fact proved mistaken, as it turned out to have been proposed by the artists themselves. The basic blueprint for communist art institutions was the Memorandum document published on behalf of Czechoslovak visual artists in March 1947, i.e. before the communist coup of February 1948. Thus, while the communist state provided a beneficial institutional framework for artists' work, it was the artists themselves who designed this framework. Mr. Michl concludes that the text of the memorandum appealed to the general left-wing and anti-market sentiments of the immediate post-war period and by this and by later working through the administrative channels of the new state, the artists succeeded in gaining all of their demands over the next 15 years. The one exception was artistic freedom, although this they came to enjoy, if only by default and for a short time, during the ideological thaw of the 1960s. Mr. Michl also examined the art-related legislative framework in detail and looked at the main features of key art institutions in the field, such as the Czech Fund for Visual Arts and the 1960s art export enterprise Art Centrum, which opened the doors into foreign markets for artists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an algorithm for estimating dense image correspondences. Our versatile approach lends itself to various tasks typical for video post-processing, including image morphing, optical flow estimation, stereo rectification, disparity/depth reconstruction, and baseline adjustment. We incorporate recent advances in feature matching, energy minimization, stereo vision, and data clustering into our approach. At the core of our correspondence estimation we use Efficient Belief Propagation for energy minimization. While state-of-the-art algorithms only work on thumbnail-sized images, our novel feature downsampling scheme in combination with a simple, yet efficient data term compression, can cope with high-resolution data. The incorporation of SIFT (Scale-Invariant Feature Transform) features into data term computation further resolves matching ambiguities, making long-range correspondence estimation possible. We detect occluded areas by evaluating the correspondence symmetry, we further apply Geodesic matting to automatically determine plausible values in these regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to investigate if acute myocardial infarction can be detected by post-mortem cardiac magnetic resonance (PMMR) at an earlier stage than by traditional autopsy, i.e., within less than 4 h after onset of ischemia; and if so, to determine the characteristics of PMMR findings in early acute infarcts. Twenty-one ex vivo porcine hearts with acute myocardial infarction underwent T2-weighted cardiac PMMR imaging within 3 h of onset of iatrogenic ischemia. PMMR imaging findings were compared to macroscopic findings. Myocardial edema induced by ischemia and reperfusion was visible on PMMR in all cases. Typical findings of early acute ischemic injury on PMMR consist of a central zone of intermediate signal intensity bordered by a rim of increased signal intensity. Myocardial edema can be detected on cardiac PMMR within the first 3 h after the onset of ischemia in porcine hearts. The size of myocardial edema reflects the area of ischemic injury in early acute (per-acute) myocardial infarction. This study provides evidence that cardiac PMMR is able to detect acute myocardial infarcts at an earlier stage than traditional autopsy and routine histology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Neoadjuvant chemotherapy is an accepted standard of care for locally advanced esophagogastric cancer. As only a subgroup benefits, a response-based tailored treatment would be of interest. The aim of our study was the evaluation of the prognostic and predictive value of clinical response in esophagogastric adenocarcinomas. METHODS Clinical response based on a combination of endoscopy and computed tomography (CT) scan was evaluated retrospectively within a prospective database in center A and then transferred to center B. A total of 686/740 (A) and 184/210 (B) patients, staged cT3/4, cN0/1 underwent neoadjuvant chemotherapy and were then re-staged by endoscopy and CT before undergoing tumor resection. Of 184 patients, 118 (B) additionally had an interim response assessment 4-6 weeks after the start of chemotherapy. RESULTS In A, 479 patients (70 %) were defined as clinical nonresponders, 207 (30 %) as responders. Median survival was 38 months (nonresponders: 27 months, responders: 108 months, log-rank, p < 0.001). Clinical and histopathological response correlated significantly (p < 0.001). In multivariate analysis, clinical response was an independent prognostic factor (HR for death 1.4, 95 %CI 1.0-1.8, p = 0.032). In B, 140 patients (76 %) were nonresponders and 44 (24 %) responded. Median survival was 33 months, (nonresponders: 27 months, responders: not reached, p = 0.003). Interim clinical response evaluation (118 patients) also had prognostic impact (p = 0.008). Interim, preoperative clinical response and histopathological response correlated strongly (p < 0.001). CONCLUSION Preoperative clinical response was an independent prognostic factor in center A, while in center B its prognostic value could only be confirmed in univariate analysis. The accordance with histopathological response was good in both centers, and interim clinical response evaluation showed comparable results to preoperative evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To investigate frequent findings in cases of fatal opioid intoxication in whole-body post-mortem computed tomography (PMCT). METHODS PMCT of 55 cases in which heroin and/or methadone had been found responsible for death were retrospectively evaluated (study group), and were compared with PMCT images of an age- and sex-matched control group. Imaging results were compared with conventional autopsy. RESULTS The most common findings in the study group were: pulmonary oedema (95 %), aspiration (66 %), distended urinary bladder (42 %), cerebral oedema (49 %), pulmonary emphysema (38 %) and fatty liver disease (36 %). These PMCT findings occurred significantly more often in the study group than in the control group (p < 0.05). The combination of lung oedema, brain oedema and distended urinary bladder was seen in 26 % of the cases in the study group but never in the control group (0 %). This triad, as indicator of opioid-related deaths, had a specificity of 100 %, as confirmed by autopsy and toxicological analysis. CONCLUSIONS Frequent findings in cases of fatal opioid intoxication were demonstrated. The triad of brain oedema, lung oedema and a distended urinary bladder on PMCT was highly specific for drug-associated cases of death. KEY POINTS Frequent findings in cases of fatal opioid intoxication were investigated. Lung oedema, brain oedema and full urinary bladder represent a highly specific constellation. This combination of findings in post-mortem CT should raise suspicion of intoxication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to test the effects of a sustained nystagmus on the head impulse response of the vestibulo-ocular reflex (VOR) in healthy subjects. VOR gain (slow-phase eye velocity/head velocity) was measured using video head impulse test goggles. Acting as a surrogate for a spontaneous nystagmus (SN), a post-rotatory nystagmus (PRN) was elicited after a sustained, constant-velocity rotation, and then head impulses were applied. 'Raw' VOR gain, uncorrected for PRN, in healthy subjects in response to head impulses with peak velocities in the range of 150°/s-250°/s was significantly increased (as reflected in an increase in the slope of the gain versus head velocity relationship) after inducing PRN with slow phases of nystagmus of high intensity (>30°/s) in the same but not in the opposite direction as the slow-phase response induced by the head impulses. The values of VOR gain themselves, however, remained in the normal range with slow-phase velocities of PRN < 30°/s. Finally, quick phases of PRN were suppressed during the first 20-160 ms of a head impulse; the time frame of suppression depended on the direction of PRN but not on the duration of the head impulse. Our results in normal subjects suggest that VOR gains measured using head impulses may have to be corrected for any superimposed SN when the slow-phase velocity of nystagmus is relatively high and the peak velocity of the head movements is relatively low. The suppression of quick phases during head impulses may help to improve steady fixation during rapid head movements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Encountering a cognitive conflict not only slows current performance, but it can also affect subsequent performance, in particular when the conflict is induced with bivalent stimuli (i.e., stimuli with relevant features for two different tasks) or with incongruent trials (i.e., stimuli with relevant features for two response alternatives). The post-conflict slowing following bivalent stimuli, called “bivalency effect”, affects all subsequent stimuli, irrespective of whether the subsequent stimuli share relevant features with the conflict stimuli. To date, it is unknown whether the conflict induced by incongruent stimuli results in a similar post-conflict slowing. To investigate this, we performed six experiments in which participants switched between two tasks. In one task, incongruent stimuli appeared occasionally; in the other task, stimuli shared no feature with the incongruent trials. The results showed an initial performance slowing that affected all tasks after incongruent trials. On further trials, however, the slowing only affected the task sharing features with the conflict stimuli. Therefore, the post-conflict slowing following incongruent stimuli is first general and then becomes conflict-specific across trials. These findings are discussed within current task switching and cognitive control accounts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El artículo analiza la filosofía del arte expuesta por Arthur Danto, procurando, conjuntamente, problematizar el principio de representación estética en su figuración dominante en la tradición del pensamiento filosófico sobre el arte. Comprendiendo las valiosas indagaciones de Danto respecto a las “vanguardias intratables" del siglo XX y a las metamorfosis en la estructuración del mundo del arte, así como respecto a la propia filosofía que indaga sobre las obras de arte, el presente análisis se interroga, a su vez, acerca de las consecuencias problemáticas que el pensamiento de Danto abre y que en muchos casos permanecen en discusión y disputa teórica.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stratigraphy and pollen analysis of the deposits show that this is a lake basin which during the Late-glacial period was partially filled by lake clays and muds. One of the main interests of the pollen diagrams lies in the division of zone i into three suh-zones showing a minor climatic oscillation which seems to be comparable with the Boiling oscillation of northern Europe. During Post-glacial time the greater part of the deposits has been muds but on one side a fen developed which in early zone VI was sufficiently dry to support birch and pine wood. Later in zone VI the water table must have risen slightly because the fen peats were gradually covered by a rather oxidized mud suggesting that the fen became replaced by a shallow swamp with a widely fluctuating water table. In the Atlantic period the basin was reflooded and the more central deposits were covered by a layer of mud. Later in the central region, swamp and eventually Sphagnum bog communities developed. The whole area is now covered by a sihy soil and forms a flat meadowland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One important task in the design of an antenna is to carry out an analysis to find out the characteristics of the antenna that best fulfills the specifications fixed by the application. After that, a prototype is manufactured and the next stage in design process is to check if the radiation pattern differs from the designed one. Besides the radiation pattern, other radiation parameters like directivity, gain, impedance, beamwidth, efficiency, polarization, etc. must be also evaluated. For this purpose, accurate antenna measurement techniques are needed in order to know exactly the actual electromagnetic behavior of the antenna under test. Due to this fact, most of the measurements are performed in anechoic chambers, which are closed areas, normally shielded, covered by electromagnetic absorbing material, that simulate free space propagation conditions, due to the absorption of the radiation absorbing material. Moreover, these facilities can be employed independently of the weather conditions and allow measurements free from interferences. Despite all the advantages of the anechoic chambers, the results obtained both from far-field measurements and near-field measurements are inevitably affected by errors. Thus, the main objective of this Thesis is to propose algorithms to improve the quality of the results obtained in antenna measurements by using post-processing techniques and without requiring additional measurements. First, a deep revision work of the state of the art has been made in order to give a general vision of the possibilities to characterize or to reduce the effects of errors in antenna measurements. Later, new methods to reduce the unwanted effects of four of the most commons errors in antenna measurements are described and theoretical and numerically validated. The basis of all them is the same, to perform a transformation from the measurement surface to another domain where there is enough information to easily remove the contribution of the errors. The four errors analyzed are noise, reflections, truncation errors and leakage and the tools used to suppress them are mainly source reconstruction techniques, spatial and modal filtering and iterative algorithms to extrapolate functions. Therefore, the main idea of all the methods is to modify the classical near-field-to-far-field transformations by including additional steps with which errors can be greatly suppressed. Moreover, the proposed methods are not computationally complex and, because they are applied in post-processing, additional measurements are not required. The noise is the most widely studied error in this Thesis, proposing a total of three alternatives to filter out an important noise contribution before obtaining the far-field pattern. The first one is based on a modal filtering. The second alternative uses a source reconstruction technique to obtain the extreme near-field where it is possible to apply a spatial filtering. The last one is to back-propagate the measured field to a surface with the same geometry than the measurement surface but closer to the AUT and then to apply also a spatial filtering. All the alternatives are analyzed in the three most common near-field systems, including comprehensive noise statistical analyses in order to deduce the signal-to-noise ratio improvement achieved in each case. The method to suppress reflections in antenna measurements is also based on a source reconstruction technique and the main idea is to reconstruct the field over a surface larger than the antenna aperture in order to be able to identify and later suppress the virtual sources related to the reflective waves. The truncation error presents in the results obtained from planar, cylindrical and partial spherical near-field measurements is the third error analyzed in this Thesis. The method to reduce this error is based on an iterative algorithm to extrapolate the reliable region of the far-field pattern from the knowledge of the field distribution on the AUT plane. The proper termination point of this iterative algorithm as well as other critical aspects of the method are also studied. The last part of this work is dedicated to the detection and suppression of the two most common leakage sources in antenna measurements. A first method tries to estimate the leakage bias constant added by the receiver’s quadrature detector to every near-field data and then suppress its effect on the far-field pattern. The second method can be divided into two parts; the first one to find the position of the faulty component that radiates or receives unwanted radiation, making easier its identification within the measurement environment and its later substitution; and the second part of this method is able to computationally remove the leakage effect without requiring the substitution of the faulty component. Resumen Una tarea importante en el diseño de una antena es llevar a cabo un análisis para averiguar las características de la antena que mejor cumple las especificaciones fijadas por la aplicación. Después de esto, se fabrica un prototipo de la antena y el siguiente paso en el proceso de diseño es comprobar si el patrón de radiación difiere del diseñado. Además del patrón de radiación, otros parámetros de radiación como la directividad, la ganancia, impedancia, ancho de haz, eficiencia, polarización, etc. deben ser también evaluados. Para lograr este propósito, se necesitan técnicas de medida de antenas muy precisas con el fin de saber exactamente el comportamiento electromagnético real de la antena bajo prueba. Debido a esto, la mayoría de las medidas se realizan en cámaras anecoicas, que son áreas cerradas, normalmente revestidas, cubiertas con material absorbente electromagnético. Además, estas instalaciones se pueden emplear independientemente de las condiciones climatológicas y permiten realizar medidas libres de interferencias. A pesar de todas las ventajas de las cámaras anecoicas, los resultados obtenidos tanto en medidas en campo lejano como en medidas en campo próximo están inevitablemente afectados por errores. Así, el principal objetivo de esta Tesis es proponer algoritmos para mejorar la calidad de los resultados obtenidos en medida de antenas mediante el uso de técnicas de post-procesado. Primeramente, se ha realizado un profundo trabajo de revisión del estado del arte con el fin de dar una visión general de las posibilidades para caracterizar o reducir los efectos de errores en medida de antenas. Después, se han descrito y validado tanto teórica como numéricamente nuevos métodos para reducir el efecto indeseado de cuatro de los errores más comunes en medida de antenas. La base de todos ellos es la misma, realizar una transformación de la superficie de medida a otro dominio donde hay suficiente información para eliminar fácilmente la contribución de los errores. Los cuatro errores analizados son ruido, reflexiones, errores de truncamiento y leakage y las herramientas usadas para suprimirlos son principalmente técnicas de reconstrucción de fuentes, filtrado espacial y modal y algoritmos iterativos para extrapolar funciones. Por lo tanto, la principal idea de todos los métodos es modificar las transformaciones clásicas de campo cercano a campo lejano incluyendo pasos adicionales con los que los errores pueden ser enormemente suprimidos. Además, los métodos propuestos no son computacionalmente complejos y dado que se aplican en post-procesado, no se necesitan medidas adicionales. El ruido es el error más ampliamente estudiado en esta Tesis, proponiéndose un total de tres alternativas para filtrar una importante contribución de ruido antes de obtener el patrón de campo lejano. La primera está basada en un filtrado modal. La segunda alternativa usa una técnica de reconstrucción de fuentes para obtener el campo sobre el plano de la antena donde es posible aplicar un filtrado espacial. La última es propagar el campo medido a una superficie con la misma geometría que la superficie de medida pero más próxima a la antena y luego aplicar también un filtrado espacial. Todas las alternativas han sido analizadas en los sistemas de campo próximos más comunes, incluyendo detallados análisis estadísticos del ruido con el fin de deducir la mejora de la relación señal a ruido lograda en cada caso. El método para suprimir reflexiones en medida de antenas está también basado en una técnica de reconstrucción de fuentes y la principal idea es reconstruir el campo sobre una superficie mayor que la apertura de la antena con el fin de ser capaces de identificar y después suprimir fuentes virtuales relacionadas con las ondas reflejadas. El error de truncamiento que aparece en los resultados obtenidos a partir de medidas en un plano, cilindro o en la porción de una esfera es el tercer error analizado en esta Tesis. El método para reducir este error está basado en un algoritmo iterativo para extrapolar la región fiable del patrón de campo lejano a partir de información de la distribución del campo sobre el plano de la antena. Además, se ha estudiado el punto apropiado de terminación de este algoritmo iterativo así como otros aspectos críticos del método. La última parte de este trabajo está dedicado a la detección y supresión de dos de las fuentes de leakage más comunes en medida de antenas. El primer método intenta realizar una estimación de la constante de fuga del leakage añadido por el detector en cuadratura del receptor a todos los datos en campo próximo y después suprimir su efecto en el patrón de campo lejano. El segundo método se puede dividir en dos partes; la primera de ellas para encontrar la posición de elementos defectuosos que radian o reciben radiación indeseada, haciendo más fácil su identificación dentro del entorno de medida y su posterior substitución. La segunda parte del método es capaz de eliminar computacionalmente el efector del leakage sin necesidad de la substitución del elemento defectuoso.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Correlación entre pH y color de la carne durante la fase post mórtem en cerdos diferenciando entre carnes PSE y DFD

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A rhetorical approach to the fiction of war offers an appropriate vehicle by which one may encounter and interrogate such literature and the cultural metanarratives that exist therein. My project is a critical analysis—one that relies heavily upon Kenneth Burke’s dramatistic method and his concepts of scapegoating, the comic corrective, and hierarchical psychosis—of three war novels published in 2012 (The Yellow Birds by Kevin Powers, FOBBIT by David Abrams, and Billy Lynn’s Long Halftime Walk by Ben Fountain). This analysis assumes a rhetorical screen in order to subvert and redirect the grand narratives the United States perpetuates in art form whenever it goes to war. Kenneth Burke’s concept of ad bellum purificandum (the purification of war) sought to bridge the gap between war experience and the discourse that it creates in both art and criticism. My work extends that project. I examine the symbolic incongruity of convenient symbols that migrate from war to war (“Geronimo” was used as code for Osama bin Laden’s death during the S.E.A.L team raid; “Indian Country” stands for any dangerous land in Iraq; hajji is this generation’s epithet for the enemy other). Such an examination can weaken our cultural “symbol mongering,” to borrow a phrase from Walker Percy. These three books, examined according to Burke’s methodology, exhibit a wide range of approaches to the soldier’s tale. Notably, however, whether they refigure the grand narratives of modern culture or recast the common redemptive war narrative into more complex representations, this examination shows how one can grasp, contend, and transcend the metanarrative of the typical, redemptive war story.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes post-pornographic practices – an activist and theoretical movement that recognizes pornography as valuable in understanding social, cultural, and political systems that construct and reflect identity – through the work of American artist Marilyn Minter. The analysis contextualizes post-pornography and concludes with an examination of several of Minter’s recent paintings and photographs through a postpornographic lens to assert that these examples of her work explore sexuality and gender by incorporating aesthetic and ideological references to porn and by invoking the postpornographic tenets of collaboration, disruption of public space, and the inversion of heteronormativity. Creating art with Wangechi Mutu, displaying in Times Square high definition videos of lips that slurp green goo, and painting men garbed in lingerie constitute some of Minter’s endeavors, which reenvision pornographic relationships to authorship and agency, public versus private space, and the expression or repression of fantasy.