980 resultados para Almost Common Value Auctions


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alveolar echinococcosis (AE), caused by larva stage of Echinococcus multilocularis, is one of the lethal parasitic diseases of man and a major public health problem in many countries in the northern hemisphere. When the living conditions and habits in Turkey were considered in terms of relation with the life cycle of the parasite, it was suggested that AE has been much more common than reported mainly from the Eastern Anatolia region of Turkey. Since in vitro serologic diagnosis tests with high specificity for AE have not been used in our country, most of the cases with liver lesions were misdiagnosed by radiological investigations as malignancies. The aim of this study was to evaluate the diagnostic value of the in-house ELISA methods developed by using three different antigens (EgHF, Em2, EmII/3-10) in the serological diagnosis of AE. The study samples included a total of 100 sera provided by Bern University Parasitology Institute where samples were obtained from patients with helminthiasis and all were confirmed by clinical, parasitological and/or histopathological means. Ten samples from each of the cases infected by E.multilocularis, E.granulosus, Taenia solium, Wuchereria bancrofti, Strongyloides stercolaris, Ascaris lumbricoides, Toxocara canis, Trichinella spiralis, Fasciola hepatica and Schistosoma haematobium were studied. In the study, EgHF (E.granulosus hydatid fluid) antigens were prepared in our laboratory from the liver cyst fluids of sheeps with cystic echinococcosis, however Em2 (E.multilocularis metacestode-purified laminated layer) and EmII/3-10 (E.multilocularis recombinant protoscolex tegument) antigens were provided by Bern University Parasitology Institute. Flat bottom ELISA plates were covered with EgHF, Em2 and EmII/3-10 antigens in the concentrations of 2.5 µg, 1 µg and 0.18 µg per well, respectively, and all sera were tested by EgHF-ELISA, Em2-ELISA and EmII/3-10-ELISA methods. For each tests, the samples which were reactive above the cut-off value (mean OD of negative controls+2 SD) were accepted as positive. The sensitivity of the ELISA tests performed with EgHF, Em2 and Em2II/3-10 antigens were estimated as 100%, 90% and 90%, respectively, whereas the specificity were 63%, 91% and 91%, respectively. When Em2-ELISA and EmII/3-10-ELISA tests were evaluated together, the specificity increased to 96%. Our data indicated that the highest sensitivity (100% with EgHF-ELISA) and specificity (96% with Em2-ELISA + EmII/3-10-ELISA) for the serodiagnosis of AE can be achieved by the combined use of the ELISA tests with three different antigens. It was concluded that the early and accurate diagnosis of AE in our country which is endemic for that disease, could be supported by the use of highly specific serological tests such as Em2-ELISA ve EmII/3-10-ELISA contributing radiological data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sustainable use of common-pool resources depends on users’ behaviour with regards to appropriation and provision. Most knowledge about behaviour in such situations comes from experimental research. As experiments take place in confined environments, motivational drivers and actions in the field might differ. This paper analyses farmers’ use of common property pastures in Grindelwald, Switzerland. Binary logistic regression is applied to survey data to explore the effect of farmers’ attributes on livestock endowment, appropriation and provision behaviour. Furthermore, Q methodology is used to assess the impact of broader contextual variables on the sustainability of common property pastures. It is shown that the strongest associations exist between (a) socio-economic attributes and change in livestock endowment; (b) norms and appropriation behaviour; and (c) area and pay-off and provision behaviour. Relevant contextual variables are the economic value of the resource units, off-farm income opportunities, and the subsidy structure. We conclude that with increasing farm size farmers reduce the use and maintenance of common property. Additionally, we postulate that readiness to maintain a resource increases with appropriation activities and the net returns generated from appropriation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The protection and sustainable management of alpine summer pastures has been stated as a goal in Swiss national law since 1996, and direct payments from the state for summer pasturing have been tied to sustainability criteria since 2000. This reflects the increasing value of the alpine cultural landscape as a public good. However, provision of this public good remains in the hands of local farmers and their local common pool resource (CPR) institutions for managing alpine pastures. These institutions are increasingly struggling to maintain their institutional arrangements, particularly regarding the work needed to maintain the pastures. This paper examines two cases of local CPR institutions for managing alpine pastures in the Swiss Canton of Grisons that manifest different institutional developments in light of changing conditions. The differences in how these institutions reacted to change and the impacts this has had on the provision of the CPR are explained by focusing on relative prices, bargaining power, and ideology as drivers of institutional change that are often neglected within common property research. Key words: summer pasture management, institutional change, bargaining power, ideology

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All forms of Kaposi sarcoma (KS) are more common in men than in women. It is unknown if this is due to a higher prevalence of human herpesvirus 8 (HHV-8), the underlying cause of KS, in men compared to women. We did a systematic review and meta-analysis to examine the association between HHV-8 seropositivity and gender in the general population. Studies in selected populations like for example, blood donors, hospital patients, and men who have sex with men were excluded. We searched Medline and Embase from January 1994 to February 2015. We included observational studies that recruited participants from the general population and reported HHV-8 seroprevalence for men and women or boys and girls. We used random-effects meta-analysis to pool odds ratios (OR) of the association between HHV-8 and gender. We used meta-regression to identify effect modifiers, including age, geographical region and type of HHV-8 antibody test. We included 22 studies, with 36,175 participants. Men from sub-Saharan Africa (SSA) (OR 1.21, 95% confidence interval [CI] 1.09-1.34), but not men from elsewhere (OR 0.94, 95% CI 0.83-1.06), were more likely to be HHV-8 seropositive than women (p value for interaction=0.010). There was no difference in HHV-8 seroprevalence between boys and girls from SSA (OR 0.90, 95% CI 0.72-1.13). The type of HHV-8 assay did not affect the overall results. A higher HHV-8 seroprevalence in men than women in SSA may partially explain why men have higher KS risk in this region. This article is protected by copyright. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the role of uncertainty and imperfect local knowledge in foreign direct investment. The main idea comes from the literature on investment under uncertainty, such as Pindyck (1991) and Dixit and Pindyck (1994). We empirically test .the value of waiting. with a dataset on foreign direct investment (FDI). Many factors (e.g., political and economic regulations) as well as uncertainty and the risks due to imperfect local knowledge, determine the attractiveness of FDI. The uncertainty and irreversibility of FDI links the time interval between permission and actual execution of such FDI with explanatory variables, including information on foreign (home) countries and domestic industries. Common factors, such as regulatory change and external shocks, may affect the uncertainty when foreign investors make irreversible FDI decisions. We derive testable hypotheses from models of investment under uncertainty to determine those possible factors that induce delays in FDI, using Korean data over 1962 to 2001.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the current climate of escalating health care costs, defining value and accurately measuring it are two critical issues affecting not only the future of cancer care in particular but also the future of health care in general. Specifically, measuring and improving value in cancer-related health care are critical for continued advancements in research, management, and overall delivery of care. However, in oncology, most of this research has focused on value as it relates to insurance industry and payment reform, with little attention paid to value as the output of clinical interventions that encompass integrated clinical teams focusing on the entire cycle of care and measuring objective outcomes that are most relevant to patients. ^ In this study, patient-centered value was defined as health outcomes achieved per dollar spent, and calculated using objective functional outcomes and total care costs. The analytic sample comprised patients diagnosed with three common head and neck cancers—cancer of the larynx, oral cavity, and oropharynx—who were treated in an integrated tertiary care center over an approximately 10-year period. The results of this study provide initial empirical data that can be used to assess and ultimately to help improve the quality and value of head and neck cancer care, and more importantly they can be used by patients and clinicians to make better-informed decisions about care, particularly what therapeutic services and outcomes matter the most to patients.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seventeen whole-rock samples, generally taken at 25- to 50-meter intervals from 5 to 560 meters sub-basement in Deep Sea Drilling Project Hole 504B, were analyzed for 87Sr/86Sr ratios, and rubidium and strontium concentrations. Ten of these samples also were analyzed for Pb-isotope composition. Strontium-isotope ratios for eight samples from the upper 260 meters of the hole range from 0.70287 to 0.70377, with a mean of 0.70320. In the interval 330 to 560 meters, five samples have a restricted range of 0.70259 to 0.70279, with a mean of 0.70266, almost identical to the average value of fresh mid-ocean-ridge basalts. In the interval 260 to 330 meters, approximately intermediate strontium- isotope ratios are found. The higher 87Sr/86Sr ratios in the upper part of the hole can be interpreted in terms of strontium-isotope alteration during basalt-sea-water interaction. Relative to average fresh mid-ocean ridge basalts, the upper 260 meters of basalts are enriched by an average of about 9% in sea-water strontium 87Sr/86Sr = 0.7091). This Sr presumably is located in the smectites, which, as the main secondary minerals throughout the hole, replace olivine and matrix glass and locally fill vesicles (analyzed samples contained no veins). The strontium-isotope data strongly suggest that the integrated flux of sea water through the upper part of the Hole 504B crust has been greater than through the lower part. This is also suggested by (1) the common occurrence of Feoxide- hydroxide minerals as alteration products above 270 meters, but their near absence below 320 meters, (2) the presence of vein calcite above 320 meters, but its near absence below this level, and (3) the occurrence of vein pyrite only below a depth of 270 meters. Sea-water circulation in the lower basalts may have been partly restricted by the greater number of relatively impermeable massive lava flows below 230 meters sub-basement. Although sufficient sea water was present within the lower part of the hole to produce smectitic alteration products, the overall water /rock ratio was low enough to prevent significant modification of strontium-isotope ratios. Lead-isotope ratios of Hole 504B basalts form approximately linear arrays in plots of 208Pb/204Pb and 207Pb/204Pb versus 206Pb/204Pb. The arrays are similar to those reported for basalts from other mid-ocean ridges. There is no trend in Hole 504B lead-isotope ratios with vertical position in the basement. The arrays indicate that the lead-isotope composition of the upper mantle from which the Hole 504B basaltic melts were derived was inhomogeneous.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Responses by marine species to ocean acidification (OA) have recently been shown to be modulated by external factors including temperature, food supply and salinity. However the role of a fundamental biological parameter relevant to all organisms, that of body size, in governing responses to multiple stressors has been almost entirely overlooked. Recent consensus suggests allometric scaling of metabolism with body size differs between species, the commonly cited 'universal' mass scaling exponent (b) of ¾ representing an average of exponents that naturally vary. One model, the Metabolic-Level Boundaries hypothesis, provides a testable prediction: that b will decrease within species under increasing temperature. However, no previous studies have examined how metabolic scaling may be directly affected by OA. We acclimated a wide body-mass range of three common NE Atlantic echinoderms (the sea star Asterias rubens, the brittlestars Ophiothrix fragilis and Amphiura filiformis) to two levels of pCO2 and three temperatures, and metabolic rates were determined using closed-chamber respirometry. The results show that contrary to some models these echinoderm species possess a notable degree of stability in metabolic scaling under different abiotic conditions; the mass scaling exponent (b) varied in value between species, but not within species under different conditions. Additionally, we found no effect of OA on metabolic rates in any species. These data suggest responses to abiotic stressors are not modulated by body size in these species, as reflected in the stability of the metabolic scaling relationship. Such equivalence in response across ontogenetic size ranges has important implications for the stability of ecological food webs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pollen productivity estimates (PPE) are used to quantitatively reconstruct variations in vegetation within a specific distance of the sampled pollen archive. Here, for the first time, PPEs from Siberia are presented. The study area (Khatanga region, Krasnoyarsk territory, Russia) is located in the Siberian Sub-arctic where Larixis the sole forest-line forming tree taxon. Pollen spectra from two different sedimentary environments, namely terrestrial mosses (n=16) and lakes (n=15, median radius ~100 m) and their surrounding vegetation were investigated to extract PPEs. Our results indicate some differences in pollen spectra between moss and lake pollen. Larix and Cyperaceae for example obtained higher representation in the lacustrine than in terrestrial moss samples. This highlights that in calibration studies modern and fossil dataset should be of similar sedimentary origin. The results of the Extended R-Value model were applied to assess the relevant source area of pollen (RSAP) and to calculate the PPEs for both datasets. As expected, the RSAP of the moss samples was very small (about 10 m) compared to the lacustrine samples (about 25 km). Calculation of PPEs for the six most common taxa yielded generally similar results for both datasets. Relative to Poaceae (reference taxon, PPE=1) Betula nana-type (PPEmoss: 1.8, PPElake: 1.8) and Alnusfruticosa-type (PPEmoss: 6.4, PPElake: 2.9) were overrepresented while Cyperaceae (PPEmoss: 0.5, PPElake: 0.1), Ericaceae (PPEmoss: 0.3, PPElake <0.01), Salix (PPEmoss: 0.03, PPElake <0.01) and Larix (PPEmoss <0.01, PPElake: 0.2) were under-represented in the pollen spectra compared to the vegetation in the RSAP. The estimation for the dominant tree in the region, Larixgmelinii, is the first published result for this species, but need to be considered very preliminary. The inferred sequence from over- to under-representation is mostly consistent with results from Europe; however, still the absolute values show some differences. Gathering vegetation data was limited by flowering season and low resolute satellite imagery and accessibility of the remote location of our study area. Therefore, our estimate may serve as first reference to strengthen future vegetation reconstructions in this climate-sensitive region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fifty samples of Roman time soil preserved under the thick ash layer of the A.D.79 eruption of Mt Vesuvius were studied by pollen analysis: 33 samples from a former vineyard surrounding a Villa Rustica at Boscoreale (excavation site 40 x 50 m), 13 samples taken along the 60 m long swimming pool in the sculpture garden of the Villa of Poppaea at Oplontis, and four samples from the formal garden (12.4 x 17.5 m) of the House of the Gold Bracelet in Pompeii. To avoid contamination with modern pollen all samples were taken immediately after uncovering a new portion of the A.D. 79 soil. For comparison also samples of modern Italian soils were studied. Using standard methods for pollen preparation the pollen content of 15 of the archaeological samples proved to be too little to reach a pollen sum of more than 100 grains. The pollen spectra of these samples are not shown in the pollen tables. (Flotation with a sodium tungstate solution, Na2WO4, D = 2.05, following treatment with HCl and NaOH would probably have given a somewhat better result. This method was, however, not available as too expensive at that time.) Although the archaeological samples were taken a few meters apart their pollen values differ very much from one sample to the other. E.g., at Boscoreale (SW quarter). the pollen values of Pinus range from 1.5 to 54.5% resp. from 1 to 244 pine pollen grains per 1 gram of soil, the extremes even found under pine trees. Vitis pollen was present in 7 of the 11 vineyard samples from Boscoreale (NE quarter) only. Although a maximum of 21.7% is reached, the values of Vitis are mostly below 1.5%. Even the values of common weeds differ very much, not only at Boscoreale, but also at the other two sites. The pollen concentration values show similar variations: 3 to 3053 grains and spores were found in 1 g of soil. The mean value (290) is much less than the number of pollen grains, which would fall on 1 cm2 of soil surface during one year. In contrast, the pollen and spore concentrations of the recent soil samples, treated in exactly the same manner, range from 9313 to almost 80000 grains per 1 g of soil. Evidently most of the Roman time pollen has disappeared since its deposition, the reasons not being clear. Not even species which are known to have been cultivated in the garden of Oplontis, like Citrus and Nerium, plant species with easily distinguishable pollen grains, could be traced by pollen analysis. The loss of most of the pollen grains originally contained in the soil prohibits any detailed interpretation of the Pompeian pollen data. The pollen counts merely name plant species which grew in the region, but not necessarily on the excavated plots.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines empirically the impacts of sharing rules of origin (RoOs) with other ASEAN+1 free trade agreements (FTAs) on ASEAN-Korea FTA/ASEAN-China FTA utilization in Thai exports in 2011. Our careful empirical analysis suggests that the harmonization of RoOs across FTAs play some role in reducing the costs yielded through the spaghetti bowl phenomenon. In particular, the harmonization to "change-in-tariff classification (CTC) or real value-added content (RVC)" will play a relatively positive role in not seriously discouraging firms’ use of multiple FTA schemes. On the other hand, the harmonization to CTC or CTC&RVC hinders firms from using those schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The twentieth century brought a new sensibility characterized by the discredit of cartesian rationality and the weakening of universal truths, related with aesthetic values as order, proportion and harmony. In the middle of the century, theorists such as Theodor Adorno, Rudolf Arnheim and Anton Ehrenzweig warned about the transformation developed by the artistic field. Contemporary aesthetics seemed to have a new goal: to deny the idea of art as an organized, finished and coherent structure. The order had lost its privileged position. Disorder, probability, arbitrariness, accidentality, randomness, chaos, fragmentation, indeterminacy... Gradually new terms were coined by aesthetic criticism to explain what had been happening since the beginning of the century. The first essays on the matter sought to provide new interpretative models based on, among other arguments, the phenomenology of perception, the recent discoveries of quantum mechanics, the deeper layers of the psyche or the information theories. Overall, were worthy attempts to give theoretical content to a situation as obvious as devoid of founding charter. Finally, in 1962, Umberto Eco brought together all this efforts by proposing a single theoretical frame in his book Opera Aperta. According to his point of view, all of the aesthetic production of twentieth century had a characteristic in common: its capacity to express multiplicity. For this reason, he considered that the nature of contemporary art was, above all, ambiguous. The aim of this research is to clarify the consequences of the incorporation of ambiguity in architectural theoretical discourse. We should start making an accurate analysis of this concept. However, this task is quite difficult because ambiguity does not allow itself to be clearly defined. This concept has the disadvantage that its signifier is as imprecise as its signified. In addition, the negative connotations that ambiguity still has outside the aesthetic field, stigmatizes this term and makes its use problematic. Another problem of ambiguity is that the contemporary subject is able to locate it in all situations. This means that in addition to distinguish ambiguity in contemporary productions, so does in works belonging to remote ages and styles. For that reason, it could be said that everything is ambiguous. And that’s correct, because somehow ambiguity is present in any creation of the imperfect human being. However, as Eco, Arnheim and Ehrenzweig pointed out, there are two major differences between current and past contexts. One affects the subject and the other the object. First, it’s the contemporary subject, and no other, who has acquired the ability to value and assimilate ambiguity. Secondly, ambiguity was an unexpected aesthetic result in former periods, while in contemporary object it has been codified and is deliberately present. In any case, as Eco did, we consider appropriate the use of the term ambiguity to refer to the contemporary aesthetic field. Any other term with more specific meaning would only show partial and limited aspects of a situation quite complex and difficult to diagnose. Opposed to what normally might be expected, in this case ambiguity is the term that fits better due to its particular lack of specificity. In fact, this lack of specificity is what allows to assign a dynamic condition to the idea of ambiguity that in other terms would hardly be operative. Thus, instead of trying to define the idea of ambiguity, we will analyze how it has evolved and its consequences in architectural discipline. Instead of trying to define what it is, we will examine what its presence has supposed in each moment. We will deal with ambiguity as a constant presence that has always been latent in architectural production but whose nature has been modified over time. Eco, in the mid-twentieth century, discerned between classical ambiguity and contemporary ambiguity. Currently, half a century later, the challenge is to discern whether the idea of ambiguity has remained unchanged or have suffered a new transformation. What this research will demonstrate is that it’s possible to detect a new transformation that has much to do with the cultural and aesthetic context of last decades: the transition from modernism to postmodernism. This assumption leads us to establish two different levels of contemporary ambiguity: each one related to one these periods. The first level of ambiguity is widely well-known since many years. Its main characteristics are a codified multiplicity, an interpretative freedom and an active subject who gives conclusion to an object that is incomplete or indefinite. This level of ambiguity is related to the idea of indeterminacy, concept successfully introduced into contemporary aesthetic language. The second level of ambiguity has been almost unnoticed for architectural criticism, although it has been identified and studied in other theoretical disciplines. Much of the work of Fredric Jameson and François Lyotard shows reasonable evidences that the aesthetic production of postmodernism has transcended modern ambiguity to reach a new level in which, despite of the existence of multiplicity, the interpretative freedom and the active subject have been questioned, and at last denied. In this period ambiguity seems to have reached a new level in which it’s no longer possible to obtain a conclusive and complete interpretation of the object because it has became an unreadable device. The postmodern production offers a kind of inaccessible multiplicity and its nature is deeply contradictory. This hypothetical transformation of the idea of ambiguity has an outstanding analogy with that shown in the poetic analysis made by William Empson, published in 1936 in his Seven Types of Ambiguity. Empson established different levels of ambiguity and classified them according to their poetic effect. This layout had an ascendant logic towards incoherence. In seventh level, where ambiguity is higher, he located the contradiction between irreconcilable opposites. It could be said that contradiction, once it undermines the coherence of the object, was the better way that contemporary aesthetics found to confirm the Hegelian judgment, according to which art would ultimately reject its capacity to express truth. Much of the transformation of architecture throughout last century is related to the active involvement of ambiguity in its theoretical discourse. In modern architecture ambiguity is present afterwards, in its critical review made by theoreticians like Colin Rowe, Manfredo Tafuri and Bruno Zevi. The publication of several studies about Mannerism in the forties and fifties rescued certain virtues of an historical style that had been undervalued due to its deviation from Renacentist canon. Rowe, Tafuri and Zevi, among others, pointed out the similarities between Mannerism and certain qualities of modern architecture, both devoted to break previous dogmas. The recovery of Mannerism allowed joining ambiguity and modernity for first time in the same sentence. In postmodernism, on the other hand, ambiguity is present ex-professo, developing a prominent role in the theoretical discourse of this period. The distance between its analytical identification and its operational use quickly disappeared because of structuralism, an analytical methodology with the aspiration of becoming a modus operandi. Under its influence, architecture began to be identified and studied as a language. Thus, postmodern theoretical project discerned between the components of architectural language and developed them separately. Consequently, there is not only one, but three projects related to postmodern contradiction: semantic project, syntactic project and pragmatic project. Leading these projects are those prominent architects whose work manifested an especial interest in exploring and developing the potential of the use of contradiction in architecture. Thus, Robert Venturi, Peter Eisenman and Rem Koolhaas were who established the main features through which architecture developed the dialectics of ambiguity, in its last and extreme level, as a theoretical project in each component of architectural language. Robert Venturi developed a new interpretation of architecture based on its semantic component, Peter Eisenman did the same with its syntactic component, and also did Rem Koolhaas with its pragmatic component. With this approach this research aims to establish a new reflection on the architectural transformation from modernity to postmodernity. Also, it can serve to light certain aspects still unaware that have shaped the architectural heritage of past decades, consequence of a fruitful relationship between architecture and ambiguity and its provocative consummation in a contradictio in terminis. Esta investigación centra su atención fundamentalmente sobre las repercusiones de la incorporación de la ambigüedad en forma de contradicción en el discurso arquitectónico postmoderno, a través de cada uno de sus tres proyectos teóricos. Está estructurada, por tanto, en torno a un capítulo principal titulado Dialéctica de la ambigüedad como proyecto teórico postmoderno, que se desglosa en tres, de títulos: Proyecto semántico. Robert Venturi; Proyecto sintáctico. Peter Eisenman; y Proyecto pragmático. Rem Koolhaas. El capítulo central se complementa con otros dos situados al inicio. El primero, titulado Dialéctica de la ambigüedad contemporánea. Una aproximación realiza un análisis cronológico de la evolución que ha experimentado la idea de la ambigüedad en la teoría estética del siglo XX, sin entrar aún en cuestiones arquitectónicas. El segundo, titulado Dialéctica de la ambigüedad como crítica del proyecto moderno se ocupa de examinar la paulatina incorporación de la ambigüedad en la revisión crítica de la modernidad, que sería de vital importancia para posibilitar su posterior introducción operativa en la postmodernidad. Un último capítulo, situado al final del texto, propone una serie de Proyecciones que, a tenor de lo analizado en los capítulos anteriores, tratan de establecer una relectura del contexto arquitectónico actual y su evolución posible, considerando, en todo momento, que la reflexión en torno a la ambigüedad todavía hoy permite vislumbrar nuevos horizontes discursivos. Cada doble página de la Tesis sintetiza la estructura tripartita del capítulo central y, a grandes rasgos, la principal herramienta metodológica utilizada en la investigación. De este modo, la triple vertiente semántica, sintáctica y pragmática con que se ha identificado al proyecto teórico postmoderno se reproduce aquí en una distribución específica de imágenes, notas a pie de página y cuerpo principal del texto. En la columna de la izquierda están colocadas las imágenes que acompañan al texto principal. Su distribución atiende a criterios estéticos y compositivos, cualificando, en la medida de lo posible, su condición semántica. A continuación, a su derecha, están colocadas las notas a pie de página. Su disposición es en columna y cada nota está colocada a la misma altura que su correspondiente llamada en el texto principal. Su distribución reglada, su valor como notación y su posible equiparación con una estructura profunda aluden a su condición sintáctica. Finalmente, el cuerpo principal del texto ocupa por completo la mitad derecha de cada doble página. Concebido como un relato continuo, sin apenas interrupciones, su papel como responsable de satisfacer las demandas discursivas que plantea una investigación doctoral está en correspondencia con su condición pragmática.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Los cambios percibidos hacia finales del siglo XX y a principios del nuevo milenio, nos ha mostrado que la crisis cultural de la que somos participes refleja también una crisis de los modelos universales. Nuestra situación contemporánea, parece indicar que ya no es posible formular un sistema estético para atribuirle una vigencia universal e intemporal más allá de su estricta eficacia puntual. La referencia organizada, delimitada, invariable y específica que ofrecía cualquier emplazamiento, en tanto preexistencia, reflejaba una jerarquía del sistema formal basado en lo extensivo: la medida, las normas, el movimiento, el tiempo, la modulación, los códigos y las reglas. Sin embargo, actualmente, algunos aspectos que permanecían latentes sobre lo construido, emergen bajo connotaciones intensivas, transgrediendo la simple manifestación visual y expresiva, para centrase en las propiedades del comportamiento de la materia y la energía como determinantes de un proceso de adaptación en el entorno. A lo largo del todo el siglo XX, el desarrollo de la relación del proyecto sobre lo construido ha sido abordado, casi en exclusiva, entre acciones de preservación o intervención. Ambas perspectivas, manifestaban esfuerzos por articular un pensamiento que diera una consistencia teórica, como soporte para la producción de la acción aditiva. No obstante, en las últimas décadas de finales de siglo, la teoría arquitectónica terminó por incluir pensamientos de otros campos que parecen contaminar la visión sesgada que nos refería lo construido. Todo este entramado conceptual previo, aglomeraba valiosos intentos por dar contenido a una teoría que pudiese ser entendida desde una sola posición argumental. Es así, que en 1979 Ignasi Solá-Morales integró todas las imprecisiones que referían una actuación sobre una arquitectura existente, bajo el termino de “intervención”, el cual fue argumentado en dos sentidos: El primero referido a cualquier tipo de actuación que se puede hacer en un edificio, desde la defensa, preservación, conservación, reutilización, y demás acciones. Se trata de un ámbito donde permanece latente el sentido de intensidad, como factor común de entendimiento de una misma acción. En segundo lugar, más restringido, la idea de intervención se erige como el acto crítico a las ideas anteriores. Ambos representan en definitiva, formas de interpretación de un nuevo discurso. “Una intervención, es tanto como intentar que el edificio vuelva a decir algo o lo diga en una determinada dirección”. A mediados de 1985, motivado por la corriente de revisión historiográfica y la preocupación del deterioro de los centros históricos que recorría toda Europa, Solá-Morales se propone reflexionar sobre “la relación” entre una intervención de nueva arquitectura y la arquitectura previamente existente. Relación condicionada estrictamente bajo consideraciones lingüísticas, a su entender, en sintonía con toda la producción arquitectónica de todo el siglo XX. Del Contraste a la Analogía, resumirá las transformaciones en la concepción discursiva de la intervención arquitectónica, como un fenómeno cambiante en función de los valores culturales, pero a su vez, mostrando una clara tendencia dialógica entres dos categorías formales: El Contraste, enfatizando las posibilidades de la novedad y la diferencia; y por otro lado la emergente Analogía, como una nueva sensibilidad de interpretación del edificio antiguo, donde la semejanza y la diversidad se manifiestan simultáneamente. El aporte reflexivo de los escritos de Solá-Morales podría ser definitivo, si en las últimas décadas antes del fin de siglo, no se hubiesen percibido ciertos cambios sobre la continuidad de la expresión lingüística que fomentaba la arquitectura, hacia una especie de hipertrofia figurativa. Entre muchos argumentos: La disolución de la consistencia compositiva y el estilo unitario, la incorporación volumétrica del proyecto como dispositivo reactivo, y el cambio de visión desde lo retrospectivo hacia lo prospectivo que sugiere la nueva conservación. En este contexto de desintegración, el proyecto, en tanto incorporación o añadido sobre un edificio construido, deja de ser considerado como un apéndice volumétrico subordinado por la reglas compositivas y formales de lo antiguo, para ser considerado como un organismo de orden reactivo, que produce en el soporte existente una alteración en su conformación estructural y sistémica. La extensión, antes espacial, se considera ahora una extensión sensorial y morfológica con la implementación de la tecnología y la hiper-información, pero a su vez, marcados por una fuerte tendencia de optimización energética en su rol operativo, ante el surgimiento del factor ecológico en la producción contemporánea. En una sociedad, como la nuestra, que se está modernizando intensamente, es difícil compartir una adecuada sintonía con las formas del pasado. Desde 1790, fecha de la primera convención francesa para la conservación de monumentos, la escala de lo que se pretende preservar es cada vez más ambiciosa, tanto es así, que al día de hoy el repertorio de lo que se conserva incluye prácticamente todas las tipologías del entorno construido. Para Koolhaas, el intervalo entre el objeto y el momento en el cual se decide su conservación se ha reducido, desde dos milenios en 1882 a unas décadas hoy en día. En breve este lapso desaparecerá, demostrando un cambio radical desde lo retrospectivo hacia lo prospectivo, es decir, que dentro de poco habrá que decidir que es lo que se conserva antes de construir. Solá-Morales, en su momento, distinguió la relación entre lo nuevo y lo antiguo, entre el contraste y la analogía. Hoy casi tres décadas después, el objetivo consiste en evaluar si el modelo de intervención arquitectónica sobre lo construido se ha mantenido desde entonces o si han aparecido nuevas formas de posicionamiento del proyecto sobre lo construido. Nuestro trabajo pretende demostrar el cambio de enfoque proyectual con la preexistencia y que éste tiene estrecha relación con la incorporación de nuevos conceptos, técnicas, herramientas y necesidades que imprimen el contexto cultural, producido por el cambio de siglo. Esta suposición nos orienta a establecer un paralelismo arquitectónico entre los modos de relación en que se manifiesta lo nuevo, entre una posición comúnmente asumida (Tópica), genérica y ortodoxa, fundamentada en lo visual y expresivo de las últimas décadas del siglo XX, y una realidad emergente (Heterotópica), extraordinaria y heterodoxa que estimula lo inmaterial y que parece emerger con creciente intensidad en el siglo XXI. Si a lo largo de todo el siglo XX, el proyecto de intervención arquitectónico, se debatía entre la continuidad y discontinuidad de las categorías formales marcadas por la expresión del edificio preexistente, la nueva intervención contemporánea, como dispositivo reactivo en el paisaje y en el territorio, demanda una absoluta continuidad, ya no visual, expresiva, ni funcional, sino una continuidad fisiológica de adaptación y cambio con la propia dinámica del territorio, bajo nuevas reglas de juego y desplegando planes y estrategias operativas (proyectivas) desde su propia lógica y contingencia. El objeto de esta investigación es determinar los nuevos modos de continuidad y las posibles lógicas de producción que se manifiestan dentro de la Intervención Arquitectónica, intentando superar lo aparente de su relación física y visual, como resultado de la incorporación del factor operativo desplegado por el nuevo dispositivo contemporáneo. Creemos que es acertado mantener la senda connotativa que marca la denominación intervención arquitectónica, por aglutinar conceptos y acercamientos teóricos previos que han ido evolucionando en el tiempo. Si bien el término adolece de mayor alcance operativo desde su formulación, una cualidad que infieren nuestras lógicas contemporáneas, podría ser la reformulación y consolidación de un concepto de intervención más idóneo con nuestros tiempos, anteponiendo un procedimiento lógico desde su propia necesidad y contingencia. Finalmente, nuestro planteamiento inicial aspira a constituir un nueva forma de reflexión que nos permita comprender las complejas implicaciones que infiere la nueva arquitectura sobre la preexistencia, motivada por las incorporación de factores externos al simple juicio formal y expresivo preponderante a finales del siglo XX. Del mismo modo, nuestro camino propuesto, como alternativa, permite proyectar posibles sendas de prospección, al considerar lo preexistente como un ámbito que abarca la totalidad del territorio con dinámicas emergentes de cambio, y con ellas, sus lógicas de intervención.Abstract The perceived changes towards the end of the XXth century and at the beginning of the new milennium have shown us that the cultural crisis in which we participate also reflects a crisis of the universal models. The difference between our contemporary situation and the typical situations of modern orthodoxy and post-modernistic fragmentation, seems to indicate that it is no longer possible to formulate a valid esthetic system, to assign a universal and eternal validity to it beyond its strictly punctual effectiveness; which is even subject to questioning because of the continuous transformations that take place in time and in the sensibility of the subject itself every time it takes over the place. The organised reference that any location offered, limited, invariable and specific, while pre-existing, reflected a hierarchy of the formal system based on the applicable: measure, standards, movement, time, modulation, codes and rules. Authors like Marshall Mc Luhan, Paul Virilio, or Marc Augé anticipated a reality where the conventional system already did not seem to respond to the new architectural requests in which information, speed, disappearance and the virtual had blurred the traditional limits of place; pre-existence did no longer possess a specific delimitation and, on the contrary, they expect to reach a global scale. Currently, some aspects that stayed latent relating to the constructed, surface from intensive connotations, transgressing the simple visual and expressive manifestation in order to focus on the traits of the behaviour of material and energy as determinants of a process of adaptation to the surroundings. Throughout the entire Century, the development of the relation of the project relating to the constructed has been addressed, almost exclusively, in preservational or interventianal actions. Both perspectives showed efforts in order to express a thought that would give a theoretical consistency as a base for the production of the additive action. Nevertheless, the last decades of the Century, architectural theory ended up including thoughts from other fields that seem to contaminate the biased vision 15 which the constructed related us. Ecology, planning, philosophy, global economy, etc, suggest new approaches to the construction of the contemporary city; but this time with a determined idea of change and continuous transformation, that enriches the panorama of thought and architectural practice, at the same time, according to some, it puts disciplinary specification at risk, given that there is no architecture without destruction, the constructed organism requires mutation in order to adjust to the change of shape. All of this previous conceptual framework gathered valuable intents to give importance to a theory that could be understood solely from an argumental position. Thusly, in 1979 Ignasi Solá-Morales integrated all of the imprecisions that referred to an action in existing architecture under the term of “Intervention”, which was explained in two ways: The first referring to any type of intervention that can be carried out in a building, regarding protection, conservation, reuse, etc. It is about a scope where the meaning of intensity stays latent as a common factor of the understanding of a single action. Secondly, more limitedly, the idea of intervention is established as the critical act to the other previous ideas such as restauration, conservation, reuse, etc. Both ultimately represent ways of interpretation of a new speech. “An intervention, is as much as trying to make the building say something again or that it be said in a certain direction”. Mid 1985, motivated by the current of historiographical revision and the concerns regarding the deterioration of historical centres that traversed Europe, Solá-Morales decides to reflect on “the relationship” between an intervention of the new architecture and the previously existing architecture. A relationship determined strictly by linguistic considerations, to his understanding, in harmony with all of the architectural production of the XXth century. From Contrast to Analogy would summarise transformations in the discursive perception of architectural intervention, as a changing phenomenon depending on cultural values, but at the same time, showing a clear dialogical tendency between two formal categories: Contrast, emphasising the possibilities of novelty and difference; and on the other hand the emerging Analogy, as a new awareness of interpretation of the ancient building, where the similarity and diversity are manifested simultaneously. For Solá-Morales the analogical procedure is not based on the visible simultaneity of formal orders, but on associations that the subject establishes throughout time. Through analogy it is tried to overcome the simple visual relationship with the antique, to focus on its spacial, physical and geographical nature. If the analogical attempt guides an opening towards a new continuity; it still persists in the connection of dimensional, typological and figurative factors, subordinate to the formal hierarchy of the preexisting subjects. 16 The reflexive contribution of Solá-Morales’ works could be final, if in the last decades before the end of the century there had not been certain changes regarding linguistic expression, encouraged by architecture, towards a kind of figurative hypertrophy, amongst many arguments we are in this case interested in three moments: The dissolution of the compositional consistency and the united style, the volumetric incorporation of the project as a reactive mechanism, and the change of the vision from retrospective towards prospective that the new conservation suggests. The recurrence to the history of architecture and its recognisable forms, as a way of perpetuating memory and establishing a reference, dissolved any instinct of compositive unity and style, provoking permanent relationships to tend to disappear. The composition and coherence lead to suppose a type of discontinuity of isolated objects in which only possible relationships could appear; no longer as an order of certain formal and compositive rules, but as a special way of setting elements in a specific work. The new globalised field required new forms of consistency between the project and the pre-existent subject, motivated amongst others by the higher pace of market evolution, increase of consumer tax and the level of information and competence between different locations; aspects which finally made stylistic consistence inefficient. In this context of disintegration, the project, in incorporation as well as added to a constructed building, stops being considered as a volumetric appendix subordinate to compositive and formal rules of old, to be considered as an organism of reactive order, that causes a change in the structural and systematic configuration of the existing foundation. The extension, previsouly spatial, is now considered a sensorial and morphological extension, with the implementation of technology and hyper-information, but at the same time, marked by a strong tendency of energetic optimization in its operational role, facing the emergence of the ecological factor in contemporary production. The technological world turns into a new nature, a nature that should be analysed from ecological terms; in other words, as an event of transition in the continuous redistribution of energy. In this area, effectiveness is not only determined by the capacity of adaptation to changing conditions, but also by its transforming capacity “expressly” in order to change an environment. In a society, like ours, that is modernising intensively, it is difficult to share an adecuate agreement with the forms of the past. From 1790, the date of the first French convention for the conservation of monuments, the scale of what is expexted to be preserved is more and more ambitious, so much so that nowadays the repertoire of that what is conserved includes practically all typologies of the constructed surroundings. For Koolhaas, the ínterval between the object and the moment when its conservation is decided has been reduced, from two 17 milennia in 1882 to a few decades nowadays. Shortly this lapse will disappear, showing a radical change of retrospective towards prospective, that is to say, that soon it will be necessary to decide what to conserve before constructing. The shapes of cities are the result of the continuous incorporation of architecture, and perhaps that only through architecture the response to the universe can be understood, the continuity of what has already been constructed. Our work is understood also within that system, modifying the field of action and leaving the road ready for the next movement of those that will follow after us. Continuity does not mean conservatism, continuity means being conscient of the transitory value of our answers to specific needs, accepting the change that we have received. That what has been constructed to remain and last, should cause future interventions to be integrated in it. It is necessary to accept continuity as a rule. Solá-Morales, in his time, distinguished between the relationship with new and old, between contrast and analogy. Today, almost three decades later, the objective consists of evaluating whether the model of architectural intervention in the constructed has been maintained since then or if new ways of positioning the project regarding the constructed have appeared. Our work claims to show the change of the approach of projects with pre-existing subjects and that this has got a close relation to the incorporation of new concepts, techniques, tools and necessities that impress the cultural context, caused by the change of centuries. This assumption guides us to establish a parallelism between the forms of connection where that what is new is manifested between a commonly assumed (topical), generic and orthodox position, based on that what is visual and expressive in the last decades of the XXth century, and an emerging (heterotopical), extraordinary and heterodox reality that stimulates the immaterial and that seems to emerge with growing intensity in the XXIst century. If throughout the XXth century the project of architectural intervention was considered from the continuity and discontinuity of formal categories, marked by the expression of the pre-existing building, the new contemporary intervention, as a reactive device in the landscape and territory, demands an absolute continuity. No longer a visual, expressive or functional one but a morphological continuity of adaptation and change with its own territorial dynamics, under new game rules and unfolding new operative (projective) strategies from its own logic and contingency. 18 The aim of this research is to determine new forms of continuity and the possible logic of production that are expressed in the Architectural Intervention, trying to overcome the obviousness of its physical and visual relationship, at the beginning of this new century, as a result of the incorporation of the operative factor that the new architectural device unfolds. We think it is correct to maintain the connotative path that marks the name architectural intervention by bringing previous concepts and theorical approaches that have been evolving through time together. If the name suffers from a wider operational range because of its formulation, a quality that our contemporary logic provokes, the reformulation and consolidation of an interventional concept could be more suitable for our times, giving preference to a logical method from its own necessity and contingency. It seems that now time shapes the topics, it is no longer about materialising a certain time but about expressing the changes that its new temporality generates. Finally, our initial approach aspires to form a new way of reflection that permits us to understand the complex implications that the new architecture submits the pre-existing subject to, motivated by the incorporation of factors external to simple formal and expressive judgement, prevailing at the end of the XXth century. In the same way, our set road, as an alternative, permits the contemplation of possible research paths, considering that what is pre-existing as an area that spans the whole territory with emerging changing dynamics and, with them, their interventional logics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Nakagami-m distribution is widely used for the simulation of fading channels in wireless communications. A novel, simple and extremely efficient acceptance-rejection algorithm is introduced for the generation of independent Nakagami-m random variables. The proposed method uses another Nakagami density with a half-integer value of the fading parameter, mp ¼ n/2 ≤ m, as proposal function, from which samples can be drawn exactly and easily. This novel rejection technique is able to work with arbitrary values of m ≥ 1, average path energy, V, and provides a higher acceptance rate than all currently available methods. RESUMEN. Método extremadamente eficiente para generar variables aleatorias de Nakagami (utilizadas para modelar el desvanecimiento en canales de comunicaciones móviles) basado en "rejection sampling".

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper contributes with a unified formulation that merges previ- ous analysis on the prediction of the performance ( value function ) of certain sequence of actions ( policy ) when an agent operates a Markov decision process with large state-space. When the states are represented by features and the value function is linearly approxi- mated, our analysis reveals a new relationship between two common cost functions used to obtain the optimal approximation. In addition, this analysis allows us to propose an efficient adaptive algorithm that provides an unbiased linear estimate. The performance of the pro- posed algorithm is illustrated by simulation, showing competitive results when compared with the state-of-the-art solutions.