972 resultados para Columnes -- Proves
Resumo:
Ribonucleotides have shown many promising applications in food and pharmaceutical industries. The aim of the present study was to produce ribonucleotides (RNA) by Kluyveromyces marxianus ATCC 8,554 utilizing cheese whey, a dairy industry waste, as a main substrate under batch fermentation conditions. The effects of temperature, pH, aeration rate, agitation and initial cellular concentration were studied simultaneously through factorial design for RNA, biomass production and lactose consumption. The maximum RNA production (28.66 mg/g of dry biomass) was observed at temperature 30°C, pH 5.0 and 1 g/l of initial cellular concentration after 2 h of fermentation. Agitation and aeration rate did not influence on RNA concentration (p >0.05). Maximum lactose consumption (98.7%) and biomass production (6.0 g/l) was observed after 12 h of incubation. This study proves that cheese whey can be used as an adequate medium for RNA production by K. marxianus under the optimized conditions at industrial scale.
Resumo:
Context. HD140283 is a nearby (V = 7:7) subgiant metal-poor star, extensively analysed in the literature. Although many spectra have been obtained for this star, none showed a signal-to-noise (S/N) ratio high enough to enable a very accurate derivation of abundances from weak lines. Aims. The detection of europium proves that the neutron-capture elements in this star originate in the r-process, and not in the s-process, as recently claimed in the literature. Methods. Based on the OSMARCS 1D LTE atmospheric model and with a consistent approach based on the spectrum synthesis code Turbospectrum, we measured the europium lines at 4129 Å and 4205 Å, taking into account the hyperfine structure of the transitions. The spectrum, obtained with a long exposure time of seven hours at the Canada-France-Hawaii Telescope (CFHT), has a resolving power of 81 000 and a S/N ratio of 800 at 4100 Å. Results. We were able to determine the abundance A(Eu) =
Resumo:
Temperature dependent transient curves of excited levels of a model Eu3+ complex have been measured for the first time. A coincidence between the temperature dependent rise time of the 5D0 emitting level and decay time of the 5D1 excited level in the [Eu(tta)3(H2O)2] complex has been found, which unambiguously proves the T1→5D1→5D0 sensitization pathway. A theoretical approach for the temperature dependent energy transfer rates has been successfully applied to the rationalization of the experimental data.
Resumo:
[ES] Mediante la adopción del seudónimo masculino, algunas escritoras del siglo XIX y de la primera mitad del XX pudieron introducirse en un ámbito controlado absolutamente por hombres: el de la literatura. La sustitución de su nombre legal por otro ficticio les permitió ganarse el respeto de un elevado porcentaje de lectores que se mostraban todavía reticentes a valorar positivamente las obras escritas por una mujer, consideradas frívolas, sensibleras e intrascendentes. La necesidad de llevar puesta una máscara para alcanzar semejantes objetivos demuestra, no obstante, que las autoras de entonces en cierto modo seguían sometidas a los dictámenes de la sociedad patriarcal, cuyos prejuicios hubieron de asumir si querían que sus textos vieran la luz y fuesen tomados en serio. Con el seudónimo varonil ‒unido ocasionalmente también al travestismo físico, como ejemplifica el caso de George Sand‒, las escritoras forjaron de sí mismas unas imágenes descentradas, ambiguas, andróginas. Al mismo tiempo, la estrategia del cambio de género autoral presuponía el reconocimiento de una «condición masculina» inherente a la escritura.
Resumo:
A new methodology is being devised for ensemble ocean forecasting using distributions of the surface wind field derived from a Bayesian Hierarchical Model (BHM). The ocean members are forced with samples from the posterior distribution of the wind during the assimilation of satellite and in-situ ocean data. The initial condition perturbations are then consistent with the best available knowledge of the ocean state at the beginning of the forecast and amplify the ocean response to uncertainty only in the forcing. The ECMWF Ensemble Prediction System (EPS) surface winds are also used to generate a reference ocean ensemble to evaluate the performance of the BHM method that proves to be eective in concentrating the forecast uncertainty at the ocean meso-scale. An height month experiment of weekly BHM ensemble forecasts was performed in the framework of the operational Mediterranean Forecasting System. The statistical properties of the ensemble are compared with model errors throughout the seasonal cycle proving the existence of a strong relationship between forecast uncertainties due to atmospheric forcing and the seasonal cycle.
Resumo:
This thesis intends to investigate two aspects of Constraint Handling Rules (CHR). It proposes a compositional semantics and a technique for program transformation. CHR is a concurrent committed-choice constraint logic programming language consisting of guarded rules, which transform multi-sets of atomic formulas (constraints) into simpler ones until exhaustion [Frü06] and it belongs to the declarative languages family. It was initially designed for writing constraint solvers but it has recently also proven to be a general purpose language, being as it is Turing equivalent [SSD05a]. Compositionality is the first CHR aspect to be considered. A trace based compositional semantics for CHR was previously defined in [DGM05]. The reference operational semantics for such a compositional model was the original operational semantics for CHR which, due to the propagation rule, admits trivial non-termination. In this thesis we extend the work of [DGM05] by introducing a more refined trace based compositional semantics which also includes the history. The use of history is a well-known technique in CHR which permits us to trace the application of propagation rules and consequently it permits trivial non-termination avoidance [Abd97, DSGdlBH04]. Naturally, the reference operational semantics, of our new compositional one, uses history to avoid trivial non-termination too. Program transformation is the second CHR aspect to be considered, with particular regard to the unfolding technique. Said technique is an appealing approach which allows us to optimize a given program and in more detail to improve run-time efficiency or spaceconsumption. Essentially it consists of a sequence of syntactic program manipulations which preserve a kind of semantic equivalence called qualified answer [Frü98], between the original program and the transformed ones. The unfolding technique is one of the basic operations which is used by most program transformation systems. It consists in the replacement of a procedure-call by its definition. In CHR every conjunction of constraints can be considered as a procedure-call, every CHR rule can be considered as a procedure and the body of said rule represents the definition of the call. While there is a large body of literature on transformation and unfolding of sequential programs, very few papers have addressed this issue for concurrent languages. We define an unfolding rule, show its correctness and discuss some conditions in which it can be used to delete an unfolded rule while preserving the meaning of the original program. Finally, confluence and termination maintenance between the original and transformed programs are shown. This thesis is organized in the following manner. Chapter 1 gives some general notion about CHR. Section 1.1 outlines the history of programming languages with particular attention to CHR and related languages. Then, Section 1.2 introduces CHR using examples. Section 1.3 gives some preliminaries which will be used during the thesis. Subsequentely, Section 1.4 introduces the syntax and the operational and declarative semantics for the first CHR language proposed. Finally, the methodologies to solve the problem of trivial non-termination related to propagation rules are discussed in Section 1.5. Chapter 2 introduces a compositional semantics for CHR where the propagation rules are considered. In particular, Section 2.1 contains the definition of the semantics. Hence, Section 2.2 presents the compositionality results. Afterwards Section 2.3 expounds upon the correctness results. Chapter 3 presents a particular program transformation known as unfolding. This transformation needs a particular syntax called annotated which is introduced in Section 3.1 and its related modified operational semantics !0t is presented in Section 3.2. Subsequently, Section 3.3 defines the unfolding rule and prove its correctness. Then, in Section 3.4 the problems related to the replacement of a rule by its unfolded version are discussed and this in turn gives a correctness condition which holds for a specific class of rules. Section 3.5 proves that confluence and termination are preserved by the program modifications introduced. Finally, Chapter 4 concludes by discussing related works and directions for future work.
Resumo:
The sustained demand for faster,more powerful chips has beenmet by the availability of chip manufacturing processes allowing for the integration of increasing numbers of computation units onto a single die. The resulting outcome, especially in the embedded domain, has often been called SYSTEM-ON-CHIP (SOC) or MULTI-PROCESSOR SYSTEM-ON-CHIP (MPSOC). MPSoC design brings to the foreground a large number of challenges, one of the most prominent of which is the design of the chip interconnection. With a number of on-chip blocks presently ranging in the tens, and quickly approaching the hundreds, the novel issue of how to best provide on-chip communication resources is clearly felt. NETWORKS-ON-CHIPS (NOCS) are the most comprehensive and scalable answer to this design concern. By bringing large-scale networking concepts to the on-chip domain, they guarantee a structured answer to present and future communication requirements. The point-to-point connection and packet switching paradigms they involve are also of great help in minimizing wiring overhead and physical routing issues. However, as with any technology of recent inception, NoC design is still an evolving discipline. Several main areas of interest require deep investigation for NoCs to become viable solutions: • The design of the NoC architecture needs to strike the best tradeoff among performance, features and the tight area and power constraints of the on-chip domain. • Simulation and verification infrastructure must be put in place to explore, validate and optimize the NoC performance. • NoCs offer a huge design space, thanks to their extreme customizability in terms of topology and architectural parameters. Design tools are needed to prune this space and pick the best solutions. • Even more so given their global, distributed nature, it is essential to evaluate the physical implementation of NoCs to evaluate their suitability for next-generation designs and their area and power costs. This dissertation focuses on all of the above points, by describing a NoC architectural implementation called ×pipes; a NoC simulation environment within a cycle-accurate MPSoC emulator called MPARM; a NoC design flow consisting of a front-end tool for optimal NoC instantiation, called SunFloor, and a set of back-end facilities for the study of NoC physical implementations. This dissertation proves the viability of NoCs for current and upcoming designs, by outlining their advantages (alongwith a fewtradeoffs) and by providing a full NoC implementation framework. It also presents some examples of additional extensions of NoCs, allowing e.g. for increased fault tolerance, and outlines where NoCsmay find further application scenarios, such as in stacked chips.
Resumo:
[EN] Osseous remains of numerous shearwaters and of eggshells of the only egg laid and often undamaged appear in the sand dunes of Jandía Península and more specifically in "Hueso del Caballo" which was one of their breeding si tes. Toe bones also visible in the sandy walls of a quarry were uncovered by the present aeolian erosion and are found in a layerwith Hymmenoptera nests and terrestrial mollusc shells in a dune dated by radiocarbon as being more than 30.000 years old. This fact proves there was a halt in the aeolian processes during a humid interval, probably related to the African aterían pluvial in Upper Pleistocene and a fixation of the dunes by vegetation.
Resumo:
The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.
Resumo:
The aim of this PhD thesis is to study accurately and in depth the figure and the literary production of the intellectual Jacopo Aconcio. This minor author of the 16th century has long been considered a sort of “enigmatic character”, a profile which results from the work of those who, for many centuries, have left his writing to its fate: a story of constant re-readings and equally incessant oversights. This is why it is necessary to re-read Aconcio’s production in its entirety and to devote to it a monographic study. Previous scholars’ interpretations will obviously be considered, but at the same time an effort will be made to go beyond them through the analysis of both published and manuscript sources, in the attempt to attain a deeper understanding of the figure of this man, who was a Christian, a military and hydraulic engineer and a political philosopher,. The title of the thesis was chosen to emphasise how, throughout the three years of the doctorate, my research concentrated in equal measure and with the same degree of importance on all the reflections and activities of Jacopo Aconcio. My object, in fact, was to establish how and to what extent the methodological thinking of the intellectual found application in, and at the same time guided, his theoretical and practical production. I did not mention in the title the author’s religious thinking, which has always been considered by everyone the most original and interesting element of his production, because religion, from the Reformation onwards, was primarily a political question and thus it was treated by almost all the authors involved in the Protestant movement - Aconcio in the first place. Even the remarks concerning the private, intimate sphere of faith have therefore been analysed in this light: only by acknowledging the centrality of the “problem of politics” in Aconcio’s theories, in fact, is it possible to interpret them correctly. This approach proves the truth of the theoretical premise to my research, that is to say the unity and orderliness of the author’s thought: in every field of knowledge, Aconcio applies the rules of the methodus resolutiva, as a means to achieve knowledge and elaborate models of pacific cohabitation in society. Aconcio’s continuous references to method can make his writing pedant and rather complex, but at the same time they allow for a consistent and valid analysis of different disciplines. I have not considered the fact that most of his reflections appear to our eyes as strongly conditioned by the time in which he lived as a limit. To see in him, as some have done, the forerunner of Descartes’ methodological discourse or, conversely, to judge his religious theories as not very modern, is to force the thought of an author who was first and foremost a Christian man of his own time. Aconcio repeats this himself several times in his writings: he wants to provide individuals with the necessary tools to reach a full-fledged scientific knowledge in the various fields, and also to enable them to seek truth incessantly in the religious domain, which is the duty of every human being. The will to find rules, instruments, effective solutions characterizes the whole of the author’s corpus: Aconcio feels he must look for truth in all the arts, aware as he is that anything can become science as long as it is analysed with method. Nevertheless, he remains a man of his own time, a Christian convinced of the existence of God, creator and governor of the world, to whom people must account for their own actions. To neglect this fact in order to construct a “character”, a generic forerunner, but not participant, of whatever philosophical current, is a dangerous and sidetracking operation. In this study, I have highlighted how Aconcio’s arguments only reveal their full meaning when read in the context in which they were born, without depriving them of their originality but also without charging them with meanings they do not possess. Through a historical-doctrinal approach, I have tried to analyse the complex web of theories and events which constitute the substratum of Aconcio’s reflection, in order to trace the correct relations between texts and contexts. The thesis is therefore organised in six chapters, dedicated respectively to Aconcio’s biography, to the methodological question, to the author’s engineering activity, to his historical knowledge and to his religious thinking, followed by a last section concerning his fortune throughout the centuries. The above-mentioned complexity is determined by the special historical moment in which the author lived. On the one hand, thanks to the new union between science and technique, the 16th century produces discoveries and inventions which make available a previously unthinkable number of notions and lead to a “revolution” in the way of studying and teaching the different subjects, which, by producing a new form of intellectual, involved in politics but also aware of scientific-technological issues, will contribute to the subsequent birth of modern science. On the other, the 16th century is ravaged by religious conflicts, which shatter the unity of the Christian world and generate theological-political disputes which will inform the history of European states for many decades. My aim is to show how Aconcio’s multifarious activity is the conscious fruit of this historical and religious situation, as well as the attempt of an answer to the request of a new kind of engagement on the intellectual’s behalf. Plunged in the discussions around methodus, employed in the most important European courts, involved in the abrupt acceleration of technical-scientific activities, and especially concerned by the radical religious reformation brought on by the Protestant movement, Jacopo Aconcio reflects this complex conjunction in his writings, without lacking in order and consistency, differently from what many scholars assume. The object of this work, therefore, is to highlight the unity of the author’s thought, in which science, technique, faith and politics are woven into a combination which, although it may appear illogical and confused, is actually tidy and methodical, and therefore in agreement with Aconcio’s own intentions and with the specific characters of European culture in the Renaissance. This theory is confirmed by the reading of the Ars muniendorum oppidorum, Aconcio’s only work which had been up till now unavailable. I am persuaded that only a methodical reading of Aconcio’s works, without forgetting nor glorifying any single one, respects the author’s will. From De methodo (1558) onwards, all his writings are summae, guides for the reader who wishes to approach the study of the various disciplines. Undoubtedly, Satan’s Stratagems (1565) is something more, not only because of its length, but because it deals with the author’s main interest: the celebration of doubt and debate as bases on which to build religious tolerance, which is the best method for pacific cohabitation in society. This, however, does not justify the total centrality which the Stratagems have enjoyed for centuries, at the expense of a proper understanding of the author’s will to offer examples of methodological rigour in all sciences. Maybe it is precisely because of the reforming power of Aconcio’s thought that, albeit often forgotten throughout the centuries, he has never ceased to reappear and continues to draw attention, both as a man and as an author. His ideas never stop stimulating the reader’s curiosity and this may ultimately be the best demonstration of their worth, independently from the historical moment in which they come back to the surface.
Resumo:
The Székesfehérvár Ruin Garden is a unique assemblage of monuments belonging to the cultural heritage of Hungary due to its important role in the Middle Ages as the coronation and burial church of the Kings of the Hungarian Christian Kingdom. It has been nominated for “National Monument” and as a consequence, its protection in the present and future is required. Moreover, it was reconstructed and expanded several times throughout Hungarian history. By a quick overview of the current state of the monument, the presence of several lithotypes can be found among the remained building and decorative stones. Therefore, the research related to the materials is crucial not only for the conservation of that specific monument but also for other historic structures in Central Europe. The current research is divided in three main parts: i) description of lithologies and their provenance, ii) physical properties testing of historic material and iii) durability tests of analogous stones obtained from active quarries. The survey of the National Monument of Székesfehérvár, focuses on the historical importance and the architecture of the monument, the different construction periods, the identification of the different building stones and their distribution in the remaining parts of the monument and it also included provenance analyses. The second one was the in situ and laboratory testing of physical properties of historic material. As a final phase samples were taken from local quarries with similar physical and mineralogical characteristics to the ones used in the monument. The three studied lithologies are: fine oolitic limestone, a coarse oolitic limestone and a red compact limestone. These stones were used for rock mechanical and durability tests under laboratory conditions. The following techniques were used: a) in-situ: Schmidt Hammer Values, moisture content measurements, DRMS, mapping (construction ages, lithotypes, weathering forms) b) laboratory: petrographic analysis, XRD, determination of real density by means of helium pycnometer and bulk density by means of mercury pycnometer, pore size distribution by mercury intrusion porosimetry and by nitrogen adsorption, water absorption, determination of open porosity, DRMS, frost resistance, ultrasonic pulse velocity test, uniaxial compressive strength test and dynamic modulus of elasticity. The results show that initial uniaxial compressive strength is not necessarily a clear indicator of the stone durability. Bedding and other lithological heterogeneities can influence the strength and durability of individual specimens. In addition, long-term behaviour is influenced by exposure conditions, fabric and, especially, the pore size distribution of each sample. Therefore, a statistic evaluation of the results is highly recommended and they should be evaluated in combination with other investigations on internal structure and micro-scale heterogeneities of the material, such as petrographic observation, ultrasound pulse velocity and porosimetry. Laboratory tests used to estimate the durability of natural stone may give a good guidance to its short-term performance but they should not be taken as an ultimate indication of the long-term behaviour of the stone. The interdisciplinary study of the results confirms that stones in the monument show deterioration in terms of mineralogy, fabric and physical properties in comparison with quarried stones. Moreover stone-testing proves compatibility between quarried and historical stones. Good correlation is observed between the non-destructive-techniques and laboratory tests results which allow us to minimize sampling and assessing the condition of the materials. Concluding, this research can contribute to the diagnostic knowledge for further studies that are needed in order to evaluate the effect of recent and future protective measures.
Resumo:
The emergency of infection by highly pathogenic avian influenza virus (HPAI) subtype H5N1 has focused the attention of the world scientific community, requiring the prompt provision of effective control systems for early detection of the circulation of low pathogenic influenza H5 viruses (LPAI) in populations of wild birds to prevent outbreaks of highly pathogenic (HPAI) in populations of domestic birds with possible transmission to humans. The project stems from the aim to provide, through a preliminary analysis of data obtained from surveillance in Italy and Europe, a preliminary study about the virus detection rates and the development of mathematical models, an objective assessment of the effectiveness of avian influenza surveillance systems in wild bird populations, and to point out guidelines to support the planning process of the sampling activities. The results obtained from the statistical processing quantify the sampling effort in terms of time and sample size required, and simulating different epidemiological scenarios identify active surveillance as the most suitable for endemic LPAI infection monitoring in wild waterfowl, and passive surveillance as the only really effective tool in early detecting HPAI H5N1 circulation in wild populations. Given the lack of relevant information on H5N1 epidemiology, and the actual finantial and logistic constraints, an approach that makes use of statistical tools to evaluate and predict monitoring activities effectiveness proves to be of primary importance to direct decision-making and make the best use of available resources.
Resumo:
The convergence of information technology and consumer electronics towards battery powered portable devices has increased the interest in high efficiency, low dissipation amplifiers. Class D amplifiers are the state of the art in low power consumption and high performance amplification. In this thesis we explore the possibility of exploiting nonlinearities introduced by the PWM modulation, by designing an optimized modulation law which scales its carrier frequency adaptively with the input signal's average power while preserving the SNR, thus reducing power consumption. This is achieved by means of a novel analytical model of the PWM output spectrum, which shows how interfering harmonics and their bandwidth affect the spectrum. This allows for frequency scaling with negligible aliasing between the baseband spectrum and its harmonics. We performed low noise power spectrum measurements on PWM modulations generated by comparing variable bandwidth, random test signals with a variable frequency triangular wave carrier. The experimental results show that power-optimized frequency scaling is both feasible and effective. The new analytical model also suggests a new PWM architecture that can be applied to digitally encoded input signals which are predistorted and compared with a cosine carrier, which is accurately synthesized by a digital oscillator. This approach has been simulated in a realistic noisy model and tested in our measurement setup. A zero crossing search on the obtained PWM modulation law proves that this approach yields an equivalent signal quality with respect to traditional PWM schemes, while entailing the use of signals whose bandwidth is remarkably smaller due to the use of a cosine instead of a triangular carrier.
Resumo:
Die Beziehung zwischen genetischem Polymorphismus von Populationen und Umweltvariabilität: Anwendung der Fitness-Set Theorie Das Quantitative Fitness-Set Modell (QFM) ist eine Erweiterung der Fitness-Set Theorie. Das QFM kann Abstufungen zwischen grob- und feinkörnigen regelmäßigen Schwankungen zweier Umwelten darstellen. Umwelt- und artspezifische Parameter, sowie die bewirkte Körnigkeit, sind quantifizierbar. Experimentelle Daten lassen sich analysieren und das QFM erweist sich in großen Populationen als sehr genau, was durch den diskreten Parameterraum unterstützt wird. Kleine Populationen und/oder hohe genetische Diversität führen zu Schätzungsungenauigkeiten, die auch in natürlichen Populationen zu erwarten sind. Ein populationsgrößenabhängiger Unschärfewert erweitert die Punktschätzung eines Parametersatzes zur Intervallschätzung. Diese Intervalle wirken in finiten Populationen als Fitnessbänder. Daraus ergibt sich die Hypothese, dass bei Arten, die in dichten kontinuierlichen Fitnessbändern leben, Generalisten und in diskreten Fitnessbändern Spezialisten evolvieren.Asynchrone Reproduktionsstrategien führen zur Bewahrung genetischer Diversität. Aus dem Wechsel von grobkörniger zu feinkörniger Umweltvariation ergibt sich eine Bevorzugung der spezialisierten Genotypen. Aus diesem Angriffspunkt für disruptive Selektion lässt sich die Hypothese Artbildung in Übergangsszenarien von grobkörniger zu feinkörniger Umweltvariation formulieren. Im umgekehrten Fall ist Diversitätsverlust und stabilisierende Selektion zu erwarten Dies ist somit eine prozessorientierte Erklärung für den Artenreichtum der (feinkörnigen) Tropen im Vergleich zu den artenärmeren, jahreszeitlichen Schwankungen unterworfenen (grobkörnigen) temperaten Zonen.
Resumo:
My project explores and compares different forms of gender performance in contemporary art and visual culture according to a perspective centered on photography. Thanks to its attesting power this medium can work as a ready-made. In fact during the 20th century it played a key role in the cultural emancipation of the body which (using a Michel Foucault’s expression) has now become «the zero point of the world». Through performance the body proves to be a living material of expression and communication while photography ensures the recording of any ephemeral event that happens in time and space. My questioning approach considers the gender constructed imagery from the 1990s to the present in order to investigate how photography’s strong aura of realism promotes and allows fantasies of transformation. The contemporary fascination with gender (especially for art and fashion) represents a crucial issue in the global context of postmodernity and is manifested in a variety of visual media, from photography to video and film. Moreover the internet along with its digital transmission of images has deeply affected our world (from culture to everyday life) leading to a postmodern preference for performativity over the more traditional and linear forms of narrativity. As a consequence individual borders get redefined by the skin itself which (dissected through instant vision) turns into a ductile material of mutation and hybridation in the service of identity. My critical assumptions are taken from the most relevant changes occurred in philosophy during the last two decades as a result of the contributions by Jacques Lacan, Michel Foucault, Jacques Derrida, Gilles Deleuze who developed a cross-disciplinary and comparative approach to interpret the crisis of modernity. They have profoundly influenced feminist studies so that the category of gender has been reassessed in contrast with sex (as a biological connotation) and in relation to history, culture, society. The ideal starting point of my research is the year 1990. I chose it as the approximate historical moment when the intersection of race, class and gender were placed at the forefront of international artistic production concerned with identity, diversity and globalization. Such issues had been explored throughout the 1970s but it was only from the mid-1980s onward that they began to be articulated more consistently. Published in 1990, the book "Gender trouble: feminism and the subversion of identity" by Judith Butler marked an important breakthrough by linking gender to performance as well as investigating the intricate connections between theory and practice, embodiment and representation. It inspired subsequent research in a variety of disciplines, art history included. In the same year Teresa de Lauretis launched the definition of queer theory to challenge the academic perspective in gay and lesbian studies. In the meantime the rise of Third Wave Feminism in the US introduced a racially and sexually inclusive vision over the global situation in order to reflect on subjectivity, new technologies and popular culture in connection with gender representation. These conceptual tools have enabled prolific readings of contemporary cultural production whether fine arts or mass media. After discussing the appropriate framework of my project and taking into account the postmodern globalization of the visual, I have turned to photography to map gender representation both in art and in fashion. Therefore I have been creating an archive of images around specific topics. I decided to include fashion photography because in the 1990s this genre moved away from the paradigm of an idealized and classical beauty toward a new vernacular allied with lifestyles, art practices, pop and youth culture; as one might expect the dominant narrative modes in fashion photography are now mainly influenced by cinema and snapshot. These strategies originate story lines and interrupted narratives using models’ performance to convey a particular imagery where identity issues emerge as an essential part of fashion spectacle. Focusing on the intersections of gender identities with socially and culturally produced identities, my approach intends to underline how the fashion world has turned to current trends in art photography and in some case turned to the artists themselves. The growing fluidity of the categories that distinguish art from fashion photography represents a particularly fruitful moment of visual exchange. Varying over time the dialogue between these two fields has always been vital; nowadays it can be studied as a result of this close relationship between contemporary art world and consumer culture. Due to the saturation of postmodern imagery the feedback between art and fashion has become much more immediate and then increasingly significant for anyone who wants to investigate the construction of gender identity through performance. In addition to that a lot of magazines founded in the 1990s bridged the worlds of art and fashion because some of their designers and even editors were art-school graduates encouraging innovation. The inclusion of art within such magazines aimed at validating them as a form of art in themselves supporting a dynamic intersection for music, fashion, design and youth culture: an intersection that also contributed to create and spread different gender stereotypes. This general interest in fashion produced many exhibitions of and about fashion itself at major international venues such as the Victoria and Albert Museum in London, the Metropolitan Museum of Art and the Solomon R. Guggenheim Museum in New York. Since then this celebrated success of fashion has been regarded as a typical element of postmodern culture. Owing to that I have also based my analysis on some important exhibitions dealing with gender performance like "Féminin-Masculin" at the Centre Pompidou of Paris (1995), "Rrose is a Rrose is a Rrose. Gender performance in photography" at the Solomon R. Guggenheim Museum of New York (1997), "Global Feminisms" at the Brooklyn Museum (2007), "Female Trouble" at the Pinakothek der Moderne in München together with the workshops dedicated to "Performance: gender and identity" in June 2005 at the Tate Modern of London. Since 2003 in Italy we have had Gender Bender - an international festival held annually in Bologna - to explore the gender imagery stemming from contemporary culture. In few days this festival offers a series of events ranging from visual arts, performance, cinema, literature to conferences and music. Being aware that any method of research is neither race nor gender neutral I have traced these critical paths to question gender identity in a multicultural perspective taking account of the political implications too. In fact, if visibility may be equated with exposure, we can also read these images as points of intersection of visibility with social power. Since gender assignations rely so heavily on the visual, the postmodern dismantling of gender certainty through performance has wide-ranging effects that need to be analyzed. In some sense this practice can even contest the dominance of visual within postmodernism. My visual map in contemporary art and fashion photography includes artists like Nan Goldin, Cindy Sherman, Hellen van Meene, Rineke Dijkstra, Ed Templeton, Ryan McGinley, Anne Daems, Miwa Yanagi, Tracey Moffat, Catherine Opie, Tomoko Sawada, Vanessa Beecroft, Yasumasa Morimura, Collier Schorr among others.