957 resultados para time evolution
Resumo:
Kernel methods provide a way to apply a wide range of learning techniques to complex and structured data by shifting the representational problem from one of finding an embedding of the data to that of defining a positive semidefinite kernel. In this paper, we propose a novel kernel on unattributed graphs where the structure is characterized through the evolution of a continuous-time quantum walk. More precisely, given a pair of graphs, we create a derived structure whose degree of symmetry is maximum when the original graphs are isomorphic. With this new graph to hand, we compute the density operators of the quantum systems representing the evolutions of two suitably defined quantum walks. Finally, we define the kernel between the two original graphs as the quantum Jensen-Shannon divergence between these two density operators. The experimental evaluation shows the effectiveness of the proposed approach. © 2013 Springer-Verlag.
Resumo:
Jenő Szűcs wrote his essay entitled Sketch on the three regions of Europe in the early 1980s in Hungary. During these years, a historically well-argued opinion emphasising a substantial difference between Central European and Eastern European societies was warmly received in various circles of the political opposition. In a wider European perspective Szűcs used the old “liberty topos” which claims that the history of Europe is no other than the fulfillment of liberty. In his Sketch, Szűcs does not only concentrate on questions concerning the Middle Ages in Western Europe. Yet it is this stream of thought which brought a new perspective to explaining European history. His picture of the Middle Ages represents well that there is a way to integrate all typical Western motifs of post-war self-definition into a single theory. Mainly, the “liberty motif”, as a sign of “Europeanism” – in the interpretation of Bibó’s concept, Anglo-saxon Marxists and Weber’s social theory –, developed from medieval concepts of state and society and from an analysis of economic and social structures. Szűcs’s historical aspect was a typical intellectual product of the 1980s: this was the time when a few Central European historians started to outline non-Marxist aspects of social theory and categories of modernisation theories, but concealing them with Marxist terminology.
Resumo:
Jenő Szűcs wrote his essay entitled Sketch on the three regions of Europe in the early 1980s in Hungary. During these years, a historically well-argued opinion emphasising a substantial difference between Central European and Eastern European societies was warmly received in various circles of the political opposition. In a wider European perspective Szűcs used the old “liberty topos” which claims that the history of Europe is no other than the fulfillment of liberty. In his Sketch, Szűcs does not only concentrate on questions concerning the Middle Ages in Western Europe. Yet it is this stream of thought which brought a new perspective to explaining European history. His picture of the Middle Ages represents well that there is a way to integrate all typical Western motifs of post-war self-definition into a single theory. Mainly, the “liberty motif”, as a sign of “Europeanism” – in the interpretation of Bibó’s concept, Anglo-saxon Marxists and Weber’s social theory –, developed from medieval concepts of state and society and from an analysis of economic and social structures. Szűcs’s historical aspect was a typical intellectual product of the 1980s: this was the time when a few Central European historians started to outline non-Marxist aspects of social theory and categories of modernisation theories, but concealing them with Marxist terminology.
Resumo:
The major barrier to practical optimization of pavement preservation programming has always been that for formulations where the identity of individual projects is preserved, the solution space grows exponentially with the problem size to an extent where it can become unmanageable by the traditional analytical optimization techniques within reasonable limit. This has been attributed to the problem of combinatorial explosion that is, exponential growth of the number of combinations. The relatively large number of constraints often presents in a real-life pavement preservation programming problems and the trade-off considerations required between preventive maintenance, rehabilitation and reconstruction, present yet another factor that contributes to the solution complexity. In this research study, a new integrated multi-year optimization procedure was developed to solve network level pavement preservation programming problems, through cost-effectiveness based evolutionary programming analysis, using the Shuffled Complex Evolution (SCE) algorithm.^ A case study problem was analyzed to illustrate the robustness and consistency of the SCE technique in solving network level pavement preservation problems. The output from this program is a list of maintenance and rehabilitation treatment (M&R) strategies for each identified segment of the network in each programming year, and the impact on the overall performance of the network, in terms of the performance levels of the recommended optimal M&R strategy. ^ The results show that the SCE is very efficient and consistent in the simultaneous consideration of the trade-off between various pavement preservation strategies, while preserving the identity of the individual network segments. The flexibility of the technique is also demonstrated, in the sense that, by suitably coding the problem parameters, it can be used to solve several forms of pavement management programming problems. It is recommended that for large networks, some sort of decomposition technique should be applied to aggregate sections, which exhibit similar performance characteristics into links, such that whatever M&R alternative is recommended for a link can be applied to all the sections connected to it. In this way the problem size, and hence the solution time, can be greatly reduced to a more manageable solution space. ^ The study concludes that the robust search characteristics of SCE are well suited for solving the combinatorial problems in long-term network level pavement M&R programming and provides a rich area for future research. ^
Resumo:
Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^
Resumo:
Since the arrival of the first African slaves to Cuba in 1524, the issue of race has had a long-lived presence in the Cuban national discourse. However, despite Cuba’s colonial history, it has often been maintained by some historians that race relations in Cuba were congenial with racism and racial discrimination never existing as deep or widespread in Cuba as in the United States (Cannon, 1983, p. 113). In fact, it has been argued that institutionalized racism was introduced into Cuban society with the first U.S. occupation, during 1898–1902 (Cannon, 1983, p. 113). This study of Cuba investigates the influence of the United States on the development of race relations and racial perceptions in post-independent Cuba, specifically from 1898-1902. These years comprise the time period immediately following the final fight for Cuban Independence, culminating with the Cuban-Spanish-American War and the first U.S. occupation of Cuba. By this time, the Cuban population comprised Africans as well as descendants of Africans, White Spanish people, indigenous Cubans, and offspring of the intermixing of the groups. This research studies whether the United States’ own race relations and racial perceptions influenced the initial conflicting race relations and racial perceptions in early and post-U.S. occupation Cuba. This study uses a collective interpretative framework that incorporates a national level of analysis with a race relations and racial perceptions focus. This framework reaches beyond the traditionally utilized perspectives when interpreting the impact of the United States during and following its intervention in Cuba. Attention is given to the role of the existing social, political climate within the United States as a driving influence of the United States’ involvement with Cuba. This study reveals that emphasis on the role of the United States as critical to the development of Cuba’s race relations and racial perceptions is credible given the extensive involvement of the U.S. in the building of the early Cuban Republic and U.S. structures serving as models for reconstruction. U.S. government formation in Cuba aligned with a governing system reflecting the existing governing codes of the U.S. during that time period.
Resumo:
The study examines the thought of Yanagita Kunio (1875–1962), an influential Japanese nationalist thinker and a founder of an academic discipline named minzokugaku. The purpose of the study is to bring into light an unredeemed potential of his intellectual and political project as a critique of the way in which modern politics and knowledge systematically suppresses global diversity. The study reads his texts against the backdrop of the modern understanding of space and time and its political and moral implications and traces the historical evolution of his thought that culminates in the establishment of minzokugaku. My reading of Yanagita’s texts draws on three interpretive hypotheses. First, his thought can be interpreted as a critical engagement with John Stuart Mill’s philosophy of history, as he turns Mill’s defense of diversity against Mill’s justification of enlightened despotism in non-Western societies. Second, to counter Mill’s individualistic notion of progressive agency, he turns to a Marxian notion of anthropological space, in which a laboring class makes history by continuously transforming nature, and rehabilitates the common people (jomin) as progressive agents. Third, in addition to the common people, Yanagita integrates wandering people as a countervailing force to the innate parochialism and conservatism of agrarian civilization. To excavate the unrecorded history of ordinary farmers and wandering people and promote the formation of national consciousness, his minzokugaku adopts travel as an alternative method for knowledge production and political education. In light of this interpretation, the aim of Yanagita’s intellectual and political project can be understood as defense and critique of the Enlightenment tradition. Intellectually, he attempts to navigate between spurious universalism and reactionary particularism by revaluing diversity as a necessary condition for universal knowledge and human progress. Politically, his minzokugaku aims at nation-building/globalization from below by tracing back the history of a migratory process cutting across the existing boundaries. His project is opposed to nation-building from above that aims to integrate the world population into international society at the expense of global diversity.
Resumo:
The Last Interglacial (LIG, 129-116 thousand of years BP, ka) represents a test bed for climate model feedbacks in warmer-than-present high latitude regions. However, mainly because aligning different palaeoclimatic archives and from different parts of the world is not trivial, a spatio-temporal picture of LIG temperature changes is difficult to obtain. Here, we have selected 47 polar ice core and sub-polar marine sediment records and developed a strategy to align them onto the recent AICC2012 ice core chronology. We provide the first compilation of high-latitude temperature changes across the LIG associated with a coherent temporal framework built between ice core and marine sediment records. Our new data synthesis highlights non-synchronous maximum temperature changes between the two hemispheres with the Southern Ocean and Antarctica records showing an early warming compared to North Atlantic records. We also observe warmer than present-day conditions that occur for a longer time period in southern high latitudes than in northern high latitudes. Finally, the amplitude of temperature changes at high northern latitudes is larger compared to high southern latitude temperature changes recorded at the onset and the demise of the LIG. We have also compiled four data-based time slices with temperature anomalies (compared to present-day conditions) at 115 ka, 120 ka, 125 ka and 130 ka and quantitatively estimated temperature uncertainties that include relative dating errors. This provides an improved benchmark for performing more robust model-data comparison. The surface temperature simulated by two General Circulation Models (CCSM3 and HadCM3) for 130 ka and 125 ka is compared to the corresponding time slice data synthesis. This comparison shows that the models predict warmer than present conditions earlier than documented in the North Atlantic, while neither model is able to produce the reconstructed early Southern Ocean and Antarctic warming. Our results highlight the importance of producing a sequence of time slices rather than one single time slice averaging the LIG climate conditions.
Resumo:
Peer reviewed
Resumo:
All organisms live in complex habitats that shape the course of their evolution by altering the phenotype expressed by a given genotype (a phenomenon known as phenotypic plasticity) and simultaneously by determining the evolutionary fitness of that phenotype. In some cases, phenotypic evolution may alter the environment experienced by future generations. This dissertation describes how genetic and environmental variation act synergistically to affect the evolution of glucosinolate defensive chemistry and flowering time in Boechera stricta, a wild perennial herb. I focus particularly on plant-associated microbes as a part of the plant’s environment that may alter trait evolution and in turn be affected by the evolution of those traits. In the first chapter I measure glucosinolate production and reproductive fitness of over 1,500 plants grown in common gardens in four diverse natural habitats, to describe how patterns of plasticity and natural selection intersect and may influence glucosinolate evolution. I detected extensive genetic variation for glucosinolate plasticity and determined that plasticity may aid colonization of new habitats by moving phenotypes in the same direction as natural selection. In the second chapter I conduct a greenhouse experiment to test whether naturally-occurring soil microbial communities contributed to the differences in phenotype and selection that I observed in the field experiment. I found that soil microbes cause plasticity of flowering time but not glucosinolate production, and that they may contribute to natural selection on both traits; thus, non-pathogenic plant-associated microbes are an environmental feature that could shape plant evolution. In the third chapter, I combine a multi-year, multi-habitat field experiment with high-throughput amplicon sequencing to determine whether B. stricta-associated microbial communities are shaped by plant genetic variation. I found that plant genotype predicts the diversity and composition of leaf-dwelling bacterial communities, but not root-associated bacterial communities. Furthermore, patterns of host genetic control over associated bacteria were largely site-dependent, indicating an important role for genotype-by-environment interactions in microbiome assembly. Together, my results suggest that soil microbes influence the evolution of plant functional traits and, because they are sensitive to plant genetic variation, this trait evolution may alter the microbial neighborhood of future B. stricta generations. Complex patterns of plasticity, selection, and symbiosis in natural habitats may impact the evolution of glucosinolate profiles in Boechera stricta.
Resumo:
Dengue is an important vector-borne virus that infects on the order of 400 million individuals per year. Infection with one of the virus's four serotypes (denoted DENV-1 to 4) may be silent, result in symptomatic dengue 'breakbone' fever, or develop into the more severe dengue hemorrhagic fever/dengue shock syndrome (DHF/DSS). Extensive research has therefore focused on identifying factors that influence dengue infection outcomes. It has been well-documented through epidemiological studies that DHF is most likely to result from a secondary heterologous infection, and that individuals experiencing a DENV-2 or DENV-3 infection typically are more likely to present with more severe dengue disease than those individuals experiencing a DENV-1 or DENV-4 infection. However, a mechanistic understanding of how these risk factors affect disease outcomes, and further, how the virus's ability to evolve these mechanisms will affect disease severity patterns over time, is lacking. In the second chapter of my dissertation, I formulate mechanistic mathematical models of primary and secondary dengue infections that describe how the dengue virus interacts with the immune response and the results of this interaction on the risk of developing severe dengue disease. I show that only the innate immune response is needed to reproduce characteristic features of a primary infection whereas the adaptive immune response is needed to reproduce characteristic features of a secondary dengue infection. I then add to these models a quantitative measure of disease severity that assumes immunopathology, and analyze the effectiveness of virological indicators of disease severity. In the third chapter of my dissertation, I then statistically fit these mathematical models to viral load data of dengue patients to understand the mechanisms that drive variation in viral load. I specifically consider the roles that immune status, clinical disease manifestation, and serotype may play in explaining viral load variation observed across the patients. With this analysis, I show that there is statistical support for the theory of antibody dependent enhancement in the development of severe disease in secondary dengue infections and that there is statistical support for serotype-specific differences in viral infectivity rates, with infectivity rates of DENV-2 and DENV-3 exceeding those of DENV-1. In the fourth chapter of my dissertation, I integrate these within-host models with a vector-borne epidemiological model to understand the potential for virulence evolution in dengue. Critically, I show that dengue is expected to evolve towards intermediate virulence, and that the optimal virulence of the virus depends strongly on the number of serotypes that co-circulate. Together, these dissertation chapters show that dengue viral load dynamics provide insight into the within-host mechanisms driving differences in dengue disease patterns and that these mechanisms have important implications for dengue virulence evolution.
Resumo:
As we work our way through the latest financial crisis, politicians seem both powerless to act convincingly and unable to craft from the welter of diverse and antagonistic narratives a coherent and convincing vision of the future. In this article, we argue that a temporal lens brings clarity to such confusion, and that thinking in terms of time and reflecting on privileged temporal structures helps to highlight underlying assumptions and distinguish different narratives from one another. We begin by articulating our understanding of temporality, and we proceed to apply this to the evolution of financial practice during different historical epochs as recently delineated by Gordon (2012). We argue that the principles of finance were effectively in place by the eighteenth century and that consequent developments are best conceptualized as phases in which one particular aspect is intensified. We find that in different historical periods, the temporal intensification associated with specific models of finance shifts, over history, from the past to the present to the future. We argue that a quite idiosyncratic understanding of the future has been intensified in the present phase, what we refer to as proximal future, and we explain how this has come to be. We then consider the ethical consequences of privileging an intensification of proximal future before mapping an alternative model centred on intensifying distal future, highlighting early signs of its potential emergence in the shadows of our present.
Resumo:
The recently proposed global monsoon hypothesis interprets monsoon systems as part of one global-scale atmospheric overturning circulation, implying a connection between the regional monsoon systems and an in-phase behaviour of all northern hemispheric monsoons on annual timescales (Trenberth et al., 2000). Whether this concept can be applied to past climates and variability on longer timescales is still under debate, because the monsoon systems exhibit different regional characteristics such as different seasonality (i.e. onset, peak, and withdrawal). To investigate the interconnection of different monsoon systems during the pre-industrial Holocene, five transient global climate model simulations have been analysed with respect to the rainfall trend and variability in different sub-domains of the Afro-Asian monsoon region. Our analysis suggests that on millennial timescales with varying orbital forcing, the monsoons do not behave as a tightly connected global system. According to the models, the Indian and North African monsoons are coupled, showing similar rainfall trend and moderate correlation in rainfall variability in all models. The East Asian monsoon changes independently during the Holocene. The dissimilarities in the seasonality of the monsoon sub-systems lead to a stronger response of the North African and Indian monsoon systems to the Holocene insolation forcing than of the East Asian monsoon and affect the seasonal distribution of Holocene rainfall variations. Within the Indian and North African monsoon domain, precipitation solely changes during the summer months, showing a decreasing Holocene precipitation trend. In the East Asian monsoon region, the precipitation signal is determined by an increasing precipitation trend during spring and a decreasing precipitation change during summer, partly balancing each other. A synthesis of reconstructions and the model results do not reveal an impact of the different seasonality on the timing of the Holocene rainfall optimum in the different sub-monsoon systems. They rather indicate locally inhomogeneous rainfall changes and show, that single palaeo-records should not be used to characterise the rainfall change and monsoon evolution for entire monsoon sub-systems.
Resumo:
This data set comprises time series of aboveground community plant biomass (Sown plant community, Weed plant community, Dead plant material, and Unidentified plant material; all measured in biomass as dry weight) and species-specific biomass from the sown species of several experiments at the field site of a large grassland biodiversity experiment (the Jena Experiment; see further details below). Aboveground community biomass was normally harvested twice a year just prior to mowing (during peak standing biomass twice a year, generally in May and August; in 2002 only once in September) on all experimental plots in the Jena Experiment. This was done by clipping the vegetation at 3 cm above ground in up to four rectangles of 0.2 x 0.5 m per large plot. The location of these rectangles was assigned by random selection of new coordinates every year within the core area of the plots. The positions of the rectangles within plots were identical for all plots. The harvested biomass was sorted into categories: individual species for the sown plant species, weed plant species (species not sown at the particular plot), detached dead plant material (i.e., dead plant material in the data file), and remaining plant material that could not be assigned to any category (i.e., unidentified plant material in the data file). All biomass was dried to constant weight (70°C, >= 48 h) and weighed. Sown plant community biomass was calculated as the sum of the biomass of the individual sown species. The data for individual samples and the mean over samples for the biomass measures on the community level are given. Overall, analyses of the community biomass data have identified species richness as well as functional group composition as important drivers of a positive biodiversity-productivity relationship. The following series of datasets are contained in this collection: 1. Plant biomass form the Main Experiment: In the Main Experiment, 82 grassland plots of 20 x 20 m were established from a pool of 60 species belonging to four functional groups (grasses, legumes, tall and small herbs). In May 2002, varying numbers of plant species from this species pool were sown into the plots to create a gradient of plant species richness (1, 2, 4, 8, 16 and 60 species) and functional richness (1, 2, 3, 4 functional groups). 2. Plant biomass from the Dominance Experiment: In the Dominance Experiment, 206 grassland plots of 3.5 x 3.5 m were established from a pool of 9 species that can be dominant in semi-natural grassland communities of the study region. In May 2002, varying numbers of plant species from this species pool were sown into the plots to create a gradient of plant species richness (1, 2, 3, 4, 6, and 9 species). 3. Plant biomass from the monoculture plots: In the monoculture plots the sown plant community contains only a single species per plot and this species is a different one for each plot. Which species has been sown in which plot is stated in the plot information table for monocultures (see further details below). The monoculture plots of 3.5 x 3.5 m were established for all of the 60 plant species of the Jena Experiment species pool with two replicates per species like the other experiments in May 2002. All plots were maintained by bi-annual weeding and mowing.