10 resultados para Real assets and portfolio diversification
em Duke University
Elucidation of hepatitis C virus transmission and early diversification by single genome sequencing.
Resumo:
A precise molecular identification of transmitted hepatitis C virus (HCV) genomes could illuminate key aspects of transmission biology, immunopathogenesis and natural history. We used single genome sequencing of 2,922 half or quarter genomes from plasma viral RNA to identify transmitted/founder (T/F) viruses in 17 subjects with acute community-acquired HCV infection. Sequences from 13 of 17 acute subjects, but none of 14 chronic controls, exhibited one or more discrete low diversity viral lineages. Sequences within each lineage generally revealed a star-like phylogeny of mutations that coalesced to unambiguous T/F viral genomes. Numbers of transmitted viruses leading to productive clinical infection were estimated to range from 1 to 37 or more (median = 4). Four acutely infected subjects showed a distinctly different pattern of virus diversity that deviated from a star-like phylogeny. In these cases, empirical analysis and mathematical modeling suggested high multiplicity virus transmission from individuals who themselves were acutely infected or had experienced a virus population bottleneck due to antiviral drug therapy. These results provide new quantitative and qualitative insights into HCV transmission, revealing for the first time virus-host interactions that successful vaccines or treatment interventions will need to overcome. Our findings further suggest a novel experimental strategy for identifying full-length T/F genomes for proteome-wide analyses of HCV biology and adaptation to antiviral drug or immune pressures.
Resumo:
An enterprise information system (EIS) is an integrated data-applications platform characterized by diverse, heterogeneous, and distributed data sources. For many enterprises, a number of business processes still depend heavily on static rule-based methods and extensive human expertise. Enterprises are faced with the need for optimizing operation scheduling, improving resource utilization, discovering useful knowledge, and making data-driven decisions.
This thesis research is focused on real-time optimization and knowledge discovery that addresses workflow optimization, resource allocation, as well as data-driven predictions of process-execution times, order fulfillment, and enterprise service-level performance. In contrast to prior work on data analytics techniques for enterprise performance optimization, the emphasis here is on realizing scalable and real-time enterprise intelligence based on a combination of heterogeneous system simulation, combinatorial optimization, machine-learning algorithms, and statistical methods.
On-demand digital-print service is a representative enterprise requiring a powerful EIS.We use real-life data from Reischling Press, Inc. (RPI), a digit-print-service provider (PSP), to evaluate our optimization algorithms.
In order to handle the increase in volume and diversity of demands, we first present a high-performance, scalable, and real-time production scheduling algorithm for production automation based on an incremental genetic algorithm (IGA). The objective of this algorithm is to optimize the order dispatching sequence and balance resource utilization. Compared to prior work, this solution is scalable for a high volume of orders and it provides fast scheduling solutions for orders that require complex fulfillment procedures. Experimental results highlight its potential benefit in reducing production inefficiencies and enhancing the productivity of an enterprise.
We next discuss analysis and prediction of different attributes involved in hierarchical components of an enterprise. We start from a study of the fundamental processes related to real-time prediction. Our process-execution time and process status prediction models integrate statistical methods with machine-learning algorithms. In addition to improved prediction accuracy compared to stand-alone machine-learning algorithms, it also performs a probabilistic estimation of the predicted status. An order generally consists of multiple series and parallel processes. We next introduce an order-fulfillment prediction model that combines advantages of multiple classification models by incorporating flexible decision-integration mechanisms. Experimental results show that adopting due dates recommended by the model can significantly reduce enterprise late-delivery ratio. Finally, we investigate service-level attributes that reflect the overall performance of an enterprise. We analyze and decompose time-series data into different components according to their hierarchical periodic nature, perform correlation analysis,
and develop univariate prediction models for each component as well as multivariate models for correlated components. Predictions for the original time series are aggregated from the predictions of its components. In addition to a significant increase in mid-term prediction accuracy, this distributed modeling strategy also improves short-term time-series prediction accuracy.
In summary, this thesis research has led to a set of characterization, optimization, and prediction tools for an EIS to derive insightful knowledge from data and use them as guidance for production management. It is expected to provide solutions for enterprises to increase reconfigurability, accomplish more automated procedures, and obtain data-driven recommendations or effective decisions.
Resumo:
This dissertation consists of three separate studies that examine patterns of immigrant incorporation in the United States. The first study tests competing hypotheses derived from conflicting theoretical frameworks−transnational perspective and cross-national framework− to determine whether transnational engagement and incorporation are concurrent processes among Chinese, Indian, and Mexican immigrants. This study measures transnational engagement and incorporation as home and home country asset ownership using multi-panel, nationally representative data from the New Immigrant Survey (NIS) collected in 2003 and 2007. Results support a cross-border framework and indicate that transnational asset ownership decreases among all immigrant groups, while U.S. asset ownership increases. Findings from this study also indicate that due to disadvantaged pre-migration SES and low human capital, Mexican immigrants are less likely than other immigrants to own home country assets during the year after receiving their green card.
The second study examines the doubly disadvantaged position of elderly immigrants in the U.S. wealth distribution by applying the life course perspective to the dominance-differentiation theory of immigrant wealth stratification. I analyze elderly immigrant wealth in respect to U.S.-born seniors and younger immigrant cohorts using two data sets: the Survey of Income and Program Participation (SIPP) and the New Immigrant Survey (NIS). The Survey of Income and Program Participation (2001 to 2005) is a nationally representative survey of U.S. households. The first series of analyses reveals a significant wealth gap between U.S.- and foreign-born seniors which is most pronounced among the wealthiest households in my sample; however, U.S. tenure explains much of this difference. The second series of analyses suggests that elderly immigrants experience greater barriers to incorporation compared to their younger counterparts.
In the third study, I apply a transnational lens to the forms-of-capital and opportunity structure models of entrepreneurship in order to analyze the role of foreign resources in immigrant business start-ups. I propose that home country property use represents financial, social, and class resources that facilitate immigrant entrepreneurship. I test my hypotheses using survey data on Latin American immigrants from the Comparative Immigrant Entrepreneurship Project. Findings from these analyses suggest that home country asset ownership provides financial and social capital that is related to an increased likelihood of immigrant entrepreneurship.
Resumo:
BACKGROUND: Vertebrate skin appendages are constructed of keratins produced by multigene families. Alpha (α) keratins are found in all vertebrates, while beta (β) keratins are found exclusively in reptiles and birds. We have studied the molecular evolution of these gene families in the genomes of 48 phylogenetically diverse birds and their expression in the scales and feathers of the chicken. RESULTS: We found that the total number of α-keratins is lower in birds than mammals and non-avian reptiles, yet two α-keratin genes (KRT42 and KRT75) have expanded in birds. The β-keratins, however, demonstrate a dynamic evolution associated with avian lifestyle. The avian specific feather β-keratins comprise a large majority of the total number of β-keratins, but independently derived lineages of aquatic and predatory birds have smaller proportions of feather β-keratin genes and larger proportions of keratinocyte β-keratin genes. Additionally, birds of prey have a larger proportion of claw β-keratins. Analysis of α- and β-keratin expression during development of chicken scales and feathers demonstrates that while α-keratins are expressed in these tissues, the number and magnitude of expressed β-keratin genes far exceeds that of α-keratins. CONCLUSIONS: These results support the view that the number of α- and β-keratin genes expressed, the proportion of the β-keratin subfamily genes expressed and the diversification of the β-keratin genes have been important for the evolution of the feather and the adaptation of birds into multiple ecological niches.
Resumo:
UNLABELLED: The human fungal pathogen Cryptococcus neoformans is capable of infecting a broad range of hosts, from invertebrates like amoebas and nematodes to standard vertebrate models such as mice and rabbits. Here we have taken advantage of a zebrafish model to investigate host-pathogen interactions of Cryptococcus with the zebrafish innate immune system, which shares a highly conserved framework with that of mammals. Through live-imaging observations and genetic knockdown, we establish that macrophages are the primary immune cells responsible for responding to and containing acute cryptococcal infections. By interrogating survival and cryptococcal burden following infection with a panel of Cryptococcus mutants, we find that virulence factors initially identified as important in causing disease in mice are also necessary for pathogenesis in zebrafish larvae. Live imaging of the cranial blood vessels of infected larvae reveals that C. neoformans is able to penetrate the zebrafish brain following intravenous infection. By studying a C. neoformans FNX1 gene mutant, we find that blood-brain barrier invasion is dependent on a known cryptococcal invasion-promoting pathway previously identified in a murine model of central nervous system invasion. The zebrafish-C. neoformans platform provides a visually and genetically accessible vertebrate model system for cryptococcal pathogenesis with many of the advantages of small invertebrates. This model is well suited for higher-throughput screening of mutants, mechanistic dissection of cryptococcal pathogenesis in live animals, and use in the evaluation of therapeutic agents. IMPORTANCE: Cryptococcus neoformans is an important opportunistic pathogen that is estimated to be responsible for more than 600,000 deaths worldwide annually. Existing mammalian models of cryptococcal pathogenesis are costly, and the analysis of important pathogenic processes such as meningitis is laborious and remains a challenge to visualize. Conversely, although invertebrate models of cryptococcal infection allow high-throughput assays, they fail to replicate the anatomical complexity found in vertebrates and, specifically, cryptococcal stages of disease. Here we have utilized larval zebrafish as a platform that overcomes many of these limitations. We demonstrate that the pathogenesis of C. neoformans infection in zebrafish involves factors identical to those in mammalian and invertebrate infections. We then utilize the live-imaging capacity of zebrafish larvae to follow the progression of cryptococcal infection in real time and establish a relevant model of the critical central nervous system infection phase of disease in a nonmammalian model.
Resumo:
The advances in three related areas of state-space modeling, sequential Bayesian learning, and decision analysis are addressed, with the statistical challenges of scalability and associated dynamic sparsity. The key theme that ties the three areas is Bayesian model emulation: solving challenging analysis/computational problems using creative model emulators. This idea defines theoretical and applied advances in non-linear, non-Gaussian state-space modeling, dynamic sparsity, decision analysis and statistical computation, across linked contexts of multivariate time series and dynamic networks studies. Examples and applications in financial time series and portfolio analysis, macroeconomics and internet studies from computational advertising demonstrate the utility of the core methodological innovations.
Chapter 1 summarizes the three areas/problems and the key idea of emulating in those areas. Chapter 2 discusses the sequential analysis of latent threshold models with use of emulating models that allows for analytical filtering to enhance the efficiency of posterior sampling. Chapter 3 examines the emulator model in decision analysis, or the synthetic model, that is equivalent to the loss function in the original minimization problem, and shows its performance in the context of sequential portfolio optimization. Chapter 4 describes the method for modeling the steaming data of counts observed on a large network that relies on emulating the whole, dependent network model by independent, conjugate sub-models customized to each set of flow. Chapter 5 reviews those advances and makes the concluding remarks.
Resumo:
I study the link between capital markets and sources of macroeconomic risk. In chapter 1 I show that expected inflation risk is priced in the cross section of stock returns even after controlling for cash flow growth and volatility risks. Motivated by this evidence I study a long run risk model with a built-in inflation non-neutrality channel that allows me to decompose the real stochastic discount factor into news about current and expected cash flow growth, news about expected inflation and news about volatility. The model can successfully price a broad menu of assets and provides a setting for analyzing cross sectional variation in expected inflation risk premium. For industries like retail and durable goods inflation risk can account for nearly a third of the overall risk premium while the energy industry and a broad commodity index act like inflation hedges. Nominal bonds are exposed to expected inflation risk and have inflation premiums that increase with bond maturity. The price of expected inflation risk was very high during the 70's and 80's, but has come down a lot since being very close to zero over the past decade. On average, the expected inflation price of risk is negative, consistent with the view that periods of high inflation represent a "bad" state of the world and are associated with low economic growth and poor stock market performance. In chapter 2 I look at the way capital markets react to predetermined macroeconomic announcements. I document significantly higher excess returns on the US stock market on macro release dates as compared to days when no macroeconomic news hit the market. Almost the entire equity premium since 1997 is being realized on days when macroeconomic news are released. At high frequency, there is a pattern of returns increasing in the hours prior to the pre-determined announcement time, peaking around the time of the announcement and dropping thereafter.
Resumo:
The full-scale base-isolated structure studied in this dissertation is the only base-isolated building in South Island of New Zealand. It sustained hundreds of earthquake ground motions from September 2010 and well into 2012. Several large earthquake responses were recorded in December 2011 by NEES@UCLA and by GeoNet recording station nearby Christchurch Women's Hospital. The primary focus of this dissertation is to advance the state-of-the art of the methods to evaluate performance of seismic-isolated structures and the effects of soil-structure interaction by developing new data processing methodologies to overcome current limitations and by implementing advanced numerical modeling in OpenSees for direct analysis of soil-structure interaction.
This dissertation presents a novel method for recovering force-displacement relations within the isolators of building structures with unknown nonlinearities from sparse seismic-response measurements of floor accelerations. The method requires only direct matrix calculations (factorizations and multiplications); no iterative trial-and-error methods are required. The method requires a mass matrix, or at least an estimate of the floor masses. A stiffness matrix may be used, but is not necessary. Essentially, the method operates on a matrix of incomplete measurements of floor accelerations. In the special case of complete floor measurements of systems with linear dynamics, real modes, and equal floor masses, the principal components of this matrix are the modal responses. In the more general case of partial measurements and nonlinear dynamics, the method extracts a number of linearly-dependent components from Hankel matrices of measured horizontal response accelerations, assembles these components row-wise and extracts principal components from the singular value decomposition of this large matrix of linearly-dependent components. These principal components are then interpolated between floors in a way that minimizes the curvature energy of the interpolation. This interpolation step can make use of a reduced-order stiffness matrix, a backward difference matrix or a central difference matrix. The measured and interpolated floor acceleration components at all floors are then assembled and multiplied by a mass matrix. The recovered in-service force-displacement relations are then incorporated into the OpenSees soil structure interaction model.
Numerical simulations of soil-structure interaction involving non-uniform soil behavior are conducted following the development of the complete soil-structure interaction model of Christchurch Women's Hospital in OpenSees. In these 2D OpenSees models, the superstructure is modeled as two-dimensional frames in short span and long span respectively. The lead rubber bearings are modeled as elastomeric bearing (Bouc Wen) elements. The soil underlying the concrete raft foundation is modeled with linear elastic plane strain quadrilateral element. The non-uniformity of the soil profile is incorporated by extraction and interpolation of shear wave velocity profile from the Canterbury Geotechnical Database. The validity of the complete two-dimensional soil-structure interaction OpenSees model for the hospital is checked by comparing the results of peak floor responses and force-displacement relations within the isolation system achieved from OpenSees simulations to the recorded measurements. General explanations and implications, supported by displacement drifts, floor acceleration and displacement responses, force-displacement relations are described to address the effects of soil-structure interaction.
Resumo:
Monitoring and enforcement are perhaps the biggest challenges in the design and implementation of environmental policies in developing countries where the actions of many small informal actors cause significant impacts on the ecosystem services and where the transaction costs for the state to regulate them could be enormous. This dissertation studies the potential of innovative institutions based on decentralized coordination and enforcement to induce better environmental outcomes. Such policies have in common that the state plays the role of providing the incentives for organization but the process of compliance happens through decentralized agreements, trust building, signaling and monitoring. I draw from the literatures in collective action, common-pool resources, game-theory and non-point source pollution to develop the instruments proposed here. To test the different conditions in which such policies could be implemented I designed two field-experiments that I conducted with small-scale gold miners in the Colombian Pacific and with users and providers of ecosystem services in the states of Veracruz, Quintana Roo and Yucatan in Mexico. This dissertation is organized in three essays.
The first essay, “Collective Incentives for Cleaner Small-Scale Gold Mining on the Frontier: Experimental Tests of Compliance with Group Incentives given Limited State Monitoring”, examines whether collective incentives, i.e. incentives provided to a group conditional on collective compliance, could “outsource” the required local monitoring, i.e. induce group interactions that extend the reach of the state that can observe only aggregate consequences in the context of small-scale gold mining. I employed a framed field-lab experiment in which the miners make decisions regarding mining intensity. The state sets a collective target for an environmental outcome, verifies compliance and provides a group reward for compliance which is split equally among members. Since the target set by the state transforms the situation into a coordination game, outcomes depend on expectations of what others will do. I conducted this experiment with 640 participants in a mining region of the Colombian Pacific and I examine different levels of policy severity and their ordering. The findings of the experiment suggest that such instruments can induce compliance but this regulation involves tradeoffs. For most severe targets – with rewards just above costs – raise gains if successful but can collapse rapidly and completely. In terms of group interactions, better outcomes are found when severity initially is lower suggesting learning.
The second essay, “Collective Compliance can be Efficient and Inequitable: Impacts of Leaders among Small-Scale Gold Miners in Colombia”, explores the channels through which communication help groups to coordinate in presence of collective incentives and whether the reached solutions are equitable or not. Also in the context of small-scale gold mining in the Colombian Pacific, I test the effect of communication in compliance with a collective environmental target. The results suggest that communication, as expected, helps to solve coordination challenges but still some groups reach agreements involving unequal outcomes. By examining the agreements that took place in each group, I observe that the main coordination mechanism was the presence of leaders that help other group members to clarify the situation. Interestingly, leaders not only helped groups to reach efficiency but also played a key role in equity by defining how the costs of compliance would be distributed among group members.
The third essay, “Creating Local PES Institutions and Increasing Impacts of PES in Mexico: A real-Time Watershed-Level Framed Field Experiment on Coordination and Conditionality”, considers the creation of a local payments for ecosystem services (PES) mechanism as an assurance game that requires the coordination between two groups of participants: upstream and downstream. Based on this assurance interaction, I explore the effect of allowing peer-sanctions on upstream behavior in the functioning of the mechanism. This field-lab experiment was implemented in three real cases of the Mexican Fondos Concurrentes (matching funds) program in the states of Veracruz, Quintana Roo and Yucatan, where 240 real users and 240 real providers of hydrological services were recruited and interacted with each other in real time. The experimental results suggest that initial trust-game behaviors align with participants’ perceptions and predicts baseline giving in assurance game. For upstream providers, i.e. those who get sanctioned, the threat and the use of sanctions increase contributions. Downstream users contribute less when offered the option to sanction – as if that option signal an uncooperative upstream – then the contributions rise in line with the complementarity in payments of the assurance game.
Resumo:
Uncertainty quantification (UQ) is both an old and new concept. The current novelty lies in the interactions and synthesis of mathematical models, computer experiments, statistics, field/real experiments, and probability theory, with a particular emphasize on the large-scale simulations by computer models. The challenges not only come from the complication of scientific questions, but also from the size of the information. It is the focus in this thesis to provide statistical models that are scalable to massive data produced in computer experiments and real experiments, through fast and robust statistical inference.
Chapter 2 provides a practical approach for simultaneously emulating/approximating massive number of functions, with the application on hazard quantification of Soufri\`{e}re Hills volcano in Montserrate island. Chapter 3 discusses another problem with massive data, in which the number of observations of a function is large. An exact algorithm that is linear in time is developed for the problem of interpolation of Methylation levels. Chapter 4 and Chapter 5 are both about the robust inference of the models. Chapter 4 provides a new criteria robustness parameter estimation criteria and several ways of inference have been shown to satisfy such criteria. Chapter 5 develops a new prior that satisfies some more criteria and is thus proposed to use in practice.