944 resultados para US macroeconomic variables
Resumo:
Many high-state non-magnetic cataclysmic variables (CVs) exhibit blueshifted absorption or P-Cygni profiles associated with ultraviolet (UV) resonance lines. These features imply the existence of powerful accretion disc winds in CVs. Here, we use our Monte Carlo ionization and radiative transfer code to investigate whether disc wind models that produce realistic UV line profiles are also likely to generate observationally significant recombination line and continuum emission in the optical waveband. We also test whether outflows may be responsible for the single-peaked emission line profiles often seen in high-state CVs and for the weakness of the Balmer absorption edge (relative to simple models of optically thick accretion discs). We find that a standard disc wind model that is successful in reproducing the UV spectra of CVs also leaves a noticeable imprint on the optical spectrum, particularly for systems viewed at high inclination. The strongest optical wind-formed recombination lines are H alpha and He ii lambda 4686. We demonstrate that a higher density outflow model produces all the expected H and He lines and produces a recombination continuum that can fill in the Balmer jump at high inclinations. This model displays reasonable verisimilitude with the optical spectrum of RW Trianguli. No single-peaked emission is seen, although we observe a narrowing of the double-peaked emission lines from the base of the wind. Finally, we show that even denser models can produce a single-peaked H alpha line. On the basis of our results, we suggest that winds can modify, and perhaps even dominate, the line and continuum emission from CVs.
Resumo:
We present a novel method for the light-curve characterization of Pan-STARRS1 Medium Deep Survey (PS1 MDS) extragalactic sources into stochastic variables (SVs) and burst-like (BL) transients, using multi-band image-differencing time-series data. We select detections in difference images associated with galaxy hosts using a star/galaxy catalog extracted from the deep PS1 MDS stacked images, and adopt a maximum a posteriori formulation to model their difference-flux time-series in four Pan-STARRS1 photometric bands gP1, rP1, iP1, and zP1. We use three deterministic light-curve models to fit BL transients; a Gaussian, a Gamma distribution, and an analytic supernova (SN) model, and one stochastic light-curve model, the Ornstein-Uhlenbeck process, in order to fit variability that is characteristic of active galactic nuclei (AGNs). We assess the quality of fit of the models band-wise and source-wise, using their estimated leave-out-one cross-validation likelihoods and corrected Akaike information criteria. We then apply a K-means clustering algorithm on these statistics, to determine the source classification in each band. The final source classification is derived as a combination of the individual filter classifications, resulting in two measures of classification quality, from the averages across the photometric filters of (1) the classifications determined from the closest K-means cluster centers, and (2) the square distances from the clustering centers in the K-means clustering spaces. For a verification set of AGNs and SNe, we show that SV and BL occupy distinct regions in the plane constituted by these measures. We use our clustering method to characterize 4361 extragalactic image difference detected sources, in the first 2.5 yr of the PS1 MDS, into 1529 BL, and 2262 SV, with a purity of 95.00% for AGNs, and 90.97% for SN based on our verification sets. We combine our light-curve classifications with their nuclear or off-nuclear host galaxy offsets, to define a robust photometric sample of 1233 AGNs and 812 SNe. With these two samples, we characterize their variability and host galaxy properties, and identify simple photometric priors that would enable their real-time identification in future wide-field synoptic surveys.
Resumo:
This study examines the firm size distribution of US banks and credit unions. A truncated lognormal distribution describes the size distribution, measured using assets data, of a large population of small, community-based commercial banks. The size distribution of a smaller but increasingly dominant cohort of large banks, which operate a high-volume low-cost retail banking model, exhibits power-law behaviour. There is a progressive increase in skewness over time, and Zipf’s Law is rejected as a descriptor of the size distribution in the upper tail. By contrast, the asset size distribution of the population of credit unions conforms closely to the lognormal distribution.
Resumo:
We investigate the determinants of US credit union capital-to-assets ratios, before and after the implementation of the current capital adequacy regulatory framework in 2000. Capitalization varies pro-cyclically, and until the financial crisis credit unions classified as adequately capitalized or below followed a faster adjustment path than well capitalized credit unions. This pattern was reversed, however, in the aftermath of the crisis. The introduction of the PCA regulatory regime achieved a reduction in the proportion of credit unions classified as adequately capitalized or below that continued until the onset of the crisis. Since the crisis, the speed of recovery of credit unions in this category following an adverse capitalization shock was sharply reduced.
Resumo:
A method of manufacturing a composite concrete article comprising forming a textile structure, removing material from regions of the textile structure to create voids in the textile structure and incorporating the textile structure into a body of wet uncured concrete such that the concrete flows into the voids created in the textile structure, embedding the textile structure into the concrete, whereby the textile structure defines at least a portion of a surface of the cured concrete article.
Resumo:
This study provides estimates of the macroeconomic impact of non-communicable diseases (NCDs) inChina and India for the period 2012–2030. Our estimates are derived using the World Health Organization’sEPIC model of economic growth, which focuses on the negative effects of NCDs on labor supply andcapital accumulation. We present results for the five main NCDs (cardiovascular disease, cancer, chronicrespiratory disease, diabetes, and mental health). Our undiscounted estimates indicate that the cost ofthe five main NCDs will total USD 23.03 trillion for China and USD 4.58 trillion for India (in 2010 USD).For both countries, the most costly domain is cardiovascular disease. Our analyses also reveal that thecosts are much larger in China than in India mainly because of China’s higher and steeper income trajectory,and to a lesser extent its older population. Rough calculations also indicate that WHO’s best buys foraddressing the challenge of NCDs are highly cost-beneficial
Resumo:
New, automated forms of data-analysis are required in order to understand the high-dimensional trajectories that are obtained from molecular dynamics simulations on proteins. Dimensionality reduction algorithms are particularly appealing in this regard as they allow one to construct unbiased, low-dimensional representations of the trajectory using only the information encoded in the trajectory. The downside of this approach is that different sets of coordinates are required for each different chemical systems under study precisely because the coordinates are constructed using information from the trajectory. In this paper we show how one can resolve this problem by using the sketch-map algorithm that we recently proposed to construct a low-dimensional representation of the structures contained in the protein data bank (PDB). We show that the resulting coordinates are as useful for analysing trajectory data as coordinates constructed using landmark configurations taken from the trajectory and that these coordinates can thus be used for understanding protein folding across a range of systems.
Resumo:
We present a homological characterisation of those chain complexes of modules over a Laurent polynomial ring in several indeterminates which are finitely dominated over the ground ring (that is, are a retract up to homotopy of a bounded complex of finitely generated free modules). The main tools, which we develop in the paper, are a non-standard totalisation construction for multi-complexes based on truncated products, and a high-dimensional mapping torus construction employing a theory of cubical diagrams that commute up to specified coherent homotopies.
Resumo:
The advent of novel genomic technologies that enable the evaluation of genomic alterations on a genome-wide scale has significantly altered the field of genomic marker research in solid tumors. Researchers have moved away from the traditional model of identifying a particular genomic alteration and evaluating the association between this finding and a clinical outcome measure to a new approach involving the identification and measurement of multiple genomic markers simultaneously within clinical studies. This in turn has presented additional challenges in considering the use of genomic markers in oncology, such as clinical study design, reproducibility and interpretation and reporting of results. This Review will explore these challenges, focusing on microarray-based gene-expression profiling, and highlights some common failings in study design that have impacted on the use of putative genomic markers in the clinic. Despite these rapid technological advances there is still a paucity of genomic markers in routine clinical use at present. A rational and focused approach to the evaluation and validation of genomic markers is needed, whereby analytically validated markers are investigated in clinical studies that are adequately powered and have pre-defined patient populations and study endpoints. Furthermore, novel adaptive clinical trial designs, incorporating putative genomic markers into prospective clinical trials, will enable the evaluation of these markers in a rigorous and timely fashion. Such approaches have the potential to facilitate the implementation of such markers into routine clinical practice and consequently enable the rational and tailored use of cancer therapies for individual patients. © 2010 Macmillan Publishers Limited. All rights reserved.
Resumo:
We present a method for learning Bayesian networks from data sets containing thousands of variables without the need for structure constraints. Our approach is made of two parts. The first is a novel algorithm that effectively explores the space of possible parent sets of a node. It guides the exploration towards the most promising parent sets on the basis of an approximated score function that is computed in constant time. The second part is an improvement of an existing ordering-based algorithm for structure optimization. The new algorithm provably achieves a higher score compared to its original formulation. Our novel approach consistently outperforms the state of the art on very large data sets.
Resumo:
In recent years much attention has been given to systemic risk and maintaining financial stability. Much of the focus, rightly, has been on market failures and the role of regulation in addressing them. This article looks at the role of domestic policies and government actions as sources of global instability. The global financial system is built upon global markets controlled by national financial and macroeconomic policies. In this context, regulatory asymmetries, diverging policy preferences, and government failures add a further dimension to global systemic risk not present at the national level.
Systemic risk is a result of the interplay between two independent variables: an underlying trigger event, in this analysis a domestic policy measure, and a transmission channel. The solution to systemic risk requires tackling one of these variables. In a domestic setting, the centralization of regulatory power into one single authority makes it easier to balance the delicate equilibrium between enhancing efficiency and reducing instability. However, in a global financial system in which national financial policies serve to maximize economic welfare, regulators will be confronted with difficult policy and legal tradeoffs.
We investigate the role that financial regulation plays in addressing domestic policy failures and in controlling the danger of global financial interdependence. To do so we analyse global financial interconnectedness, and explain its role in transmitting instability; we investigate the political economy dynamics at the origin of regulatory asymmetries and government failures; and we discuss the limits of regulation.