962 resultados para GRAPH CUTS
Resumo:
This paper develops a dynamic general equilibrium model to highlight the role of human capital accumulation of agents differentiated by skill type in the joint determination of social mobility and the skill premium. We first show that our model captures the empirical co-movement of the skill premium, the relative supply of skilled to unskilled workers and aggregate output in the U.S. data from 1970-2000. We next show that endogenous social mobility and human capital accumulation are key channels through which the effects of capital tax cuts and increases in public spending on both pre- and post-college education are transmitted. In particular, social mobility creates additional incentives for the agents which enhance the beneficial effects of policy reforms. Moreover, the dynamics of human capital accumulation imply that, post reform, the skill premium is higher in the short- to medium-run than in the long-run.
Resumo:
We develop an empirical framework that links micro-liquidity, macro-liquidity and stock prices. We provide evidence of a strong link between macro-liquidity shocks and the returns of UK stock portfolios constructed on the basis of micro-liquidity measures between 1999-2012. Specifically, macro-liquidity shocks, which are extracted on the meeting days of the Bank of England Monetary Policy Committee relative to market expectations embedded in 3-month LIBOR futures prices, are transmitted in a differential manner to the cross-section of liquidity-sorted portfolios, with liquid stocks playing the most active role. We also find that there is a significant increase in shares’ trading activity and a rather small increase in their trading cost on MPC meeting days. Finally, our results emphatically document that during the recent financial crisis the shocks-returns relationship has reversed its sign. Interest rate cuts during the crisis were perceived by market participants as a signal of deteriorating economic prospects and reinforced “flight to safety” trading.
Resumo:
Workers in less secure jobs are often paid less than identical-looking workers in more secure jobs. We show that this lack of compensating differentials for unemployment risk can arise in equilibrium when all workers are identical and firms differ only in job security (i.e. the probability that the worker is not sent into unemployment). In a setting where workers search for new positions both on and off the job, the worker's marginal willingness to pay for job security is endogenous: it depends on the behavior of all firms in the labor market and increases with the rent the employing firm leaves to the worker. We solve for the labor market equilibrium, finding that wages increase with job security for at least all firms in the risky tail of the distribution of firm-level unemployment risk. Meanwhile, unemployment becomes persistent for low-wage and unemployed workers, a seeming pattern of 'unemployment scarring' created entirely by firm heterogeneity. Higher in the wage distribution, workers can take wage cuts to move to more stable employment.
Resumo:
This paper analyses optimal income taxes over the business cycle under a balanced-budget restriction, for low, middle and high income households. A model incorporating capital-skill complementarity in production and differential access to capital and labour markets is developed to capture the cyclical characteristics of the US economy, as well as the empirical observations on wage (skill premium) and wealth inequality. We .nd that the tax rate for high income agents is optimally the least volatile and the tax rate for low income agents the least countercyclical. In contrast, the path of optimal taxes for the middle income group is found to be very volatile and counter-cyclical. We further find that the optimal response to output-enhancing capital equipment technology and spending cuts is to increase the progressivity of income taxes. Finally, in response to positive TFP shocks, taxation becomes more progressive after about two years.
Resumo:
What's the role of unilateral measures in global climate change mitigation in a post-Durban, post 2012 global policy regime? We argue that under conditions of preference heterogeneity, unilateral emissions mitigation at a subnational level may exist even when a nation is unwilling to commit to emission cuts. As the fraction of individuals unilaterally cutting emissions in a global strongly connected network of countries evolves over time, learning the costs of cutting emissions can result in the adoption of such activities globally and we establish that this will indeed happen under certain assumptions. We analyze the features of a policy proposal that could accelerate convergence to a low carbon world in the presence of global learning.
Resumo:
Functional connectivity in human brain can be represented as a network using electroencephalography (EEG) signals. These networks--whose nodes can vary from tens to hundreds--are characterized by neurobiologically meaningful graph theory metrics. This study investigates the degree to which various graph metrics depend upon the network size. To this end, EEGs from 32 normal subjects were recorded and functional networks of three different sizes were extracted. A state-space based method was used to calculate cross-correlation matrices between different brain regions. These correlation matrices were used to construct binary adjacency connectomes, which were assessed with regards to a number of graph metrics such as clustering coefficient, modularity, efficiency, economic efficiency, and assortativity. We showed that the estimates of these metrics significantly differ depending on the network size. Larger networks had higher efficiency, higher assortativity and lower modularity compared to those with smaller size and the same density. These findings indicate that the network size should be considered in any comparison of networks across studies.
Resumo:
Workers in less-secure jobs are often paid less than identical-looking workers in more secure jobs. We show that this lack of compensating differentials for unemployment risk can arise in equilibrium when all workers are identical and firms differ only in job security (i.e. the probability that the worker is not sent into unemployment). In a setting where workers search for new positions both on and off the job, the worker’s marginal willingness to pay for job security is endogenous, increasing with the rent received by a worker in his job, and depending on the behavior of all firms in the labor market. We solve for the labor market equilibrium and find that wages increase with job security for at least all firms in the risky tail of the distribution of firm-level unemployment risk. Unemployment becomes persistent for low-wage and unemployed workers, a seeming pattern of ‘unemployment scarring’ created entirely by firm heterogeneity. Higher in the wage distribution, workers can take wage cuts to move to more stable employment.
Resumo:
Lean meat percentage (LMP) is an important carcass quality parameter. The aim of this work is to obtain a calibration equation for the Computed Tomography (CT) scans with the Partial Least Square Regression (PLS) technique in order to predict the LMP of the carcass and the different cuts and to study and compare two different methodologies of the selection of the variables (Variable Importance for Projection — VIP- and Stepwise) to be included in the prediction equation. The error of prediction with cross-validation (RMSEPCV) of the LMP obtained with PLS and selection based on VIP value was 0.82% and for stepwise selection it was 0.83%. The prediction of the LMP scanning only the ham had a RMSEPCV of 0.97% and if the ham and the loin were scanned the RMSEPCV was 0.90%. Results indicate that for CT data both VIP and stepwise selection are good methods. Moreover the scanning of only the ham allowed us to obtain a good prediction of the LMP of the whole carcass.
Resumo:
El proyecto consiste en un entorno gráfico cuyo fin es el de visualizar, estudiar e interpretar la conservación de código genético existente entre los diferentes genomas. Una interface que permite cargar hasta ocho genomas para ser comparados en detalle, por pares o entre todos ellos a la vez. El gráfico que se muestra en la interfaz, representa los Maximal Unique Matchings entre cada par de genomas, lo que significa coincidencias de la mayor longitud posible no repetidas, en las secuencias de ADN de las especies comparadas. La finalidad es el estudio de las evoluciones que han ido apareciendo entre diferentes organismos o los genes que comparten unas especies con otras.
Resumo:
The present study was performed to assess the interlaboratory reproducibility of the molecular detection and identification of species of Zygomycetes from formalin-fixed paraffin-embedded kidney and brain tissues obtained from experimentally infected mice. Animals were infected with one of five species (Rhizopus oryzae, Rhizopus microsporus, Lichtheimia corymbifera, Rhizomucor pusillus, and Mucor circinelloides). Samples with 1, 10, or 30 slide cuts of the tissues were prepared from each paraffin block, the sample identities were blinded for analysis, and the samples were mailed to each of seven laboratories for the assessment of sensitivity. A protocol describing the extraction method and the PCR amplification procedure was provided. The internal transcribed spacer 1 (ITS1) region was amplified by PCR with the fungal universal primers ITS1 and ITS2 and sequenced. As negative results were obtained for 93% of the tissue specimens infected by M. circinelloides, the data for this species were excluded from the analysis. Positive PCR results were obtained for 93% (52/56), 89% (50/56), and 27% (15/56) of the samples with 30, 10, and 1 slide cuts, respectively. There were minor differences, depending on the organ tissue, fungal species, and laboratory. Correct species identification was possible for 100% (30 cuts), 98% (10 cuts), and 93% (1 cut) of the cases. With the protocol used in the present study, the interlaboratory reproducibility of ITS sequencing for the identification of major Zygomycetes species from formalin-fixed paraffin-embedded tissues can reach 100%, when enough material is available.
Resumo:
Objective: To assess reproducibility and feasibility of amusculoskeletal ultrasound (US) score for rheumatoid arthritis amongrheumatologist with diverse expertise in US, working in private orhospital practice.Methods: The Swiss Sonography in Arthritis and Rheumatism(SONAR) group has developed a semi-quantitative score for RA usingOMERACT criteria for synovitis and erosion. The score was taught torheumatologists trained in US through two workshops. Subsequently,they were encouraged to practice in their office. For the study, we used6 US machines of different quality, each with a different patient.19 readers randomly selected among rheumatologists who haveattended both workshops, were asked to score anonymously at leastone patient. To assess whether some factors influence the score, weasked each reader to answer questionnaire describing his experiencewith US.Results: 19 rheumatologists have performed 29 scans, each patienthaving been evaluated by 4 to 6 readers. Median time for examcompletion was 20 minutes (range 15 to 60 mn). 53% ofrheumatologists work in private practice. Graph 1 show the global greyscale score for each patient. Weighted kappa was calculated for eachpair of reader using stata11. Almost all kappa of poor agreement wereobtained with a low quality device or by an assessor who havepreviously performed less than 5 scores himself.Conclusions: This is the first study to show an US score for RAfeasible by rheumatologists with diverse expertise in US both in privateand hospital practice. Reproducibility seemed to be influenced by thequality of device and previous experience with the score.
Resumo:
The usual way to investigate the statistical properties of finitely generated subgroups of free groups, and of finite presentations of groups, is based on the so-called word-based distribution: subgroups are generated (finite presentations are determined) by randomly chosen k-tuples of reduced words, whose maximal length is allowed to tend to infinity. In this paper we adopt a different, though equally natural point of view: we investigate the statistical properties of the same objects, but with respect to the so-called graph-based distribution, recently introduced by Bassino, Nicaud and Weil. Here, subgroups (and finite presentations) are determined by randomly chosen Stallings graphs whose number of vertices tends to infinity. Our results show that these two distributions behave quite differently from each other, shedding a new light on which properties of finitely generated subgroups can be considered frequent or rare. For example, we show that malnormal subgroups of a free group are negligible in the raph-based distribution, while they are exponentially generic in the word-based distribution. Quite surprisingly, a random finite presentation generically presents the trivial group in this new distribution, while in the classical one it is known to generically present an infinite hyperbolic group.
Resumo:
The dynamical analysis of large biological regulatory networks requires the development of scalable methods for mathematical modeling. Following the approach initially introduced by Thomas, we formalize the interactions between the components of a network in terms of discrete variables, functions, and parameters. Model simulations result in directed graphs, called state transition graphs. We are particularly interested in reachability properties and asymptotic behaviors, which correspond to terminal strongly connected components (or "attractors") in the state transition graph. A well-known problem is the exponential increase of the size of state transition graphs with the number of network components, in particular when using the biologically realistic asynchronous updating assumption. To address this problem, we have developed several complementary methods enabling the analysis of the behavior of large and complex logical models: (i) the definition of transition priority classes to simplify the dynamics; (ii) a model reduction method preserving essential dynamical properties, (iii) a novel algorithm to compact state transition graphs and directly generate compressed representations, emphasizing relevant transient and asymptotic dynamical properties. The power of an approach combining these different methods is demonstrated by applying them to a recent multilevel logical model for the network controlling CD4+ T helper cell response to antigen presentation and to a dozen cytokines. This model accounts for the differentiation of canonical Th1 and Th2 lymphocytes, as well as of inflammatory Th17 and regulatory T cells, along with many hybrid subtypes. All these methods have been implemented into the software GINsim, which enables the definition, the analysis, and the simulation of logical regulatory graphs.
Resumo:
Community studies of non-hospitalized children are essential to obtain a more thorough understanding of acute respiratory infections (ARI) and provide important information for public health authorities. This study identified a total ARI incidence rate (IR) of 4.5 per 100 child-weeks at risk and 0.78 for lower respiratory tract infections (LRI). Disease duration averaged less than one week and produced a total time ill with ARI of 5.8% and for LRI 1.2%. No clear seasonal variation was observed, the sex-specific IR showed a higher proportion of boys becoming ill with ARI and LRI and the peak age-specific IR occurred in infants of 6-11 months. Correlation with risk factors of the child (breastfeeding, vaccination, diarrheal disease, undernourishment) and the environment (crowding, living conditions, maternal age and education) showed marginal increases in the rate ratios, making it difficult to propose clear-cuts targets for action to lower the ARI and LRI morbidity. The importance of an integral maternal-child health care program and public education in the early recognition of LRI is discussed.
Resumo:
El projecte s'ha centrat en el disseny i desenvolupament de laboratoris virtuals per a la docència del dispositius i mètodes de gestió d’energia. Això s’ha realitzat a dos nivells clarament diferenciats, el primer grup de laboratoris correspon als convertidors electrònics de potencia i el segon grup de laboratoris correspon a un conjunt de casos d’aplicacions concretes. En el primer grup es descriu el detall del funcionament dels diferents elements mentre que en el segon els descriuen les idees i conceptes bàsics de funcionament. Els laboratoris virtuals de convertidors electrònics de potència inclouen el convertidor elevador (boost), el convertidor reductor (buck), i convertidors acobladors magnèticament. Aquestes permeten estudiar el comportament dinàmica des d’un punt de vista commutat o bé promitjat, les aplicacions incorporen també la possibilitat de sintonitzar els controladors. Aquestes aplicacions han estat desenvolupades per ser un complement per les sessions de pràctiques presencials. Els laboratoris virtuals d’aplicacions, inclouen els sistema de transport metropolità, el vehicle híbrid i els sistemes de gestió de talls transitoris en el subministrament d’energia principalment. Aquestes laboratoris permeten introduir els estudiants de forma qualitativa en els diferents conceptes i tècniques emprades en els sistemes de generació, transport i transformació d’energia. Totes les aplicacions han estat desenvolupades emprant Easy JAVA Simulations, aquesta eina permet desenvolupar laboratoris multiplataforma fàcilment distribuïbles a través d’internet.