996 resultados para Structural-Parametrical Optimization
Resumo:
Structural unemployment is due to mismatch between available jobs and workers.We formalize this concept in a simple model of a segmented labor market with searchfrictions within segments. Worker mobility, job mobility and wage bargaining costsacross segments generate structural unemployment. We estimate the contribution ofthese costs to fluctuations in US unemployment, operationalizing segments as statesor industries. Most structural unemployment is due to wage bargaining costs, whichare large but nevertheless contribute little to unemployment fluctuations. Structuralunemployment is as cyclical as overall unemployment and no more persistent, bothin the current and in previous recessions.
Resumo:
An analysis of the performance of GDP, employment and otherlabor market variables following the troughs in postwar U.S. businesscycles points to much slower recoveries in the three most recentepisodes, but does not reveal any significant change over time in therelation between GDP and employment. This leads us to characterizethe last three episodes as slow recoveries, as opposed to jobless recoveries.We use the estimated New Keynesian model in Galí-Smets-Wouters (2011) to provide a structural interpretation for the slowerrecoveries since the early nineties.
Resumo:
This paper provides a method to estimate time varying coefficients structuralVARs which are non-recursive and potentially overidentified. The procedureallows for linear and non-linear restrictions on the parameters, maintainsthe multi-move structure of standard algorithms and can be used toestimate structural models with different identification restrictions. We studythe transmission of monetary policy shocks and compare the results with thoseobtained with traditional methods.
Resumo:
Estimates for the U.S. suggest that at least in some sectors productivity enhancing reallocationis the dominant factor in accounting for producitivity growth. An open question, particularlyrelevant for developing countries, is whether reallocation is always productivity enhancing. Itmay be that imperfect competition or other barriers to competitive environments imply that thereallocation process is not fully e?cient in these countries. Using a unique plant-levellongitudinal dataset for Colombia for the period 1982-1998, we explore these issues byexamining the interaction between market allocation, and productivity and profitability.Moreover, given the important trade, labor and financial market reforms in Colombia during theearly 1990's, we explore whether and how the contribution of reallocation changed over theperiod of study. Our data permit measurement of plant-level quantities and prices. Takingadvantage of the rich structure of our price data, we propose a sequential mehodology to estimateproductivity and demand shocks at the plant level. First, we estimate total factor productivity(TFP) with plant-level physical output data, where we use downstream demand to instrumentinputs. We then turn to estimating demand shocks and mark-ups with plant-level price data, usingTFP to instrument for output in the inversedemand equation. We examine the evolution of thedistributions of TFP and demand shocks in response to the market reforms in the 1990's. We findthat market reforms are associated with rising overall productivity that is largely driven byreallocation away from low- and towards highproductivity businesses. In addition, we find thatthe allocation of activity across businesses is less driven by demand factors after reforms. Wefind that the increase in aggregate productivity post-reform is entirely accounted for by theimproved allocation of activity.
Resumo:
Some past studies analyzed Spanish monetary policy with the standard VAR. Their problem is that this method obliges researchers to impose a certain extreme form of the short run policy rule on their models. Hence, it does not allow researchers to study the possibility of structural changes in this rule, either. This paper overcomes these problems by using the structural VAR. I find that the rule has always been that of partial accommodation. Prior to 1984, it was quite close to money targeting. After 1984, it became closer to the interest rate targeting, with more emphasis on the exchange rate.
Resumo:
We address the problem of scheduling a multiclass $M/M/m$ queue with Bernoulli feedback on $m$ parallel servers to minimize time-average linear holding costs. We analyze the performance of a heuristic priority-index rule, which extends Klimov's optimal solution to the single-server case: servers select preemptively customers with larger Klimov indices. We present closed-form suboptimality bounds (approximate optimality) for Klimov's rule, which imply that its suboptimality gap is uniformly bounded above with respect to (i) external arrival rates, as long as they stay within system capacity;and (ii) the number of servers. It follows that its relativesuboptimality gap vanishes in a heavy-traffic limit, as external arrival rates approach system capacity (heavy-traffic optimality). We obtain simpler expressions for the special no-feedback case, where the heuristic reduces to the classical $c \mu$ rule. Our analysis is based on comparing the expected cost of Klimov's ruleto the value of a strong linear programming (LP) relaxation of the system's region of achievable performance of mean queue lengths. In order to obtain this relaxation, we derive and exploit a new set ofwork decomposition laws for the parallel-server system. We further report on the results of a computational study on the quality of the $c \mu$ rule for parallel scheduling.
Resumo:
Following the introduction of single-metal deposition (SMD), a simplified fingermark detection technique based on multimetal deposition, optimization studies were conducted. The different parameters of the original formula were tested and the results were evaluated based on the contrast and overall aspect of the enhanced fingermarks. The new formula for SMD was found based on the most optimized parameters. Interestingly, it was found that important variations from the base parameters did not significantly affect the outcome of the enhancement, thus demonstrating that SMD is a very robust technique. Finally, a comparison of the optimized SMD with multi-metal deposition (MMD) was carried out on different surfaces. It was demonstrated that SMD produces comparable results to MMD, thus validating the technique.
Resumo:
The interpretation of the Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV) is based on a 4-factor model, which is only partially compatible with the mainstream Cattell-Horn-Carroll (CHC) model of intelligence measurement. The structure of cognitive batteries is frequently analyzed via exploratory factor analysis and/or confirmatory factor analysis. With classical confirmatory factor analysis, almost all crossloadings between latent variables and measures are fixed to zero in order to allow the model to be identified. However, inappropriate zero cross-loadings can contribute to poor model fit, distorted factors, and biased factor correlations; most important, they do not necessarily faithfully reflect theory. To deal with these methodological and theoretical limitations, we used a new statistical approach, Bayesian structural equation modeling (BSEM), among a sample of 249 French-speaking Swiss children (8-12 years). With BSEM, zero-fixed cross-loadings between latent variables and measures are replaced by approximate zeros, based on informative, small-variance priors. Results indicated that a direct hierarchical CHC-based model with 5 factors plus a general intelligence factor better represented the structure of the WISC-IV than did the 4-factor structure and the higher order models. Because a direct hierarchical CHC model was more adequate, it was concluded that the general factor should be considered as a breadth rather than a superordinate factor. Because it was possible for us to estimate the influence of each of the latent variables on the 15 subtest scores, BSEM allowed improvement of the understanding of the structure of intelligence tests and the clinical interpretation of the subtest scores.
Resumo:
PURPOSE: To suppress the noise, by sacrificing some of the signal homogeneity for numerical stability, in uniform T1 weighted (T1w) images obtained with the magnetization prepared 2 rapid gradient echoes sequence (MP2RAGE) and to compare the clinical utility of these robust T1w images against the uniform T1w images. MATERIALS AND METHODS: 8 healthy subjects (29.0±4.1 years; 6 Male), who provided written consent, underwent two scan sessions within a 24 hour period on a 7T head-only scanner. The uniform and robust T1w image volumes were calculated inline on the scanner. Two experienced radiologists qualitatively rated the images for: general image quality; 7T specific artefacts; and, local structure definition. Voxel-based and volume-based morphometry packages were used to compare the segmentation quality between the uniform and robust images. Statistical differences were evaluated by using a positive sided Wilcoxon rank test. RESULTS: The robust image suppresses background noise inside and outside the skull. The inhomogeneity introduced was ranked as mild. The robust image was significantly ranked higher than the uniform image for both observers (observer 1/2, p-value = 0.0006/0.0004). In particular, an improved delineation of the pituitary gland, cerebellar lobes was observed in the robust versus uniform T1w image. The reproducibility of the segmentation results between repeat scans improved (p-value = 0.0004) from an average volumetric difference across structures of ≈6.6% to ≈2.4% for the uniform image and robust T1w image respectively. CONCLUSIONS: The robust T1w image enables MP2RAGE to produce, clinically familiar T1w images, in addition to T1 maps, which can be readily used in uniform morphometry packages.
Resumo:
This paper investigates the contribution of monetary policy to the changes in outputgrowth and inflation dynamics in the US. We identify a policy shock and a policy rule ina time-varying coefficients VAR using robust sign restrictions. The transmission of policyshocks has been relatively stable. The variance of the policy shock has decreased over time,but policy shocks account for a small fraction of the level and of the variations in inflationand output growth volatility and persistence. We find little evidence of a significant increasein the long run response of the interest rate to inflation. A more aggressive inflation policyin the 1970s would have produced large output growth costs.
Resumo:
We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid(whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then theproblem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.
Resumo:
Ontic structural realism is the view that structures are what is real in the first place in the domain of fundamental physics. The structures are usually conceived as including a primitive modality. However, it has not been spelled out as yet what exactly that modality amounts to. This paper proposes to fill this lacuna by arguing that the fundamental physical structures possess a causal essence, being powers. Applying the debate about causal vs. categorical properties in analytic metaphysics to ontic structural realism, I show that the standard argument against categorical and for causal properties holds for structures as well. Structural realism, as a position in the metaphysics of science that is a form of scientific realism, is committed to causal structures. The metaphysics of causal structures is supported by physics, and it can provide for a complete and coherent view of the world that includes all domains of empirical science.
Resumo:
Monitoring and management of intracranial pressure (ICP) and cerebral perfusion pressure (CPP) is a standard of care after traumatic brain injury (TBI). However, the pathophysiology of so-called secondary brain injury, i.e., the cascade of potentially deleterious events that occur in the early phase following initial cerebral insult-after TBI, is complex, involving a subtle interplay between cerebral blood flow (CBF), oxygen delivery and utilization, and supply of main cerebral energy substrates (glucose) to the injured brain. Regulation of this interplay depends on the type of injury and may vary individually and over time. In this setting, patient management can be a challenging task, where standard ICP/CPP monitoring may become insufficient to prevent secondary brain injury. Growing clinical evidence demonstrates that so-called multimodal brain monitoring, including brain tissue oxygen (PbtO2), cerebral microdialysis and transcranial Doppler among others, might help to optimize CBF and the delivery of oxygen/energy substrate at the bedside, thereby improving the management of secondary brain injury. Looking beyond ICP and CPP, and applying a multimodal therapeutic approach for the optimization of CBF, oxygen delivery, and brain energy supply may eventually improve overall care of patients with head injury. This review summarizes some of the important pathophysiological determinants of secondary cerebral damage after TBI and discusses novel approaches to optimize CBF and provide adequate oxygen and energy supply to the injured brain using multimodal brain monitoring.