768 resultados para Contig Creation Algorithm
Resumo:
The standard one-machine scheduling problem consists in schedulinga set of jobs in one machine which can handle only one job at atime, minimizing the maximum lateness. Each job is available forprocessing at its release date, requires a known processing timeand after finishing the processing, it is delivery after a certaintime. There also can exists precedence constraints between pairsof jobs, requiring that the first jobs must be completed beforethe second job can start. An extension of this problem consistsin assigning a time interval between the processing of the jobsassociated with the precedence constrains, known by finish-starttime-lags. In presence of this constraints, the problem is NP-hardeven if preemption is allowed. In this work, we consider a specialcase of the one-machine preemption scheduling problem with time-lags, where the time-lags have a chain form, and propose apolynomial algorithm to solve it. The algorithm consist in apolynomial number of calls of the preemption version of the LongestTail Heuristic. One of the applicability of the method is to obtainlower bounds for NP-hard one-machine and job-shop schedulingproblems. We present some computational results of thisapplication, followed by some conclusions.
Resumo:
Recent research in macroeconomics emphasizes the role of wage rigidity in accounting for the volatility of unemployment fluctuations. We use worker-level datafrom the CPS to measure the sensitivity of wages of newly hired workers to changesin aggregate labor market conditions. The wage of new hires, unlike the aggregatewage, is volatile and responds almost one-to-one to changes in labor productivity.We conclude that there is little evidence for wage stickiness in the data. We alsoshow, however, that a little wage rigidity goes a long way in amplifying the responseof job creation to productivity shocks.
Resumo:
In this paper we propose a Pyramidal Classification Algorithm,which together with an appropriate aggregation index producesan indexed pseudo-hierarchy (in the strict sense) withoutinversions nor crossings. The computer implementation of thealgorithm makes it possible to carry out some simulation testsby Monte Carlo methods in order to study the efficiency andsensitivity of the pyramidal methods of the Maximum, Minimumand UPGMA. The results shown in this paper may help to choosebetween the three classification methods proposed, in order toobtain the classification that best fits the original structureof the population, provided we have an a priori informationconcerning this structure.
Resumo:
This paper shows that liquidity constraints restrict jobcreation even when labor markets are flexible. In a dynamicmodel of labor demand, I show that in an environment of imperfect capital and imperfect labor markets, firms usetemporary contracts to relax financial constraints. Evidence for the predictions of the model is presented using Spanish data from the CBBE (Central de Balances del Banco de España - Balance Sheet data from the Bank of Spain). It is shown that firms substitute temporary laborfor permanent one and use less debt as their financial position improves. In particular, it is rejected that Spanish firms operate in an environment of free capital markets and of no labor adjustment costs. The labor reform of 1984, which created temporary contracts, implied to some extent a relaxation of liquidity constraints.Accordingly, firms used these contracts more extensivelyand used less debt; however, as capital markets continueto be imperfect, permanent job creation continues to beslow. Consequently, relaxation of liquidity constraints should also be part of a job creation strategy.
Resumo:
We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if thesequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor.
Resumo:
Climate science indicates that climate stabilization requires low GHG emissions. Is thisconsistent with nondecreasing human welfare?Our welfare or utility index emphasizes education, knowledge, and the environment. Weconstruct and calibrate a multigenerational model with intertemporal links provided by education,physical capital, knowledge and the environment.We reject discounted utilitarianism and adopt, first, the Pure Sustainability Optimization (orIntergenerational Maximin) criterion, and, second, the Sustainable Growth Optimization criterion,that maximizes the utility of the first generation subject to a given future rate of growth. We applythese criteria to our calibrated model via a novel algorithm inspired by the turnpike property.The computed paths yield levels of utility higher than the level at reference year 2000 for allgenerations. They require the doubling of the fraction of labor resources devoted to the creation ofknowledge relative to the reference level, whereas the fractions of labor allocated to consumptionand leisure are similar to the reference ones. On the other hand, higher growth rates requiresubstantial increases in the fraction of labor devoted to education, together with moderate increasesin the fractions of labor devoted to knowledge and the investment in physical capital.
Resumo:
This paper compares two well known scan matching algorithms: the MbICP and the pIC. As a result of the study, it is proposed the MSISpIC, a probabilistic scan matching algorithm for the localization of an Autonomous Underwater Vehicle (AUV). The technique uses range scans gathered with a Mechanical Scanning Imaging Sonar (MSIS), and the robot displacement estimated through dead-reckoning with the help of a Doppler Velocity Log (DVL) and a Motion Reference Unit (MRU). The proposed method is an extension of the pIC algorithm. Its major contribution consists in: 1) using an EKF to estimate the local path traveled by the robot while grabbing the scan as well as its uncertainty and 2) proposing a method to group into a unique scan, with a convenient uncertainty model, all the data grabbed along the path described by the robot. The algorithm has been tested on an AUV guided along a 600m path within a marina environment with satisfactory results
Resumo:
Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo α equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadràtic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently
Resumo:
Nucleotide composition analyses of bacterial genomes such as cumulative GC skew highlight the atypical, strongly asymmetric architecture of the recently published chromosome of Idiomarina loihiensis L2TR, suggesting that an inversion of a 600-kb chromosomal segment occurred. The presence of 3.4-kb inverted repeated sequences at the borders of the putative rearrangement supports this hypothesis. Reverting in silico this segment restores (1) a symmetric chromosome architecture; (2) the co-orientation of transcription of all rRNA operons with DNA replication; and (3) a better conservation of gene order between this chromosome and other gamma-proteobacterial ones. Finally, long-range PCRs encompassing the ends of the 600-kb segment reveal the existence of the reverted configuration but not of the published one. This demonstrates how cumulative nucleotide-skew analyses can validate genome assemblies.
Resumo:
Summary Background: We previously derived a clinical prognostic algorithm to identify patients with pulmonary embolism (PE) who are at low-risk of short-term mortality who could be safely discharged early or treated entirely in an outpatient setting. Objectives: To externally validate the clinical prognostic algorithm in an independent patient sample. Methods: We validated the algorithm in 983 consecutive patients prospectively diagnosed with PE at an emergency department of a university hospital. Patients with none of the algorithm's 10 prognostic variables (age >/= 70 years, cancer, heart failure, chronic lung disease, chronic renal disease, cerebrovascular disease, pulse >/= 110/min., systolic blood pressure < 100 mm Hg, oxygen saturation < 90%, and altered mental status) at baseline were defined as low-risk. We compared 30-day overall mortality among low-risk patients based on the algorithm between the validation and the original derivation sample. We also assessed the rate of PE-related and bleeding-related mortality among low-risk patients. Results: Overall, the algorithm classified 16.3% of patients with PE as low-risk. Mortality at 30 days was 1.9% among low-risk patients and did not differ between the validation and the original derivation sample. Among low-risk patients, only 0.6% died from definite or possible PE, and 0% died from bleeding. Conclusions: This study validates an easy-to-use, clinical prognostic algorithm for PE that accurately identifies patients with PE who are at low-risk of short-term mortality. Low-risk patients based on our algorithm are potential candidates for less costly outpatient treatment.
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
In the present work, an analysis of the dark and optical capacitance transients obtained from Schottky Au:GaAs barriers implanted with boron has been carried out by means of the isothermal transient spectroscopy (ITS) and differential and optical ITS techniques. Unlike deep level transient spectroscopy, the use of these techniques allows one to easily distinguish contributions to the transients different from those of the usual deep trap emission kinetics. The results obtained show the artificial creation of the EL2, EL6, and EL5 defects by the boron implantation process. Moreover, the interaction mechanism between the EL2 and other defects, which gives rise to the U band, has been analyzed. The existence of a reorganization process of the defects involved has been observed, which prevents the interaction as the temperature increases. The activation energy of this process has been found to be dependent on the temperature of the annealing treatment after implantation, with values of 0.51 and 0.26 eV for the as‐implanted and 400 °C annealed samples, respectively. The analysis of the optical data has corroborated the existence of such interactions involving all the observed defects that affect their optical parameters