809 resultados para blocking algorithm
Resumo:
The standard one-machine scheduling problem consists in schedulinga set of jobs in one machine which can handle only one job at atime, minimizing the maximum lateness. Each job is available forprocessing at its release date, requires a known processing timeand after finishing the processing, it is delivery after a certaintime. There also can exists precedence constraints between pairsof jobs, requiring that the first jobs must be completed beforethe second job can start. An extension of this problem consistsin assigning a time interval between the processing of the jobsassociated with the precedence constrains, known by finish-starttime-lags. In presence of this constraints, the problem is NP-hardeven if preemption is allowed. In this work, we consider a specialcase of the one-machine preemption scheduling problem with time-lags, where the time-lags have a chain form, and propose apolynomial algorithm to solve it. The algorithm consist in apolynomial number of calls of the preemption version of the LongestTail Heuristic. One of the applicability of the method is to obtainlower bounds for NP-hard one-machine and job-shop schedulingproblems. We present some computational results of thisapplication, followed by some conclusions.
Resumo:
This study describes a form of partial agonism for a CD8+ CTL clone, S15, in which perforin-dependent killing and IFN-gamma production were lost but Fas (APO1 or CD95)-dependent cytotoxicity preserved. Cloned S15 CTL are H-2Kd restricted and specific for a photoreactive derivative of the Plasmodium berghei circumsporozoite peptide PbCS 252-260 (SYIPSAEKI). The presence of a photoactivatable group in the epitope permitted assessment of TCR-ligand binding by TCR photoaffinity labeling. Selective activation of Fas-dependent killing was observed for a peptide-derivative variant containing a modified photoreactive group. A similar functional response was obtained after binding of the wild-type peptide derivative upon blocking of CD8 participation in TCR-ligand binding. The epitope modification or blocking of CD8 resulted in an > or = 8-fold decrease in TCR-ligand binding. In both cases, phosphorylation of zeta-chain and ZAP-70, as well as calcium mobilization were reduced close to background levels, indicating that activation of Fas-dependent cytotoxicity required weaker TCR signaling than activation of perforin-dependent killing or IFN-gamma production. Consistent with this, we observed that depletion of the protein tyrosine kinase p56(lck) by preincubation of S15 CTL with herbimycin A severely impaired perforin- but not Fas-dependent cytotoxicity. Together with the observation that S15 CTL constitutively express Fas ligand, these results indicate that TCR signaling too weak to elicit perforin-dependent cytotoxicity or cytokine production can induce Fas-dependent cytotoxicity, possibly by translocation of preformed Fas ligand to the cell surface.
Resumo:
In this paper we propose a Pyramidal Classification Algorithm,which together with an appropriate aggregation index producesan indexed pseudo-hierarchy (in the strict sense) withoutinversions nor crossings. The computer implementation of thealgorithm makes it possible to carry out some simulation testsby Monte Carlo methods in order to study the efficiency andsensitivity of the pyramidal methods of the Maximum, Minimumand UPGMA. The results shown in this paper may help to choosebetween the three classification methods proposed, in order toobtain the classification that best fits the original structureof the population, provided we have an a priori informationconcerning this structure.
Resumo:
Intratumoural (i.t.) injection of radio-iododeoxyuridine (IdUrd), a thymidine (dThd) analogue, is envisaged for targeted Auger electron- or beta-radiation therapy of glioblastoma. Here, biodistribution of [(125)I]IdUrd was evaluated 5 hr after i.t. injection in subcutaneous human glioblastoma xenografts LN229 after different intravenous (i.v.) pretreatments with fluorodeoxyuridine (FdUrd). FdUrd is known to block de novo dThd synthesis, thus favouring DNA incorporation of radio-IdUrd. Results showed that pretreatment with 2 mg/kg FdUrd i.v. in 2 fractions 0.5 hr and 1 hr before injection of radio-IdUrd resulted in a mean tumour uptake of 19.8% of injected dose (% ID), representing 65.3% ID/g for tumours of approx. 0.35 g. Tumour uptake of radio-IdUrd in non-pretreated mice was only 4.1% ID. Very low uptake was observed in normal nondividing and dividing tissues with a maximum concentration of 2.9% ID/g measured in spleen. Pretreatment with a higher dose of FdUrd of 10 mg/kg prolonged the increased tumour uptake of radio-IdUrd up to 5 hr. A competition experiment was performed in FdUrd pretreated mice using i.t. co-injection of excess dThd that resulted in very low tumour retention of [(125)I]IdUrd. DNA isolation experiments showed that in the mean >95% of tumour (125)I activity was incorporated in DNA. In conclusion, these results show that close to 20% ID of radio-IdUrd injected i.t. was incorporated in tumour DNA after i.v. pretreatment with clinically relevant doses of FdUrd and that this approach may be further exploited for diffusion and therapy studies with Auger electron- and/or beta-radiation-emitting radio-IdUrd.
Resumo:
We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if thesequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor.
Resumo:
This paper compares two well known scan matching algorithms: the MbICP and the pIC. As a result of the study, it is proposed the MSISpIC, a probabilistic scan matching algorithm for the localization of an Autonomous Underwater Vehicle (AUV). The technique uses range scans gathered with a Mechanical Scanning Imaging Sonar (MSIS), and the robot displacement estimated through dead-reckoning with the help of a Doppler Velocity Log (DVL) and a Motion Reference Unit (MRU). The proposed method is an extension of the pIC algorithm. Its major contribution consists in: 1) using an EKF to estimate the local path traveled by the robot while grabbing the scan as well as its uncertainty and 2) proposing a method to group into a unique scan, with a convenient uncertainty model, all the data grabbed along the path described by the robot. The algorithm has been tested on an AUV guided along a 600m path within a marina environment with satisfactory results
Resumo:
Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo α equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadràtic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently
Resumo:
Summary Background: We previously derived a clinical prognostic algorithm to identify patients with pulmonary embolism (PE) who are at low-risk of short-term mortality who could be safely discharged early or treated entirely in an outpatient setting. Objectives: To externally validate the clinical prognostic algorithm in an independent patient sample. Methods: We validated the algorithm in 983 consecutive patients prospectively diagnosed with PE at an emergency department of a university hospital. Patients with none of the algorithm's 10 prognostic variables (age >/= 70 years, cancer, heart failure, chronic lung disease, chronic renal disease, cerebrovascular disease, pulse >/= 110/min., systolic blood pressure < 100 mm Hg, oxygen saturation < 90%, and altered mental status) at baseline were defined as low-risk. We compared 30-day overall mortality among low-risk patients based on the algorithm between the validation and the original derivation sample. We also assessed the rate of PE-related and bleeding-related mortality among low-risk patients. Results: Overall, the algorithm classified 16.3% of patients with PE as low-risk. Mortality at 30 days was 1.9% among low-risk patients and did not differ between the validation and the original derivation sample. Among low-risk patients, only 0.6% died from definite or possible PE, and 0% died from bleeding. Conclusions: This study validates an easy-to-use, clinical prognostic algorithm for PE that accurately identifies patients with PE who are at low-risk of short-term mortality. Low-risk patients based on our algorithm are potential candidates for less costly outpatient treatment.
Resumo:
We present some results attained with different algorithms for the Fm|block|Cmax problem using as experimental data the well-known Taillard instances.
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
TNFalpha blocking agents are effective and essential tools in the management of many inflammatory conditions including rheumatoid arthritis, spondylarthropathies and chronic inflammatory bowel disease. With time, some known side-effects have gained in importance and others have appeared. This article focuses on the potential risks of infection and autoimmunity induced by TNFalpha blocking agents and on the strategy to prevent and treat such adverse events.
Resumo:
We consider stochastic partial differential equations with multiplicative noise. We derive an algorithm for the computer simulation of these equations. The algorithm is applied to study domain growth of a model with a conserved order parameter. The numerical results corroborate previous analytical predictions obtained by linear analysis.
Resumo:
We apply majorization theory to study the quantum algorithms known so far and find that there is a majorization principle underlying the way they operate. Grover's algorithm is a neat instance of this principle where majorization works step by step until the optimal target state is found. Extensions of this situation are also found in algorithms based in quantum adiabatic evolution and the family of quantum phase-estimation algorithms, including Shor's algorithm. We state that in quantum algorithms the time arrow is a majorization arrow.