814 resultados para Load disaggregation algorithm
Resumo:
The standard one-machine scheduling problem consists in schedulinga set of jobs in one machine which can handle only one job at atime, minimizing the maximum lateness. Each job is available forprocessing at its release date, requires a known processing timeand after finishing the processing, it is delivery after a certaintime. There also can exists precedence constraints between pairsof jobs, requiring that the first jobs must be completed beforethe second job can start. An extension of this problem consistsin assigning a time interval between the processing of the jobsassociated with the precedence constrains, known by finish-starttime-lags. In presence of this constraints, the problem is NP-hardeven if preemption is allowed. In this work, we consider a specialcase of the one-machine preemption scheduling problem with time-lags, where the time-lags have a chain form, and propose apolynomial algorithm to solve it. The algorithm consist in apolynomial number of calls of the preemption version of the LongestTail Heuristic. One of the applicability of the method is to obtainlower bounds for NP-hard one-machine and job-shop schedulingproblems. We present some computational results of thisapplication, followed by some conclusions.
Resumo:
To investigate the effect of age and change in body composition on the increase in energy expenditure consecutive to the ingestion of a 75-g glucose load, respiratory exchange measurements were performed on 24 subjects, 12 elderly (mean +/- SEM, 73 +/- 1 yr) and 12 young (25 +/- 1 yr). The body weight was comparable, 62 +/- 2 kg in the elderly group vs 61 +/- 3 in the young, but the body fat content of the elderly group was significantly greater than that of the young (29 +/- 2% vs 19 +/- 2%, p less than 0.001). The elderly group presented a slight glucose intolerance according to the World Health Organization (WHO) criteria, with a 120-min plasma glucose of 149 +/- 9 mg/dl (p less than 0.005 vs young). The postabsorptive resting energy expenditure (REE) was 0.83 +/- 0.03 kcal/min in the elderly group vs 0.98 +/- 0.04 in the young (p less than 0.02); this decrease of 15% was mainly related to the decrease in fat free mass (FFM) in the elderly group, which averaged 14%. The difference was not significant when REE was expressed per kg FFM. The glucose-induced thermogenesis (GIT) expressed as percent of energy content of the load was 6.2 +/- 0.6% in the elderly group and 8.9 +/- 0.9% in the young (p less than 0.05). It is concluded that the glucose-induced thermogenesis is decreased in elderly subjects. However, when expressed per kg FFM, the increment in energy expenditure (EE), in response to the glucose load, is not different in elderly subjects, suggesting that the decrease of thermogenesis may be attributed to the age-related decrease in FFM.
Resumo:
BACKGROUND: To test the inflammatory origin of cardiovascular disease, as opposed to its origin in western lifestyle. Population-based assessment of the prevalences of cardiovascular risk factors and cardiovascular disease in an inflammation-prone African population, including electrocardiography and ankle-arm index measurement. Comparison with known prevalences in American and European societies. METHODOLOGY/PRINCIPAL FINDINGS: Traditional population in rural Ghana, characterised by adverse environmental conditions and a high infectious load. Population-based sample of 924 individuals aged 50 years and older. Median values for cardiovascular risk factors, including waist circumference, BMI, blood pressure, and markers of glucose and lipid metabolism and inflammation. Prevalence of myocardial infarction detected by electrocardiography and prevalence of peripheral arterial disease detected by ankle-arm index. When compared to western societies, we found the Ghanaians to have more proinflammatory profiles and less cardiovascular risk factors, including obesity, dysglycaemia, dyslipidaemia, and hypertension. Prevalences of cardiovascular disease were also lower. Definite myocardial infarction was present in 1.2% (95%CI: 0.6 to 2.4%). Peripheral arterial disease was present in 2.8% (95%CI: 1.9 to 4.1%). CONCLUSIONS/SIGNIFICANCE: Taken together, our data indicate that for the pathogenesis of cardiovascular disease inflammatory processes alone do not suffice and additional factors, probably lifestyle-related, are mandatory.
Resumo:
In this paper we propose a Pyramidal Classification Algorithm,which together with an appropriate aggregation index producesan indexed pseudo-hierarchy (in the strict sense) withoutinversions nor crossings. The computer implementation of thealgorithm makes it possible to carry out some simulation testsby Monte Carlo methods in order to study the efficiency andsensitivity of the pyramidal methods of the Maximum, Minimumand UPGMA. The results shown in this paper may help to choosebetween the three classification methods proposed, in order toobtain the classification that best fits the original structureof the population, provided we have an a priori informationconcerning this structure.
Resumo:
Active protein-disaggregation by a chaperone network composed of ClpB and DnaK + DnaJ + GrpE is essential for the recovery of stress-induced protein aggregates in vitro and in Escherichia coli cells. K-glutamate and glycine-betaine (betaine) naturally accumulate in salt-stressed cells. In addition to providing thermo-protection to native proteins, we found that these osmolytes can strongly and specifically activate ClpB, resulting in an increased efficiency of chaperone-mediated protein disaggregation. Moreover, factors that inhibited the chaperone network by impairing the stability of the ClpB oligomer, such as natural polyamines, dilution, or high salt, were efficiently counteracted by K-glutamate or betaine. The combined protective, counter-negative and net activatory effects of K-glutamate and betaine, allowed protein disaggregation and refolding under heat-shock temperatures that otherwise cause protein aggregation in vitro and in the cell. Mesophilic organisms may thus benefit from a thermotolerant osmolyte-activated chaperone mechanism that can actively rescue protein aggregates, correctly refold and maintain them in a native state under heat-shock conditions.
Resumo:
We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if thesequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor.
Resumo:
This paper compares two well known scan matching algorithms: the MbICP and the pIC. As a result of the study, it is proposed the MSISpIC, a probabilistic scan matching algorithm for the localization of an Autonomous Underwater Vehicle (AUV). The technique uses range scans gathered with a Mechanical Scanning Imaging Sonar (MSIS), and the robot displacement estimated through dead-reckoning with the help of a Doppler Velocity Log (DVL) and a Motion Reference Unit (MRU). The proposed method is an extension of the pIC algorithm. Its major contribution consists in: 1) using an EKF to estimate the local path traveled by the robot while grabbing the scan as well as its uncertainty and 2) proposing a method to group into a unique scan, with a convenient uncertainty model, all the data grabbed along the path described by the robot. The algorithm has been tested on an AUV guided along a 600m path within a marina environment with satisfactory results
Resumo:
In a previous paper a novel Generalized Multiobjective Multitree model (GMM-model) was proposed. This model considers for the first time multitree-multicast load balancing with splitting in a multiobjective context, whose mathematical solution is a whole Pareto optimal set that can include several results than it has been possible to find in the publications surveyed. To solve the GMM-model, in this paper a multi-objective evolutionary algorithm (MOEA) inspired by the Strength Pareto Evolutionary Algorithm (SPEA) is proposed. Experimental results considering up to 11 different objectives are presented for the well-known NSF network, with two simultaneous data flows
Resumo:
Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo α equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadràtic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently
Resumo:
Summary Background: We previously derived a clinical prognostic algorithm to identify patients with pulmonary embolism (PE) who are at low-risk of short-term mortality who could be safely discharged early or treated entirely in an outpatient setting. Objectives: To externally validate the clinical prognostic algorithm in an independent patient sample. Methods: We validated the algorithm in 983 consecutive patients prospectively diagnosed with PE at an emergency department of a university hospital. Patients with none of the algorithm's 10 prognostic variables (age >/= 70 years, cancer, heart failure, chronic lung disease, chronic renal disease, cerebrovascular disease, pulse >/= 110/min., systolic blood pressure < 100 mm Hg, oxygen saturation < 90%, and altered mental status) at baseline were defined as low-risk. We compared 30-day overall mortality among low-risk patients based on the algorithm between the validation and the original derivation sample. We also assessed the rate of PE-related and bleeding-related mortality among low-risk patients. Results: Overall, the algorithm classified 16.3% of patients with PE as low-risk. Mortality at 30 days was 1.9% among low-risk patients and did not differ between the validation and the original derivation sample. Among low-risk patients, only 0.6% died from definite or possible PE, and 0% died from bleeding. Conclusions: This study validates an easy-to-use, clinical prognostic algorithm for PE that accurately identifies patients with PE who are at low-risk of short-term mortality. Low-risk patients based on our algorithm are potential candidates for less costly outpatient treatment.
Resumo:
Desenvolupament dels models matemàtics necessaris per a controlar de forma òptima la microxarxa existent als laboratoris del Institut de Recerca en Energia de Catalunya. Els algoritmes s'implementaran per tal de simular el comportament i posteriorment es programaran directament sobre els elements de la microxarxa per verificar el seu correcte funcionament.. Desenvolupament dels models matemàtics necessaris per a controlar de forma òptima la microxarxa existent als laboratoris del Institut de Recerca en Energia de Catalunya. Els algoritmes s'implementaran per tal de simular el comportament i posteriorment es programaran directament sobre els elements de la microxarxa per verificar el seu correcte funcionament.
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.