814 resultados para Load disaggregation algorithm
Resumo:
Examples from the Murray-Darling basin in Australia are used to illustrate different methods of disaggregation of reconnaissance-scale maps. One approach for disaggregation revolves around the de-convolution of the soil-landscape paradigm elaborated during a soil survey. The descriptions of soil ma units and block diagrams in a soil survey report detail soil-landscape relationships or soil toposequences that can be used to disaggregate map units into component landscape elements. Toposequences can be visualised on a computer by combining soil maps with digital elevation data. Expert knowledge or statistics can be used to implement the disaggregation. Use of a restructuring element and k-means clustering are illustrated. Another approach to disaggregation uses training areas to develop rules to extrapolate detailed mapping into other, larger areas where detailed mapping is unavailable. A two-level decision tree example is presented. At one level, the decision tree method is used to capture mapping rules from the training area; at another level, it is used to define the domain over which those rules can be extrapolated. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
An equivalent algorithm is proposed to simulate thermal effects of the magma intrusion in geological systems, which are composed of porous rocks. Based on the physical and mathematical equivalence, the original magma solidification problem with a moving boundary between the rock and intruded magma is transformed into a new problem without the moving boundary but with a physically equivalent heat source. From the analysis of an ideal solidification model, the physically equivalent heat source has been determined in this paper. The major advantage in using the proposed equivalent algorithm is that the fixed finite element mesh with a variable integration time step can be employed to simulate the thermal effect of the intruded magma solidification using the conventional finite element method. The related numerical results have demonstrated the correctness and usefulness of the proposed equivalent algorithm for simulating the thermal effect of the intruded magma solidification in geological systems. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
A graph clustering algorithm constructs groups of closely related parts and machines separately. After they are matched for the least intercell moves, a refining process runs on the initial cell formation to decrease the number of intercell moves. A simple modification of this main approach can deal with some practical constraints, such as the popular constraint of bounding the maximum number of machines in a cell. Our approach makes a big improvement in the computational time. More importantly, improvement is seen in the number of intercell moves when the computational results were compared with best known solutions from the literature. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Extended gcd computation is interesting itself. It also plays a fundamental role in other calculations. We present a new algorithm for solving the extended gcd problem. This algorithm has a particularly simple description and is practical. It also provides refined bounds on the size of the multipliers obtained.
Resumo:
Qu-Prolog is an extension of Prolog which performs meta-level computations over object languages, such as predicate calculi and lambda-calculi, which have object-level variables, and quantifier or binding symbols creating local scopes for those variables. As in Prolog, the instantiable (meta-level) variables of Qu-Prolog range over object-level terms, and in addition other Qu-Prolog syntax denotes the various components of the object-level syntax, including object-level variables. Further, the meta-level operation of substitution into object-level terms is directly represented by appropriate Qu-Prolog syntax. Again as in Prolog, the driving mechanism in Qu-Prolog computation is a form of unification, but this is substantially more complex than for Prolog because of Qu-Prolog's greater generality, and especially because substitution operations are evaluated during unification. In this paper, the Qu-Prolog unification algorithm is specified, formalised and proved correct. Further, the analysis of the algorithm is carried out in a frame-work which straightforwardly allows the 'completeness' of the algorithm to be proved: though fully explicit answers to unification problems are not always provided, no information is lost in the unification process.
Resumo:
Background. Many resource-limited countries rely on clinical and immunological monitoring without routine virological monitoring for human immunodeficiency virus (HIV)-infected children receiving highly active antiretroviral therapy (HAART). We assessed whether HIV load had independent predictive value in the presence of immunological and clinical data for the occurrence of new World Health Organization (WHO) stage 3 or 4 events (hereafter, WHO events) among HIV-infected children receiving HAART in Latin America. Methods. The NISDI (Eunice Kennedy Shriver National Institute of Child Health and Human Development International Site Development Initiative) Pediatric Protocol is an observational cohort study designed to describe HIV-related outcomes among infected children. Eligibility criteria for this analysis included perinatal infection, age ! 15 years, and continuous HAART for >= 6 months. Cox proportional hazards modeling was used to assess time to new WHO events as a function of immunological status, viral load, hemoglobin level, and potential confounding variables; laboratory tests repeated during the study were treated as time-varying predictors. Results. The mean duration of follow-up was 2.5 years; new WHO events occurred in 92 (15.8%) of 584 children. In proportional hazards modeling, most recent viral load 15000 copies/mL was associated with a nearly doubled risk of developing a WHO event (adjusted hazard ratio, 1.81; 95% confidence interval, 1.05-3.11; P = 033), even after adjustment for immunological status defined on the basis of CD4 T lymphocyte value, hemoglobin level, age, and body mass index. Conclusions. Routine virological monitoring using the WHO virological failure threshold of 5000 copies/mL adds independent predictive value to immunological and clinical assessments for identification of children receiving HAART who are at risk for significant HIV-related illness. To provide optimal care, periodic virological monitoring should be considered for all settings that provide HAART to children.
Resumo:
Serum hepatitis B virus (HBV) DNA [eve[ is a predictor of the development of cirrhosis and hepatocellullar carcinoma in chronic hepatitis B patients. Nevertheless, the distribution of viral load levels in chronic HBV patients in Brazil has yet to be described. This cross-sectional study included 564 participants selected in nine Brazilian cities located in four of the five regions of the country using the database of a medical diagnostics company. Admission criteria included hepatitis B surface antigen seropositivity, availability of HBV viral toad samples and age >= 18 years. Mates comprised 64.5% of the study population. Mean age was 43.7 years. Most individuals (62.1%) were seronegative for the hepatitis B e antigen (HBeAg). Median serum ALT level was 34 U/L. In 58.5% of the patients HBV-DNA levels ranged from 300 to 99,999 copies/mL; however, in 21.6% levels were undetectable. Median HBV-DNA level was 2,351 copies/mL. Over 60% of the patients who tested negative for HBeAg and in whom ALT level was less than 1.5 times the upper limit of the normal range had HBV-DNA levels > 2,000 IU/mL, which has been considered a cut-off point for indicating a liver biopsy and/or treatment. In conclusion, HBV-DNA level identified a significant proportion of Brazilian individuals with chronic hepatitis B at risk of disease progression. Furthermore, this tool. enables those individuals with high HBV-DNA levels who are susceptible to disease progression to be identified among patients with normal or stightly elevated ALT.
Resumo:
An algorithm for explicit integration of structural dynamics problems with multiple time steps is proposed that averages accelerations to obtain subcycle states at a nodal interface between regions integrated with different time steps. With integer time step ratios, the resulting subcycle updates at the interface sum to give the same effect as a central difference update over a major cycle. The algorithm is shown to have good accuracy, and stability properties in linear elastic analysis similar to those of constant velocity subcycling algorithms. The implementation of a generalised form of the algorithm with non-integer time step ratios is presented. (C) 1997 by John Wiley & Sons, Ltd.
Resumo:
The popular Newmark algorithm, used for implicit direct integration of structural dynamics, is extended by means of a nodal partition to permit use of different timesteps in different regions of a structural model. The algorithm developed has as a special case an explicit-explicit subcycling algorithm previously reported by Belytschko, Yen and Mullen. That algorithm has been shown, in the absence of damping or other energy dissipation, to exhibit instability over narrow timestep ranges that become narrower as the number of degrees of freedom increases, making them unlikely to be encountered in practice. The present algorithm avoids such instabilities in the case of a one to two timestep ratio (two subcycles), achieving unconditional stability in an exponential sense for a linear problem. However, with three or more subcycles, the trapezoidal rule exhibits stability that becomes conditional, falling towards that of the central difference method as the number of subcycles increases. Instabilities over narrow timestep ranges, that become narrower as the model size increases, also appear with three or more subcycles. However by moving the partition between timesteps one row of elements into the region suitable for integration with the larger timestep these the unstable timestep ranges become extremely narrow, even in simple systems with a few degrees of freedom. As well, accuracy is improved. Use of a version of the Newmark algorithm that dissipates high frequencies minimises or eliminates these narrow bands of instability. Viscous damping is also shown to remove these instabilities, at the expense of having more effect on the low frequency response.
Resumo:
We propose a simulated-annealing-based genetic algorithm for solving model parameter estimation problems. The algorithm incorporates advantages of both genetic algorithms and simulated annealing. Tests on computer-generated synthetic data that closely resemble optical constants of a metal were performed to compare the efficiency of plain genetic algorithms against the simulated-annealing-based genetic algorithms. These tests assess the ability of the algorithms to and the global minimum and the accuracy of values obtained for model parameters. Finally, the algorithm with the best performance is used to fit the model dielectric function to data for platinum and aluminum. (C) 1997 Optical Society of America.
Resumo:
Background: Although various techniques have been used for breast conservation surgery reconstruction, there are few studies describing a logical approach to reconstruction of these defects. The objectives of this study were to establish a classification system for partial breast defects and to develop a reconstructive algorithm. Methods: The authors reviewed a 7-year experience with 209 immediate breast conservation surgery reconstructions. Mean follow-up was 31 months. Type I defects include tissue resection in smaller breasts (bra size A/B), including type IA, which involves minimal defects that do not cause distortion; type III, which involves moderate defects that cause moderate distortion; and type IC, which involves large defects that cause significant deformities. Type II includes tissue resection in medium-sized breasts with or without ptosis (bra size C), and type III includes tissue resection in large breasts with ptosis (bra size D). Results: Eighteen percent of patients presented type I, where a lateral thoracodorsal flap and a latissimus dorsi flap were performed in 68 percent. Forty-five percent presented type II defects, where bilateral mastopexy was performed in 52 percent. Thirty-seven percent of patients presented type III distortion, where bilateral reduction mammaplasty was performed in 67 percent. Thirty-five percent of patients presented complications, and most were minor. Conclusions: An algorithm based on breast size in relation to tumor location and extension of resection can be followed to determine the best approach to reconstruction. The authors` results have demonstrated that the complications were similar to those in other clinical series. Success depends on patient selection, coordinated planning with the oncologic surgeon, and careful intraoperative management.
Resumo:
Although human T-cell lymphotropic virus type 2 (HTLV-2) is considered of low pathogenicity, serological diagnosis is important for counseling and monitoring. The confirmatory tests most used are Western blot (WB) and PCR. However, in high-risk populations, about 50% of the indeterminate WB were HTLV-2 positives by PCR. The insensitivity of the WB might be due to the use of recombinant proteins of strains that do not circulate in our country. Another possibility may be a high level of immunosuppression, which could lead to low production of virus, resulting in low stimulation of antibody. We found one mutation, proline to serine in the envelope region in the position 184, presented at least 1/3 of the samples, independent the indeterminate WB profile. In conclusion, we found no correlation of immune state, HTLV-2 proviral load, or env diversity in the K55 region and WB indeterminate results. We believe that the only WB kit available in the market is probably more accurate to detect HTLV-1 antibodies, and some improvement for HTLV-2 detection should be done in the future, especially among high-risk population. J. Med. Virol. 82:837-842,2010. (C) 2010 Wiley-Liss, Inc.
Resumo:
Background: Many clinical studies have suggested a beneficial effect of GB virus type C (GBV-C) on the course of HIV-1 infection, but the mechanisms involved in such amelioration are not clear. As recent evidence has implicated cellular activation in HIV-1 pathogenesis, we investigated the effect of GBV-C viremia on T-cell activation in early HIV-1 infection. Methods: Forty-eight recently infected HIV-1 patients (23 GBV-C viremic) were evaluated for T-cell counts, expanded immunophenotyping GBV-C RNA detection, and HIV-1 viral load. Nonparametric univariate and multivariate analyses were carried out to identify variables associated with cellular activation, including GBV-C status, HIV-1 viral load, T lymphocyte counts, and CD38 and chemokine (C-C motif) receptor 5 (CCR5) surface expression. Finding: We not only confirmed the positive correlation between HIV-1 viral load and the percentage of T cells positive for CD38(+)CD8(+) but also observed that GBV-C viremic patients had a lower percentage of T cells positive for CD38(+)CD4(+), CD38(+)CD8(+), CCR5(+)CD4(+), and CCR5(+)CD8(+) compared with HIV-1-infected patients who were not GBV-C viremic. In regression models, GBV-C RNA(+) status was associated with a reduction in the CD38 on CD4(+) or CD8(+) T cells and CCR5(+) on CD8(+) T cells, independent of the HIV-1 viral load or CD4(+) and CD8(+) T-cell counts. These results were also supported by the lower expression of CD69 and CD25 in GBV-C viremic patients. Interpretation: The association between GBV-C replication and lower T-cell activation may be a key mechanism involved in the protection conferred by this virus against HIV-1 disease progression to immunodeficiency in HIV-1-infected patients. (C) 2009 Wolters Kluwer Health | Lippincott Williams & Wilkins
Resumo:
To examine abnormal patterns of frontal cortical-subcortical activity in response to emotional stimuli in euthymic individuals with bipolar disorder type I in order to identify trait-like, pathophysiologic mechanisms of the disorder. We examined potential confounding effects of total psychotropic medication load and illness variables upon neural abnormalities. We analyzed neural activity in 19 euthymic bipolar and 24 healthy individuals to mild and intense happy, fearful and neutral faces. Relative to healthy individuals, bipolar subjects had significantly increased left striatal activity in response to mild happy faces (p < 0.05, corrected), decreased right dorsolateral prefrontal cortical (DLPFC) activity in response to neutral, mild and intense happy faces, and decreased left DLPFC activity in response to neutral, mild and intense fearful faces (p < 0.05, corrected). Bipolar and healthy individuals did not differ in amygdala activity in response to either emotion. In bipolar individuals, there was no significant association between medication load and abnormal activity in these regions, but a negative relationship between age of illness onset and amygdala activity in response to mild fearful faces (p = 0.007). Relative to those without comorbidities, bipolar individuals with comorbidities showed a trend increase in left striatal activity in response to mild happy faces. Abnormally increased striatal activity in response to potentially rewarding stimuli and decreased DLPFC activity in response to other emotionally salient stimuli may underlie mood instabilities in euthymic bipolar individuals, and are more apparent in those with comorbid diagnoses. No relationship between medication load and abnormal neural activity in bipolar individuals suggests that our findings may reflect pathophysiologic mechanisms of the illness rather than medication confounds. Future studies should examine whether this pattern of abnormal neural activity could distinguish bipolar from unipolar depression.