788 resultados para deferred-acceptance algorithm
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A self-learning simulated annealing algorithm is developed by combining the characteristics of simulated annealing and domain elimination methods. The algorithm is validated by using a standard mathematical function and by optimizing the end region of a practical power transformer. The numerical results show that the CPU time required by the proposed method is about one third of that using conventional simulated annealing algorithm.
Resumo:
In most patients, postoperative endoscopic recurrence (PER) occurs 1 year after abdominal resection for Crohn’s disease (CD). Preventing PER is essential for disease control, as most patients develop further clinical and surgical recurrences. Conventional therapy with nitroimidazoles, aminosalicylates, and immunomodulators have limited efficacy for preventing PER. Initial trials with biological therapy (infliximab and adalimumab) showed promising results in preventing PER, and the efficacy of these drugs seems higher than that with conventional therapy. The aim of this review is to outline the results of studies that used infliximab or adalimumab for preventing and treating PER in CD patients. Data with both agents are available, and a few, small prospective trials have shown the efficacy of these drugs in patients with a high risk for recurrence. We believe that, in 2013, biological agents will be better accepted for the prevention PER in CD patients, in addition to the already existing data. Larger trials are still underway, and their results will certainly determine the role of these agents in PER, which develops after bowel resection for CD.
Resumo:
The transcript of John J. Janovy Jr.'s speech upon acceptance of the American Society of Parasitologists' Clark P. Read Mentor Award, 2003.
Resumo:
Active machine learning algorithms are used when large numbers of unlabeled examples are available and getting labels for them is costly (e.g. requiring consulting a human expert). Many conventional active learning algorithms focus on refining the decision boundary, at the expense of exploring new regions that the current hypothesis misclassifies. We propose a new active learning algorithm that balances such exploration with refining of the decision boundary by dynamically adjusting the probability to explore at each step. Our experimental results demonstrate improved performance on data sets that require extensive exploration while remaining competitive on data sets that do not. Our algorithm also shows significant tolerance of noise.
Resumo:
Rural community development is a major issue for developing countries. Much attention has been given Information and Communication Technology (ICT) projects to connect rural communities with the global network. However, ICT resistance is a deterring factor in addressing the digital divide in developing countries. It is postulated that reversing the resistance to to ICT can be strategizedthrough "information acceptance." ICT can be accepted by rural communities by creating demand for information. The paper calls for the refocusing on the role of information in rural community development and ICT as a tool for change agent. Initiatives for rural community development must emphasize the importance of information in rural communities.
Resumo:
Grain producers must make marketing decisions every day. First they must decide whether to price or hold grain. If they decide to price grain, they must then choose the most appropriate method of pricing: cash sale, forward contract, or hedging. If they decide to hold grain (not to price), they must choose the most appropriate method of retaining ownership. This fact sheet presents some guidelines to help producers choose the least costly method of owning grain or speculating on price level changes.
Resumo:
The irregular shape packing problem is approached. The container has a fixed width and an open dimension to be minimized. The proposed algorithm constructively creates the solution using an ordered list of items and a placement heuristic. Simulated annealing is the adopted metaheuristic to solve the optimization problem. A two-level algorithm is used to minimize the open dimension of the container. To ensure feasible layouts, the concept of collision free region is used. A collision free region represents all possible translations for an item to be placed and may be degenerated. For a moving item, the proposed placement heuristic detects the presence of exact fits (when the item is fully constrained by its surroundings) and exact slides (when the item position is constrained in all but one direction). The relevance of these positions is analyzed and a new placement heuristic is proposed. Computational comparisons on benchmark problems show that the proposed algorithm generated highly competitive solutions. Moreover, our algorithm updated some best known results. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
OBJECTIVE: Differentiation between benign and malignant ovarian neoplasms is essential for creating a system for patient referrals. Therefore, the contributions of the tumor markers CA125 and human epididymis protein 4 (HE4) as well as the risk ovarian malignancy algorithm (ROMA) and risk malignancy index (RMI) values were considered individually and in combination to evaluate their utility for establishing this type of patient referral system. METHODS: Patients who had been diagnosed with ovarian masses through imaging analyses (n = 128) were assessed for their expression of the tumor markers CA125 and HE4. The ROMA and RMI values were also determined. The sensitivity and specificity of each parameter were calculated using receiver operating characteristic curves according to the area under the curve (AUC) for each method. RESULTS: The sensitivities associated with the ability of CA125, HE4, ROMA, or RMI to distinguish between malignant versus benign ovarian masses were 70.4%, 79.6%, 74.1%, and 63%, respectively. Among carcinomas, the sensitivities of CA125, HE4, ROMA (pre-and post-menopausal), and RMI were 93.5%, 87.1%, 80%, 95.2%, and 87.1%, respectively. The most accurate numerical values were obtained with RMI, although the four parameters were shown to be statistically equivalent. CONCLUSION: There were no differences in accuracy between CA125, HE4, ROMA, and RMI for differentiating between types of ovarian masses. RMI had the lowest sensitivity but was the most numerically accurate method. HE4 demonstrated the best overall sensitivity for the evaluation of malignant ovarian tumors and the differential diagnosis of endometriosis. All of the parameters demonstrated increased sensitivity when tumors with low malignancy potential were considered low-risk, which may be used as an acceptable assessment method for referring patients to reference centers.
Resumo:
A direct reconstruction algorithm for complex conductivities in W-2,W-infinity(Omega), where Omega is a bounded, simply connected Lipschitz domain in R-2, is presented. The framework is based on the uniqueness proof by Francini (2000 Inverse Problems 6 107-19), but equations relating the Dirichlet-to-Neumann to the scattering transform and the exponentially growing solutions are not present in that work, and are derived here. The algorithm constitutes the first D-bar method for the reconstruction of conductivities and permittivities in two dimensions. Reconstructions of numerically simulated chest phantoms with discontinuities at the organ boundaries are included.
Resumo:
OBJECTIVE: We aimed to evaluate whether the inclusion of videothoracoscopy in a pleural empyema treatment algorithm would change the clinical outcome of such patients. METHODS: This study performed quality-improvement research. We conducted a retrospective review of patients who underwent pleural decortication for pleural empyema at our institution from 2002 to 2008. With the old algorithm (January 2002 to September 2005), open decortication was the procedure of choice, and videothoracoscopy was only performed in certain sporadic mid-stage cases. With the new algorithm (October 2005 to December 2008), videothoracoscopy became the first-line treatment option, whereas open decortication was only performed in patients with a thick pleural peel (>2 cm) observed by chest scan. The patients were divided into an old algorithm (n = 93) and new algorithm (n = 113) group and compared. The main outcome variables assessed included treatment failure (pleural space reintervention or death up to 60 days after medical discharge) and the occurrence of complications. RESULTS: Videothoracoscopy and open decortication were performed in 13 and 80 patients from the old algorithm group and in 81 and 32 patients from the new algorithm group, respectively (p < 0.01). The patients in the new algorithm group were older (41 +/- 1 vs. 46.3 +/- 16.7 years, p=0.014) and had higher Charlson Comorbidity Index scores [0(0-3) vs. 2(0-4), p = 0.032]. The occurrence of treatment failure was similar in both groups (19.35% vs. 24.77%, p= 0.35), although the complication rate was lower in the new algorithm group (48.3% vs. 33.6%, p = 0.04). CONCLUSIONS: The wider use of videothoracoscopy in pleural empyema treatment was associated with fewer complications and unaltered rates of mortality and reoperation even though more severely ill patients were subjected to videothoracoscopic surgery.
Resumo:
We performed a comparative study and evaluated cellular infiltrates and anti-inflammatory cytokine production at different time-points after syngeneic or allogeneic skin transplantation. We observed an early IL-10 production in syngeneic grafts compared with allografts. This observation prompted us to investigate the role of IL-10 in isograft acceptance. For this, we used IL-10 KO and WT mice to perform syngeneic transplantation, where IL-10 was absent in the graft or in the recipient. The majority of syngeneic grafts derived from IL-10 KO donors did not engraft or was only partially accepted, whereas IL-10 KO mice transplanted with skin from WT donors accepted the graft. We evaluated IL-10 producers in the transplanted skin and observed that epithelial cells were the major source. Taken together, our data show that production of IL-10 by donor cells, but not by the recipient, is determinant for graft acceptance and strongly suggest that production of this cytokine by keratinocytes immediately upon transplantation is necessary for isograft survival. J. Leukoc. Biol. 92: 259-264; 2012.
Resumo:
In this study, a dynamic programming approach to deal with the unconstrained two-dimensional non-guillotine cutting problem is presented. The method extends the recently introduced recursive partitioning approach for the manufacturer's pallet loading problem. The approach involves two phases and uses bounds based on unconstrained two-staged and non-staged guillotine cutting. The method is able to find the optimal cutting pattern of a large number of pro blem instances of moderate sizes known in the literature and a counterexample for which the approach fails to find known optimal solutions was not found. For the instances that the required computer runtime is excessive, the approach is combined with simple heuristics to reduce its running time. Detailed numerical experiments show the reliability of the method. Journal of the Operational Research Society (2012) 63, 183-200. doi: 10.1057/jors.2011.6 Published online 17 August 2011
Resumo:
A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.
Resumo:
The present paper presents a historical study on the acceptance of Newton's corpuscular theory of light in the early eighteenth century. Isaac Newton first published his famous book Opticks in 1704. After its publication, it became quite popular and was an almost mandatory presence in cultural life of Enlightenment societies. However, Newton's optics did not become popular only via his own words and hands, but also via public lectures and short books with scientific contents devoted to general public (including women) that emerged in the period as a sort of entertainment business. Lectures and writers stressed the inductivist approach to the study of nature and presented Newton's ideas about optics as they were consensual among natural philosophers in the period. The historical case study presented in this paper illustrates relevant aspects of nature of science, which can be explored by students of physics on undergraduate level or in physics teacher training programs.