946 resultados para Practical algorithm
Resumo:
The incidence of melanoma has increased rapidly over the past 30 years, and the disease is now the sixth most common cancer among men and women in the U.K. Many patients are diagnosed with or develop metastatic disease, and survival is substantially reduced in these patients. Mutations in the BRAF gene have been identified as key drivers of melanoma cells and are found in around 50% of cutaneous melanomas. Vemurafenib (Zelboraf(®) ; Roche Molecular Systems Inc., Pleasanton, CA, U.S.A.) is the first licensed inhibitor of mutated BRAF, and offers a new first-line option for patients with unresectable or metastatic melanoma who harbour BRAF mutations. Vemurafenib was developed in conjunction with a companion diagnostic, the cobas(®) 4800 BRAF V600 Mutation Test. The purpose of this paper is to make evidence-based recommendations to facilitate the implementation of BRAF mutation testing and targeted therapy in patients with metastatic melanoma in the U.K. The recommendations are the result of a meeting of an expert panel and have been reviewed by melanoma specialists and representatives of the National Cancer Research Network Clinical Study Group on behalf of the wider melanoma community. This article is intended to be a starting point for practical advice and recommendations, which will no doubt be updated as we gain further experience in personalizing therapy for patients with melanoma.
Resumo:
[EN]In face recognition, where high-dimensional representation spaces are generally used, it is very important to take advantage of all the available information. In particular, many labelled facial images will be accumulated while the recognition system is functioning, and due to practical reasons some of them are often discarded. In this paper, we propose an algorithm for using this information. The algorithm has the fundamental characteristic of being incremental. On the other hand, the algorithm makes use of a combination of classification results for the images in the input sequence. Experiments with sequences obtained with a real person detection and tracking system allow us to analyze the performance of the algorithm, as well as its potential improvements.
Resumo:
This paper analyzes the inner relations between classical sub-scheme probability and statistic probability, subjective probability and objective probability, prior probability and posterior probability, transition probability and probability of utility, and further analysis the goal, method, and its practical economic purpose which represent by these various probability from the perspective of mathematics, so as to deeply understand there connotation and its relation with economic decision making, thus will pave the route for scientific predication and decision making.
Resumo:
There is considerable interest in the use of genetic algorithms to solve problems arising in the areas of scheduling and timetabling. However, the classical genetic algorithm paradigm is not well equipped to handle the conflict between objectives and constraints that typically occurs in such problems. In order to overcome this, successful implementations frequently make use of problem specific knowledge. This paper is concerned with the development of a GA for a nurse rostering problem at a major UK hospital. The structure of the constraints is used as the basis for a co-evolutionary strategy using co-operating sub-populations. Problem specific knowledge is also used to define a system of incentives and disincentives, and a complementary mutation operator. Empirical results based on 52 weeks of live data show how these features are able to improve an unsuccessful canonical GA to the point where it is able to provide a practical solution to the problem.
Resumo:
There is considerable interest in the use of genetic algorithms to solve problems arising in the areas of scheduling and timetabling. However, the classical genetic algorithm paradigm is not well equipped to handle the conflict between objectives and constraints that typically occurs in such problems. In order to overcome this, successful implementations frequently make use of problem specific knowledge. This paper is concerned with the development of a GA for a nurse rostering problem at a major UK hospital. The structure of the constraints is used as the basis for a co-evolutionary strategy using co-operating sub-populations. Problem specific knowledge is also used to define a system of incentives and disincentives, and a complementary mutation operator. Empirical results based on 52 weeks of live data show how these features are able to improve an unsuccessful canonical GA to the point where it is able to provide a practical solution to the problem.
Resumo:
There is considerable interest in the use of genetic algorithms to solve problems arising in the areas of scheduling and timetabling. However, the classical genetic algorithm paradigm is not well equipped to handle the conflict between objectives and constraints that typically occurs in such problems. In order to overcome this, successful implementations frequently make use of problem specific knowledge. This paper is concerned with the development of a GA for a nurse rostering problem at a major UK hospital. The structure of the constraints is used as the basis for a co-evolutionary strategy using co-operating sub-populations. Problem specific knowledge is also used to define a system of incentives and disincentives, and a complementary mutation operator. Empirical results based on 52 weeks of live data show how these features are able to improve an unsuccessful canonical GA to the point where it is able to provide a practical solution to the problem.
Resumo:
The K-means algorithm is one of the most popular clustering algorithms in current use as it is relatively fast yet simple to understand and deploy in practice. Nevertheless, its use entails certain restrictive assumptions about the data, the negative consequences of which are not always immediately apparent, as we demonstrate. While more flexible algorithms have been developed, their widespread use has been hindered by their computational and technical complexity. Motivated by these considerations, we present a flexible alternative to K-means that relaxes most of the assumptions, whilst remaining almost as fast and simple. This novel algorithm which we call MAP-DP (maximum a-posteriori Dirichlet process mixtures), is statistically rigorous as it is based on nonparametric Bayesian Dirichlet process mixture modeling. This approach allows us to overcome most of the limitations imposed by K-means. The number of clusters K is estimated from the data instead of being fixed a-priori as in K-means. In addition, while K-means is restricted to continuous data, the MAP-DP framework can be applied to many kinds of data, for example, binary, count or ordinal data. Also, it can efficiently separate outliers from the data. This additional flexibility does not incur a significant computational overhead compared to K-means with MAP-DP convergence typically achieved in the order of seconds for many practical problems. Finally, in contrast to K-means, since the algorithm is based on an underlying statistical model, the MAP-DP framework can deal with missing data and enables model testing such as cross validation in a principled way. We demonstrate the simplicity and effectiveness of this algorithm on the health informatics problem of clinical sub-typing in a cluster of diseases known as parkinsonism.
Resumo:
40
Resumo:
Lipidic mixtures present a particular phase change profile highly affected by their unique crystalline structure. However, classical solid-liquid equilibrium (SLE) thermodynamic modeling approaches, which assume the solid phase to be a pure component, sometimes fail in the correct description of the phase behavior. In addition, their inability increases with the complexity of the system. To overcome some of these problems, this study describes a new procedure to depict the SLE of fatty binary mixtures presenting solid solutions, namely the Crystal-T algorithm. Considering the non-ideality of both liquid and solid phases, this algorithm is aimed at the determination of the temperature in which the first and last crystal of the mixture melts. The evaluation is focused on experimental data measured and reported in this work for systems composed of triacylglycerols and fatty alcohols. The liquidus and solidus lines of the SLE phase diagrams were described by using excess Gibbs energy based equations, and the group contribution UNIFAC model for the calculation of the activity coefficients of both liquid and solid phases. Very low deviations of theoretical and experimental data evidenced the strength of the algorithm, contributing to the enlargement of the scope of the SLE modeling.
Resumo:
β-Carotene, zeaxanthin, lutein, β-cryptoxanthin, and lycopene are liposoluble pigments widely distributed in vegetables and fruits and, after ingestion, these compounds are usually detected in human blood plasma. In this study, we evaluated their potential to inhibit hemolysis of human erythrocytes, as mediated by the toxicity of peroxyl radicals (ROO•). Thus, 2,2'-azobis (2-methylpropionamidine) dihydrochloride (AAPH) was used as ROO• generator and the hemolysis assay was carried out in experimental conditions optimized by response surface methodology, and successfully adapted to microplate assay. The optimized conditions were verified at 30 × 10(6) cells/mL, 17 mM of AAPH for 3 h, at which 48 ± 5% of hemolysis was achieved in freshly isolated erythrocytes. Among the tested carotenoids, lycopene (IC(50) = 0.24 ± 0.05 μM) was the most efficient to prevent the hemolysis, followed by β-carotene (0.32 ± 0.02 μM), lutein (0.38 ± 0.02 μM), and zeaxanthin (0.43 ± 0.02 μM). These carotenoids were at least 5 times more effective than quercetin, trolox, and ascorbic acid (positive controls). β-Cryptoxanthin did not present any erythroprotective effect, but rather induced a hemolytic effect at the highest tested concentration (3 μM). These results suggest that selected carotenoids may have potential to act as important erythroprotective agents by preventing ROO•-induced toxicity in human erythrocytes.
Resumo:
PURPOSE: To compare the Full Threshold (FT) and SITA Standard (SS) strategies in glaucomatous patients undergoing automated perimetry for the first time. METHODS: Thirty-one glaucomatous patients who had never undergone perimetry underwent automated perimetry (Humphrey, program 30-2) with both FT and SS on the same day, with an interval of at least 15 minutes. The order of the examination was randomized, and only one eye per patient was analyzed. Three analyses were performed: a) all the examinations, regardless of the order of application; b) only the first examinations; c) only the second examinations. In order to calculate the sensitivity of both strategies, the following criteria were used to define abnormality: glaucoma hemifield test (GHT) outside normal limits, pattern standard deviation (PSD) <5%, or a cluster of 3 adjacent points with p<5% at the pattern deviation probability plot. RESULTS: When the results of all examinations were analyzed regardless of the order in which they were performed, the number of depressed points with p<0.5% in the pattern deviation probability map was significantly greater with SS (p=0.037), and the sensitivities were 87.1% for SS and 77.4% for FT (p=0.506). When only the first examinations were compared, there were no statistically significant differences regarding the number of depressed points, but the sensitivity of SS (100%) was significantly greater than that obtained with FT (70.6%) (p=0.048). When only the second examinations were compared, there were no statistically significant differences regarding the number of depressed points, and the sensitivities of SS (76.5%) and FT (85.7%) (p=0.664). CONCLUSION: SS may have a higher sensitivity than FT in glaucomatous patients undergoing automated perimetry for the first time. However, this difference tends to disappear in subsequent examinations.
Resumo:
OBJETIVO: O objetivo deste trabalho foi estudar a grandeza practical peak voltage (PPV), determinada a partir da forma de onda de tensão aplicada a tubos radiológicos, e compará-la com algumas definições de kVp para diferentes tipos de geradores: monofásico (onda completa, clínico), trifásico (seis pulsos, clínico) e potencial constante (industrial). MATERIAIS E MÉTODOS: O trabalho envolveu a comparação do PPV medido invasivamente (utilizando um divisor de tensão) com a resposta de dois medidores comerciais não invasivos, além dos valores de outras grandezas usadas para medição da tensão de pico aplicada ao tubo de raios X, e a análise da variação do PPV com a ondulação percentual da tensão (ripple). RESULTADOS: Verificou-se que a diferença entre o PPV e as definições mais comuns de tensão de pico aumenta com o ripple. Os valores de PPV variaram em até 3% e 5%, respectivamente, na comparação entre medições invasivas e não invasivas feitas com os equipamentos trifásico e monofásico. CONCLUSÃO: Os resultados demonstraram que a principal grandeza de influência que afeta o PPV é o ripple da tensão. Adicionalmente, valores de PPV obtidos com medidores não invasivos devem ser avaliados considerando que eles dependem da taxa de aquisição e da forma de onda adquirida pelo instrumento.
Resumo:
The network of HIV counseling and testing centers in São Paulo, Brazil is a major source of data used to build epidemiological profiles of the client population. We examined HIV-1 incidence from November 2000 to April 2001, comparing epidemiological and socio-behavioral data of recently-infected individuals with those with long-standing infection. A less sensitive ELISA was employed to identify recent infection. The overall incidence of HIV-1 infection was 0.53/100/year (95% CI: 0.31-0.85/100/year): 0.77/100/year for males (95% CI: 0.42-1.27/100/year) and 0.22/100/ year (95% CI: 0.05-0.59/100/year) for females. Overall HIV-1 prevalence was 3.2% (95% CI: 2.8-3.7%), being 4.0% among males (95% CI: 3.3-4.7%) and 2.1% among females (95% CI: 1.6-2.8%). Recent infections accounted for 15% of the total (95% CI: 10.2-20.8%). Recent infection correlated with being younger and male (p = 0.019). Therefore, recent infection was more common among younger males and older females.
Resumo:
This work develops a method for solving ordinary differential equations, that is, initial-value problems, with solutions approximated by using Legendre's polynomials. An iterative procedure for the adjustment of the polynomial coefficients is developed, based on the genetic algorithm. This procedure is applied to several examples providing comparisons between its results and the best polynomial fitting when numerical solutions by the traditional Runge-Kutta or Adams methods are available. The resulting algorithm provides reliable solutions even if the numerical solutions are not available, that is, when the mass matrix is singular or the equation produces unstable running processes.
Resumo:
There is evidence that a significant number of patients with schizophrenia and other chronic psychotic psychosis are prescribed high-dose antipsychotic drugs despite the fact that clinical guidelines recommend the routine use of a single antipsychotic drug in a standard dose. The prescriptions for high-dose and combined antipsychotic drugs are relatively common in clinical practice. This occurs despite the fact that results of published trials of high-dose antipsychotic drug treatment for schizophrenia provide little evidence to support effectiveness of using high-dose antipsychotic treatment and most importantly such strategy is not recommended. Moreover, there is mounting evidence of higher incidence of side effects and mortality associated with high dose antipsychotic treatment. Therefore we are presenting a practical pocket checklist which is aimed at minimizing predicted and unpredicted side effects during such treatments.