852 resultados para Initial data problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The attributes describing a data set may often be arranged in meaningful subsets, each of which corresponds to a different aspect of the data. An unsupervised algorithm (SCAD) that simultaneously performs fuzzy clustering and aspects weighting was proposed in the literature. However, SCAD may fail and halt given certain conditions. To fix this problem, its steps are modified and then reordered to reduce the number of parameters required to be set by the user. In this paper we prove that each step of the resulting algorithm, named ASCAD, globally minimizes its cost-function with respect to the argument being optimized. The asymptotic analysis of ASCAD leads to a time complexity which is the same as that of fuzzy c-means. A hard version of the algorithm and a novel validity criterion that considers aspect weights in order to estimate the number of clusters are also described. The proposed method is assessed over several artificial and real data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: The Brazilian public health system does not provide electroconvulsive therapy (ECT), which is limited to a few academic services. National mental health policies are against ECT. Our objectives were to analyze critically the public policies toward ECT and present the current situation using statistics from the Institute of Psychiatry of the University of Sao Paulo (IPq-HCFMUSP) and summary data from the other 13 ECT services identified in the country. Methods: Data regarding ECT treatment at the IPq-HCFMUSP were collected from January 2009 to June 2010 (demographical, number of sessions, and diagnoses). All the data were analyzed using SPSS 19, Epic Info 2000, and Excel. Results: During this period, 331 patients were treated at IPq-HCFMUSP: 221 (67%) were from Sao Paulo city, 50 (15.2%) from Sao Paulo's metropolitan area, 39 (11.8%) from Sao Paulo's countryside, and 20 (6.1%) from other states; 7352 ECT treatments were delivered-63.0% (4629) devoted entirely via the public health system (although not funded by the federal government); the main diagnoses were a mood disorder in 86.4% and schizophrenia in 7.3% of the cases. Conclusions: There is an important lack of public assistance for ECT, affecting mainly the poor and severely ill patients. The university services are overcrowded and cannot handle all the referrals. The authors press for changes in the mental health policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: This research project examined influence of the doctors' speciality on primary health care (PHC) problem solving in Belo Horizonte (BH) Brazil, comparing homeopathic with family health doctors (FH), from the management's and the patients' viewpoint. In BH, both FH and homeopathic doctors work in PHC. The index of resolvability (IR) is used to compare resolution of problems by doctors. Methods: The present research compared IR, using official data from the Secretariat of Health and test requests made by the doctors and 482 structured interviews with patients. A total of 217,963 consultations by 14 homeopaths and 67 FH doctors between 1 July 2006 and 30 June 2007 were analysed. Results: The results show significant differences greater problem resolution by homeopaths compared to FH doctors. Conclusion: In BH, the medical speciality, homeopathy or FH, has an impact on problem solving, both from the managers' and the patients' point of view. Homeopaths request fewer tests and have better IR compared with FH doctors. Specialisation in homeopathy is an independent positive factor in problem solving at PHC level in BH, Brazil. Homeopathy (2012) 101, 44-50.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose simple heuristics for the assembly line worker assignment and balancing problem. This problem typically occurs in assembly lines in sheltered work centers for the disabled. Different from the well-known simple assembly line balancing problem, the task execution times vary according to the assigned worker. We develop a constructive heuristic framework based on task and worker priority rules defining the order in which the tasks and workers should be assigned to the workstations. We present a number of such rules and compare their performance across three possible uses: as a stand-alone method, as an initial solution generator for meta-heuristics, and as a decoder for a hybrid genetic algorithm. Our results show that the heuristics are fast, they obtain good results as a stand-alone method and are efficient when used as a initial solution generator or as a solution decoder within more elaborate approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Video-assisted thoracic sympathectomy provides excellent resolution of palmar and axillary hyperhidrosis but is associated with compensatory hyperhidrosis. Low doses of oxybutynin, an anticholinergic medication that competitively antagonizes the muscarinic acetylcholine receptor, can be used to treat palmar hyperhidrosis with fewer side effects. Objective: This study evaluated the effectiveness and patient satisfaction of oral oxybutynin at low doses (5 mg twice daily) compared with placebo for treating palmar hyperhidrosis. Methods: This was prospective, randomized, and controlled study. From December 2010 to February 2011, 50 consecutive patients with palmar hyperhidrosis were treated with oxybutynin or placebo. Data were collected from 50 patients, but 5 (10.0%) were lost to follow-up. During the first week, patients received 2.5 mg of oxybutynin once daily in the evening. From days 8 to 21, they received 2.5 mg twice daily, and from day 22 to the end of week 6, they received 5 mg twice daily. All patients underwent two evaluations, before and after (6 weeks) the oxybutynin treatment, using a clinical questionnaire and a clinical protocol for quality of life. Results: Palmar and axillary hyperhidrosis improved in >70% of the patients, and 47.8% of those presented great improvement. Plantar hyperhidrosis improved in >90% of the patients. Most patients (65.2%) showed improvements in their quality of life. The side effects were minor, with dry mouth being the most frequent (47.8%). Conclusions: Treatment of palmar and axillary hyperhidrosis with oxybutynin is a good initial alternative for treatment given that it presents good results and improves quality of life. (J Vasc Surg 2012;55:1696-700.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A computational pipeline combining texture analysis and pattern classification algorithms was developed for investigating associations between high-resolution MRI features and histological data. This methodology was tested in the study of dentate gyrus images of sclerotic hippocampi resected from refractory epilepsy patients. Images were acquired using a simple surface coil in a 3.0T MRI scanner. All specimens were subsequently submitted to histological semiquantitative evaluation. The computational pipeline was applied for classifying pixels according to: a) dentate gyrus histological parameters and b) patients' febrile or afebrile initial precipitating insult history. The pipeline results for febrile and afebrile patients achieved 70% classification accuracy, with 78% sensitivity and 80% specificity [area under the reader observer characteristics (ROC) curve: 0.89]. The analysis of the histological data alone was not sufficient to achieve significant power to separate febrile and afebrile groups. Interesting enough, the results from our approach did not show significant correlation with histological parameters (which per se were not enough to classify patient groups). These results showed the potential of adding computational texture analysis together with classification methods for detecting subtle MRI signal differences, a method sufficient to provide good clinical classification. A wide range of applications of this pipeline can also be used in other areas of medical imaging. Magn Reson Med, 2012. (c) 2012 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Determination of the utility harmonic impedance based on measurements is a significant task for utility power-quality improvement and management. Compared to those well-established, accurate invasive methods, the noninvasive methods are more desirable since they work with natural variations of the loads connected to the point of common coupling (PCC), so that no intentional disturbance is needed. However, the accuracy of these methods has to be improved. In this context, this paper first points out that the critical problem of the noninvasive methods is how to select the measurements that can be used with confidence for utility harmonic impedance calculation. Then, this paper presents a new measurement technique which is based on the complex data-based least-square regression, combined with two techniques of data selection. Simulation and field test results show that the proposed noninvasive method is practical and robust so that it can be used with confidence to determine the utility harmonic impedances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relativistic nuclear collisions data on two-particle correlations exhibit structures as function of relative azimuthal angle and rapidity. A unified description of these near-side and away-side structures is proposed for low to moderate transverse momentum. It is based on the combined effect of tubular initial conditions and hydrodynamical expansion. Contrary to expectations, the hydrodynamics solution shows that the high-energy density tubes (leftover from the initial particle interactions) give rise to particle emission in two directions and this is what leads to the various structures. This description is sensitive to some of the initial tube parameters and may provide a probe of the strong interaction. This explanation is compared with an alternative one where some triangularity in the initial conditions is assumed. A possible experimental test is suggested. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we propose a new Bayesian flexible cure rate survival model, which generalises the stochastic model of Klebanov et al. [Klebanov LB, Rachev ST and Yakovlev AY. A stochastic-model of radiation carcinogenesis - latent time distributions and their properties. Math Biosci 1993; 113: 51-75], and has much in common with the destructive model formulated by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)]. In our approach, the accumulated number of lesions or altered cells follows a compound weighted Poisson distribution. This model is more flexible than the promotion time cure model in terms of dispersion. Moreover, it possesses an interesting and realistic interpretation of the biological mechanism of the occurrence of the event of interest as it includes a destructive process of tumour cells after an initial treatment or the capacity of an individual exposed to irradiation to repair altered cells that results in cancer induction. In other words, what is recorded is only the damaged portion of the original number of altered cells not eliminated by the treatment or repaired by the repair system of an individual. Markov Chain Monte Carlo (MCMC) methods are then used to develop Bayesian inference for the proposed model. Also, some discussions on the model selection and an illustration with a cutaneous melanoma data set analysed by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)] are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background The search for enriched (aka over-represented or enhanced) ontology terms in a list of genes obtained from microarray experiments is becoming a standard procedure for a system-level analysis. This procedure tries to summarize the information focussing on classification designs such as Gene Ontology, KEGG pathways, and so on, instead of focussing on individual genes. Although it is well known in statistics that association and significance are distinct concepts, only the former approach has been used to deal with the ontology term enrichment problem. Results BayGO implements a Bayesian approach to search for enriched terms from microarray data. The R source-code is freely available at http://blasto.iq.usp.br/~tkoide/BayGO in three versions: Linux, which can be easily incorporated into pre-existent pipelines; Windows, to be controlled interactively; and as a web-tool. The software was validated using a bacterial heat shock response dataset, since this stress triggers known system-level responses. Conclusion The Bayesian model accounts for the fact that, eventually, not all the genes from a given category are observable in microarray data due to low intensity signal, quality filters, genes that were not spotted and so on. Moreover, BayGO allows one to measure the statistical association between generic ontology terms and differential expression, instead of working only with the common significance analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background A popular model for gene regulatory networks is the Boolean network model. In this paper, we propose an algorithm to perform an analysis of gene regulatory interactions using the Boolean network model and time-series data. Actually, the Boolean network is restricted in the sense that only a subset of all possible Boolean functions are considered. We explore some mathematical properties of the restricted Boolean networks in order to avoid the full search approach. The problem is modeled as a Constraint Satisfaction Problem (CSP) and CSP techniques are used to solve it. Results We applied the proposed algorithm in two data sets. First, we used an artificial dataset obtained from a model for the budding yeast cell cycle. The second data set is derived from experiments performed using HeLa cells. The results show that some interactions can be fully or, at least, partially determined under the Boolean model considered. Conclusions The algorithm proposed can be used as a first step for detection of gene/protein interactions. It is able to infer gene relationships from time-series data of gene expression, and this inference process can be aided by a priori knowledge available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rare variants are becoming the new candidates in the search for genetic variants that predispose individuals to a phenotype of interest. Their low prevalence in a population requires the development of dedicated detection and analytical methods. A family-based approach could greatly enhance their detection and interpretation because rare variants are nearly family specific. In this report, we test several distinct approaches for analyzing the information provided by rare and common variants and how they can be effectively used to pinpoint putative candidate genes for follow-up studies. The analyses were performed on the mini-exome data set provided by Genetic Analysis Workshop 17. Eight approaches were tested, four using the trait’s heritability estimates and four using QTDT models. These methods had their sensitivity, specificity, and positive and negative predictive values compared in light of the simulation parameters. Our results highlight important limitations of current methods to deal with rare and common variants, all methods presented a reduced specificity and, consequently, prone to false positive associations. Methods analyzing common variants information showed an enhanced sensibility when compared to rare variants methods. Furthermore, our limited knowledge of the use of biological databases for gene annotations, possibly for use as covariates in regression models, imposes a barrier to further research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CONTEXT AND PURPOSE: Partial nephrectomy has become the standard of care for renal tumors less than 4 cm in diameter. Controversy still exists, however, regarding the best surgical approach, especially when minimally invasive techniques are taken into account. Robotic-assisted laparoscopic partial nephrectomy (RALPN) has emerged as a promising technique that helps surgeons achieve the standards of open partial nephrectomy care while offering a minimally invasive approach. The objective of the present study was to describe our initial experience with robotic-assisted laparoscopic partial nephrectomy and extensively review the pertinent literature. MATERIALS AND METHODS: Between August 2009 and February 2010, eight consecutive selected patients with contrast enhancing renal masses observed by CT were submitted to RALPN in a private institution. In addition, we collected information on the patients' demographics, preoperative tumor characteristics and detailed operative, postoperative and pathological data. In addition, a PubMed search was performed to provide an extensive review of the robotic-assisted laparoscopic partial nephrectomy literature. RESULTS: Seven patients had RALPN on the left or right sides with no intraoperative complications. One patient was electively converted to a robotic-assisted radical nephrectomy. The operative time ranged from 120 to 300 min, estimated blood loss (EBL) ranged from 75 to 400 mL and, in five cases, the warm ischemia time (WIT) ranged from 18 to 32 min. Two patients did not require any clamping. Overall, no transfusions were necessary, and there were no intraoperative complications or adverse postoperative clinical events. All margins were negative, and all patients were disease-free at the 6-month follow-up. CONCLUSIONS: Robotic-assisted laparoscopic partial nephrectomy is a feasible and safe approach to small renal cortical masses.Further prospective studies are needed to compare open partial nephrectomy with its minimally invasive counterparts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The importance of mechanical aspects related to cell activity and its environment is becoming more evident due to their influence in stem cell differentiation and in the development of diseases such as atherosclerosis. The mechanical tension homeostasis is related to normal tissue behavior and its lack may be related to the formation of cancer, which shows a higher mechanical tension. Due to the complexity of cellular activity, the application of simplified models may elucidate which factors are really essential and which have a marginal effect. The development of a systematic method to reconstruct the elements involved in the perception of mechanical aspects by the cell may accelerate substantially the validation of these models. This work proposes the development of a routine capable of reconstructing the topology of focal adhesions and the actomyosin portion of the cytoskeleton from the displacement field generated by the cell on a flexible substrate. Another way to think of this problem is to develop an algorithm to reconstruct the forces applied by the cell from the measurements of the substrate displacement, which would be characterized as an inverse problem. For these kind of problems, the Topology Optimization Method (TOM) is suitable to find a solution. TOM is consisted of an iterative application of an optimization method and an analysis method to obtain an optimal distribution of material in a fixed domain. One way to experimentally obtain the substrate displacement is through Traction Force Microscopy (TFM), which also provides the forces applied by the cell. Along with systematically generating the distributions of focal adhesion and actin-myosin for the validation of simplified models, the algorithm also represents a complementary and more phenomenological approach to TFM. As a first approximation, actin fibers and flexible substrate are represented through two-dimensional linear Finite Element Method. Actin contraction is modeled as an initial stress of the FEM elements. Focal adhesions connecting actin and substrate are represented by springs. The algorithm was applied to data obtained from experiments regarding cytoskeletal prestress and micropatterning, comparing the numerical results to the experimental ones

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The heating of the solar corona has been investigated during four of decades and several mechanisms able to produce heating have been proposed. It has until now not been possible to produce quantitative estimates that would establish any of these heating mechanism as the most important in the solar corona. In order to investigate which heating mechanism is the most important, a more detailed approach is needed. In this thesis, the heating problem is approached ”ab initio”, using well observed facts and including realistic physics in a 3D magneto-hydrodynamic simulation of a small part of the solar atmosphere. The ”engine” of the heating mechanism is the solar photospheric velocity field, that braids the magnetic field into a configuration where energy has to be dissipated. The initial magnetic field is taken from an observation of a typical magnetic active region scaled down to fit inside the computational domain. The driving velocity field is generated by an algorithm that reproduces the statistical and geometrical fingerprints of solar granulation. Using a standard model atmosphere as the thermal initial condition, the simulation goes through a short startup phase, where the initial thermal stratification is quickly forgotten, after which the simulation stabilizes in statistical equilibrium. In this state, the magnetic field is able to dissipate the same amount of energy as is estimated to be lost through radiation, which is the main energy loss mechanism in the solar corona. The simulation produces heating that is intermittent on the smallest resolved scales and hot loops similar to those observed through narrow band filters in the ultra violet. Other observed characteristics of the heating are reproduced, as well as a coronal temperature of roughly one million K. Because of the ab initio approach, the amount of heating produced in these simulations represents a lower limit to coronal heating and the conclusion is that such heating of the corona is unavoidable.