978 resultados para Prediction algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To estimate the likelihood of axillary lymph node involvement for patients with early-stage breast cancer, based on a variety of clinical and pathological factors. METHODS: A retrospective analysis was done in hospital databases from 1999 to 2007. Two hundred thirty-nine patients were diagnosed with early-stage breast cancer. Predictive factors, such as patient age, tumor size, lymphovascular invasion, histological grade and immunohistochemical subtype were analyzed to identify variables that may be associated with axillary lymph node metastasis. RESULTS: Patients with tumors that are negative for estrogen receptor, progesterone receptor, and HER2 had approximately a 90% lower chance of developing lymph node metastasis than those with luminal A tumors (e.g., ER+ and/or PR+ and HER2-) - Odds Ratio: 0.11; 95% confidence interval: 0.01-0.88; p=0.01. Furthermore, the risk for lymph node metastasis of luminal A tumors seemed to decrease as patient age increased, and it was directly correlated with tumor size. CONCLUSION: The molecular classification of early-stage breast cancer using immunohistochemistry may help predicting the probability of developing axillary lymph node metastasis. Further studies are needed to optimize predictions for nodal involvement, with the aim of aiding the decision-making process for breast cancer treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cyanobacteria are unicellular, non-nitrogen-fixing prokaryotes, which perform photosynthesis similarly as higher plants. The cyanobacterium Synechocystis sp. strain PCC 6803 is used as a model organism in photosynthesis research. My research described herein aims at understanding the function of the photosynthetic machinery and how it responds to changes in the environment. Detailed knowledge of the regulation of photosynthesis in cyanobacteria can be utilized for biotechnological purposes, for example in the harnessing of solar energy for biofuel production. In photosynthesis, iron participates in electron transfer. Here, we focused on iron transport in Synechocystis sp. strain PCC 6803 and particularly on the environmental regulation of the genes encoding the FutA2BC ferric iron transporter, which belongs to the ABC transporter family. A homology model built for the ATP-binding subunit FutC indicates that it has a functional ATPbinding site as well as conserved interactions with the channel-forming subunit FutB in the transporter complex. Polyamines are important for the cell proliferation, differentiation and apoptosis in prokaryotic and eukaryotic cells. In plants, polyamines have special roles in stress response and in plant survival. The polyamine metabolism in cyanobacteria in response to environmental stress is of interest in research on stress tolerance of higher plants. In this thesis, the potd gene encoding an polyamine transporter subunit from Synechocystis sp. strain PCC 6803 was characterized for the first time. A homology model built for PotD protein indicated that it has capability of binding polyamines, with the preference for spermidine. Furthermore, in order to investigate the structural features of the substrate specificity, polyamines were docked into the binding site. Spermidine was positioned very similarly in Synechocystis PotD as in the template structure and had most favorable interactions of the docked polyamines. Based on the homology model, experimental work was conducted, which confirmed the binding preference. Flavodiiron proteins (Flv) are enzymes, which protect the cell against toxicity of oxygen and/or nitric oxide by reduction. In this thesis, we present a novel type of photoprotection mechanism in cyanobacteria by the heterodimer of Flv2/Flv4. The constructed homology model of Flv2/Flv4 suggests a functional heterodimer capable of rapid electron transfer. The unknown protein sll0218, encoded by the flv2-flv4 operon, is assumed to facilitate the interaction of the Flv2/Flv4 heterodimer and energy transfer between the phycobilisome and PSII. Flv2/Flv4 provides an alternative electron transfer pathway and functions as an electron sink in PSII electron transfer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work describes a lumped parameter mathematical model for the prediction of transients in an aerodynamic circuit of a transonic wind tunnel. Control actions to properly handle those perturbations are also assessed. The tunnel circuit technology is up to date and incorporates a novel feature: high-enthalpy air injection to extend the tunnel’s Reynolds number capability. The model solves the equations of continuity, energy and momentum and defines density, internal energy and mass flow as the basic parameters in the aerodynamic study as well as Mach number, stagnation pressure and stagnation temperature, all referred to test section conditions, as the main control variables. The tunnel circuit response to control actions and the stability of the flow are numerically investigated. Initially, for validation purposes, the code was applied to the AWT ("Altitude Wind Tunnel" of NASA-Lewis). In the sequel, the Brazilian transonic wind tunnel was investigated, with all the main control systems modeled, including injection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global illumination algorithms are at the center of realistic image synthesis and account for non-trivial light transport and occlusion within scenes, such as indirect illumination, ambient occlusion, and environment lighting. Their computationally most difficult part is determining light source visibility at each visible scene point. Height fields, on the other hand, constitute an important special case of geometry and are mainly used to describe certain types of objects such as terrains and to map detailed geometry onto object surfaces. The geometry of an entire scene can also be approximated by treating the distance values of its camera projection as a screen-space height field. In order to shadow height fields from environment lights a horizon map is usually used to occlude incident light. We reduce the per-receiver time complexity of generating the horizon map on N N height fields from O(N) of the previous work to O(1) by using an algorithm that incrementally traverses the height field and reuses the information already gathered along the path of traversal. We also propose an accurate method to integrate the incident light within the limits given by the horizon map. Indirect illumination in height fields requires information about which other points are visible to each height field point. We present an algorithm to determine this intervisibility in a time complexity that matches the space complexity of the produced visibility information, which is in contrast to previous methods which scale in the height field size. As a result the amount of computation is reduced by two orders of magnitude in common use cases. Screen-space ambient obscurance methods approximate ambient obscurance from the depth bu er geometry and have been widely adopted by contemporary real-time applications. They work by sampling the screen-space geometry around each receiver point but have been previously limited to near- field effects because sampling a large radius quickly exceeds the render time budget. We present an algorithm that reduces the quadratic per-pixel complexity of previous methods to a linear complexity by line sweeping over the depth bu er and maintaining an internal representation of the processed geometry from which occluders can be efficiently queried. Another algorithm is presented to determine ambient obscurance from the entire depth bu er at each screen pixel. The algorithm scans the depth bu er in a quick pre-pass and locates important features in it, which are then used to evaluate the ambient obscurance integral accurately. We also propose an evaluation of the integral such that results within a few percent of the ray traced screen-space reference are obtained at real-time render times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A linear prediction procedure is one of the approved numerical methods of signal processing. In the field of optical spectroscopy it is used mainly for extrapolation known parts of an optical signal in order to obtain a longer one or deduce missing signal samples. The first is needed particularly when narrowing spectral lines for the purpose of spectral information extraction. In the present paper the coherent anti-Stokes Raman scattering (CARS) spectra were under investigation. The spectra were significantly distorted by the presence of nonlinear nonresonant background. In addition, line shapes were far from Gaussian/Lorentz profiles. To overcome these disadvantages the maximum entropy method (MEM) for phase spectrum retrieval was used. The obtained broad MEM spectra were further underwent the linear prediction analysis in order to be narrowed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies the predictability of market switching and delisting events from OMX First North Nordic multilateral stock exchange by using financial statement information and market information from 2007 to 2012. This study was conducted by using a three stage process. In first stage relevant theoretical framework and initial variable pool were constructed. Then, explanatory analysis of the initial variable pool was done in order to further limit and identify relevant variables. The explanatory analysis was conducted by using self-organizing map methodology. In the third stage, the predictive modeling was carried out with random forests and support vector machine methodologies. It was found that the explanatory analysis was able to identify relevant variables. The results indicate that the market switching and delisting events can be predicted in some extent. The empirical results also support the usability of financial statement and market information in the prediction of market switching and delisting events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this master’s thesis is to examine if Weibull analysis is suitable method for warranty forecasting in the Case Company. The Case Company has used Reliasoft’s Weibull++ software, which is basing on the Weibull method, but the Company has noticed that the analysis has not given right results. This study was conducted making Weibull simulations in different profit centers of the Case Company and then comparing actual cost and forecasted cost. Simula-tions were made using different time frames and two methods for determining future deliveries. The first sub objective is to examine, which parameters of simulations will give the best result to each profit center. The second sub objective of this study is to create a simple control model for following forecasted costs and actual realized costs. The third sub objective is to document all Qlikview-parameters of profit centers. This study is a constructive research, and solutions for company’s problems are figured out in this master’s thesis. In the theory parts were introduced quality issues, for example; what is quality, quality costing and cost of poor quality. Quality is one of the major aspects in the Case Company, so understand-ing the link between quality and warranty forecasting is important. Warranty management was also introduced and other different tools for warranty forecasting. The Weibull method and its mathematical properties and reliability engineering were introduced. The main results of this master’s thesis are that the Weibull analysis forecasted too high costs, when calculating provision. Although, some forecasted values of profit centers were lower than actual values, the method works better for planning purposes. One of the reasons is that quality improving or alternatively quality decreasing is not showing in the results of the analysis in the short run. The other reason for too high values is that the products of the Case Company are com-plex and analyses were made in the profit center-level. The Weibull method was developed for standard products, but products of the Case Company consists of many complex components. According to the theory, this method was developed for homogeneous-data. So the most im-portant notification is that the analysis should be made in the product level, not the profit center level, when the data is more homogeneous.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Identification of low-dimensional structures and main sources of variation from multivariate data are fundamental tasks in data analysis. Many methods aimed at these tasks involve solution of an optimization problem. Thus, the objective of this thesis is to develop computationally efficient and theoretically justified methods for solving such problems. Most of the thesis is based on a statistical model, where ridges of the density estimated from the data are considered as relevant features. Finding ridges, that are generalized maxima, necessitates development of advanced optimization methods. An efficient and convergent trust region Newton method for projecting a point onto a ridge of the underlying density is developed for this purpose. The method is utilized in a differential equation-based approach for tracing ridges and computing projection coordinates along them. The density estimation is done nonparametrically by using Gaussian kernels. This allows application of ridge-based methods with only mild assumptions on the underlying structure of the data. The statistical model and the ridge finding methods are adapted to two different applications. The first one is extraction of curvilinear structures from noisy data mixed with background clutter. The second one is a novel nonlinear generalization of principal component analysis (PCA) and its extension to time series data. The methods have a wide range of potential applications, where most of the earlier approaches are inadequate. Examples include identification of faults from seismic data and identification of filaments from cosmological data. Applicability of the nonlinear PCA to climate analysis and reconstruction of periodic patterns from noisy time series data are also demonstrated. Other contributions of the thesis include development of an efficient semidefinite optimization method for embedding graphs into the Euclidean space. The method produces structure-preserving embeddings that maximize interpoint distances. It is primarily developed for dimensionality reduction, but has also potential applications in graph theory and various areas of physics, chemistry and engineering. Asymptotic behaviour of ridges and maxima of Gaussian kernel densities is also investigated when the kernel bandwidth approaches infinity. The results are applied to the nonlinear PCA and to finding significant maxima of such densities, which is a typical problem in visual object tracking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Early identification of patients who need hospitalization or patients who should be discharged would be helpful for the management of acute asthma in the emergency room. The objective of the present study was to examine the clinical and pulmonary functional measures used during the first hour of assessment of acute asthma in the emergency room in order to predict the outcome. We evaluated 88 patients. The inclusion criteria were age between 12 and 55 years, forced expiratory volume in the first second below 50% of predicted value, and no history of chronic disease or pregnancy. After baseline evaluation, all patients were treated with 2.5 mg albuterol delivered by nebulization every 20 min in the first hour and 60 mg of intravenous methylprednisolone. Patients were reevaluated after 60 min of treatment. Sixty-five patients (73.9%) were successfully treated and discharged from the emergency room (good responders), and 23 (26.1%) were hospitalized or were treated and discharged with relapse within 10 days (poor responders). A predictive index was developed: peak expiratory flow rates after 1 h <=0% of predicted values and accessory muscle use after 1 h. The index ranged from 0 to 2. An index of 1 or higher presented a sensitivity of 74.0, a specificity of 69.0, a positive predictive value of 46.0, and a negative predictive value of 88.0. It was possible to predict outcome in the first hour of management of acute asthma in the emergency room when the index score was 0 or 2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe the impact of subtype differences on the seroreactivity of linear antigenic epitopes in envelope glycoprotein of HIV-1 isolates from different geographical locations. By computer analysis, we predicted potential antigenic sites of envelope glycoprotein (gp120 and gp4l) of this virus. For this purpose, after fetching sequences of proteins of interest from data banks, values of hydrophilicity, flexibility, accessibility, inverted hydrophobicity, and secondary structure were considered. We identified several potential antigenic epitopes in a B subtype strain of envelope glycoprotein of HIV-1 (IIIB). Solid- phase peptide synthesis methods of Merrifield and Fmoc chemistry were used for synthesizing peptides. These synthetic peptides corresponded mainly to the C2, V3 and CD4 binding sites of gp120 and some parts of the ectodomain of gp41. The reactivity of these peptides was tested by ELISA against different HIV-1-positive sera from different locations in India. For two of these predicted epitopes, the corresponding Indian consensus sequences (LAIERYLKQQLLGWG and DIIGDIRQAHCNISEDKWNET) (subtype C) were also synthesized and their reactivity was tested by ELISA. These peptides also distinguished HIV-1-positive sera of Indians with C subtype infections from sera from HIV-negative subjects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study compares the performance of stochastic and fuzzy models for the analysis of the relationship between clinical signs and diagnosis. Data obtained for 153 children concerning diagnosis (pneumonia, other non-pneumonia diseases, absence of disease) and seven clinical signs were divided into two samples, one for analysis and other for validation. The former was used to derive relations by multi-discriminant analysis (MDA) and by fuzzy max-min compositions (fuzzy), and the latter was used to assess the predictions drawn from each type of relation. MDA and fuzzy were closely similar in terms of prediction, with correct allocation of 75.7 to 78.3% of patients in the validation sample, and displaying only a single instance of disagreement: a patient with low level of toxemia was mistaken as not diseased by MDA and correctly taken as somehow ill by fuzzy. Concerning relations, each method provided different information, each revealing different aspects of the relations between clinical signs and diagnoses. Both methods agreed on pointing X-ray, dyspnea, and auscultation as better related with pneumonia, but only fuzzy was able to detect relations of heart rate, body temperature, toxemia and respiratory rate with pneumonia. Moreover, only fuzzy was able to detect a relationship between heart rate and absence of disease, which allowed the detection of six malnourished children whose diagnoses as healthy are, indeed, disputable. The conclusion is that even though fuzzy sets theory might not improve prediction, it certainly does enhance clinical knowledge since it detects relationships not visible to stochastic models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In view of the importance of anticipating the occurrence of critical situations in medicine, we propose the use of a fuzzy expert system to predict the need for advanced neonatal resuscitation efforts in the delivery room. This system relates the maternal medical, obstetric and neonatal characteristics to the clinical conditions of the newborn, providing a risk measurement of need of advanced neonatal resuscitation measures. It is structured as a fuzzy composition developed on the basis of the subjective perception of danger of nine neonatologists facing 61 antenatal and intrapartum clinical situations which provide a degree of association with the risk of occurrence of perinatal asphyxia. The resulting relational matrix describes the association between clinical factors and risk of perinatal asphyxia. Analyzing the inputs of the presence or absence of all 61 clinical factors, the system returns the rate of risk of perinatal asphyxia as output. A prospectively collected series of 304 cases of perinatal care was analyzed to ascertain system performance. The fuzzy expert system presented a sensitivity of 76.5% and specificity of 94.8% in the identification of the need for advanced neonatal resuscitation measures, considering a cut-off value of 5 on a scale ranging from 0 to 10. The area under the receiver operating characteristic curve was 0.93. The identification of risk situations plays an important role in the planning of health care. These preliminary results encourage us to develop further studies and to refine this model, which is intended to implement an auxiliary system able to help health care staff to make decisions in perinatal care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compared the cost-benefit of two algorithms, recently proposed by the Centers for Disease Control and Prevention, USA, with the conventional one, the most appropriate for the diagnosis of hepatitis C virus (HCV) infection in the Brazilian population. Serum samples were obtained from 517 ELISA-positive or -inconclusive blood donors who had returned to Fundação Pró-Sangue/Hemocentro de São Paulo to confirm previous results. Algorithm A was based on signal-to-cut-off (s/co) ratio of ELISA anti-HCV samples that show s/co ratio ³95% concordance with immunoblot (IB) positivity. For algorithm B, reflex nucleic acid amplification testing by PCR was required for ELISA-positive or -inconclusive samples and IB for PCR-negative samples. For algorithm C, all positive or inconclusive ELISA samples were submitted to IB. We observed a similar rate of positive results with the three algorithms: 287, 287, and 285 for A, B, and C, respectively, and 283 were concordant with one another. Indeterminate results from algorithms A and C were elucidated by PCR (expanded algorithm) which detected two more positive samples. The estimated cost of algorithms A and B was US$21,299.39 and US$32,397.40, respectively, which were 43.5 and 14.0% more economic than C (US$37,673.79). The cost can vary according to the technique used. We conclude that both algorithms A and B are suitable for diagnosing HCV infection in the Brazilian population. Furthermore, algorithm A is the more practical and economical one since it requires supplemental tests for only 54% of the samples. Algorithm B provides early information about the presence of viremia.