950 resultados para cut vertex false positive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Yao, Begg, and Livingston (1996, Biometrics 52, 992-1001) considered the optimal group size for testing a series of potentially therapeutic agents to identify a promising one as soon as possible for given error rates. The number of patients to be tested with each agent was fixed as the group size. We consider a sequential design that allows early acceptance and rejection, and we provide an optimal strategy to minimize the sample sizes (patients) required using Markov decision processes. The minimization is under the constraints of the two types (false positive and false negative) of error probabilities, with the Lagrangian multipliers corresponding to the cost parameters for the two types of errors. Numerical studies indicate that there can be a substantial reduction in the number of patients required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stallard (1998, Biometrics 54, 279-294) recently used Bayesian decision theory for sample-size determination in phase II trials. His design maximizes the expected financial gains in the development of a new treatment. However, it results in a very high probability (0.65) of recommending an ineffective treatment for phase III testing. On the other hand, the expected gain using his design is more than 10 times that of a design that tightly controls the false positive error (Thall and Simon, 1994, Biometrics 50, 337-349). Stallard's design maximizes the expected gain per phase II trial, but it does not maximize the rate of gain or total gain for a fixed length of time because the rate of gain depends on the proportion: of treatments forwarding to the phase III study. We suggest maximizing the rate of gain, and the resulting optimal one-stage design becomes twice as efficient as Stallard's one-stage design. Furthermore, the new design has a probability of only 0.12 of passing an ineffective treatment to phase III study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a Multi-Hypotheses Tracking (MHT) approach that allows solving ambiguities that arise with previous methods of associating targets and tracks within a highly volatile vehicular environment. The previous approach based on the Dempster–Shafer Theory assumes that associations between tracks and targets are unique; this was shown to allow the formation of ghost tracks when there was too much ambiguity or conflict for the system to take a meaningful decision. The MHT algorithm described in this paper removes this uniqueness condition, allowing the system to include ambiguity and even to prevent making any decision if available data are poor. We provide a general introduction to the Dempster–Shafer Theory and present the previously used approach. Then, we explain our MHT mechanism and provide evidence of its increased performance in reducing the amount of ghost tracks and false positive processed by the tracking system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Species distribution modelling (SDM) typically analyses species’ presence together with some form of absence information. Ideally absences comprise observations or are inferred from comprehensive sampling. When such information is not available, then pseudo-absences are often generated from the background locations within the study region of interest containing the presences, or else absence is implied through the comparison of presences to the whole study region, e.g. as is the case in Maximum Entropy (MaxEnt) or Poisson point process modelling. However, the choice of which absence information to include can be both challenging and highly influential on SDM predictions (e.g. Oksanen and Minchin, 2002). In practice, the use of pseudo- or implied absences often leads to an imbalance where absences far outnumber presences. This leaves analysis highly susceptible to ‘naughty-noughts’: absences that occur beyond the envelope of the species, which can exert strong influence on the model and its predictions (Austin and Meyers, 1996). Also known as ‘excess zeros’, naughty noughts can be estimated via an overall proportion in simple hurdle or mixture models (Martin et al., 2005). However, absences, especially those that occur beyond the species envelope, can often be more diverse than presences. Here we consider an extension to excess zero models. The two-staged approach first exploits the compartmentalisation provided by classification trees (CTs) (as in O’Leary, 2008) to identify multiple sources of naughty noughts and simultaneously delineate several species envelopes. Then SDMs can be fit separately within each envelope, and for this stage, we examine both CTs (as in Falk et al., 2014) and the popular MaxEnt (Elith et al., 2006). We introduce a wider range of model performance measures to improve treatment of naughty noughts in SDM. We retain an overall measure of model performance, the area under the curve (AUC) of the Receiver-Operating Curve (ROC), but focus on its constituent measures of false negative rate (FNR) and false positive rate (FPR), and how these relate to the threshold in the predicted probability of presence that delimits predicted presence from absence. We also propose error rates more relevant to users of predictions: false omission rate (FOR), the chance that a predicted absence corresponds to (and hence wastes) an observed presence, and the false discovery rate (FDR), reflecting those predicted (or potential) presences that correspond to absence. A high FDR may be desirable since it could help target future search efforts, whereas zero or low FOR is desirable since it indicates none of the (often valuable) presences have been ignored in the SDM. For illustration, we chose Bradypus variegatus, a species that has previously been published as an exemplar species for MaxEnt, proposed by Phillips et al. (2006). We used CTs to increasingly refine the species envelope, starting with the whole study region (E0), eliminating more and more potential naughty noughts (E1–E3). When combined with an SDM fit within the species envelope, the best CT SDM had similar AUC and FPR to the best MaxEnt SDM, but otherwise performed better. The FNR and FOR were greatly reduced, suggesting that CTs handle absences better. Interestingly, MaxEnt predictions showed low discriminatory performance, with the most common predicted probability of presence being in the same range (0.00-0.20) for both true absences and presences. In summary, this example shows that SDMs can be improved by introducing an initial hurdle to identify naughty noughts and partition the envelope before applying SDMs. This improvement was barely detectable via AUC and FPR yet visible in FOR, FNR, and the comparison of predicted probability of presence distribution for pres/absence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To describe patient participation and clinical performance in a colorectal cancer (CRC) screening program utilising faecal occult blood test (FOBT). Methods: A community-based intervention was conducted in a small, rural community in north Queensland, 2000/01. One of two FOBT kits – guaiac (Hemoccult-ll) or immunochemical (Inform) – was assigned by general practice and mailed to participants (3,358 patients aged 50–74 years listed with the local practices). Results: Overall participation in FOBT screening was 36.3%. Participation was higher with the immunochemical kit than the guaiac kit (OR=1.9, 95% Cl 1.6-2.2). Women were more likely to comply with testing than men (OR=1.4, 95% Cl 1.2-1.7), and people in their 60s were less likely to participate than those 70–74 years (OR=0.8, 95% Cl 0.6-0.9). The positivity rate was higher for the immunochemical (9.5%) than the guaiac (3.9%) test (χ2=9.2, p=0.002), with positive predictive values for cancer or adenoma of advanced pathology of 37.8% (95% Cl 28.1–48.6) for !nform and 40.0% (95% Cl 16.8–68.7) for Hemoccult-ll. Colonoscopy follow-up was 94.8% with a medical complication rate of 2–3%. Conclusions: An immunochemical FOBT enhanced participation. Higher positivity rates for this kit did not translate into higher false-positive rates, and both test types resulted in a high yield of neoplasia. Implications: In addition to type of FOBT, the ultimate success of a population-based screening program for CRC using FOBT will depend on appropriate education of health professionals and the public as well as significant investment in medical infrastructure for colonoscopy follow-up.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Web data can often be represented in free tree form; however, free tree mining methods seldom exist. In this paper, a computationally fast algorithm FreeS is presented to discover all frequently occurring free subtrees in a database of labelled free trees. FreeS is designed using an optimal canonical form, BOCF that can uniquely represent free trees even during the presence of isomorphism. To avoid enumeration of false positive candidates, it utilises the enumeration approach based on a tree-structure guided scheme. This paper presents lemmas that introduce conditions to conform the generation of free tree candidates during enumeration. Empirical study using both real and synthetic datasets shows that FreeS is scalable and significantly outperforms (i.e. few orders of magnitude faster than) the state-of-the-art frequent free tree mining algorithms, HybridTreeMiner and FreeTreeMiner.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the last 10-15 years interest in mouse behavioural analysis has evolved considerably. The driving force is development in molecular biological techniques that allow manipulation of the mouse genome by changing the expression of genes. Therefore, with some limitations it is possible to study how genes participate in regulation of physiological functions and to create models explaining genetic contribution to various pathological conditions. The first aim of our study was to establish a framework for behavioural phenotyping of genetically modified mice. We established comprehensive battery of tests for the initial screening of mutant mice. These included tests for exploratory and locomotor activity, emotional behaviour, sensory functions, and cognitive performance. Our interest was in the behavioural patterns of common background strains used for genetic manipulations in mice. Additionally we studied the behavioural effect of sex differences, test history, and individual housing. Our findings highlight the importance of careful consideration of genetic background for analysis of mutant mice. It was evident that some backgrounds may mask or modify the behavioural phenotype of mutants and thereby lead to false positive or negative findings. Moreover, there is no universal strain that is equally suitable for all tests, and using different backgrounds allows one to address possible phenotype modifying factors. We discovered that previous experience affected performance in several tasks. The most sensitive traits were the exploratory and emotional behaviour, as well as motor and nociceptive functions. Therefore, it may be essential to repeat some of the tests in naïve animals for assuring the phenotype. Social isolation for a long time period had strong effects on exploratory behaviour, but also on learning and memory. All experiments revealed significant interactions between strain and environmental factors (test history or housing condition) indicating genotype-dependent effects of environmental manipulations. Several mutant line analyses utilize this information. For example, we studied mice overexpressing as well as those lacking extracellular matrix protein heparin-binding growth-associated molecule (HB-GAM), and mice lacking N-syndecan (a receptor for HB-GAM). All mutant mice appeared to be fertile and healthy, without any apparent neurological or sensory defects. The lack of HB-GAM and N-syndecan, however, significantly reduced the learning capacity of the mice. On the other hand, overexpression of HB-GAM resulted in facilitated learning. Moreover, HB-GAM knockout mice displayed higher anxiety-like behaviour, whereas anxiety was reduced in HB-GAM overexpressing mice. Changes in hippocampal plasticity accompanied the behavioural phenotypes. We conclude that HB-GAM and N-syndecan are involved in the modulation of synaptic plasticity in hippocampus and play a role in regulation of anxiety- and learning-related behaviour.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Generating discriminative input features is a key requirement for achieving highly accurate classifiers. The process of generating features from raw data is known as feature engineering and it can take significant manual effort. In this paper we propose automated feature engineering to derive a suite of additional features from a given set of basic features with the aim of both improving classifier accuracy through discriminative features, and to assist data scientists through automation. Our implementation is specific to HTTP computer network traffic. To measure the effectiveness of our proposal, we compare the performance of a supervised machine learning classifier built with automated feature engineering versus one using human-guided features. The classifier addresses a problem in computer network security, namely the detection of HTTP tunnels. We use Bro to process network traffic into base features and then apply automated feature engineering to calculate a larger set of derived features. The derived features are calculated without favour to any base feature and include entropy, length and N-grams for all string features, and counts and averages over time for all numeric features. Feature selection is then used to find the most relevant subset of these features. Testing showed that both classifiers achieved a detection rate above 99.93% at a false positive rate below 0.01%. For our datasets, we conclude that automated feature engineering can provide the advantages of increasing classifier development speed and reducing development technical difficulties through the removal of manual feature engineering. These are achieved while also maintaining classification accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Placental abruption, one of the most significant causes of perinatal mortality and maternal morbidity, occurs in 0.5-1% of pregnancies. Its etiology is unknown, but defective trophoblastic invasion of the spiral arteries and consequent poor vascularization may play a role. The aim of this study was to define the prepregnancy risk factors of placental abruption, to define the risk factors during the index pregnancy, and to describe the clinical presentation of placental abruption. We also wanted to find a biochemical marker for predicting placental abruption early in pregnancy. Among women delivering at the University Hospital of Helsinki in 1997-2001 (n=46,742), 198 women with placental abruption and 396 control women were identified. The overall incidence of placental abruption was 0.42%. The prepregnancy risk factors were smoking (OR 1.7; 95% CI 1.1, 2.7), uterine malformation (OR 8.1; 1.7, 40), previous cesarean section (OR 1.7; 1.1, 2.8), and history of placental abruption (OR 4.5; 1.1, 18). The risk factors during the index pregnancy were maternal (adjusted OR 1.8; 95% CI 1.1, 2.9) and paternal smoking (2.2; 1.3, 3.6), use of alcohol (2.2; 1.1, 4.4), placenta previa (5.7; 1.4, 23.1), preeclampsia (2.7; 1.3, 5.6) and chorioamnionitis (3.3; 1.0, 10.0). Vaginal bleeding (70%), abdominal pain (51%), bloody amniotic fluid (50%) and fetal heart rate abnormalities (69%) were the most common clinical manifestations of placental abruption. Retroplacental blood clot was seen by ultrasound in 15% of the cases. Neither bleeding nor pain was present in 19% of the cases. Overall, 59% went into preterm labor (OR 12.9; 95% CI 8.3, 19.8), and 91% were delivered by cesarean section (34.7; 20.0, 60.1). Of the newborns, 25% were growth restricted. The perinatal mortality rate was 9.2% (OR 10.1; 95% CI 3.4, 30.1). We then tested selected biochemical markers for prediction of placental abruption. The median of the maternal serum alpha-fetoprotein (MSAFP) multiples of median (MoM) (1.21) was significantly higher in the abruption group (n=57) than in the control group (n=108) (1.07) (p=0.004) at 15-16 gestational weeks. In multivariate analysis, elevated MSAFP remained as an independent risk factor for placental abruption, adjusting for parity ≥ 3, smoking, previous placental abruption, preeclampsia, bleeding in II or III trimester, and placenta previa. MSAFP ≥ 1.5 MoM had a sensitivity of 29% and a false positive rate of 10%. The levels of the maternal serum free beta human chorionic gonadotrophin MoM did not differ between the cases and the controls. None of the angiogenic factors (soluble endoglin, soluble fms-like tyrosine kinase 1, or placental growth factor) showed any difference between the cases (n=42) and the controls (n=50) in the second trimester. The levels of C-reactive protein (CRP) showed no difference between the cases (n=181) and the controls (n=261) (median 2.35 mg/l [interquartile range {IQR} 1.09-5.93] versus 2.28 mg/l [IQR 0.92-5.01], not significant) when tested in the first trimester (mean 10.4 gestational weeks). Chlamydia pneumoniae specific immunoglobulin G (IgG) and immunoglobulin A (IgA) as well as C. trachomatis specific IgG, IgA and chlamydial heat-shock protein 60 antibody rates were similar between the groups. In conclusion, although univariate analysis identified many prepregnancy risk factors for placental abruption, only smoking, uterine malformation, previous cesarean section and history of placental abruption remained significant by multivariate analysis. During the index pregnancy maternal alcohol consumption and smoking and smoking by the partner turned out to be the major independent risk factors for placental abruption. Smoking by both partners multiplied the risk. The liberal use of ultrasound examination contributed little to the management of women with placental abruption. Although second-trimester MSAFP levels were higher in women with subsequent placental abruption, clinical usefulness of this test is limited due to low sensitivity and high false positive rate. Similarly, angiogenic factors in early second trimester, or CRP levels, or chlamydial antibodies in the first trimester failed to predict placental abruption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The greatest effect on reducing mortality in breast cancer comes from the detection and treatment of invasive cancer when it is as small as possible. Although mammography screening is known to be effective, observer errors are frequent and false-negative cancers can be found in retrospective studies of prior mammograms. In the year 2001, 67 women with 69 surgically proven cancers detected at screening in the Mammography Centre of Helsinki University Hospital had previous mammograms as well. These mammograms were analyzed by an experienced screening radiologist, who found that 36 lesions were already visible in previous screening rounds. CAD (Second Look v. 4.01) detected 23 of these missed lesions. Eight readers with different kinds of experience with mammography screening read the films of 200 women with and without CAD. These films included 35 of those missed lesions and 16 screen-detected cancers. CAD sensitivity was 70.6% and specificity 15.8%. Use of CAD lengthened the mean time spent for readings but did not significantly affect readers sensitivities or specificities. Therefore the use of applied version of CAD (Second Look v. 4.01) is questionable. Because none of those eight readers found exactly same cancers, two reading methods were compared: summarized independent reading (at least a single cancer-positive opinion within the group considered decisive) and conference consensus reading (the cancer-positive opinion of the reader majority was considered decisive). The greatest sensitivity of 74.5% was achieved when the independent readings of 4 best-performing readers were summarized. Overall the summarized independent readings were more sensitive than conference consensus readings (64.7% vs. 43.1%) while there was far less difference in mean specificities (92.4% vs. 97.7%). After detecting suspicious lesion, the radiologist has to decide what is the most accurate, fast, and cost-effective means of further work-up. The feasibility of FNAC and CNB in the diagnosis of breast lesions was compared in non-randomised, retrospective study of 580 (503 malignant) breast lesions of 572 patients. The absolute sensitivity for CNB was better than for FNAC, 96% (206/214) vs. 67% (194/289) (p < 0.0001). An additional needle biopsy or surgical biopsy was performed for 93 and 62 patients with FNAC, but for only 2 and 33 patients with CNB. The frequent need of supplement biopsies and unnecessary axillary operations due to false-positive findings made FNAC (294 ) more expensive than CNB (223 ), and because the advantage of quick analysis vanishes during the overall diagnostic and referral process, it is recommendable to use CNB as initial biopsy method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to evaluate the use of sentinel node biopsy (SNB) in the axillary nodal staging in breast cancer. A special interest was in sentinel node (SN) visualization, intraoperative detection of SN metastases, the feasibility of SNB in patients with pure tubular carcinoma (PTC) and in those with ductal carcinoma in situ (DCIS) in core needle biopsy (CNB) and additionally in the detection of axillary recurrences after tumour negative SNB. Patients and methods. 1580 clinically stage T1-T2 node-negative breast cancer patients, who underwent lymphoscintigraphy (LS), SNB and breast surgery between June 2000 - 2004 at the Breast Surgery Unit. The CNB samples were obtained from women, who participated the biennial, population based mammography screening at the Mammography Screening Centre of Helsinki 2001 - 2004.In the follow- up, a cohort of 205 patients who avoided AC due to negative SNB findings were evaluated using ultrasonography one and three years after breast surgery. Results. The visualization rate of axillary SNs was not enhanced by adjusting radioisotope doses according to BMI. The sensitivity of the intraoperative diagnosis of SN metastases of invasive lobular carcinoma (ILC) was higher, 87%, with rapid, intraoperative immunohistochemistry (IHC) group compared to 66% without it. The prevalence of tumour positive SN findings was 27% in the 33 patients with breast tumours diagnosed as PTC. The median histological tumour size was similar in patients with or without axillary metastases. After the histopathological review, six out of 27 patients with true PTC had axillary metastases, with no significant change in the risk factors for axillary metastases. Of the 67 patients with DCIS in the preoperative percutaneous biopsy specimen , 30% had invasion in the surgical specimen. The strongest predictive factor for invasion was the visibility of the lesion in ultrasound. In the three year follow-up, axillary recurrence was found in only two (0.5%) of the total of 383 ultrasound examinations performed during the study, and only one of the 369 examinations revealed cancer. None of the ultrasound examinations were false positive, and no study participant was subjected to unnecessary surgery due to ultrasound monitoring. Conclusions. Adjusting the dose of the radioactive tracer according to patient BMI does not increase the visualization rate of SNs. The intraoperative diagnosis of SN metastases is enhanced by rapid IHC particularly in patients with ILC. SNB seems to be a feasible method for axillary staging of pure tubular carcinoma in patients with a low prevalence of axillary metatastases. SNB also appears to be a sensible method in patients undergoing mastectomy due to DCIS in CNB. It also seems useful in patients with lesions visible in breast US. During follow-up, routine monitoring of the ipsilateral axilla using US is not worthwhile among breast cancer patients who avoided AC due to negative SN findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Acute knee injury is a common event throughout life, and it is usually the result of a traffic accident, simple fall, or twisting injury. Over 90% of patients with acute knee injury undergo radiography. An overlooked fracture or delayed diagnosis can lead to poor patient outcome. The major aim of this thesis was retrospectively to study imaging of knee injury with a special focus on tibial plateau fractures in patients referred to a level-one trauma center. Multi-detector computed tomography (MDCT) findings of acute knee trauma were studied and compared to radiography, as well as whether non-contrast MDCT can detect cruciate ligaments with reasonable accuracy. The prevalence, type, and location of meniscal injuries in magnetic resonance imaging (MRI) were evaluated, particularly in order to assess the prevalence of unstable meniscal tears in acute knee trauma with tibial plateau fractures. The possibility to analyze with conventional MRI the signal appearance of menisci repaired with bioabsorbable arrows was also studied. The postoperative use of MDCT was studied in surgically treated tibial plateau fractures: to establish the frequency and indications of MDCT and to assess the common findings and their clinical impact in a level-one trauma hospital. This thesis focused on MDCT and MRI of knee injuries, and radiographs were analyzed when applica-ble. Radiography constitutes the basis for imaging acute knee injury, but MDCT can yield information beyond the capabilities of radiography. Especially in severely injured patients , sufficient radiographs are often difficult to obtain, and in those patients, radiography is unreliable to rule out fractures. MDCT detected intact cruciate ligaments with good specificity, accuracy, and negative predictive value, but the assessment of torn ligaments was unreliable. A total of 36% (14/39) patients with tibial plateau fracture had an unstable meniscal tear in MRI. When a meniscal tear is properly detected preoperatively, treatment can be combined with primary fracture fixation, thus avoiding another operation. The number of meniscal contusions was high. Awareness of the imaging features of this meniscal abnormality can help radiologists increase specificity by avoiding false-positive findings in meniscal tears. Postoperative menisci treated with bioabsorbable arrows showed no difference, among different signal intensities in MRI, among menisci between patients with operated or intact ACL. The highest incidence of menisci with an increased signal intensity extending to the meniscal surface was in patients whose surgery was within the previous 18 months. The results may indicate that a rather long time is necessary for menisci to heal completely after arrow repair. Whether the menisci with an increased signal intensity extending to the meniscal surface represent improper healing or re-tear, or whether this is just the earlier healing feature in the natural process remains unclear, and further prospective studies are needed to clarify this. Postoperative use of MDCT in tibial plateau fractures was rather infrequent even in this large trauma center, but when performed, it revealed clinically significant information, thus benefitting patients in regard to treatment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation behind the fusion of Intrusion Detection Systems was the realization that with the increasing traffic and increasing complexity of attacks, none of the present day stand-alone Intrusion Detection Systems can meet the high demand for a very high detection rate and an extremely low false positive rate. Multi-sensor fusion can be used to meet these requirements by a refinement of the combined response of different Intrusion Detection Systems. In this paper, we show the design technique of sensor fusion to best utilize the useful response from multiple sensors by an appropriate adjustment of the fusion threshold. The threshold is generally chosen according to the past experiences or by an expert system. In this paper, we show that the choice of the threshold bounds according to the Chebyshev inequality principle performs better. This approach also helps to solve the problem of scalability and has the advantage of failsafe capability. This paper theoretically models the fusion of Intrusion Detection Systems for the purpose of proving the improvement in performance, supplemented with the empirical evaluation. The combination of complementary sensors is shown to detect more attacks than the individual components. Since the individual sensors chosen detect sufficiently different attacks, their result can be merged for improved performance. The combination is done in different ways like (i) taking all the alarms from each system and avoiding duplications, (ii) taking alarms from each system by fixing threshold bounds, and (iii) rule-based fusion with a priori knowledge of the individual sensor performance. A number of evaluation metrics are used, and the results indicate that there is an overall enhancement in the performance of the combined detector using sensor fusion incorporating the threshold bounds and significantly better performance using simple rule-based fusion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reconstructions in optical tomography involve obtaining the images of absorption and reduced scattering coefficients. The integrated intensity data has greater sensitivity to absorption coefficient variations than scattering coefficient. However, the sensitivity of intensity data to scattering coefficient is not zero. We considered an object with two inhomogeneities (one in absorption and the other in scattering coefficient). The standard iterative reconstruction techniques produced results, which were plagued by cross talk, i.e., the absorption coefficient reconstruction has a false positive corresponding to the location of scattering inhomogeneity, and vice-versa. We present a method to remove cross talk in the reconstruction, by generating a weight matrix and weighting the update vector during the iteration. The weight matrix is created by the following method: we first perform a simple backprojection of the difference between the experimental and corresponding homogeneous intensity data. The built up image has greater weightage towards absorption inhomogeneity than the scattering inhomogeneity and its appropriate inverse is weighted towards the scattering inhomogeneity. These two weight matrices are used as multiplication factors in the update vectors, normalized backprojected image of difference intensity for absorption inhomogeneity and the inverse of the above for the scattering inhomogeneity, during the image reconstruction procedure. We demonstrate through numerical simulations, that cross-talk is fully eliminated through this modified reconstruction procedure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The cis-regulatory regions on DNA serve as binding sites for proteins such as transcription factors and RNA polymerase. The combinatorial interaction of these proteins plays a crucial role in transcription initiation, which is an important point of control in the regulation of gene expression. We present here an analysis of the performance of an in silico method for predicting cis-regulatory regions in the plant genomes of Arabidopsis (Arabidopsis thaliana) and rice (Oryza sativa) on the basis of free energy of DNA melting. For protein-coding genes, we achieve recall and precision of 96% and 42% for Arabidopsis and 97% and 31% for rice, respectively. For noncoding RNA genes, the program gives recall and precision of 94% and 75% for Arabidopsis and 95% and 90% for rice, respectively. Moreover, 96% of the false-positive predictions were located in noncoding regions of primary transcripts, out of which 20% were found in the first intron alone, indicating possible regulatory roles. The predictions for orthologous genes from the two genomes showed a good correlation with respect to prediction scores and promoter organization. Comparison of our results with an existing program for promoter prediction in plant genomes indicates that our method shows improved prediction capability.