953 resultados para Probability of detection
Resumo:
The prevalent rate of psychiatry morbidity amongst patients with cancer reported in various studies ranges from 5 to 50%, a variation that can be attributed to differences in sample size, the disease itself and treatment factors. The objectives of the present study were to determine the frequency of psychiatric morbidity amongst recently diagnosed cancer outpatients and try to identify which factors might be related to further psychological distress. Two hundred and eleven (70.9%) female patients and 87 (29.1%) male patients from the chemotherapy unit of the Cancer Hospital A.C. Camargo (São Paulo) completed a questionnaire that featured data on demographic, medical and treatment details. The Self Reporting Questionnaire (SRQ-20) was administered to the patients to determine their personal psychiatric morbidity. Seventy-two patients (25.8%) scored > or = 8 in the SRQ-20, the cut-off point for a patient to be considered a psychiatric case. When the low and high scoring groups were compared no differences were detected regarding age, marital status, tumor site, sex, or previous treatment. Nonetheless, patients in the lowest social class and those who were bedridden less than 50% of the time had a significantly higher probability of being a psychiatric case. Regarding help-seeking behavior in situations in which they had doubts or were frightened, about 64% of the total sample did not seek any type of support and did not talk to anyone. This frequency of psychiatric morbidity agrees with data from the cancer literature. According to many investigators, the early detection of a comorbid psychiatric disorder is crucial to relieve a patient's suffering.
Resumo:
CDKN2A has been implicated as a melanoma susceptibility gene in some kindreds with a family history of this disease. Mutations in CDKN2A may produce an imbalance between functional p16ink4a and cyclin D causing abnormal cell growth. We searched for germline mutations in this gene in 22 patients with clinical criteria of hereditary cancer (early onset, presence of multiple primary melanoma or 1 or more first- or second-degree relatives affected) by secondary structural content prediction, a mutation scanning method that relies on the propensity for single-strand DNA to take on a three-dimensional structure that is highly sequence dependent, and sequencing the samples with alterations in the electrophoretic mobility. The prevalence of CDKN2A mutation in our study was 4.5% (1/22) and there was a correlation between family history and probability of mutation detection. We found the P48T mutation in 1 patient with 2 melanoma-affected relatives. The patient descends from Italian families and this mutation has been reported previously only in Italian families in two independent studies. This leads us to suggest the presence of a mutational "hotspot" within this gene or a founder mutation. We also detected a high prevalence (59.1%) of polymorphisms, mainly alleles 500 C/G (7/31.8%) or 540 C/T (6/27.3%), in the 3' untranslated region of exon 3. This result reinforces the idea that these rare polymorphic alleles have been significantly associated with the risk of developing melanoma.
Resumo:
Myelodysplastic syndrome (MDS) patients with a normal karyotype constitute a heterogeneous group from a biological standpoint and their outcome is often unpredictable. Interphase fluorescence in situ hybridization (I-FISH) studies could increase the rate of detection of abnormalities, but previous reports in the literature have been contradictory. We performed I-FISH and conventional karyotyping (G-banding) on 50 MDS patients at diagnosis, after 6 and 12 months or at any time if a transformation to acute myeloid leukemia (AML) was detected. Applying a probe-panel targeting the centromere of chromosomes 7 and 8, 5q31, 5p15.2 and 7q31, we observed one case with 5q deletion not identified by G-banding. I-FISH at 6 and 12 months confirmed the karyotype results. Eight cases transformed to AML during follow-up, but no hidden clone was detected by I-FISH in any of them. The inclusion of I-FISH during follow-up of MDS resulted in a small improvement in abnormality detection when compared with conventional G-banding.
Resumo:
The increasing presence of products derived from genetically modified (GM) plants in human and animal diets has led to the development of detection methods to distinguish biotechnology-derived foods from conventional ones. The conventional and real-time PCR have been used, respectively, to detect and quantify GM residues in highly processed foods. DNA extraction is a critical step during the analysis process. Some factors such as DNA degradation, matrix effects, and the presence of PCR inhibitors imply that a detection or quantification limit, established for a given method, is restricted to a matrix used during validation and cannot be projected to any other matrix outside the scope of the method. In Brazil, sausage samples were the main class of processed products in which Roundup Ready® (RR) soybean residues were detected. Thus, the validation of methodologies for the detection and quantification of those residues is absolutely necessary. Sausage samples were submitted to two different methods of DNA extraction: modified Wizard and the CTAB method. The yield and quality were compared for both methods. DNA samples were analyzed by conventional and real-time PCR for the detection and quantification of Roundup Ready® soybean in the samples. At least 200 ng of total sausage DNA was necessary for a reliable quantification. Reactions containing DNA amounts below this value led to large variations on the expected GM percentage value. In conventional PCR, the detection limit varied from 1.0 to 500 ng, depending on the GM soybean content in the sample. The precision, performance, and linearity were relatively high indicating that the method used for analysis was satisfactory.
Resumo:
A method using Liquid Chromatography Tanden Mass Spectrometry (LC-MS/MS) with matrix-matched calibration curve was developed and validated for determining ochratoxin A (OTA) in green coffee. Linearity was found between 3.0 and 23.0 ng.g-1. Mean recoveries ranged between 90.45% and 108.81%; the relative standard deviation under repeatability and intermediate precision conditions ranged from 5.39% to 9.94% and from 2.20% to 14.34%, respectively. The limits of detection and quantification were 1.2 ng.g-1 and 3.0 ng.g-¹, respectively. The method developed was suitable and contributed to the field of mycotoxin analysis, and it will be used for future production of the Certified Reference Material (CRM) for OTA in coffee.
Resumo:
Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.
Resumo:
We present a new method to select features for a face detection system using Support Vector Machines (SVMs). In the first step we reduce the dimensionality of the input space by projecting the data into a subset of eigenvectors. The dimension of the subset is determined by a classification criterion based on minimizing a bound on the expected error probability of an SVM. In the second step we select features from the SVM feature space by removing those that have low contributions to the decision function of the SVM.
Resumo:
Understanding the effect of habitat fragmentation is a fundamental yet complicated aim of many ecological studies. Beni savanna is a naturally fragmented forest habitat, where forest islands exhibit variation in resources and threats. To understand how the availability of resources and threats affect the use of forest islands by parrots, we applied occupancy modeling to quantify use and detection probabilities for 12 parrot species on 60 forest islands. The presence of urucuri (Attalea phalerata) and macaw (Acrocomia aculeata) palms, the number of tree cavities on the islands, and the presence of selective logging,and fire were included as covariates associated with availability of resources and threats. The model-selection analysis indicated that both resources and threats variables explained the use of forest islands by parrots. For most species, the best models confirmed predictions. The number of cavities was positively associated with use of forest islands by 11 species. The area of the island and the presence of macaw palm showed a positive association with the probability of use by seven and five species, respectively, while selective logging and fire showed a negative association with five and six species, respectively. The Blue-throated Macaw (Ara glaucogularis), the critically endangered parrot species endemic to our study area, was the only species that showed a negative association with both threats. Monitoring continues to be essential to evaluate conservation and management actions of parrot populations. Understanding of how species are using this natural fragmented habitat will help determine which fragments should be preserved and which conservation actions are needed.
Resumo:
There has been recent interest in the use of X-chromosomal loci for forensic and relatedness testing casework, with many authors developing new X-linked short tandem repeat (STR) loci suitable for forensic use. Here we present formulae for two key quantities in paternity testing, the average probability of exclusion and the paternity index, which are suitable for Xchromosomal loci in the presence of population substructure.
Resumo:
Recombination is thought to occur only rarely in animal mitochondrial DNA ( mtDNA). However, detection of mtDNA recombination requires that cells become heteroplasmic through mutation, intramolecular recombination or ' leakage' of paternal mtDNA. Interspecific hybridization increases the probability of detecting mtDNA recombinants due to higher levels of sequence divergence and potentially higher levels of paternal leakage. During a study of historical variation in Atlantic salmon ( Salmo salar) mtDNA, an individual with a recombinant haplotype containing sequence from both Atlantic salmon and brown trout ( Salmo trutta) was detected. The individual was not an F1 hybrid but it did have an unusual nuclear genotype which suggested that it was a later-generation backcross. No other similar recombinant haplotype was found from the same population or three neighbouring Atlantic salmon populations in 717 individuals collected during 1948 - 2002. Interspecific recombination may increase mtDNA variability within species and can have implications for phylogenetic studies.
Resumo:
Numerical Weather Prediction (NWP) fields are used to assist the detection of cloud in satellite imagery. Simulated observations based on NWP are used within a framework based on Bayes' theorem to calculate a physically-based probability of each pixel with an imaged scene being clear or cloudy. Different thresholds can be set on the probabilities to create application-specific cloud-masks. Here, this is done over both land and ocean using night-time (infrared) imagery. We use a validation dataset of difficult cloud detection targets for the Spinning Enhanced Visible and Infrared Imager (SEVIRI) achieving true skill scores of 87% and 48% for ocean and land, respectively using the Bayesian technique, compared to 74% and 39%, respectively for the threshold-based techniques associated with the validation dataset.
Resumo:
Numerical Weather Prediction (NWP) fields are used to assist the detection of cloud in satellite imagery. Simulated observations based on NWP are used within a framework based on Bayes' theorem to calculate a physically-based probability of each pixel with an imaged scene being clear or cloudy. Different thresholds can be set on the probabilities to create application-specific cloud masks. Here, the technique is shown to be suitable for daytime applications over land and sea, using visible and near-infrared imagery, in addition to thermal infrared. We use a validation dataset of difficult cloud detection targets for the Spinning Enhanced Visible and Infrared Imager (SEVIRI) achieving true skill scores of 89% and 73% for ocean and land, respectively using the Bayesian technique, compared to 90% and 70%, respectively for the threshold-based techniques associated with the validation dataset.
Resumo:
We propose and demonstrate a fully probabilistic (Bayesian) approach to the detection of cloudy pixels in thermal infrared (TIR) imagery observed from satellite over oceans. Using this approach, we show how to exploit the prior information and the fast forward modelling capability that are typically available in the operational context to obtain improved cloud detection. The probability of clear sky for each pixel is estimated by applying Bayes' theorem, and we describe how to apply Bayes' theorem to this problem in general terms. Joint probability density functions (PDFs) of the observations in the TIR channels are needed; the PDFs for clear conditions are calculable from forward modelling and those for cloudy conditions have been obtained empirically. Using analysis fields from numerical weather prediction as prior information, we apply the approach to imagery representative of imagers on polar-orbiting platforms. In comparison with the established cloud-screening scheme, the new technique decreases both the rate of failure to detect cloud contamination and the false-alarm rate by one quarter. The rate of occurrence of cloud-screening-related errors of >1 K in area-averaged SSTs is reduced by 83%. Copyright © 2005 Royal Meteorological Society.
Resumo:
We describe how the method of detection of delayed K x-rays produced by the electron capture decay of the residual nuclei can be a powerful tool in the investigation of the effect of the breakup process on the complete fusion (CF) cross-section of weakly bound nuclei at energies close to the Coulomb barrier. This is presently one of the most interesting subjects under investigation in the field of low-energy nuclear reactions, and the difficult experimental task of separating CF from the incomplete fusion (ICF) of one of the breakup fragments can be achieved by the x-ray spectrometry method. We present results for the fusion of the (9)Be + (144)Sm system. Copyright (c) 2008 John Wiley & Sons, Ltd.
Resumo:
This study describes the development of amperometric sensors based on poly(allylamine hydrochloride) (PAH) and lutetium bisphthalocyanine (LuPc(2)) films assembled using the Layer-by-Layer (LbL) technique. The films have been used as modified electrodes for catechol quantification. Electrochemical measurements have been employed to investigate the catalytic properties of the LuPc(2) immobilized in the LbL films. By chronoamperometry, the sensors present excellent sensitivity (20 nA mu M(-1)) in a wide linear range (R(2) = 0.994) up to 900 mu M and limit of detection (s/n = 3) of 37.5 x 10(-8) M for catechol. The sensors have good reproducibility and can be used at least for ten times. The work potential is +0.3 V vs. saturated calomel electrode (SCE). In voltammetry measurements, the calibration curve shows a good linearity (R(2) = 0.992) in the range of catechol up to 500 mu M with a sensitivity of 90 nA mu M(-1) and LD of 8 mu M. (C) 2011 Elsevier B.V. All rights reserved.