990 resultados para Testing Procedure


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Past research has demonstrated emergent conditional relations using a go/no-go procedure with pairs of figures displayed side-by-side on a computer screen. The present Study sought to extend applications Of this procedure. In Experiment, 1, we evaluated whether emergent conditional relations Could be demonstrated when two-component stimuli were displayed in figure-ground relationships-abstract figures displayed on backgrounds of different colors. Five normal)), capable adults participated. During training, each two-component stimulus Was presented successively. Responses emitted in the presence of some Stimulus pairs (A1B1, A2B2, A3B3, B1C1, B2C2 and B3C3) were reinforced, whereas responses emitted in the presence of other pairs (A1B2, A1B3, A2B1, A2B3, A3B1, A3B2, B1C2, B1C3, B2C1, B2C3, B3C1 and B3C2) were not. During tests, new configurations (AC and CA) were presented, thus emulating structurally the matching-to-sample tests employed in typical equivalence Studies. All participants showed emergent relations consistent with stimulus equivalence during testing. In Experiment 2, we systematically replicated the procedures with Stimulus compounds consisting Of four figures (A1, A2, C1 and C2) and two locations (left - B1 and right - 132). A,11 6 normally capable adults exhibited emergent stimulus-stimulus relations. Together, these experiments show that the go/no-go procedure is a potentially useful alternative for Studying emergent. conditional relations when matching-to-sample is procedurally cumbersome or impossible to use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims to discuss and test the hypothesis raised by Fusar-Poli [Fusar-Poli P. Can neuroimaging prove that schizophrenia is a brain disease? A radical hypothesis. Medical Hypotheses in press, corrected proof] that ""on the basis of the available imaging literature there is no consistent evidence to reject the radical and provocative hypothesis that schizophrenia is not a brain disease"". To achieve this goal, all meta-analyses on `fMRI and schizophrenia` published during the current decade and indexed in Pubmed were summarized, as much as some other useful information, e.g., meta-analyses on genetic risk factors. Our main conclusion is that the literature fully supports the hypothesis that schizophrenia is a syndrome (not a disease) associated with brain abnormalities, despite the fact that there is no singular and reductionist pathway from the nosographic entity (schizophrenia) to its causes. This irreducibility is due to the fact that the syndrome has more than one dimension (e.g., cognitive, psychotic and negative) and each of them is related to abnormalities in specific neuronal networks. A psychiatric diagnosis is a statistical procedure; these dimensions are not identically represented in each diagnosticated case and this explains the existence of more than one pattern of brain abnormalities related to schizophrenia. For example, chronification is associated with negativism while the first psychotic episode is not; in that sense, the same person living with schizophrenia may reveal different symptoms and fMRI patterns along the course of his life, and this is precisely what defines schizophrenia since the time when it was called Dementia Praecox (first by pick then by Kraepelin). It is notable that 100% of the collected meta-analyses on `fMRI and schizophrenia` reveal positive findings. Moreover, all meta-analyses that found positive associations between schizophrenia and genetic risk factors have to do with genes (SNPs) especially activated in neuronal tissue of the central nervous system (CNS), suggesting that, to the extent these polymorphisms are related to schizophrenia`s etiology, they are also related to abnormal brain activity. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The evaluation of associations between genotypes and diseases in a case-control framework plays an important role in genetic epidemiology. This paper focuses on the evaluation of the homogeneity of both genotypic and allelic frequencies. The traditional test that is used to check allelic homogeneity is known to be valid only under Hardy-Weinberg equilibrium, a property that may not hold in practice. Results: We first describe the flaws of the traditional (chi-squared) tests for both allelic and genotypic homogeneity. Besides the known problem of the allelic procedure, we show that whenever these tests are used, an incoherence may arise: sometimes the genotypic homogeneity hypothesis is not rejected, but the allelic hypothesis is. As we argue, this is logically impossible. Some methods that were recently proposed implicitly rely on the idea that this does not happen. In an attempt to correct this incoherence, we describe an alternative frequentist approach that is appropriate even when Hardy-Weinberg equilibrium does not hold. It is then shown that the problem remains and is intrinsic of frequentist procedures. Finally, we introduce the Full Bayesian Significance Test to test both hypotheses and prove that the incoherence cannot happen with these new tests. To illustrate this, all five tests are applied to real and simulated datasets. Using the celebrated power analysis, we show that the Bayesian method is comparable to the frequentist one and has the advantage of being coherent. Conclusions: Contrary to more traditional approaches, the Full Bayesian Significance Test for association studies provides a simple, coherent and powerful tool for detecting associations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: This report discusses the use of antinuclear antibody (ANA) detection as a screening test for neuropsychiatry systemic lupus erythematosus (NPSLE) in patients presenting a first-episode psychosis. Methods: We reviewed the medical records of 85 patients admitted to an emergency service due to first-episode psychosis, during a 1-year period, for whom ANA detection was performed through an IFI HEp2 cell assay. ANA-positive patients were subsequently evaluated for autoantibodies and neuroimaging exams. Results: Three patients presented as ANA positive in the initial screening and further investigation confirmed NPSLE in two patients. The patients were treated with antipsychotics and cyclophosphamide pulses with satisfactory outcomes. Conclusion: Even though ANA detection is not specific, it is a low-cost procedure and could be an important screening test for NPSLE in the early-onset psychosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In most real-life environments, mechanical or electronic components are subjected to vibrations. Some of these components may have to pass qualification tests to verify that they can withstand the fatigue damage they will encounter during their operational life. In order to conduct a reliable test, the environmental excitations can be taken as a reference to synthesize the test profile: this procedure is referred to as “test tailoring”. Due to cost and feasibility reasons, accelerated qualification tests are usually performed. In this case, the duration of the original excitation which acts on the component for its entire life-cycle, typically hundreds or thousands of hours, is reduced. In particular, the “Mission Synthesis” procedure lets to quantify the induced damage of the environmental vibration through two functions: the Fatigue Damage Spectrum (FDS) quantifies the fatigue damage, while the Maximum Response Spectrum (MRS) quantifies the maximum stress. Then, a new random Power Spectral Density (PSD) can be synthesized, with same amount of induced damage, but a specified duration in order to conduct accelerated tests. In this work, the Mission Synthesis procedure is applied in the case of so-called Sine-on-Random vibrations, i.e. excitations composed of random vibrations superimposed on deterministic contributions, in the form of sine tones typically due to some rotating parts of the system (e.g. helicopters, engine-mounted components, …). In fact, a proper test tailoring should not only preserve the accumulated fatigue damage, but also the “nature” of the excitation (in this case the sinusoidal components superimposed on the random process) in order to obtain reliable results. The classic time-domain approach is taken as a reference for the comparison of different methods for the FDS calculation in presence of Sine-on-Random vibrations. Then, a methodology to compute a Sine-on-Random specification based on a mission FDS is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past decades, major progress in patient selection, surgical techniques and anaesthetic management have largely contributed to improved outcome in lung cancer surgery. The purpose of this study was to identify predictors of post-operative cardiopulmonary morbidity in patients with a forced expiratory volume in 1 s <80% predicted, who underwent cardiopulmonary exercise testing (CPET). In this observational study, 210 consecutive patients with lung cancer underwent CPET with completed data over a 9-yr period (2001-2009). Cardiopulmonary complications occurred in 46 (22%) patients, including four (1.9%) deaths. On logistic regression analysis, peak oxygen uptake (peak V'(O₂) and anaesthesia duration were independent risk factors of both cardiovascular and pulmonary complications; age and the extent of lung resection were additional predictors of cardiovascular complications, whereas tidal volume during one-lung ventilation was a predictor of pulmonary complications. Compared with patients with peak V'(O₂) >17 mL·kg⁻¹·min⁻¹, those with a peak V'(O₂) <10 mL·kg⁻¹·min⁻¹ had a four-fold higher incidence of cardiac and pulmonary morbidity. Our data support the use of pre-operative CPET and the application of an intra-operative protective ventilation strategy. Further studies should evaluate whether pre-operative physical training can improve post-operative outcome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Equivalence testing is growing in use in scientific research outside of its traditional role in the drug approval process. Largely due to its ease of use and recommendation from the United States Food and Drug Administration guidance, the most common statistical method for testing (bio)equivalence is the two one-sided tests procedure (TOST). Like classical point-null hypothesis testing, TOST is subject to multiplicity concerns as more comparisons are made. In this manuscript, a condition that bounds the family-wise error rate (FWER) using TOST is given. This condition then leads to a simple solution for controlling the FWER. Specifically, we demonstrate that if all pairwise comparisons of k independent groups are being evaluated for equivalence, then simply scaling the nominal Type I error rate down by (k - 1) is sufficient to maintain the family-wise error rate at the desired value or less. The resulting rule is much less conservative than the equally simple Bonferroni correction. An example of equivalence testing in a non drug-development setting is given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The GLAaS algorithm for pretreatment intensity modulation radiation therapy absolute dose verification based on the use of amorphous silicon detectors, as described in Nicolini et al. [G. Nicolini, A. Fogliata, E. Vanetti, A. Clivio, and L. Cozzi, Med. Phys. 33, 2839-2851 (2006)], was tested under a variety of experimental conditions to investigate its robustness, the possibility of using it in different clinics and its performance. GLAaS was therefore tested on a low-energy Varian Clinac (6 MV) equipped with an amorphous silicon Portal Vision PV-aS500 with electronic readout IAS2 and on a high-energy Clinac (6 and 15 MV) equipped with a PV-aS1000 and IAS3 electronics. Tests were performed for three calibration conditions: A: adding buildup on the top of the cassette such that SDD-SSD = d(max) and comparing measurements with corresponding doses computed at d(max), B: without adding any buildup on the top of the cassette and considering only the intrinsic water-equivalent thickness of the electronic portal imaging devices device (0.8 cm), and C: without adding any buildup on the top of the cassette but comparing measurements against doses computed at d(max). This procedure is similar to that usually applied when in vivo dosimetry is performed with solid state diodes without sufficient buildup material. Quantitatively, the gamma index (gamma), as described by Low et al. [D. A. Low, W. B. Harms, S. Mutic, and J. A. Purdy, Med. Phys. 25, 656-660 (1998)], was assessed. The gamma index was computed for a distance to agreement (DTA) of 3 mm. The dose difference deltaD was considered as 2%, 3%, and 4%. As a measure of the quality of results, the fraction of field area with gamma larger than 1 (%FA) was scored. Results over a set of 50 test samples (including fields from head and neck, breast, prostate, anal canal, and brain cases) and from the long-term routine usage, demonstrated the robustness and stability of GLAaS. In general, the mean values of %FA remain below 3% for deltaD equal or larger than 3%, while they are slightly larger for deltaD = 2% with %FA in the range from 3% to 8%. Since its introduction in routine practice, 1453 fields have been verified with GLAaS at the authors' institute (6 MV beam). Using a DTA of 3 mm and a deltaD of 4% the authors obtained %FA = 0.9 +/- 1.1 for the entire data set while, stratifying according to the dose calculation algorithm, they observed: %FA = 0.7 +/- 0.9 for fields computed with the analytical anisotropic algorithm and %FA = 2.4 +/- 1.3 for pencil-beam based fields with a statistically significant difference between the two groups. If data are stratified according to field splitting, they observed %FA = 0.8 +/- 1.0 for split fields and 1.0 +/- 1.2 for nonsplit fields without any significant difference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Prolonged sacral neuromodulation (SNM) testing is more reliable for accurate patient selection than the usual test period of 4-7 days. However, prolonged testing was suspected to result in a higher complication rate due to infection via the percutaneous passage of the extension wire. Therefore, we prospectively assessed the complications associated with prolonged tined lead testing. PATIENTS AND METHODS: A consecutive series of 44 patients who underwent prolonged tined lead testing for at least 14 days between May 2002 and April 2007 were evaluated. Complications during prolonged tined lead testing, during and after tined lead explantation and during follow-up after implantation of the implantable pulse generator (IPG) were registered prospectively. RESULTS: Four patients suffered from urgency-frequency syndrome, 13 from urge incontinence, 18 from non-obstructive chronic urinary retention and nine from chronic pelvic pain syndrome. The median test phase was 30 days (interquartile range [IQR] 21-36). Thirty-two of the 44 patients (73%) had successful prolonged tined lead testing and 31 of these (97%) underwent the implantation of the IPG. The median follow-up of the IPG implanted patients was 31 months (IQR 20-41). The complication rate was 5% (2/44) during prolonged tined lead testing and 16% (5/31) during follow-up of the IPG implanted patients, respectively. None of the complications could be attributed to prolonged testing. No infections were observed during the study period. CONCLUSIONS: This prospective, observational non-randomised study suggests prolonged SNM tined lead testing is a safe procedure. Based on the low complication rate and the increased reliability for accurate patient selection, this method is proposed as a possible standard test procedure, subject to confirmation by further randomised, controlled clinical studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Phaeochromocytomas and paragangliomas are neuro-endocrine tumours that occur sporadically and in several hereditary tumour syndromes, including the phaeochromocytoma-paraganglioma syndrome. This syndrome is caused by germline mutations in succinate dehydrogenase B (SDHB), C (SDHC), or D (SDHD) genes. Clinically, the phaeochromocytoma-paraganglioma syndrome is often unrecognised, although 10-30% of apparently sporadic phaeochromocytomas and paragangliomas harbour germline SDH-gene mutations. Despite these figures, the screening of phaeochromocytomas and paragangliomas for mutations in the SDH genes to detect phaeochromocytoma-paraganglioma syndrome is rarely done because of time and financial constraints. We investigated whether SDHB immunohistochemistry could effectively discriminate between SDH-related and non-SDH-related phaeochromocytomas and paragangliomas in large retrospective and prospective tumour series. METHODS: Immunohistochemistry for SDHB was done on 220 tumours. Two retrospective series of 175 phaeochromocytomas and paragangliomas with known germline mutation status for phaeochromocytoma-susceptibility or paraganglioma-susceptibility genes were investigated. Additionally, a prospective series of 45 phaeochromocytomas and paragangliomas was investigated for SDHB immunostaining followed by SDHB, SDHC, and SDHD mutation testing. FINDINGS: SDHB protein expression was absent in all 102 phaeochromocytomas and paragangliomas with an SDHB, SDHC, or SDHD mutation, but was present in all 65 paraganglionic tumours related to multiple endocrine neoplasia type 2, von Hippel-Lindau disease, and neurofibromatosis type 1. 47 (89%) of the 53 phaeochromocytomas and paragangliomas with no syndromic germline mutation showed SDHB expression. The sensitivity and specificity of the SDHB immunohistochemistry to detect the presence of an SDH mutation in the prospective series were 100% (95% CI 87-100) and 84% (60-97), respectively. INTERPRETATION: Phaeochromocytoma-paraganglioma syndrome can be diagnosed reliably by an immunohistochemical procedure. SDHB, SDHC, and SDHD germline mutation testing is indicated only in patients with SDHB-negative tumours. SDHB immunohistochemistry on phaeochromocytomas and paragangliomas could improve the diagnosis of phaeochromocytoma-paraganglioma syndrome. FUNDING: The Netherlands Organisation for Scientific Research, Dutch Cancer Society, Vanderes Foundation, Association pour la Recherche contre le Cancer, Institut National de la Santé et de la Recherche Médicale, and a PHRC grant COMETE 3 for the COMETE network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently developed technologies allow aortic valve implantation off-pump in a beating heart. In this procedure, the native, stenotic aortic valve is not removed, but simply crushed by a pressure balloon mounted on a percutaneous catheter. Removal of the native aortic cusps before valve replacement may reduce the incidence of annular or cuspal calcium embolization and late perivalvular leaks and increase implantable valve size. However, a temporary valve system in the ascending aorta may be necessary to maintain hemodynamic stability by reducing acute aortic regurgitation and left ventricular volume overload. This study evaluates the hemodynamic effects of a wire-mounted, monoleaflet, temporary valve apparatus in a mechanical cardiovascular simulator. Aortic flow, systemic pressure and left ventricular pressure were continuously monitored. An intraluminal camera obtained real-time proximal and distal images of the valve in operation. Insertion of the parachute valve in the simulator increased diastolic pressure from 7 to 38 mm Hg. Cardiac output increased from 2.08 to 4.66 L/min and regurgitant volume decreased from 65 to 23 mL. In conclusion, placement of a temporary valve in the ascending aorta may help maintain hemodynamic stability and improve off-pump aortic valve replacement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to assess implant therapy after a staged guided bone regeneration procedure in the anterior maxilla by lateralization of the nasopalatine nerve and vessel bundle. Neurosensory function following augmentative procedures and implant placement, assessed using a standardized questionnaire and clinical examination, were the primary outcome variables measured. This retrospective study included patients with a bone defect in the anterior maxilla in need of horizontal and/or vertical ridge augmentation prior to dental implant placement. The surgical sites were allowed to heal for at least 6 months before placement of dental implants. All patients received fixed implant-supported restorations and entered into a tightly scheduled maintenance program. In addition to the maintenance program, patients were recalled for a clinical examination and to fill out a questionnaire to assess any changes in the neurosensory function of the nasopalatine nerve at least 6 months after function. Twenty patients were included in the study from February 2001 to December 2010. They received a total of 51 implants after augmentation of the alveolar crest and lateralization of the nasopalatine nerve. The follow-up examination for questionnaire and neurosensory assessment was scheduled after a mean period of 4.18 years of function. None of the patients examined reported any pain, they did not have less or an altered sensation, and they did not experience a "foreign body" feeling in the area of surgery. Overall, 6 patients out of 20 (30%) showed palatal sensibility alterations of the soft tissues in the region of the maxillary canines and incisors resulting in a risk for a neurosensory change of 0.45 mucosal teeth regions per patient after ridge augmentation with lateralization of the nasopalatine nerve. Regeneration of bone defects in the anterior maxilla by horizontal and/or vertical ridge augmentation and lateralization of the nasopalatine nerve prior to dental implant placement is a predictable surgical technique. Whether or not there were clinically measurable impairments of neurosensory function, the patients did not report them or were not bothered by them.