853 resultados para automated correlation optimized warping
Resumo:
Abstract Hyperechoic lesions are not a frequent finding at breasts ultrasonography, and most of times are associated with benign pathologies that do not require further evaluation. However, some neoplasms such as invasive breast carcinomas and metastases may present with hyperechogenicity. Thus, the knowledge about differential diagnoses and identification of signs of lesion aggressiveness are of great relevance to avoid unnecessary procedures or underdiagnosis, and to support the correct clinical/surgical approach. On the basis of such concepts, the present essay describes and illustrates the main features of hyperechoic lesions at breast ultrasonography in different cases, with anatomopathological correlation.
Resumo:
Background: The DNA repair protein O6-Methylguanine-DNA methyltransferase (MGMT) confers resistance to alkylating agents. Several methods have been applied to its analysis, with methylation-specific polymerase chain reaction (MSP) the most commonly used for promoter methylation study, while immunohistochemistry (IHC) has become the most frequently used for the detection of MGMT protein expression. Agreement on the best and most reliable technique for evaluating MGMT status remains unsettled. The aim of this study was to perform a systematic review and meta-analysis of the correlation between IHC and MSP. Methods A computer-aided search of MEDLINE (1950-October 2009), EBSCO (1966-October 2009) and EMBASE (1974-October 2009) was performed for relevant publications. Studies meeting inclusion criteria were those comparing MGMT protein expression by IHC with MGMT promoter methylation by MSP in the same cohort of patients. Methodological quality was assessed by using the QUADAS and STARD instruments. Previously published guidelines were followed for meta-analysis performance. Results Of 254 studies identified as eligible for full-text review, 52 (20.5%) met the inclusion criteria. The review showed that results of MGMT protein expression by IHC are not in close agreement with those obtained with MSP. Moreover, type of tumour (primary brain tumour vs others) was an independent covariate of accuracy estimates in the meta-regression analysis beyond the cut-off value. Conclusions Protein expression assessed by IHC alone fails to reflect the promoter methylation status of MGMT. Thus, in attempts at clinical diagnosis the two methods seem to select different groups of patients and should not be used interchangeably.
Resumo:
Abstract Objective: To compare the diagnostic performance of the three-dimensional turbo spin-echo (3D TSE) magnetic resonance imaging (MRI) technique with the performance of the standard two-dimensional turbo spin-echo (2D TSE) protocol at 1.5 T, in the detection of meniscal and ligament tears. Materials and Methods: Thirty-eight patients were imaged twice, first with a standard multiplanar 2D TSE MR technique, and then with a 3D TSE technique, both in the same 1.5 T MRI scanner. The patients underwent knee arthroscopy within the first three days after the MRI. Using arthroscopy as the reference standard, we determined the diagnostic performance and agreement. Results: For detecting anterior cruciate ligament tears, the 3D TSE and routine 2D TSE techniques showed similar values for sensitivity (93% and 93%, respectively) and specificity (80% and 85%, respectively). For detecting medial meniscal tears, the two techniques also had similar sensitivity (85% and 83%, respectively) and specificity (68% and 71%, respectively). In addition, for detecting lateral meniscal tears, the two techniques had similar sensitivity (58% and 54%, respectively) and specificity (82% and 92%, respectively). There was a substantial to almost perfect intraobserver and interobserver agreement when comparing the readings for both techniques. Conclusion: The 3D TSE technique has a diagnostic performance similar to that of the routine 2D TSE protocol for detecting meniscal and anterior cruciate ligament tears at 1.5 T, with the advantage of faster acquisition.
Resumo:
Living bacteria or yeast cells are frequently used as bioreporters for the detection of specific chemical analytes or conditions of sample toxicity. In particular, bacteria or yeast equipped with synthetic gene circuitry that allows the production of a reliable non-cognate signal (e.g., fluorescent protein or bioluminescence) in response to a defined target make robust and flexible analytical platforms. We report here how bacterial cells expressing a fluorescence reporter ("bactosensors"), which are mostly used for batch sample analysis, can be deployed for automated semi-continuous target analysis in a single concise biochip. Escherichia coli-based bactosensor cells were continuously grown in a 13 or 50 nanoliter-volume reactor on a two-layered polydimethylsiloxane-on-glass microfluidic chip. Physiologically active cells were directed from the nl-reactor to a dedicated sample exposure area, where they were concentrated and reacted in 40 minutes with the target chemical by localized emission of the fluorescent reporter signal. We demonstrate the functioning of the bactosensor-chip by the automated detection of 50 μgarsenite-As l(-1) in water on consecutive days and after a one-week constant operation. Best induction of the bactosensors of 6-9-fold to 50 μg l(-1) was found at an apparent dilution rate of 0.12 h(-1) in the 50 nl microreactor. The bactosensor chip principle could be widely applicable to construct automated monitoring devices for a variety of targets in different environments.
Resumo:
Recent standardization efforts in e-learning technology have resulted in a number of specifications, however, the automation process that is considered essential in a learning management system (LMS) is a lessexplored one. As learning technology becomes more widespread and more heterogeneous, there is a growing need to specify processes that cross the boundaries of a single LMS or learning resource repository. This article proposes to obtain a specification orientated to automation that takes on board the heterogeneity of systems and formats and provides a language for specifying complex and generic interactions. Having this goal in mind, a technique based on three steps is suggested. The semantic conformance profiles, the business process management (BPM) diagram, and its translation into the business process execution language (BPEL) seem to be suitable for achieving it.
Resumo:
Automation or semi-automation of learning scenariospecifications is one of the least exploredsubjects in the e-learning research area. There isa need for a catalogue of learning scenarios and atechnique to facilitate automated retrieval of stored specifications. This requires constructing anontology with this goal and is justified inthis paper. This ontology must mainlysupport a specification technique for learning scenarios. This ontology should also be useful in the creation and validation of new scenarios as well as in the personalization of learning scenarios or their monitoring. Thus, after justifying the need for this ontology, a first approach of a possible knowledge domain is presented. An example of a concrete learning scenario illustrates some relevant concepts supported by this ontology in order to define the scenario in such a way that it could be easy to automate.
Resumo:
Epsilon toxin (Etx) from Clostridium perfringens is a pore-forming protein with a lethal effect on livestock, producing severe enterotoxemia characterized by general edema and neurological alterations. Site-specific mutations of the toxin are valuable tools to study the cellular and molecular mechanism of the toxin activity. In particular, mutants with paired cysteine substitutions that affect the membrane insertion domain behaved as dominant-negative inhibitors of toxin activity in MDCK cells. We produced similar mutants, together with a well-known non-toxic mutant (Etx-H106P), as green fluorescent protein (GFP) fusion proteins to perform in vivo studies in an acutely intoxicated mouse model. The mutant (GFP-Etx-I51C/A114C) had a lethal effect with generalized edema, and accumulated in the brain parenchyma due to its ability to cross the blood-brain barrier (BBB). In the renal system, this mutant had a cytotoxic effect on distal tubule epithelial cells. The other mutants studied (GFP-Etx-V56C/F118C and GFP-Etx-H106P) did not have a lethal effect or cross the BBB, and failed to induce a cytotoxic effect on renal epithelial cells. These data suggest a direct correlation between the lethal effect of the toxin, with its cytotoxic effect on the kidney distal tubule cells, and the ability to cross the BBB.
Resumo:
This paper analyses the impact of using different correlation assumptions between lines of business when estimating the risk-based capital reserve, the Solvency Capital Requirement -SCR-, under Solvency II regulations. A case study is presented and the SCR is calculated according to the Standard Model approach. Alternatively, the requirement is then calculated using an Internal Model based on a Monte Carlo simulation of the net underwriting result at a one-year horizon, with copulas being used to model the dependence between lines of business. To address the impact of these model assumptions on the SCR we conduct a sensitivity analysis. We examine changes in the correlation matrix between lines of business and address the choice of copulas. Drawing on aggregate historical data from the Spanish non-life insurance market between 2000 and 2009, we conclude that modifications of the correlation and dependence assumptions have a significant impact on SCR estimation.
Resumo:
The spectrophotometric determination of Cd(II) using a flow injection system provided with a solid-phase reactor for cadmium preconcentration and on-line reagent preparation, is described. It is based on the formation of a dithizone-Cd complex in basic medium. The calibration curve is linear between 6 and 300 µg L-1 Cd(II), with a detection limit of 5.4 µg L-1, an RSD of 3.7% (10 replicates in duplicate) and a sample frequency of 11.4 h-1. The proposed method was satisfactorily applied to the determination of Cd(II) in surface, well and drinking waters.
Resumo:
Phenolic contents of extracts of Syzygium cumini leaves, collected monthly over a one-year period, were quantitatively determined by the modified Folin-Ciocalteau method. Extracts and tannin-free fractions were assayed by their potential to inhibit mouse paw edema induced by C48/80. HPLC showed high molecular weight phenolic species and flavonoids in the active extracts and fractions. The highest total phenolic content corresponded to the most potent degree of inhibition and the flavonoids were supposed to be the main species responsible for the activity, given that the flavonoid-enriched ethyl-acetate fraction maintained its effect down to a dose of 0.01 mg/kg in a dose-response manner.
Resumo:
Evaluation of the pollution by the herbicides alachlor, propanil and atrazine in water samples from four rivers in the cities of Turvo and Meleiro, south of Santa Catarina State, was made using the SPME-GC-ECD method. The proposed method was optimized and validated. The correlation coefficients were higher than 0.997 and linear ranges of the analytical curves were 0.1-4; 0.1-2.5 and 0.1-5 µg L-1 for atrazine, alachlor and propanil, respectively. The herbicides were quantified by GC-ECD and identified by GC-MS. Both of the selected rivers presented contamination by at least one of the studied herbicides.
Resumo:
A liquid chromatography-tandem mass spectrometry method with atmospheric pressure chemical ionization (LC-APCI/MS/MS) was validated for the determination of etoricoxib in human plasma using antipyrin as internal standard, followed by on-line solid-phase extraction. The method was performed on a Luna C18 column and the mobile phase consisted of acetonitrile:water (95:5, v/v)/ammonium acetate (pH 4.0; 10 mM), run at a flow rate of 0.6 mL/min. The method was linear in the range of 1-5000 ng/mL (r²>0.99). The lower limit of quantitation was 1 ng/mL. The recoveries were within 93.72-96.18%. Moreover, method validation demonstrated acceptable results for the precision, accuracy and stability studies.
Resumo:
The productivity, quality and cost efficiency of welding work are critical for metal industry today. Welding processes must get more effective and this can be done by mechanization and automation. Those systems are always expensive and they have to pay the investment back. In this case it is really important to optimize the needed intelligence and this way needed automation level, so that a company will get the best profit. This intelligence and automation level was earlier classified in several different ways which are not useful for optimizing the process of automation or mechanization of welding. In this study the intelligence of a welding system is defined in a new way to enable the welding system to produce a weld good enough. In this study a new way is developed to classify and select the internal intelligence level of a welding system needed to produce the weld efficiently. This classification contains the possible need of human work and its effect to the weld and its quality but does not exclude any different welding processes or methods. In this study a totally new way is developed to calculate the best optimization for the needed intelligence level in welding. The target of this optimization is the best possible productivity and quality and still an economically optimized solution for several different cases. This new optimizing method is based on grounds of product type, economical productivity, the batch size of products, quality and criteria of usage. Intelligence classification and optimization were never earlier made by grounds of a made product. Now it is possible to find the best type of welding system needed to welddifferent types of products. This calculation process is a universal way for optimizing needed automation or mechanization level when improving productivity of welding. This study helps the industry to improve productivity, quality and cost efficiency of welding workshops.