23 resultados para Design methods

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper presents a method to design membrane elements of concrete with orthogonal mesh of reinforcement which are subject to compressive stress. Design methods, in general, define how to quantify the reinforcement necessary to support the tension stress and verify if the compression in concrete is within the strength limit. In case the compression in membrane is excessive, it is possible to use reinforcements subject to compression. However, there is not much information in the literature about how to design reinforcement for these cases. For that, this paper presents a procedure which uses the model based on Baumann's [1] criteria. The strength limits used herein are those recommended by CEB [3], however, a model is proposed in which this limit varies according to the tensile strain which occur perpendicular to compression. This resistance model is based on concepts proposed by Vecchio e Collins [2].

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This research deals with the behaviour of grouted dowels used in beam-to-column connections in precast concrete structures. The research focuses primarily on the theoretical and experimental analysis of the resistance mechanism of the dowels. The experimental programme included 15 models for analysing the following variations in dowel parameters: a) dowel diameters of 16, 20 and 25 mm, b) dowel inclinations of 0 degrees (i.e. perpendicular to the interface), 45 degrees and 60 degrees, c) compressive strength of classes C35 and C50 for the concrete adjacent to the dowels, and d) the absence or presence of compressive loads normal to the interface. The experimental results indicate that the ultimate capacity and shear stiffness of the inclined dowels are significantly higher than those of the perpendicular dowels. Based on these results, an analytical model is proposed that considers the influence of the parameters studied regarding the capacity of the dowel.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aim. The aim of this study was to evaluate the internal reliability and validity of the BrazilianPortuguese version of Duke Anticoagulation Satisfaction Scale (DASS) among cardiovascular patients. Background. Oral anticoagulation is widely used to prevent and treat thromboembolic events in several conditions, especially in cardiovascular diseases; however, this therapy can induce dissatisfaction and reduce the quality of life. Design. Methodological and cross-sectional research design. Methods. The cultural adaptation of the DASS included the translation and back-translation, discussions with healthcare professionals and patients to ensure conceptual equivalence, semantic evaluation and instrument pretest. The BrazilianPortuguese version of the DASS was tested among subjects followed in a university hospital anticoagulation outpatient clinic. The psychometric properties were assessed by construct validity (convergent, known groups and dimensionality) and internal consistency/reliability (Cronbachs alpha). Results. A total of 180 subjects under oral anticoagulation formed the baseline validation population. DASS total score and SF-36 domain correlations were moderate for General health (r = -0.47, p < 0.01), Vitality (r = -0.44, p < 0.01) and Mental health (r = -0.42, p < 0.01) (convergent). Age and length on oral anticoagulation therapy (in years) were weakly correlated with total DASS score and most of the subscales, except Limitation (r = -0.375, p < 0.01) (Known groups). The Cronbachs alpha coefficient was 0.79 for the total scale, and it ranged from 0.76 (hassles and burdens)0.46 (psychological impact) among the domains, confirming the internal consistency reliability. Conclusions. The BrazilianPortuguese version of the DASS has shown levels of reliability and validity comparable with the original English version. Relevance to clinical practice. Healthcare practitioners and researchers need internationally validated measurement tools to compare outcomes of interventions in clinical management and research tools in oral anticoagulation therapy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Although the release of cardiac biomarkers after percutaneous (PCI) or surgical revascularization (CABG) is common, its prognostic significance is not known. Questions remain about the mechanisms and degree of correlation between the release, the volume of myocardial tissue loss, and the long-term significance. Delayed-enhancement of cardiac magnetic resonance (CMR) consistently quantifies areas of irreversible myocardial injury. To investigate the quantitative relationship between irreversible injury and cardiac biomarkers, we will evaluate the extent of irreversible injury in patients undergoing PCI and CABG and relate it to postprocedural modifications in cardiac biomarkers and long-term prognosis. Methods/Design: The study will include 150 patients with multivessel coronary artery disease (CAD) with left ventricle ejection fraction (LVEF) and a formal indication for CABG; 50 patients will undergo CABG with cardiopulmonary bypass (CPB); 50 patients with the same arterial and ventricular condition indicated for myocardial revascularization will undergo CABG without CPB; and another 50 patients with CAD and preserved ventricular function will undergo PCI using stents. All patients will undergo CMR before and after surgery or PCI. We will also evaluate the release of cardiac markers of necrosis immediately before and after each procedure. Primary outcome considered is overall death in a 5-year follow-up. Secondary outcomes are levels of CK-MB isoenzyme and I-Troponin in association with presence of myocardial fibrosis and systolic left ventricle dysfunction assessed by CMR. Discussion: The MASS-V Trial aims to establish reliable values for parameters of enzyme markers of myocardial necrosis in the absence of manifest myocardial infarction after mechanical interventions. The establishments of these indices have diagnostic value and clinical prognosis and therefore require relevant and different therapeutic measures. In daily practice, the inappropriate use of these necrosis markers has led to misdiagnosis and therefore wrong treatment. The appearance of a more sensitive tool such as CMR provides an unprecedented diagnostic accuracy of myocardial damage when correlated with necrosis enzyme markers. We aim to correlate laboratory data with imaging, thereby establishing more refined data on the presence or absence of irreversible myocardial injury after the procedure, either percutaneous or surgical, and this, with or without the use of cardiopulmonary bypass.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Major depressive disorder (MDD) trials - investigating either non-pharmacological or pharmacological interventions - have shown mixed results. Many reasons explain this heterogeneity, but one that stands out is the trial design due to specific challenges in the field. We aimed therefore to review the methodology of non-invasive brain stimulation (NIBS) trials and provide a framework to improve clinical trial design. We performed a systematic review for randomized, controlled MDD trials whose intervention was transcranial magnetic stimulation (rTMS) or transcranial direct current stimulation (tDCS) in MEDLINE and other databases from April 2002 to April 2008. We created an unstructured checklist based on CONSORT guidelines to extract items such as power analysis, sham method, blinding assessment, allocation concealment, operational criteria used for MDD, definition of refractory depression and primary study hypotheses. Thirty-one studies were included. We found that the main methodological issues can be divided in to three groups: (1) issues related to phase II/small trials, (2) issues related to MDD trials and, (3) specific issues of NIBS studies. Taken together, they can threaten study validity and lead to inconclusive results. Feasible solutions include: estimating the sample size a priori; measuring the degree of refractoriness of the subjects; specifying the primary hypothesis and statistical tests; controlling predictor variables through stratification randomization methods or using strict eligibility criteria; adjusting the study design to the target population; using adaptive designs and exploring NIBS efficacy employing biological markers. In conclusion, our study summarizes the main methodological issues of NIBS trials and proposes a number of alternatives to manage them. Copyright (C) 2011 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of a network is a solution to several engineering and science problems. Several network design problems are known to be NP-hard, and population-based metaheuristics like evolutionary algorithms (EAs) have been largely investigated for such problems. Such optimization methods simultaneously generate a large number of potential solutions to investigate the search space in breadth and, consequently, to avoid local optima. Obtaining a potential solution usually involves the construction and maintenance of several spanning trees, or more generally, spanning forests. To efficiently explore the search space, special data structures have been developed to provide operations that manipulate a set of spanning trees (population). For a tree with n nodes, the most efficient data structures available in the literature require time O(n) to generate a new spanning tree that modifies an existing one and to store the new solution. We propose a new data structure, called node-depth-degree representation (NDDR), and we demonstrate that using this encoding, generating a new spanning forest requires average time O(root n). Experiments with an EA based on NDDR applied to large-scale instances of the degree-constrained minimum spanning tree problem have shown that the implementation adds small constants and lower order terms to the theoretical bound.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although low- and middle-income countries still bear the burden of major infectious diseases, chronic noncommunicable diseases are becoming increasingly common due to rapid demographic, epidemiologic, and nutritional transitions. However, information is generally scant in these countries regarding chronic disease incidence, social determinants, and risk factors. The Brazilian Longitudinal Study of Adult Health (ELSA-Brasil) aims to contribute relevant information with respect to the development and progression of clinical and subclinical chronic diseases, particularly cardiovascular diseases and diabetes. In this report, the authors delineate the study's objectives, principal methodological features, and timeline. At baseline, ELSA-Brasil enrolled 15,105 civil servants from 5 universities and 1 research institute. The baseline examination (2008-2010) included detailed interviews, clinical and anthropometric examinations, an oral glucose tolerance test, overnight urine collection, a 12-lead resting electrocardiogram, measurement of carotid intima-media thickness, echocardiography, measurement of pulse wave velocity, hepatic ultrasonography, retinal fundus photography, and an analysis of heart rate variability. Long-term biologic sample storage will allow investigation of biomarkers that may predict cardiovascular diseases and diabetes. Annual telephone surveillance, initiated in 2009, will continue for the duration of the study. A follow-up examination is scheduled for 2012-2013.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Combining data from multiple analytical platforms is essential for comprehensive study of the molecular phenotype (metabotype) of a given biological sample. The metabolite profiles generated are intrinsically dependent on the analytical platforms, each requiring optimization of instrumental parameters, separation conditions, and sample extraction to deliver maximal biological information. An in-depth evaluation of extraction protocols for characterizing the metabolome of the hepatobiliary fluke Fasciola hepatica, using ultra performance liquid chromatography and capillary electrophoresis coupled with mass spectroscopy is presented. The spectrometric methods were characterized by performance, and metrics of merit were established, including precision, mass accuracy, selectivity, sensitivity, and platform stability. Although a core group of molecules was common to all methods, each platform contributed a unique set, whereby 142 metabolites out of 14,724 features were identified. A mixture design revealed that the chloroform:methanol:water proportion of 15:59:26 was globally the best composition for metabolite extraction across UPLC-MS and CE-MS platforms accommodating different columns and ionization modes. Despite the general assumption of the necessity of platform-adapted protocols for achieving effective metabotype characterization, we show that an appropriately designed single extraction procedure is able to fit the requirements of all technologies. This may constitute a paradigm shift in developing efficient protocols for high-throughput metabolite profiling with more-general analytical applicability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are some variants of the widely used Fuzzy C-Means (FCM) algorithm that support clustering data distributed across different sites. Those methods have been studied under different names, like collaborative and parallel fuzzy clustering. In this study, we offer some augmentation of the two FCM-based clustering algorithms used to cluster distributed data by arriving at some constructive ways of determining essential parameters of the algorithms (including the number of clusters) and forming a set of systematically structured guidelines such as a selection of the specific algorithm depending on the nature of the data environment and the assumptions being made about the number of clusters. A thorough complexity analysis, including space, time, and communication aspects, is reported. A series of detailed numeric experiments is used to illustrate the main ideas discussed in the study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study evaluated color change, stability, and tooth sensitivity in patients submitted to different bleaching techniques. Material and methods: In this study, 48 patients were divided into five groups. A half-mouth design was conducted to compare two in-office bleaching bleaching techniques (with and without light activation): G1: 35% hydrogen peroxide (HP) (Lase Peroxide - DMC Equipments, Sao Carlos, SP, Brazil) + hybrid light (HL) (LED/Diode Laser, Whitening Lase II DMC Equipments, Sao Carlos, SP, Brazil); G2: 35% HP; G3: 38% HP (X-traBoost - Ultradent, South Jordan UT, USA) + HL; G4: 38% HP; and G5: 15% carbamide peroxide (CP) (Opalescence PF - Ultradent, South Jordan UT, USA). For G1 and G3, HP was applied on the enamel surface for 3 consecutive applications activated by HL. Each application included 3x3' HL activations with 1' between each interval; for G2 and G4, HP was applied 3x15' with 15' between intervals; and for G5, 15% CP was applied for 120'/10 days at home. A spectrophotometer was used to measure color change before the treatment and after 24 h, 1 week, 1, 6, 12, 18 and 24 months. A VAS questionnaire was used to evaluate tooth sensitivity before the treatment, immediately following treatment, 24 h after and finally 1 week after. Results: Statistical analysis did not reveal any significant differences between in-office bleaching with or without HL activation related to effectiveness; nevertheless the time required was less with HL. Statistical differences were observed between the result after 24 h, 1 week and 1, 6, 12, 18 and 24 months (integroup). Immediately, in-office bleaching increased tooth sensitivity. The groups activated with HL required less application time with gel. Conclusion: All techniques and bleaching agents used were effective and demonstrated similar behaviors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: This paper addresses the prediction of the free energy of binding of a drug candidate with enzyme InhA associated with Mycobacterium tuberculosis. This problem is found within rational drug design, where interactions between drug candidates and target proteins are verified through molecular docking simulations. In this application, it is important not only to correctly predict the free energy of binding, but also to provide a comprehensible model that could be validated by a domain specialist. Decision-tree induction algorithms have been successfully used in drug-design related applications, specially considering that decision trees are simple to understand, interpret, and validate. There are several decision-tree induction algorithms available for general-use, but each one has a bias that makes it more suitable for a particular data distribution. In this article, we propose and investigate the automatic design of decision-tree induction algorithms tailored to particular drug-enzyme binding data sets. We investigate the performance of our new method for evaluating binding conformations of different drug candidates to InhA, and we analyze our findings with respect to decision tree accuracy, comprehensibility, and biological relevance. Results: The empirical analysis indicates that our method is capable of automatically generating decision-tree induction algorithms that significantly outperform the traditional C4.5 algorithm with respect to both accuracy and comprehensibility. In addition, we provide the biological interpretation of the rules generated by our approach, reinforcing the importance of comprehensible predictive models in this particular bioinformatics application. Conclusions: We conclude that automatically designing a decision-tree algorithm tailored to molecular docking data is a promising alternative for the prediction of the free energy from the binding of a drug candidate with a flexible-receptor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To evaluate the effect of insertion torque on micromotion to a lateral force in three different implant designs. Material and methods: Thirty-six implants with identical thread design, but different cutting groove design were divided in three groups: (1) non-fluted (no cutting groove, solid screw-form); (2) fluted (901 cut at the apex, tap design); and (3) Blossomt (Patent pending) (non-fluted with engineered trimmed thread design). The implants were screwed into polyurethane foam blocks and the insertion torque was recorded after each turn of 901 by a digital torque gauge. Controlled lateral loads of 10N followed by increments of 5 up to 100N were sequentially applied by a digital force gauge on a titanium abutment. Statistical comparison was performed with two-way mixed model ANOVA that evaluated implant design group, linear effects of turns and displacement loads, and their interaction. Results: While insertion torque increased as a function of number of turns for each design, the slope and final values increased (Po0.001) progressively from the Blossomt to the fluted to the non-fluted design (M +/- standard deviation [SD] = 64.1 +/- 26.8, 139.4 +/- 17.2, and 205.23 +/- 24.3 Ncm, respectively). While a linear relationship between horizontal displacement and lateral force was observed for each design, the slope and maximal displacement increased (Po0.001) progressively from the Blossomt to the fluted to the non-fluted design (M +/- SD 530 +/- 57.7, 585.9 +/- 82.4, and 782.33 +/- 269.4 mm, respectively). There was negligible to moderate levels of association between insertion torque and lateral displacement in the Blossomt, fluted and non-fluted design groups, respectively. Conclusion: Insertion torque was reduced in implant macrodesigns that incorporated cutting edges, and lesser insertion torque was generally associated with decreased micromovement. However, insertion torque and micromotion were unrelated within implant designs, particularly for those designs showing the least insertion torque.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Current recommendations for antithrombotic therapy after drug-eluting stent (DES) implantation include prolonged dual antiplatelet therapy (DAPT) with aspirin and clopidogrel >= 12 months. However, the impact of such a regimen for all patients receiving any DES system remains unclear based on scientific evidence available to date. Also, several other shortcomings have been identified with prolonged DAPT, including bleeding complications, compliance, and cost. The second-generation Endeavor zotarolimus-eluting stent (E-ZES) has demonstrated efficacy and safety, despite short duration DAPT (3 months) in the majority of studies. Still, the safety and clinical impact of short-term DAPT with E-ZES in the real world is yet to be determined. Methods The OPTIMIZE trial is a large, prospective, multicenter, randomized (1: 1) non-inferiority clinical evaluation of short-term (3 months) vs long-term (12-months) DAPT in patients undergoing E-ZES implantation in daily clinical practice. Overall, 3,120 patients were enrolled at 33 clinical sites in Brazil. The primary composite endpoint is death (any cause), myocardial infarction, cerebral vascular accident, and major bleeding at 12-month clinical follow-up post-index procedure. Conclusions The OPTIMIZE clinical trial will determine the clinical implications of DAPT duration with the second generation E-ZES in real-world patients undergoing percutaneous coronary intervention. (Am Heart J 2012;164:810-816.e3.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

International Journal of Paediatric Dentistry 2012; 22: 459466 Aim. This in vitro study aimed to test the performance of fluorescence-based methods in detecting occlusal caries lesions in primary molars compared to conventional methods. Design. Two examiners assessed 113 sites on 77 occlusal surfaces of primary molars using three fluorescence devices: DIAGNOdent (LF), DIAGNOdent pen (LFpen), and fluorescence camera (VistaProof-FC). Visual inspection (ICDAS) and radiographic methods were also evaluated. One examiner repeated the evaluations after one month. As reference standard method, the lesion depth was determined after sectioning and evaluation in stereomicroscope. The area under the ROC curve (Az), sensitivity, specificity, and accuracy of the methods were calculated at enamel (D1) and dentine caries (D3) lesions thresholds. The intra and interexaminer reproducibility were calculated using the intraclass correlation coefficient (ICC) and kappa statistics. Results. At D1, visual inspection presented higher sensitivities (0.970.99) but lower specificities (0.180.25). At D3, all the methods demonstrated similar performance (Az values around 0.90). Visual and radiographic methods showed a slightly higher specificity (values higher than 0.96) than the fluorescence based ones (values around 0.88). In general, all methods presented high reproducibility (ICC higher than 0.79). Conclusions. Although fluorescence-based and conventional methods present similar performance in detecting occlusal caries lesions in primary teeth, visual inspection alone seems to be sufficient to be used in clinical practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drug discovery has moved toward more rational strategies based on our increasing understanding of the fundamental principles of protein-ligand interactions. Structure( SBDD) and ligand-based drug design (LBDD) approaches bring together the most powerful concepts in modern chemistry and biology, linking medicinal chemistry with structural biology. The definition and assessment of both chemical and biological space have revitalized the importance of exploring the intrinsic complementary nature of experimental and computational methods in drug design. Major challenges in this field include the identification of promising hits and the development of high-quality leads for further development into clinical candidates. It becomes particularly important in the case of neglected tropical diseases (NTDs) that affect disproportionately poor people living in rural and remote regions worldwide, and for which there is an insufficient number of new chemical entities being evaluated owing to the lack of innovation and R&D investment by the pharmaceutical industry. This perspective paper outlines the utility and applications of SBDD and LBDD approaches for the identification and design of new small-molecule agents for NTDs.