86 resultados para Metric Linear Combinations


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: This study of a chronic porcine postinfarction model examined whether linear epicardial cryoablation was capable of creating large, homogenous lesions in regions of the myocardium including scarred ventricle. Endocardial and epicardial focal cryolesions were also compared to determine if there were significant differences in lesion characteristics. Methods: Eighty focal endocardial and 28 focal epicardial cryoapplications were delivered to eight normal caprine and four normal porcine ventricular myocardium, and 21 linear cryolesions were applied along the border of infarcted epicardial tissue in a chronic porcine infarct model in six swines. Results: Focal endocardial cryolesions in normal animals measured 9.7 +/- 0.4 mm (length) by 7.3 +/- 1.4 mm (width) by 4.8 +/- 0.2 mm (depth), while epicardial lesions measured 10.2 +/- 1.4 mm (length) by 7.7 +/- 2 mm (width) by 4.6 +/- 0.9 mm (depth); P > 0.05. Linear epicardial cryolesions in the chronic porcine infarct model measured 36.5 +/- 7.8 mm (length) by 8.2 +/- 1.3 mm (width) by 6.0 +/- 1.2 mm (depth). The mean depth of linear cryolesions applied to the border of the infarct scar was 7 +/- 0.7 mm, as measured by magnetic resonance imaging. Conclusions:Cryoablation can create deep lesions when delivered to the ventricular epicardium. Endocardial and epicardial cryolesions created by a focal cryoablation catheter are similar in size and depth. The ability to rapidly create deep linear cryolesions may prove to be beneficial in substrate-based catheter ablation of ventricular arrhythmias.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective was to evaluate the influence of dental metallic artefacts on implant sites using multislice and cone-beam computed tomography techniques. Ten dried human mandibles were scanned twice by each technique, with and without dental metallic artefacts. Metallic restorations were placed at the top of the alveolar ridge adjacent to the mental foramen region for the second scanning. Linear measurements (thickness and height) for each cross-section were performed by a single examiner using computer software. All mandibles were analysed at both the right and the left mental foramen regions. For the multislice technique, dental metallic artefact produced an increase of 5% in bone thickness and a reduction of 6% in bone height; no significant differences (p > 0.05) were detected when comparing measurements performed with and without metallic artefacts. With respect to the cone-beam technique, dental metallic artefact produced an increase of 6% in bone thickness and a reduction of 0.68% in bone height. No significant differences (p > 0.05) were observed when comparing measurements performed with and without metallic artefacts. The presence of dental metallic artefacts did not alter the linear measurements obtained with both techniques, although its presence made the location of the alveolar bone crest more difficult.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective. The purpose of this research was to provide further evidence to demonstrate the precision and accuracy of maxillofacial linear and angular measurements obtained by cone-beam computed tomography (CBCT) images. Study design. The study population consisted of 15 dry human skulls that were submitted to CBCT, and 3-dimensional (3D) images were generated. Linear and angular measurements based on conventional craniometric anatomical landmarks, and were identified in 3D-CBCT images by 2 radiologists twice each independently. Subsequently, physical measurements were made by a third examiner using a digital caliper and a digital goniometer. Results. The results demonstrated no statistically significant difference between inter-and intra-examiner analysis. Regarding accuracy test, no statistically significant differences were found of the comparison between the physical and CBCT-based linear and angular measurements for both examiners (P = .968 and .915, P = .844 and .700, respectively). Conclusions. 3D-CBCT images can be used to obtain dimensionally accurate linear and angular measurements from bony maxillofacial structures and landmarks. (Oral Surg Oral Med Oral Pathol Oral Radiol Endod 2009; 108: 430-436)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The resin phase of dental composites is mainly composed of combinations of dimethacrylate comonomers, with final polymeric network structure defined by monomer type/reactivity and degree of conversion. This fundamental study evaluates how increasing concentrations of the flexible triethylene glycol dimethacrylate (TEGDMA) influences void formation in bisphenol A diglycidyl dimethacrylate (BisGMA) co-polymerizations and correlates this aspect of network structure with reaction kinetic parameters and macroscopic volumetric shrinkage. Photopolymerization kinetics was followed in real-time by a near-infrared (NIR) spectroscopic technique, viscosity was assessed with a viscometer, volumetric shrinkage was followed with a linometer, free volume formation was determined by positron annihilation lifetime spectroscopy (PALS) and the sol-gel composition was determined by extraction with dichloromethane followed by (1)H NMR analysis. Results show that, as expected, volumetric shrinkage increases with TEGDMA concentration and monomer conversion. Extraction/(1)H NMR studies show increasing participation of the more flexible TEGDMA towards the limiting stages of conversion/crosslinking development. As the conversion progresses, either based on longer irradiation times or greater TEGDMA concentrations, the network becomes more dense, which is evidenced by the decrease in free volume and weight loss after extraction in these situations. For the same composition (BisGMA/TEGDMA 60-40 mol%) light-cured for increasing periods of time (from 10 to 600 s), free volume decreased and volumetric shrinkage increased, in a linear relationship with conversion. However, the correlation between free volume and macroscopic volumetric shrinkage was shown to be rather complex for variable compositions exposed for the same time (600 s). The addition of TEGDMA decreases free-volume up to 40 mol% (due to increased conversion), but above that concentration, in spite of the increase in conversion/crosslinking, free volume pore size increases due to the high concentration of the more flexible monomer. In those cases, the increase in volumetric shrinkage was due to higher functional group concentration, in spite of the greater free volume. Therefore, through the application of the PALS model, this study elucidates the network formation in dimethacrylates commonly used in dental materials. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective. The objective of this study was to evaluate the antibacterial efficacy of irrigating solutions and their combinations against Enterococcus faecalis. Study design. One hundred ten single-rooted human teeth were inoculated with E. faecalis and incubated for 21 days. Teeth were divided according to the irrigant: Group I (GI), 2.5% sodium hypochlorite solution (NaOCl); GII, 2.5% NaOCl + 10% citric acid; GIII, 2.5% NaOCl + apple cider vinegar; GIV, apple cider vinegar; GV, 2% chlorhexidine solution; GVI, 1% peracetic acid; GVII, saline solution. Microbiological samples were taken after root canal preparation and 7 days later. Data were submitted to ANOVA (5%). Results. All solutions promoted reduction of E. faecalis after instrumentation, but bacterial counts were higher in the final sample. GI, GV, and GVI had lower bacterial counts than the other groups. Conclusions. The irrigating solutions may present activity but do not eradicate E. faecalis in the root canal system. (Oral Surg Oral Med Oral Pathol Oral Radiol Endod 2011; 112:396-400)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To compare visual inspection (VI), radiographic examination (RX) and the laser fluorescence device DIAGNOdent (L), as well as their combinations in vitro regarding treatment decisions for occlusal surfaces. Methods: 72 extracted human permanent teeth (molars and premolars) were used. Treatment decisions were recorded by three calibrated examiners, and the options available were fissure sealant and conservative restoration. For validation of treatment decisions, the teeth were sectioned and examined in a stereomicroscope. Thereafter, dental slices were scanned and the images were edited to facilitate classification of existing carious lesions. Intra and inter-examiner reproducibility for the determination of treatment plans were calculated using Cohen`s kappa test (95%-CI). Sensitivity, specificity, positive and negative predictive values, and the area under the ROC curve were also calculated. Results: VI and L provided on average the greatest intra- and inter-examiner reproducibility, respectively. Although the combination of diagnostic methods may decrease both intra- and inter examiners reproducibility, combination of VI, L and RX resulted in the greatest sensitivity, being statistically superior to RX and L. There was more inter-examiner agreement for the option of restorative treatment, while the use of sealants as a treatment option yielded the lowest values. Negative predictive values were numerically inferior to positive predictive values, indicating that the examiners preferred not to restore a carious tooth than to proceed operatively in an intact tooth. The combination of the three methods studied showed the best results in determining treatment plans for occlusal surfaces, when compared to the other types of exams. On the other hand, radiographic examination and laser fluorescence were less efficient when used alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electromagnetic induction (EMI) method results are shown for vertical magnetic dipole (VMD) configuration by using the EM38 equipment. Performance in the location of metallic pipes and electrical cables is compared as a function of instrumental drift correction by linear and quadratic adjusting under controlled conditions. Metallic pipes and electrical cables are buried at the IAG/USP shallow geophysical test site in Sao Paulo City. Brazil. Results show that apparent electrical conductivity and magnetic susceptibility data were affected by ambient temperature variation. In order to obtain better contrast between background and metallic targets it was necessary to correct the drift. This correction was accomplished by using linear and quadratic relation between conductivity/susceptibility and temperature intending comparative studies. The correction of temperature drift by using a quadratic relation was effective, showing that all metallic targets were located as well deeper targets were also improved. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-linear methods for estimating variability in time-series are currently of widespread use. Among such methods are approximate entropy (ApEn) and sample approximate entropy (SampEn). The applicability of ApEn and SampEn in analyzing data is evident and their use is increasing. However, consistency is a point of concern in these tools, i.e., the classification of the temporal organization of a data set might indicate a relative less ordered series in relation to another when the opposite is true. As highlighted by their proponents themselves, ApEn and SampEn might present incorrect results due to this lack of consistency. In this study, we present a method which gains consistency by using ApEn repeatedly in a wide range of combinations of window lengths and matching error tolerance. The tool is called volumetric approximate entropy, vApEn. We analyze nine artificially generated prototypical time-series with different degrees of temporal order (combinations of sine waves, logistic maps with different control parameter values, random noises). While ApEn/SampEn clearly fail to consistently identify the temporal order of the sequences, vApEn correctly do. In order to validate the tool we performed shuffled and surrogate data analysis. Statistical analysis confirmed the consistency of the method. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

So Paulo is the most developed state in Brazil and contains few fragments of native ecosystems, generally surrounded by intensive agriculture lands. Despite this, some areas still shelter large native animals. We aimed at understanding how medium and large carnivores use a mosaic landscape of forest/savanna and agroecosystems, and how the species respond to different landscape parameters (percentage of landcover and edge density), in a multi-scale perspective. The response variables were: species richness, carnivore frequency and frequency for the three most recorded species (Puma concolor, Chrysocyon brachyurus and Leopardus pardalis). We compared 11 competing models using Akaike`s information criterion (AIC) and assessed model support using weight of AIC. Concurrent models were combinations of landcover types (native vegetation, ""cerrado"" formations, ""cerrado"" and eucalypt plantation), landscape feature (percentage of landcover and edge density) and spatial scale. Herein, spatial scale refers to the radius around a sampling point defining a circular landscape. The scales analyzed were 250 (fine), 1,000 (medium) and 2,000 m (coarse). The shape of curves for response variables (linear, exponential and power) was also assessed. Our results indicate that species with high mobility, P. concolor and C. brachyurus, were best explained by edge density of the native vegetation at a coarse scale (2,000 m). The relationship between P. concolor and C. brachyurus frequency had a negative power-shaped response to explanatory variables. This general trend was also observed for species richness and carnivore frequency. Species richness and P. concolor frequency were also well explained by a second concurrent model: edge density of cerrado at the fine (250 m) scale. A different response was recorded for L. pardalis, as the frequency was best explained for the amount of cerrado at the fine (250 m) scale. The curve of response was linearly positive. The contrasting results (P. concolor and C. brachyurus vs L. pardalis) may be due to the much higher mobility of the two first species, in comparison with the third. Still, L. pardalis requires habitat with higher quality when compared with other two species. This study highlights the importance of considering multiple spatial scales when evaluating species responses to different habitats. An important and new finding was the prevalence of edge density over the habitat extension to explain overall carnivore distribution, a key information for planning and management of protected areas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prestes, J, Frollini, AB, De Lima, C, Donatto, FF, Foschini, D, de Marqueti, RC, Figueira Jr, A, and Fleck, SJ. Comparison between linear and daily undulating periodized resistance training to increase strength. J Strength Cond Res 23(9): 2437-2442, 2009-To determine the most effective periodization model for strength and hypertrophy is an important step for strength and conditioning professionals. The aim of this study was to compare the effects of linear (LP) and daily undulating periodized (DUP) resistance training on body composition and maximal strength levels. Forty men aged 21.5 +/- 8.3 and with a minimum 1-year strength training experience were assigned to an LP (n = 20) or DUP group (n = 20). Subjects were tested for maximal strength in bench press, leg press 45 degrees, and arm curl (1 repetition maximum [RM]) at baseline (T1), after 8 weeks (T2), and after 12 weeks of training (T3). Increases of 18.2 and 25.08% in bench press 1 RM were observed for LP and DUP groups in T3 compared with T1, respectively (p <= 0.05). In leg press 45 degrees, LP group exhibited an increase of 24.71% and DUP of 40.61% at T3 compared with T1. Additionally, DUP showed an increase of 12.23% at T2 compared with T1 and 25.48% at T3 compared with T2. For the arm curl exercise, LP group increased 14.15% and DUP 23.53% at T3 when compared with T1. An increase of 20% was also found at T2 when compared with T1, for DUP. Although the DUP group increased strength the most in all exercises, no statistical differences were found between groups. In conclusion, undulating periodized strength training induced higher increases in maximal strength than the linear model in strength-trained men. For maximizing strength increases, daily intensity and volume variations were more effective than weekly variations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Searching in a dataset for elements that are similar to a given query element is a core problem in applications that manage complex data, and has been aided by metric access methods (MAMs). A growing number of applications require indices that must be built faster and repeatedly, also providing faster response for similarity queries. The increase in the main memory capacity and its lowering costs also motivate using memory-based MAMs. In this paper. we propose the Onion-tree, a new and robust dynamic memory-based MAM that slices the metric space into disjoint subspaces to provide quick indexing of complex data. It introduces three major characteristics: (i) a partitioning method that controls the number of disjoint subspaces generated at each node; (ii) a replacement technique that can change the leaf node pivots in insertion operations; and (iii) range and k-NN extended query algorithms to support the new partitioning method, including a new visit order of the subspaces in k-NN queries. Performance tests with both real-world and synthetic datasets showed that the Onion-tree is very compact. Comparisons of the Onion-tree with the MM-tree and a memory-based version of the Slim-tree showed that the Onion-tree was always faster to build the index. The experiments also showed that the Onion-tree significantly improved range and k-NN query processing performance and was the most efficient MAM, followed by the MM-tree, which in turn outperformed the Slim-tree in almost all the tests. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new technique and two algorithms to bulk-load data into multi-way dynamic metric access methods, based on the covering radius of representative elements employed to organize data in hierarchical data structures. The proposed algorithms are sample-based, and they always build a valid and height-balanced tree. We compare the proposed algorithm with existing ones, showing the behavior to bulk-load data into the Slim-tree metric access method. After having identified the worst case of our first algorithm, we describe adequate counteractions in an elegant way creating the second algorithm. Experiments performed to evaluate their performance show that our bulk-loading methods build trees faster than the sequential insertion method regarding construction time, and that it also significantly improves search performance. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel technique for selecting the poles of orthonormal basis functions (OBF) in Volterra models of any order is presented. It is well-known that the usual large number of parameters required to describe the Volterra kernels can be significantly reduced by representing each kernel using an appropriate basis of orthonormal functions. Such a representation results in the so-called OBF Volterra model, which has a Wiener structure consisting of a linear dynamic generated by the orthonormal basis followed by a nonlinear static mapping given by the Volterra polynomial series. Aiming at optimizing the poles that fully parameterize the orthonormal bases, the exact gradients of the outputs of the orthonormal filters with respect to their poles are computed analytically by using a back-propagation-through-time technique. The expressions relative to the Kautz basis and to generalized orthonormal bases of functions (GOBF) are addressed; the ones related to the Laguerre basis follow straightforwardly as a particular case. The main innovation here is that the dynamic nature of the OBF filters is fully considered in the gradient computations. These gradients provide exact search directions for optimizing the poles of a given orthonormal basis. Such search directions can, in turn, be used as part of an optimization procedure to locate the minimum of a cost-function that takes into account the error of estimation of the system output. The Levenberg-Marquardt algorithm is adopted here as the optimization procedure. Unlike previous related work, the proposed approach relies solely on input-output data measured from the system to be modeled, i.e., no information about the Volterra kernels is required. Examples are presented to illustrate the application of this approach to the modeling of dynamic systems, including a real magnetic levitation system with nonlinear oscillatory behavior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nesse artigo, tem-se o interesse em avaliar diferentes estratégias de estimação de parâmetros para um modelo de regressão linear múltipla. Para a estimação dos parâmetros do modelo foram utilizados dados de um ensaio clínico em que o interesse foi verificar se o ensaio mecânico da propriedade de força máxima (EM-FM) está associada com a massa femoral, com o diâmetro femoral e com o grupo experimental de ratas ovariectomizadas da raça Rattus norvegicus albinus, variedade Wistar. Para a estimação dos parâmetros do modelo serão comparadas três metodologias: a metodologia clássica, baseada no método dos mínimos quadrados; a metodologia Bayesiana, baseada no teorema de Bayes; e o método Bootstrap, baseado em processos de reamostragem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Linear mixed models were developed to handle clustered data and have been a topic of increasing interest in statistics for the past 50 years. Generally. the normality (or symmetry) of the random effects is a common assumption in linear mixed models but it may, sometimes, be unrealistic, obscuring important features of among-subjects variation. In this article, we utilize skew-normal/independent distributions as a tool for robust modeling of linear mixed models under a Bayesian paradigm. The skew-normal/independent distributions is an attractive class of asymmetric heavy-tailed distributions that includes the skew-normal distribution, skew-t, skew-slash and the skew-contaminated normal distributions as special cases, providing an appealing robust alternative to the routine use of symmetric distributions in this type of models. The methods developed are illustrated using a real data set from Framingham cholesterol study. (C) 2009 Elsevier B.V. All rights reserved.