977 resultados para Linear multistep methods
Resumo:
OBJECTIVE: To report stabilization of closed, comminuted distal metaphyseal transverse fractures of the left tibia and fibula in a tiger using a hybrid circular-linear external skeletal fixator. STUDY DESIGN: Clinical report. ANIMAL: Juvenile tiger (15 months, 90 kg). METHODS: From imaging studies, the tiger had comminuted distal metaphyseal transverse fractures of the left tibia and fibula, with mild caudolateral displacement and moderate compression. Multiple fissures extended from the fractures through the distal metaphyses, extending toward, but not involving the distal tibial and fibular physes. A hybrid circular-linear external skeletal fixator was applied by closed reduction, to stabilize the fractures. RESULTS: The fractures healed and the fixator was removed 5 weeks after stabilization. Limb length and alignment were similar to the normal contralateral limb at hospital discharge, 8 weeks after surgery. Two weeks later, the tiger had fractures of the right tibia and fibula and was euthanatized. Necropsy confirmed pathologic fractures ascribed to copper deficiency. CONCLUSION: Closed application of the hybrid construct provided sufficient stability to allow this 90 kg tiger's juxta-articular fractures to heal with minimal complications and without disrupting growth from the adjacent physes.
Resumo:
The aim of this study is to develop a new simple method for analyzing one-dimensional transcranial magnetic stimulation (TMS) mapping studies in humans. Motor evoked potentials (MEP) were recorded from the abductor pollicis brevis (APB) muscle during stimulation at nine different positions on the scalp along a line passing through the APB hot spot and the vertex. Non-linear curve fitting according to the Levenberg-Marquardt algorithm was performed on the averaged amplitude values obtained at all points to find the best-fitting symmetrical and asymmetrical peak functions. Several peak functions could be fitted to the experimental data. Across all subjects, a symmetric, bell-shaped curve, the complementary error function (erfc) gave the best results. This function is characterized by three parameters giving its amplitude, position, and width. None of the mathematical functions tested with less or more than three parameters fitted better. The amplitude and position parameters of the erfc were highly correlated with the amplitude at the hot spot and with the location of the center of gravity of the TMS curve. In conclusion, non-linear curve fitting is an accurate method for the mathematical characterization of one-dimensional TMS curves. This is the first method that provides information on amplitude, position and width simultaneously.
Resumo:
Despite the widespread popularity of linear models for correlated outcomes (e.g. linear mixed models and time series models), distribution diagnostic methodology remains relatively underdeveloped in this context. In this paper we present an easy-to-implement approach that lends itself to graphical displays of model fit. Our approach involves multiplying the estimated margional residual vector by the Cholesky decomposition of the inverse of the estimated margional variance matrix. The resulting "rotated" residuals are used to construct an empirical cumulative distribution function and pointwise standard errors. The theoretical framework, including conditions and asymptotic properties, involves technical details that are motivated by Lange and Ryan (1989), Pierce (1982), and Randles (1982). Our method appears to work well in a variety of circumstances, including models having independent units of sampling (clustered data) and models for which all observations are correlated (e.g., a single time series). Our methods can produce satisfactory results even for models that do not satisfy all of the technical conditions stated in our theory.
Resumo:
This paper proposes Poisson log-linear multilevel models to investigate population variability in sleep state transition rates. We specifically propose a Bayesian Poisson regression model that is more flexible, scalable to larger studies, and easily fit than other attempts in the literature. We further use hierarchical random effects to account for pairings of individuals and repeated measures within those individuals, as comparing diseased to non-diseased subjects while minimizing bias is of epidemiologic importance. We estimate essentially non-parametric piecewise constant hazards and smooth them, and allow for time varying covariates and segment of the night comparisons. The Bayesian Poisson regression is justified through a re-derivation of a classical algebraic likelihood equivalence of Poisson regression with a log(time) offset and survival regression assuming piecewise constant hazards. This relationship allows us to synthesize two methods currently used to analyze sleep transition phenomena: stratified multi-state proportional hazards models and log-linear models with GEE for transition counts. An example data set from the Sleep Heart Health Study is analyzed.
Resumo:
Microarrays have established as instrumental for bacterial detection, identification, and genotyping as well as for transcriptomic studies. For gene expression analyses using limited numbers of bacteria (derived from in vivo or ex vivo origin, for example), RNA amplification is often required prior to labeling and hybridization onto microarrays. Evaluation of the fidelity of the amplification methods is crucial for the robustness and reproducibility of microarray results. We report here the first utilization of random primers and the highly processive Phi29 phage polymerase to amplify material for transcription profiling analyses. We compared two commercial amplification methods (GenomiPhi and MessageAmp kits) with direct reverse-transcription as the reference method, focusing on the robustness of mRNA quantification using either microarrays or quantitative RT-PCR. Both amplification methods using either poly-A tailing followed by in vitro transcription, or direct strand displacement polymerase, showed appreciable linearity. Strand displacement technique was particularly affordable compared to in vitro transcription-based (IVT) amplification methods and consisted in a single tube reaction leading to high amplification yields. Real-time measurements using low-, medium-, and highly expressed genes revealed that this simple method provided linear amplification with equivalent results in terms of relative messenger abundance as those obtained by conventional direct reverse-transcription.
Resumo:
OBJECTIVES The aim of this study was to analyze trigger activity in the long-term follow-up after left atrial (LA) linear ablation. BACKGROUND Interventional strategies for curative treatment of atrial fibrillation (AF) are targeted at the triggers and/or the maintaining substrate. After substrate modification using nonisolating linear lesions, the activity of triggers is unknown. METHODS With the LA linear lesion concept, 129 patients were treated using intraoperative ablation with minimal invasive surgical techniques. Contiguous radiofrequency energy-induced lesion lines involving the mitral annulus and the orifices of the pulmonary veins without isolation were placed under direct vision. RESULTS After a mean follow-up of 3.6 +/- 0.4 years, atrial ectopy, atrial runs, and reoccurrence of AF episodes were analyzed by digital 7-day electrocardiograms in 30 patients. Atrial ectopy was present in all patients. Atrial runs were present in 25 of 30 patients (83%), with a median number of 9 runs per patient/week (range 1 to 321) and a median duration of 1.2 s/run (range 0.7 to 25), without a significant difference in atrial ectopy and atrial runs between patients with former paroxysmal (n = 17) or persistent AF (n = 13). Overall, 87% of all patients were completely free from AF without antiarrhythmic drugs. CONCLUSIONS A detailed rhythm analysis late after specific LA linear lesion ablation shows that trigger activity remains relatively frequent but short and does not induce AF episodes in most patients. The long-term success rate of this concept is high in patients with paroxysmal or persistent AF.
Resumo:
OBJECTIVES We sought to analyze the time course of atrial fibrillation (AF) episodes before and after circular plus linear left atrial ablation and the percentage of patients with complete freedom from AF after ablation by using serial seven-day electrocardiograms (ECGs). BACKGROUND The curative treatment of AF targets the pathophysiological corner stones of AF (i.e., the initiating triggers and/or the perpetuation of AF). The pathophysiological complexity of both may not result in an "all-or-nothing" response but may modify number and duration of AF episodes. METHODS In patients with highly symptomatic AF, circular plus linear ablation lesions were placed around the left and right pulmonary veins, between the two circles, and from the left circle to the mitral annulus using the electroanatomic mapping system. Repetitive continuous 7-day ECGs administered before and after catheter ablation were used for rhythm follow-up. RESULTS In 100 patients with paroxysmal (n = 80) and persistent (n = 20) AF, relative duration of time spent in AF significantly decreased over time (35 +/- 37% before ablation, 26 +/- 41% directly after ablation, and 10 +/- 22% after 12 months). Freedom from AF stepwise increased in patients with paroxysmal AF and after 12 months measured at 88% or 74% depending on whether 24-h ECG or 7-day ECG was used. Complete pulmonary vein isolation was demonstrated in <20% of the circular lesions. CONCLUSIONS The results obtained in patients with AF treated with circular plus linear left atrial lesions strongly indicate that substrate modification is the main underlying pathophysiologic mechanism and that it results in a delayed cure instead of an immediate cure.
Resumo:
OBJECTIVES: Premature babies require supplementation with calcium and phosphorus to prevent metabolic bone disease of prematurity. To guide mineral supplementation, two methods of monitoring urinary excretion of calcium and phosphorus are used: urinary calcium or phosphorus concentration and calcium/creatinine or phosphorus/creatinine ratios. We compare these two methods in regards to their agreement on the need for mineral supplementation. METHODS: Retrospective chart review of 230 premature babies with birthweight <1500 g, undergoing screening of urinary spot samples from day 21 of life and fortnightly thereafter. Hypothetical cut-off values for urine calcium or phosphorus concentration (1 mmol/l) and urine calcium/creatinine ratio (0.5 mol/mol) or phosphorus/creatinine ratio (4 mol/mol) were applied to the sample results. The agreement on whether or not to supplement the respective minerals based on the results with the two methods was compared. Multivariate general linear models sought to identify patient characteristic to predict disagreeing results. RESULTS: 24.8% of cases disagreed on the indication for calcium supplementation, 8.8% for phosphorus. Total daily calcium intake was the only patient characteristic associated with discordant results. CONCLUSIONS: With the intention to supplement the respective mineral, comparison of urinary mineral concentration with mineral/creatinine ratio is moderate for Calcium and good for Phosphorus. The results do not allow to identify superiority of either method on the decision which babies require calcium and/or phosphorus supplements.
Resumo:
Localized short-echo-time (1)H-MR spectra of human brain contain contributions of many low-molecular-weight metabolites and baseline contributions of macromolecules. Two approaches to model such spectra are compared and the data acquisition sequence, optimized for reproducibility, is presented. Modeling relies on prior knowledge constraints and linear combination of metabolite spectra. Investigated was what can be gained by basis parameterization, i.e., description of basis spectra as sums of parametric lineshapes. Effects of basis composition and addition of experimentally measured macromolecular baselines were investigated also. Both fitting methods yielded quantitatively similar values, model deviations, error estimates, and reproducibility in the evaluation of 64 spectra of human gray and white matter from 40 subjects. Major advantages of parameterized basis functions are the possibilities to evaluate fitting parameters separately, to treat subgroup spectra as independent moieties, and to incorporate deviations from straightforward metabolite models. It was found that most of the 22 basis metabolites used may provide meaningful data when comparing patient cohorts. In individual spectra, sums of closely related metabolites are often more meaningful. Inclusion of a macromolecular basis component leads to relatively small, but significantly different tissue content for most metabolites. It provides a means to quantitate baseline contributions that may contain crucial clinical information.
Resumo:
Background and Aims Ongoing global warming has been implicated in shifting phenological patterns such as the timing and duration of the growing season across a wide variety of ecosystems. Linear models are routinely used to extrapolate these observed shifts in phenology into the future and to estimate changes in associated ecosystem properties such as net primary productivity. Yet, in nature, linear relationships may be special cases. Biological processes frequently follow more complex, non-linear patterns according to limiting factors that generate shifts and discontinuities, or contain thresholds beyond which responses change abruptly. This study investigates to what extent cambium phenology is associated with xylem growth and differentiation across conifer species of the northern hemisphere. Methods Xylem cell production is compared with the periods of cambial activity and cell differentiation assessed on a weekly time scale on histological sections of cambium and wood tissue collected from the stems of nine species in Canada and Europe over 1–9 years per site from 1998 to 2011. Key Results The dynamics of xylogenesis were surprisingly homogeneous among conifer species, although dispersions from the average were obviously observed. Within the range analysed, the relationships between the phenological timings were linear, with several slopes showing values close to or not statistically different from 1. The relationships between the phenological timings and cell production were distinctly non-linear, and involved an exponential pattern. Conclusions The trees adjust their phenological timings according to linear patterns. Thus, shifts of one phenological phase are associated with synchronous and comparable shifts of the successive phases. However, small increases in the duration of xylogenesis could correspond to a substantial increase in cell production. The findings suggest that the length of the growing season and the resulting amount of growth could respond differently to changes in environmental conditions.
Resumo:
This paper reports a comparison of three modeling strategies for the analysis of hospital mortality in a sample of general medicine inpatients in a Department of Veterans Affairs medical center. Logistic regression, a Markov chain model, and longitudinal logistic regression were evaluated on predictive performance as measured by the c-index and on accuracy of expected numbers of deaths compared to observed. The logistic regression used patient information collected at admission; the Markov model was comprised of two absorbing states for discharge and death and three transient states reflecting increasing severity of illness as measured by laboratory data collected during the hospital stay; longitudinal regression employed Generalized Estimating Equations (GEE) to model covariance structure for the repeated binary outcome. Results showed that the logistic regression predicted hospital mortality as well as the alternative methods but was limited in scope of application. The Markov chain provides insights into how day to day changes of illness severity lead to discharge or death. The longitudinal logistic regression showed that increasing illness trajectory is associated with hospital mortality. The conclusion is reached that for standard applications in modeling hospital mortality, logistic regression is adequate, but for new challenges facing health services research today, alternative methods are equally predictive, practical, and can provide new insights. ^
Resumo:
INTRODUCTION The aim of this study was to determine the reproducibility and accuracy of linear measurements on 2 types of dental models derived from cone-beam computed tomography (CBCT) scans: CBCT images, and Anatomodels (InVivoDental, San Jose, Calif); these were compared with digital models generated from dental impressions (Digimodels; Orthoproof, Nieuwegein, The Netherlands). The Digimodels were used as the reference standard. METHODS The 3 types of digital models were made from 10 subjects. Four examiners repeated 37 linear tooth and arch measurements 10 times. Paired t tests and the intraclass correlation coefficient were performed to determine the reproducibility and accuracy of the measurements. RESULTS The CBCT images showed significantly smaller intraclass correlation coefficient values and larger duplicate measurement errors compared with the corresponding values for Digimodels and Anatomodels. The average difference between measurements on CBCT images and Digimodels ranged from -0.4 to 1.65 mm, with limits of agreement values up to 1.3 mm for crown-width measurements. The average difference between Anatomodels and Digimodels ranged from -0.42 to 0.84 mm with limits of agreement values up to 1.65 mm. CONCLUSIONS Statistically significant differences between measurements on Digimodels and Anatomodels, and between Digimodels and CBCT images, were found. Although the mean differences might be clinically acceptable, the random errors were relatively large compared with corresponding measurements reported in the literature for both Anatomodels and CBCT images, and might be clinically important. Therefore, with the CBCT settings used in this study, measurements made directly on CBCT images and Anatomodels are not as accurate as measurements on Digimodels.
Resumo:
In this paper we develop an adaptive procedure for the numerical solution of general, semilinear elliptic problems with possible singular perturbations. Our approach combines both prediction-type adaptive Newton methods and a linear adaptive finite element discretization (based on a robust a posteriori error analysis), thereby leading to a fully adaptive Newton–Galerkin scheme. Numerical experiments underline the robustness and reliability of the proposed approach for various examples
Resumo:
Index tracking has become one of the most common strategies in asset management. The index-tracking problem consists of constructing a portfolio that replicates the future performance of an index by including only a subset of the index constituents in the portfolio. Finding the most representative subset is challenging when the number of stocks in the index is large. We introduce a new three-stage approach that at first identifies promising subsets by employing data-mining techniques, then determines the stock weights in the subsets using mixed-binary linear programming, and finally evaluates the subsets based on cross validation. The best subset is returned as the tracking portfolio. Our approach outperforms state-of-the-art methods in terms of out-of-sample performance and running times.