573 resultados para predictability


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Article preview View full access options BoneKEy Reports | Review Print Email Share/bookmark Finite element analysis for prediction of bone strength Philippe K Zysset, Enrico Dall'Ara, Peter Varga & Dieter H Pahr Affiliations Corresponding author BoneKEy Reports (2013) 2, Article number: 386 (2013) doi:10.1038/bonekey.2013.120 Received 03 January 2013 Accepted 25 June 2013 Published online 07 August 2013 Article tools Citation Reprints Rights & permissions Abstract Abstract• References• Author information Finite element (FE) analysis has been applied for the past 40 years to simulate the mechanical behavior of bone. Although several validation studies have been performed on specific anatomical sites and load cases, this study aims to review the predictability of human bone strength at the three major osteoporotic fracture sites quantified in recently completed in vitro studies at our former institute. Specifically, the performance of FE analysis based on clinical computer tomography (QCT) is compared with the ones of the current densitometric standards, bone mineral content, bone mineral density (BMD) and areal BMD (aBMD). Clinical fractures were produced in monotonic axial compression of the distal radii, vertebral sections and in side loading of the proximal femora. QCT-based FE models of the three bones were developed to simulate as closely as possible the boundary conditions of each experiment. For all sites, the FE methodology exhibited the lowest errors and the highest correlations in predicting the experimental bone strength. Likely due to the improved CT image resolution, the quality of the FE prediction in the peripheral skeleton using high-resolution peripheral CT was superior to that in the axial skeleton with whole-body QCT. Because of its projective and scalar nature, the performance of aBMD in predicting bone strength depended on loading mode and was significantly inferior to FE in axial compression of radial or vertebral sections but not significantly inferior to FE in side loading of the femur. Considering the cumulated evidence from the published validation studies, it is concluded that FE models provide the most reliable surrogates of bone strength at any of the three fracture sites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE Validity of the seventh edition of the American Joint Committee on Cancer/International Union Against Cancer (AJCC/UICC) staging systems for gastric cancer has been evaluated in several studies, mostly in Asian patient populations. Only few data are available on the prognostic implications of the new classification system on a Western population. Therefore, we investigated its prognostic ability based on a German patient cohort. PATIENTS AND METHODS Data from a single-center cohort of 1,767 consecutive patients surgically treated for gastric cancer were classified according to the seventh edition and were compared using the previous TNM/UICC classification. Kaplan-Meier analyses were performed for all TNM stages and UICC stages in a comparative manner. Additional survival receiver operating characteristic analyses and bootstrap-based goodness-of-fit comparisons via Bayesian information criterion (BIC) were performed to assess and compare prognostic performance of the competing classification systems. RESULTS We identified the UICC pT/pN stages according to the seventh edition of the AJCC/UICC guidelines as well as resection status, age, Lauren histotype, lymph-node ratio, and tumor grade as independent prognostic factors in gastric cancer, which is consistent with data from previous Asian studies. Overall survival rates according to the new edition were significantly different for each individual's pT, pN, and UICC stage. However, BIC analysis revealed that, owing to higher complexity, the new staging system might not significantly alter predictability for overall survival compared with the old system within the analyzed cohort from a statistical point of view. CONCLUSION The seventh edition of the AJCC/UICC classification was found to be valid with distinctive prognosis for each stage. However, the AJCC/UICC classification has become more complex without improving predictability for overall survival in a Western population. Therefore, simplification with better predictability of overall survival of patients with gastric cancer should be considered when revising the seventh edition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Disorganized behavior is a key symptom of schizophrenia. The objective assessment of disorganized behavior is particularly challenging. Actigraphy has enabled the objective assessment of motor behavior in various settings. Reduced motor activity was associated with negative syndrome scores, but simple motor activity analyses were not informative on other symptom dimensions. The analysis of movement patterns, however, could be more informative for assessing schizophrenia symptom dimensions. Here, we use time series analyses on actigraphic data of 100 schizophrenia spectrum disorder patients. Actigraphy recording intervals were set at 2 s. Data from 2 defined 60-min periods were analyzed, and partial autocorrelations of the actigraphy time series indicated predictability of movements in each individual. Increased positive syndrome scores were associated with reduced predictability of movements but not with the overall amount of movement. Negative syndrome scores were associated with low activity levels but unrelated with predictability of movement. The factors disorganization and excitement were related to movement predictability but emotional distress was not. Thus, the predictability of objectively assessed motor behavior may be a marker of positive symptoms and disorganized behavior. This behavior could become relevant for translational research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evidence suggests a strong relation between superior motor performance and the duration of the last fixation before movement initiation (called Quiet Eye, QE). Although this phenomenon proves to be quite robust, to date, only little is known about its functional role. Therefore, in two experiments, a novel paradigm is introduced, testing the QE as independent variable by experimentally manipulating the duration of the last fixation in a throwing task. In addition, this paradigm allowed for the manipulation of the predictability of the target position. Results of the first study revealed an increase in throwing accuracy as function of experimentally prolonged QE durations. Thus, the assumption that the QE does not surface as a mere by-product of superior performance could be confirmed. In the second study, this dependency was found only under high task-demand conditions in which the target position was not predictable. This finding confirms the hypothesis that it is the optimization of information processing which serves as the crucial mechanisms behind QE effects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Serial correlation of extreme midlatitude cyclones observed at the storm track exits is explained by deviations from a Poisson process. To model these deviations, we apply fractional Poisson processes (FPPs) to extreme midlatitude cyclones, which are defined by the 850 hPa relative vorticity of the ERA interim reanalysis during boreal winter (DJF) and summer (JJA) seasons. Extremes are defined by a 99% quantile threshold in the grid-point time series. In general, FPPs are based on long-term memory and lead to non-exponential return time distributions. The return times are described by a Weibull distribution to approximate the Mittag–Leffler function in the FPPs. The Weibull shape parameter yields a dispersion parameter that agrees with results found for midlatitude cyclones. The memory of the FPP, which is determined by detrended fluctuation analysis, provides an independent estimate for the shape parameter. Thus, the analysis exhibits a concise framework of the deviation from Poisson statistics (by a dispersion parameter), non-exponential return times and memory (correlation) on the basis of a single parameter. The results have potential implications for the predictability of extreme cyclones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ongoing changes in disturbance regimes are predicted to cause acute changes in ecosystem structure and function in the coming decades, but many aspects of these predictions are uncertain. A key challenge is to improve the predictability of postdisturbance biogeochemical trajectories at the ecosystem level. Ecosystem ecologists and paleoecologists have generated complementary data sets about disturbance (type, severity, frequency) and ecosystem response (net primary productivity, nutrient cycling) spanning decadal to millennial timescales. Here, we take the first steps toward a full integration of these data sets by reviewing how disturbances are reconstructed using dendrochronological and sedimentary archives and by summarizing the conceptual frameworks for carbon, nitrogen, and hydrologic responses to disturbances. Key research priorities include further development of paleoecological techniques that reconstruct both disturbances and terrestrial ecosystem dynamics. In addition, mechanistic detail from disturbance experiments, long-term observations, and chronosequences can help increase the understanding of ecosystem resilience.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Earth’s climate system is driven by a complex interplay of internal chaotic dynamics and natural and anthropogenic external forcing. Recent instrumental data have shown a remarkable degree of asynchronicity between Northern Hemisphere and Southern Hemisphere temperature fluctuations, thereby questioning the relative importance of internal versus external drivers of past as well as future climate variability1, 2, 3. However, large-scale temperature reconstructions for the past millennium have focused on the Northern Hemisphere4, 5, limiting empirical assessments of inter-hemispheric variability on multi-decadal to centennial timescales. Here, we introduce a new millennial ensemble reconstruction of annually resolved temperature variations for the Southern Hemisphere based on an unprecedented network of terrestrial and oceanic palaeoclimate proxy records. In conjunction with an independent Northern Hemisphere temperature reconstruction ensemble5, this record reveals an extended cold period (1594–1677) in both hemispheres but no globally coherent warm phase during the pre-industrial (1000–1850) era. The current (post-1974) warm phase is the only period of the past millennium where both hemispheres are likely to have experienced contemporaneous warm extremes. Our analysis of inter-hemispheric temperature variability in an ensemble of climate model simulations for the past millennium suggests that models tend to overemphasize Northern Hemisphere–Southern Hemisphere synchronicity by underestimating the role of internal ocean–atmosphere dynamics, particularly in the ocean-dominated Southern Hemisphere. Our results imply that climate system predictability on decadal to century timescales may be lower than expected based on assessments of external climate forcing and Northern Hemisphere temperature variations5, 6 alone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The finite element analysis is an accepted method to predict vertebral body compressive strength. This study compares measurements obtained from in vitro tests with the ones from two different simulation models: clinical quantitative computer tomography (QCT) based homogenized finite element (hFE) models and pre-clinical high-resolution peripheral QCT-based (HR-pQCT) hFE models. About 37 vertebral body sections were prepared by removing end-plates and posterior elements, scanned with QCT (390/450μm voxel size) as well as HR-pQCT (82μm voxel size), and tested in compression up to failure. Non-linear viscous damage hFE models were created from QCT/HT-pQCT images and compared to experimental results based on stiffness and ultimate load. As expected, the predictability of QCT/HR-pQCT-based hFE models for both apparent stiffness (r2=0.685/0.801r2=0.685/0.801) and strength (r2=0.774/0.924r2=0.774/0.924) increased if a better image resolution was used. An analysis of the damage distribution showed similar damage locations for all cases. In conclusion, HR-pQCT-based hFE models increased the predictability considerably and do not need any tuning of input parameters. In contrast, QCT-based hFE models usually need some tuning but are clinically the only possible choice at the moment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Salt transport in the Irminger Current and thus the coupling between eastern and western subpolar North Atlantic plays an important role for climate variability across a wide range of time scales. High-resolution ocean modeling and observations indicate that 5 salinities in the eastern subpolar North Atlantic decrease with enhanced circulation of the North Atlantic subpolar gyre (SPG). This has led to the perception that a stronger SPG also transports less salt westward. In this study, we analyze a regional ocean model and a comprehensive global coupled climate model, and show that a stronger SPG transports more salt in the Irminger Current irrespective of lower salinities in its 10 source region. The additional salt converges in the Labrador Sea and the Irminger Basin by eddy transports, increases surface salinity in the western SPG, and favors more intense deep convection. This is part of a positive feedback mechanism with potentially large implications for climate variability and predictability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A rain-on-snow flood occurred in the Bernese Alps, Switzerland, on 10 October 2011, and caused significant damage. As the flood peak was unpredicted by the flood forecast system, questions were raised concerning the causes and the predictability of the event. Here, we aimed to reconstruct the anatomy of this rain-on-snow flood in the Lötschen Valley (160 km2) by analyzing meteorological data from the synoptic to the local scale and by reproducing the flood peak with the hydrological model WaSiM-ETH (Water Flow and Balance Simulation Model). This in order to gain process understanding and to evaluate the predictability. The atmospheric drivers of this rain-on-snow flood were (i) sustained snowfall followed by (ii) the passage of an atmospheric river bringing warm and moist air towards the Alps. As a result, intensive rainfall (average of 100 mm day-1) was accompanied by a temperature increase that shifted the 0° line from 1500 to 3200 m a.s.l. (meters above sea level) in 24 h with a maximum increase of 9 K in 9 h. The south-facing slope of the valley received significantly more precipitation than the north-facing slope, leading to flooding only in tributaries along the south-facing slope. We hypothesized that the reason for this very local rainfall distribution was a cavity circulation combined with a seeder-feeder-cloud system enhancing local rainfall and snowmelt along the south-facing slope. By applying and considerably recalibrating the standard hydrological model setup, we proved that both latent and sensible heat fluxes were needed to reconstruct the snow cover dynamic, and that locally high-precipitation sums (160 mm in 12 h) were required to produce the estimated flood peak. However, to reproduce the rapid runoff responses during the event, we conceptually represent likely lateral flow dynamics within the snow cover causing the model to react "oversensitively" to meltwater. Driving the optimized model with COSMO (Consortium for Small-scale Modeling)-2 forecast data, we still failed to simulate the flood because COSMO-2 forecast data underestimated both the local precipitation peak and the temperature increase. Thus we conclude that this rain-on-snow flood was, in general, predictable, but requires a special hydrological model setup and extensive and locally precise meteorological input data. Although, this data quality may not be achieved with forecast data, an additional model with a specific rain-on-snow configuration can provide useful information when rain-on-snow events are likely to occur.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Initialising the ocean internal variability for decadal predictability studies is a new area of research and a variety of ad hoc methods are currently proposed. In this study, we explore how nudging with sea surface temperature (SST) and salinity (SSS) can reconstruct the threedimensional variability of the ocean in a perfect model framework. This approach builds on the hypothesis that oceanic processes themselves will transport the surface information into the ocean interior as seen in ocean-only simulations. Five nudged simulations are designed to reconstruct a 150 years ‘‘target’’ simulation, defined as a portion of a long control simulation. The nudged simulations differ by the variables restored to, SST or SST + SSS, and by the area where the nudging is applied. The strength of the heat flux feedback is diagnosed from observations and the restoring coefficients for SSS use the same time-scale. We observed that this choice prevents spurious convection at high latitudes and near sea-ice border when nudging both SST and SSS. In the tropics, nudging the SST is enough to reconstruct the tropical atmosphere circulation and the associated dynamical and thermodynamical impacts on the underlying ocean. In the tropical Pacific Ocean, the profiles for temperature show a significant correlation from the surface down to 2,000 m, due to dynamical adjustment of the isopycnals. At mid-tohigh latitudes, SSS nudging is required to reconstruct both the temperature and the salinity below the seasonal thermocline. This is particularly true in the North Atlantic where adding SSS nudging enables to reconstruct the deep convection regions of the target. By initiating a previously documented 20-year cycle of the model, the SST + SSS nudging is also able to reproduce most of the AMOC variations, a key source of decadal predictability. Reconstruction at depth does not significantly improve with amount of time spent nudging and the efficiency of the surface nudging rather depends on the period/events considered. The joint SST + SSS nudging applied verywhere is the most efficient approach. It ensures that the right water masses are formed at the right surface density, the subsequent circulation, subduction and deep convection further transporting them at depth. The results of this study underline the potential key role of SSS for decadal predictability and further make the case for sustained largescale observations of this field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Decadal and longer timescale variability in the winter North Atlantic Oscillation (NAO) has considerable impact on regional climate, yet it remains unclear what fraction of this variability is potentially predictable. This study takes a new approach to this question by demonstrating clear physical differences between NAO variability on interannual-decadal (<30 year) and multidecadal (>30 year) timescales. It is shown that on the shorter timescale the NAO is dominated by variations in the latitude of the North Atlantic jet and storm track, whereas on the longer timescale it represents changes in their strength instead. NAO variability on the two timescales is associated with different dynamical behaviour in terms of eddy-mean flow interaction, Rossby wave breaking and blocking. The two timescales also exhibit different regional impacts on temperature and precipitation and different relationships to sea surface temperatures. These results are derived from linear regression analysis of the Twentieth Century and NCEP-NCAR reanalyses and of a high-resolution HiGEM General Circulation Model control simulation, with additional analysis of a long sea level pressure reconstruction. Evidence is presented for an influence of the ocean circulation on the longer timescale variability of the NAO, which is particularly clear in the model data. As well as providing new evidence of potential predictability, these findings are shown to have implications for the reconstruction and interpretation of long climate records.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND In experimental animal studies, pulsing the CO2 laser beam has been shown to reduce the thermal damage zone of excised oral mucosal tissue. However, there is still controversy over whether this is borne out under clinical conditions. OBJECTIVE To compare the outcome following excisional biopsies of fibrous hyperplasias using a pulsed (cf) versus a continuous wave (cw) CO2 laser mode regarding the thermal damage zone, duration of surgeries, intra- and postoperative complications, postoperative pain sensation, scarring and/or relapse during the initial 6 months. MATERIALS AND METHODS One hundred Swiss-resident patients with a fibrous hyperplasia in their buccal mucosa were randomly assigned to the cw mode (5 W) or the cf mode (140 Hz, 400 microseconds, 33 mJ, 4.62 W) group. All excisions were performed by one single oral surgeon. Postoperative pain (2 weeks) was recorded by visual analogue scale (VAS; ranging from 0 to 100). Intake of analgesics and postoperative complications were recorded in a standardized study form. The maximum width of the collateral thermal damage zone was measured (µm) in excision specimens by one pathologist. Intraoral photographs at 6-month follow-up examinations were evaluated regarding scarring (yes/no). RESULTS Median duration of the excision was 65 seconds in the cw and 81 seconds in the cf group (P = 0.13). Intraoperative bleeding occurred in 16.3% of the patients in the cw and 17.7% of the cf group. The median value of the thermal damage zone was 161(±228) μm in the cw and 152(± 105) μm in the cf group (P = 0.68). The reported postoperative complications included swelling in 19% and minor bleeding in 6% without significant differences between the two laser modes. When comparing each day separately or the combined mean VAS scores of both groups between Days 1-3, 1-7, and 1-15, there were no significant differences. However, more patients of the cw group (25%) took analgesics than patients of the cf group (9.8%) resulting in a borderline significance (P = 0.04). Scarring at the excision site was found in 50.6% of 77 patients after 6 months, and more scars were identified in cases treated with the cf mode (P = 0.03). CONCLUSIONS Excision of fibrous hyperplasias performed with a CO2 laser demonstrated a good clinical outcome and long-term predictability with a low risk of recurrence regardless of the laser mode (cf or cw) used. Scarring after 6 months was only seen in 50.6% of the cases and was slightly more frequent in the cf mode group. Based on the findings of the present study, a safety border of 1 mm appears sufficient for both laser modes especially when performing a biopsy of a suspicious soft tissue lesion to ensure a proper histopathological examination.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND After cardiac surgery with cardiopulmonary bypass (CPB), acquired coagulopathy often leads to post-CPB bleeding. Though multifactorial in origin, this coagulopathy is often aggravated by deficient fibrinogen levels. OBJECTIVE To assess whether laboratory and thrombelastometric testing on CPB can predict plasma fibrinogen immediately after CPB weaning. PATIENTS / METHODS This prospective study in 110 patients undergoing major cardiovascular surgery at risk of post-CPB bleeding compares fibrinogen level (Clauss method) and function (fibrin-specific thrombelastometry) in order to study the predictability of their course early after termination of CPB. Linear regression analysis and receiver operating characteristics were used to determine correlations and predictive accuracy. RESULTS Quantitative estimation of post-CPB Clauss fibrinogen from on-CPB fibrinogen was feasible with small bias (+0.19 g/l), but with poor precision and a percentage of error >30%. A clinically useful alternative approach was developed by using on-CPB A10 to predict a Clauss fibrinogen range of interest instead of a discrete level. An on-CPB A10 ≤10 mm identified patients with a post-CPB Clauss fibrinogen of ≤1.5 g/l with a sensitivity of 0.99 and a positive predictive value of 0.60; it also identified those without a post-CPB Clauss fibrinogen <2.0 g/l with a specificity of 0.83. CONCLUSIONS When measured on CPB prior to weaning, a FIBTEM A10 ≤10 mm is an early alert for post-CPB fibrinogen levels below or within the substitution range (1.5-2.0 g/l) recommended in case of post-CPB coagulopathic bleeding. This helps to minimize the delay to data-based hemostatic management after weaning from CPB.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Initializing the ocean for decadal predictability studies is a challenge, as it requires reconstructing the little observed subsurface trajectory of ocean variability. In this study we explore to what extent surface nudging using well-observed sea surface temperature (SST) can reconstruct the deeper ocean variations for the 1949–2005 period. An ensemble made with a nudged version of the IPSLCM5A model and compared to ocean reanalyses and reconstructed datasets. The SST is restored to observations using a physically-based relaxation coefficient, in contrast to earlier studies, which use a much larger value. The assessment is restricted to the regions where the ocean reanalyses agree, i.e. in the upper 500 m of the ocean, although this can be latitude and basin dependent. Significant reconstruction of the subsurface is achieved in specific regions, namely region of subduction in the subtropical Atlantic, below the thermocline in the equatorial Pacific and, in some cases, in the North Atlantic deep convection regions. Beyond the mean correlations, ocean integrals are used to explore the time evolution of the correlation over 20-year windows. Classical fixed depth heat content diagnostics do not exhibit any significant reconstruction between the different existing bservation-based references and can therefore not be used to assess global average time-varying correlations in the nudged simulations. Using the physically based average temperature above an isotherm (14°C) alleviates this issue in the tropics and subtropics and shows significant reconstruction of these quantities in the nudged simulations for several decades. This skill is attributed to the wind stress reconstruction in the tropics, as already demonstrated in a perfect model study using the same model. Thus, we also show here the robustness of this result in an historical and observational context.