38 resultados para the SIMPLE algorithm


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this work was to study and quantify the differences in dose distributions computed with some of the newest dose calculation algorithms available in commercial planning systems. The study was done for clinical cases originally calculated with pencil beam convolution (PBC) where large density inhomogeneities were present. Three other dose algorithms were used: a pencil beam like algorithm, the anisotropic analytic algorithm (AAA), a convolution superposition algorithm, collapsed cone convolution (CCC), and a Monte Carlo program, voxel Monte Carlo (VMC++). The dose calculation algorithms were compared under static field irradiations at 6 MV and 15 MV using multileaf collimators and hard wedges where necessary. Five clinical cases were studied: three lung and two breast cases. We found that, in terms of accuracy, the CCC algorithm performed better overall than AAA compared to VMC++, but AAA remains an attractive option for routine use in the clinic due to its short computation times. Dose differences between the different algorithms and VMC++ for the median value of the planning target volume (PTV) were typically 0.4% (range: 0.0 to 1.4%) in the lung and -1.3% (range: -2.1 to -0.6%) in the breast for the few cases we analysed. As expected, PTV coverage and dose homogeneity turned out to be more critical in the lung than in the breast cases with respect to the accuracy of the dose calculation. This was observed in the dose volume histograms obtained from the Monte Carlo simulations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The modified American College of Cardiology/American Heart Association (ACC/AHA) lesion morphology classification scheme has prognostic impact for early and late outcomes when bare-metal stents are used. Its value after drug-eluting stent placement is unknown. The predictive value of this lesion morphology classification system in patients treated using sirolimus-eluting stents included in the German Cypher Registry was prospectively examined. The study population included 6,755 patients treated for 7,960 lesions using sirolimus-eluting stents. Lesions were classified as type A, B1, B2, or C. Lesion type A or B1 was considered simple (35.1%), and type B2 or C, complex (64.9%). The combined end point of all deaths, myocardial infarction, or target vessel revascularization was seen in 2.6% versus 2.4% in the complex and simple groups, respectively (p = 0.62) at initial hospital discharge, with a trend for higher rates of myocardial infarction in the complex group. At the 6-month clinical follow-up and after adjusting for other independent factors, the composite of cumulative death, myocardial infarction, and target vessel revascularization was nonsignificantly different between groups (11.4% vs 11.2% in the complex and simple groups, respectively; odds ratio 1.08, 95% confidence interval 0.8 to 1.46). This was also true for target vessel revascularization alone (8.3% of the complex group, 9.0% of the simple group; odds ratio 0.87, 95% confidence interval 0.72 to 1.05). In conclusion, the modified ACC/AHA lesion morphology classification system has some value in determining early complications after sirolimus-eluting stent implantation. Clinical follow-up results at 6 months were generally favorable and cannot be adequately differentiated on the basis of this lesion morphology classification scheme.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An important problem in unsupervised data clustering is how to determine the number of clusters. Here we investigate how this can be achieved in an automated way by using interrelation matrices of multivariate time series. Two nonparametric and purely data driven algorithms are expounded and compared. The first exploits the eigenvalue spectra of surrogate data, while the second employs the eigenvector components of the interrelation matrix. Compared to the first algorithm, the second approach is computationally faster and not limited to linear interrelation measures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In terms of atmospheric impact, the volcanic eruption of Mt. Pinatubo (1991) is the best characterized large eruption on record. We investigate here the model-derived stratospheric warming following the Pinatubo eruption as derived from SAGE II extinction data including recent improvements in the processing algorithm. This method, termed SAGE_4λ, makes use of the four wavelengths (385, 452, 525 and 1024 nm) of the SAGE II data when available, and uses a data-filling procedure in the opacity-induced "gap" regions. Using SAGE_4λ, we derived aerosol size distributions that properly reproduce extinction coefficients also at much longer wavelengths. This provides a good basis for calculating the absorption of terrestrial infrared radiation and the resulting stratospheric heating. However, we also show that the use of this data set in a global chemistry–climate model (CCM) still leads to stronger aerosol-induced stratospheric heating than observed, with temperatures partly even higher than the already too high values found by many models in recent general circulation model (GCM) and CCM intercomparisons. This suggests that the overestimation of the stratospheric warming after the Pinatubo eruption may not be ascribed to an insufficient observational database but instead to using outdated data sets, to deficiencies in the implementation of the forcing data, or to radiative or dynamical model artifacts. Conversely, the SAGE_4λ approach reduces the infrared absorption in the tropical tropopause region, resulting in a significantly better agreement with the post-volcanic temperature record at these altitudes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In contrast to the treatment of avulsion lesions of the anterior cruciate ligament (ACL) the management of intrasubstance ACL tears in the skeletally immature patient remains controversial. Prospective studies could show that conservative treatment results in severe instability with concomitant intraarticular damage and poor function of the knee. Reconstruction of a torn ACL always carries the risk of damaging the open growth plates; with consecutively affecting the longitudinal or axial growth of the lower extremity either on the femoral or the tibial side. Thus, several surgical procedures are available to prevent adverse events mentioned above. The purpose of this study is to review the recent literature regarding the treatment algorithm for ACL injuries in skeletally immature patients. This review will (1) investigate the indications for ACL surgery in children; (2) determine if a surgical procedure is clinically superior in skeletally immature patients; and (3) correlate the adverse events with the surgical technique.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The current article presents a novel physiological control algorithm for ventricular assist devices (VADs), which is inspired by the preload recruitable stroke work. This controller adapts the hydraulic power output of the VAD to the end-diastolic volume of the left ventricle. We tested this controller on a hybrid mock circulation where the left ventricular volume (LVV) is known, i.e., the problem of measuring the LVV is not addressed in the current article. Experiments were conducted to compare the response of the controller with the physiological and with the pathological circulation, with and without VAD support. A sensitivity analysis was performed to analyze the influence of the controller parameters and the influence of the quality of the LVV signal on the performance of the control algorithm. The results show that the controller induces a response similar to the physiological circulation and effectively prevents over- and underpumping, i.e., ventricular suction and backflow from the aorta to the left ventricle, respectively. The same results are obtained in the case of a disturbed LVV signal. The results presented in the current article motivate the development of a robust, long-term stable sensor to measure the LVV.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE To extend the capabilities of the Cone Location and Magnitude Index algorithm to include a combination of topographic information from the anterior and posterior corneal surfaces and corneal thickness measurements to further improve our ability to correctly identify keratoconus using this new index: ConeLocationMagnitudeIndex_X. DESIGN Retrospective case-control study. METHODS Three independent data sets were analyzed: 1 development and 2 validation. The AnteriorCornealPower index was calculated to stratify the keratoconus data from mild to severe. The ConeLocationMagnitudeIndex algorithm was applied to all tomography data collected using a dual Scheimpflug-Placido-based tomographer. The ConeLocationMagnitudeIndex_X formula, resulting from analysis of the Development set, was used to determine the logistic regression model that best separates keratoconus from normal and was applied to all data sets to calculate PercentProbabilityKeratoconus_X. The sensitivity/specificity of PercentProbabilityKeratoconus_X was compared with the original PercentProbabilityKeratoconus, which only uses anterior axial data. RESULTS The AnteriorCornealPower severity distribution for the combined data sets are 136 mild, 12 moderate, and 7 severe. The logistic regression model generated for ConeLocationMagnitudeIndex_X produces complete separation for the Development set. Validation Set 1 has 1 false-negative and Validation Set 2 has 1 false-positive. The overall sensitivity/specificity results for the logistic model produced using the ConeLocationMagnitudeIndex_X algorithm are 99.4% and 99.6%, respectively. The overall sensitivity/specificity results for using the original ConeLocationMagnitudeIndex algorithm are 89.2% and 98.8%, respectively. CONCLUSIONS ConeLocationMagnitudeIndex_X provides a robust index that can detect the presence or absence of a keratoconic pattern in corneal tomography maps with improved sensitivity/specificity from the original anterior surface-only ConeLocationMagnitudeIndex algorithm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper considers a framework where data from correlated sources are transmitted with the help of network coding in ad hoc network topologies. The correlated data are encoded independently at sensors and network coding is employed in the intermediate nodes in order to improve the data delivery performance. In such settings, we focus on the problem of reconstructing the sources at decoder when perfect decoding is not possible due to losses or bandwidth variations. We show that the source data similarity can be used at decoder to permit decoding based on a novel and simple approximate decoding scheme. We analyze the influence of the network coding parameters and in particular the size of finite coding fields on the decoding performance. We further determine the optimal field size that maximizes the expected decoding performance as a trade-off between information loss incurred by limiting the resolution of the source data and the error probability in the reconstructed data. Moreover, we show that the performance of the approximate decoding improves when the accuracy of the source model increases even with simple approximate decoding techniques. We provide illustrative examples showing how the proposed algorithm can be deployed in sensor networks and distributed imaging applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The two major subtypes of diffuse large B-cell lymphoma (DLBCL) (germinal centre B-cell - like (GCB-DLBCL) and activated B-cell - like (ABC-DLBCL)) are defined by means of gene expression profiling (GEP). Patients with GCB-DLBCL survive longer with the current standard regimen R-CHOP than patients with ABC-DLBCL. As GEP is not part of the current routine diagnostic work-up, efforts have been made to find a substitute than involves immunohistochemistry (IHC). Various algorithms achieved this with 80-90% accuracy. However, conflicting results on the appropriateness of IHC have been reported. Because it is likely that the molecular subtypes will play a role in future clinical practice, we assessed the determination of the molecular DLBCL subtypes by means of IHC at our University Hospital, and some aspects of this determination elsewhere in Switzerland. The most frequently used Hans algorithm includes three antibodies (against CD10, bcl-6 and MUM1). From records of the routine diagnostic work-up, we identified 51 of 172 (29.7%) newly diagnosed and treated DLBCL cases from 2005 until 2010 with an assigned DLBCL subtype. DLBCL subtype information was expanded by means of tissue microarray analysis. The outcome for patients with the GCB subtype was significantly better compared with those with the non-GC subtype, independent of the age-adjusted International Prognostic Index. We found a lack of standardisation in the subtype determination by means of IHC in Switzerland and significant problems of reproducibility. We conclude that the Hans algorithm performs well in our hands and that awareness of this important matter is increasing. However, outside clinical trials, vigorous efforts to standardise IHC determination are needed as DLBCL subtype-specific therapies emerge.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND Areal bone mineral density is predictive for fracture risk. Microstructural bone parameters evaluated at the appendicular skeleton by high-resolution peripheral quantitative computed tomography (HR-pQCT) display differences between healthy patients and fracture patients. With the simple geometry of the cortex at the distal tibial diaphysis, a cortical index of the tibia combining material and mechanical properties correlated highly with bone strength ex vivo. The trabecular bone score derived from the scan of the lumbar spine by dual-energy X-ray absorptiometry (DXA) correlated ex vivo with the micro architectural parameters. It is unknown if these microstructural correlations could be made in healthy premenopausal women. METHODS Randomly selected women between 20-40 years of age were examined by DXA and HR-pQCT at the standard regions of interest and at customized sub regions to focus on cortical and trabecular parameters of strength separately. For cortical strength, at the distal tibia the volumetric cortical index was calculated directly from HR-pQCT and the areal cortical index was derived from the DXA scan using a Canny threshold-based tool. For trabecular strength, the trabecular bone score was calculated based on the DXA scan of the lumbar spine and was compared with the corresponding parameters derived from the HR-pQCT measurements at radius and tibia. RESULTS Seventy-two healthy women were included (average age 33.8 years, average BMI 23.2 kg/m(2)). The areal cortical index correlated highly with the volumetric cortical index at the distal tibia (R  =  0.798). The trabecular bone score correlated moderately with the microstructural parameters of the trabecular bone. CONCLUSION This study in randomly selected premenopausal women demonstrated that microstructural parameters of the bone evaluated by HR-pQCT correlated with the DXA derived parameters of skeletal regions containing predominantly cortical or cancellous bone. Whether these indexes are suitable for better predictions of the fracture risk deserves further investigation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Is the online trade with second-hand products changing individual consumer behaviour? What is the sustainability potential of this activity? How can daily energy-consuming routines at the workplace be changed? Do major changes in the course of people's lives represent opportunities to modify their consumer behaviour towards greater sustainability? These are only some of the research questions studied in the focal topic "From Knowledge to Action - New Paths towards Sustainable Consumption" which is funded by the German Federal Ministry of Education and Research (BMBF) as part of the "Social-ecological Research Programme" (SÖF). This book gives an insight into the research results of the ten project groups. Their diversity highlights that there is much more to "sustainable consumption" than the simple purchase of organic or fair trade products.In addition, overarching conceptual and normative issues were treated across the project groups of the focal topic. Developed collaboratively and moderated by the accompanying research project, the results of the synthesis process are also presented here, as for example how the sustainability of individual consumer behaviour can be evaluated,or which theories of action are particularly useful for specific consumer behaviour phenomena.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Artificial pancreas is in the forefront of research towards the automatic insulin infusion for patients with type 1 diabetes. Due to the high inter- and intra-variability of the diabetic population, the need for personalized approaches has been raised. This study presents an adaptive, patient-specific control strategy for glucose regulation based on reinforcement learning and more specifically on the Actor-Critic (AC) learning approach. The control algorithm provides daily updates of the basal rate and insulin-to-carbohydrate (IC) ratio in order to optimize glucose regulation. A method for the automatic and personalized initialization of the control algorithm is designed based on the estimation of the transfer entropy (TE) between insulin and glucose signals. The algorithm has been evaluated in silico in adults, adolescents and children for 10 days. Three scenarios of initialization to i) zero values, ii) random values and iii) TE-based values have been comparatively assessed. The results have shown that when the TE-based initialization is used, the algorithm achieves faster learning with 98%, 90% and 73% in the A+B zones of the Control Variability Grid Analysis for adults, adolescents and children respectively after five days compared to 95%, 78%, 41% for random initialization and 93%, 88%, 41% for zero initial values. Furthermore, in the case of children, the daily Low Blood Glucose Index reduces much faster when the TE-based tuning is applied. The results imply that automatic and personalized tuning based on TE reduces the learning period and improves the overall performance of the AC algorithm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Demographic composition and dynamics of animal and human populations are important determinants for the transmission dynamics of infectious disease and for the effect of infectious disease or environmental disasters on productivity. In many circumstances, demographic data are not available or of poor quality. Since 1999 Switzerland has been recording cattle movements, births, deaths and slaughter in an animal movement database (AMD). The data present in the AMD offers the opportunity for analysing and understanding the dynamic of the Swiss cattle population. A dynamic population model can serve as a building block for future disease transmission models and help policy makers in developing strategies regarding animal health, animal welfare, livestock management and productivity. The Swiss cattle population was therefore modelled using a system of ordinary differential equations. The model was stratified by production type (dairy or beef), age and gender (male and female calves: 0-1 year, heifers and young bulls: 1-2 years, cows and bulls: older than 2 years). The simulation of the Swiss cattle population reflects the observed pattern accurately. Parameters were optimized on the basis of the goodness-of-fit (using the Powell algorithm). The fitted rates were compared with calculated rates from the AMD and differed only marginally. This gives confidence in the fitted rates of parameters that are not directly deductible from the AMD (e.g. the proportion of calves that are moved from the dairy system to fattening plants).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: In an artificial pancreas (AP), the meals are either manually announced or detected and their size estimated from the blood glucose level. Both methods have limitations, which result in suboptimal postprandial glucose control. The GoCARB system is designed to provide the carbohydrate content of meals and is presented within the AP framework. Method: The combined use of GoCARB with a control algorithm is assessed in a series of 12 computer simulations. The simulations are defined according to the type of the control (open or closed loop), the use or not-use of GoCARB and the diabetics’ skills in carbohydrate estimation. Results: For bad estimators without GoCARB, the percentage of the time spent in target range (70-180 mg/dl) during the postprandial period is 22.5% and 66.2% for open and closed loop, respectively. When the GoCARB is used, the corresponding percentages are 99.7% and 99.8%. In case of open loop, the time spent in severe hypoglycemic events (<50 mg/dl) is 33.6% without the GoCARB and is reduced to 0.0% when the GoCARB is used. In case of closed loop, the corresponding percentage is 1.4% without the GoCARB and is reduced to 0.0% with the GoCARB. Conclusion: The use of GoCARB improves the control of postprandial response and glucose profiles especially in the case of open loop. However, the most efficient regulation is achieved by the combined use of the control algorithm and the GoCARB.