995 resultados para sequential methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUNDS: Cyclophosphamide and high-dose steroids have been used as limited induction therapy in progressive IgA nephropathy (IgAN) to reduce the loss of renal function and proteinuria. We evaluated the effect of cyclophosphamide pulses (CyP) and mycophenolic acid (MPA) as sequential therapy on renal function in patients with progressive IgAN. METHODS: Twenty patients with progressive IgAN and advanced renal failure (median GFR 22 ml/min per 1.73 m2) and further disease activity (triangle downGFR -0.8 ml/min per month) after cyclophosphamide (CyP; n = 18) or steroid pulse therapy (n = 2) were treated with mycophenolate mofetil 1 g per day for a median of 27 months. RESULTS: The monthly loss of renal function was significantly reduced in linear regression analysis from -2.4 ml/min before CyP to -0.12 ml/min with CyP/MPA (p = 0.0009). Estimated renal survival time was significantly prolonged by a median of 65 months (p = 0.0014). Proteinuria decreased significantly from 1.7 to 0.4 g/l during MPA treatment (p = 0.015). In Cox regression analysis, only proteinuria >1.0 g/l was an independent risk factor for doubling of creatinine during CyP/MPA treatment (p = 0.03). CONCLUSION: A sequential therapy with CyP/MPA may arrest or slow down the loss of renal function and reduces proteinuria even in patients who passed the so called 'point of no return' with progressive IgAN.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kriging-based optimization relying on noisy evaluations of complex systems has recently motivated contributions from various research communities. Five strategies have been implemented in the DiceOptim package. The corresponding functions constitute a user-friendly tool for solving expensive noisy optimization problems in a sequential framework, while offering some flexibility for advanced users. Besides, the implementation is done in a unified environment, making this package a useful device for studying the relative performances of existing approaches depending on the experimental setup. An overview of the package structure and interface is provided, as well as a description of the strategies and some insight about the implementation challenges and the proposed solutions. The strategies are compared to some existing optimization packages on analytical test functions and show promising performances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Catheter ablation (CA) of ventricular tachycardia (VT) is an important treatment option in patients with structural heart disease (SHD) and implantable cardioverter defibrillator (ICD). A subset of patients requires epicardial CA for VT. Objective The purpose of the study was to assess the significance of epicardial CA in these patients after a systematic sequential endocardial approach. Methods Between January 2009 and October 2012 CA for VT was analyzed. A sequential CA approach guided by earliest ventricular activation, pacemap, entrainment and stimulus to QRS-interval analysis was used. Acute CA success was assessed by programmed ventricular stimulation. ICD interrogation and 24 h-Holter ECG were used to evaluate long-term success. Results One hundred sixty VT ablation procedures in 126 consecutive patients (114 men; age 65 ± 12 years) were performed. Endocardial CA succeeded in 250 (94%) out of 265 treated VT. For 15 (6%) VT an additional epicardial CA was performed and succeeded in 9 of these 15 VT. Long-term FU (25 ± 18.2 month) showed freedom of VT in 104 pts (82%) after 1.2 ± 0.5 procedures, 11 (9%) suffered from repeated ICD shocks and 11 (9%) died due to worsening of heart failure. Conclusions Despite a heterogenic substrate for VT in SHD, endocardial CA alone results in high acute success rates. In this study additional epicardial CA following a sequential endocardial mapping and CA approach was performed in 6% of VT. Thus, due to possible complications epicardial CA should only be considered if endocardial CA fails.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Among patients with steroid-refractory ulcerative colitis (UC) in whom a first rescue therapy has failed, a second line salvage treatment can be considered to avoid colectomy. AIM: To evaluate the efficacy and safety of second or third line rescue therapy over a one-year period. METHODS: Response to single or sequential rescue treatments with infliximab (5mg/kg intravenously (iv) at week 0, 2, 6 and then every 8weeks), ciclosporin (iv 2mg/kg/daily and then oral 5mg/kg/daily) or tacrolimus (0.05mg/kg divided in 2 doses) in steroid-refractory moderate to severe UC patients from 7 Swiss and 1 Serbian tertiary IBD centers was retrospectively studied. The primary endpoint was the one year colectomy rate. RESULTS: 60% of patients responded to the first rescue therapy, 10% went to colectomy and 30% non-responders were switched to a 2(nd) line rescue treatment. 66% of patients responded to the 2(nd) line treatment whereas 34% failed, of which 15% went to colectomy and 19% received a 3(rd) line rescue treatment. Among those, 50% patients went to colectomy. Overall colectomy rate of the whole cohort was 18%. Steroid-free remission rate was 39%. The adverse event rates were 33%, 37.5% and 30% for the first, second and third line treatment respectively. CONCLUSION: Our data show that medical intervention even with 2(nd) and 3(rd) rescue treatments decreased colectomy frequency within one year of follow up. A longer follow-up will be necessary to investigate whether sequential therapy will only postpone colectomy and what percentage of patients will remain in long-term remission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE Leakage is the most common complication of percutaneous cement augmentation of the spine. The viscosity of the polymethylmethacrylate (PMMA) cement is strongly correlated with the likelihood of cement leakage. We hypothesized that cement leakage can be reduced by sequential cement injection in a vertebroplasty model. METHODS A standardized vertebral body substitute model, consisting of aluminum oxide foams coated by acrylic cement with a preformed leakage path, simulating a ventral vein, was developed. Three injection techniques of 6 ml PMMA were assessed: injection in one single step (all-in-one), injection of 1 ml at the first and 5 ml at the second step with 1 min latency in-between (two-step), and sequential injection of 0.5 ml with 1-min latency between the sequences (sequential). Standard PMMA vertebroplasty cement was used; each injection type was tested on ten vertebral body substitute models with two possible leakage paths per model. Leakage was assessed by radiographs using a zonal graduation: intraspongious = no leakage and extracortical = leakage. RESULTS The leakage rate was significantly lower in the "sequential" technique (2/20 leakages) followed by "two-step" (15/20) and "all-in-one" (20/20) techniques (p < 0.001). The RR for a cement leakage was 10.0 times higher in the "all-in-one" compared to the "sequential" group (95 % confidence intervals 2.7-37.2; p < 0.001). CONCLUSIONS The sequential cement injection is a simple approach to minimize the risk for leakage. Taking advantage of the temperature gradient between body and room temperature, it is possible to increase the cement viscosity inside the vertebra while keeping it low in the syringe. Using sequential injection of small cement volumes, further leakage paths are blocked before further injection of the low-viscosity cement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a remote sensing observational method for the measurement of the spatio-temporal dynamics of ocean waves. Variational techniques are used to recover a coherent space-time reconstruction of oceanic sea states given stereo video imagery. The stereoscopic reconstruction problem is expressed in a variational optimization framework. There, we design an energy functional whose minimizer is the desired temporal sequence of wave heights. The functional combines photometric observations as well as spatial and temporal regularizers. A nested iterative scheme is devised to numerically solve, via 3-D multigrid methods, the system of partial differential equations resulting from the optimality condition of the energy functional. The output of our method is the coherent, simultaneous estimation of the wave surface height and radiance at multiple snapshots. We demonstrate our algorithm on real data collected off-shore. Statistical and spectral analysis are performed. Comparison with respect to an existing sequential method is analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper outlines the problems found in the parallelization of SPH (Smoothed Particle Hydrodynamics) algorithms using Graphics Processing Units. Different results of some parallel GPU implementations in terms of the speed-up and the scalability compared to the CPU sequential codes are shown. The most problematic stage in the GPU-SPH algorithms is the one responsible for locating neighboring particles and building the vectors where this information is stored, since these specific algorithms raise many dificulties for a data-level parallelization. Because of the fact that the neighbor location using linked lists does not show enough data-level parallelism, two new approaches have been pro- posed to minimize bank conflicts in the writing and subsequent reading of the neighbor lists. The first strategy proposes an efficient coordination between CPU-GPU, using GPU algorithms for those stages that allow a straight forward parallelization, and sequential CPU algorithms for those instructions that involve some kind of vector reduction. This coordination provides a relatively orderly reading of the neighbor lists in the interactions stage, achieving a speed-up factor of x47 in this stage. However, since the construction of the neighbor lists is quite expensive, it is achieved an overall speed-up of x41. The second strategy seeks to maximize the use of the GPU in the neighbor's location process by executing a specific vector sorting algorithm that allows some data-level parallelism. Al- though this strategy has succeeded in improving the speed-up on the stage of neighboring location, the global speed-up on the interactions stage falls, due to inefficient reading of the neighbor vectors. Some changes to these strategies are proposed, aimed at maximizing the computational load of the GPU and using the GPU texture-units, in order to reach the maximum speed-up for such codes. Different practical applications have been added to the mentioned GPU codes. First, the classical dam-break problem is studied. Second, the wave impact of the sloshing fluid contained in LNG vessel tanks is also simulated as a practical example of particle methods

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been shown that it is possible to exploit Independent/Restricted And-parallelism in logic programs while retaining the conventional "don't know" semantics of such programs. In particular, it is possible to parallelize pure Prolog programs while maintaining the semantics of the language. However, when builtin side-effects (such as write or assert) appear in the program, if an identical observable behaviour to that of sequential Prolog implementations is to be preserved, such side-effects have to be properly sequenced. Previously proposed solutions to this problem are either incomplete (lacking, for example, backtracking semantics) or they force sequentialization of significant portions of the execution graph which could otherwise run in parallel. In this paper a series of side-effect synchronization methods are proposed which incur lower overhead and allow more parallelism than those previously proposed. Most importantly, and unlike previous proposals, they have well-defined backward execution behaviour and require only a small modification to a given (And-parallel) Prolog implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative descriptive analysis (QDA) is used to describe the nature and the intensity of sensory properties from a single evaluation of a product, whereas temporal dominance of sensation (TDS) is primarily used to identify dominant sensory properties over time. Previous studies with TDS have focused on model systems, but this is the first study to use a sequential approach, i.e. QDA then TDS in measuring sensory properties of a commercial product category, using the same set of trained assessors (n = 11). The main objectives of this study were to: (1) investigate the benefits of using a sequential approach of QDA and TDS and (2) to explore the impact of the sample composition on taste and flavour perceptions in blackcurrant squashes. The present study has proposed an alternative way of determining the choice of attributes for TDS measurement based on data obtained from previous QDA studies, where available. Both methods indicated that the flavour profile was primarily influenced by the level of dilution and complexity of sample composition combined with blackcurrant juice content. In addition, artificial sweeteners were found to modify the quality of sweetness and could also contribute to bitter notes. Using QDA and TDS in tandem was shown to be more beneficial than each just on its own enabling a more complete sensory profile of the products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-suicidal self-injury (NSSI), such as cutting and burning, is a widespread social problem among lesbian, gay, bisexual, transgender, queer, and questioning (LGBTQ) youth. Extant research indicates that this population is more than twice as likely to engage in NSSI than heterosexual and cisgender (non-transgender) youth. Despite the scope of this social problem, it remains relatively unexamined in the literature. Research on other risk behaviors among LGBTQ youth indicates that experiencing homophobia and transphobia in key social contexts such as families, schools, and peer relationships contributes to health disparities among this group. Consequently, the aims of this study were to examine: (1) the relationship between LGBTQ youth's social environments and their NSSI behavior, and (2) whether/how specific aspects of the social environment contribute to an understanding of NSSI among LGBTQ youth. This study was conducted using an exploratory, sequential mixed methods design with two phases. The first phase of the study involved analysis of transcripts from interviews conducted with 44 LGBTQ youth recruited from a community-based organization. In this phase, five qualitative themes were identified: (1) Violence; (2) Misconceptions, Stigma, and Shame; (3) Negotiating LGBTQ Identity; (4) Invisibility and Isolation; and (5) Peer Relationships. Results from the qualitative phase were used to identify key variables and specify statistical models in the second, quantitative, phase of the study, using secondary data from a survey of 252 LGBTQ youth. The qualitative phase revealed how LGBTQ youth, themselves, described the role of the social environment in their NSSI behavior, while the quantitative phase was used to determine whether the qualitative findings could be used to predict engagement in NSSI among a larger sample of LGBTQ youth. The quantitative analyses found that certain social-environmental factors such as experiencing physical abuse at home, feeling unsafe at school, and greater openness about sexual orientation significantly predicted the likelihood of engaging in NSSI among LGBTQ youth. Furthermore, depression partially mediated the relationships between family physical abuse and NSSI and feeling unsafe at school and NSSI. The qualitative and quantitative results were compared in the interpretation phase to explore areas of convergence and incongruence. Overall, this study's findings indicate that social-environmental factors are salient to understanding NSSI among LGBTQ youth. The particular social contexts in which LGBTQ youth live significantly influence their engagement in this risk behavior. These findings can inform the development of culturally relevant NSSI interventions that address the social realities of LGBTQ youth's lives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently within the machine learning and spatial statistics communities many papers have explored the potential of reduced rank representations of the covariance matrix, often referred to as projected or fixed rank approaches. In such methods the covariance function of the posterior process is represented by a reduced rank approximation which is chosen such that there is minimal information loss. In this paper a sequential framework for inference in such projected processes is presented, where the observations are considered one at a time. We introduce a C++ library for carrying out such projected, sequential estimation which adds several novel features. In particular we have incorporated the ability to use a generic observation operator, or sensor model, to permit data fusion. We can also cope with a range of observation error characteristics, including non-Gaussian observation errors. Inference for the variogram parameters is based on maximum likelihood estimation. We illustrate the projected sequential method in application to synthetic and real data sets. We discuss the software implementation and suggest possible future extensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The trend in modal extraction algorithms is to use all the available frequency response functions data to obtain a global estimate of the natural frequencies, damping ratio and mode shapes. Improvements in transducer and signal processing technology allow the simultaneous measurement of many hundreds of channels of response data. The quantity of data available and the complexity of the extraction algorithms make considerable demands on the available computer power and require a powerful computer or dedicated workstation to perform satisfactorily. An alternative to waiting for faster sequential processors is to implement the algorithm in parallel, for example on a network of Transputers. Parallel architectures are a cost effective means of increasing computational power, and a larger number of response channels would simply require more processors. This thesis considers how two typical modal extraction algorithms, the Rational Fraction Polynomial method and the Ibrahim Time Domain method, may be implemented on a network of transputers. The Rational Fraction Polynomial Method is a well known and robust frequency domain 'curve fitting' algorithm. The Ibrahim Time Domain method is an efficient algorithm that 'curve fits' in the time domain. This thesis reviews the algorithms, considers the problems involved in a parallel implementation, and shows how they were implemented on a real Transputer network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heterogeneous datasets arise naturally in most applications due to the use of a variety of sensors and measuring platforms. Such datasets can be heterogeneous in terms of the error characteristics and sensor models. Treating such data is most naturally accomplished using a Bayesian or model-based geostatistical approach; however, such methods generally scale rather badly with the size of dataset, and require computationally expensive Monte Carlo based inference. Recently within the machine learning and spatial statistics communities many papers have explored the potential of reduced rank representations of the covariance matrix, often referred to as projected or fixed rank approaches. In such methods the covariance function of the posterior process is represented by a reduced rank approximation which is chosen such that there is minimal information loss. In this paper a sequential Bayesian framework for inference in such projected processes is presented. The observations are considered one at a time which avoids the need for high dimensional integrals typically required in a Bayesian approach. A C++ library, gptk, which is part of the INTAMAP web service, is introduced which implements projected, sequential estimation and adds several novel features. In particular the library includes the ability to use a generic observation operator, or sensor model, to permit data fusion. It is also possible to cope with a range of observation error characteristics, including non-Gaussian observation errors. Inference for the covariance parameters is explored, including the impact of the projected process approximation on likelihood profiles. We illustrate the projected sequential method in application to synthetic and real datasets. Limitations and extensions are discussed. © 2010 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Elemental analysis can become an important piece of evidence to assist the solution of a case. The work presented in this dissertation aims to evaluate the evidential value of the elemental composition of three particular matrices: ink, paper and glass. In the first part of this study, the analytical performance of LIBS and LA-ICP-MS methods was evaluated for paper, writing inks and printing inks. A total of 350 ink specimens were examined including black and blue gel inks, ballpoint inks, inkjets and toners originating from several manufacturing sources and/or batches. The paper collection set consisted of over 200 paper specimens originating from 20 different paper sources produced by 10 different plants. Micro-homogeneity studies show smaller variation of elemental compositions within a single source (i.e., sheet, pen or cartridge) than the observed variation between different sources (i.e., brands, types, batches). Significant and detectable differences in the elemental profile of the inks and paper were observed between samples originating from different sources (discrimination of 87–100% of samples, depending on the sample set under investigation and the method applied). These results support the use of elemental analysis, using LA-ICP-MS and LIBS, for the examination of documents and provide additional discrimination to the currently used techniques in document examination. In the second part of this study, a direct comparison between four analytical methods (µ-XRF, solution-ICP-MS, LA-ICP-MS and LIBS) was conducted for glass analyses using interlaboratory studies. The data provided by 21 participants were used to assess the performance of the analytical methods in associating glass samples from the same source and differentiating different sources, as well as the use of different match criteria (confidence interval (±6s, ±5s, ±4s, ±3s, ±2s), modified confidence interval, t-test (sequential univariate, p=0.05 and p=0.01), t-test with Bonferroni correction (for multivariate comparisons), range overlap, and Hotelling's T2 tests. Error rates (Type 1 and Type 2) are reported for the use of each of these match criteria and depend on the heterogeneity of the glass sources, the repeatability between analytical measurements, and the number of elements that were measured. The study provided recommendations for analytical performance-based parameters for µ-XRF and LA-ICP-MS as well as the best performing match criteria for both analytical techniques, which can be applied now by forensic glass examiners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This sequential explanatory, mixed methods research design examines the role teachers should enact in the development process of the teacher evaluation system in Louisiana. These insights will ensure teachers are catalysts in the classroom to significantly increase student achievement and allow policymakers, practitioners, and instructional leaders to direct as learned decision makers.