40 resultados para Safety Data Analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction: The Thalidomide-Dexamethasone (TD) regimen has provided encouraging results in relapsed MM. To improve results, bortezomib (Velcade) has been added to the combination in previous phase II studies, the so called VTD regimen. In January 2006, the European Group for Blood and Marrow Transplantation (EBMT) and the Intergroupe Francophone du Myélome (IFM) initiated a prospective, randomized, parallel-group, open-label phase III, multicenter study, comparing VTD (arm A) with TD (arm B) for MM patients progressing or relapsing after autologous transplantation. Patients and Methods: Inclusion criteria: patients in first progression or relapse after at least one autologous transplantation, including those who had received bortezomib or thalidomide before transplant. Exclusion criteria: subjects with neuropathy above grade 1 or non secretory MM. Primary study end point was time to progression (TTP). Secondary end points included safety, response rate, progression-free survival (PFS) and overall survival (OS). Treatment was scheduled as follows: bortezomib 1.3 mg/m2 was given as an i.v bolus on Days 1, 4, 8 and 11 followed by a 10-Day rest period (days 12 to 21) for 8 cycles (6 months) and then on Days 1, 8, 15, 22 followed by a 20-Day rest period (days 23 to 42) for 4 cycles (6 months). In both arms, thalidomide was scheduled at 200 mg/Day orally for one year and dexamethasone 40 mg/Day orally four days every three weeks for one year. Patients reaching remission could proceed to a new stem cell harvest. However, transplantation, either autologous or allogeneic, could only be performed in patients who completed the planned one year treatment period. Response was assessed by EBMT criteria, with additional category of near complete remission (nCR). Adverse events were graded by the NCI-CTCAE, Version 3.0.The trial was based on a group sequential design, with 4 planned interim analyses and one final analysis that allowed stopping for efficacy as well as futility. The overall alpha and power were set equal to 0.025 and 0.90 respectively. The test for decision making was based on the comparison in terms of the ratio of the cause-specific hazards of relapse/progression, estimated in a Cox model stratified on the number of previous autologous transplantations. Relapse/progression cumulative incidence was estimated using the proper nonparametric estimator, the comparison was done by the Gray test. PFS and OS probabilities were estimated by the Kaplan-Meier curves, the comparison was performed by the Log-Rank test. An interim safety analysis was performed when the first hundred patients had been included. The safety committee recommended to continue the trial. Results: As of 1st July 2010, 269 patients had been enrolled in the study, 139 in France (IFM 2005-04 study), 21 in Italy, 38 in Germany, 19 in Switzerland (a SAKK study), 23 in Belgium, 8 in Austria, 8 in the Czech republic, 11 in Hungary, 1 in the UK and 1 in Israel. One hundred and sixty nine patients were males and 100 females; the median age was 61 yrs (range 29-76). One hundred and thirty six patients were randomized to receive VTD and 133 to receive TD. The current analysis is based on 246 patients (124 in arm A, 122 in arm B) included in the second interim analysis, carried out when 134 events were observed. Following this analysis, the trial was stopped because of significant superiority of VTD over TD. The remaining patients were too premature to contribute to the analysis. The number of previous autologous transplants was one in 63 vs 60 and two or more in 61 vs 62 patients in arm A vs B respectively. The median follow-up was 25 months. The median TTP was 20 months vs 15 months respectively in arm A and B, with cumulative incidence of relapse/progression at 2 years equal to 52% (95% CI: 42%-64%) vs 70% (95% CI: 61%-81%) (p=0.0004, Gray test). The same superiority of arm A was also observed when stratifying on the number of previous autologous transplantations. At 2 years, PFS was 39% (95% CI: 30%-51%) vs 23% (95% CI: 16%-34%) (A vs B, p=0.0006, Log-Rank test). OS in the first two years was comparable in the two groups. Conclusion: VTD resulted in significantly longer TTP and PFS in patients relapsing after ASCT. Analysis of response and safety data are on going and results will be presented at the meeting.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Geophysical techniques can help to bridge the inherent gap with regard to spatial resolution and the range of coverage that plagues classical hydrological methods. This has lead to the emergence of the new and rapidly growing field of hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters and their inherent trade-off between resolution and range the fundamental usefulness of multi-method hydrogeophysical surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative integration of the resulting vast and diverse database in order to obtain a unified model of the probed subsurface region that is internally consistent with all available data. To address this problem, we have developed a strategy towards hydrogeophysical data integration based on Monte-Carlo-type conditional stochastic simulation that we consider to be particularly suitable for local-scale studies characterized by high-resolution and high-quality datasets. Monte-Carlo-based optimization techniques are flexible and versatile, allow for accounting for a wide variety of data and constraints of differing resolution and hardness and thus have the potential of providing, in a geostatistical sense, highly detailed and realistic models of the pertinent target parameter distributions. Compared to more conventional approaches of this kind, our approach provides significant advancements in the way that the larger-scale deterministic information resolved by the hydrogeophysical data can be accounted for, which represents an inherently problematic, and as of yet unresolved, aspect of Monte-Carlo-type conditional simulation techniques. We present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to corresponding field data collected at the Boise Hydrogeophysical Research Site near Boise, Idaho, USA.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A growing body of epidemiologic evidence suggests an association between exposure to cleaning products and respiratory dysfunction. Due to the lack of quantitative assessments of respiratory exposures to airborne irritants and sensitizers among professional cleaners, the culpable substances have yet to be identified.Purpose: Focusing on previously identified irritants, our aims were to determine (i) airborne concentrations of monoethanolamine (MEA), glycol ethers, and benzyl alcohol (BA) during different cleaning tasks performed by professional cleaning workers and assess their determinants; and (ii) air concentrations of formaldehyde, a known indoor air contaminant. METHODS: Personal air samples were collected in 12 cleaning companies, and analyzed by conventional methods. RESULTS: Nearly all air concentrations [MEA (n = 68), glycol ethers (n = 79), BA (n = 15), and formaldehyde (n = 45)] were far below (<1/10) of the corresponding Swiss occupational exposure limits (OEL), except for ethylene glycol mono-n-butyl ether (EGBE). For butoxypropanol and BA, no OELs exist. Although only detected once, EGBE air concentrations (n = 4) were high (49.48-58.72mg m(-3)), and close to the Swiss OEL (49mg m(-3)). When substances were not noted as present in safety data sheets of cleaning products used but were measured, air concentrations showed no presence of MEA, while the glycol ethers were often present, and formaldehyde was universally detected. Exposure to MEA was affected by its amount used (P = 0.036), and spraying (P = 0.000) and exposure to butoxypropanol was affected by spraying (P = 0.007) and cross-ventilation (P = 0.000). CONCLUSIONS: Professional cleaners were found to be exposed to multiple airborne irritants at low concentrations, thus these substances should be considered in investigations of respiratory dysfunctions in the cleaning industry; especially in specialized cleaning tasks such as intensive floor cleaning.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

INTRODUCTION: infants hospitalised in neonatology are inevitably exposed to pain repeatedly. Premature infants are particularly vulnerable, because they are hypersensitive to pain and demonstrate diminished behavioural responses to pain. They are therefore at risk of developing short and long-term complications if pain remains untreated. CONTEXT: compared to acute pain, there is limited evidence in the literature on prolonged pain in infants. However, the prevalence is reported between 20 and 40 %. OBJECTIVE : this single case study aimed to identify the bio-contextual characteristics of neonates who experienced prolonged pain. METHODS : this study was carried out in the neonatal unit of a tertiary referral centre in Western Switzerland. A retrospective data analysis of seven infants' profile, who experienced prolonged pain ,was performed using five different data sources. RESULTS : the mean gestational age of the seven infants was 32weeks. The main diagnosis included prematurity and respiratory distress syndrome. The total observations (N=55) showed that the participants had in average 21.8 (SD 6.9) painful procedures that were estimated to be of moderate to severe intensity each day. Out of the 164 recorded pain scores (2.9 pain assessment/day/infant), 14.6 % confirmed acute pain. Out of those experiencing acute pain, analgesia was given in 16.6 % of them and 79.1 % received no analgesia. CONCLUSION: this study highlighted the difficulty in managing pain in neonates who are exposed to numerous painful procedures. Pain in this population remains underevaluated and as a result undertreated.Results of this study showed that nursing documentation related to pain assessment is not systematic.Regular assessment and documentation of acute and prolonged pain are recommended. This could be achieved with clear guidelines on the Assessment Intervention Reassessment (AIR) cyclewith validated measures adapted to neonates. The adequacy of pain assessment is a pre-requisite for appropriate pain relief in neonates.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We conducted this study to determine the relative influence of various mechanical and patient-related factors on the incidence of dislocation after primary total hip asthroplasty (THA). Of 2,023 THAs, 21 patients who had at least 1 dislocation were compared with a control group of 21 patients without dislocation, matched for age, gender, pathology, and year of surgery. Implant positioning, seniority of the surgeon, American Society of Anesthesiologists (ASA) score, and diminished motor coordination were recorded. Data analysis included univariate and multivariate methods. The dislocation risk was 6.9 times higher if total anteversion was not between 40 degrees and 60 degrees and 10 times higher in patients with high ASA scores. Surgeons should pay attention to total anteversion (cup and stem) of THA. The ASA score should be part of the preoperative assessment of the dislocation risk.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The paper presents some contemporary approaches to spatial environmental data analysis. The main topics are concentrated on the decision-oriented problems of environmental spatial data mining and modeling: valorization and representativity of data with the help of exploratory data analysis, spatial predictions, probabilistic and risk mapping, development and application of conditional stochastic simulation models. The innovative part of the paper presents integrated/hybrid model-machine learning (ML) residuals sequential simulations-MLRSS. The models are based on multilayer perceptron and support vector regression ML algorithms used for modeling long-range spatial trends and sequential simulations of the residuals. NIL algorithms deliver non-linear solution for the spatial non-stationary problems, which are difficult for geostatistical approach. Geostatistical tools (variography) are used to characterize performance of ML algorithms, by analyzing quality and quantity of the spatially structured information extracted from data with ML algorithms. Sequential simulations provide efficient assessment of uncertainty and spatial variability. Case study from the Chernobyl fallouts illustrates the performance of the proposed model. It is shown that probability mapping, provided by the combination of ML data driven and geostatistical model based approaches, can be efficiently used in decision-making process. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Sport betting is a lucrative business for bookmakers, for the lucky (or wise) punters, but also for governments and for sport. While not new or even recent, the deviances linked to sport betting, primarily match-fixing, have gained increased media exposure in the past decade. This exploratory study is a qualitative content analysis of the press coverage of sport betting-related deviances in football in two countries (UK and France), using in each case two leading national publications over a period of five years. Data analysis indicates a mounting coverage of sport betting scandals, with teams, players and criminals increasingly framed as culprits, while authorities and federations primarily assume a positive role. As for the origin of sport betting deviances, French newspapers tend to blame the system (in an abstract way); British newspapers, in contrast, focus more on individual weaknesses, notably greed. This article contributed to the growing body of literature on the importance of these deviances and on the way they are perceived by sport organizations, legislators and the public at large.