945 resultados para Point density analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the recent history of a large prealpine lake (Lake Bourget) using chironomids, diatoms and organic matter analysis, and deals with the ability of paleolimnological approach to define an ecological reference state for the lake in the sense of the European Framework Directive. The study at low resolution of subfossil chironomids in a 4-m-long core shows the remarkable stability over the last 2.5 kyrs of the profundal community dominated by a Micropsectra-association until the beginning of the twentieth century, when oxyphilous taxa disappeared. Focusing on this key recent period, a high resolution and multiproxy study of two short cores reveals a progressive evolution of the lake's ecological state. Until AD 1880, Lake Bourget showed low organic matter content in the deep sediments (TOC less than 1%) and a well-oxygenated hypolimnion that allowed the development of a profundal oxyphilous chironomid fauna (Micropsectra-association). Diatom communities were characteristic of oligotrophic conditions. Around AD 1880, a slight increase in the TOC was the first sign of changes in lake conditions. This was followed by a first limited decline in oligotrophic diatom taxa and the disappearance of two oxyphilous chironomid taxa at the beginning of the twentieth century. The 1940s were a major turning point in recent lake history. Diatom assemblages and accumulation of well preserved planktonic organic matter in the sediment provide evidence of strong eutrophication. The absence of profundal chironomid communities reveals permanent hypolimnetic anoxia. From AD 1995 to 2006, the diatom assemblages suggest a reduction in nutrients, and a return to mesotrophic conditions, a result of improved wastewater management. However, no change in hypolimnion benthic conditions has been shown by either the organic matter or the subfossil chironomid profundal community. Our results emphasize the relevance of the paleolimnological approach for the assessment of reference conditions for modern lakes. Before AD 1900, the profundal Micropsectra-association and the Cyclotella dominated diatom community can be considered as the Lake Bourget reference community, which reflects the reference ecological state of the lake.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The risk of osteoporosis and fracture influences the selection of adjuvant endocrine therapy. We analyzed bone mineral density (BMD) in Swiss patients of the Breast International Group (BIG) 1-98 trial [treatment arms: A, tamoxifen (T) for 5 years; B, letrozole (L) for 5 years; C, 2 years of T followed by 3 years of L; D, 2 years of L followed by 3 years of T]. PATIENTS AND METHODS: Dual-energy X-ray absorptiometry (DXA) results were retrospectively collected. Patients without DXA served as control group. Repeated measures models using covariance structures allowing for different times between DXA were used to estimate changes in BMD. Prospectively defined covariates were considered as fixed effects in the multivariable models. RESULTS: Two hundred and sixty-one of 546 patients had one or more DXA with 577 lumbar and 550 hip measurements. Weight, height, prior hormone replacement therapy, and hysterectomy were positively correlated with BMD; the correlation was negative for letrozole arms (B/C/D versus A), known osteoporosis, time on trial, age, chemotherapy, and smoking. Treatment did not influence the occurrence of osteoporosis (T score < -2.5 standard deviation). CONCLUSIONS: All aromatase inhibitor regimens reduced BMD. The sequential schedules were as detrimental for bone density as L monotherapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CONTEXT: In the Health Outcomes and Reduced Incidence with Zoledronic Acid Once Yearly - Pivotal Fracture Trial (HORIZON-PFT), zoledronic acid (ZOL) 5 mg significantly reduced fracture risk. OBJECTIVE: The aim of the study was to identify factors associated with greater efficacy during ZOL 5 mg treatment. DESIGN, SETTING, AND PATIENTS: We conducted a subgroup analysis (preplanned and post hoc) of a multicenter, double-blind, placebo-controlled, 36-month trial in 7765 women with postmenopausal osteoporosis. Intervention: A single infusion of ZOL 5 mg or placebo was administered at baseline, 12, and 24 months. MAIN OUTCOME MEASURES: Primary endpoints were new vertebral fracture and hip fracture. Secondary endpoints were nonvertebral fracture and change in femoral neck bone mineral density (BMD). Baseline risk factor subgroups were age, BMD T-score and vertebral fracture status, total hip BMD, race, weight, geographical region, smoking, height loss, history of falls, physical activity, prior bisphosphonates, creatinine clearance, body mass index, and concomitant osteoporosis medications. RESULTS: Greater ZOL induced effects on vertebral fracture risk were seen with younger age (treatment-by-subgroup interaction, P = 0.05), normal creatinine clearance (P = 0.04), and body mass index >or= 25 kg/m(2) (P = 0.02). There were no significant treatment-factor interactions for hip or nonvertebral fracture or for change in BMD. CONCLUSIONS: ZOL appeared more effective in preventing vertebral fracture in younger women, overweight/obese women, and women with normal renal function. ZOL had similar effects irrespective of fracture risk factors or femoral neck BMD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unlike the evaluation of single items of scientific evidence, the formal study and analysis of the jointevaluation of several distinct items of forensic evidence has to date received some punctual, ratherthan systematic, attention. Questions about the (i) relationships among a set of (usually unobservable)propositions and a set of (observable) items of scientific evidence, (ii) the joint probative valueof a collection of distinct items of evidence as well as (iii) the contribution of each individual itemwithin a given group of pieces of evidence still represent fundamental areas of research. To somedegree, this is remarkable since both, forensic science theory and practice, yet many daily inferencetasks, require the consideration of multiple items if not masses of evidence. A recurrent and particularcomplication that arises in such settings is that the application of probability theory, i.e. the referencemethod for reasoning under uncertainty, becomes increasingly demanding. The present paper takesthis as a starting point and discusses graphical probability models, i.e. Bayesian networks, as frameworkwithin which the joint evaluation of scientific evidence can be approached in some viable way.Based on a review of existing main contributions in this area, the article here aims at presentinginstances of real case studies from the author's institution in order to point out the usefulness andcapacities of Bayesian networks for the probabilistic assessment of the probative value of multipleand interrelated items of evidence. A main emphasis is placed on underlying general patterns of inference,their representation as well as their graphical probabilistic analysis. Attention is also drawnto inferential interactions, such as redundancy, synergy and directional change. These distinguish thejoint evaluation of evidence from assessments of isolated items of evidence. Together, these topicspresent aspects of interest to both, domain experts and recipients of expert information, because theyhave bearing on how multiple items of evidence are meaningfully and appropriately set into context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When decommissioning a nuclear facility it is important to be able to estimate activity levels of potentially radioactive samples and compare with clearance values defined by regulatory authorities. This paper presents a method of calibrating a clearance box monitor based on practical experimental measurements and Monte Carlo simulations. Adjusting the simulation for experimental data obtained using a simple point source permits the computation of absolute calibration factors for more complex geometries with an accuracy of a bit more than 20%. The uncertainty of the calibration factor can be improved to about 10% when the simulation is used relatively, in direct comparison with a measurement performed in the same geometry but with another nuclide. The simulation can also be used to validate the experimental calibration procedure when the sample is supposed to be homogeneous but the calibration factor is derived from a plate phantom. For more realistic geometries, like a small gravel dumpster, Monte Carlo simulation shows that the calibration factor obtained with a larger homogeneous phantom is correct within about 20%, if sample density is taken as the influencing parameter. Finally, simulation can be used to estimate the effect of a contamination hotspot. The research supporting this paper shows that activity could be largely underestimated in the event of a centrally-located hotspot and overestimated for a peripherally-located hotspot if the sample is assumed to be homogeneously contaminated. This demonstrates the usefulness of being able to complement experimental methods with Monte Carlo simulations in order to estimate calibration factors that cannot be directly measured because of a lack of available material or specific geometries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to determined species composition and community structure of Carabidae and Staphylinidae in five areas of forest fragment and soybean/corn crops or orange orchard, from December 2004 to May 2007. Beetles were captured in pitfall traps distributed along two parallel transects of 200 m in length, placed across crop land/forest boundary fragment, with 100 m each. The Shannon-Wiener diversity and evenness indexes and Morisita similarity index were calculated. The carabids Abaris basistriatus Chaudoir, Calosoma granulatum Perty, Megacephala brasiliensis Kirby, Odontochila nodicornis (Dejean) and Selenophorus seriatoporus Putzeys. are dominant and are widely distributed in northeastern São Paulo state, Brazil. Point-scale species diversity was greatest at the transition between forest fragment and cultivated area. The carabid and staphylinid communities of the forest fragment were more similar to the community of orange orchard than that of soybean/corn crops.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Iowa features an extensive surface transportation system, with more than 110,000 miles of roadway, most of which is under the jurisdiction of local agencies. Given that Iowa is a lower-population state, most of this mileage is located in rural areas that exhibit low traffic volumes of less than 400 vehicles per day. However, these low-volume rural roads also account for about half of all recorded traffic crashes in Iowa, including a high percentage of fatal and major injury crashes. This study was undertaken to examine these crashes, identify major contributing causes, and develop low-cost strategies for reducing the incidence of these crashes. Iowa’s extensive crash and roadway system databases were utilized to obtain needed data. Using descriptive statistics, a test of proportions, and crash modeling, various classes of rural secondary roads were compared to similar state of Iowa controlled roads in crash frequency, severity, density, and rate for numerous selected factors that could contribute to crashes. The results of this study allowed the drawing of conclusions as to common contributing factors for crashes on low-volume rural roads, both paved and unpaved. Due to identified higher crash statistics, particular interest was drawn to unpaved rural roads with traffic volumes greater than 100 vehicles per day. Recommendations for addressing these crashes with low-cost mitigation are also included. Because of the isolated nature of traffic crashes on low-volume roads, a systemic or mass action approach to safety mitigation was recommended for an identified subset of the entire system. In addition, future development of a reliable crash prediction model is described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Center for Transportation Research and Education (CTRE) issued a report in July 2003, based on a sample study of the application of remote sensed image land use change detection to the methodology of traffic monitoring in Blackhawk County, Iowa. In summary, the results indicated a strong correlation and a statistically significant regression coefficient between the identification of built-up land use change areas from remote sensed data and corresponding changes in traffic patterns, expressed as vehicle miles traveled (VMT). Based on these results, the Iowa Department of Transportation (Iowa DOT) requested that CTRE expand the study area to five counties in the southwest quadrant of the state. These counties are scheduled for traffic counts in 2004, and the Iowa DOT desired the data to 1) evaluate the current methodology used to place the devices; 2) potentially influence the placement of traffic counting devices in areas of high built-up land use change; and 3) determine if opportunities exist to reduce the frequency and/or density of monitoring activity in lower trafficked rural areas of the state. This project is focused on the practical application of built-up land use change data for placement of traffic count data recording devices in five southwest Iowa counties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The factor structure of a back translated Spanish version (Lega, Caballo and Ellis, 2002) of the Attitudes and Beliefs Inventory (ABI) (Burgess, 1990) is analyzed in a sample of 250 university students.The Spanish version of the ABI is a 48-items self-report inventory using a 5-point Likert scale that assesses rational and irrational attitudes and beliefs. 24-items cover two dimensions of irrationality: a) areas of content (3 subscales), and b) styles of thinking (4 subscales).An Exploratory Factor Analysis (Parallel Analysis with Unweighted Least Squares method and Promin rotation) was performed with the FACTOR 9.20 software (Lorenzo-Seva and Ferrando, 2013).The results reproduced the main four styles of irrational thinking in relation with the three specific contents of irrational beliefs. However, two factors showed a complex configuration with important cross-loadings of different items in content and style. More analyses are needed to review the specific content and style of such items.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global positioning systems (GPS) offer a cost-effective and efficient method to input and update transportation data. The spatial location of objects provided by GPS is easily integrated into geographic information systems (GIS). The storage, manipulation, and analysis of spatial data are also relatively simple in a GIS. However, many data storage and reporting methods at transportation agencies rely on linear referencing methods (LRMs); consequently, GPS data must be able to link with linear referencing. Unfortunately, the two systems are fundamentally incompatible in the way data are collected, integrated, and manipulated. In order for the spatial data collected using GPS to be integrated into a linear referencing system or shared among LRMs, a number of issues need to be addressed. This report documents and evaluates several of those issues and offers recommendations. In order to evaluate the issues associated with integrating GPS data with a LRM, a pilot study was created. To perform the pilot study, point features, a linear datum, and a spatial representation of a LRM were created for six test roadway segments that were located within the boundaries of the pilot study conducted by the Iowa Department of Transportation linear referencing system project team. Various issues in integrating point features with a LRM or between LRMs are discussed and recommendations provided. The accuracy of the GPS is discussed, including issues such as point features mapping to the wrong segment. Another topic is the loss of spatial information that occurs when a three-dimensional or two-dimensional spatial point feature is converted to a one-dimensional representation on a LRM. Recommendations such as storing point features as spatial objects if necessary or preserving information such as coordinates and elevation are suggested. The lack of spatial accuracy characteristic of most cartography, on which LRM are often based, is another topic discussed. The associated issues include linear and horizontal offset error. The final topic discussed is some of the issues in transferring point feature data between LRMs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Elevated serum uric acid levels cause gout and are a risk factor for cardiovascular disease and diabetes. To investigate the polygenetic basis of serum uric acid levels, we conducted a meta-analysis of genome-wide association scans from 14 studies totalling 28,141 participants of European descent, resulting in identification of 954 SNPs distributed across nine loci that exceeded the threshold of genome-wide significance, five of which are novel. Overall, the common variants associated with serum uric acid levels fall in the following nine regions: SLC2A9 (p = 5.2x10(-201)), ABCG2 (p = 3.1x10(-26)), SLC17A1 (p = 3.0x10(-14)), SLC22A11 (p = 6.7x10(-14)), SLC22A12 (p = 2.0x10(-9)), SLC16A9 (p = 1.1x10(-8)), GCKR (p = 1.4x10(-9)), LRRC16A (p = 8.5x10(-9)), and near PDZK1 (p = 2.7x10(-9)). Identified variants were analyzed for gender differences. We found that the minor allele for rs734553 in SLC2A9 has greater influence in lowering uric acid levels in women and the minor allele of rs2231142 in ABCG2 elevates uric acid levels more strongly in men compared to women. To further characterize the identified variants, we analyzed their association with a panel of metabolites. rs12356193 within SLC16A9 was associated with DL-carnitine (p = 4.0x10(-26)) and propionyl-L-carnitine (p = 5.0x10(-8)) concentrations, which in turn were associated with serum UA levels (p = 1.4x10(-57) and p = 8.1x10(-54), respectively), forming a triangle between SNP, metabolites, and UA levels. Taken together, these associations highlight additional pathways that are important in the regulation of serum uric acid levels and point toward novel potential targets for pharmacological intervention to prevent or treat hyperuricemia. In addition, these findings strongly support the hypothesis that transport proteins are key in regulating serum uric acid levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The factor structure of a back translated Spanish version (Lega, Caballo and Ellis, 2002) of the Attitudes and Beliefs Inventory (ABI) (Burgess, 1990) is analyzed in a sample of 250 university students.The Spanish version of the ABI is a 48-items self-report inventory using a 5-point Likert scale that assesses rational and irrational attitudes and beliefs. 24-items cover two dimensions of irrationality: a) areas of content (3 subscales), and b) styles of thinking (4 subscales).An Exploratory Factor Analysis (Parallel Analysis with Unweighted Least Squares method and Promin rotation) was performed with the FACTOR 9.20 software (Lorenzo-Seva and Ferrando, 2013).The results reproduced the main four styles of irrational thinking in relation with the three specific contents of irrational beliefs. However, two factors showed a complex configuration with important cross-loadings of different items in content and style. More analyses are needed to review the specific content and style of such items.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: The European Organisation for Research and Treatment of Cancer and National Cancer Institute of Canada trial on temozolomide (TMZ) and radiotherapy (RT) in glioblastoma (GBM) has demonstrated that the combination of TMZ and RT conferred a significant and meaningful survival advantage compared with RT alone. We evaluated in this trial whether the recursive partitioning analysis (RPA) retains its overall prognostic value and what the benefit of the combined modality is in each RPA class. PATIENTS AND METHODS: Five hundred seventy-three patients with newly diagnosed GBM were randomly assigned to standard postoperative RT or to the same RT with concomitant TMZ followed by adjuvant TMZ. The primary end point was overall survival. The European Organisation for Research and Treatment of Cancer RPA used accounts for age, WHO performance status, extent of surgery, and the Mini-Mental Status Examination. RESULTS: Overall survival was statistically different among RPA classes III, IV, and V, with median survival times of 17, 15, and 10 months, respectively, and 2-year survival rates of 32%, 19%, and 11%, respectively (P < .0001). Survival with combined TMZ/RT was higher in RPA class III, with 21 months median survival time and a 43% 2-year survival rate, versus 15 months and 20% for RT alone (P = .006). In RPA class IV, the survival advantage remained significant, with median survival times of 16 v 13 months, respectively, and 2-year survival rates of 28% v 11%, respectively (P = .0001). In RPA class V, however, the survival advantage of RT/TMZ was of borderline significance (P = .054). CONCLUSION: RPA retains its prognostic significance overall as well as in patients receiving RT with or without TMZ for newly diagnosed GBM, particularly in classes III and IV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stop-loss reinsurance is one of the most important reinsurance contracts in the insurance market. From the insurer point of view, it presents an interesting property: it is optimal if the criterion of minimizing the variance of the cost of the insurer is used. The aim of the paper is to contribute to the analysis of the stop-loss contract in one period from the point of view of the insurer and the reinsurer. Firstly, the influence of the parameters of the reinsurance contract on the correlation coefficient between the cost of the insurer and the cost of the reinsurer is studied. Secondly, the optimal stop-loss contract is obtained if the criterion used is the maximization of the joint survival probability of the insurer and the reinsurer in one period.