71 resultados para Mass-consistent model
Resumo:
Using a large prospective cohort of over 12,000 women, we determined 2 thresholds (high risk and low risk of hip fracture) to use in a 10-yr hip fracture probability model that we had previously described, a model combining the heel stiffness index measured by quantitative ultrasound (QUS) and a set of easily determined clinical risk factors (CRFs). The model identified a higher percentage of women with fractures as high risk than a previously reported risk score that combined QUS and CRF. In addition, it categorized women in a way that was quite consistent with the categorization that occurred using dual X-ray absorptiometry (DXA) and the World Health Organization (WHO) classification system; the 2 methods identified similar percentages of women with and without fractures in each of their 3 categories, but the 2 identified only in part the same women. Nevertheless, combining our composite probability model with DXA in a case findings strategy will likely further improve the detection of women at high risk of fragility hip fracture. We conclude that the currently proposed model may be of some use as an alternative to the WHO classification criteria for osteoporosis, at least when access to DXA is limited.
Resumo:
Résumé L'eau est souvent considérée comme une substance ordinaire puisque elle est très commune dans la nature. En fait elle est la plus remarquable de toutes les substances. Sans l'eau la vie sur la terre n'existerait pas. L'eau représente le composant majeur de la cellule vivante, formant typiquement 70 à 95% de la masse cellulaire et elle fournit un environnement à d'innombrables organismes puisque elle couvre 75% de la surface de terre. L'eau est une molécule simple faite de deux atomes d'hydrogène et un atome d'oxygène. Sa petite taille semble en contradiction avec la subtilité de ses propriétés physiques et chimiques. Parmi celles-là, le fait que, au point triple, l'eau liquide est plus dense que la glace est particulièrement remarquable. Malgré son importance particulière dans les sciences de la vie, l'eau est systématiquement éliminée des spécimens biologiques examinés par la microscopie électronique. La raison en est que le haut vide du microscope électronique exige que le spécimen biologique soit solide. Pendant 50 ans la science de la microscopie électronique a adressé ce problème résultant en ce moment en des nombreuses techniques de préparation dont l'usage est courrant. Typiquement ces techniques consistent à fixer l'échantillon (chimiquement ou par congélation), remplacer son contenu d'eau par un plastique doux qui est transformé à un bloc rigide par polymérisation. Le bloc du spécimen est coupé en sections minces (denviron 50 nm) avec un ultramicrotome à température ambiante. En général, ces techniques introduisent plusieurs artefacts, principalement dû à l'enlèvement d'eau. Afin d'éviter ces artefacts, le spécimen peut être congelé, coupé et observé à basse température. Cependant, l'eau liquide cristallise lors de la congélation, résultant en une importante détérioration. Idéalement, l'eau liquide est solidifiée dans un état vitreux. La vitrification consiste à refroidir l'eau si rapidement que les cristaux de glace n'ont pas de temps de se former. Une percée a eu lieu quand la vitrification d'eau pure a été découverte expérimentalement. Cette découverte a ouvert la voie à la cryo-microscopie des suspensions biologiques en film mince vitrifié. Nous avons travaillé pour étendre la technique aux spécimens épais. Pour ce faire les échantillons biologiques doivent être vitrifiés, cryo-coupées en sections vitreuse et observées dans une cryo-microscope électronique. Cette technique, appelée la cryo- microscopie électronique des sections vitrifiées (CEMOVIS), est maintenant considérée comme étant la meilleure façon de conserver l'ultrastructure de tissus et cellules biologiques dans un état très proche de l'état natif. Récemment, cette technique est devenue une méthode pratique fournissant des résultats excellents. Elle a cependant, des limitations importantes, la plus importante d'entre elles est certainement dû aux artefacts de la coupe. Ces artefacts sont la conséquence de la nature du matériel vitreux et le fait que les sections vitreuses ne peuvent pas flotter sur un liquide comme c'est le cas pour les sections en plastique coupées à température ambiante. Le but de ce travail a été d'améliorer notre compréhension du processus de la coupe et des artefacts de la coupe. Nous avons ainsi trouvé des conditions optimales pour minimiser ou empêcher ces artefacts. Un modèle amélioré du processus de coupe et une redéfinitions des artefacts de coupe sont proposés. Les résultats obtenus sous ces conditions sont présentés et comparés aux résultats obtenus avec les méthodes conventionnelles. Abstract Water is often considered to be an ordinary substance since it is transparent, odourless, tasteless and it is very common in nature. As a matter of fact it can be argued that it is the most remarkable of all substances. Without water life on Earth would not exist. Water is the major component of cells, typically forming 70 to 95% of cellular mass and it provides an environment for innumerable organisms to live in, since it covers 75% of Earth surface. Water is a simple molecule made of two hydrogen atoms and one oxygen atom, H2O. The small size of the molecule stands in contrast with its unique physical and chemical properties. Among those the fact that, at the triple point, liquid water is denser than ice is especially remarkable. Despite its special importance in life science, water is systematically removed from biological specimens investigated by electron microscopy. This is because the high vacuum of the electron microscope requires that the biological specimen is observed in dry conditions. For 50 years the science of electron microscopy has addressed this problem resulting in numerous preparation techniques, presently in routine use. Typically these techniques consist in fixing the sample (chemically or by freezing), replacing its water by plastic which is transformed into rigid block by polymerisation. The block is then cut into thin sections (c. 50 nm) with an ultra-microtome at room temperature. Usually, these techniques introduce several artefacts, most of them due to water removal. In order to avoid these artefacts, the specimen can be frozen, cut and observed at low temperature. However, liquid water crystallizes into ice upon freezing, thus causing severe damage. Ideally, liquid water is solidified into a vitreous state. Vitrification consists in solidifying water so rapidly that ice crystals have no time to form. A breakthrough took place when vitrification of pure water was discovered. Since this discovery, the thin film vitrification method is used with success for the observation of biological suspensions of. small particles. Our work was to extend the method to bulk biological samples that have to be vitrified, cryosectioned into vitreous sections and observed in cryo-electron microscope. This technique is called cryo-electron microscopy of vitreous sections (CEMOVIS). It is now believed to be the best way to preserve the ultrastructure of biological tissues and cells very close to the native state for electron microscopic observation. Since recently, CEMOVIS has become a practical method achieving excellent results. It has, however, some sever limitations, the most important of them certainly being due to cutting artefacts. They are the consequence of the nature of vitreous material and the fact that vitreous sections cannot be floated on a liquid as is the case for plastic sections cut at room temperature. The aim of the present work has been to improve our understanding of the cutting process and of cutting artefacts, thus finding optimal conditions to minimise or prevent these artefacts. An improved model of the cutting process and redefinitions of cutting artefacts are proposed. Results obtained with CEMOVIS under these conditions are presented and compared with results obtained with conventional methods.
Resumo:
Résumé: L'automatisation du séquençage et de l'annotation des génomes, ainsi que l'application à large échelle de méthodes de mesure de l'expression génique, génèrent une quantité phénoménale de données pour des organismes modèles tels que l'homme ou la souris. Dans ce déluge de données, il devient très difficile d'obtenir des informations spécifiques à un organisme ou à un gène, et une telle recherche aboutit fréquemment à des réponses fragmentées, voir incomplètes. La création d'une base de données capable de gérer et d'intégrer aussi bien les données génomiques que les données transcriptomiques peut grandement améliorer la vitesse de recherche ainsi que la qualité des résultats obtenus, en permettant une comparaison directe de mesures d'expression des gènes provenant d'expériences réalisées grâce à des techniques différentes. L'objectif principal de ce projet, appelé CleanEx, est de fournir un accès direct aux données d'expression publiques par le biais de noms de gènes officiels, et de représenter des données d'expression produites selon des protocoles différents de manière à faciliter une analyse générale et une comparaison entre plusieurs jeux de données. Une mise à jour cohérente et régulière de la nomenclature des gènes est assurée en associant chaque expérience d'expression de gène à un identificateur permanent de la séquence-cible, donnant une description physique de la population d'ARN visée par l'expérience. Ces identificateurs sont ensuite associés à intervalles réguliers aux catalogues, en constante évolution, des gènes d'organismes modèles. Cette procédure automatique de traçage se fonde en partie sur des ressources externes d'information génomique, telles que UniGene et RefSeq. La partie centrale de CleanEx consiste en un index de gènes établi de manière hebdomadaire et qui contient les liens à toutes les données publiques d'expression déjà incorporées au système. En outre, la base de données des séquences-cible fournit un lien sur le gène correspondant ainsi qu'un contrôle de qualité de ce lien pour différents types de ressources expérimentales, telles que des clones ou des sondes Affymetrix. Le système de recherche en ligne de CleanEx offre un accès aux entrées individuelles ainsi qu'à des outils d'analyse croisée de jeux de donnnées. Ces outils se sont avérés très efficaces dans le cadre de la comparaison de l'expression de gènes, ainsi que, dans une certaine mesure, dans la détection d'une variation de cette expression liée au phénomène d'épissage alternatif. Les fichiers et les outils de CleanEx sont accessibles en ligne (http://www.cleanex.isb-sib.ch/). Abstract: The automatic genome sequencing and annotation, as well as the large-scale gene expression measurements methods, generate a massive amount of data for model organisms. Searching for genespecific or organism-specific information througout all the different databases has become a very difficult task, and often results in fragmented and unrelated answers. The generation of a database which will federate and integrate genomic and transcriptomic data together will greatly improve the search speed as well as the quality of the results by allowing a direct comparison of expression results obtained by different techniques. The main goal of this project, called the CleanEx database, is thus to provide access to public gene expression data via unique gene names and to represent heterogeneous expression data produced by different technologies in a way that facilitates joint analysis and crossdataset comparisons. A consistent and uptodate gene nomenclature is achieved by associating each single gene expression experiment with a permanent target identifier consisting of a physical description of the targeted RNA population or the hybridization reagent used. These targets are then mapped at regular intervals to the growing and evolving catalogues of genes from model organisms, such as human and mouse. The completely automatic mapping procedure relies partly on external genome information resources such as UniGene and RefSeq. The central part of CleanEx is a weekly built gene index containing crossreferences to all public expression data already incorporated into the system. In addition, the expression target database of CleanEx provides gene mapping and quality control information for various types of experimental resources, such as cDNA clones or Affymetrix probe sets. The Affymetrix mapping files are accessible as text files, for further use in external applications, and as individual entries, via the webbased interfaces . The CleanEx webbased query interfaces offer access to individual entries via text string searches or quantitative expression criteria, as well as crossdataset analysis tools, and crosschip gene comparison. These tools have proven to be very efficient in expression data comparison and even, to a certain extent, in detection of differentially expressed splice variants. The CleanEx flat files and tools are available online at: http://www.cleanex.isbsib. ch/.
Resumo:
Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.
Resumo:
BACKGROUND: Obesity is strongly associated with major depressive disorder (MDD) and various other diseases. Genome-wide association studies have identified multiple risk loci robustly associated with body mass index (BMI). In this study, we aimed to investigate whether a genetic risk score (GRS) combining multiple BMI risk loci might have utility in prediction of obesity in patients with MDD. METHODS: Linear and logistic regression models were conducted to predict BMI and obesity, respectively, in three independent large case-control studies of major depression (Radiant, GSK-Munich, PsyCoLaus). The analyses were first performed in the whole sample and then separately in depressed cases and controls. An unweighted GRS was calculated by summation of the number of risk alleles. A weighted GRS was calculated as the sum of risk alleles at each locus multiplied by their effect sizes. Receiver operating characteristic (ROC) analysis was used to compare the discriminatory ability of predictors of obesity. RESULTS: In the discovery phase, a total of 2,521 participants (1,895 depressed patients and 626 controls) were included from the Radiant study. Both unweighted and weighted GRS were highly associated with BMI (P <0.001) but explained only a modest amount of variance. Adding 'traditional' risk factors to GRS significantly improved the predictive ability with the area under the curve (AUC) in the ROC analysis, increasing from 0.58 to 0.66 (95% CI, 0.62-0.68; χ(2) = 27.68; P <0.0001). Although there was no formal evidence of interaction between depression status and GRS, there was further improvement in AUC in the ROC analysis when depression status was added to the model (AUC = 0.71; 95% CI, 0.68-0.73; χ(2) = 28.64; P <0.0001). We further found that the GRS accounted for more variance of BMI in depressed patients than in healthy controls. Again, GRS discriminated obesity better in depressed patients compared to healthy controls. We later replicated these analyses in two independent samples (GSK-Munich and PsyCoLaus) and found similar results. CONCLUSIONS: A GRS proved to be a highly significant predictor of obesity in people with MDD but accounted for only modest amount of variance. Nevertheless, as more risk loci are identified, combining a GRS approach with information on non-genetic risk factors could become a useful strategy in identifying MDD patients at higher risk of developing obesity.
Resumo:
A variation of task analysis was used to build an empirical model of how therapists may facilitate client assimilation process, described in the Assimilation of Problematic Experiences Scale. A rational model was specified and considered in light of an analysis of therapist in-session performances (N = 117) drawn from six inpatient therapies for depression. The therapist interventions were measured by the Comprehensive Psychotherapeutic Interventions Rating Scale. Consistent with the rational model, confronting interventions were particularly useful in helping clients elaborate insight. However, rather than there being a small number of progress-related interventions at lower levels of assimilation, therapists' use of interventions was broader than hypothesized and drew from a wide range of therapeutic approaches. Concerning the higher levels of assimilation, there was insufficient data to allow an analysis of the therapist's progress-related interventions.
Resumo:
PURPOSE: To characterize perifoveal intraretinal cavities observed around full-thickness macular holes (MH) using en face optical coherence tomography and to establish correlations with histology of human and primate maculae. DESIGN: Retrospective nonconsecutive observational case series. METHODS: Macular en face scans of 8 patients with MH were analyzed to quantify the areas of hyporeflective spaces, and were compared with macular flat mounts and sections from 1 normal human donor eye and 2 normal primate eyes (Macaca fascicularis). Immunohistochemistry was used to study the distribution of glutamine synthetase, expressed by Müller cells, and zonula occludens-1, a tight-junction protein. RESULTS: The mean area of hyporeflective spaces was lower in the inner nuclear layer (INL) than in the complex formed by the outer plexiform (OPL) and the Henle fiber layers (HFL): 5.0 × 10(-3) mm(2) vs 15.9 × 10(-3) mm(2), respectively (P < .0001, Kruskal-Wallis test). In the OPL and HFL, cavities were elongated with a stellate pattern, whereas in the INL they were rounded and formed vertical cylinders. Immunohistochemistry confirmed that Müller cells followed a radial distribution around the fovea in the frontal plane and a "Z-shaped" course in the axial plane, running obliquely in the OPL and HFL and vertically in the inner layers. In addition, zonula occludens-1 co-localized with Müller cells within the complex of OPL and HFL, indicating junctions in between Müller cells and cone axons. CONCLUSION: The dual profile of cavities around MHs correlates with Müller cell morphology and is consistent with the hypothesis of intra- or extracellular fluid accumulation along these cells.
Resumo:
BACKGROUND: Recent methodological advances allow better examination of speciation and extinction processes and patterns. A major open question is the origin of large discrepancies in species number between groups of the same age. Existing frameworks to model this diversity either focus on changes between lineages, neglecting global effects such as mass extinctions, or focus on changes over time which would affect all lineages. Yet it seems probable that both lineages differences and mass extinctions affect the same groups. RESULTS: Here we used simulations to test the performance of two widely used methods under complex scenarios of diversification. We report good performances, although with a tendency to over-predict events with increasing complexity of the scenario. CONCLUSION: Overall, we find that lineage shifts are better detected than mass extinctions. This work has significance to assess the methods currently used to estimate changes in diversification using phylogenetic trees. Our results also point toward the need to develop new models of diversification to expand our capabilities to analyse realistic and complex evolutionary scenarios.
Resumo:
OBJECTIVE: To quantify the relation between body mass index (BMI) and endometrial cancer risk, and to describe the shape of such a relation. DESIGN: Pooled analysis of three hospital-based case-control studies. SETTING: Italy and Switzerland. POPULATION: A total of 1449 women with endometrial cancer and 3811 controls. METHODS: Multivariate odds ratios (OR) and 95% confidence intervals (95% CI) were obtained from logistic regression models. The shape of the relation was determined using a class of flexible regression models. MAIN OUTCOME MEASURE: The relation of BMI with endometrial cancer. RESULTS: Compared with women with BMI 18.5 to <25 kg/m(2) , the odds ratio was 5.73 (95% CI 4.28-7.68) for women with a BMI ≥35 kg/m(2) . The odds ratios were 1.10 (95% CI 1.09-1.12) and 1.63 (95% CI 1.52-1.75) respectively for an increment of BMI of 1 and 5 units. The relation was stronger in never-users of oral contraceptives (OR 3.35, 95% CI 2.78-4.03, for BMI ≥30 versus <25 kg/m(2) ) than in users (OR 1.22, 95% CI 0.56-2.67), and in women with diabetes (OR 8.10, 95% CI 4.10-16.01, for BMI ≥30 versus <25 kg/m(2) ) than in those without diabetes (OR 2.95, 95% CI 2.44-3.56). The relation was best fitted by a cubic model, although after the exclusion of the 5% upper and lower tails, it was best fitted by a linear model. CONCLUSIONS: The results of this study confirm a role of elevated BMI in the aetiology of endometrial cancer and suggest that the risk in obese women increases in a cubic nonlinear fashion. The relation was stronger in never-users of oral contraceptives and in women with diabetes. TWEETABLE ABSTRACT: Risk of endometrial cancer increases with elevated body weight in a cubic nonlinear fashion.
Resumo:
BACKGROUND: Obesity and substance use are major concern in young people. This study explored the bidirectional longitudinal relationships between the body mass index (BMI) of young men and their use of: 1) four classes of non-medical prescription drugs; 2) alcohol; 3) tobacco; and 4) cannabis. METHODS: Baseline and follow-up data from the Cohort Study on Substance Use Risk Factors were used (n=5,007). A cross-lagged panel model, complemented by probit models as sensitivity analysis, was run to determine the bidirectional relationships between BMI and substance use. Alcohol was assessed using risky single-occasion drinking (RSOD); tobacco, using daily smoking; and cannabis, using hazardous cannabis use (defined as twice-weekly or more cannabis use). Non-medical prescription drugs use (NMPDU) included opioid analgesics, sedatives/sleeping pills, anxiolytics and stimulants. RESULTS: Different associations were found between BMI and substance use. Only RSOD (β= -.053, p=.005) and NMPDU of anxiolytics (β=.040, p=.020) at baseline significantly predicted BMI at follow-up. Baseline RSOD predicted a lower BMI at follow-up while baseline NMPDU of anxiolytics predicted higher BMI at follow-up. Furthermore, BMI at baseline significantly predicted daily smoking (β=.050, p=.007) and hazardous cannabis use (β=.058, p=.030). CONCLUSIONS: Our results suggest different associations between BMI and the use of various substances by young men. However, only RSOD and NMPDU of anxiolytics predicted BMI, whereas BMI predicted daily smoking and hazardous cannabis use.
Resumo:
While obesity continues to rise globally, the associations between body size, gender, and socioeconomic status (SES) seem to vary in different populations, and little is known on the contribution of perceived ideal body size in the social disparity of obesity in African countries. We examined the gender and socioeconomic patterns of body mass index (BMI) and perceived ideal body size in the Seychelles, a middle-income small island state in the African region. We also assessed the potential role of perceived ideal body size as a mediator for the gender-specific association between SES and BMI. A population-based survey of 1,240 adults aged 25 to 64 years conducted in December 2013. Participants' BMI was calculated based on measured weight and height; ideal body size was assessed using a nine-silhouette instrument. Three SES indicators were considered: income, education, and occupation. BMI and perceived ideal body size were both higher among men of higher versus lower SES (p< .001) but lower among women of higher versus lower SES (p< .001), irrespective of the SES indicator used. Multivariate analysis showed a strong and direct association between perceived ideal body size and BMI in both men and women (p< .001) and was consistent with a potential mediating role of perceived ideal body size in the gender-specific associations between SES and BMI. Our study emphasizes the importance of gender and socioeconomic differences in BMI and ideal body size and suggests that public health interventions that promote perception of healthy weight could help mitigate SES-related disparities in BMI.