962 resultados para System complexity
Resumo:
Even though the use of recommender systems is already widely spread in several application areas, there is still a lack of studies for accessibility research field. One of these attempts to use recommender system benefits for accessibility needs is Vulcanus. The Vulcanus recommender system uses similarity analysis to compare user’s trails. In this way, it is possible to take advantage of the user’s past behavior and distribute personalized content and services. The Vulcanus combined concepts from ubiquitous computing, such as user profiles, context awareness, trails management, and similarity analysis. It uses two different approaches for trails similarity analysis: resources patterns and categories patterns. In this work we performed an asymptotic analysis, identifying Vulcanus’ algorithm complexity. Furthermore we also propose improvements achieved by dynamic programming technique, so the ordinary case is improved by using a bottom-up approach. With that approach, many unnecessary comparisons can be skipped and now Vulcanus 2.0 is presented with improvements in its average case scenario.
Resumo:
Applications are subject of a continuous evolution process with a profound impact on their underlining data model, hence requiring frequent updates in the applications' class structure and database structure as well. This twofold problem, schema evolution and instance adaptation, usually known as database evolution, is addressed in this thesis. Additionally, we address concurrency and error recovery problems with a novel meta-model and its aspect-oriented implementation. Modern object-oriented databases provide features that help programmers deal with object persistence, as well as all related problems such as database evolution, concurrency and error handling. In most systems there are transparent mechanisms to address these problems, nonetheless the database evolution problem still requires some human intervention, which consumes much of programmers' and database administrators' work effort. Earlier research works have demonstrated that aspect-oriented programming (AOP) techniques enable the development of flexible and pluggable systems. In these earlier works, the schema evolution and the instance adaptation problems were addressed as database management concerns. However, none of this research was focused on orthogonal persistent systems. We argue that AOP techniques are well suited to address these problems in orthogonal persistent systems. Regarding the concurrency and error recovery, earlier research showed that only syntactic obliviousness between the base program and aspects is possible. Our meta-model and framework follow an aspect-oriented approach focused on the object-oriented orthogonal persistent context. The proposed meta-model is characterized by its simplicity in order to achieve efficient and transparent database evolution mechanisms. Our meta-model supports multiple versions of a class structure by applying a class versioning strategy. Thus, enabling bidirectional application compatibility among versions of each class structure. That is to say, the database structure can be updated because earlier applications continue to work, as well as later applications that have only known the updated class structure. The specific characteristics of orthogonal persistent systems, as well as a metadata enrichment strategy within the application's source code, complete the inception of the meta-model and have motivated our research work. To test the feasibility of the approach, a prototype was developed. Our prototype is a framework that mediates the interaction between applications and the database, providing them with orthogonal persistence mechanisms. These mechanisms are introduced into applications as an {\it aspect} in the aspect-oriented sense. Objects do not require the extension of any super class, the implementation of an interface nor contain a particular annotation. Parametric type classes are also correctly handled by our framework. However, classes that belong to the programming environment must not be handled as versionable due to restrictions imposed by the Java Virtual Machine. Regarding concurrency support, the framework provides the applications with a multithreaded environment which supports database transactions and error recovery. The framework keeps applications oblivious to the database evolution problem, as well as persistence. Programmers can update the applications' class structure because the framework will produce a new version for it at the database metadata layer. Using our XML based pointcut/advice constructs, the framework's instance adaptation mechanism is extended, hence keeping the framework also oblivious to this problem. The potential developing gains provided by the prototype were benchmarked. In our case study, the results confirm that mechanisms' transparency has positive repercussions on the programmer's productivity, simplifying the entire evolution process at application and database levels. The meta-model itself also was benchmarked in terms of complexity and agility. Compared with other meta-models, it requires less meta-object modifications in each schema evolution step. Other types of tests were carried out in order to validate prototype and meta-model robustness. In order to perform these tests, we used an OO7 small size database due to its data model complexity. Since the developed prototype offers some features that were not observed in other known systems, performance benchmarks were not possible. However, the developed benchmark is now available to perform future performance comparisons with equivalent systems. In order to test our approach in a real world scenario, we developed a proof-of-concept application. This application was developed without any persistence mechanisms. Using our framework and minor changes applied to the application's source code, we added these mechanisms. Furthermore, we tested the application in a schema evolution scenario. This real world experience using our framework showed that applications remains oblivious to persistence and database evolution. In this case study, our framework proved to be a useful tool for programmers and database administrators. Performance issues and the single Java Virtual Machine concurrent model are the major limitations found in the framework.
Resumo:
Numerous components of the Arctic freshwater system (atmosphere, ocean, cryosphere, terrestrial hydrology) have experienced large changes over the past few decades, and these changes are projected to amplify further in the future. Observations are particularly sparse, both in time and space, in the Polar Regions. Hence, modeling systems have been widely used and are a powerful tool to gain understanding on the functioning of the Arctic freshwater system and its integration within the global Earth system and climate. Here, we present a review of modeling studies addressing some aspect of the Arctic freshwater system. Through illustrative examples, we point out the value of using a hierarchy of models with increasing complexity and component interactions, in order to dismantle the important processes at play for the variability and changes of the different components of the Arctic freshwater system and the interplay between them. We discuss past and projected changes for the Arctic freshwater system and explore the sources of uncertainty associated with these model results. We further elaborate on some missing processes that should be included in future generations of Earth system models and highlight the importance of better quantification and understanding of natural variability, amongst other factors, for improved predictions of Arctic freshwater system change.
Resumo:
Stochastic methods based on time-series modeling combined with geostatistics can be useful tools to describe the variability of water-table levels in time and space and to account for uncertainty. Monitoring water-level networks can give information about the dynamic of the aquifer domain in both dimensions. Time-series modeling is an elegant way to treat monitoring data without the complexity of physical mechanistic models. Time-series model predictions can be interpolated spatially, with the spatial differences in water-table dynamics determined by the spatial variation in the system properties and the temporal variation driven by the dynamics of the inputs into the system. An integration of stochastic methods is presented, based on time-series modeling and geostatistics as a framework to predict water levels for decision making in groundwater management and land-use planning. The methodology is applied in a case study in a Guarani Aquifer System (GAS) outcrop area located in the southeastern part of Brazil. Communication of results in a clear and understandable form, via simulated scenarios, is discussed as an alternative, when translating scientific knowledge into applications of stochastic hydrogeology in large aquifers with limited monitoring network coverage like the GAS.
Resumo:
À la fin du 19e siècle, Dr. Ramón y Cajal, un pionnier scientifique, a découvert les éléments cellulaires individuels, appelés neurones, composant le système nerveux. Il a également remarqué la complexité de ce système et a mentionné l’impossibilité de ces nouveaux neurones à être intégrés dans le système nerveux adulte. Une de ses citations reconnues : “Dans les centres adultes, les chemins nerveux sont fixes, terminés, immuables. Tout doit mourir, rien ne peut être régénérer” est représentative du dogme de l’époque (Ramón y Cajal 1928). D’importantes études effectuées dans les années 1960-1970 suggèrent un point de vue différent. Il a été démontré que les nouveaux neurones peuvent être générés à l’âge adulte, mais cette découverte a créé un scepticisme omniprésent au sein de la communauté scientifique. Il a fallu 30 ans pour que le concept de neurogenèse adulte soit largement accepté. Cette découverte, en plus de nombreuses avancées techniques, a ouvert la porte à de nouvelles cibles thérapeutiques potentielles pour les maladies neurodégénératives. Les cellules souches neurales (CSNs) adultes résident principalement dans deux niches du cerveau : la zone sous-ventriculaire des ventricules latéraux et le gyrus dentelé de l’hippocampe. En condition physiologique, le niveau de neurogenèse est relativement élevé dans la zone sous-ventriculaire contrairement à l’hippocampe où certaines étapes sont limitantes. En revanche, la moelle épinière est plutôt définie comme un environnement en quiescence. Une des principales questions qui a été soulevée suite à ces découvertes est : comment peut-on activer les CSNs adultes afin d’augmenter les niveaux de neurogenèse ? Dans l’hippocampe, la capacité de l’environnement enrichi (incluant la stimulation cognitive, l’exercice et les interactions sociales) à promouvoir la neurogenèse hippocampale a déjà été démontrée. La plasticité de cette région est importante, car elle peut jouer un rôle clé dans la récupération de déficits au niveau de la mémoire et l’apprentissage. Dans la moelle épinière, des études effectuées in vitro ont démontré que les cellules épendymaires situées autour du canal central ont des capacités d’auto-renouvellement et de multipotence (neurones, astrocytes, oligodendrocytes). Il est intéressant de noter qu’in vivo, suite à une lésion de la moelle épinière, les cellules épendymaires sont activées, peuvent s’auto-renouveller, mais peuvent seulement ii donner naissance à des cellules de type gliale (astrocytes et oligodendrocytes). Cette nouvelle fonction post-lésion démontre que la plasticité est encore possible dans un environnement en quiescence et peut être exploité afin de développer des stratégies de réparation endogènes dans la moelle épinière. Les CSNs adultes jouent un rôle important dans le maintien des fonctions physiologiques du cerveau sain et dans la réparation neuronale suite à une lésion. Cependant, il y a peu de données sur les mécanismes qui permettent l'activation des CSNs en quiescence permettant de maintenir ces fonctions. L'objectif général est d'élucider les mécanismes sous-jacents à l'activation des CSNs dans le système nerveux central adulte. Pour répondre à cet objectif, nous avons mis en place deux approches complémentaires chez les souris adultes : 1) L'activation des CSNs hippocampales par l'environnement enrichi (EE) et 2) l'activation des CSNs de la moelle épinière par la neuroinflammation suite à une lésion. De plus, 3) afin d’obtenir plus d’information sur les mécanismes moléculaires de ces modèles, nous utiliserons des approches transcriptomiques afin d’ouvrir de nouvelles perspectives. Le premier projet consiste à établir de nouveaux mécanismes cellulaires et moléculaires à travers lesquels l’environnement enrichi module la plasticité du cerveau adulte. Nous avons tout d’abord évalué la contribution de chacune des composantes de l’environnement enrichi à la neurogenèse hippocampale (Chapitre II). L’exercice volontaire promeut la neurogenèse, tandis que le contexte social augmente l’activation neuronale. Par la suite, nous avons déterminé l’effet de ces composantes sur les performances comportementales et sur le transcriptome à l’aide d’un labyrinthe radial à huit bras afin d’évaluer la mémoire spatiale et un test de reconnaissante d’objets nouveaux ainsi qu’un RNA-Seq, respectivement (Chapitre III). Les coureurs ont démontré une mémoire spatiale de rappel à court-terme plus forte, tandis que les souris exposées aux interactions sociales ont eu une plus grande flexibilité cognitive à abandonner leurs anciens souvenirs. Étonnamment, l’analyse du RNA-Seq a permis d’identifier des différences claires dans l’expression des transcripts entre les coureurs de courte et longue distance, en plus des souris sociales (dans l’environnement complexe). iii Le second projet consiste à découvrir comment les cellules épendymaires acquièrent les propriétés des CSNs in vitro ou la multipotence suite aux lésions in vivo (Chapitre IV). Une analyse du RNA-Seq a révélé que le transforming growth factor-β1 (TGF-β1) agit comme un régulateur, en amont des changements significatifs suite à une lésion de la moelle épinière. Nous avons alors confirmé la présence de cette cytokine suite à la lésion et caractérisé son rôle sur la prolifération, différentiation, et survie des cellules initiatrices de neurosphères de la moelle épinière. Nos résultats suggèrent que TGF-β1 régule l’acquisition et l’expression des propriétés de cellules souches sur les cellules épendymaires provenant de la moelle épinière.
Resumo:
International audience
Resumo:
Early water resources modeling efforts were aimed mostly at representing hydrologic processes, but the need for interdisciplinary studies has led to increasing complexity and integration of environmental, social, and economic functions. The gradual shift from merely employing engineering-based simulation models to applying more holistic frameworks is an indicator of promising changes in the traditional paradigm for the application of water resources models, supporting more sustainable management decisions. This dissertation contributes to application of a quantitative-qualitative framework for sustainable water resources management using system dynamics simulation, as well as environmental systems analysis techniques to provide insights for water quality management in the Great Lakes basin. The traditional linear thinking paradigm lacks the mental and organizational framework for sustainable development trajectories, and may lead to quick-fix solutions that fail to address key drivers of water resources problems. To facilitate holistic analysis of water resources systems, systems thinking seeks to understand interactions among the subsystems. System dynamics provides a suitable framework for operationalizing systems thinking and its application to water resources problems by offering useful qualitative tools such as causal loop diagrams (CLD), stock-and-flow diagrams (SFD), and system archetypes. The approach provides a high-level quantitative-qualitative modeling framework for "big-picture" understanding of water resources systems, stakeholder participation, policy analysis, and strategic decision making. While quantitative modeling using extensive computer simulations and optimization is still very important and needed for policy screening, qualitative system dynamics models can improve understanding of general trends and the root causes of problems, and thus promote sustainable water resources decision making. Within the system dynamics framework, a growth and underinvestment (G&U) system archetype governing Lake Allegan's eutrophication problem was hypothesized to explain the system's problematic behavior and identify policy leverage points for mitigation. A system dynamics simulation model was developed to characterize the lake's recovery from its hypereutrophic state and assess a number of proposed total maximum daily load (TMDL) reduction policies, including phosphorus load reductions from point sources (PS) and non-point sources (NPS). It was shown that, for a TMDL plan to be effective, it should be considered a component of a continuous sustainability process, which considers the functionality of dynamic feedback relationships between socio-economic growth, land use change, and environmental conditions. Furthermore, a high-level simulation-optimization framework was developed to guide watershed scale BMP implementation in the Kalamazoo watershed. Agricultural BMPs should be given priority in the watershed in order to facilitate cost-efficient attainment of the Lake Allegan's TP concentration target. However, without adequate support policies, agricultural BMP implementation may adversely affect the agricultural producers. Results from a case study of the Maumee River basin show that coordinated BMP implementation across upstream and downstream watersheds can significantly improve cost efficiency of TP load abatement.
Resumo:
Climate change, intensive use, and population growth are threatening the availability of water resources. New sources of water, better knowledge of existing ones, and improved water management strategies are of paramount importance. Ground water is often considered as primary water source due to its advantages in terms of quantity, spatial distribution, and natural quality. Remote sensing techniques afford scientists a unique opportunity to characterize landscapes in order to assess groundwater resources, particularly in tectonically influenced areas. Aquifers in volcanic basins are considered the most productive aquifers in Latin America. Although topography is considered the primary driving force for groundwater flows in mountainous terrains, tectonic activity increases the complexity of these groundwater systems by altering the integrity of sedimentary rock units and the overlying drainage networks. Structural controls affect the primary hydraulic properties of the rock formations by developing barriers to flow in some cases and zones of preferential infiltration and subterranean in others. The study area focuses on the Quito Aquifer System (QAS) in Ecuador. The characterization of the hydrogeology started with a lineament analysis based on a combined remote sensing and digital terrain analysis approach. The application of classical tools for regional hydrogeological evaluation and shallow geophysical methods were useful to evaluate the impact of faulting and fracturing on the aquifer system. Given the spatial extension of the area and the complexity of the system, two levels of analysis were applied in this study. At the regional level, a lineament map was created for the QAS. Relationships between fractures, faults and lineaments and the configuration of the groundwater flow on the QAS were determined. At the local level, on the Plateaus region of the QAS, a detailed lineament map was obtained by using high-spatial-resolution satellite imagery and aspect map derived from a digital elevation model (DEM). This map was complemented by the analysis of morphotectonic indicators and shallow geophysics that characterize fracture patterns. The development of the groundwater flow system was studied, drawing upon data pertaining to the aquifer system physical characteristics and topography. Hydrochemistry was used to ascertain the groundwater evolution and verify the correspondence of the flow patterns proposed in the flow system analysis. Isotopic analysis was employed to verify the origin of groundwater. The results of this study show that tectonism plays a very important role for the hydrology of the QAS. The results also demonstrate that faults influence a great deal of the topographic characteristics of the QAS and subsequently the configuration of the groundwater flow. Moreover, for the Plateaus region, the results demonstrate that the aquifer flow systems are affected by secondary porosity. This is a new conceptualization of the functioning of the aquifers on the QAS that will significantly contribute to the development of better strategies for the management of this important water resource.
Resumo:
Matrix factorization (MF) has evolved as one of the better practice to handle sparse data in field of recommender systems. Funk singular value decomposition (SVD) is a variant of MF that exists as state-of-the-art method that enabled winning the Netflix prize competition. The method is widely used with modifications in present day research in field of recommender systems. With the potential of data points to grow at very high velocity, it is prudent to devise newer methods that can handle such data accurately as well as efficiently than Funk-SVD in the context of recommender system. In view of the growing data points, I propose a latent factor model that caters to both accuracy and efficiency by reducing the number of latent features of either users or items making it less complex than Funk-SVD, where latent features of both users and items are equal and often larger. A comprehensive empirical evaluation of accuracy on two publicly available, amazon and ml-100 k datasets reveals the comparable accuracy and lesser complexity of proposed methods than Funk-SVD.
Resumo:
Self-replication and compartmentalization are two central properties thought to be essential for minimal life, and understanding how such processes interact in the emergence of complex reaction networks is crucial to exploring the development of complexity in chemistry and biology. Autocatalysis can emerge from multiple different mechanisms such as formation of an initiator, template self-replication and physical autocatalysis (where micelles formed from the reaction product solubilize the reactants, leading to higher local concentrations and therefore higher rates). Amphiphiles are also used in artificial life studies to create protocell models such as micelles, vesicles and oil-in-water droplets, and can increase reaction rates by encapsulation of reactants. So far, no template self-replicator exists which is capable of compartmentalization, or transferring this molecular scale phenomenon to micro or macro-scale assemblies. Here a system is demonstrated where an amphiphilic imine catalyses its own formation by joining a non-polar alkyl tail group with a polar carboxylic acid head group to form a template, which was shown to form reverse micelles by Dynamic Light Scattering (DLS). The kinetics of this system were investigated by 1H NMR spectroscopy, showing clearly that a template self-replication mechanism operates, though there was no evidence that the reverse micelles participated in physical autocatalysis. Active oil droplets, composed from a mixture of insoluble organic compounds in an aqueous sub-phase, can undergo processes such as division, self-propulsion and chemotaxis, and are studied as models for minimal cells, or protocells. Although in most cases the Marangoni effect is responsible for the forces on the droplet, the behaviour of the droplet depends heavily on the exact composition. Though theoretical models are able to calculate the forces on a droplet, to model a mixture of oils on an aqueous surface where compounds from the oil phase are dissolving and diffusing through the aqueous phase is beyond current computational capability. The behaviour of a droplet in an aqueous phase can only be discovered through experiment, though it is determined by the droplet's composition. By using an evolutionary algorithm and a liquid handling robot to conduct droplet experiments and decide which compositions to test next, entirely autonomously, the composition of the droplet becomes a chemical genome capable of evolution. The selection is carried out according to a fitness function, which ranks the formulation based on how well it conforms to the chosen fitness criteria (e.g. movement or division). Over successive generations, significant increases in fitness are achieved, and this increase is higher with more components (i.e. greater complexity). Other chemical processes such as chemiluminescence and gelation were investigated in active oil droplets, demonstrating the possibility of controlling chemical reactions by selective droplet fusion. Potential future applications for this might include combinatorial chemistry, or additional fitness goals for the genetic algorithm. Combining the self-replication and the droplet protocells research, it was demonstrated that the presence of the amphiphilic replicator lowers the interfacial tension between droplets of a reaction mixture in organic solution and the alkaline aqueous phase, causing them to divide. Periodic sampling by a liquid handling robot revealed that the extent of droplet fission increased as the reaction progressed, producing more individual protocells with increased self-replication. This demonstrates coupling of the molecular scale phenomenon of template self-replication to a macroscale physicochemical effect.
Resumo:
The enteric nervous system (ENS) modulates a number of digestive functions including well known ones, i.e. motility, secretion, absorption and blood flow, along with other critically relevant processes, i.e. immune responses of the gastrointestinal (GI) tract, gut microbiota and epithelial barrier . The characterization of the anatomical aspects of the ENS in large mammals and the identification of differences and similarities existing between species may represent a fundamental basis to decipher several digestive GI diseases in humans and animals. In this perspective, the aim of the present thesis is to highlight the ENS anatomical basis and pathological aspects in different mammalian species, such as horses, dogs and humans. Firstly, I designed two anatomical studies in horses: “Excitatory and inhibitory enteric innervation of horse lower esophageal sphincter”. “Localization of 5-hydroxytryptamine 4 receptor (5-HT4R) in the equine enteric nervous system”. Then I focused on the enteric dysfunctions, including: A primary enteric aganglionosis in horses: “Extrinsic innervation of the ileum and pelvic flexure of foals with ileocolonic aganglionosis”. A diabetic enteric neuropathy in dogs: “Quantification of nitrergic neurons in the myenteric plexus of gastric antrum and ileum of healthy and diabetic dogs”. An enteric neuropathy in human neurological patients: “Functional and neurochemical abnormalities in patients with Parkinson's disease and chronic constipation”. The physiology of the GI tract is characterized by a high complexity and it is mainly dependent on the control of the intrinsic nervous system. ENS is critical to preserve body homeostasis as reflect by its derangement occurring in pathological conditions that can be lethal or seriously disabling to humans and animals. The knowledge of the anatomy and the pathology of the ENS represents a new important and fascinating topic, which deserves more attention in the veterinary medicine field.
Resumo:
Compared to other, plastic materials have registered a strong acceleration in production and consumption during the last years. Despite the existence of waste management systems, plastic_based materials are still a pervasive presence in the environment, with negative consequences on marine ecosystem and human health. The recycling is still challenging due to the growing complexity of product design, the so-called overpackaging, the insufficient and inadequate recycling infrastructure, the weak market of recycled plastics and the high cost of waste treatment and disposal. The Circular economy package, the European Strategy for plastics in a circular economy and the recent European Green Deal include very ambitious programmes to rethink the entire plastic value chain. As regards packaging, all plastic packaging will have to be 100% recyclable (or reusable) and 55% recycled by 2030. Regions are consequently called upon to set up a robust plan able to fit the European objectives. It takes on greater importance in Emilia Romagna where the Packaging valley is located. This thesis supports the definition of a strategy aimed to establish an after-use plastics economy in the region. The PhD work has set the basis and the instruments to establish the so-called Circularity Strategy with the aim to turn about 92.000t of plastic waste into profitable secondary resources. System innovation, life cycle thinking and participative backcasting method have allowed to deeply analyse the current system, orientate the problem and explore sustainable solutions through a broad stakeholder participation. A material flow analysis, accompanied by a barrier analysis, has supported the identification of the gaps between the present situation and the 2030 scenario. Eco-design for and from recycling (and a mass _based recycling rate (based on the effective amount of plastic wastes turned into secondary plastics), valorized by a value_based indicator, are the key-points of the action plan.
Resumo:
Introduction: Recently, the American Association of Gynecologic Laparoscopists proposed a new classification and scoring system with the specific aim to assess surgical complexity. This study sought to assess if a higher AAGL score correlates with an increased risk of peri-operative complications in women submitted to surgery for endometriosis. Methods: This is a retrospective cohort study conducted in a third level referral center. We collected data from women with endometriosis submitted to complete surgical removal of endometriosis from January 2019 to December 2021. ENZIAN, r-ASRM classifications and AAGL total score was calculated for each patient. Population was divided in two groups according to the occurrence or not of at least one peri-operative complication. Our primary outcome was to evaluate the correlation between AAGL score and occurrence of complications. Results: During the study period we analyzed data from 282 eligible patients. Among them, 80 (28.4%) experienced peri-operative complications. No statistically significant difference was found between the two groups in terms of baseline characteristics, except for pre-operative hemoglobin (Hb), which was lower in patients with complications (p=0.001). Surgical variables associated with the occurrence of complications were recto-sigmoid surgery (p=0.003), ileocecal resection (0.034), and longer operative time (p=0.007). Furthermore, a higher ENZIAN B score (p=0.006), AAGL score (p=0.045) and stage (p=0.022) were found in the group of patients with complications. The multivariate analysis only confirmed the significant association between the occurrence of peri-operative complications and lower pre-operative Hb level (OR 0.74; 95% CI, 0.59 - 0.94; p=0.014), longer operative time (OR 1.00; 95% CI, 1.00 – 1.01; p=0.013), recto-sigmoid surgery - especially discoid resection (OR 8.73; 95% CI, 2.18 – 35; p=0.016) and ENZIAN B3 (OR 3.62; 95% CI, 1.46 – 8.99; p= 0.006). Conclusion: According to our findings, high AAGL scores or stages do not seem to increase the risk of peri-operative complications.
Resumo:
Bone marrow is organized in specialized microenvironments known as 'marrow niches'. These are important for the maintenance of stem cells and their hematopoietic progenitors whose homeostasis also depends on other cell types present in the tissue. Extrinsic factors, such as infection and inflammatory states, may affect this system by causing cytokine dysregulation (imbalance in cytokine production) and changes in cell proliferation and self-renewal rates, and may also induce changes in the metabolism and cell cycle. Known to relate to chronic inflammation, obesity is responsible for systemic changes that are best studied in the cardiovascular system. Little is known regarding the changes in the hematopoietic system induced by the inflammatory state carried by obesity or the cell and molecular mechanisms involved. The understanding of the biological behavior of hematopoietic stem cells under obesity-induced chronic inflammation could help elucidate the pathophysiological mechanisms involved in other inflammatory processes, such as neoplastic diseases and bone marrow failure syndromes.
Resumo:
To compare time and risk to biochemical recurrence (BR) after radical prostatectomy of two chronologically different groups of patients using the standard and the modified Gleason system (MGS). Cohort 1 comprised biopsies of 197 patients graded according to the standard Gleason system (SGS) in the period 1997/2004, and cohort 2, 176 biopsies graded according to the modified system in the period 2005/2011. Time to BR was analyzed with the Kaplan-Meier product-limit analysis and prediction of shorter time to recurrence using univariate and multivariate Cox proportional hazards model. Patients in cohort 2 reflected time-related changes: striking increase in clinical stage T1c, systematic use of extended biopsies, and lower percentage of total length of cancer in millimeter in all cores. The MGS used in cohort 2 showed fewer biopsies with Gleason score ≤ 6 and more biopsies of the intermediate Gleason score 7. Time to BR using the Kaplan-Meier curves showed statistical significance using the MGS in cohort 2, but not the SGS in cohort 1. Only the MGS predicted shorter time to BR on univariate analysis and on multivariate analysis was an independent predictor. The results favor that the 2005 International Society of Urological Pathology modified system is a refinement of the Gleason grading and valuable for contemporary clinical practice.