857 resultados para complexity of agents
Resumo:
Informaatiotulva ja organisaation monimutkaisuus luoneet tarpeen tietämyksen hallinnalle. Tämän tutkimuksen tavoitteena on tunnistaa muutostarpeet, jotka portaalin käyttöönotto tietämyksenhallintatyökaluna luo. Tutkimuksessa verrataan myös uusia työkaluja olemassa oleviin sekä arvioidaan organisaation kykyä siirtää tietämystä virtuaalisesti. Kirjallisuutta vastaavanlaisista projekteista ei ole ollut saatavilla, sillä käyttöönotettava teknologia on melko uutta. Samaa teknologiaa on käytössä hieman eri alueella, kuin tässä projektissa on tavoitteena. Tutkimus on tapaustutkimus, jonka pääasialliset lähteet ovat erilaisissa kokouksissa tuotettuja dokumentteja. Tutkija on osallistunut aktiivisesti projektityöhön, joten osa taustatiedoista perustuu tutkijan huomioihin sekä vielä keskusteluihin. Teoriaosassa käsitellään tietämyksen jakamista tietämyksen hallinnan ja virtuaalisuuden näkökulmasta. Muutoksen hallintaa on käsitelty lyhyesti tietämyksenhallintatyökalun käyttöönotossa. Tutkimus liittyy Stora Enso Consumer Boardsin tietämyksen hallintaprojektiin.
Resumo:
Before 2011, patients with advanced or metastatic melanoma had a particularly poor long-term prognosis. Since traditional treatments failed to confer a survival benefit, patients were preferentially entered into clinical trials of investigational agents. A greater understanding of the epidemiology and biology of disease has underpinned the development of newer therapies, including six agents that have been approved in the EU, US and/or Japan: a cytotoxic T-lymphocyte antigen-4 inhibitor (ipilimumab), two programmed cell death-1 receptor inhibitors (nivolumab and pembrolizumab), two BRAF inhibitors (vemurafenib and dabrafenib) and a MEK inhibitor (trametinib). The availability of these treatments has greatly improved the outlook for patients with advanced melanoma; however, a major consideration for physicians is now to determine how best to integrate these agents into clinical practice. Therapeutic decisions are complicated by the need to consider patient and disease characteristics, and individual treatment goals, alongside the different efficacy and safety profiles of agents with varying mechanisms of action. Long-term survival, an outcome largely out of reach with traditional systemic therapies, is now a realistic goal, creating the additional need to re-establish how clinical benefit is evaluated. In this review we summarise the current treatment landscape in advanced melanoma and discuss the promise of agents still in development. We also speculate on the future of melanoma treatment and discuss how combination and sequencing approaches may be used to optimise patient care in the future.
Resumo:
The aim of this study is to explore the role and importance of different animal species in Turku through an analysis of osteological data and documentary evidence. The osteological material used in this study is derived from two town plots in Turku dating from the 13th century to the 19th century. The osteological material deposited in Turku represents animals bred both in the town and in the surrounding landscape. Animal husbandry in SW-Finland can also be examined through a number of historical documents. The importance of animals in Turku and its hinterland are closely connected and therefore the roles of the animals in both urban and rural settings are examined. The study has revealed the complexity of the depositional patterns in medieval and post-medieval Turku. In the different areas of Turku, characteristic patterns in the osteological material and different deposit types were evident. These patterns are reflections of the activities and therefore of the lifestyles practiced in Turku. The results emphasise the importance of context- awareness in the study of material culture from archaeological sites. Both the zooarchaeological and historical sources indicate that cattle were important in animal husbandry in Turku from the Middle Ages up to the 19th century. Sheep were the second most common species. When taking into consideration the larger size of cattle, the dominance of these animals when it come to meat consumption seems clear even in those phases where sheep bones are more abundant. Pig is less abundant in the material than either cattle or sheep and their importance for subsistence was probably fairly modest, albeit constant. Goats were not abundant in the material. Most of the identified goat bones came from low utility body parts (e.g. skulls and lower extremities), but some amount of goat meat was also consumed. Wild species were of minor importance when it came to consumption practices in Turku. The changes in Turku’s animal husbandry patterns between the medieval and post medieval periods is reflected in the change in age of the animals slaughtered, which was part of a wider pattern seen in North- and Central Europe. More mature animals are also present in the assemblages. This pattern is related to the more pronounced importance of cattle as a manure producer and a draught animal as a result of the intensification of crop cultivation. This change seems to occur later in Finland than in the more Southerly regions, and indeed it did not necessarily take hold in all parts of the country.
Resumo:
Post-translational protein modifications are crucial for many fundamental cellular and extracellular processes and greatly contribute to the complexity of organisms. Human HCF-1 is a transcriptional co-regulator that undergoes complex protein maturation involving reversible and irreversible post-translational modifications. Upon synthesis as a large precursor protein, HCF-1 undergoes extensive reversible glycosylation with β-N-acetylglucosamine giving rise to O-linked-β-N-acetylglucosamine (O-GlcNAc) modified serines and threonines. HCF-1 also undergoes irreversible site-specific proteolysis, which is important for one of HCF-1's major functions - the regulation of the cell-division cycle. HCF-1 O-GlcNAcylation and site-specific proteolysis are both catalyzed by a single enzyme with an unusual dual enzymatic activity, the O-GlcNAc transferase (OGT). HCF-1 is cleaved by OGT at any of six highly conserved 26 amino acid repeated sequences (HCF-1PRO repeats), but the mechanisms and the substrate requirements for OGT-mediated cleavage are not understood. In the present work, I characterized substrate requirements for OGT-mediated cleavage and O-GlcNAcylation of HCF-1. I identified key elements within the HCF-1PRO-repeat sequence that are important for proteolysis. Remarkably, an invariant single amino acid side-chain within the HCF-1PRO-repeat sequence displays particular OGT-binding properties and is essential for proteolysis. Additionally, I characterized substrate requirements for proteolysis outside of the HCF-1PRO repeat and identified a novel, highly O-GlcNAcylated OGT-binding sequence that enhances cleavage of the first HCF-1PRO repeat. These results link OGT association and its O-GlcNAcylation activities to HCF-1PRO-repeat proteolysis.
Resumo:
PURPOSE: Statistical shape and appearance models play an important role in reducing the segmentation processing time of a vertebra and in improving results for 3D model development. Here, we describe the different steps in generating a statistical shape model (SSM) of the second cervical vertebra (C2) and provide the shape model for general use by the scientific community. The main difficulties in its construction are the morphological complexity of the C2 and its variability in the population. METHODS: The input dataset is composed of manually segmented anonymized patient computerized tomography (CT) scans. The alignment of the different datasets is done with the procrustes alignment on surface models, and then, the registration is cast as a model-fitting problem using a Gaussian process. A principal component analysis (PCA)-based model is generated which includes the variability of the C2. RESULTS: The SSM was generated using 92 CT scans. The resulting SSM was evaluated for specificity, compactness and generalization ability. The SSM of the C2 is freely available to the scientific community in Slicer (an open source software for image analysis and scientific visualization) with a module created to visualize the SSM using Statismo, a framework for statistical shape modeling. CONCLUSION: The SSM of the vertebra allows the shape variability of the C2 to be represented. Moreover, the SSM will enable semi-automatic segmentation and 3D model generation of the vertebra, which would greatly benefit surgery planning.
Resumo:
Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.
Resumo:
The management of primary CNS lymphoma is one of the most controversial topics in neuro-oncology because of the complexity of the disease and the very few controlled studies available. In 2013, the European Association of Neuro-Oncology created a multidisciplinary task force to establish evidence-based guidelines for immunocompetent adults with primary CNS lymphoma. In this Review, we present these guidelines, which provide consensus considerations and recommendations for diagnosis, assessment, staging, and treatment of primary CNS lymphoma. Specifically, we address aspects of care related to surgery, systemic and intrathecal chemotherapy, intensive chemotherapy with autologous stem-cell transplantation, radiotherapy, intraocular manifestations, and management of elderly patients. The guidelines should aid clinicians in their daily practice and decision making, and serve as a basis for future investigations in neuro-oncology.
Resumo:
Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.
Resumo:
BACKGROUND: Recent methodological advances allow better examination of speciation and extinction processes and patterns. A major open question is the origin of large discrepancies in species number between groups of the same age. Existing frameworks to model this diversity either focus on changes between lineages, neglecting global effects such as mass extinctions, or focus on changes over time which would affect all lineages. Yet it seems probable that both lineages differences and mass extinctions affect the same groups. RESULTS: Here we used simulations to test the performance of two widely used methods under complex scenarios of diversification. We report good performances, although with a tendency to over-predict events with increasing complexity of the scenario. CONCLUSION: Overall, we find that lineage shifts are better detected than mass extinctions. This work has significance to assess the methods currently used to estimate changes in diversification using phylogenetic trees. Our results also point toward the need to develop new models of diversification to expand our capabilities to analyse realistic and complex evolutionary scenarios.
Resumo:
Dopamine release in the prefrontal cortex plays a critical role in cognitive function such as working memory, attention and planning. Dopamine exerts complex modulation on excitability of pyramidal neurons and interneurons, and regulates excitatory and inhibitory synaptic transmission. Because of the complexity of this modulation, it is difficult to fully comprehend the effect of dopamine on neuronal network activity. In this study, we investigated the effect of dopamine on local high-frequency oscillatory neuronal activity (in β band) in slices of the mouse anterior cingulate cortex (ACC). We found that dopamine enhanced the power of these oscillations induced by kainate and carbachol, but did not affect their peak frequency. Activation of D2R and in a lesser degree D1R increased the oscillation power, while activation of D4R had no effect. These high-frequency oscillations in the ACC relied on both phasic inhibitory and excitatory transmission and functional gap junctions. Thus, dopamine released in the ACC promotes high-frequency synchronized local cortical activity which is known to favor information transfer, fast selection and binding of distributed neuronal responses. Finally, the power of these oscillations was significantly enhanced after degradation of the perineuronal nets (PNNs) enwrapping most parvalbumin interneurons. This study provides new insights for a better understanding of the abnormal prefrontal gamma activity in schizophrenia (SZ) patients who display prefrontal anomalies of both the dopaminergic system and the PNNs.
Resumo:
Wastewater-based epidemiology consists in acquiring relevant information about the lifestyle and health status of the population through the analysis of wastewater samples collected at the influent of a wastewater treatment plant. Whilst being a very young discipline, it has experienced an astonishing development since its firs application in 2005. The possibility to gather community-wide information about drug use has been among the major field of application. The wide resonance of the first results sparked the interest of scientists from various disciplines. Since then, research has broadened in innumerable directions. Although being praised as a revolutionary approach, there was a need to critically assess its added value, with regard to the existing indicators used to monitor illicit drug use. The main, and explicit, objective of this research was to evaluate the added value of wastewater-based epidemiology with regards to two particular, although interconnected, dimensions of illicit drug use. The first is related to trying to understand the added value of the discipline from an epidemiological, or societal, perspective. In other terms, to evaluate if and how it completes our current vision about the extent of illicit drug use at the population level, and if it can guide the planning of future prevention measures and drug policies. The second dimension is the criminal one, with a particular focus on the networks which develop around the large demand in illicit drugs. The goal here was to assess if wastewater-based epidemiology, combined to indicators stemming from the epidemiological dimension, could provide additional clues about the structure of drug distribution networks and the size of their market. This research had also an implicit objective, which focused on initiating the path of wastewater- based epidemiology at the Ecole des Sciences Criminelles of the University of Lausanne. This consisted in gathering the necessary knowledge about the collection, preparation, and analysis of wastewater samples and, most importantly, to understand how to interpret the acquired data and produce useful information. In the first phase of this research, it was possible to determine that ammonium loads, measured directly in the wastewater stream, could be used to monitor the dynamics of the population served by the wastewater treatment plant. Furthermore, it was shown that on the long term, the population did not have a substantial impact on consumption patterns measured through wastewater analysis. Focussing on methadone, for which precise prescription data was available, it was possible to show that reliable consumption estimates could be obtained via wastewater analysis. This allowed to validate the selected sampling strategy, which was then used to monitor the consumption of heroin, through the measurement of morphine. The latter, in combination to prescription and sales data, provided estimates of heroin consumption in line with other indicators. These results, combined to epidemiological data, highlighted the good correspondence between measurements and expectations and, furthermore, suggested that the dark figure of heroin users evading harm-reduction programs, which would thus not be measured by conventional indicators, is likely limited. In the third part, which consisted in a collaborative study aiming at extensively investigating geographical differences in drug use, wastewater analysis was shown to be a useful complement to existing indicators. In particular for stigmatised drugs, such as cocaine and heroin, it allowed to decipher the complex picture derived from surveys and crime statistics. Globally, it provided relevant information to better understand the drug market, both from an epidemiological and repressive perspective. The fourth part focused on cannabis and on the potential of combining wastewater and survey data to overcome some of their respective limitations. Using a hierarchical inference model, it was possible to refine current estimates of cannabis prevalence in the metropolitan area of Lausanne. Wastewater results suggested that the actual prevalence is substantially higher compared to existing figures, thus supporting the common belief that surveys tend to underestimate cannabis use. Whilst being affected by several biases, the information collected through surveys allowed to overcome some of the limitations linked to the analysis of cannabis markers in wastewater (i.e., stability and limited excretion data). These findings highlighted the importance and utility of combining wastewater-based epidemiology to existing indicators about drug use. Similarly, the fifth part of the research was centred on assessing the potential uses of wastewater-based epidemiology from a law enforcement perspective. Through three concrete examples, it was shown that results from wastewater analysis can be used to produce highly relevant intelligence, allowing drug enforcement to assess the structure and operations of drug distribution networks and, ultimately, guide their decisions at the tactical and/or operational level. Finally, the potential to implement wastewater-based epidemiology to monitor the use of harmful, prohibited and counterfeit pharmaceuticals was illustrated through the analysis of sibutramine, and its urinary metabolite, in wastewater samples. The results of this research have highlighted that wastewater-based epidemiology is a useful and powerful approach with numerous scopes. Faced with the complexity of measuring a hidden phenomenon like illicit drug use, it is a major addition to the panoply of existing indicators. -- L'épidémiologie basée sur l'analyse des eaux usées (ou, selon sa définition anglaise, « wastewater-based epidemiology ») consiste en l'acquisition d'informations portant sur le mode de vie et l'état de santé d'une population via l'analyse d'échantillons d'eaux usées récoltés à l'entrée des stations d'épuration. Bien qu'il s'agisse d'une discipline récente, elle a vécu des développements importants depuis sa première mise en oeuvre en 2005, notamment dans le domaine de l'analyse des résidus de stupéfiants. Suite aux retombées médiatiques des premiers résultats de ces analyses de métabolites dans les eaux usées, de nombreux scientifiques provenant de différentes disciplines ont rejoint les rangs de cette nouvelle discipline en développant plusieurs axes de recherche distincts. Bien que reconnu pour son coté objectif et révolutionnaire, il était nécessaire d'évaluer sa valeur ajoutée en regard des indicateurs couramment utilisés pour mesurer la consommation de stupéfiants. En se focalisant sur deux dimensions spécifiques de la consommation de stupéfiants, l'objectif principal de cette recherche était focalisé sur l'évaluation de la valeur ajoutée de l'épidémiologie basée sur l'analyse des eaux usées. La première dimension abordée était celle épidémiologique ou sociétale. En d'autres termes, il s'agissait de comprendre si et comment l'analyse des eaux usées permettait de compléter la vision actuelle sur la problématique, ainsi que déterminer son utilité dans la planification des mesures préventives et des politiques en matière de stupéfiants actuelles et futures. La seconde dimension abordée était celle criminelle, en particulier, l'étude des réseaux qui se développent autour du trafic de produits stupéfiants. L'objectif était de déterminer si cette nouvelle approche combinée aux indicateurs conventionnels, fournissait de nouveaux indices quant à la structure et l'organisation des réseaux de distribution ainsi que sur les dimensions du marché. Cette recherche avait aussi un objectif implicite, développer et d'évaluer la mise en place de l'épidémiologie basée sur l'analyse des eaux usées. En particulier, il s'agissait d'acquérir les connaissances nécessaires quant à la manière de collecter, traiter et analyser des échantillons d'eaux usées, mais surtout, de comprendre comment interpréter les données afin d'en extraire les informations les plus pertinentes. Dans la première phase de cette recherche, il y pu être mis en évidence que les charges en ammonium, mesurées directement dans les eaux usées permettait de suivre la dynamique des mouvements de la population contributrice aux eaux usées de la station d'épuration de la zone étudiée. De plus, il a pu être démontré que, sur le long terme, les mouvements de la population n'avaient pas d'influence substantielle sur le pattern de consommation mesuré dans les eaux usées. En se focalisant sur la méthadone, une substance pour laquelle des données précises sur le nombre de prescriptions étaient disponibles, il a pu être démontré que des estimations exactes sur la consommation pouvaient être tirées de l'analyse des eaux usées. Ceci a permis de valider la stratégie d'échantillonnage adoptée, qui, par le bais de la morphine, a ensuite été utilisée pour suivre la consommation d'héroïne. Combinée aux données de vente et de prescription, l'analyse de la morphine a permis d'obtenir des estimations sur la consommation d'héroïne en accord avec des indicateurs conventionnels. Ces résultats, combinés aux données épidémiologiques ont permis de montrer une bonne adéquation entre les projections des deux approches et ainsi démontrer que le chiffre noir des consommateurs qui échappent aux mesures de réduction de risque, et qui ne seraient donc pas mesurés par ces indicateurs, est vraisemblablement limité. La troisième partie du travail a été réalisée dans le cadre d'une étude collaborative qui avait pour but d'investiguer la valeur ajoutée de l'analyse des eaux usées à mettre en évidence des différences géographiques dans la consommation de stupéfiants. En particulier pour des substances stigmatisées, telles la cocaïne et l'héroïne, l'approche a permis d'objectiver et de préciser la vision obtenue avec les indicateurs traditionnels du type sondages ou les statistiques policières. Globalement, l'analyse des eaux usées s'est montrée être un outil très utile pour mieux comprendre le marché des stupéfiants, à la fois sous l'angle épidémiologique et répressif. La quatrième partie du travail était focalisée sur la problématique du cannabis ainsi que sur le potentiel de combiner l'analyse des eaux usées aux données de sondage afin de surmonter, en partie, leurs limitations. En utilisant un modèle d'inférence hiérarchique, il a été possible d'affiner les actuelles estimations sur la prévalence de l'utilisation de cannabis dans la zone métropolitaine de la ville de Lausanne. Les résultats ont démontré que celle-ci est plus haute que ce que l'on s'attendait, confirmant ainsi l'hypothèse que les sondages ont tendance à sous-estimer la consommation de cannabis. Bien que biaisés, les données récoltées par les sondages ont permis de surmonter certaines des limitations liées à l'analyse des marqueurs du cannabis dans les eaux usées (i.e., stabilité et manque de données sur l'excrétion). Ces résultats mettent en évidence l'importance et l'utilité de combiner les résultats de l'analyse des eaux usées aux indicateurs existants. De la même façon, la cinquième partie du travail était centrée sur l'apport de l'analyse des eaux usées du point de vue de la police. Au travers de trois exemples, l'utilisation de l'indicateur pour produire du renseignement concernant la structure et les activités des réseaux de distribution de stupéfiants, ainsi que pour guider les choix stratégiques et opérationnels de la police, a été mise en évidence. Dans la dernière partie, la possibilité d'utiliser cette approche pour suivre la consommation de produits pharmaceutiques dangereux, interdits ou contrefaits, a été démontrée par l'analyse dans les eaux usées de la sibutramine et ses métabolites. Les résultats de cette recherche ont mis en évidence que l'épidémiologie par l'analyse des eaux usées est une approche pertinente et puissante, ayant de nombreux domaines d'application. Face à la complexité de mesurer un phénomène caché comme la consommation de stupéfiants, la valeur ajoutée de cette approche a ainsi pu être démontrée.
Resumo:
The tourism image is an element that conditions the competitiveness of tourism destinations by making them stand out in the minds of tourists. In this context, marketers of tourism destinations endeavour to create an induced image based on their identity and distinctive characteristics.A number of authors have also recognized the complexity of tourism destinations and the need for coordination and cooperation among all tourism agents, in order to supply a satisfactory tourist product and be competitive in the tourism market. Therefore, tourism agents at the destination need to develop and integrate strategic marketing plans.The aim of this paper is to determine how cities of similar cultures use their resources with the purpose of developing a distinctive induced tourism image to attract tourists and the extent of coordination and cooperation among the various tourism agents of a destination in the process of induced image creation.In order to accomplish these aims, a comparative analysis of the induced image of two cultural cities is presented, Girona (Spain) and Perpignan (France). The induced image is assessed through the content analysis of promotional brochures and the extent of cooperation with in-depth interviews of the main tourism agents of these destinations.Despite the similarities of both cities in terms of tourism resources, results show the use of different attributes to configure the induced image of each destination, as well as a different configuration of the network of tourism agents that participate in the process of induced image creation
Resumo:
Possible new ways in the pharmacological treatment of bipolar disorder and comorbid alcoholism. Azorin JM, Bowden CL, Garay RP, Perugi G, Vieta E, Young AH. Source Department of Psychiatry, CHU Sainte Marguerite, Marseilles, France. Abstract About half of all bipolar patients have an alcohol abuse problem at some point of their lifetime. However, only one randomized, controlled trial of pharmacotherapy (valproate) in this patient population was published as of 2006. Therefore, we reviewed clinical trials in this indication of the last four years (using mood stabilizers, atypical antipsychotics, and other drugs). Priority was given to randomized trials, comparing drugs with placebo or active comparator. Published studies were found through systematic database search (PubMed, Scirus, EMBASE, Cochrane Library, Science Direct). In these last four years, the only randomized, clinically relevant study in bipolar patients with comorbid alcoholism is that of Brown and colleagues (2008) showing that quetiapine therapy decreased depressive symptoms in the early weeks of use, without modifying alcohol use. Several other open-label trials have been generally positive and support the efficacy and tolerability of agents from different classes in this patient population. Valproate efficacy to reduce excessive alcohol consumption in bipolar patients was confirmed and new controlled studies revealed its therapeutic benefit to prevent relapse in newly abstinent alcoholics and to improve alcohol hallucinosis. Topiramate deserves to be investigated in bipolar patients with comorbid alcoholism since this compound effectively improves physical health and quality of life of alcohol-dependent individuals. In conclusion, randomized, controlled research is still needed to provide guidelines for possible use of valproate and other agents in patients with a dual diagnosis of bipolar disorder and substance abuse or dependence.
Resumo:
There is an increasing reliance on computers to solve complex engineering problems. This is because computers, in addition to supporting the development and implementation of adequate and clear models, can especially minimize the financial support required. The ability of computers to perform complex calculations at high speed has enabled the creation of highly complex systems to model real-world phenomena. The complexity of the fluid dynamics problem makes it difficult or impossible to solve equations of an object in a flow exactly. Approximate solutions can be obtained by construction and measurement of prototypes placed in a flow, or by use of a numerical simulation. Since usage of prototypes can be prohibitively time-consuming and expensive, many have turned to simulations to provide insight during the engineering process. In this case the simulation setup and parameters can be altered much more easily than one could with a real-world experiment. The objective of this research work is to develop numerical models for different suspensions (fiber suspensions, blood flow through microvessels and branching geometries, and magnetic fluids), and also fluid flow through porous media. The models will have merit as a scientific tool and will also have practical application in industries. Most of the numerical simulations were done by the commercial software, Fluent, and user defined functions were added to apply a multiscale method and magnetic field. The results from simulation of fiber suspension can elucidate the physics behind the break up of a fiber floc, opening the possibility for developing a meaningful numerical model of the fiber flow. The simulation of blood movement from an arteriole through a venule via a capillary showed that the model based on VOF can successfully predict the deformation and flow of RBCs in an arteriole. Furthermore, the result corresponds to the experimental observation illustrates that the RBC is deformed during the movement. The concluding remarks presented, provide a correct methodology and a mathematical and numerical framework for the simulation of blood flows in branching. Analysis of ferrofluids simulations indicate that the magnetic Soret effect can be even higher than the conventional one and its strength depends on the strength of magnetic field, confirmed experimentally by Völker and Odenbach. It was also shown that when a magnetic field is perpendicular to the temperature gradient, there will be additional increase in the heat transfer compared to the cases where the magnetic field is parallel to the temperature gradient. In addition, the statistical evaluation (Taguchi technique) on magnetic fluids showed that the temperature and initial concentration of the magnetic phase exert the maximum and minimum contribution to the thermodiffusion, respectively. In the simulation of flow through porous media, dimensionless pressure drop was studied at different Reynolds numbers, based on pore permeability and interstitial fluid velocity. The obtained results agreed well with the correlation of Macdonald et al. (1979) for the range of actual flow Reynolds studied. Furthermore, calculated results for the dispersion coefficients in the cylinder geometry were found to be in agreement with those of Seymour and Callaghan.
Resumo:
We adapt the Shout and Act algorithm to Digital Objects Preservation where agents explore file systems looking for digital objects to be preserved (victims). When they find something they “shout” so that agent mates can hear it. The louder the shout, the urgent or most important the finding is. Louder shouts can also refer to closeness. We perform several experiments to show that this system works very scalably, showing that heterogeneous teams of agents outperform homogeneous ones over a wide range of tasks complexity. The target at-risk documents are MS Office documents (including an RTF file) with Excel content or in Excel format. Thus, an interesting conclusion from the experiments is that fewer heterogeneous (varying skills) agents can equal the performance of many homogeneous (combined super-skilled) agents, implying significant performance increases with lower overall cost growth. Our results impact the design of Digital Objects Preservation teams: a properly designed combination of heterogeneous teams is cheaper and more scalable when confronted with uncertain maps of digital objects that need to be preserved. A cost pyramid is proposed for engineers to use for modeling the most effective agent combinations