263 resultados para Flavor-changing neutral current
Resumo:
Locally advanced prostate cancer (LAPC) is a heterogeneous entity usually embracing T3-4 and/or pelvic lymph-node-positive disease in the absence of established metastases. Outcomes for LAPC with single therapies have traditionally been poor, leading to the investigation of adjuvant therapies. Prostate cancer is a hormonally sensitive tumour, which usually responds to pharmacological manipulation of the androgen receptor or its testosterone-related ligands. As such, androgen deprivation therapy (ADT) has become an important adjuvant strategy for the treatment of LAPC, particularly for patients managed primarily with radiotherapy. Such results have generally not been replicated in surgical patients. With increased use of ADT has come improved awareness of the numerous toxicities associated with long-term use of these agents, as well as the development of strategies for minimizing ADT exposure and actively managing adverse effects. Several trials are exploring agents to enhance radiation cell sensitivity as well as the application of adjuvant docetaxel, an agent with proven efficacy in the metastatic, castrate-resistant setting. The recent work showing activity of cabazitaxel, sipuleucel-T and abiraterone for castrate-resistant disease in the post-docetaxel setting will see these agents investigated in conjunction with definitive surgery and radiotherapy.
Resumo:
Aim Species distribution models (SDMs) based on current species ranges underestimate the potential distribution when projected in time and/or space. A multi-temporal model calibration approach has been suggested as an alternative, and we evaluate this using 13,000 years of data. Location Europe. Methods We used fossil-based records of presence for Picea abies, Abies alba and Fagus sylvatica and six climatic variables for the period 13,000 to 1000yr bp. To measure the contribution of each 1000-year time step to the total niche of each species (the niche measured by pooling all the data), we employed a principal components analysis (PCA) calibrated with data over the entire range of possible climates. Then we projected both the total niche and the partial niches from single time frames into the PCA space, and tested if the partial niches were more similar to the total niche than random. Using an ensemble forecasting approach, we calibrated SDMs for each time frame and for the pooled database. We projected each model to current climate and evaluated the results against current pollen data. We also projected all models into the future. Results Niche similarity between the partial and the total-SDMs was almost always statistically significant and increased through time. SDMs calibrated from single time frames gave different results when projected to current climate, providing evidence of a change in the species realized niches through time. Moreover, they predicted limited climate suitability when compared with the total-SDMs. The same results were obtained when projected to future climates. Main conclusions The realized climatic niche of species differed for current and future climates when SDMs were calibrated considering different past climates. Building the niche as an ensemble through time represents a way forward to a better understanding of a species' range and its ecology in a changing climate.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
During the past decades, anticancer immunotherapy has evolved from a promising therapeutic option to a robust clinical reality. Many immunotherapeutic regimens are now approved by the US Food and Drug Administration and the European Medicines Agency for use in cancer patients, and many others are being investigated as standalone therapeutic interventions or combined with conventional treatments in clinical studies. Immunotherapies may be subdivided into "passive" and "active" based on their ability to engage the host immune system against cancer. Since the anticancer activity of most passive immunotherapeutics (including tumor-targeting monoclonal antibodies) also relies on the host immune system, this classification does not properly reflect the complexity of the drug-host-tumor interaction. Alternatively, anticancer immunotherapeutics can be classified according to their antigen specificity. While some immunotherapies specifically target one (or a few) defined tumor-associated antigen(s), others operate in a relatively non-specific manner and boost natural or therapy-elicited anticancer immune responses of unknown and often broad specificity. Here, we propose a critical, integrated classification of anticancer immunotherapies and discuss the clinical relevance of these approaches.
Resumo:
Dans cette thèse, nous étudions les aspects comportementaux d'agents qui interagissent dans des systèmes de files d'attente à l'aide de modèles de simulation et de méthodologies expérimentales. Chaque période les clients doivent choisir un prestataire de servivce. L'objectif est d'analyser l'impact des décisions des clients et des prestataires sur la formation des files d'attente. Dans un premier cas nous considérons des clients ayant un certain degré d'aversion au risque. Sur la base de leur perception de l'attente moyenne et de la variabilité de cette attente, ils forment une estimation de la limite supérieure de l'attente chez chacun des prestataires. Chaque période, ils choisissent le prestataire pour lequel cette estimation est la plus basse. Nos résultats indiquent qu'il n'y a pas de relation monotone entre le degré d'aversion au risque et la performance globale. En effet, une population de clients ayant un degré d'aversion au risque intermédiaire encoure généralement une attente moyenne plus élevée qu'une population d'agents indifférents au risque ou très averses au risque. Ensuite, nous incorporons les décisions des prestataires en leur permettant d'ajuster leur capacité de service sur la base de leur perception de la fréquence moyenne d'arrivées. Les résultats montrent que le comportement des clients et les décisions des prestataires présentent une forte "dépendance au sentier". En outre, nous montrons que les décisions des prestataires font converger l'attente moyenne pondérée vers l'attente de référence du marché. Finalement, une expérience de laboratoire dans laquelle des sujets jouent le rôle de prestataire de service nous a permis de conclure que les délais d'installation et de démantèlement de capacité affectent de manière significative la performance et les décisions des sujets. En particulier, les décisions du prestataire, sont influencées par ses commandes en carnet, sa capacité de service actuellement disponible et les décisions d'ajustement de capacité qu'il a prises, mais pas encore implémentées. - Queuing is a fact of life that we witness daily. We all have had the experience of waiting in line for some reason and we also know that it is an annoying situation. As the adage says "time is money"; this is perhaps the best way of stating what queuing problems mean for customers. Human beings are not very tolerant, but they are even less so when having to wait in line for service. Banks, roads, post offices and restaurants are just some examples where people must wait for service. Studies of queuing phenomena have typically addressed the optimisation of performance measures (e.g. average waiting time, queue length and server utilisation rates) and the analysis of equilibrium solutions. The individual behaviour of the agents involved in queueing systems and their decision making process have received little attention. Although this work has been useful to improve the efficiency of many queueing systems, or to design new processes in social and physical systems, it has only provided us with a limited ability to explain the behaviour observed in many real queues. In this dissertation we differ from this traditional research by analysing how the agents involved in the system make decisions instead of focusing on optimising performance measures or analysing an equilibrium solution. This dissertation builds on and extends the framework proposed by van Ackere and Larsen (2004) and van Ackere et al. (2010). We focus on studying behavioural aspects in queueing systems and incorporate this still underdeveloped framework into the operations management field. In the first chapter of this thesis we provide a general introduction to the area, as well as an overview of the results. In Chapters 2 and 3, we use Cellular Automata (CA) to model service systems where captive interacting customers must decide each period which facility to join for service. They base this decision on their expectations of sojourn times. Each period, customers use new information (their most recent experience and that of their best performing neighbour) to form expectations of sojourn time at the different facilities. Customers update their expectations using an adaptive expectations process to combine their memory and their new information. We label "conservative" those customers who give more weight to their memory than to the xiv Summary new information. In contrast, when they give more weight to new information, we call them "reactive". In Chapter 2, we consider customers with different degree of risk-aversion who take into account uncertainty. They choose which facility to join based on an estimated upper-bound of the sojourn time which they compute using their perceptions of the average sojourn time and the level of uncertainty. We assume the same exogenous service capacity for all facilities, which remains constant throughout. We first analyse the collective behaviour generated by the customers' decisions. We show that the system achieves low weighted average sojourn times when the collective behaviour results in neighbourhoods of customers loyal to a facility and the customers are approximately equally split among all facilities. The lowest weighted average sojourn time is achieved when exactly the same number of customers patronises each facility, implying that they do not wish to switch facility. In this case, the system has achieved the Nash equilibrium. We show that there is a non-monotonic relationship between the degree of risk-aversion and system performance. Customers with an intermediate degree of riskaversion typically achieve higher sojourn times; in particular they rarely achieve the Nash equilibrium. Risk-neutral customers have the highest probability of achieving the Nash Equilibrium. Chapter 3 considers a service system similar to the previous one but with risk-neutral customers, and relaxes the assumption of exogenous service rates. In this sense, we model a queueing system with endogenous service rates by enabling managers to adjust the service capacity of the facilities. We assume that managers do so based on their perceptions of the arrival rates and use the same principle of adaptive expectations to model these perceptions. We consider service systems in which the managers' decisions take time to be implemented. Managers are characterised by a profile which is determined by the speed at which they update their perceptions, the speed at which they take decisions, and how coherent they are when accounting for their previous decisions still to be implemented when taking their next decision. We find that the managers' decisions exhibit a strong path-dependence: owing to the initial conditions of the model, the facilities of managers with identical profiles can evolve completely differently. In some cases the system becomes "locked-in" into a monopoly or duopoly situation. The competition between managers causes the weighted average sojourn time of the system to converge to the exogenous benchmark value which they use to estimate their desired capacity. Concerning the managers' profile, we found that the more conservative Summary xv a manager is regarding new information, the larger the market share his facility achieves. Additionally, the faster he takes decisions, the higher the probability that he achieves a monopoly position. In Chapter 4 we consider a one-server queueing system with non-captive customers. We carry out an experiment aimed at analysing the way human subjects, taking on the role of the manager, take decisions in a laboratory regarding the capacity of a service facility. We adapt the model proposed by van Ackere et al (2010). This model relaxes the assumption of a captive market and allows current customers to decide whether or not to use the facility. Additionally the facility also has potential customers who currently do not patronise it, but might consider doing so in the future. We identify three groups of subjects whose decisions cause similar behavioural patterns. These groups are labelled: gradual investors, lumpy investors, and random investor. Using an autocorrelation analysis of the subjects' decisions, we illustrate that these decisions are positively correlated to the decisions taken one period early. Subsequently we formulate a heuristic to model the decision rule considered by subjects in the laboratory. We found that this decision rule fits very well for those subjects who gradually adjust capacity, but it does not capture the behaviour of the subjects of the other two groups. In Chapter 5 we summarise the results and provide suggestions for further work. Our main contribution is the use of simulation and experimental methodologies to explain the collective behaviour generated by customers' and managers' decisions in queueing systems as well as the analysis of the individual behaviour of these agents. In this way, we differ from the typical literature related to queueing systems which focuses on optimising performance measures and the analysis of equilibrium solutions. Our work can be seen as a first step towards understanding the interaction between customer behaviour and the capacity adjustment process in queueing systems. This framework is still in its early stages and accordingly there is a large potential for further work that spans several research topics. Interesting extensions to this work include incorporating other characteristics of queueing systems which affect the customers' experience (e.g. balking, reneging and jockeying); providing customers and managers with additional information to take their decisions (e.g. service price, quality, customers' profile); analysing different decision rules and studying other characteristics which determine the profile of customers and managers.
Resumo:
The cytotoxic T-cell and natural killer (NK)-cell lymphomas and related disorders are important but relatively rare lymphoid neoplasms that frequently are a challenge for practicing pathologists. This selective review, based on a meeting of the International Lymphoma Study Group, briefly reviews T-cell and NK-cell development and addresses questions related to the importance of precise cell lineage (αβ-type T cell, γδ T cell, or NK cell), the implications of Epstein-Barr virus infection, the significance of anatomic location including nodal disease, and the question of further categorization of enteropathy-associated T-cell lymphomas. Finally, developments subsequent to the 2008 World Health Organization Classification, including the recognition of indolent NK-cell and T-cell disorders of the gastrointestinal tract are presented.
Resumo:
Chronic low back pain attributed to lumbar disc degeneration poses a serious challenge to physicians. Surgery may be indicated in selected cases following failure of appropriate conservative treatment. For decades, the only surgical option has been spinal fusion, but its results have been inconsistent. Some prospective trials show superiority over usual conservative measures while others fail to demonstrate its advantages. In an effort to improve results of fusion and to decrease the incidence of adjacent segment degeneration, total disc replacement techniques have been introduced and studied extensively. Short-term results have shown superiority over some fusion techniques. Mid-term results however tend to show that this approach yields results equivalent to those of spinal fusion. Nucleus replacement has gained some popularity initially, but evidence on its efficacy is scarce. Dynamic stabilisation, a technique involving less rigid implants than in spinal fusion and performed without the need for bone grafting, represents another surgical option. Evidence again is lacking on its superiority over other surgical strategies and conservative measures. Insertion of interspinous devices posteriorly, aiming at redistributing loads and relieving pain, has been used as an adjunct to disc removal surgery for disc herniation. To date however, there is no clear evidence on their efficacy. Minimally invasive intradiscal thermocoagulation techniques have also been tried, but evidence of their effectiveness is questioned. Surgery using novel biological solutions may be the future of discogenic pain treatment. Collaboration between clinicians and basic scientists in this multidisciplinary field will undoubtedly shape the future of treating symptomatic disc degeneration.
Resumo:
PURPOSE OF REVIEW: Oculopalatal tremor (OPT) is an acquired disorder resulting from the interruption of a specific brainstem circuitry, the dentato-rubro-olivary pathway or Guillain-Mollaret triangle. The recent literature on OPT and olivary hypertrophy was reviewed with specific interest regarding causes, diagnostic procedures, physiopathology and therapies. RECENT FINDINGS: OPT is associated with inferior olivary hypertrophy, and recent findings have provided a better understanding of its intimate mechanisms. A dual-mechanism model, combining an oscillator (inferior olive) and a modulator/amplifier (cerebellum), best explains the development of OPT. Electrotonic coupling and specific Ca channels contribute to oscillations of inferior olivary nucleus neurons in OPT. Improvement of visual symptoms can be achieved with oral gabapentin or memantine. SUMMARY: Both the neuronal circuitry and the physiopathology of OPT are now better understood. This opens up an era of specific therapy for this rare cause of disabling oscillopsia.
Resumo:
BACKGROUND: As the long-term survival of pancreatic head malignancies remains dismal, efforts have been made for a better patient selection and a tailored treatment. Tumour size could also be used for patient stratification. METHODS: One hundred and fourteen patients underwent a pancreaticoduodenectomy for pancreatic adenocarcinoma, peri-ampullary and biliary cancer stratified according to: ≤20 mm, 21-34 mm, 35-45 mm and >45 mm tumour size. RESULTS: Patients with tumour sizes of ≤20 mm had a N1 rate of 41% and a R1/2 rate of 7%. The median survival was 3.4 years. N1 and R1/2 rates increased to 84% and 31% for tumour sizes of 21-34 mm (P = 0.0002 for N, P = 0.02 for R). The median survival decreased to 1.6 years (P = 0.0003). A further increase in tumour size of 35-45 mm revealed a further increase of N1 and R1/2 rates of 93% (P < 0.0001) and 33%, respectively. The median survival was 1.2 years (P = 0.004). Tumour sizes >45 mm were related to a further decreased median survival of 1.1 years (P = 0.2), whereas N1 and R1/2 rates were 87% and 20%, respectively. DISCUSSION: Tumour size is an important feature of pancreatic head malignancies. A tumour diameter of 20 mm seems to be the cut-off above which an increased rate of incomplete resections and metastatic lymph nodes must be encountered and the median survival is reduced.
Resumo:
Summary Due to their conic shape and the reduction of area with increasing elevation, mountain ecosystems were early identified as potentially very sensitive to global warming. Moreover, mountain systems may experience unprecedented rates of warming during the next century, two or three times higher than that records of the 20th century. In this context, species distribution models (SDM) have become important tools for rapid assessment of the impact of accelerated land use and climate change on the distribution plant species. In my study, I developed and tested new predictor variables for species distribution models (SDM), specific to current and future geographic projections of plant species in a mountain system, using the Western Swiss Alps as model region. Since meso- and micro-topography are relevant to explain geographic patterns of plant species in mountain environments, I assessed the effect of scale on predictor variables and geographic projections of SDM. I also developed a methodological framework of space-for-time evaluation to test the robustness of SDM when projected in a future changing climate. Finally, I used a cellular automaton to run dynamic simulations of plant migration under climate change in a mountain landscape, including realistic distance of seed dispersal. Results of future projections for the 21st century were also discussed in perspective of vegetation changes monitored during the 20th century. Overall, I showed in this study that, based on the most severe A1 climate change scenario and realistic dispersal simulations of plant dispersal, species extinctions in the Western Swiss Alps could affect nearly one third (28.5%) of the 284 species modeled by 2100. With the less severe 61 scenario, only 4.6% of species are predicted to become extinct. However, even with B1, 54% (153 species) may still loose more than 80% of their initial surface. Results of monitoring of past vegetation changes suggested that plant species can react quickly to the warmer conditions as far as competition is low However, in subalpine grasslands, competition of already present species is probably important and limit establishment of newly arrived species. Results from future simulations also showed that heavy extinctions of alpine plants may start already in 2040, but the latest in 2080. My study also highlighted the importance of fine scale and regional. assessments of climate change impact on mountain vegetation, using more direct predictor variables. Indeed, predictions at the continental scale may fail to predict local refugees or local extinctions, as well as loss of connectivity between local populations. On the other hand, migrations of low-elevation species to higher altitude may be difficult to predict at the local scale. Résumé La forme conique des montagnes ainsi que la diminution de surface dans les hautes altitudes sont reconnues pour exposer plus sensiblement les écosystèmes de montagne au réchauffement global. En outre, les systèmes de montagne seront sans doute soumis durant le 21ème siècle à un réchauffement deux à trois fois plus rapide que celui mesuré durant le 20ème siècle. Dans ce contexte, les modèles prédictifs de distribution géographique de la végétation se sont imposés comme des outils puissants pour de rapides évaluations de l'impact des changements climatiques et de la transformation du paysage par l'homme sur la végétation. Dans mon étude, j'ai développé de nouvelles variables prédictives pour les modèles de distribution, spécifiques à la projection géographique présente et future des plantes dans un système de montagne, en utilisant les Préalpes vaudoises comme zone d'échantillonnage. La méso- et la microtopographie étant particulièrement adaptées pour expliquer les patrons de distribution géographique des plantes dans un environnement montagneux, j'ai testé les effets d'échelle sur les variables prédictives et sur les projections des modèles de distribution. J'ai aussi développé un cadre méthodologique pour tester la robustesse potentielle des modèles lors de projections pour le futur. Finalement, j'ai utilisé un automate cellulaire pour simuler de manière dynamique la migration future des plantes dans le paysage et dans quatre scénarios de changement climatique pour le 21ème siècle. J'ai intégré dans ces simulations des mécanismes et des distances plus réalistes de dispersion de graines. J'ai pu montrer, avec les simulations les plus réalistes, que près du tiers des 284 espèces considérées (28.5%) pourraient être menacées d'extinction en 2100 dans le cas du plus sévère scénario de changement climatique A1. Pour le moins sévère des scénarios B1, seulement 4.6% des espèces sont menacées d'extinctions, mais 54% (153 espèces) risquent de perdre plus 80% de leur habitat initial. Les résultats de monitoring des changements de végétation dans le passé montrent que les plantes peuvent réagir rapidement au réchauffement climatique si la compétition est faible. Dans les prairies subalpines, les espèces déjà présentes limitent certainement l'arrivée de nouvelles espèces par effet de compétition. Les résultats de simulation pour le futur prédisent le début d'extinctions massives dans les Préalpes à partir de 2040, au plus tard en 2080. Mon travail démontre aussi l'importance d'études régionales à échelle fine pour évaluer l'impact des changements climatiques sur la végétation, en intégrant des variables plus directes. En effet, les études à échelle continentale ne tiennent pas compte des micro-refuges, des extinctions locales ni des pertes de connectivité entre populations locales. Malgré cela, la migration des plantes de basses altitudes reste difficile à prédire à l'échelle locale sans modélisation plus globale.
Resumo:
Anticoagulants are a mainstay of cardiovascular therapy, and parenteral anticoagulants have widespread use in cardiology, especially in acute situations. Parenteral anticoagulants include unfractionated heparin, low-molecular-weight heparins, the synthetic pentasaccharides fondaparinux, idraparinux and idrabiotaparinux, and parenteral direct thrombin inhibitors. The several shortcomings of unfractionated heparin and of low-molecular-weight heparins have prompted the development of the other newer agents. Here we review the mechanisms of action, pharmacological properties and side effects of parenteral anticoagulants used in the management of coronary heart disease treated with or without percutaneous coronary interventions, cardioversion for atrial fibrillation, and prosthetic heart valves and valve repair. Using an evidence-based approach, we describe the results of completed clinical trials, highlight ongoing research with currently available agents, and recommend therapeutic options for specific heart diseases.