161 resultados para key account manager


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: Climatic niche modelling of species and community distributions implicitly assumes strong and constant climatic determinism across geographic space. This assumption had however never been tested so far. We tested it by assessing how stacked-species distribution models (S-SDMs) perform for predicting plant species assemblages along elevation. Location: Western Swiss Alps. Methods: Using robust presence-absence data, we first assessed the ability of topo-climatic S-SDMs to predict plant assemblages in a study area encompassing a 2800 m wide elevation gradient. We then assessed the relationships among several evaluation metrics and trait-based tests of community assembly rules. Results: The standard errors of individual SDMs decreased significantly towards higher elevations. Overall, the S-SDM overpredicted far more than they underpredicted richness and could not reproduce the humpback curve along elevation. Overprediction was greater at low and mid-range elevations in absolute values but greater at high elevations when standardised by the actual richness. Looking at species composition, the evaluation metrics accounting for both the presence and absence of species (overall prediction success and kappa) or focusing on correctly predicted absences (specificity) increased with increasing elevation, while the metrics focusing on correctly predicted presences (Jaccard index and sensitivity) decreased. The best overall evaluation - as driven by specificity - occurred at high elevation where species assemblages were shown to be under significant environmental filtering of small plants. In contrast, the decreased overall accuracy in the lowlands was associated with functional patterns representing any type of assembly rule (environmental filtering, limiting similarity or null assembly). Main Conclusions: Our study reveals interesting patterns of change in S-SDM errors with changes in assembly rules along elevation. Yet, significant levels of assemblage prediction errors occurred throughout the gradient, calling for further improvement of SDMs, e.g., by adding key environmental filters that act at fine scales and developing approaches to account for variations in the influence of predictors along environmental gradients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this work is to develop a method to objectively compare the performance of a digital and a screen-film mammography system in terms of image quality. The method takes into account the dynamic range of the image detector, the detection of high and low contrast structures, the visualisation of the images and the observer response. A test object, designed to represent a compressed breast, was constructed from various tissue equivalent materials ranging from purely adipose to purely glandular composition. Different areas within the test object permitted the evaluation of low and high contrast detection, spatial resolution and image noise. All the images (digital and conventional) were captured using a CCD camera to include the visualisation process in the image quality assessment. A mathematical model observer (non-prewhitening matched filter), that calculates the detectability of high and low contrast structures using spatial resolution, noise and contrast, was used to compare the two technologies. Our results show that for a given patient dose, the detection of high and low contrast structures is significantly better for the digital system than for the conventional screen-film system studied. The method of using a test object with a large tissue composition range combined with a camera to compare conventional and digital imaging modalities can be applied to other radiological imaging techniques. In particular it could be used to optimise the process of radiographic reading of soft copy images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cancer immunotherapy has great promise, but is limited by diverse mechanisms used by tumors to prevent sustained antitumor immune responses. Tumors disrupt antigen presentation, T/NK-cell activation, and T/NK-cell homing through soluble and cell-surface mediators, the vasculature, and immunosuppressive cells such as myeloid-derived suppressor cells and regulatory T cells. However, many molecular mechanisms preventing the efficacy of antitumor immunity have been identified and can be disrupted by combination immunotherapy. Here, we examine immunosuppressive mechanisms exploited by tumors and provide insights into the therapies under development to overcome them, focusing on lymphocyte traffic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

B cells can either differentiate in germinal centers or in extrafollicular compartments of secondary lymphoid organs. Here we show the migration properties of B cells after differentiation in murine peripheral lymph node infected with mouse mammary tumor virus. Naive B cells become activated, infected, and carry integrated retroviral DNA sequences. After production of a retroviral superantigen, the infected B cells receive cognate T cell help and differentiate along the two main differentiation pathways analogous to classical Ag responses. The extrafollicular differentiation peaks on day 6 of mouse mammary tumor virus infection, and the follicular one becomes detectable after day 10. B cells participating in this immune response carry a retroviral DNA marker that can be detected by using semiquantitative PCR. We determined the migration patterns of B cells having taken part in the T cell-B cell interaction from the draining lymph node to different tissues. Waves of immigration and retention of infected cells in secondary lymphoid organs, mammary gland, salivary gland, skin, lung, and liver were observed correlating with the two peaks of B cell differentiation in the draining lymph node. Other organs revealed immigration of infected cells at later time points. The migration properties were correlated with a strong up-regulation of alpha(4)beta(1) integrin expression. These results show the migration properties of B cells during an immune response and demonstrate that a large proportion of extrafolliculary differentiating plasmablasts can escape local cell death and carry the retroviral infection to peripheral organs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines key aspects of Allan Gibbard's psychological account of moral activity. Inspired by evolutionary theory, Gibbard paints a naturalistic picture of morality mainly based on two specific types of emotion: guilt and anger. His sentimentalist and expressivist analysis is also based on a particular conception of rationality. I begin by introducing Gibbard's theory before testing some key assumptions underlying his system against recent empirical data and theories. The results cast doubt on some crucial aspects of Gibbard's philosophical theory, namely his reduction of morality to anger and guilt, and his theory of 'normative governance'. Gibbard's particular version of expressivism may be undermined by these doubts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Ca(2+)-regulated calcineurin/nuclear factor of activated T cells (NFAT) cascade controls alternative pathways of T-cell activation and peripheral tolerance. Here, we describe reduction of NFATc2 mRNA expression in the lungs of patients with bronchial adenocarcinoma. In a murine model of bronchoalveolar adenocarcinoma, mice lacking NFATc2 developed more and larger solid tumors than wild-type littermates. The extent of central tumor necrosis was decreased in the tumors in NFATc2((-/-)) mice, and this finding was associated with reduced tumor necrosis factor-alpha and interleukin-2 (IL-2) production by CD8(+) T cells. Adoptive transfer of CD8(+) T cells of NFATc2((-/-)) mice induced transforming growth factor-beta(1) in the airways of recipient mice, thus supporting CD4(+)CD25(+)Foxp-3(+)glucocorticoid-induced tumor necrosis factor receptor (GITR)(+) regulatory T (T(reg)) cell survival. Finally, engagement of GITR in NFATc2((-/-)) mice induced IFN-gamma levels in the airways, reversed the suppression by T(reg) cells, and costimulated effector CD4(+)CD25(+) (IL-2Ralpha) and memory CD4(+)CD127(+) (IL-7Ralpha) T cells, resulting in abrogation of carcinoma progression. Agonistic signaling through GITR, in the absence of NFATc2, thus emerges as a novel possible strategy for the treatment of human bronchial adenocarcinoma in the absence of NFATc2 by enhancing IL-2Ralpha(+) effector and IL-7Ralpha(+) memory-expressing T cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seit den 1990er Jahren werden zunehmend nachhaltige Quartiere realisiert. Dabei besteht häufig eine beachtliche Diskrepanz zwischen den Zielen, die von den beteiligten Akteuren angestrebt werden, deren Umsetzung (Realisierungsphase) und deren Erhalt auf Dauer (Nutzungsphase). Es stellt sich folglich die Frage, auf welche Weise die Projektqualität im Sinne einer nachhaltigen Quartiersentwicklung verbessert werden kann. Diese Projekte sind jedoch enorm komplex aufgrund der großen Interdisziplinarität und Interdependenz ihrer Ziele sowie der vielschichtigen Akteursstrukturen. Sie stellen daher be-sonders hohe Anforderungen an die Projektsteuerung. Das konkrete Ziel dieser Arbeit besteht darin, die Bedeutung einer Prozesssteuerung im Sinne von Urban Governance zur Realisierung und zum Erhalt nachhaltiger Quartiere zu untersuchen. Damit soll einen Beitrag zur Förderung einer nachhalti-gen Stadtentwicklung geleistet werden. Die Arbeit stützt sich auf ein umfassendes theoretisches Fundament zum Thema Governance, wobei die relevanten Elemente für den Kontext nachhaltiger Quartiere herausgearbeitet werden. Die Hypothesen prüfen die Bedeutung der Schlüsselcharakteristika von Urban Governance (Kooperation, Partizipation, Verhandlungen) für die Projektqualität während der Realisierungs- und Nutzungsphase. Eine erste empirische Untersuchung wurde an zwanzig europäischen nachhaltigen Modellquartieren vorgenommen. Stärken und Schwächen aus der Perspektive der Nachhaltigkeit werden analysiert, deren Ursachen identifiziert und Handlungsoptio-nen aufgezeigt. Die Erkenntnisse zeigen die Notwendigkeit einer Verbesserung der Projektsteuerung während der Realisierungs- und der Nutzungsphase. Auf der Grundlage dieser Erkenntnisse wird ein umfassender Ansatz zur empirischen Untersuchung von Urban Governance im Kontext nachhaltiger Quartiere entwickelt. Dieser beruht auf dem akteurzentrierten Institutionalismus und den Merkmalen der Urban Governance. Anhand dieses Ansatzes wird mithilfe von Experteninterviews der Realisierungsprozess des nach-haltigen Quartiers Kronsberg (Hannover) analysiert. Betrachtet werden dabei die beteiligten Akteure und ihre Handlungso-rientierungen, die verwendeten Schlüsselinstrumente sowie aufgetretene Divergenzen zwischen Akteuren und deren Auswirkungen auf die Projekt- und Prozessqualität. Eine Vertiefung relevanter Themenfelder wird anhand der Fallstudie Neu-Oerlikon (Zürich) vorgenommen. Diese empirische Arbeit zeigt, dass eine Prozesssteuerung im Sinne von Urban Governance im Vergleich zu einer klassis-chen hierarchischen Steuerung eine notwendige aber nicht hinreichende Bedingung zur Verbesserung der Projektqualität nachhaltiger Quartiere darstellt. An konkreten Beispielen wird herausgearbeitet, dass der Mehrwert einer solchen Steuerung nur unter bestimmten Voraussetzungen erzielt werden kann: In manchen Situationen ist die Steuerungsform Kooperation und die Interaktionsform Verhandlung in ihrer Wirksamkeit zur Sicherung der Projektqualität begrenzt und hierarchische Interventionen sind notwendig. Nicht ein bestimmtes Steuerungsmodell per se ist geeignet, sondern es kommt auf den Ein-zelfall an: auf die Akteursstruktur, die individuellen und institutionellen Handlungsorientierungen der Akteure und deren Ver-haltensweisen, die Rahmenbedingungen und die Ausgestaltung des Urban Governance-Prozesses. Wenn die Spielregeln dieses Prozesses von den Akteuren nicht wirklich angenommen und gelebt werden, dominieren individuelle und institutio-nelle Akteursinteressen zu Lasten der Projektqualität. Ferner zeigen die Untersuchungen, dass die Partizipation der zukünftigen Quartiersnutzer in der Praxis häufig unzureichend ist. Dies führt zu Einbußen in der Projektqualität. Entscheidend ist auf jeden Fall, dass mindestens ein Akteur, in der Regel die öffentliche Hand, präsent ist, der die Definition anspruchsvoller Nachhaltigkeitsstandards, deren Umsetzung und deren Erhalt sichert sowie die notwendigen Rahmenbedingungen dafür schafft. Diese Arbeit belegt darüber hinaus, dass der Erhalt der Projektqualität während der Nutzungsphase (Faktor Zeit) bisher un-zureichend beachtet und in die Projektplanung einbezogen wird. Gerade dieser Aspekt bestimmt aber, ob das Quartier auch auf Dauer dem Nachhaltigkeitsanspruch gerecht werden kann! Tatsächlich handelt es sich um einen fortlaufenden Prozess, der nicht mit der Einweihung des Quartiers abgeschlossen ist. Vor diesem Hintergrund werden relevante Handlungsfelder beschrieben und die Notwendigkeit der langfristigen Fortsetzung einer Steuerung im Sinne von Urban Governance bzw. der Herausbildung einer Urban Governance-Kultur aufgezeigt. Aus den empirischen Erhebungen werden Erfolgs- und Risikofaktoren für Urban Governance-Prozesse während der Realisierungs- und der Nutzungsphase abgeleitet. Ferner werden bisher vernachlässigte Handlungsfelder (langfristiges Umwelt-management, ökologische Finanzierungsformen, urbane Landwirtschaft, Umweltkommunikation, etc.) eruiert. Die Berücksichtigung dieser Erkenntnisse ist unerlässlich für eine Verbesserung der Projektqualität nachhaltiger Quartiere. ---------------------------------------------- Gouvernance urbaine et quartiers durables: Entre intensions et mise en oeuvre --- Résumé --- Depuis les années 90, la thématique des quartiers durables a gagné en importance, même si leur développement s'est avéré difficile. Le décalage entre les objectifs, leur mise en oeuvre et le projet tel qu'il est vécu par ses habitants est souvent important et nécessite d'être réduit. Un quartier durable est par nature un projet complexe, aux objectifs ambitieux situé à la croisée de multiples champs disciplinaires, mobilisant de nombreux acteurs aux intérêts divergents. De plus, chaque projet, du fait des ses spécificités, requiert un pilotage adapté. L'objectif principal de la recherche vise à analyser la nature du pilotage du processus de conception, de réalisation et d'exploitation des quartiers durables. Ses résultats ont pour ambition de contribuer à optimiser et promouvoir le développement urbain durable. Le fondement théorique de la recherche se base sur le concept de gouvernance urbaine, adapté au contexte particulier de la gouvernance des quartiers durables. La gouvernance urbaine, au sens où nous l'entendons, est un mode de pilotage basé sur la coopération entre les acteurs publics et privés. Les hypothèses centrales du travail testent la portée et les limites des caractéristiques-clefs de la gouvernance urbaine (coopération, participation, négociation), ainsi que l'importance de la notion de pérennité pour la qualité du projet. Dans un premier temps, nous avons analysé vingt quartiers durables modèles européens et identifié leurs atouts et leurs faiblesses en termes de durabilité, ainsi que leurs divers modes de pilotage. Les enseignements tirés de ces exemples révèlent la nécessité d'améliorer le pilotage des projets. Dans un deuxième temps, nous avons élaboré une grille d'analyse fine fondée sur l'approche institutionnelle des acteurs et les caractéristiques-clefs de la gouvernance urbaine. En nous appuyant sur cette grille, nous avons analysé le processus de conception et de réalisation du quartier durable de « Kronsberg » (Hanovre) à l'aide des éléments suivants : les acteurs (avec leurs intérêts et objectifs propres), les instruments d'aménagement du territoire, les modes de pilotage, les zones de divergence et de convergence entre les acteurs, ainsi que leurs impacts sur le processus et le projet. Dans un troisième temps, les hypothèses centrales ont été testées sur le quartier de « Neu-Oerlikon » (Zurich) afin d'approfondir et d'élargir les enseignements tirés de celui de « Kronsberg ». Les résultats des analyses mettent en évidence le fait qu'un pilotage de projet selon le modèle de la gouvernance urbaine est certes une condition nécessaire mais non suffisante pour améliorer la qualité du projet. De plus, la valeur ajoutée de la gouvernance urbaine n'est valable qu'à certaines conditions. En effet, la coopération et la négociation peuvent même, dans certaines situations, réduire la qualité du projet ! Le principal enseignement de la recherche révèle qu'il n'y a pas de mode de pilotage idéal, mais que la qualité d'un projet dépend d'une multitude de facteurs, tels que les constellations d'acteurs, leurs intérêts personnels et institutionnels, les conditions cadres et les « règles du jeu » de la gouvernance urbaine. Si les « règles du jeu » en particulier ne sont pas réellement appropriées par l'ensemble des acteurs, les intérêts et les comportements personnels ou institutionnels prédominent au détriment de la qualité du projet. De même, si la participation des futurs usagers à l'élaboration du projet de quartier durable n'est pas assurée, tant la qualité du projet que sa pérennité en pâtissent. Nous avons également constaté que la présence d'un acteur (en règle générale les autorités publiques) qui veille à la définition d'objectifs ambitieux en matière de développement durable et à leur application constitue un apport essentiel à la qualité du projet. En outre, la recherche met en évidence les carences dans le suivi et le maintien à long terme des qualités de durabilité de la phase d'exploitation des projets de quartiers durables analysés. Dans la phase d'exploitation, le degré de coopération diminue généralement et les modes de fonctionnement et de pilotage sectoriels se mettent en place au détriment de la qualité du projet. Cela confirme la nécessité de poursuivre le processus de pilotage selon le modèle de la gouvernance urbaine au-delà de la phase de réalisation des projets. La recherche précise les enjeux des champs d'action de la phase d'exploitation (domaine encore peu étudié) et démontre la pertinence du mode de pilotage préconisé. Enfin, les analyses permettent d'identifier des facteurs de réussite et de risque susceptibles d'influencer les systèmes de gouvernance urbaine, ainsi que les enjeux des domaines de la durabilité encore négligés (agriculture urbaine, gestion environnementale dans la durée, comportement des usagers, financement équitable, etc.). La prise en compte de ces enseignements est essentielle à l'amélioration de la gestion de futurs projets de quartiers durables. ---------------------------------------------- Urban Governance and Sustainable Neighbourhoods: A Contribution to a Lasting Sustainable Development --- Abstract --- Since the 1990s, sustainable neighbourhoods have become an increasingly important topic. However, their development has proven to be difficult. There is an often considerable gap, which must be reduced, between the initial goals, the way they are implemented and how the project is finally inhabited. A sustainable neighbourhood is inherently a complex project, with ambitious goals that lie at the intersection of multiple disciplines, involving numerous stakeholders with diverging interests. Moreover, each project, due to its specific characteristics, requires an adapted steering. The main goal of this research is to analyse the nature of the steering process during the planning, realisation and use of sustainable neighbourhoods. The results aim to contribute to the promotion of sustainable urban development. The theoretical foundation of this research is based on the concept of urban governance, adapted to the particular context of sustainable neighbourhoods. Urban governance is understood in this work, as a mode of project steering based on the cooperation between public and private stakeholders. The central hypotheses of this work test the importance and the limits of the key characteristics of urban governance (cooperation, participation, negotiation) as well as the importance of continuity for the project quality. To begin with, we surveyed and analysed twenty exemplary European sustainable neighbourhoods and identified their strengths and weaknesses in terms of sustainability, as well as their diverse steering modes. The lessons learned from these examples reveal the need to improve the projects' steering. Secondly we elaborated a detailed framework for analysis founded on stakeholder-centred institutionalism and the key characteristics of urban governance. By systematically applying this framework, we analysed the planning and implementation process of the sustainable neighbourhood "Kronsberg" (Hannover). Our focus was on the following dimensions: the stakeholders (with their particular interests and goals), the instruments of spatial planning, the steering modes, the points of divergence and convergence amongst the stakeholders, as well as their impacts on the process and on the project. The final step was to test the core hypotheses on the neighbourhood "Neu-Oerlikon" (Zürich) in order to broaden the lessons learned from "Kronsberg". The results of the analysis highlight the fact that an urban governance type project steering is certainly a necessary but insufficient condition to improve the project quality. Moreover, the added value of urban governance is only valid under certain conditions. In fact, cooperation and negotiation can even in certain situations reduce the project's quality! The main lesson of this research is that there is not an ideal steering mode, but rather that the quality of the project depends on numerous factors, such as the stakeholder constellation, their individual and institutional interests, the general conditions and the "rules of the game" of urban governance. If these "rules of the game" are not really appropriated by all stakeholders, individual and institutional interests and behaviours predominate at the expense of the project's quality. Likewise, if the future users' participation in the project development is insufficient, both the project's quality and its continuity suffer. We have also observed that the presence of a stakeholder (in general the public authorities) who ensures the definition of ambitious goals in terms of sustainable development and their implementation is crucial for the project's quality. Furthermore, this research highlights the deficiencies in the follow-up and long-term preservation of the sustainability qualities in the neighbourhood projects which we have analysed. In the use phase, the degree of cooperation generally diminishes. Attitudes and project management become more sectorial at the expense of the project's quality. This confirms the need to continue the steering process according to the principles of urban governance beyond the project's implementation phase. This research specifies the challenges that affect the use phase (a still neglected area) and shows the relevance of the recommended steering mode. Finally, the analyses also identify the success and risk factors that may influence urban-governance systems, as well as the challenges of still neglected fields of sustainability (urban agriculture, long-term environmental management, user behaviour, fair funding, etc.). Taking into account these outcomes is essential to improve the management of future sustainable-neighbourhood projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dans cette thèse, nous étudions les aspects comportementaux d'agents qui interagissent dans des systèmes de files d'attente à l'aide de modèles de simulation et de méthodologies expérimentales. Chaque période les clients doivent choisir un prestataire de servivce. L'objectif est d'analyser l'impact des décisions des clients et des prestataires sur la formation des files d'attente. Dans un premier cas nous considérons des clients ayant un certain degré d'aversion au risque. Sur la base de leur perception de l'attente moyenne et de la variabilité de cette attente, ils forment une estimation de la limite supérieure de l'attente chez chacun des prestataires. Chaque période, ils choisissent le prestataire pour lequel cette estimation est la plus basse. Nos résultats indiquent qu'il n'y a pas de relation monotone entre le degré d'aversion au risque et la performance globale. En effet, une population de clients ayant un degré d'aversion au risque intermédiaire encoure généralement une attente moyenne plus élevée qu'une population d'agents indifférents au risque ou très averses au risque. Ensuite, nous incorporons les décisions des prestataires en leur permettant d'ajuster leur capacité de service sur la base de leur perception de la fréquence moyenne d'arrivées. Les résultats montrent que le comportement des clients et les décisions des prestataires présentent une forte "dépendance au sentier". En outre, nous montrons que les décisions des prestataires font converger l'attente moyenne pondérée vers l'attente de référence du marché. Finalement, une expérience de laboratoire dans laquelle des sujets jouent le rôle de prestataire de service nous a permis de conclure que les délais d'installation et de démantèlement de capacité affectent de manière significative la performance et les décisions des sujets. En particulier, les décisions du prestataire, sont influencées par ses commandes en carnet, sa capacité de service actuellement disponible et les décisions d'ajustement de capacité qu'il a prises, mais pas encore implémentées. - Queuing is a fact of life that we witness daily. We all have had the experience of waiting in line for some reason and we also know that it is an annoying situation. As the adage says "time is money"; this is perhaps the best way of stating what queuing problems mean for customers. Human beings are not very tolerant, but they are even less so when having to wait in line for service. Banks, roads, post offices and restaurants are just some examples where people must wait for service. Studies of queuing phenomena have typically addressed the optimisation of performance measures (e.g. average waiting time, queue length and server utilisation rates) and the analysis of equilibrium solutions. The individual behaviour of the agents involved in queueing systems and their decision making process have received little attention. Although this work has been useful to improve the efficiency of many queueing systems, or to design new processes in social and physical systems, it has only provided us with a limited ability to explain the behaviour observed in many real queues. In this dissertation we differ from this traditional research by analysing how the agents involved in the system make decisions instead of focusing on optimising performance measures or analysing an equilibrium solution. This dissertation builds on and extends the framework proposed by van Ackere and Larsen (2004) and van Ackere et al. (2010). We focus on studying behavioural aspects in queueing systems and incorporate this still underdeveloped framework into the operations management field. In the first chapter of this thesis we provide a general introduction to the area, as well as an overview of the results. In Chapters 2 and 3, we use Cellular Automata (CA) to model service systems where captive interacting customers must decide each period which facility to join for service. They base this decision on their expectations of sojourn times. Each period, customers use new information (their most recent experience and that of their best performing neighbour) to form expectations of sojourn time at the different facilities. Customers update their expectations using an adaptive expectations process to combine their memory and their new information. We label "conservative" those customers who give more weight to their memory than to the xiv Summary new information. In contrast, when they give more weight to new information, we call them "reactive". In Chapter 2, we consider customers with different degree of risk-aversion who take into account uncertainty. They choose which facility to join based on an estimated upper-bound of the sojourn time which they compute using their perceptions of the average sojourn time and the level of uncertainty. We assume the same exogenous service capacity for all facilities, which remains constant throughout. We first analyse the collective behaviour generated by the customers' decisions. We show that the system achieves low weighted average sojourn times when the collective behaviour results in neighbourhoods of customers loyal to a facility and the customers are approximately equally split among all facilities. The lowest weighted average sojourn time is achieved when exactly the same number of customers patronises each facility, implying that they do not wish to switch facility. In this case, the system has achieved the Nash equilibrium. We show that there is a non-monotonic relationship between the degree of risk-aversion and system performance. Customers with an intermediate degree of riskaversion typically achieve higher sojourn times; in particular they rarely achieve the Nash equilibrium. Risk-neutral customers have the highest probability of achieving the Nash Equilibrium. Chapter 3 considers a service system similar to the previous one but with risk-neutral customers, and relaxes the assumption of exogenous service rates. In this sense, we model a queueing system with endogenous service rates by enabling managers to adjust the service capacity of the facilities. We assume that managers do so based on their perceptions of the arrival rates and use the same principle of adaptive expectations to model these perceptions. We consider service systems in which the managers' decisions take time to be implemented. Managers are characterised by a profile which is determined by the speed at which they update their perceptions, the speed at which they take decisions, and how coherent they are when accounting for their previous decisions still to be implemented when taking their next decision. We find that the managers' decisions exhibit a strong path-dependence: owing to the initial conditions of the model, the facilities of managers with identical profiles can evolve completely differently. In some cases the system becomes "locked-in" into a monopoly or duopoly situation. The competition between managers causes the weighted average sojourn time of the system to converge to the exogenous benchmark value which they use to estimate their desired capacity. Concerning the managers' profile, we found that the more conservative Summary xv a manager is regarding new information, the larger the market share his facility achieves. Additionally, the faster he takes decisions, the higher the probability that he achieves a monopoly position. In Chapter 4 we consider a one-server queueing system with non-captive customers. We carry out an experiment aimed at analysing the way human subjects, taking on the role of the manager, take decisions in a laboratory regarding the capacity of a service facility. We adapt the model proposed by van Ackere et al (2010). This model relaxes the assumption of a captive market and allows current customers to decide whether or not to use the facility. Additionally the facility also has potential customers who currently do not patronise it, but might consider doing so in the future. We identify three groups of subjects whose decisions cause similar behavioural patterns. These groups are labelled: gradual investors, lumpy investors, and random investor. Using an autocorrelation analysis of the subjects' decisions, we illustrate that these decisions are positively correlated to the decisions taken one period early. Subsequently we formulate a heuristic to model the decision rule considered by subjects in the laboratory. We found that this decision rule fits very well for those subjects who gradually adjust capacity, but it does not capture the behaviour of the subjects of the other two groups. In Chapter 5 we summarise the results and provide suggestions for further work. Our main contribution is the use of simulation and experimental methodologies to explain the collective behaviour generated by customers' and managers' decisions in queueing systems as well as the analysis of the individual behaviour of these agents. In this way, we differ from the typical literature related to queueing systems which focuses on optimising performance measures and the analysis of equilibrium solutions. Our work can be seen as a first step towards understanding the interaction between customer behaviour and the capacity adjustment process in queueing systems. This framework is still in its early stages and accordingly there is a large potential for further work that spans several research topics. Interesting extensions to this work include incorporating other characteristics of queueing systems which affect the customers' experience (e.g. balking, reneging and jockeying); providing customers and managers with additional information to take their decisions (e.g. service price, quality, customers' profile); analysing different decision rules and studying other characteristics which determine the profile of customers and managers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Notch signalling has an important role in skin homeostasis, promoting keratinocyte differentiation and suppressing tumorigenesis. Here we show that this pathway also has an essential anti-apoptotic function in the keratinocyte UVB response. Notch1 expression and activity are significantly induced, in a p53-dependent manner, by UVB exposure of primary keratinocytes as well as intact epidermis of both mouse and human origin. The apoptotic response to UVB is increased by deletion of the Notch1 gene or down-modulation of Notch signalling by pharmacological inhibition or genetic suppression of 'canonical' Notch/CSL/MAML1-dependent transcription. Conversely, Notch activation protects keratinocytes against apoptosis through a mechanism that is not linked to Notch-induced cell cycle withdrawal or NF-kappaB activation. Rather, transcription of FoxO3a, a key pro-apoptotic gene, is under direct negative control of Notch/HERP transcription in keratinocytes, and upregulation of this gene accounts for the increased susceptibility to UVB of cells with suppressed Notch signalling. Thus, the canonical Notch/HERP pathway functions as a protective anti-apoptotic mechanism in keratinocytes through negative control of FoxO3a expression.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: APETx2, a toxin from the sea anemone Anthropleura elegantissima, inhibits acid-sensing ion channel 3 (ASIC3)-containing homo- and heterotrimeric channels with IC(50) values < 100 nM and 0.1-2 µM respectively. ASIC3 channels mediate acute acid-induced and inflammatory pain response and APETx2 has been used as a selective pharmacological tool in animal studies. Toxins from sea anemones also modulate voltage-gated Na(+) channel (Na(v) ) function. Here we tested the effects of APETx2 on Na(v) function in sensory neurones.¦EXPERIMENTAL APPROACH: Effects of APETx2 on Na(v) function were studied in rat dorsal root ganglion (DRG) neurones by whole-cell patch clamp.¦KEY RESULTS: APETx2 inhibited the tetrodotoxin (TTX)-resistant Na(v) 1.8 currents of DRG neurones (IC(50) , 2.6 µM). TTX-sensitive currents were less inhibited. The inhibition of Na(v) 1.8 currents was due to a rightward shift in the voltage dependence of activation and a reduction of the maximal macroscopic conductance. The inhibition of Na(v) 1.8 currents by APETx2 was confirmed with cloned channels expressed in Xenopus oocytes. In current-clamp experiments in DRG neurones, the number of action potentials induced by injection of a current ramp was reduced by APETx2.¦CONCLUSIONS AND IMPLICATIONS: APETx2 inhibited Na(v) 1.8 channels, in addition to ASIC3 channels, at concentrations used in in vivo studies. The limited specificity of this toxin should be taken into account when using APETx2 as a pharmacological tool. Its dual action will be an advantage for the use of APETx2 or its derivatives as analgesic drugs.