929 resultados para Leader and boss
Resumo:
Agent based simulation is a widely developing area in artificial intelligence.The simulation studies are extensively used in different areas of disaster management. This work deals with the study of an agent based evacuation simulation which is being done to handle the various evacuation behaviors.Various emergent behaviors of agents are addressed here. Dynamic grouping behaviors of agents are studied. Collision detection and obstacle avoidances are also incorporated in this approach.Evacuation is studied with single exits and multiple exits and efficiency is measured in terms of evacuation rate, collision rate etc.Net logo is the tool used which helps in the efficient modeling of scenarios in evacuation
Resumo:
El posicionamiento estratégico se define como el punto de partida de toda reflexión de la organización (por más pequeña que sea) que pretende poner un lugar dentro de la empresa elgive back del propio rendimiento. En el caso de la estrategia colectiva, no sólo hay una visión de un dirigente (manager), al contrario hay una gran cantidad de visiones de diferentes gestores, que tendrá que tomar decisiones comunes beneficiosas para sus propios intereses y el común de los intereses de cada empresa. Por lo tanto, es esencial que la situación actual y los objetivos a alcanzar estén claramente definidos desde el comienzo de la elaboración de la estrategia, para evitar las posibles divergencias que puedan poner en riesgo la coherencia de la estrategia. Con los problemas encontrados en la PYME francesa, como el inicio de la actividad, los problemas financieros, la integración organizativa y la competencia y el desarrollo de productos, la estrategia colectiva aparece como una posible solución que permite a la PYME perdurar en el tiempo. En Francia, impulsada por el Gobierno y otras instituciones financieras y administrativas, esta estrategia ha conseguido resultados que antes no se habían pensado, como lo demuestra el estudio de modelo urbano el cuál es presentado en esta investigación. Esta es la razón y el por qué se eligió este tema.
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Virginia Tech has been depositing e-theses for over a decade and is a leader in the field. University of Southampton introduced e-thesis deposit in the academic session 2008-09
Resumo:
El marcaje de proteínas con ubiquitina, conocido como ubiquitinación, cumple diferentes funciones que incluyen la regulación de varios procesos celulares, tales como: la degradación de proteínas por medio del proteosoma, la reparación del ADN, la señalización mediada por receptores de membrana, y la endocitosis, entre otras (1). Las moléculas de ubiquitina pueden ser removidas de sus sustratos gracias a la acción de un gran grupo de proteasas, llamadas enzimas deubiquitinizantes (DUBs) (2). Las DUBs son esenciales para la manutención de la homeostasis de la ubiquitina y para la regulación del estado de ubiquitinación de diferentes sustratos. El gran número y la diversidad de DUBs descritas refleja tanto su especificidad como su utilización para regular un amplio espectro de sustratos y vías celulares. Aunque muchas DUBs han sido estudiadas a profundidad, actualmente se desconocen los sustratos y las funciones biológicas de la mayoría de ellas. En este trabajo se investigaron las funciones de las DUBs: USP19, USP4 y UCH-L1. Utilizando varias técnicas de biología molecular y celular se encontró que: i) USP19 es regulada por las ubiquitin ligasas SIAH1 y SIAH2 ii) USP19 es importante para regular HIF-1α, un factor de transcripción clave en la respuesta celular a hipoxia, iii) USP4 interactúa con el proteosoma, iv) La quimera mCherry-UCH-L1 reproduce parcialmente los fenotipos que nuestro grupo ha descrito previamente al usar otros constructos de la misma enzima, y v) UCH-L1 promueve la internalización de la bacteria Yersinia pseudotuberculosis.
Resumo:
What are the effects of natural disasters on electoral results? Some authors claim that catastrophes have a negative effect on the survival of leaders in a democracy because voters have a propensity to punish politicians for not preventing or poorly handling a crisis. In contrast, this paper finds that these events might be beneficial for leaders. Disasters are linked to leader survival through clientelism: they generate an in-flow of resources in the form of aid, which increase money for buying votes. Analyzing the rainy season of 2010-2011 in Colombia, considered its worst disaster in history, I use a difference-in-differences strategy to show that in the local election incumbent parties benefited from the disaster. The result is robust to different specifications and alternative explanations. Moreover, places receiving more aid and those with judicial evidence of vote-buying irregularities, are more likely to reelect the incumbent, supporting the mechanism proposed by this paper.
Resumo:
Multicultural leadership is a topic a great interest in nowadays globalized work environment. Colombia emerges as an attractive marketplace with appealing business opportunities, especially for German enterprises. After presenting Colombia’s current political, social and economic situation, the thesis elaborates the complex subject of cultural differences while focusing on the peculiarities of German and Colombian national cultures. The resulting implications for a team’s collaboration and leader effectiveness are theoretically supported with reference to the landmark studies of Hofstede and GLOBE. By utilizing semi-structured interview techniques, a qualitative research enriches the previous findings and gives an all-encompassing insight in German-Colombian teamwork. The investigation identifies distinctive behavioral patterns and relations, which imply challenges and factors of success for multicultural team leaders. Finally, a categorical analysis examines the influence of cultural traits on team performance and evaluates the effectiveness of the applied leadership style.
Resumo:
We study the role of natural resource windfalls in explaining the efficiency of public expenditures. Using a rich dataset of expenditures and public good provision for 1,836 municipalities in Peru for period 2001-2010, we estimate a non-monotonic relationship between the efficiency of public good provision and the level of natural resource transfers. Local governments that were extremely favored by the boom of mineral prices were more efficient in using fiscal windfalls whereas those benefited with modest transfers were more inefficient. These results can be explained by the increase in political competition associated with the boom. However, the fact that increases in efficiency were related to reductions in public good provision casts doubts about the beneficial effects of political competition in promoting efficiency.
Resumo:
In their assessment of the proposed European Endowment for Democracy (EED), Hrant Kostanyan and Magdalena Nasieniak conclude that an instrument along the lines currently envisaged could and should take on the challenge to make the EU a truly committed, pro-active and effective leader of democracy assistance. A flexible and fast-track path of assessing needs and granting funds could become the most visible results of the EU’s assistance in this area, delivering almost immediate tangible results. They argue that the EED therefore needs to become an instrument free of nationally-driven decisions, European ‘turf wars’ and cumbersome bureaucracy.
Resumo:
More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.
Resumo:
Phosphorylation of the coronavirus nucleoprotein (N protein) has been predicted to play a role in RNA binding. To investigate this hypothesis, we examined the kinetics of RNA binding between nonphosphorylated and phosphorylated infectious bronchitis virus N protein with nonviral and viral RNA by surface plasmon resonance (Biacore). Mass spectroscopic analysis of N protein identified phosphorylation sites that were proximal to RNA binding domains. Kinetic analysis, by surface plasmon resonance, indicated that nonphospborylated N protein bound with the same affinity to viral RNA as phosphorylated N protein. However, phosphorylated N protein bound to viral RNA with a higher binding affinity than nonviral RNA, suggesting that phosphorylation of N protein determined the recognition of virus RNA. The data also indicated that a known N protein binding site (involved in transcriptional regulation) consisting of a conserved core sequence present near the 5' end of the genome (in the leader sequence) functioned by promoting high association rates of N protein binding. Further analysis of the leader sequence indicated that the core element was not the only binding site for N protein and that other regions functioned to promote high-affinity binding.
Resumo:
Two experiments investigated the influence of implicit memory on consumer choice for brands with varying levels of familiarity. Priming was measured using a consideration-choice task, developed by Coates, Butler and Berry (2004). Experiment 1 employed a coupon-rating task at encoding that required participants to meaningfully process individual brand names, to assess whether priming could affect participants' final (preferred) choices for familiar brands. Experiment 2 used this same method to assess the impact of implicit memory on consideration and choice for unknown and leader brands, presented in conjunction with familiar competitors. Significant priming was obtained in both experiments, and was shown to directly influence final choice in the case of familiar and highly familiar leader brands. Moreover, it was shown that a single prior exposure could lead participants to consider buying an unknown, and indeed fictitious, brand. Copyright (c) 2006 John Wiley & Sons, Ltd.
Resumo:
A new database of weather and circulation type catalogs is presented comprising 17 automated classification methods and five subjective classifications. It was compiled within COST Action 733 "Harmonisation and Applications of Weather Type Classifications for European regions" in order to evaluate different methods for weather and circulation type classification. This paper gives a technical description of the included methods using a new conceptual categorization for classification methods reflecting the strategy for the definition of types. Methods using predefined types include manual and threshold based classifications while methods producing types derived from the input data include those based on eigenvector techniques, leader algorithms and optimization algorithms. In order to allow direct comparisons between the methods, the circulation input data and the methods' configuration were harmonized for producing a subset of standard catalogs of the automated methods. The harmonization includes the data source, the climatic parameters used, the classification period as well as the spatial domain and the number of types. Frequency based characteristics of the resulting catalogs are presented, including variation of class sizes, persistence, seasonal and inter-annual variability as well as trends of the annual frequency time series. The methodological concept of the classifications is partly reflected by these properties of the resulting catalogs. It is shown that the types of subjective classifications compared to automated methods show higher persistence, inter-annual variation and long-term trends. Among the automated classifications optimization methods show a tendency for longer persistence and higher seasonal variation. However, it is also concluded that the distance metric used and the data preprocessing play at least an equally important role for the properties of the resulting classification compared to the algorithm used for type definition and assignment.
Resumo:
Research undertaken through significant public art commission. The researchers were both artists were selected separately by Dr Penelope Curtis of Tate and then the shortlist was awarded through competition (peer reviewed by Critics and Artist in Germany) part of the Heidenheim Sculpture Biennial, Germany (€18K). The work was realised by two companies in Heidenheim. Where is Heidenheim? was based within the Heidenheim Zietung newspaper[HZ] and drew together a site of a local paper in a small town in Germany with other local International papers; Wendover Times – Utah, USA;, Limerick Leader, Ireland; Free Imphal Press, Manipur, India; Hibr, Lebanon; Namibia Times, Namibia and The Countryman, Tasmania, Australia. Each of these papers ran a story showing a sign erected onto HZ in Heidenheim, which was subsequently printed inside HZ itself – linking together sites and local voices. Project research identifying global partners was conducted through the management of a PhD research student from the BU Media School - Venkata Vermuri. The work for both artists expands the context of their research into the impact of global networks on public art, and the traditions and norms of public art being confined to single ‘geographical’ sites. This research indicates the potential for media as a common public space that can also be used.