911 resultados para New Strategic Theory
Resumo:
Secularism has emerged as a central category of twenty-first century political thought that in many ways has replaced the theory of secularization. According to postcolonial scholars, neither the theory nor the practice of secularization was politically neutral. They define secularism as the set of discourses, policies, and constitutional arrangements whereby modern states and liberal elites have sought to unify nations and divide colonial populations. This definition is quite different from the original meaning of secularism, as an immanent scientific worldview linked to anticlericalism. Anthropologist Talal Asad has connected nineteenth-century worldview secularism to twenty-first century political secularism through a genealogical account that stresses continuities of liberal hegemony. This essay challenges this account. It argues that liberal elites did not merely subsume worldview secularism in their drive for state secularization. Using the tools of conceptual history, the essay shows that one reason that “secularization” only achieved its contemporary meaning in Germany after 1945 was that radical freethinkers and other anticlerical secularists had previously resisted liberal hegemony. The essay concludes by offering an agenda for research into the discontinuous history of these two types of secularism.
Resumo:
The focus of this paper is to outline a method for consolidating and implementing the work on performance-based specification and testing. First part of the paper will review the mathematical significance of the variables used in common service life models. The aim is to identify a set of significant variables that influence the ingress of chloride ions into concrete. These variables are termed as Key Performance Indicators (KPI’s). This will also help to reduce the complexity of some of the service life models and make them more appealing for practicing engineers. The second part of the paper presents a plan for developing a database based on these KPI’s so that relationships can then be drawn between common concrete mix parameters and KPI’s. This will assist designers in specifying a concrete with adequate performance for a particular environment. This, collectively, is referred to as the KPI based approach and the concluding remarks will outline how the authors envisage the KPI theory to relate to performance assessment and monitoring.
Resumo:
On 26 December 2003 an Israeli activist was shot by the Israeli Army while he was participating in a demonstration organized by Anarchists Against the Wall (AAtW) in the West Bank. This was the first time Israeli Soldiers have deliberately shot live bullets at a Jewish-Israeli activist. This paper is an attempt to understand the set of conditions, the enveloping frameworks, and the new discourses that have made this event, and similar shootings that soon followed, possible. Situating the actions of AAtW within a much wider context of securitization—of identities, movements, and bodies—we examine strategies of resistance which are deployed in highly securitized public spaces. We claim that an unexpected matrix of identity in which abnormality is configured as security threat render the bodies of activists especially precarious. The paper thus provides an account of the new rationales of security technologies and tactics which increasingly govern public spaces.
Resumo:
Best concrete research paper by a student - Research has shown that the cost of managing structures puts high strain on the infrastructure budget, with
estimates of over 50% of the European construction budget being dedicated to repair and maintenance. If reinforced concrete
structures are not suitably designed and adequately maintained, their service life is compromised, resulting in the full economic
value of the investment not realised. The issue is more prevalent in coastal structures as a result of combinations of aggressive
actions, such as those caused by chlorides, sulphates and cyclic freezing and thawing.
It is a common practice nowadays to ensure durability of reinforced concrete structures by specifying a concrete mix and a
nominal cover at the design stage to cater for the exposure environment. This in theory should produce the performance required
to achieve a specified service life. Although the European Standard EN 206-1 specifies variations in the exposure environment,
it does not take into account the macro and micro climates surrounding structures, which have a significant influence on their
performance and service life. Therefore, in order to construct structures which will perform satisfactorily in different exposure
environments, the following two aspects need to be developed: a performance based specification to supplement EN 206-1
which will outline the expected performance of the structure in a given environment; and a simple yet transferrable procedure
for assessing the performance of structures in service termed KPI Theory. This will allow the asset managers not only to design
structures for the intended service life, but also to take informed maintenance decisions should the performance in service fall
short of what was specified. This paper aims to discuss this further.
Resumo:
Identifying processes that shape species geographical ranges is a prerequisite for understanding environmental change. Currently, species distribution modelling methods do not offer credible statistical tests of the relative influence of climate factors and typically ignore other processes (e.g. biotic interactions and dispersal limitation). We use a hierarchical model fitted with Markov Chain Monte Carlo to combine ecologically plausible niche structures using regression splines to describe unimodal but potentially skewed response terms. We apply spatially explicit error terms that account for (and may help identify) missing variables. Using three example distributions of European bird species, we map model results to show sensitivity to change in each covariate. We show that the overall strength of climatic association differs between species and that each species has considerable spatial variation in both the strength of the climatic association and the sensitivity to climate change. Our methods are widely applicable to many species distribution modelling problems and enable accurate assessment of the statistical importance of biotic and abiotic influences on distributions.
Resumo:
We consider a normal form game in which there is a single exogenously given coalition of cooperating players that can write a binding agreement on pre-selected actions. These collective actions typically represent a certain number of dimensions in the players’ strategy space. The actions represented by the other dimensions of the strategy space remain under the complete, individual control of the players.
We consider a standard extension of the Nash equilibrium concept denoted as a partial cooperative equilibrium as well as an equilibrium concept in which the coalition of cooperators has a leadership position. Existence results are developed for these new equilibrium concepts. We identify conditions on these partial cooperative games under which the various equilibrium concepts are equivalent.
We apply this game theoretic framework to existing models of multi-market oligopolies and international pollution abatement. In a multi-market oligopoly typically a merger paradox emerges in the partial cooperative equilibrium, which vanishes if the cartel of collaborators exploits its leadership position. Our application to international pollution abatement treaties shows that cooperation by a sufficiently large group of countries results in a Pareto improvement over the standard tragedy of the commons outcome described by the Nash equilibrium.
Resumo:
On June 27th 2012, the Deputy First Minister of Northern Ireland and former IRA commander, Martin McGuinness shook hands with Queen Elizabeth II for the first time at an event in Belfast. For many the gesture symbolised the consolidation of Northern Ireland's transition to peace, the meeting of cultures and traditions, and hope for the future. Only a few weeks later however violence spilled onto the streets of north and west Belfast following a series of commemorative parades, marking a summer of hostilities. Those hostilities spread into a winter of protest, riot and discontent around flags and emblems and a year of tensions and commemorative-related violence marked again by a summer of rioting and protest in 2013. Outwardly these examples present two very different pictures of the 'new' Northern Ireland; the former of a society moving forward and putting the past behind it and the latter apparently divided over and wedded to different constructions of the past. Furthermore they revealed two very different 'places', the public handshake in the arena of public space; the rioting and fighting occurring in spaces distanced from the public sphere. This paper has also illustrated the difficulties around the ‘public management’ of conflict and transition as many within public agencies struggle with duties to uphold good relations and promote good governance within an environment of political strife, hostility and continuing violence.
This paper presents the key findings and implications of an exploratory project funded by the Arts and Humanities Research Council, explored the phenomenon of commemorative-related violence in Northern Ireland. We focus on 1) why the performance or celebration of the past can sometimes lead to violence in specific places; 2) map and analyse the levels of commemorative related violence in the past 15 years and 3) look at the public management implications of both conflict and transition at a strategic level within the public sector.
Resumo:
Answer Set Programming (ASP) is a popular framework for modelling combinatorial problems. However, ASP cannot be used easily for reasoning about uncertain information. Possibilistic ASP (PASP) is an extension of ASP that combines possibilistic logic and ASP. In PASP a weight is associated with each rule, whereas this weight is interpreted as the certainty with which the conclusion can be established when the body is known to hold. As such, it allows us to model and reason about uncertain information in an intuitive way. In this paper we present new semantics for PASP in which rules are interpreted as constraints on possibility distributions. Special models of these constraints are then identified as possibilistic answer sets. In addition, since ASP is a special case of PASP in which all the rules are entirely certain, we obtain a new characterization of ASP in terms of constraints on possibility distributions. This allows us to uncover a new form of disjunction, called weak disjunction, that has not been previously considered in the literature. In addition to introducing and motivating the semantics of weak disjunction, we also pinpoint its computational complexity. In particular, while the complexity of most reasoning tasks coincides with standard disjunctive ASP, we find that brave reasoning for programs with weak disjunctions is easier.
Resumo:
I have completed 80% of a teaching text book (text and graphics) on Separation Science and Technology - Theory. The book's content is what I've learned over many years of practice and teaching with an emphasis on clarifying and explaining the nuances within the theories associated with various practical approaches to chemical and biochemical separations.
The book is divided into self-contained Chapters with many worked examples and practice questions. It very much aligns with my teaching on CHM3005D, CHM2010, CHM2007 and is ideal for PMY8105 and the new proposed MSci in Analytical Chemistry Programme. The book brings together diverse material in single space and will be a valuable pedagogical resource for the teaching of this key discipline within QUB and elsewhere.
Resumo:
Intergroup contact theory proposes that positive interactions between members of different social groups can improve intergroup relations. Contact should be especially effective in schools, where opportunities may exist to engage cooperatively with peers from different backgrounds and develop cross-group friendships. In turn, these friendships have numerous benefits for intergroup relations. However, there is evidence that children do not always engage in cross-group friendships, often choosing to spend time with same-group peers, even in diverse settings. We argue that in order to capitalize on the potential impact of contact in schools for promoting harmonious intergroup relations, a new model is needed that places confidence in contact at its heart. We present an empirically driven theoretical model of intergroup contact that outlines the conditions that help to make young people contact ready, preparing them for successful, sustained intergroup relationships by giving them the confidence that they can engage in contact successfully. After evaluating the traditional approach to intergroup contact in schools, we present our theoretical model which outlines predictors of cross-group friendships that enhance confidence in and readiness for contact. We then discuss theory-driven, empirically tested interventions that could potentially promote confidence in contact. Finally, we make specific recommendations for practitioners and policy makers striving to promote harmonious intergroup relations in the classroom.
Resumo:
For those working in the humanitarian sector, achieving positive outcomes for postdisaster communities through reconstruction projects is a pressing concern. In the wake of recent natural disasters, NGOs have become increasingly involved in the permanent reconstruction of affected communities. They have encountered significant barriers as they implement reconstruction programmes and this paper argues that it is important to address the visible lack of innovation that is partially to blame. The theoretical bedrock of a current research project will be used as the starting point for this argument, the overall goal of which is to design a competency-based framework model that can be used by NGOs in post-disaster reconstruction projects. Drawing on established theories of management, a unique perspective has been developed from which a competency-based reconstruction theory emerges. This theoretical framework brings together 3 distinct fields; Disaster Management, Strategic Management and Project Management, each vital
to the success of the model. The objectives of this paper are a) to investigate the role of NGOs in post-disaster reconstruction and establish the current standard of practice b) to determine the extent to which NGOs have the opportunity to contribute to sustainable community development through reconstruction c) to outline the main factors of a theoretical framework first proposed by Von Meding et al. 2009 and d) to identify the innovative measures that can be taken by NGOs to achieve more positive outcomes in their interventions. It is important that NGOs involved in post-disaster reconstruction become familiar with concepts and strategies such as those contained in this paper. Competency-based organizational change on the basis of this theory has the potential to help define the standard of best practice to which future NGO projects might align themselves.
Resumo:
O presente trabalho teve por objetivos a identificação de uma estratégia e o desenvolvimento de um modelo que permita às operadoras de telecomunicações a sua sustentabilidade, bem como a identificação de caminhos para a adaptação a uma realidade sempre em mudança como é a da indústria das telecomunicações. Numa primeira parte do trabalho elaborou-se uma revisão de literatura do atual estado da arte das principais estratégias relevantes e com aplicação à indústria de telecomunicações. A pesquisa realizada investigou a estrutura atual da indústria de telecomunicações e o estado da competitividade das operadoras de telecomunicações. Dos resultados desta foi possível constatar uma evolução constante da tecnologia e dos modelos de negócio neste ecossistema, assim como a presença de uma pressão concorrencial significativa exercida sobre as operadoras, quer por parte das empresas já existentes no mercado quer por parte das emergentes. As operadoras têm de transformar o seu modelo de rede e de negócios para se adaptarem às mudanças e às tendências da indústria e do mercado. Com base na revisão de literatura, elegeu-se a metodologia baseada num inquérito de pesquisa empírica para aferir o estado da indústria e derivar as estratégias possíveis. Este inquérito foi efetuado a especialistas da área de telecomunicações de diferentes subsectores e países para abordar todos os elementos estratégicos do modelo de negócio futuro. Os resultados da pesquisa revelaram que as empresas que operam no mercado da Internet (Over The Top - OTT) representam a maior ameaça sobre o futuro dos operadores de telecomunicações. Os operadores só vão conseguir responder através da modernização de sua rede, melhorando a qualidade, reduzindo o custo global, e investindo em produtos inovadores e diferenciados e em serviços. Os resultados do inquérito revelam-se de acordo com os pressupostos da Blue Ocean Strategy. A aplicabilidade da Blue Ocean Strategy foi aprofundada permitindo concluir que o valor inovador obtido simultaneamente através da redução de custos e da diferenciação permitem aumentar as vantagens dos operadores existentes em termos das infra-estruturas físicas detidas e das relações estabelecidas com os clientes. O caso particular da fibra óptica até casa (FTTH) foi considerado como aplicação da Blue Ocean Strategy a uma nova tecnologia que as operadoras podem implementar para criar novas soluções e abrir segmentos de mercado inexplorados. Os resultados do inquérito e da investigação realizada à aplicação da Blue Ocean Strategy foram combinados para propor um novo modelo de negócio para as operadoras de telecomunicações que lhes permite, não só responder aos desafios existentes, mas, também, ter uma melhor posição competitiva no futuro. Foi, ainda, realizado um estudo de caso que destacou como a Verizon Communications foi capaz de transformar a sua rede e o modelo de negócio em resposta ao aumento da pressão competitiva. Através do valor da inovação transferida aos seus clientes, a Verizon foi capaz de aumentar significativamente as suas receitas e satisfação do cliente.
Resumo:
O propósito deste trabalho é identificar no âmbito da Estratégia como se dá a quantificação da importância estratégica de um recurso, uma solução ou uma organização sob o pronto de vista do cliente, ou seja, do ponto de vista de quem tem uma carência de algo. A motivação para seu desenvolvimento decorreu da constatação de uma crescente banalização dos termos relacionados com a Estratégia, principalmente após os trabalhos de Porter. Esse fenômeno banalizou, por exemplo, a utilização do termo importância estratégica pois tudo ou quase tudo passou a ser qualificado como sendo estrategicamente importante. Outro termo também banalizado foi o valor e seus derivados, como: criar valor, agregar valor, transferir valor etc. A banalização é tamanha que não é difícil encontrar iletrados tratando desses termos como se dominassem seu significado. A análise bibliográfica dos referidos termos a partir da visão de vários autores acabou por conduzir à “desconstrução” de conceitos, principalmente dos que representam um esteio para a visão Porteriana. Entre eles, sobressai o valor, sobre o qual o trabalho faz uma análise do seu significado segundo algumas áreas do conhecimento, como Filosofia, Economia, Qualidade, Estratégia e Marketing. Através desse processo, foi possível constatar que as teorias que embasam as perspectivas de análise da origem da vantagem competitiva apresentam visões estáticas e parciais do ambiente competitivo. Foi também possível verificar que as definições de diversos autores sobre Estratégia não são tão fiéis às teorias como seria de se esperar, pois estas muitas vezes são inconclusivas e parecem “navegar” entre elas. Tais inconsistências evidenciam a necessidade de uma nova proposta teórica que colmate as lacunas das atuais. Daí que este trabalho propõe o conceito de Contributo, em substituição ao valor, e no Sistema de Criação de Contributos, em substituição à cadeia de viii valor, dando origem a uma nova visão estruturada da organização voltada para os macroatributos que colmatam as carências dos clientes. Essa nova perspectiva de obtenção de vantagem competitiva, denominada Gestão Baseada no Contributo, é integradora pois considera sob uma perspectiva dinâmica, não só os fatores exógenos à organização (os mercados) como também aos endógenos (os recursos). Além disso, permite a quantificação da importância estratégica, o que até então não era possível de ser feito na forma de pontuação (absoluta) e na forma de percentual (relativa ao Cliente).
Resumo:
As técnicas estatísticas são fundamentais em ciência e a análise de regressão linear é, quiçá, uma das metodologias mais usadas. É bem conhecido da literatura que, sob determinadas condições, a regressão linear é uma ferramenta estatística poderosíssima. Infelizmente, na prática, algumas dessas condições raramente são satisfeitas e os modelos de regressão tornam-se mal-postos, inviabilizando, assim, a aplicação dos tradicionais métodos de estimação. Este trabalho apresenta algumas contribuições para a teoria de máxima entropia na estimação de modelos mal-postos, em particular na estimação de modelos de regressão linear com pequenas amostras, afetados por colinearidade e outliers. A investigação é desenvolvida em três vertentes, nomeadamente na estimação de eficiência técnica com fronteiras de produção condicionadas a estados contingentes, na estimação do parâmetro ridge em regressão ridge e, por último, em novos desenvolvimentos na estimação com máxima entropia. Na estimação de eficiência técnica com fronteiras de produção condicionadas a estados contingentes, o trabalho desenvolvido evidencia um melhor desempenho dos estimadores de máxima entropia em relação ao estimador de máxima verosimilhança. Este bom desempenho é notório em modelos com poucas observações por estado e em modelos com um grande número de estados, os quais são comummente afetados por colinearidade. Espera-se que a utilização de estimadores de máxima entropia contribua para o tão desejado aumento de trabalho empírico com estas fronteiras de produção. Em regressão ridge o maior desafio é a estimação do parâmetro ridge. Embora existam inúmeros procedimentos disponíveis na literatura, a verdade é que não existe nenhum que supere todos os outros. Neste trabalho é proposto um novo estimador do parâmetro ridge, que combina a análise do traço ridge e a estimação com máxima entropia. Os resultados obtidos nos estudos de simulação sugerem que este novo estimador é um dos melhores procedimentos existentes na literatura para a estimação do parâmetro ridge. O estimador de máxima entropia de Leuven é baseado no método dos mínimos quadrados, na entropia de Shannon e em conceitos da eletrodinâmica quântica. Este estimador suplanta a principal crítica apontada ao estimador de máxima entropia generalizada, uma vez que prescinde dos suportes para os parâmetros e erros do modelo de regressão. Neste trabalho são apresentadas novas contribuições para a teoria de máxima entropia na estimação de modelos mal-postos, tendo por base o estimador de máxima entropia de Leuven, a teoria da informação e a regressão robusta. Os estimadores desenvolvidos revelam um bom desempenho em modelos de regressão linear com pequenas amostras, afetados por colinearidade e outliers. Por último, são apresentados alguns códigos computacionais para estimação com máxima entropia, contribuindo, deste modo, para um aumento dos escassos recursos computacionais atualmente disponíveis.