525 resultados para Contingency


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the forecasting of binary events, verification measures that are “equitable” were defined by Gandin and Murphy to satisfy two requirements: 1) they award all random forecasting systems, including those that always issue the same forecast, the same expected score (typically zero), and 2) they are expressible as the linear weighted sum of the elements of the contingency table, where the weights are independent of the entries in the table, apart from the base rate. The authors demonstrate that the widely used “equitable threat score” (ETS), as well as numerous others, satisfies neither of these requirements and only satisfies the first requirement in the limit of an infinite sample size. Such measures are referred to as “asymptotically equitable.” In the case of ETS, the expected score of a random forecasting system is always positive and only falls below 0.01 when the number of samples is greater than around 30. Two other asymptotically equitable measures are the odds ratio skill score and the symmetric extreme dependency score, which are more strongly inequitable than ETS, particularly for rare events; for example, when the base rate is 2% and the sample size is 1000, random but unbiased forecasting systems yield an expected score of around −0.5, reducing in magnitude to −0.01 or smaller only for sample sizes exceeding 25 000. This presents a problem since these nonlinear measures have other desirable properties, in particular being reliable indicators of skill for rare events (provided that the sample size is large enough). A potential way to reconcile these properties with equitability is to recognize that Gandin and Murphy’s two requirements are independent, and the second can be safely discarded without losing the key advantages of equitability that are embodied in the first. This enables inequitable and asymptotically equitable measures to be scaled to make them equitable, while retaining their nonlinearity and other properties such as being reliable indicators of skill for rare events. It also opens up the possibility of designing new equitable verification measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Geographic distributions of pathogens are the outcome of dynamic processes involving host availability, susceptibility and abundance, suitability of climate conditions, and historical contingency including evolutionary change. Distributions have changed fast and are changing fast in response to many factors, including climatic change. The response time of arable agriculture is intrinsically fast, but perennial crops and especially forests are unlikely to adapt easily. Predictions of many of the variables needed to predict changes in pathogen range are still rather uncertain, and their effects will be profoundly modified by changes elsewhere in the agricultural system, including both economic changes affecting growing systems and hosts and evolutionary changes in pathogens and hosts. Tools to predict changes based on environmental correlations depend on good primary data, which is often absent, and need to be checked against the historical record, which remains very poor for almost all pathogens. We argue that at present the uncertainty in predictions of change is so great that the important adaptive response is to monitor changes and to retain the capacity to innovate, both by access to economic capital with reasonably long-term rates of return and by retaining wide scientific expertise, including currently less fashionable specialisms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines the short and long-term persistence of tax-exempt real estate funds in the UK through the use of winner-loser contingency table methodology. The persistence tests are applied to a database of varying numbers of funds from a low of 16 to a high of 27 using quarterly returns over the 12 years from 1990 Q1 to 2001 Q4. The overall conclusion is that the real estate funds in the UK show little evidence of persistence in the short-term (quarterly and semi-annual data) or for data over a considerable length of time (bi-annual to six yearly intervals). In contrast, the results are better for annual data with evidence of significant performance persistence. Thus at this stage, it seems that an annual evaluation period, provides the best discrimination of the winner and loser phenomenon in the real estate market. This result is different from equity and bond studies, where it seems that the repeat winner phenomenon is stronger over shorter periods of evaluation. These results require careful interpretation, however, as the results show that when only small samples are used significant adjustments must be made to correct for small sample bias and second the conclusions are sensitive to the length of the evaluation period and specific test used. Nonetheless, it seems that persistence in performance of real estate funds in the UK does exist, at least for the annual data, and it appears to be a guide to beating the pack in the long run. Furthermore, although the evidence of persistence in performance for the overall sample of funds is limited, we have found evidence that two funds were consistent winners over this period, whereas no one fund could be said to be a consistent loser.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Government and institutionally-driven ‘good practice transfer’ initiatives are consistently presented as a means to enhance construction firm and industry performance. Two implicit tenets of these initiatives appear to be: knowledge embedded in good practice will transfer automatically; and, the potential of implementing good practice will be capitalised regardless of the context where it is to be used. The validity of these tenets is increasingly being questioned and, concurrently, more nuanced knowledge production understandings are being developed which recognise and incorporate context-specificity. This research contributes to this growing, more critical agenda by examining the actual benefits accrued from good practice transfer from the perspective of a small specialist trade contracting firm. A concept model for successful good practice transfer is developed from a single longitudinal case study within a small heating and plumbing firm. The concept model consists of five key variables: environment, strategy, people, technology, and organisation of work. The key findings challenge the implicit assumptions prevailing in the existing literature and support a contingency approach that argues successful good practice transfer is not just adopting and mechanistically inserting into the firm, but requires addressing ‘behavioural’ aspects. For successful good practice transfer, small specialist trade contracting firms need to develop and operationalise organisation slack, mechanisms for scanning external stimuli and absorbing knowledge. They also need to formulate and communicate client-driven external strategies; to motive and educate people at all levels; to possess internal or accessible complementary skills and knowledge; to have ‘soft focus’ immediate/mid-term benefits at a project level; and, to embed good practice in current work practices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Garfield produces a critique of neo-minimalist art practice by demonstrating how the artist Melanie Jackson’s Some things you are not allowed to send around the world (2003 and 2006) and the experimental film-maker Vivienne Dick’s Liberty’s booty (1980) – neither of which can be said to be about feeling ‘at home’ in the world, be it as a resident or as a nomad – examine global humanity through multi-positionality, excess and contingency, and thereby begin to articulate a new cosmopolitan relationship with the local – or, rather, with many different localities – in one and the same maximalist sweep of the work. ‘Maximalism’ in Garfield’s coinage signifies an excessive overloading (through editing, collage, and the sheer density of the range of the material) that enables the viewer to insert themselves into the narrative of the work. In the art of both Jackson and Dick Garfield detects a refusal to know or to judge the world; instead, there is an attempt to incorporate the complexities of its full range into the singular vision of the work, challenging the viewer to identify what is at stake.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Extinction following positively reinforced operant conditioning reduces response frequency, at least in part through the aversive or frustrative effects of non-reinforcement. According to J.A. Gray's theory, non-reinforcement activates the behavioural inhibition system which in turn causes anxiety. As predicted, anxiolytic drugs including benzodiazepines affect the operant extinction process. Recent studies have shown that reducing GABA-mediated neurotransmission retards extinction of aversive conditioning. We have shown in a series of studies that anxiolytic compounds that potentiate GABA facilitate extinction of positively reinforced fixed-ratio operant behaviour in C57B1/6 male mice. This effect does not occur in the early stages of extinction, nor is it dependent on cumulative effects of the compound administered. Potentiation of GABA at later stages has the effect of increasing sensitivity to the extinction contingency and facilitates the inhibition of the behaviour that is no longer required. The GABAergic hypnotic, zolpidem, has the same selective effects on operant extinction in this procedure. The effects of zolpidem are not due to sedative action. There is evidence across our series of experiments that different GABA-A subtype receptors are involved in extinction facilitation and anxiolysis. Consequently, this procedure may not be an appropriate model for anxiolytic drug action, but it may be a useful technique for analysing the neural bases of extinction and designing therapeutic interventions in humans where failure to extinguish inappropriate behaviours can lead to pathological conditions such as post-traumatic stress disorder.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neutral cues that predict emotional events (emotional harbingers) acquire emotional properties and attract attention. Given the importance of emotional harbingers for future survival, it is desirable to flexibly learn new facts about emotional harbingers when needed. However, recent research revealed that it is harder to learn new associations for emotional harbingers than cues that predict non-emotional events (neutral harbingers). In the current study, we addressed whether this impaired association learning for emotional harbingers is altered by one’s awareness of the contingencies between cues and emotional outcomes. Across 3 studies, we found that one’s awareness of the contingencies determines subsequent association learning of emotional harbingers. Emotional harbingers produced worse association learning than neutral harbingers when people were not aware of the contingencies between cues and emotional outcomes, but produced better association learning when people were aware of the contingencies. These results suggest that emotional harbingers do not always suffer from impaired association learning and can show facilitated learning depending on one’s contingency awareness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research investigates the link between rivalry and unethical behavior. We propose that people will engage in greater unethical behavior when competing against their rivals than when competing against non-rival competitors. Across a series of experiments and an archival study, we find that rivalry is associated with increased use of deception, unsportsmanlike behavior, willingness to employ unethical negotiation tactics, and misreporting of performance. We also explore the psychological underpinnings of rivalry, which help to illuminate how it differs from general competition, and why it increases unethical behavior. Rivalry as compared to non-rival competition was associated with increased status concerns, contingency of self-worth, and performance goals; mediation analyses revealed that performance goals played the biggest role in explaining why rivalry promoted greater unethicality. Lastly, we find that merely thinking about a rival can be enough to promote greater unethical behavior, even in domains unrelated to the rivalry. These findings highlight the importance of rivalry as a widespread, powerful, yet largely unstudied phenomenon with significant organizational implications. Further, the results help to inform when and why unethical behavior occurs within organizations, and demonstrate that the effects of competition are dependent upon relationships and prior interactions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sociable robots are embodied agents that are part of a heterogeneous society of robots and humans. They Should be able to recognize human beings and each other, and to engage in social, interactions. The use of a robotic architecture may strongly reduce the time and effort required to construct a sociable robot. Such architecture must have structures and mechanisms to allow social interaction. behavior control and learning from environment. Learning processes described oil Science of Behavior Analysis may lead to the development of promising methods and Structures for constructing robots able to behave socially and learn through interactions from the environment by a process of contingency learning. In this paper, we present a robotic architecture inspired from Behavior Analysis. Methods and structures of the proposed architecture, including a hybrid knowledge representation. are presented and discussed. The architecture has been evaluated in the context of a nontrivial real problem: the learning of the shared attention, employing an interactive robotic head. The learning capabilities of this architecture have been analyzed by observing the robot interacting with the human and the environment. The obtained results show that the robotic architecture is able to produce appropriate behavior and to learn from social interaction. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

P>In the context of either Bayesian or classical sensitivity analyses of over-parametrized models for incomplete categorical data, it is well known that prior-dependence on posterior inferences of nonidentifiable parameters or that too parsimonious over-parametrized models may lead to erroneous conclusions. Nevertheless, some authors either pay no attention to which parameters are nonidentifiable or do not appropriately account for possible prior-dependence. We review the literature on this topic and consider simple examples to emphasize that in both inferential frameworks, the subjective components can influence results in nontrivial ways, irrespectively of the sample size. Specifically, we show that prior distributions commonly regarded as slightly informative or noninformative may actually be too informative for nonidentifiable parameters, and that the choice of over-parametrized models may drastically impact the results, suggesting that a careful examination of their effects should be considered before drawing conclusions.Resume Que ce soit dans un cadre Bayesien ou classique, il est bien connu que la surparametrisation, dans les modeles pour donnees categorielles incompletes, peut conduire a des conclusions erronees. Cependant, certains auteurs persistent a negliger les problemes lies a la presence de parametres non identifies. Nous passons en revue la litterature dans ce domaine, et considerons quelques exemples surparametres simples dans lesquels les elements subjectifs influencent de facon non negligeable les resultats, independamment de la taille des echantillons. Plus precisement, nous montrons comment des a priori consideres comme peu ou non-informatifs peuvent se reveler extremement informatifs en ce qui concerne les parametres non identifies, et que le recours a des modeles surparametres peut avoir sur les conclusions finales un impact considerable. Ceci suggere un examen tres attentif de l`impact potentiel des a priori.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We review some issues related to the implications of different missing data mechanisms on statistical inference for contingency tables and consider simulation studies to compare the results obtained under such models to those where the units with missing data are disregarded. We confirm that although, in general, analyses under the correct missing at random and missing completely at random models are more efficient even for small sample sizes, there are exceptions where they may not improve the results obtained by ignoring the partially classified data. We show that under the missing not at random (MNAR) model, estimates on the boundary of the parameter space as well as lack of identifiability of the parameters of saturated models may be associated with undesirable asymptotic properties of maximum likelihood estimators and likelihood ratio tests; even in standard cases the bias of the estimators may be low only for very large samples. We also show that the probability of a boundary solution obtained under the correct MNAR model may be large even for large samples and that, consequently, we may not always conclude that a MNAR model is misspecified because the estimate is on the boundary of the parameter space.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation synthesizes previous research and develops a model for the study of strategic development, strategic congruence and management control. The model is used to analyze a longitudinal case study of the Swedish engineering company Atlas Copco. Employing contingency theory, the study confirms that long-term survival of a company requires adaption to contingencies. Three levels of strategy are examined: corporate, business and functional. Previous research suggests that consistency between these levels (strategic congruence) is necessary for a company to be competitive. The dissertation challenges this proposition by using a life-cycle perspective and analyzes strategic congruence in the different phases of a life cycle. It also studies management control from a life-cycle perspective. In this context, two types of management control are examined: formal and informal. From a longitudinal perspective, the study further discusses how these types interact during organizational life cycles. The dissertation shows that strategic development is more complex than previous studies have indicated. It is a long, complex and non-linear process, the results of which cannot always be predicted. Previous models for strategy and management control are based on simple relationships and rarely take into account the fact that companies often go through different phases of strategic development. The case study shows that strategic incongruence may occur at times during organizational life cycles. Furthermore, the use of management control varies over time. In the maturity phase, formal control is in focus, while the use of informal control has a bigger role in both the introduction and decline phases. Research on strategy and management control has intensified in recent years. Still there is a gap regarding the coordination of complex corporate structures. The present study contributes with further knowledge on how companies manage long-term strategic development. Few studies deal with more than two levels of strategy. Moreover, the present study addresses the need to understand strategic congruence from a life-cycle perspective. This is particularly relevant in practice, when management in large companies face difficult issues for which they expect business research to assist them in the decision-making process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article presents a study of how contemporary Swedish lower secondary school textbooks present the emergence of the Cold War and how 10 active lower secondary school history teachers interpreted a quotation that was ambiguous in relation to the general narrative in the studied Swedish textbooks, seeking to analyse textbooks both from the perspectives of content and reception. Applying a theoretical framework of uses of history, the study finds that the narratives presented in the studied textbooks are what could be called traditional in the sense that they do not acknowledge perspective and representation in history. While the interviewed teachers generally acknowledged that textbook narratives are representations of history and contingent on perspective, few teachers extended this to include how their own views affect their interpretations, suggesting an intermediary appreciation of the contextual contingency of historical narratives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Segurança Empresarial é um programa preventivo que visa proteger os valores de uma empresa, para preservação das condições operacionais dentro da normalidade. As empresas estão sujeitas a ameaças relacionadas às suas atividades no mercado em que atuam. Muitas dessas ameaças são conhecidas e identificadas e, dentro do possível, é estabelecido um programa de proteção de seus interesses. Entretanto, as ameaças aos demais bens da empresa não são evidentes, nem perceptíveis tão facilmente. As medidas tomadas para prevenir a efetivação dessas ameaças, por meio de um Programa de Segurança Empresarial ou para minorar os problemas decorrentes da concretização dessas ameaças, até a volta às condições habituais de operação das empresas, por meio da execução de um Plano de Contingência, são vistas, não raramente, como despesas e poucas vezes como investimento com retomo. Nem geram resultados operacionais para a empresa. o objetivo final desta dissertação de mestrado foi estabelecer um Programa de Segurança Empresarial para a Fundação Getulio Vargas - FGV, recomendando a revisão das normas e procedimentos de segurança que comporão o Manual de Segurança e sugerir que a FGV o inclua no Plano Estratégico. Para alcançar o objetivo final. foram definidos: Conceitos Básicos de Segurança e Contingência; Critérios de Classificação de Dados; Gestão do Processo de Segurança e Contingência; Função Administração de Segurança. Alguns tipos de pesquisa foram aplicados. Quanto aos meios, caracterizou-se como estudo de caso; para a elaboração das recomendações de revisão das normas e procedimentos de segurança, foram aplicadas pesquisas bibliográfica, documental e de campo. Quanto ao fim, pesquisa aplicada, motivada pela necessidade de resolver problemas concretos, com finalidade prática. Foram efetuadas visitas e entrevistas nas instalações da sede da FGV e no prédio da Bolsa de Valores do Rio de Janeiro - BVRJ. Dada a complexidade e abrangência da Segurança Empresarial, o estudo foi limitado a tratar de segurança física. Foi observada a necessidade da FGV rever as Normas e Procedimentos de Segurança Empresarial, buscando melhorar o nível de segurança geral e adotar instrumentos modernos de prevenção de ocorrência de sinistros que ponham em risco os valores da FGV, a saber: pessoas, instalações físicas, equipamentos, informações, suprimentos e facilidades de comunicação. Está sendo recomendada a adoção da metodologia Sistemática Integrada de Segurança e Contingência - SISC, para minimizarão dos impactos de situações emergências.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An accident with Brazilian Satellite Launching Vehicle (SLV-1 V03) third prototype in August, 2003 at Alcântara Base, in the State of Maranhão, dramatically exposed accumulated deficiencies affecting Brazilian space sector. A report regarding this accident published by Ministry of Defense recognized the relevance of organizational dimension for the success of Brazilian space policy. In this case study, the author analyses sector organizational structure - the National Space Activities Development System (NSADS) - to evaluate its adequacy to policy development requisites. The Theory of Structural Contingency - TSC provided the analytical framework adopted in the research complemented by two organizational approaches that focuses high risk systems: Normal Accident Theory - NAT and High Reliability Theory - HRT. The last two approaches supported the analysis of NSADS's organizations which are, according to Charles Perrow definition, directly involved in developing high risk technological systems and their relationship with the System. The case study was supplemented with a brief comparison between NSADS and the organizational structures of North-American and French civilian space agencies, respectively, NASA and CNES, in order to subsidize organizational modeling of Brazilian System.