376 resultados para Intuition.
Resumo:
This paper attempts to address the effectiveness of physical-layer network coding (PNC) on the capacity improvement for multi-hop multicast in random wireless ad hoc networks (WAHNs). While it can be shown that there is a capacity gain by PNC, we can prove that the per session throughput capacity with PNC is ? (nR(n))), where n is the total number of nodes, R(n) is the communication range, and each multicast session consists of a constant number of sinks. The result implies that PNC cannot improve the capacity order of multicast in random WAHNs, which is different from the intuition that PNC may improve the capacity order as it allows simultaneous signal reception and combination. Copyright © 2010 ACM.
Resumo:
Artifact selection decisions typically involve the selection of one from a number of possible/candidate options (decision alternatives). In order to support such decisions, it is important to identify and recognize relevant key issues of problem solving and decision making (Albers, 1996; Harris, 1998a, 1998b; Jacobs & Holten, 1995; Loch & Conger, 1996; Rumble, 1991; Sauter, 1999; Simon, 1986). Sauter classifies four problem solving/decision making styles: (1) left-brain style, (2) right-brain style, (3) accommodating, and (4) integrated (Sauter, 1999). The left-brain style employs analytical and quantitative techniques and relies on rational and logical reasoning. In an effort to achieve predictability and minimize uncertainty, problems are explicitly defined, solution methods are determined, orderly information searches are conducted, and analysis is increasingly refined. Left-brain style decision making works best when it is possible to predict/control, measure, and quantify all relevant variables, and when information is complete. In direct contrast, right-brain style decision making is based on intuitive techniques—it places more emphasis on feelings than facts. Accommodating decision makers use their non-dominant style when they realize that it will work best in a given situation. Lastly, integrated style decision makers are able to combine the left- and right-brain styles—they use analytical processes to filter information and intuition to contend with uncertainty and complexity.
Resumo:
We propose the use of stochastic frontier approach to modelling financial constraints of firms. The main advantage of the stochastic frontier approach over the stylised approaches that use pooled OLS or fixed effects panel regression models is that we can not only decide whether or not the average firm is financially constrained, but also estimate a measure of the degree of the constraint for each firm and for each time period, and also the marginal impact of firm characteristics on this measure. We then apply the stochastic frontier approach to a panel of Indian manufacturing firms, for the 1997–2006 period. In our application, we highlight and discuss the aforementioned advantages, while also demonstrating that the stochastic frontier approach generates regression estimates that are consistent with the stylised intuition found in the literature on financial constraint and the wider literature on the Indian credit/capital market.
Resumo:
Performance evaluation in conventional data envelopment analysis (DEA) requires crisp numerical values. However, the observed values of the input and output data in real-world problems are often imprecise or vague. These imprecise and vague data can be represented by linguistic terms characterised by fuzzy numbers in DEA to reflect the decision-makers' intuition and subjective judgements. This paper extends the conventional DEA models to a fuzzy framework by proposing a new fuzzy additive DEA model for evaluating the efficiency of a set of decision-making units (DMUs) with fuzzy inputs and outputs. The contribution of this paper is threefold: (1) we consider ambiguous, uncertain and imprecise input and output data in DEA, (2) we propose a new fuzzy additive DEA model derived from the a-level approach and (3) we demonstrate the practical aspects of our model with two numerical examples and show its comparability with five different fuzzy DEA methods in the literature. Copyright © 2011 Inderscience Enterprises Ltd.
Resumo:
Artifact selection decisions typically involve the selection of one from a number of possible/candidate options (decision alternatives). In order to support such decisions, it is important to identify and recognize relevant key issues of problem solving and decision making (Albers, 1996; Harris, 1998a, 1998b; Jacobs & Holten, 1995; Loch & Conger, 1996; Rumble, 1991; Sauter, 1999; Simon, 1986). Sauter classifies four problem solving/decision making styles: (1) left-brain style, (2) right-brain style, (3) accommodating, and (4) integrated (Sauter, 1999). The left-brain style employs analytical and quantitative techniques and relies on rational and logical reasoning. In an effort to achieve predictability and minimize uncertainty, problems are explicitly defined, solution methods are determined, orderly information searches are conducted, and analysis is increasingly refined. Left-brain style decision making works best when it is possible to predict/control, measure, and quantify all relevant variables, and when information is complete. In direct contrast, right-brain style decision making is based on intuitive techniques—it places more emphasis on feelings than facts. Accommodating decision makers use their non-dominant style when they realize that it will work best in a given situation. Lastly, integrated style decision makers are able to combine the left- and right-brain styles—they use analytical processes to filter information and intuition to contend with uncertainty and complexity.
Resumo:
Despite concerted academic interest in the strategic decision-making process (SDMP) since the 1980s, a coherent body of theory capable of guiding practice has not materialised. This is because many prior studies focus only on a single process characteristic, often rationality or comprehensiveness, and have paid insufficient attention to context. To further develop theory, research is required which examines: (i) the influence of context from multiple theoretical perspectives (e.g. upper echelons, environmental determinism); (ii) different process characteristics from both synoptic formal (e.g. rationality) and political incremental (e.g. politics) perspectives, and; (iii) the effects of context and process characteristics on a range of SDMP outcomes. Using data from 30 interviews and 357 questionnaires, this thesis addresses several opportunities for theory development by testing an integrative model which incorporates: (i) five SDMP characteristics representing both synoptic formal (procedural rationality, comprehensiveness, and behavioural integration) and political incremental (intuition, and political behaviour) perspectives; (ii) four SDMP outcome variables—strategic decision (SD) quality, implementation success, commitment, and SD speed, and; (iii) contextual variables from the four theoretical perspectives—upper echelons, SD-specific characteristics, environmental determinism, and firm characteristics. The present study makes several substantial and original contributions to knowledge. First, it provides empirical evidence of the contextual boundary conditions under which intuition and political behaviour positively influence SDMP outcomes. Second, it establishes the predominance of the upper echelons perspective; with TMT variables explaining significantly more variance in SDMP characteristics than SD specific characteristics, the external environment, and firm characteristics. A newly developed measure of top management team expertise also demonstrates highly significant direct and indirect effects on the SDMP. Finally, it is evident that SDMP characteristics and contextual variables influence a number of SDMP outcomes, not just overall SD quality, but also implementation success, commitment, and SD speed.
Resumo:
This work attempts to shed light to the fundamental concepts behind the stability of Multi-Agent Systems. We view the system as a discrete time Markov chain with a potentially unknown transitional probability distribution. The system will be considered to be stable when its state has converged to an equilibrium distribution. Faced with the non-trivial task of establishing the convergence to such a distribution, we propose a hypothesis testing approach according to which we test whether the convergence of a particular system metric has occurred. We describe some artificial multi-agent ecosystems that were developed and we present results based on these systems which confirm that this approach qualitatively agrees with our intuition.
Resumo:
In this paper, we examine the injunction issued by the prominent politician, broadcaster and older people's advocate, Baroness Joan Bakewell, to engage in ‘death talk’. We see positive ethical potential in this injunction, insofar as it serves as a call to confront more directly the prospects of death and dying, thereby releasing creative energies with which to change our outlook on life and ageing more generally. However, when set against a culture that valorises choice, independence and control, the positive ethical potential of such injunctions is invariably thwarted. We illustrate this with reference to one of Bakewell's interventions in a debate on scientific innovation and population ageing. In examining the context of her intervention, we affirm her intuition about its positive ethical potential, but we also point to an ambivalence that accompanies the formulation of the injunction – one that ultimately blunts the force and significance of her intuition. We suggest that Gilleard and Higgs' idea of the third age/fourth age dialectic, combined with the psycho-analytic concepts of fantasy and mourning, allow us to express this intuition better. In particular, we argue that the expression ‘loss talk’ (rather than ‘death talk’) better captures the ethical negotiations that should ultimately underpin the transformation processes associated with ageing, and that our theoretical contextualisation of her remarks can help us see this more clearly. In this view, deteriorations in our physical and mental capacities are best understood as involving changes in how we see ourselves, i.e. in our identifications, and so what is at stake are losses of identity and the conditions under which we can engage in new processes of identification.
Resumo:
Given cybernetic idea is formed on the basis of neurophysiologic, neuropsychological, neurocybernetic data and verisimilar hypotheses, which fill gaps of formers, of the author as well. First of all attention is focused on general principles of a Memory organization in the brain and processes which take part in it that realize such psychical functions as perception and identification of input information about patterns and a problem solving, which is specified by the input and output conditions, as well. Realization of the second function, essentially cogitative, is discussed in the aspects of figurative and lingual thinking on the levels of intuition and understanding. The reasons of advisability and principles of bionic approach to creation of appropriate tools of artificial intelligent are proposed.
Resumo:
Purpose – The purpose of this paper is to outline a seven-phase simulation conceptual modelling procedure that incorporates existing practice and embeds a process reference model (i.e. SCOR). Design/methodology/approach – An extensive review of the simulation and SCM literature identifies a set of requirements for a domain-specific conceptual modelling procedure. The associated design issues for each requirement are discussed and the utility of SCOR in the process of conceptual modelling is demonstrated using two development cases. Ten key concepts are synthesised and aligned to a general process for conceptual modelling. Further work is outlined to detail, refine and test the procedure with different process reference models in different industrial contexts. Findings - Simulation conceptual modelling is often regarded as the most important yet least understood aspect of a simulation project (Robinson, 2008a). Even today, there has been little research development into guidelines to aid in the creation of a conceptual model. Design issues are discussed for building an ‘effective’ conceptual model and the domain-specific requirements for modelling supply chains are addressed. The ten key concepts are incorporated to aid in describing the supply chain problem (i.e. components and relationships that need to be included in the model), model content (i.e. rules for determining the simplest model boundary and level of detail to implement the model) and model validation. Originality/value – Paper addresses Robinson (2008a) call for research in defining and developing new approaches for conceptual modelling and Manuj et al., (2009) discussion on improving the rigour of simulation studies in SCM. It is expected that more detailed guidelines will yield benefits to both expert (i.e. avert typical modelling failures) and novice modellers (i.e. guided practice; less reliance on hopeful intuition)
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2016
Resumo:
It is often assumed (for analytical convenience, but also in accordance with common intuition) that consumer preferences are convex. In this paper, we consider circumstances under which such preferences are (or are not) optimal. In particular, we investigate a setting in which goods possess some hidden quality with known distribution, and the consumer chooses a bundle of goods that maximizes the probability that he receives some threshold level of this quality. We show that if the threshold is small relative to consumption levels, preferences will tend to be convex; whereas the opposite holds if the threshold is large. Our theory helps explain a broad spectrum of economic behavior (including, in particular, certain common commercial advertising strategies), suggesting that sensitivity to information about thresholds is deeply rooted in human psychology.
Resumo:
Cikkünk arról a paradox jelenségről szól, hogy a fogyasztást explicit módon megjelenítő Neumann-modell egyensúlyi megoldásaiban a munkabért meghatározó létszükségleti termékek ára esetenként nulla lehet, és emiatt a reálbér egyensúlyi értéke is nulla lesz. Ez a jelenség mindig bekövetkezik az olyan dekomponálható gazdaságok esetén, amelyekben eltérő növekedési és profitrátájú, alternatív egyensúlyi megoldások léteznek. A jelenség sokkal áttekinthetőbb formában tárgyalható a modell Leontief-eljárásra épülő egyszerűbb változatában is, amit ki is használunk. Megmutatjuk, hogy a legnagyobbnál alacsonyabb szintű növekedési tényezőjű megoldások közgazdasági szempontból értelmetlenek, és így érdektelenek. Ezzel voltaképpen egyrészt azt mutatjuk meg, hogy Neumann kiváló intuíciója jól működött, amikor ragaszkodott modellje egyértelmű megoldásához, másrészt pedig azt is, hogy ehhez nincs szükség a gazdaság dekomponálhatóságának feltételezésére. A vizsgált téma szorosan kapcsolódik az általános profitráta meghatározásának - Sraffa által modern formába öntött - Ricardo-féle elemzéséhez, illetve a neoklasszikus növekedéselmélet nevezetes bér-profit, illetve felhalmozás-fogyasztás átváltási határgörbéihez, ami jelzi a téma elméleti és elmélettörténeti érdekességét is. / === / In the Marx-Neumann version of the Neumann model introduced by Morishima, the use of commodities is split between production and consumption, and wages are determined as the cost of necessary consumption. In such a version it may occur that the equilibrium prices of all goods necessary for consumption are zero, so that the equilibrium wage rate becomes zero too. In fact such a paradoxical case will always arise when the economy is decomposable and the equilibrium not unique in terms of growth and interest rate. It can be shown that a zero equilibrium wage rate will appear in all equilibrium solutions where growth and interest rate are less than maximal. This is another proof of Neumann's genius and intuition, for he arrived at the uniqueness of equilibrium via an assumption that implied that the economy was indecomposable, a condition relaxed later by Kemeny, Morgenstern and Thompson. This situation occurs also in similar models based on Leontief technology and such versions of the Marx-Neumann model make the roots of the problem more apparent. Analysis of them also yields an interesting corollary to Ricardo's corn rate of profit: the real cause of the awkwardness is bad specification of the model: luxury commodities are introduced without there being a final demand for them, and production of them becomes a waste of resources. Bad model specification shows up as a consumption coefficient incompatible with the given technology in the more general model with joint production and technological choice. For the paradoxical situation implies the level of consumption could be raised and/or the intensity of labour diminished without lowering the equilibrium rate of the growth and interest. This entails wasteful use of resources and indicates again that the equilibrium conditions are improperly specified. It is shown that the conditions for equilibrium can and should be redefined for the Marx-Neumann model without assuming an indecomposable economy, in a way that ensures the existence of an equilibrium unique in terms of the growth and interest rate coupled with a positive value for the wage rate, so confirming Neumann's intuition. The proposed solution relates closely to findings of Bromek in a paper correcting Morishima's generalization of wage/profit and consumption/investment frontiers.
Resumo:
A likviditás mérésére többféle mutató terjedt el, amelyek a likviditás jelenségét különböző szempontok alapján számszerűsítik. A cikk a szakirodalom által javasolt, különféle likviditási mutatókat elemzi sokdimenziós statisztikai módszerekkel: főkomponens-elemzés segítségével keresünk olyan faktorokat, amelyek legjobban tömörítik a likviditási jellemzőket, majd megnézzük, hogy az egyes mutatók milyen mértékben mozognak együtt a faktorokkal, illetve a korrelációk alapján klaszterezési eljárással keresünk hasonló tulajdonságokkal bíró csoportokat. Arra keressük a választ, hogy a rendelkezésünkre álló minta elemzésével kialakított változócsoportok egybeesnek-e a likviditás egyes aspektusaihoz kapcsolt mutatókkal, valamint meghatározhatók-e olyan összetett likviditási mérőszámok, amelyeknek a segítségével a likviditás jelensége több dimenzióban mérhető. / === / Liquidity is measured from different aspects (e.g. tightness, depth, and resiliency) by different ratios. We studied the co-movements and the clustering of different liquidity measures on a sample of the Swiss stock market. We performed a PCA to obtain the main factors that explain the cross-sectional variability of liquidity measures, and we used the k-means clustering methodology to defi ne groups of liquidity measures. Based on our explorative data analysis, we formed clusters of liquidity measures, and we compared the resulting groups with the expectations and intuition. Our modelling methodology provides a framework to analyze the correlation between the different aspects of liquidity as well as a means to defi ne complex liquidity measures.
Resumo:
The purpose of this ethnographic study was to describe and explain the congruency of psychological preferences identified by the Myers-Briggs Type Indicator (MBTI) and the human resource development (HRD) role of instructor/facilitator. This investigation was conducted with 23 HRD professionals who worked in the Miami, Florida area as instructors/facilitators with adult learners in job-related contexts.^ The study was conducted using qualitative strategies of data collection and analysis. The research participants were selected through a purposive sampling strategy. Data collection strategies included: (a) administration and scoring of the MBTI, Form G, (b) open-ended and semi-structured interviews, (c) participant observations of the research subjects at their respective work sites and while conducting training sessions, (d) field notes, and (e) contact summary sheets to record field research encounters. Data analysis was conducted with the use of a computer program for qualitative analysis called FolioViews 3.1 for Windows. This included: (a) coding of transcribed interviews and field notes, (b) theme analysis, (c) memoing, and (d) cross-case analysis.^ The three major themes that emerged in relation to the congruency of psychological preferences and the role of instructor/facilitator were: (1) designing and preparing instruction/facilitation, (2) conducting training and managing group process, and (3) interpersonal relations and perspectives among instructors/facilitators.^ The first two themes were analyzed through the combination of the four Jungian personality functions. These combinations are: sensing-thinking (ST), sensing-feeling (SF), intuition-thinking (NT), and intuition-feeling (NF). The third theme was analyzed through the combination of the attitudes or energy focus and the judgment function. These combinations are: extraversion-thinking (ET), extraversion-feeling (EF), introversion-thinking (IT), and introversion-feeling (IF).^ A last area uncovered by this ethnographic study was the influence exerted by a training and development culture on the instructor/facilitator role. This professional culture is described and explained in terms of the shared values and expectations reported by the study respondents. ^