961 resultados para Behavioral Choice Theory
Resumo:
Purpose - To provide a framework of accounting policy choice associated with the timing of adoption of the UK Statement of Standard Accounting Practice (SSAP) No. 20, "Foreign Currency Translation". The conceptual framework describes the accounting policy choices that firms face in a setting that is influenced by: their financial characteristics; the flexible foreign exchange rates; and the stock market response to accounting decisions. Design/methodology/approach - Following the positive accounting theory context, this paper puts into a framework the motives and choices of UK firms with regard to the adoption or deferment of the adoption of SSAP 20. The paper utilises the theoretical and empirical findings of previous studies to form and substantiate the conceptual framework. Given the UK foreign exchange setting, the framework identifies the initial stage: lack of regulation and flexibility in financial reporting; the intermediate stage: accounting policy choice; and the final stage: accounting choice and policy review. Findings - There are situations where accounting regulation contrasts with the needs and business objectives of firms and vice-versa. Thus, firms may delay the adoption up to the point where the increase in political costs can just be tolerated. Overall, the study infers that firms might have chosen to defer the adoption of SSAP 20 until they reach a certain corporate goal, or the adverse impact (if any) of the accounting change on firms' financial numbers is minimal. Thus, the determination of the timing of the adoption is a matter which is subject to the objectives of the managers in association with the market and economic conditions. The paper suggests that the flexibility in financial reporting, which may enhance the scope for income-smoothing, can be mitigated by the appropriate standardisation of accounting practice. Research limitations/implications - First, the study encompassed a period when firms and investors were less sophisticated users of financial information. Second, it is difficult to ascertain the decisions that firms would have taken, had the pound appreciated over the period of adoption and had the firms incurred translation losses rather than translation gains. Originality/value - This paper is useful to accounting standards setters, professional accountants, academics and investors. The study can give the accounting standard-setting bodies useful information when they prepare a change in the accounting regulation or set an appropriate date for the implementation of an accounting standard. The paper provides significant insight about the behaviour of firms and the associated impacts of financial markets and regulation on the decision-making process of firms. The framework aims to assist the market and other authorities to reduce information asymmetry and to reinforce the efficiency of the market. © Emerald Group Publishing Limited.
Resumo:
Purpose – The purpose of this paper is to consider the current status of strategic group theory in the light of developments over the last three decades. and then to discuss the continuing value of the concept, both to strategic management research and practising managers. Design/methodology/approach – Critical review of the idea of strategic groups together with a practical strategic mapping illustration. Findings – Strategic group theory still provides a useful approach for management research, which allows a detailed appraisal and comparison of company strategies within an industry. Research limitations/ implications – Strategic group research would undoubtedly benefit from more directly comparable, industry-specific studies, with a more careful focus on variable selection and the statistical methods used for validation. Future studies should aim to build sets of industry specific variables that describe strategic choice within that industry. The statistical methods used to identify strategic groupings need to be robust to ensure that strategic groups are not solely an artefact of method. Practical implications – The paper looks specifically at an application of strategic group theory in the UK pharmaceutical industry. The practical benefits of strategic groups as a classification system and of strategic mapping as a strategy development and analysis tool are discussed. Originality/value – The review of strategic group theory alongside alternative taxonomies and application of the concept to the UK pharmaceutical industry.
Resumo:
This paper extends previous analyses of the choice between internal and external R&D to consider the costs of internal R&D. The Heckman two-stage estimator is used to estimate the determinants of internal R&D unit cost (i.e. cost per product innovation) allowing for sample selection effects. Theory indicates that R&D unit cost will be influenced by scale issues and by the technological opportunities faced by the firm. Transaction costs encountered in research activities are allowed for and, in addition, consideration is given to issues of market structure which influence the choice of R&D mode without affecting the unit cost of internal or external R&D. The model is tested on data from a sample of over 500 UK manufacturing plants which have engaged in product innovation. The key determinants of R&D mode are the scale of plant and R&D input, and market structure conditions. In terms of the R&D cost equation, scale factors are again important and have a non-linear relationship with R&D unit cost. Specificities in physical and human capital also affect unit cost, but have no clear impact on the choice of R&D mode. There is no evidence of technological opportunity affecting either R&D cost or the internal/external decision.
Resumo:
This work explores the relevance of semantic and linguistic description to translation, theory and practice. It is aimed towards a practical model of approach to texts to translate. As literary texts [poetry mainly] are the focus of attention, so are stylistic matters. Note, however, that 'style', and, to some extent, the conclusions of the work, are not limited to so-called literary texts. The study of semantic description reveals that most translation problems do not stem from the cognitive (langue-related), but rather from the contextual (parole-related) aspects of meaning. Thus, any linguistic model that fails to account for the latter is bound to fall short. T.G.G. does, whereas Systemics, concerned with both the 'Iangue' and 'parole' (stylistic and sociolinguistic mainly) aspects of meaning, provides a useful framework of approach to texts to translate. Two essential semantic principles for translation are: that meaning is the property of a language (Firth); and the 'relativity of meaning assignments' (Tymoczko). Both imply that meaning can only be assessed, correctly, in the relevant socio-cultural background. Translation is seen as a restricted creation, and the translator's encroach as a three-dimensional critical one. To encompass the most technical to the most literary text, and account for variations in emphasis in any text, translation theory must be based on typology of function Halliday's ideational, interpersonal and textual, or, Buhler's symbol, signal, symptom, Functions3. Function Coverall and specific] will dictate aims and method, and also provide the critic with criteria to assess translation Faithfulness. Translation can never be reduced to purely objective methods, however. Intuitive procedures intervene, in textual interpretation and analysis, in the choice of equivalents, and in the reception of a translation. Ultimately, translation, theory and practice, may perhaps constitute the touchstone as regards the validity of linguistic and semantic theories.
Resumo:
Speed's theory makes two predictions for the development of analogical reasoning. Firstly, young children should not be able to reason analogically due to an undeveloped PFC neural network. Secondly, category knowledge enables the reinforcement of structural features over surface features, and thus the development of sophisticated, analogical, reasoning. We outline existing studies that support these predictions and highlight some critical remaining issues. Specifically, we argue that the development of inhibition must be directly compared alongside the development of reasoning strategies in order to support Speed's account. © 2010 Psychology Press.
Resumo:
Contrast sensitivity improves with the area of a sine-wave grating, but why? Here we assess this phenomenon against contemporary models involving spatial summation, probability summation, uncertainty, and stochastic noise. Using a two-interval forced-choice procedure we measured contrast sensitivity for circular patches of sine-wave gratings with various diameters that were blocked or interleaved across trials to produce low and high extrinsic uncertainty, respectively. Summation curves were steep initially, becoming shallower thereafter. For the smaller stimuli, sensitivity was slightly worse for the interleaved design than for the blocked design. Neither area nor blocking affected the slope of the psychometric function. We derived model predictions for noisy mechanisms and extrinsic uncertainty that was either low or high. The contrast transducer was either linear (c1.0) or nonlinear (c2.0), and pooling was either linear or a MAX operation. There was either no intrinsic uncertainty, or it was fixed or proportional to stimulus size. Of these 10 canonical models, only the nonlinear transducer with linear pooling (the noisy energy model) described the main forms of the data for both experimental designs. We also show how a cross-correlator can be modified to fit our results and provide a contemporary presentation of the relation between summation and the slope of the psychometric function.
Resumo:
The period 2010–2013 was a time of far-reaching structural reforms of the National Health Service in England. Of particular interest in this paper is the way in which radical critiques of the reform process were marginalised by pragmatic concerns about how to maintain the market-competition thrust of the reforms while avoiding potential fragmentation. We draw on the Essex school of political discourse theory and develop a ‘nodal’ analytical framework to argue that widespread and repeated appeals to a narrative of choice-based integrated care served to take the fragmentation ‘sting’ out of radical critiques of the pro-competition reform process. This served to marginalise alternative visions of health and social care, and to pre-empt the contestation of a key norm in the provision of health care that is closely associated with the notions of ‘any willing provider’ and ‘any qualified provider’: provider-blind provision.
Resumo:
Similar to classic Signal Detection Theory (SDT), recent optimal Binary Signal Detection Theory (BSDT) and based on it Neural Network Assembly Memory Model (NNAMM) can successfully reproduce Receiver Operating Characteristic (ROC) curves although BSDT/NNAMM parameters (intensity of cue and neuron threshold) and classic SDT parameters (perception distance and response bias) are essentially different. In present work BSDT/NNAMM optimal likelihood and posterior probabilities are analytically analyzed and used to generate ROCs and modified (posterior) mROCs, optimal overall likelihood and posterior. It is shown that for the description of basic discrimination experiments in psychophysics within the BSDT a ‘neural space’ can be introduced where sensory stimuli as neural codes are represented and decision processes are defined, the BSDT’s isobias curves can simultaneously be interpreted as universal psychometric functions satisfying the Neyman-Pearson objective, the just noticeable difference (jnd) can be defined and interpreted as an atom of experience, and near-neutral values of biases are observers’ natural choice. The uniformity or no-priming hypotheses, concerning the ‘in-mind’ distribution of false-alarm probabilities during ROC or overall probability estimations, is introduced. The BSDT’s and classic SDT’s sensitivity, bias, their ROC and decision spaces are compared.
Resumo:
This chapter contributes to the anthology on learning to research - researching to learn because it emphases a need to design curricula that enables living research, and on-going researcher development, rather than one that restricts student and staff activities, within a marketised approach towards time. In recent decades higher education (HE) has come to be valued for its contribution to the global economy. Referred to as the neo-liberal university, a strong prioritisation has been placed on meeting the needs of industry by providing a better workforce. This perspective emphasises the role of a degree in HE to secure future material affluence, rather than to study as an on-going investment in the self (Molesworth , Nixon & Scullion, 2009: 280). Students are treated primarily as consumers in this model, where through their tuition fees they purchase a product, rather than benefit from the transformative potential university education offers for the whole of life.Given that HE is now measured by the numbers of students it attracts, and later places into well-paid jobs, there is an intense pressure on time, which has led to a method where the learning experiences of students are broken down into discrete modules. Whilst this provides consistency, students can come to view research processes in a fragmented way within the modular system. Topics are presented chronologically, week-by-week and students simply complete a set of tasks to ‘have a degree’, rather than to ‘be learners’ (Molesworth , Nixon & Scullion, 2009: 277) who are living their research, in relation to their own past, present and future. The idea of living research in this context is my own adaptation of an approach suggested by C. Wright Mills (1959) in The Sociological Imagination. Mills advises that successful scholars do not split their work from the rest of their lives, but treat scholarship as a choice of how to live, as well as a choice of career. The marketised slant in HE thus creates a tension firstly, for students who are learning to research. Mills would encourage them to be creative, not instrumental, in their use of time, yet they are journeying through a system that is structured for a swift progression towards a high paid job, rather than crafted for reflexive inquiry, that transforms their understanding throughout life. Many universities are placing a strong focus on discrete skills for student employability, but I suggest that embedding the transformative skills emphasised by Mills empowers students and builds their confidence to help them make connections that aid their employability. Secondly, the marketised approach creates a problem for staff designing the curriculum, if students do not easily make links across time over their years of study and whole programmes. By researching to learn, staff can discover new methods to apply in their design of the curriculum, to help students make important and creative connections across their programmes of study.
Resumo:
The medial pFC (mPFC) is frequently reported to play a central role in Theory of Mind (ToM). However, the contribution of this large cortical region in ToM is not well understood. Combining a novel behavioral task with fMRI, we sought to demonstrate functional divisions between dorsal and rostral mPFC. All conditions of the task required the representation of mental states (beliefs and desires). The level of demands on cognitive control (high vs. low) and the nature of the demands on reasoning (deductive vs. abductive) were varied orthogonally between conditions. Activation in dorsal mPFC was modulated by the need for control, whereas rostral mPFC was modulated by reasoning demands. These findings fit with previously suggested domain-general functions for different parts of mPFC and suggest that these functions are recruited selectively in the service of ToM.
Resumo:
Economic theories of rational addiction aim to describe consumer behavior in the presence of habit-forming goods. We provide a biological foundation for this body of work by formally specifying conditions under which it is optimal to form a habit. We demonstrate the empirical validity of our thesis with an in-depth review and synthesis of the biomedical literature concerning the action of opiates in the mammalian brain and their eects on behavior. Our results lend credence to many of the unconventional behavioral assumptions employed by theories of rational addiction, including adjacent complementarity and the importance of cues, attention, and self-control in determining the behavior of addicts. We oer evidence for the special case of the opiates that "harmful" addiction is the manifestation of a mismatch between behavioral algorithms encoded in the human genome and the expanded menu of choices faced by consumers in the modern world.
Resumo:
Understanding online price acceptance and its determining factors can be essential if the companies try to manage different type of channels. The paper aimed to reveal the role of enduring involvement in price acceptance in a multichannel (online and offline) context. The study revealed that the hedonic value of shopping can increase the negative intention of price acceptance in the online channel, but also explored that for the segment without shopping motivations a similar price level can be applied both in the online and in the offline environment.
Resumo:
The purpose of this study was to assess the intention to exercise among ethnically and racially diverse community college students using the Theory of Planned Behavior (TPB). In addition to identifying the variables associated with motivation or intention of college students to engage in physical activity, this study tested the model of the Theory of Planned Behavior, asking: Does the TPB model explain intention to exercise among a racially/ethnically diverse group of college students? ^ The relevant variables were the TPB constructs (behavioral beliefs, normative beliefs, and control beliefs), which combined to form a measure of intention to exercise. Structural Equation Modeling was used to test the predictive power of the TPB constructs for predicting intention to exercise. Following procedures described by Ajzen (2002), the researcher developed a questionnaire encompassing the external variables of student demographics (age, gender, work status, student status, socio-economic status, access to exercise facilities, and past behavior), major constructs of the TPB, and two questions from the Godin Leisure Time Questionnaire (GLTQ; Godin & Shephard, 1985). Participants were students (N = 255) who enrolled in an on-campus wellness course at an urban community college. ^ The demographic profile of the sample revealed a racially/ethnically diverse study population. The original model that was used to reflect the TPB as developed by Ajzen was not supported by the data analyzed using SEM; however, a revised model that the researcher thought was theoretically a more accurate reflection of the causal relations between the TPB constructs was supported. The GLTQ questions were problematic for some students; those data could not be used in the modeling efforts. The GLTQ measure, however, revealed a significant correlation with intention to exercise (r = .27, p = .001). Post-hoc comparisons revealed significant differences in normative beliefs and attitude toward exercising behavior between Black students and Hispanic students. Compared to Black students, Hispanic students were more likely to (a) perceive “friends” as approving of them being physically active and (b) rate being physically active for 30 minutes per day as “beneficial”. No statistically significant difference was found among groups on overall intention to exercise. ^
Resumo:
Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic.^ This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.^
Resumo:
Phobic and anxiety disorders are one of the most common, if not the most common and debilitating psychopathological conditions found among children and adolescents. As a result, a treatment research literature has accumulated showing the efficacy of cognitive behavioral treatment (CBT) for reducing anxiety disorders in youth. This dissertation study compared a CBT with parent and child (i.e., PCBT) and child group CBT (i.e., GCBT). These two treatment approaches were compared due to the recognition that a child’s context has an effect on the development, course, and outcome of childhood psychopathology and functional status. The specific aims of this dissertation were to examine treatment specificity and mediation effects of parent and peer contextual variables. The sample consisted of 183 youth and their mothers. Research questions were analyzed using analysis of variance for treatment outcome, and structural equation modeling, accounting for clustering effects, for treatment specificity and mediation effects. Results indicated that both PCBT and GCBT produced positive treatment outcomes across all indices of change (i.e., clinically significant improvement, anxiety symptom reduction) and across all informants (i.e., youths and parents) with no significant differences between treatment conditions. Results also showed partial treatment specific effects of positive peer relationships in GCBT. PCBT also showed partial treatment specific effects of parental psychological control. Mediation effects were only observed in GCBT; positive peer interactions mediated treatment response. The results support the use CBT with parents and peers for treating childhood anxiety. The findings’ implications are further discussed in terms of the need to conduct further meditational treatment outcome designs in order to continue to advance theory and research in child and anxiety treatment.