668 resultados para 1ST-PRINCIPLES THEORY
Resumo:
This chapter introduces different theoretical approaches to negotiation and provides an explanation of these differing frameworks. While the action of a negotiation centres on the background research undertaken and what happens at the negotiation table, there is a need to know what principles and assumptions are informing these activities. Theories offer a way of understanding the underlying structures, processes and relationships of negotiation. Further, negotiation theories also assist with focusing attention on the 'basis of the bargain' and provide a standpoint from which to judge offers and counter-offers during the negotiation.
Resumo:
Business transformations are large-scale organisational change projects that, evidence suggests, are often unsuccessful. This study aims to develop a conceptual model that explains how management services that are required for a business transformation are orchestrated during such a initiative. We classify management services such as (but not limited to) change management, risk management, IT management, financial management, program management and so forth as bearing transformational and/or transactional capabilities in a transformation initiative. We then draw upon three principles of musical composition, namely melody, harmony and rhythm, and illustrate how they apply to the orchestration of management services in the management of business transformations. In order to illustrate our emerging model, we examine the case of Malaysia Airlines, who have managed to successfully turnaround the near-bankrupt organisation beyond survival. We demonstrate how the notions of melody, harmony and rhythm can be used to describe their endeavour. We conclude by discussing next steps of our research.
Resumo:
A simple phenomenological model for the relationship between structure and composition of the high Tc cuprates is presented. The model is based on two simple crystal chemistry principles: unit cell doping and charge balance within unit cells. These principles are inspired by key experimental observations of how the materials accommodate large deviations from stoichiometry. Consistent explanations for significant HTSC properties can be explained without any additional assumptions while retaining valuable insight for geometric interpretation. Combining these two chemical principles with a review of Crystal Field Theory (CFT) or Ligand Field Theory (LFT), it becomes clear that the two oxidation states in the conduction planes (typically d8 and d9) belong to the most strongly divergent d-levels as a function of deformation from regular octahedral coordination. This observation offers a link to a range of coupling effects relating vibrations and spin waves through application of Hund’s rules. An indication of this model’s capacity to predict physical properties for HTSC is provided and will be elaborated in subsequent publications. Simple criteria for the relationship between structure and composition in HTSC systems may guide chemical syntheses within new material systems.
Resumo:
This paper presents a strategy for delayed research method selection in a qualitative interpretivist research. An exemplary case details how explorative interviews were designed and conducted in accordance with a paradigm prior to deciding whether to adopt grounded theory or phenomenology for data analysis. The focus here is to determine the most appropriate research strategy in this case the methodological framing to conduct research and represent findings, both of which are detailed. Research addressing current management issues requires both a flexible framework and the capability to consider the research problem from various angles, to derive tangible results for academia with immediate application to business demands. Researchers, and in particular novices, often struggle to decide on an appropriate research method suitable to address their research problem. This often applies to interpretative qualitative research where it is not always immediately clear which is the most appropriate method to use, as the research objectives shift and crystallize over time. This paper uses an exemplary case to reveal how the strategy for delayed research method selection contributes to deciding whether to adopt grounded theory or phenomenology in the initial phase of a PhD research project. In this case, semi-structured interviews were used for data generation framed in an interpretivist approach, situated in a business context. Research questions for this study were thoroughly defined and carefully framed in accordance with the research paradigm‟s principles, while at the same time ensuring that the requirements of both potential research methods were met. The grounded theory and phenomenology methods were compared and contrasted to determine their suitability and whether they meet the research objectives based on a pilot study. The strategy proposed in this paper is an alternative to the more „traditional‟ approach, which initially selects the methodological formulation, followed by data generation. In conclusion, the suggested strategy for delayed research method selection intends to help researchers identify and apply the most appropriate method to their research. This strategy is based on explorations of data generation and analysis in order to derive faithful results from the data generated.
Resumo:
Left realists contend that people lacking legitimate means of solving the problem of relative deprivation may come into contact with other frustrated disenfranchised people and form subcultures, which in turn, encourage criminal behaviors. Absent from this theory is an attempt to address how, today, subcultural development in North America and elsewhere is heavily shaped simultaneously by the recent destructive consequences of right-wing Friedman or Chicago School economic policies and marginalized men's attempts to live up to the principles of hegemonic masculinity. The purpose of this paper, then, is to offer a new left realist theory that emphasizes the contribution of these two key determinants.
Resumo:
This study proceeds from a central interest in the importance of systematically evaluating operational large-scale integrated information systems (IS) in organisations. The study is conducted within the IS-Impact Research Track at Queensland University of Technology (QUT). The goal of the IS-Impact Track is, "to develop the most widely employed model for benchmarking information systems in organizations for the joint benefit of both research and practice" (Gable et al, 2009). The track espouses programmatic research having the principles of incrementalism, tenacity, holism and generalisability through replication and extension research strategies. Track efforts have yielded the bicameral IS-Impact measurement model; the ‘impact’ half includes Organisational-Impact and Individual-Impact dimensions; the ‘quality’ half includes System-Quality and Information-Quality dimensions. Akin to Gregor’s (2006) analytic theory, the ISImpact model is conceptualised as a formative, multidimensional index and is defined as "a measure at a point in time, of the stream of net benefits from the IS, to date and anticipated, as perceived by all key-user-groups" (Gable et al., 2008, p: 381). The study adopts the IS-Impact model (Gable, et al., 2008) as its core theory base. Prior work within the IS-Impact track has been consciously constrained to Financial IS for their homogeneity. This study adopts a context-extension strategy (Berthon et al., 2002) with the aim "to further validate and extend the IS-Impact measurement model in a new context - i.e. a different IS - Human Resources (HR)". The overarching research question is: "How can the impacts of large-scale integrated HR applications be effectively and efficiently benchmarked?" This managerial question (Cooper & Emory, 1995) decomposes into two more specific research questions – In the new HR context: (RQ1): "Is the IS-Impact model complete?" (RQ2): "Is the ISImpact model valid as a 1st-order formative, 2nd-order formative multidimensional construct?" The study adhered to the two-phase approach of Gable et al. (2008) to hypothesise and validate a measurement model. The initial ‘exploratory phase’ employed a zero base qualitative approach to re-instantiating the IS-Impact model in the HR context. The subsequent ‘confirmatory phase’ sought to validate the resultant hypothesised measurement model against newly gathered quantitative data. The unit of analysis for the study is the application, ‘ALESCO’, an integrated large-scale HR application implemented at Queensland University of Technology (QUT), a large Australian university (with approximately 40,000 students and 5000 staff). Target respondents of both study phases were ALESCO key-user-groups: strategic users, management users, operational users and technical users, who directly use ALESCO or its outputs. An open-ended, qualitative survey was employed in the exploratory phase, with the objective of exploring the completeness and applicability of the IS-Impact model’s dimensions and measures in the new context, and to conceptualise any resultant model changes to be operationalised in the confirmatory phase. Responses from 134 ALESCO users to the main survey question, "What do you consider have been the impacts of the ALESCO (HR) system in your division/department since its implementation?" were decomposed into 425 ‘impact citations.’ Citation mapping using a deductive (top-down) content analysis approach instantiated all dimensions and measures of the IS-Impact model, evidencing its content validity in the new context. Seeking to probe additional (perhaps negative) impacts; the survey included the additional open question "In your opinion, what can be done better to improve the ALESCO (HR) system?" Responses to this question decomposed into a further 107 citations which in the main did not map to IS-Impact, but rather coalesced around the concept of IS-Support. Deductively drawing from relevant literature, and working inductively from the unmapped citations, the new ‘IS-Support’ construct, including the four formative dimensions (i) training, (ii) documentation, (iii) assistance, and (iv) authorisation (each having reflective measures), was defined as: "a measure at a point in time, of the support, the [HR] information system key-user groups receive to increase their capabilities in utilising the system." Thus, a further goal of the study became validation of the IS-Support construct, suggesting the research question (RQ3): "Is IS-Support valid as a 1st-order reflective, 2nd-order formative multidimensional construct?" With the aim of validating IS-Impact within its nomological net (identification through structural relations), as in prior work, Satisfaction was hypothesised as its immediate consequence. The IS-Support construct having derived from a question intended to probe IS-Impacts, too was hypothesised as antecedent to Satisfaction, thereby suggesting the research question (RQ4): "What is the relative contribution of IS-Impact and IS-Support to Satisfaction?" With the goal of testing the above research questions, IS-Impact, IS-Support and Satisfaction were operationalised in a quantitative survey instrument. Partial least squares (PLS) structural equation modelling employing 221 valid responses largely evidenced the validity of the commencing IS-Impact model in the HR context. ISSupport too was validated as operationalised (including 11 reflective measures of its 4 formative dimensions). IS-Support alone explained 36% of Satisfaction; IS-Impact alone 70%; in combination both explaining 71% with virtually all influence of ISSupport subsumed by IS-Impact. Key study contributions to research include: (1) validation of IS-Impact in the HR context, (2) validation of a newly conceptualised IS-Support construct as important antecedent of Satisfaction, and (3) validation of the redundancy of IS-Support when gauging IS-Impact. The study also makes valuable contributions to practice, the research track and the sponsoring organisation.
Resumo:
This paper examines whether innovation in market design can address persistent problems of housing choice and affordability in the ageing inner and middle suburbs of Australian cities. Despite policy consensus that urban intensification of these low density, ‘greyfield’ areas should be able to deliver positive social, economic and environmental outcomes, existing models of development have not increased housing stock or delivered adequate gains in sustainability, affordability or diversity of dwellings in greyfield localities. We argue that application of smart market and matching market principles to the supply of multi-unit housing can unlock land, reduce development costs and improve design.
Resumo:
Transition metal-free magnetism and half-metallicity recently has been the subject of intense research activity due to its potential in spintronics application. Here we, for the first time, demonstrate via density functional theory that the most recently experimentally realized graphitic carbon nitride (g-C4N3) displays a ferromagnetic ground state. Furthermore, this novel material is predicted to possess an intrinsic half-metallicity never reported to date. Our results highlight a new promising material toward realistic metal-free spintronics application.
Resumo:
The biosafety of carbon nanomaterial needs to be critically evaluated with both experimental and theoretical validations before extensive biomedical applications. In this letter, we present an analysis of the binding ability of two dimensional monolayer carbon nanomaterial on actin by molecular simulation to understand their adhesive characteristics on F-actin cytoskeleton. The modelling results indicate that the positively charged carbon nanomaterial has higher binding stability on actin. Compared to crystalline graphene, graphene oxide shows higher binding influence on actin when carrying 11 positive surface charge. This theoretical investigation provides insights into the sensitivity of actin-related cellular activities on carbon nanomaterial.
Resumo:
"The music industry is going through a period of immense change brought about in part by the digital revolution. What is the role of music in the age of computers and the internet? How has the music industry been transformed by the economic and technological upheavals of recent years, and how is it likely to change in the future? This is the first major study of the music industry in the new millennium. Wikström provides an international overview of the music industry and its future prospects in the world of global entertainment. He illuminates the workings of the music industry, and captures the dynamics at work in the production of musical culture between the transnational media conglomerates, the independent music companies and the public." -- back cover Table of Contents Introduction: Music in the Cloud Chapter 1: A Copyright Industry. Chapter 2: Inside the Music Industry Chapter 3: Music and the Media Chapter 4: Making Music - An Industrial or Creative Process Chapter 5: The Social and Creative Music Fan Chapter 6: Future Sounds
Resumo:
"Emphasises asset allocation while presenting the practical applications of investment theory. The authors concentrate on the intuition and insights that will be useful to students throughout their careers as new ideas and challenges emerge from the financial marketplace. It provides a good foundation to understand the basic types of securities and financial markets as well as how trading in those markets is conducted. The Portfolio Management section is discussed towards the end of the course and supported by a web-based portfolio simulation with a hypothetical $100,000 brokerage account to buy and sell stocks and mutual funds. Students get a chance to use real data found in the Wall Street Survivor simulation in conjunction with the chapters on investments. This site is powered by StockTrak, the leading provider of investment simulation services to the academic community. Principles of Investments includes increased attention to changes in market structure and trading technology. The theory is supported by a wide range of exercises, worksheets and problems."--publisher website Contents: Investments: background and issues -- Asset classes and financial markets -- Securities markets -- Managed funds and investment management -- Risk and return: past and prologue -- Efficient diversification -- Capital asset pricing and arbitrage pricing theory -- The efficient market hypothesis -- Bond prices and yields -- Managing bond portfolios -- Equity valuation -- Macroeconomic and industry analysis -- Financial statement analysis -- Investors and the investment process -- Hedge funds -- Portfolio performance evaluation.
Resumo:
Criminal law scholarship is enjoying a renaissance in normative theory, evident in a growing list of publications from leading scholars that attempt to elucidate a set of principles on which criminalisation and criminal law might — indeed should — be based. This development has been less marked in Australia, where a stream of criminologically influenced criminal law scholarship, teaching and practice has emerged over nearly three decades. There are certain tensions between this predominantly contextual, process-oriented and criminological tradition that has emerged in Australia, characterised by a critical approach to the search for ‘general principles’ of the criminal law, and the more recent revival of interest in developing a set of principles on which a ‘normative theory of criminal law’ might be founded. Aspects of this tension will be detailed through examination of recent examples of criminalisation in New South Wales that are broadly representative of trends across all Australian urisdictions. The article will then reflect on the links between these particular features of criminalisation and attempts to develop a ‘normative theory’ of criminalisation.
Resumo:
Background Post-stroke recovery is demanding. Increasing studies have examined the effectiveness of self-management programs for stroke survivors. However no systematic review has been conducted to summarize the effectiveness of theory-based stroke self-management programs. Objectives The aim is to present the best available research evidence about effectiveness of theory-based self-management programs on community-dwelling stroke survivors’ recovery. Inclusion criteria Types of participants All community-residing adults aged 18 years or above, and had a clinical diagnosis of stroke. Types of interventions Studies which examined effectiveness of a self-management program underpinned by a theoretical or conceptual framework for community-dwelling stroke survivors. Types of studies Randomized controlled trials. Types of outcomes Primary outcomes included health-related quality of life and self-management behaviors. Secondary outcomes included physical (activities of daily living), psychological (self-efficacy, depressive symptoms), and social outcomes (community reintegration, perceived social support). Search Strategy A three-step approach was adopted to identify all relevant published and unpublished studies in English or Chinese. Methodological quality The methodological quality of the included studies was assessed using the Joanna Briggs Institute critical appraisal checklist for experimental studies. Data Collection A standardized JBI data extraction form was used. There was no disagreement between the two reviewers on the data extraction results. Data Synthesis There were incomplete details about the number of participants and the results in two studies, which makes it impossible to perform meta-analysis. A narrative summary of the effectiveness of stroke self-management programs is presented. Results Three studies were included. The key issues of concern in methodological quality included insufficient information about random assignment, allocation concealment, reliability and validity of the measuring instruments, absence of intention-to-treat analysis, and small sample sizes. The three programs were designed based on the Stanford Chronic Disease Self-management program and were underpinned by the principles of self-efficacy. One study showed improvement in the intervention group in family and social roles three months after program completion, and work productivity at six months as measured by the Stroke Specific Quality of Life Scale (SSQOL). The intervention group also had an increased mean self-efficacy score in communicating with physicians six months after program completion. The mean changes from baseline in these variables were significantly different from the control group. No significant difference was found in time spent in aerobic exercise between the intervention and control groups at three and six months after program completion. Another study, using SSQOL, showed a significant interaction effect by treatment and time on family roles, fine motor tasks, self-care, and work productivity. However there was no significant interaction by treatment and time on self-efficacy. The third study showed improvement in quality of life, community participation, and depressive symptoms among the participants receiving the stroke self-management program, Stanford Chronic Disease Self-management program, or usual care six months after program completion. However, there was no significant difference between the groups. Conclusions There is inconclusive evidence about the effectiveness of theory-based stroke self-management programs on community-dwelling stroke survivors’ recovery. However the preliminary evidence suggests potential benefits in improving stroke survivors’ quality of life and self-efficacy.
Resumo:
The assumptions underlying the Probability Ranking Principle (PRP) have led to a number of alternative approaches that cater or compensate for the PRP’s limitations. All alternatives deviate from the PRP by incorporating dependencies. This results in a re-ranking that promotes or demotes documents depending upon their relationship with the documents that have been already ranked. In this paper, we compare and contrast the behaviour of state-of-the-art ranking strategies and principles. To do so, we tease out analytical relationships between the ranking approaches and we investigate the document kinematics to visualise the effects of the different approaches on document ranking.
Resumo:
Measuring Earth material behaviour on time scales of millions of years transcends our current capability in the laboratory. We review an alternative path considering multiscale and multiphysics approaches with quantitative structure-property relationships. This approach allows a sound basis to incorporate physical principles such as chemistry, thermodynamics, diffusion and geometry-energy relations into simulations and data assimilation on the vast range of length and time scales encountered in the Earth. We identify key length scales for Earth systems processes and find a substantial scale separation between chemical, hydrous and thermal diffusion. We propose that this allows a simplified two-scale analysis where the outputs from the micro-scale model can be used as inputs for meso-scale simulations, which then in turn becomes the micro-model for the next scale up. We present two fundamental theoretical approaches to link the scales through asymptotic homogenisation from a macroscopic thermodynamic view and percolation renormalisation from a microscopic, statistical mechanics view.