408 resultados para Print popular literature
Resumo:
The aim of this paper is to contribute to the understanding of various models used in research for the adoption and diffusion of information technology in small and medium-sized enterprises (SMEs). Starting with Rogers' diffusion theory and behavioural models, technology adoption models used in IS research are discussed. Empirical research has shown that the reasons why firms choose to adopt or not adopt technology is dependent on a number of factors. These factors can be categorised as owner/manager characteristics, firm characteristics and other characteristics. The existing models explaining IS diffusion and adoption by SMEs overlap and complement each other. This paper reviews the existing literature and proposes a comprehensive model which includes the whole array of variables from earlier models.
Resumo:
In this article we introduce the term “energy polarization” to explain the politics of energy market reform in the Russian Duma. Our model tests the impact of regional energy production, party cohesion and ideology, and electoral mandate on the energy policy decisions of the Duma deputies (oil, gas, and electricity bills and resolution proposals) between 1994 and 2003. We find a strong divide between Single-Member District (SMD) and Proportional Representation (PR) deputies High statistical significance of gas production is demonstrated throughout the three Duma terms and shows Gazprom's key position in the post-Soviet Russian economy. Oil production is variably significant in the two first Dumas, when the main legislative debates on oil privatization occur. There is no constant left–right continuum, which is consistent with the deputies' proclaimed party ideology. The pro- and anti-reform poles observed in our Poole-based single dimensional scale are not necessarily connected with liberal and state-oriented regulatory policies, respectively. Party switching is a solid indicator of Russia's polarized legislative dynamics when it comes to energy sector reform.
Resumo:
Gabor representations have been widely used in facial analysis (face recognition, face detection and facial expression detection) due to their biological relevance and computational properties. Two popular Gabor representations used in literature are: 1) Log-Gabor and 2) Gabor energy filters. Even though these representations are somewhat similar, they also have distinct differences as the Log-Gabor filters mimic the simple cells in the visual cortex while the Gabor energy filters emulate the complex cells, which causes subtle differences in the responses. In this paper, we analyze the difference between these two Gabor representations and quantify these differences on the task of facial action unit (AU) detection. In our experiments conducted on the Cohn-Kanade dataset, we report an average area underneath the ROC curve (A`) of 92.60% across 17 AUs for the Gabor energy filters, while the Log-Gabor representation achieved an average A` of 96.11%. This result suggests that small spatial differences that the Log-Gabor filters pick up on are more useful for AU detection than the differences in contours and edges that the Gabor energy filters extract.
Resumo:
This article examines the BBC program Top Gear, discussing why it has become one of the world’s most-watched TV programs, and how it has very successfully captivated an audience who might otherwise not be particularly interested in cars. The analysis of the show is here framed in the form of three ‘lessons’ for journalists, suggesting that some of the entertaining (and highly engaging) ways in which Top Gear presents information to its viewers could be usefully applied in the coverage of politics – a domain of knowledge which, like cars, many citizens find abstract or boring.
Resumo:
The paper has a twofold purpose. First it highlights the importance of accounting information in the economic development of developing countries, with a particular focus on the nation of Libya. Secondly, using the case of Libya's General Company for Pipelines (GCP), it demonstrates that the use of accounting information to achieve economic development goals is determined to a large extent by the political/ideological setting in which it is generated. The study is based on a literature review and archival research, reinforced by a qualitative case study comprised of interviews, attendance at meetings and a study of internal documents. A study of The General Company for Pipelines (GCP) revealed that frequent politically driven changes in the structure and number of popular congresses and committees severely limited the use of accounting information, relegating it to a formal role. In consequence, accounting information had litle effect on stimulating economic development in Libya. This study focuses on one case study which does limit generalisability. However, it also suggests fruitful research areas considering the historic factors which have determined the accounting role in developing and planned economies. By providing insights about social factors which have determined the use of accounting in a planned economy, this study has implications for similar economies as they move towards a more globalised mode of operations which enhance the role of accounting in meeting economic development needs. If devleoping countries are to harness the potential of accounting aid in the achievement of their development plans, the social and political setting in which accounting has been conducted needs to be understood.
Resumo:
The need for the development of effective business curricula that meets the needs of the marketplace has created an increase in the adoption of core competencies lists identifying appropriate graduate skills. Many organisations and tertiary institutions have individual graduate capabilities lists including skills deemed essential for success. Skills recognised as ‘critical thinking’ are popular inclusions on core competencies and graduate capability lists. While there is literature outlining ‘critical thinking’ frameworks, methods of teaching it and calls for its integration into business curricula, few studies actually identify quantifiable improvements achieved in this area. This project sought to address the development of ‘critical thinking’ skills in a management degree program by embedding a process for critical thinking within a theory unit undertaken by students early in the program. Focus groups and a student survey were used to identify issues of both content and implementation and to develop a student perspective on their needs in thinking critically. A process utilising a framework of critical thinking was integrated through a workbook of weekly case studies for group analysis, discussions and experiential exercises. The experience included formative and summative assessment. Initial results indicate a greater valuation by students of their experience in the organisation theory unit; better marks for mid semester essay assignments and higher evaluations on the university administered survey of students’ satisfaction.
Resumo:
There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros