893 resultados para monopoly, synopoly, cartel, competition, black box technology
Resumo:
Este artículo se propone analizar la escena del cresmólogo intruso en Aves, revalorizando la comedia aristofánica como fuente de conocimiento histórico. Este análisis se centra en la práctica oracular como una técnica de producción escrita vinculada a la autoridad religiosa. De esta manera, se exploran dos campos de estudios, como la comedia antigua y la adivinación griega, cuyo vínculo no ha sido explorado en profundidad. Para dar cuenta del momento crítico de la institución oracular durante la Guerra del Peloponeso, se reconstruyen perspectivas sobre dicho fenómeno en otras fuentes como Tucídides o Demóstenes. Esto no solo ofrece una mirada «cómica» sobre la adivinación, sino que también permite comprender la práctica oracular como técnica y, en consecuencia, qué elementos de su funcionamiento podían ser manipulados.
Resumo:
En un mundo globalizado, las estructuras organizacionales han complejizado en mayor medida su operación. Si bien la Teoría de la organización, menciona el comportamiento de una empresa como un sistema, la mundialización permite que el entorno de dicho sistema se vuelva mucho más ininteligible de manera que la dependencia entre organizaciones, ya sean globales o locales, sea más fuerte entre sí. Este trabajo fue elaborado con el fin de hacer un análisis sobre la existencia y constitución de los Grupos Económicos de Colombia.
Resumo:
China has been the focus of much academic and business scrutiny of late. Its economic climate is changing and its huge new market opportunities seem quite tantalizing to the would-be 'technology entrepreneur'. But China's market is a relatively immature one; it is still in the process of being opened up to real competition. The corollary of this is that, at this stage of the transitional process, there is still significant State control of market function. This article discusses Chinese competition law, the technology transfer system, how the laws are being reformed and how the technology entrepreneur fares under them. The bottom line is that while opportunities beckon, the wise entrepreneur will nevertheless continue to exercise caution.
Resumo:
Rapidly developing information and telecommunication technologies and their platforms in the late 20th Century helped improve urban infrastructure management and influenced quality of life. Telecommunication technologies make it possible for people to deliver text, audio and video material using wired, wireless or fibre-optic networks. Technologies convergence amongst these digital devices continues to create new ways in which the information and telecommunication technologies are used. The 21st Century is an era where information has converged, in which people are able to access a variety of services, including internet and location based services, through multi-functional devices such as mobile phones. This chapter discusses the recent developments in telecommunication networks and trends in convergence technologies, their implications for urban infrastructure planning, and for the quality of life of urban residents.
Resumo:
Theories of individual attitudes toward IT include task technology fit (TTF), technology acceptance model (TAM), unified theory of acceptance and use of technology (UTAUT), cognitive fit, expectation disconfirmation, and computer self-efficacy. Examination of these theories reveals three main concerns. First, the theories mostly ‘‘black box’’ (or omit) the IT artifact. Second, appropriate mid-range theory is not developed to contribute to disciplinary progress and to serve the needs of our practitioner community. Third, theories are overlapping but incommensurable. We propose a theoretical framework that harmonizes these attitudinal theories and shows how they can be specialized to include relevant IS phenomenon.
Resumo:
Innovation policies play an important role throughout the development process of emerging industries. However, existing policy studies view the process as a black-box, and fail to understand the policy-industry interactions through the process. This paper aims to develop an integrated technology roadmapping tool, in order to facilitate the better understanding of policy heterogeneity at the different stages of new energy industries in China. Through the case study of Chinese wind energy equipment manufacturing industry, this paper elaborates the dynamics between policy and the growth process of the industry. Further, this paper generalizes some Chinese specifics for the policy-industry interactions. As a practical output, this study proposes a policy-technology roadmapping framework that maps policy-market-product- technology interactions in response to the requirement for analyzing and planning the development of new industries in emerging economies (e.g. China). This paper will be of interest to policy makers, strategists, investors, and industrial experts. © 2011 IEEE.
Resumo:
Innovation policies play an important role throughout the development process of emerging industries in China. Existing policy and industry studies view the emergence process as a black-box, and fail to understand the impacts of policy to the process along which it varies. This paper aims to develop a multi-dimensional roadmapping tool to better analyse the dynamics between policy and industrial growth for new industries in China. Through reviewing the emergence process of Chinese wind turbine industry, this paper elaborates how policy and other factors influence the emergence of this industry along this path. Further, this paper generalises some Chinese specifics for the policy-industry dynamics. As a practical output, this study proposes a roadmapping framework that generalises some patterns of policy-industry interactions for the emergence process of new industries in China. This paper will be of interest to policy makers, strategists, investors and industrial experts. Copyright © 2013 Inderscience Enterprises Ltd.
Resumo:
Technology discloses man’s mode of dealing with Nature, the process of production by which he sustains his life, and thereby also lays bare the mode of formation of his social relations, and of the mental conceptions that flow from them (Marx, 1990: 372) My thesis is a Sociological analysis of UK policy discourse for educational technology during the last 15 years. My framework is a dialogue between the Marxist-based critical social theory of Lieras and a corpus-based Critical Discourse Analysis (CDA) of UK policy for Technology Enhanced Learning (TEL) in higher education. Embedded in TEL is a presupposition: a deterministic assumption that technology has enhanced learning. This conceals a necessary debate that reminds us it is humans that design learning, not technology. By omitting people, TEL provides a vehicle for strong hierarchical or neoliberal, agendas to make simplified claims politically, in the name of technology. My research has two main aims: firstly, I share a replicable, mixed methodological approach for linguistic analysis of the political discourse of TEL. Quantitatively, I examine patterns in my corpus to question forms of ‘use’ around technology that structure a rigid basic argument which ‘enframes’ educational technology (Heidegger, 1977: 38). In a qualitative analysis of findings, I ask to what extent policy discourse evaluates technology in one way, to support a Knowledge Based Economy (KBE) in a political economy of neoliberalism (Jessop 2004, Fairclough 2006). If technology is commodified as an external enhancement, it is expected to provide an ‘exchange value’ for learners (Marx, 1867). I therefore examine more closely what is prioritised and devalued in these texts. Secondly, I disclose a form of austerity in the discourse where technology, as an abstract force, undertakes tasks usually ascribed to humans (Lieras, 1996, Brey, 2003:2). This risks desubjectivisation, loss of power and limits people’s relationships with technology and with each other. A view of technology in political discourse as complete without people closes possibilities for broader dialectical (Fairclough, 2001, 2007) and ‘convivial’ (Illich, 1973) understandings of the intimate, material practice of engaging with technology in education. In opening the ‘black box’ of TEL via CDA I reveal talking points that are otherwise concealed. This allows me as to be reflexive and self-critical through praxis, to confront my own assumptions about what the discourse conceals and what forms of resistance might be required. In so doing, I contribute to ongoing debates about networked learning, providing a context to explore educational technology as a technology, language and learning nexus.
Resumo:
The ultimate intent of this dissertation was to broaden and strengthen our understanding of IT implementation by emphasizing research efforts on the dynamic nature of the implementation process. More specifically, efforts were directed toward opening the "black box" and providing the story that explains how and why contextual conditions and implementation tactics interact to produce project outcomes. In pursuit of this objective, the dissertation was aimed at theory building and adopted a case study methodology combining qualitative and quantitative evidence. Precisely, it examined the implementation process, use and consequences of three clinical information systems at Jackson Memorial Hospital, a large tertiary care teaching hospital.^ As a preliminary step toward the development of a more realistic model of system implementation, the study proposes a new set of research propositions reflecting the dynamic nature of the implementation process.^ Findings clearly reveal that successful implementation projects are likely to be those where key actors envision end goals, anticipate challenges ahead, and recognize the presence of and seize opportunities. It was also found that IT implementation is characterized by the systems theory of equifinality, that is, there are likely several equally effective ways to achieve a given end goal. The selection of a particular implementation strategy appears to be a rational process where actions and decisions are largely influenced by the degree to which key actors recognize the mediating role of each tactic and are motivated to action. The nature of the implementation process is also characterized by the concept of "duality of structure," that is, context and actions mutually influence each other. Another key finding suggests that there is no underlying program that regulates the process of change and moves it form one given point toward a subsequent and already prefigured end. For this reason, the implementation process cannot be thought of as a series of activities performed in a sequential manner such as conceived in stage models. Finally, it was found that IT implementation is punctuated by a certain indeterminacy. Results suggest that only when substantial efforts are focused on what to look for and think about, it is less likely that unfavorable and undesirable consequences will occur. ^
Resumo:
The book within which this chapter appears is published as a research reference book (not a coursework textbook) on Management Information Systems (MIS) for seniors or graduate students in Chinese universities. It is hoped that this chapter, along with the others, will be helpful to MIS scholars and PhD/Masters research students in China who seek understanding of several central Information Systems (IS) research topics and related issues. The subject of this chapter - ‘Evaluating Information Systems’ - is broad, and cannot be addressed in its entirety in any depth within a single book chapter. The chapter proceeds from the truism that organizations have limited resources and those resources need to be invested in a way that provides greatest benefit to the organization. IT expenditure represents a substantial portion of any organization’s investment budget and IT related innovations have broad organizational impacts. Evaluation of the impact of this major investment is essential to justify this expenditure both pre- and post-investment. Evaluation is also important to prioritize possible improvements. The chapter (and most of the literature reviewed herein) admittedly assumes a blackbox view of IS/IT1, emphasizing measures of its consequences (e.g. for organizational performance or the economy) or perceptions of its quality from a user perspective. This reflects the MIS emphasis – a ‘management’ emphasis rather than a software engineering emphasis2, where a software engineering emphasis might be on the technical characteristics and technical performance. Though a black-box approach limits diagnostic specificity of findings from a technical perspective, it offers many benefits. In addition to superior management information, these benefits may include economy of measurement and comparability of findings (e.g. see Part 4 on Benchmarking IS). The chapter does not purport to be a comprehensive treatment of the relevant literature. It does, however, reflect many of the more influential works, and a representative range of important writings in the area. The author has been somewhat opportunistic in Part 2, employing a single journal – The Journal of Strategic Information Systems – to derive a classification of literature in the broader domain. Nonetheless, the arguments for this approach are believed to be sound, and the value from this exercise real. The chapter drills down from the general to the specific. It commences with a highlevel overview of the general topic area. This is achieved in 2 parts: - Part 1 addressing existing research in the more comprehensive IS research outlets (e.g. MISQ, JAIS, ISR, JMIS, ICIS), and Part 2 addressing existing research in a key specialist outlet (i.e. Journal of Strategic Information Systems). Subsequently, in Part 3, the chapter narrows to focus on the sub-topic ‘Information Systems Success Measurement’; then drilling deeper to become even more focused in Part 4 on ‘Benchmarking Information Systems’. In other words, the chapter drills down from Parts 1&2 Value of IS, to Part 3 Measuring Information Systems Success, to Part 4 Benchmarking IS. While the commencing Parts (1&2) are by definition broadly relevant to the chapter topic, the subsequent, more focused Parts (3 and 4) admittedly reflect the author’s more specific interests. Thus, the three chapter foci – value of IS, measuring IS success, and benchmarking IS - are not mutually exclusive, but, rather, each subsequent focus is in most respects a sub-set of the former. Parts 1&2, ‘the Value of IS’, take a broad view, with much emphasis on ‘the business Value of IS’, or the relationship between information technology and organizational performance. Part 3, ‘Information System Success Measurement’, focuses more specifically on measures and constructs employed in empirical research into the drivers of IS success (ISS). (DeLone and McLean 1992) inventoried and rationalized disparate prior measures of ISS into 6 constructs – System Quality, Information Quality, Individual Impact, Organizational Impact, Satisfaction and Use (later suggesting a 7th construct – Service Quality (DeLone and McLean 2003)). These 6 constructs have been used extensively, individually or in some combination, as the dependent variable in research seeking to better understand the important antecedents or drivers of IS Success. Part 3 reviews this body of work. Part 4, ‘Benchmarking Information Systems’, drills deeper again, focusing more specifically on a measure of the IS that can be used as a ‘benchmark’3. This section consolidates and extends the work of the author and his colleagues4 to derive a robust, validated IS-Impact measurement model for benchmarking contemporary Information Systems (IS). Though IS-Impact, like ISS, has potential value in empirical, causal research, its design and validation has emphasized its role and value as a comparator; a measure that is simple, robust and generalizable and which yields results that are as far as possible comparable across time, across stakeholders, and across differing systems and systems contexts.
Resumo:
The law and popular opinion expect boards of directors will actively monitor their organisations. Further, public opinion is that boards should have a positive impact on organisational performance. However, the processes of board monitoring and judgment are poorly understood, and board influence on organisational performance needs to be better understood. This thesis responds to the repeated calls to open the ‘black box’ linking board practices and organisational performance by investigating the processual behaviours of boards. The work of four boards1 of micro and small-sized nonprofit organisations were studied for periods of at least one year, using a processual research approach, drawing on observations of board meetings, interviews with directors, and the documents of the boards. The research shows that director turnover, the difficulty recruiting and engaging directors, and the administration of reporting, had strong impacts upon board monitoring, judging and/or influence. In addition, board monitoring of organisational performance was adversely affected by directors’ limited awareness of their legal responsibilities and directors’ limited financial literacy. Directors on average found all sources of information about their organisation’s work useful. Board judgments about the financial aspects of organisational performance were regulated by the routines of financial reporting. However, there were no comparable routines facilitating judgments about non-financial performance, and such judgments tended to be limited to specific aspects of performance and were ad hoc, largely in response to new information or the repackaging of existing information in a new form. The thesis argues that Weick’s theory of sensemaking offers insight into the way boards went about the task of understanding organisational performance. Board influence on organisational performance was demonstrated in the areas of: compliance; instrumental influence through service and through discussion and decision-making; and by symbolic, legitimating and protective means. The degree of instrumental influence achieved by boards depended on director competency, access to networks of influence, and understandings of board roles, and by the agency demonstrated by directors. The thesis concludes that there is a crowding out effect whereby CEO competence and capability limits board influence. The thesis also suggests that there is a second ‘agency problem’, a problem of director volition. The research potentially has profound implications for the work of nonprofit boards. Rather than purporting to establish a general theory of board governance, the thesis embraces calls to build situation-specific mini-theories about board behaviour.
Resumo:
Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.