809 resultados para IS-enabled Innovation Framework


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is a comparative case study in Japanese video game localization for the video games Sairen, Sairen 2 and Sairen Nyûtoransurêshon, and English-language localized versions of the same games as published in Scandinavia and Australia/New Zealand. All games are developed by Sony Computer Entertainment Inc. and published exclusively for Playstation2 and Playstation3 consoles. The fictional world of the Sairen games draws much influence from Japanese history, as well as from popular and contemporary culture, and in doing so caters mainly to a Japanese audience. For localization, i.e. the adaptation of a product to make it accessible to users outside the original market it was intended for in the first place, this is a challenging issue. Video games are media of entertainment, and therefore localization practice must preserve the games’ effects on the players’ emotions. Further, video games are digital products that are comprised of a multitude of distinct elements, some of which are part of the game world, while others regulate the connection between the player as part of the real world and the game as digital medium. As a result, video game localization is also a practice that has to cope with the technical restrictions that are inherent to the medium. The main theory used throughout the thesis is Anthony Pym’s framework for localization studies that considers the user of the localized product as a defining part of the localization process. This concept presupposes that localization is an adaptation that is performed to make a product better suited for use during a specific reception situation. Pym also addresses the factor that certain products may resist distribution into certain reception situations because of their content, and that certain aspects of localization aim to reduce this resistance through significant alterations of the original product. While Pym developed his ideas with mainly regular software in mind, they can also be adapted well to study video games from a localization angle. Since modern video games are highly complex entities that often switch between interactive and non-interactive modes, Pym’s ideas are adapted throughout the thesis to suit the particular elements being studied. Instances analyzed in this thesis include menu screens, video clips, in-game action and websites. The main research questions focus on how the games’ rules influence localization, and how the games’ fictional domain influences localization. Because there are so many peculiarities inherent to the medium of the video game, other theories are introduced as well to complement the research at hand. These include Lawrence Venuti’s discussions of foreiginizing and domesticating translation methods for literary translation, and Jesper Juul’s definition of games. Additionally, knowledge gathered from interviews with video game localization professionals in Japan during September and October 2009 is also utilized for this study. Apart from answering the aforementioned research questions, one of this thesis’ aims is to enrich the still rather small field of game localization studies, and the study of Japanese video games in particular, one of Japan’s most successful cultural exports.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study explores new ideational changes in the information strategy of the Finnish state between 1998 and 2007, after a juncture in Finnish governing in the early 1990s. The study scrutinizes the economic reframing of institutional openness in Finland that comes with significant and often unintended institutional consequences of transparency. Most notably, the constitutional principle of publicity (julkisuusperiaate), a Nordic institutional peculiarity allowing public access to state information, is now becoming an instrument of economic performance and accountability through results. Finland has a long institutional history in the publicity of government information, acknowledged by law since 1951. Nevertheless, access to government information became a policy concern in the mid-1990s, involving a historical narrative of openness as a Nordic tradition of Finnish governing Nordic openness (pohjoismainen avoimuus). International interest in transparency of governance has also marked an opening for institutional re-descriptions in Nordic context. The essential added value, or contradictory term, that transparency has on the Finnish conceptualisation of governing is the innovation that public acts of governing can be economically efficient. This is most apparent in the new attempts at providing standardised information on government and expressing it in numbers. In Finland, the publicity of government information has been a concept of democratic connotations, but new internationally diffusing ideas of performance and national economic competitiveness are discussed under the notion of transparency and its peer concepts openness and public (sector) information, which are also newcomers to Finnish vocabulary of governing. The above concepts often conflict with one another, paving the way to unintended consequences for the reforms conducted in their name. Moreover, the study argues that the policy concerns over openness and public sector information are linked to the new drive for transparency. Drawing on theories of new institutionalism, political economy, and conceptual history, the study argues for a reinvention of Nordic openness in two senses. First, in referring to institutional history, the policy discourse of Nordic openness discovers an administrative tradition in response to new dilemmas of public governance. Moreover, this normatively appealing discourse also legitimizes the new ideational changes. Second, a former mechanism of democratic accountability is being reframed with market and performance ideas, mostly originating from the sphere of transnational governance and governance indices. Mobilizing different research techniques and data (public documents of the Finnish government and international organizations, some 30 interviews of Finnish civil servants, and statistical time series), the study asks how the above ideational changes have been possible, pointing to the importance of nationalistically appealing historical narratives and normative concepts of governing. Concerning institutional developments, the study analyses the ideational changes in central steering mechanisms (political, normative and financial steering) and the introduction of budget transparency and performance management in two cases: census data (Population Register Centre) and foreign political information (Ministry for Foreign Affairs). The new policy domain of governance indices is also explored as a type of transparency. The study further asks what institutional transformations are to be observed in the above cases and in the accountability system. The study concludes that while the information rights of citizens have been reinforced and recalibrated during the period under scrutiny, there has also been a conversion of institutional practices towards economic performance. As the discourse of Nordic openness has been rather unquestioned, the new internationally circulating ideas of transparency and the knowledge economy have entered this discourse without public notice. Since the mid 1990s, state registry data has been perceived as an exploitable economic resource in Finland and in the EU public sector information. This is a parallel development to the new drive for budget transparency in organisations as vital to the state as the Population Register Centre, which has led to marketization of census data in Finland, an international exceptionality. In the Finnish Ministry for Foreign Affairs, the post-Cold War rhetorical shift from secrecy to performance-driven openness marked a conversion in institutional practices that now see information services with high regards. But this has not necessarily led to the increased publicity of foreign political information. In this context, openness is also defined as sharing information with select actors, as a trust based non-public activity, deemed necessary amid the global economic competition. Regarding accountability system, deliberation and performance now overlap, making it increasingly difficult to identify to whom and for what the public administration is accountable. These evolving institutional practices are characterised by unintended consequences and paradoxes. History is a paradoxical component in the above institutional change, as long-term institutional developments now justify short-term reforms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two new neutral copper-azido polymers [Cu-3(N-3)(6)(tmen)(2)](n)(1)and [Cu-6(N-3)(12)(deen)(2)](n) (2) [tmen = N,N,N, N-tetramethylethylenediamine and deen = N,N-diethylethylenediamine] have been synthesized by using lower molar equivalents of the chelating diamine ligands with Cu(NO3)(2)center dot 3H(2)O and an excess of NaN3. The single crystal X-ray structure shows that in the basic unit of the 1D complex 1, the three Cu-II ions are linked by double end-on azido bridges with Cu-N-EO-Cu angles on both sides of the magnetic exchange critical angle of 108 degrees. Complex 2 is a 3D framework of a basic u-6 cluster. Cryomagnetic susceptibility measurements over a wide range of temperature exhibit dominant ferromagnetic behavior in both the complexes. Density functional theory calculations (B3LYP functional) have been performed on the trinuclear unit to provide a qualitative theoretical interpretation of the overall ferromagnetic behavior shown by the complex 1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dance is a potential asset for peacebuilding, creating opportunities for nonverbal, embodied learning, exploring identity, and relationships. Peace scholars consider identity and relationships to the ‘other’ as key components in transforming conflict. Focusing on a case study in Mindanao, the Philippines, this paper explores the potential of dance in a peacebuilding context through embodied identity and relationships. In Mindanao, deep-seated cultural prejudices contribute to ongoing conflict entwined with identity. The permeable membrane (Cohen, Gutiérrez & Walker, 2011) is the organising framework describing the constant interaction between artists, facilitators, participants, and communities. It expands peace scholar John Paul Lederach’s concept of the moral imagination, requiring the capacity to envisage one’s self within a web of relationships. In this paper multiple methods of qualitative research including personal interviews are used to further the discussion regarding dance’s potential to diversify the nonverbal tools available for peacebuilding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Image fusion is a formal framework which is expressed as means and tools for the alliance of multisensor, multitemporal, and multiresolution data. Multisource data vary in spectral, spatial and temporal resolutions necessitating advanced analytical or numerical techniques for enhanced interpretation capabilities. This paper reviews seven pixel based image fusion techniques - intensity-hue-saturation, brovey, high pass filter (HPF), high pass modulation (HPM), principal component analysis, fourier transform and correspondence analysis.Validation of these techniques on IKONOS data (Panchromatic band at I m spatial resolution and Multispectral 4 bands at 4 in spatial resolution) reveal that HPF and HPM methods synthesises the images closest to those the corresponding multisensors would observe at the high resolution level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Perhaps the most fundamental prediction of financial theory is that the expected returns on financial assets are determined by the amount of risk contained in their payoffs. Assets with a riskier payoff pattern should provide higher expected returns than assets that are otherwise similar but provide payoffs that contain less risk. Financial theory also predicts that not all types of risks should be compensated with higher expected returns. It is well-known that the asset-specific risk can be diversified away, whereas the systematic component of risk that affects all assets remains even in large portfolios. Thus, the asset-specific risk that the investor can easily get rid of by diversification should not lead to higher expected returns, and only the shared movement of individual asset returns – the sensitivity of these assets to a set of systematic risk factors – should matter for asset pricing. It is within this framework that this thesis is situated. The first essay proposes a new systematic risk factor, hypothesized to be correlated with changes in investor risk aversion, which manages to explain a large fraction of the return variation in the cross-section of stock returns. The second and third essays investigate the pricing of asset-specific risk, uncorrelated with commonly used risk factors, in the cross-section of stock returns. The three essays mentioned above use stock market data from the U.S. The fourth essay presents a new total return stock market index for the Finnish stock market beginning from the opening of the Helsinki Stock Exchange in 1912 and ending in 1969 when other total return indices become available. Because a total return stock market index for the period prior to 1970 has not been available before, academics and stock market participants have not known the historical return that stock market investors in Finland could have achieved on their investments. The new stock market index presented in essay 4 makes it possible, for the first time, to calculate the historical average return on the Finnish stock market and to conduct further studies that require long time-series of data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study explores the relationship between Intellectual Capital and Maintenance of Work Ability. Intellectual Capital is the central framework for analysing the increasing knowledge-intensiveness of business life. It is characteristic of Intellectual Capital that the intersection of human capital, internal structures and external structures is essential. Maintenance of Work Ability, on the other hand, has been the leading paradigm for Finnish occupational health and safety activities since the late 1980s. It is also a holistic approach that emphasises the interdependence of competence, work community, work environment and health as the key to work-related wellbeing. This thesis consists of five essays that scrutinise the focal phenomena both theoretically and empirically. The conceptual model that results from the first research essay provides a general framework for the whole thesis. The case study in the second essay supports a division of intangible assets into generative and commercially exploitable intangibles introduced in the first essay and further into the primary and secondary dimension of generative intangibles. Further scrutiny of the interaction of generative intangible assets in essay three reveals that employees’ wellbeing enhances the readiness to contribute to the knowledge creation process. The fourth essay shows that the MWA framework could benefit knowledge-intensive work but this would require a different approach than has been commonly adopted in Finland. In essay five, deeper analysis of the MWA framework shows that its potential results from comprehensive support of the functioning of an organisation. The general conclusion of this thesis is that organisations must take care of their employees’ wellbeing in order to secure innovativeness that is the key to surviving in today’s competitive business environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Perhaps the most fundamental prediction of financial theory is that the expected returns on financial assets are determined by the amount of risk contained in their payoffs. Assets with a riskier payoff pattern should provide higher expected returns than assets that are otherwise similar but provide payoffs that contain less risk. Financial theory also predicts that not all types of risks should be compensated with higher expected returns. It is well-known that the asset-specific risk can be diversified away, whereas the systematic component of risk that affects all assets remains even in large portfolios. Thus, the asset-specific risk that the investor can easily get rid of by diversification should not lead to higher expected returns, and only the shared movement of individual asset returns – the sensitivity of these assets to a set of systematic risk factors – should matter for asset pricing. It is within this framework that this thesis is situated. The first essay proposes a new systematic risk factor, hypothesized to be correlated with changes in investor risk aversion, which manages to explain a large fraction of the return variation in the cross-section of stock returns. The second and third essays investigate the pricing of asset-specific risk, uncorrelated with commonly used risk factors, in the cross-section of stock returns. The three essays mentioned above use stock market data from the U.S. The fourth essay presents a new total return stock market index for the Finnish stock market beginning from the opening of the Helsinki Stock Exchange in 1912 and ending in 1969 when other total return indices become available. Because a total return stock market index for the period prior to 1970 has not been available before, academics and stock market participants have not known the historical return that stock market investors in Finland could have achieved on their investments. The new stock market index presented in essay 4 makes it possible, for the first time, to calculate the historical average return on the Finnish stock market and to conduct further studies that require long time-series of data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A low correlation interleaved QAM sequence family is presented here. In a CDMA setting, these sequences have the ability to transport a large amount of data as well as enable variable-rate signaling on the reverse link. The new interleaved selected family INQ has period N, normalized maximum correlation parameter thetasmacrmax bounded above by lsim a radicN, where a ranges from 1.17 in the 16-QAM case to 1.99 for large M2-QAM, where M = 2m, m ges 2. Each user is enabled to transfer m + 1 bits of data per period of the spreading sequence. These constructions have the lowest known value of maximum correlation of any sequence family with the same alphabet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequent episode discovery is a popular framework for mining data available as a long sequence of events. An episode is essentially a short ordered sequence of event types and the frequency of an episode is some suitable measure of how often the episode occurs in the data sequence. Recently,we proposed a new frequency measure for episodes based on the notion of non-overlapped occurrences of episodes in the event sequence, and showed that, such a definition, in addition to yielding computationally efficient algorithms, has some important theoretical properties in connecting frequent episode discovery with HMM learning. This paper presents some new algorithms for frequent episode discovery under this non-overlapped occurrences-based frequency definition. The algorithms presented here are better (by a factor of N, where N denotes the size of episodes being discovered) in terms of both time and space complexities when compared to existing methods for frequent episode discovery. We show through some simulation experiments, that our algorithms are very efficient. The new algorithms presented here have arguably the least possible orders of spaceand time complexities for the task of frequent episode discovery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequent episode discovery framework is a popular framework in temporal data mining with many applications. Over the years, many different notions of frequencies of episodes have been proposed along with different algorithms for episode discovery. In this paper, we present a unified view of all the apriori-based discoverymethods for serial episodes under these different notions of frequencies. Specifically, we present a unified view of the various frequency counting algorithms. We propose a generic counting algorithm such that all current algorithms are special cases of it. This unified view allows one to gain insights into different frequencies, and we present quantitative relationships among different frequencies.Our unified view also helps in obtaining correctness proofs for various counting algorithms as we show here. It also aids in understanding and obtaining the anti-monotonicity properties satisfied by the various frequencies, the properties exploited by the candidate generation step of any apriori-based method. We also point out how our unified view of counting helps to consider generalization of the algorithm to count episodes with general partial orders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequent episode discovery is a popular framework for temporal pattern discovery in event streams. An episode is a partially ordered set of nodes with each node associated with an event type. Currently algorithms exist for episode discovery only when the associated partial order is total order (serial episode) or trivial (parallel episode). In this paper, we propose efficient algorithms for discovering frequent episodes with unrestricted partial orders when the associated event-types are unique. These algorithms can be easily specialized to discover only serial or parallel episodes. Also, the algorithms are flexible enough to be specialized for mining in the space of certain interesting subclasses of partial orders. We point out that frequency alone is not a sufficient measure of interestingness in the context of partial order mining. We propose a new interestingness measure for episodes with unrestricted partial orders which, when used along with frequency, results in an efficient scheme of data mining. Simulations are presented to demonstrate the effectiveness of our algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequent episode discovery framework is a popular framework in temporal data mining with many applications. Over the years, many different notions of frequencies of episodes have been proposed along with different algorithms for episode discovery. In this paper, we present a unified view of all the apriori-based discovery methods for serial episodes under these different notions of frequencies. Specifically, we present a unified view of the various frequency counting algorithms. We propose a generic counting algorithm such that all current algorithms are special cases of it. This unified view allows one to gain insights into different frequencies, and we present quantitative relationships among different frequencies. Our unified view also helps in obtaining correctness proofs for various counting algorithms as we show here. It also aids in understanding and obtaining the anti-monotonicity properties satisfied by the various frequencies, the properties exploited by the candidate generation step of any apriori-based method. We also point out how our unified view of counting helps to consider generalization of the algorithm to count episodes with general partial orders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multiwavelength data indicate that the X-ray-emitting plasma in the cores of galaxy clusters is not cooling catastrophically. To a large extent, cooling is offset by heating due to active galactic nuclei (AGNs) via jets. The cool-core clusters, with cooler/denser plasmas, show multiphase gas and signs of some cooling in their cores. These observations suggest that the cool core is locally thermally unstable while maintaining global thermal equilibrium. Using high-resolution, three-dimensional simulations we study the formation of multiphase gas in cluster cores heated by collimated bipolar AGN jets. Our key conclusion is that spatially extended multiphase filaments form only when the instantaneous ratio of the thermal instability and free-fall timescales (t(TI)/t(ff)) falls below a critical threshold of approximate to 10. When this happens, dense cold gas decouples from the hot intracluster medium (ICM) phase and generates inhomogeneous and spatially extended Ha filaments. These cold gas clumps and filaments ``rain'' down onto the central regions of the core, forming a cold rotating torus and in part feeding the supermassive black hole. Consequently, the self-regulated feedback enhances AGN heating and the core returns to a higher entropy level with t(TI)/t(ff) > 10. Eventually, the core reaches quasi-stable global thermal equilibrium, and cold filaments condense out of the hot ICM whenever t(TI)/t(ff) less than or similar to 10. This occurs despite the fact that the energy from AGN jets is supplied to the core in a highly anisotropic fashion. The effective spatial redistribution of heat is enabled in part by the turbulent motions in the wake of freely falling cold filaments. Increased AGN activity can locally reverse the cold gas flow, launching cold filamentary gas away from the cluster center. Our criterion for the condensation of spatially extended cold gas is in agreement with observations and previous idealized simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequent episode discovery is a popular framework for pattern discovery from sequential data. It has found many applications in domains like alarm management in telecommunication networks, fault analysis in the manufacturing plants, predicting user behavior in web click streams and so on. In this paper, we address the discovery of serial episodes. In the episodes context, there have been multiple ways to quantify the frequency of an episode. Most of the current algorithms for episode discovery under various frequencies are apriori-based level-wise methods. These methods essentially perform a breadth-first search of the pattern space. However currently there are no depth-first based methods of pattern discovery in the frequent episode framework under many of the frequency definitions. In this paper, we try to bridge this gap. We provide new depth-first based algorithms for serial episode discovery under non-overlapped and total frequencies. Under non-overlapped frequency, we present algorithms that can take care of span constraint and gap constraint on episode occurrences. Under total frequency we present an algorithm that can handle span constraint. We provide proofs of correctness for the proposed algorithms. We demonstrate the effectiveness of the proposed algorithms by extensive simulations. We also give detailed run-time comparisons with the existing apriori-based methods and illustrate scenarios under which the proposed pattern-growth algorithms perform better than their apriori counterparts. (C) 2013 Elsevier B.V. All rights reserved.