862 resultados para World Wide Web
Resumo:
Hollywood has dominated the global film business since the First World War. Economic formulas used by governments to assess levels of industry dominance typically measure market share to establish the degree of industry concentration. The business literature reveals that a marketing orientation strongly correlates with superior market performance and that market leaders that possess a set of six superior marketing capabilities are able to continually outperform rival firms. This paper argues that the historical evidence shows that the Hollywood Majors have consistently outperformed rival firms and rival film industries in each of those six marketing capabilities and that unless rivals develop a similarly integrated and cohesive strategic marketing management approach to the movie business and match the Major studios’ superior capabilities, then Hollywood’s dominance will continue. This paper also proposes that in cyberspace, whilst the Internet does provide a channel that democratises film distribution, the flat landscape of the world wide web means that in order to stand out from the clutter of millions of cyber-voices seeking attention, independent film companies need to possess superior strategic marketing management capabilities and develop effective e-marketing strategies to find a niche, attract a loyal online audience and prosper. However, mirroring a recent CIA report forecasting a multi-polar world economy, this paper also argues that potentially serious longer-term rivals are emerging and will increasingly take a larger slice of an expanding global box office as India, China and other major developing economies and their respective cultural channels grow and achieve economic parity with or surpass the advanced western economies. Thus, in terms of global market share over time, Hollywood’s slice of the pie will comparatively diminish in an emerging multi-polar movie business.
Resumo:
The Rudd Labour Government rode to power in Australia on the education promise of 'an education revolution'. The term 'education revolution' carries all the obligatory marketing metaphors that an aspirant government might want recognised by the general public on the eve government came to power however in revolutionary terms it fades into insignificance in comparison to the real revolution in Australian education. This revolution simply put is to elevate Indigenous Knowledge Systems, in Australian Universities. In the forty three years since the nation setting Referendum of 1967 a generation has made a beach head on the educational landscape. Now a further generation who having made it into the field of higher degrees yearn for the ways and means to authentically marshal Indigenous knowledge? The Institute of Koorie Education at Deakin has for over twenty years not only witnessed the transition but is also a leader in the field. With the appointment of two Chairs of Indigenous Knowledge Systems to build on to its already established research profile the Institute moved towards what is the 'real revolution' in education – the elevation of Indigenous Knowledge as a legitimate knowledge system. This paper lays out the Institute of Koorie Education‘s Research Plan and the basis of an argument put to the academy that will be the driver for this pursuit.
Resumo:
While critical success factors (CSFs) of enterprise system (ES) implementation are mature concepts and have received considerable attention for over a decade, researchers have very often focused on only a specific aspect of the implementation process or a specific CSF. Resultantly, there is (1) little research documented that encompasses all significant CSF considerations and (2) little empirical research into the important factors of successful ES implementation. This paper is part of a larger research effort that aims to contribute to understanding the phenomenon of ES CSFs, and reports on preliminary findings from a case study conducted at a Queensland University of Technology (QUT) in Australia. This paper reports on an empirically derived CSFs framework using a directed content analysis of 79 studies; from top IS outlets, employing the characteristics of the analytic theory, and from six different projects implemented at QUT.
Resumo:
While Information services function’s (ISF) service quality is not a new concept and has received considerable attention for over two decades, cross-cultural research of ISF’s service quality is not very mature. The author argues that the relationship between cultural dimensions and the ISF’s service quality dimensions may provide useful insights for how organisations should deal with different cultural groups. This paper will show that ISF’s service quality dimensions vary from one culture to another. The study adopts Hofstede’s (1980, 1991) typology of cultures and the “zones of tolerance” (ZOT) service quality measure reported by Kettinger & Lee (2005) as the primary commencing theory-base. In this paper, the author hypothesised and tested the influences of culture on users’ service quality perceptions and found strong empirical support for the study’s hypotheses. The results of this study indicate that as a result of their cultural characteristics, users vary in both their overall service quality perceptions and their perceptions on each of the four dimensions of ZOT service quality.
Resumo:
This conference celebrates the passing of 40 years since the establishment of the Internet (dating this, presumably, to the first connection between two nodes on ARPANET in October 1969). For a gathering of media scholars such as this, however, it may be just as important not only to mark the first testing of the core technologies upon which much of our present‐day Net continues to build, but also to reflect on another recent milestone: the 20th anniversary of what is today arguably the chief interface through which billions around the world access and experience the Internet – the World Wide Web, launched by Tim Berners‐Lee in 1989.
Resumo:
To date, much work has been done to examine the ways in which information literacy – a way of thinking about, existing alongside and working with information- functions in an academic setting. However, its role in the non-academic library professions has been largely ignored. Given that the public librarian is responsible for designing and delivering services and programmes aimed at supporting the information literacy needs of the community-at-large there is great value to be had from examining the ways in which public libraries understand and experience IL. The research described in this paper investigates, through the use of phenomenography; the ways in which public librarians understand and experience the concept of Information Literacy.
Resumo:
In a resource constrained business world, strategic choices must be made on process improvement and service delivery. There are calls for more agile forms of enterprises and much effort is being directed at moving organizations from a complex landscape of disparate application systems to that of an integrated and flexible enterprise accessing complex systems landscapes through service oriented architecture (SOA). This paper describes the analysis of strategies to detect supporting business services. These services can then be delivered in a variety of ways: web-services, new application services or outsourced services. The focus of this paper is on strategy analysis to identify those strategies that are common to lines of business and thus can be supported through shared services. A case study of a state government is used to show the analytical method and the detection of shared strategies.
Resumo:
In their quest for resources to support children’s early literacy learning and development, parents encounter and traverse different spaces in which discourses and artifacts are produced and circulated. This paper uses conceptual tools from the field of geosemiotics to examine some commercial spaces designed for parents and children which foreground preschool learning and development. Drawing on data generated in a wider study I discuss some of the ways in which the material and virtual commercial spaces of a transnational shopping mall company and an educational toy company operate as sites of encounter between discourses and artifacts about children’s early learning and parents of preschoolers. I consider how companies connect with and ‘situate’ people as parents and customers, and then offer pathways designed for parents to follow as they attempt to meet their very young children’s learning and development needs. I argue that these pathways are both material and ideological, and that are increasingly tending to lead parents to the online commercial spaces of the world wide web. I show how companies are using the online environment and hybrid offline and online spaces and flows to reinforce an image of themselves as authoritative brokers of childhood resources for parents that is highly valuable in a policy climate which foregrounds lifelong learning and school readiness.
Resumo:
Complex networks have been studied extensively due to their relevance to many real-world systems such as the world-wide web, the internet, biological and social systems. During the past two decades, studies of such networks in different fields have produced many significant results concerning their structures, topological properties, and dynamics. Three well-known properties of complex networks are scale-free degree distribution, small-world effect and self-similarity. The search for additional meaningful properties and the relationships among these properties is an active area of current research. This thesis investigates a newer aspect of complex networks, namely their multifractality, which is an extension of the concept of selfsimilarity. The first part of the thesis aims to confirm that the study of properties of complex networks can be expanded to a wider field including more complex weighted networks. Those real networks that have been shown to possess the self-similarity property in the existing literature are all unweighted networks. We use the proteinprotein interaction (PPI) networks as a key example to show that their weighted networks inherit the self-similarity from the original unweighted networks. Firstly, we confirm that the random sequential box-covering algorithm is an effective tool to compute the fractal dimension of complex networks. This is demonstrated on the Homo sapiens and E. coli PPI networks as well as their skeletons. Our results verify that the fractal dimension of the skeleton is smaller than that of the original network due to the shortest distance between nodes is larger in the skeleton, hence for a fixed box-size more boxes will be needed to cover the skeleton. Then we adopt the iterative scoring method to generate weighted PPI networks of five species, namely Homo sapiens, E. coli, yeast, C. elegans and Arabidopsis Thaliana. By using the random sequential box-covering algorithm, we calculate the fractal dimensions for both the original unweighted PPI networks and the generated weighted networks. The results show that self-similarity is still present in generated weighted PPI networks. This implication will be useful for our treatment of the networks in the third part of the thesis. The second part of the thesis aims to explore the multifractal behavior of different complex networks. Fractals such as the Cantor set, the Koch curve and the Sierspinski gasket are homogeneous since these fractals consist of a geometrical figure which repeats on an ever-reduced scale. Fractal analysis is a useful method for their study. However, real-world fractals are not homogeneous; there is rarely an identical motif repeated on all scales. Their singularity may vary on different subsets; implying that these objects are multifractal. Multifractal analysis is a useful way to systematically characterize the spatial heterogeneity of both theoretical and experimental fractal patterns. However, the tools for multifractal analysis of objects in Euclidean space are not suitable for complex networks. In this thesis, we propose a new box covering algorithm for multifractal analysis of complex networks. This algorithm is demonstrated in the computation of the generalized fractal dimensions of some theoretical networks, namely scale-free networks, small-world networks, random networks, and a kind of real networks, namely PPI networks of different species. Our main finding is the existence of multifractality in scale-free networks and PPI networks, while the multifractal behaviour is not confirmed for small-world networks and random networks. As another application, we generate gene interactions networks for patients and healthy people using the correlation coefficients between microarrays of different genes. Our results confirm the existence of multifractality in gene interactions networks. This multifractal analysis then provides a potentially useful tool for gene clustering and identification. The third part of the thesis aims to investigate the topological properties of networks constructed from time series. Characterizing complicated dynamics from time series is a fundamental problem of continuing interest in a wide variety of fields. Recent works indicate that complex network theory can be a powerful tool to analyse time series. Many existing methods for transforming time series into complex networks share a common feature: they define the connectivity of a complex network by the mutual proximity of different parts (e.g., individual states, state vectors, or cycles) of a single trajectory. In this thesis, we propose a new method to construct networks of time series: we define nodes by vectors of a certain length in the time series, and weight of edges between any two nodes by the Euclidean distance between the corresponding two vectors. We apply this method to build networks for fractional Brownian motions, whose long-range dependence is characterised by their Hurst exponent. We verify the validity of this method by showing that time series with stronger correlation, hence larger Hurst exponent, tend to have smaller fractal dimension, hence smoother sample paths. We then construct networks via the technique of horizontal visibility graph (HVG), which has been widely used recently. We confirm a known linear relationship between the Hurst exponent of fractional Brownian motion and the fractal dimension of the corresponding HVG network. In the first application, we apply our newly developed box-covering algorithm to calculate the generalized fractal dimensions of the HVG networks of fractional Brownian motions as well as those for binomial cascades and five bacterial genomes. The results confirm the monoscaling of fractional Brownian motion and the multifractality of the rest. As an additional application, we discuss the resilience of networks constructed from time series via two different approaches: visibility graph and horizontal visibility graph. Our finding is that the degree distribution of VG networks of fractional Brownian motions is scale-free (i.e., having a power law) meaning that one needs to destroy a large percentage of nodes before the network collapses into isolated parts; while for HVG networks of fractional Brownian motions, the degree distribution has exponential tails, implying that HVG networks would not survive the same kind of attack.
Resumo:
Information Technology (IT) education is in crisis. Enrolments have dropped by up to as much as 70% at some universities (Markoff, 2009). This coupled with traditionally high attrition and failure rates (Biggers et al, 2008) is resulting in the number of graduates nationwide being far lower than industry demand (Queensland Government SkillsInfo Report, 2009). This work reports on a radical redesign of the Bachelor of IT degree at QUT. The initial results are very promising with attrition in first year dropping from being one of the highest at QUT for an undergraduate degree to being one of the lowest. The redesign followed an action research model to reflect on issues and problems with the previous version of the degree and to introduce changes to attempt to rectify some of these problems. The resulting degree intends to produce "business savvy" graduates who are capable of using their IT knowledge and skills within cross-functional teams to solve complex problems.
Resumo:
Process modelling – the design and use of graphical documentations of an organisation’s business processes – is a key method to document and use information about business processes in organisational projects. Still, despite current interest in process modelling, this area of study still faces essential challenges. One of the key unanswered questions concerns the impact of process modelling in organisational practice. Process modelling initiatives call for tangible results in the form of returns on the substantial investments that organisations undertake to achieve improved processes. This study explores the impact of process model use on end-users and its contribution to organisational success. We posit that the use of conceptual models creates impact in organisational process teams. We also report on a set of case studies in which we explore tentative evidence for the development of impact of process model use. The results of this work provide a better understanding of process modelling impact from information practices and also lead to insights into how organisations should conduct process modelling initiatives in order to achieve an optimum return on their investment.