372 resultados para Maximum independent set
Resumo:
As most of people know that all of mass media are state-owned in China, television stations are not exceptional to belong to the enormous state-owned system. But to date, with the economic reform in the broadcasting system and China entering into WTO, the television industry has increased greatly and the television market has matured with more and more competition. The players in China’s television industry have changed from the monologue of TV stations to multi roles of TV stations, production companies and overseas television companies, although TV stations are still the majority of China’s TV market. Especially, private television production companies are becoming more and more active in this market. In this paper, I will describe the development process and challenges of this group in China and ask whether the emergence of this group means for the whole China’s TV industry?
Resumo:
Beginning around 2003, television studies has seen the growth of interest in the genre of reality shows. However, concentrating on this genre has tended to sideline the even more significant emergence of the program format as a central mode of business and culture in the new television landscape. "Localizing Global TV" redresses this balance, and heralds the emergence of an important, exciting and challenging area of television studies. Topics explored include reality TV, makeover programs, sitcoms, talent shows and fiction serials, as well as broadcaster management policies, production decision chains and audience participation processes. This seminal work will be of considerable interest to media scholars internationally.
Resumo:
This study is the first to investigate the effect of prolonged reading on reading performance and visual functions in students with low vision. The study focuses on one of the most common modes of achieving adequate magnification for reading by students with low vision, their close reading distance (proximal or relative distance magnification). Close reading distances impose high demands on near visual functions, such as accommodation and convergence. Previous research on accommodation in children with low vision shows that their accommodative responses are reduced compared to normal vision. In addition, there is an increased lag of accommodation for higher stimulus levels as may occur at close reading distance. Reduced accommodative responses in low vision and higher lag of accommodation at close reading distances together could impact on reading performance of students with low vision especially during prolonged reading tasks. The presence of convergence anomalies could further affect reading performance. Therefore, the aims of the present study were 1) To investigate the effect of prolonged reading on reading performance in students with low vision 2) To investigate the effect of prolonged reading on visual functions in students with low vision. This study was conducted as cross-sectional research on 42 students with low vision and a comparison group of 20 students with normal vision, aged 7 to 20 years. The students with low vision had vision impairments arising from a range of causes and represented a typical group of students with low vision, with no significant developmental delays, attending school in Brisbane, Australia. All participants underwent a battery of clinical tests before and after a prolonged reading task. An initial reading-specific history and pre-task measurements that included Bailey-Lovie distance and near visual acuities, Pelli-Robson contrast sensitivity, ocular deviations, sensory fusion, ocular motility, near point of accommodation (pull-away method), accuracy of accommodation (Monocular Estimation Method (MEM)) retinoscopy and Near Point of Convergence (NPC) (push-up method) were recorded for all participants. Reading performance measures were Maximum Oral Reading Rates (MORR), Near Text Visual Acuity (NTVA) and acuity reserves using Bailey-Lovie text charts. Symptoms of visual fatigue were assessed using the Convergence Insufficiency Symptom Survey (CISS) for all participants. Pre-task measurements of reading performance and accuracy of accommodation and NPC were compared with post-task measurements, to test for any effects of prolonged reading. The prolonged reading task involved reading a storybook silently for at least 30 minutes. The task was controlled for print size, contrast, difficulty level and content of the reading material. Silent Reading Rate (SRR) was recorded every 2 minutes during prolonged reading. Symptom scores and visual fatigue scores were also obtained for all participants. A visual fatigue analogue scale (VAS) was used to assess visual fatigue during the task, once at the beginning, once at the middle and once at the end of the task. In addition to the subjective assessments of visual fatigue, tonic accommodation was monitored using a photorefractor (PlusoptiX CR03™) every 6 minutes during the task, as an objective assessment of visual fatigue. Reading measures were done at the habitual reading distance of students with low vision and at 25 cms for students with normal vision. The initial history showed that the students with low vision read for significantly shorter periods at home compared to the students with normal vision. The working distances of participants with low vision ranged from 3-25 cms and half of them were not using any optical devices for magnification. Nearly half of the participants with low vision were able to resolve 8-point print (1M) at 25 cms. Half of the participants in the low vision group had ocular deviations and suppression at near. Reading rates were significantly reduced in students with low vision compared to those of students with normal vision. In addition, there were a significantly larger number of participants in the low vision group who could not sustain the 30-minute task compared to the normal vision group. However, there were no significant changes in reading rates during or following prolonged reading in either the low vision or normal vision groups. Individual changes in reading rates were independent of their baseline reading rates, indicating that the changes in reading rates during prolonged reading cannot be predicted from a typical clinical assessment of reading using brief reading tasks. Contrary to previous reports the silent reading rates of the students with low vision were significantly lower than their oral reading rates, although oral and silent reading was assessed using different methods. Although the visual acuity, contrast sensitivity, near point of convergence and accuracy of accommodation were significantly poorer for the low vision group compared to those of the normal vision group, there were no significant changes in any of these visual functions following prolonged reading in either group. Interestingly, a few students with low vision (n =10) were found to be reading at a distance closer than their near point of accommodation. This suggests a decreased sensitivity to blur. Further evaluation revealed that the equivalent intrinsic refractive errors (an estimate of the spherical dioptirc defocus which would be expected to yield a patient’s visual acuity in normal subjects) were significantly larger for the low vision group compared to those of the normal vision group. As expected, accommodative responses were significantly reduced for the low vision group compared to the expected norms, which is consistent with their close reading distances, reduced visual acuity and contrast sensitivity. For those in the low vision group who had an accommodative error exceeding their equivalent intrinsic refractive errors, a significant decrease in MORR was found following prolonged reading. The silent reading rates however were not significantly affected by accommodative errors in the present study. Suppression also had a significant impact on the changes in reading rates during prolonged reading. The participants who did not have suppression at near showed significant decreases in silent reading rates during and following prolonged reading. This impact of binocular vision at near on prolonged reading was possibly due to the high demands on convergence. The significant predictors of MORR in the low vision group were age, NTVA, reading interest and reading comprehension, accounting for 61.7% of the variances in MORR. SRR was not significantly influenced by any factors, except for the duration of the reading task sustained; participants with higher reading rates were able to sustain a longer reading duration. In students with normal vision, age was the only predictor of MORR. Participants with low vision also reported significantly greater visual fatigue compared to the normal vision group. Measures of tonic accommodation however were little influenced by visual fatigue in the present study. Visual fatigue analogue scores were found to be significantly associated with reading rates in students with low vision and normal vision. However, the patterns of association between visual fatigue and reading rates were different for SRR and MORR. The participants with low vision with higher symptom scores had lower SRRs and participants with higher visual fatigue had lower MORRs. As hypothesized, visual functions such as accuracy of accommodation and convergence did have an impact on prolonged reading in students with low vision, for students whose accommodative errors were greater than their equivalent intrinsic refractive errors, and for those who did not suppress one eye. Those students with low vision who have accommodative errors higher than their equivalent intrinsic refractive errors might significantly benefit from reading glasses. Similarly, considering prisms or occlusion for those without suppression might reduce the convergence demands in these students while using their close reading distances. The impact of these prescriptions on reading rates, reading interest and visual fatigue is an area of promising future research. Most importantly, it is evident from the present study that a combination of factors such as accommodative errors, near point of convergence and suppression should be considered when prescribing reading devices for students with low vision. Considering these factors would also assist rehabilitation specialists in identifying those students who are likely to experience difficulty in prolonged reading, which is otherwise not reflected during typical clinical reading assessments.
Resumo:
Patients with chest discomfort or other symptoms suggestive of acute coronary syndrome (ACS) are one of the most common categories seen in many Emergency Departments (EDs). While the recognition of patients at high-risk of ACS has improved steadily, identifying the majority of chest pain presentations who fall into the low-risk group remains a challenge. Research in this area needs to be transparent, robust, applicable to all hospitals from large tertiary centres to rural and remote sites, and to allow direct comparison between different studies with minimum patient spectrum bias. A standardised approach to the research framework using a common language for data definitions must be adopted to achieve this. The aim was to create a common framework for a standardised data definitions set that would allow maximum value when extrapolating research findings both within Australasian ED practice, and across similar populations worldwide. Therefore a comprehensive data definitions set for the investigation of non-traumatic chest pain patients with possible ACS was developed, specifically for use in the ED setting. This standardised data definitions set will facilitate ‘knowledge translation’ by allowing extrapolation of useful findings into the real-life practice of emergency medicine.
Resumo:
Accessibility to housing for low to moderate income groups in Australia has been experiencing a severe decline since 2001. On the supply side, the public sector has been reducing its commitment to the direct provision of public housing. Despite high demand for affordable housing, there has been limited supply generated by non-government housing providers. One possible solution to promote an increase in affordable housing supply, like other infrastructure, is through the development of multi-stakeholder partnerships and private financing. This research aims to identify current issues underlying decision-making criteria for building multi-stakeholder partnerships to deliver affordable housing projects. It also investigates strategies for minimising risk and ensuring the financial outcomes of these partnership arrangements. A mix of qualitative in-depth interviews and quantitative surveys has been used as the main method to explore stakeholder experiences regarding their involvement in partnership arrangements in the affordable housing sector in Queensland. Two sets of interviews were conducted following an exploratory pilot study: one set in 2003-2004 and the other in 2007-2008. There were nineteen respondents representing government, private and not-for-profit organisations in the first stage interviews and surveys. The second stage interviews were focussed on twenty-two housing providers in South East Queensland. Initial analyses have been conducted using thematic and statistical analyses. This study extends the use of existing decision making tools and combines the use of a Soft System Framework to analyse the ideal state questionnaires using qualitative thematic analysis. Soft System Methodology (SSM) has been used to analyse this unstructured complex problem by using systematic thinking to develop a conceptual model and carrying it to the real world situations to solve the problem. This research found that the diversity of stakeholder capability and their level of risk acceptance will allow partnerships to develop the best synergies and a degree of collaboration which achieves the required financial return within acceptable risk parameters. However, some of the negativity attached to future commitment to such partnerships has been found to be the anticipation of a worse outcome than that expected from independent action. Many interviewees agree that housing providers' fear of financial risk and community rejection has been central to dampening their enthusiasm for entering such investment projects. The creation of a mixed-use development structure will mitigate both risk and return as the commercial income will subsidise the affordable housing development and will normalise concentration of marginalised low-income people who live in a prime location with an award winning design. In addition, tenant support schemes and rent-to-buy incentive programs will encourage them to secure their tenancies and significantly reduce the risk of rent arrears and property damage. There is also a breakthrough investment vehicle offered by the social developer which sells the non-physical but financial product to individual and institutional investors to mitigate further financial risk. Finally, this study recommends modification of the current value-for-money framework in favour of broader partnership arrangements which are more closely aligned with risk minimisation strategies.
Resumo:
Although the branding literature commenced during the 1940s, the first publications related to destination branding did not emerge until half a century later. A review of 74 destination branding publications by 102 authors from the first 10 years of destination branding literature (1998-2007) found at least nine potential research gaps warranting attention by researchers. In particular, there has been a lack of research examining the extent to which brand positioning campaigns have been successful in enhancing brand equity in the manner intended in the brand identity. The purpose of this paper is to report the results of an investigation of brand equity tracking for a competitive set of destinations in Queensland, Australia between 2003 and 2007. A hierarchy of consumer-based brand equity (CBBE) provided an effective means to monitor destination brand positions over time. A key implication of the results was the finding that there was no change in brand positions for any of the five destinations over the four year period. This leads to the proposition that destination position change within a competitive set will only occur slowly over a long period of time. The tabulation of 74 destination branding case studies, research papers, conceptual papers and web content analyses provides students and researchers with a useful resource on the current state of the field.
Resumo:
An experimental investigation has been made of a round, non-buoyant plume of nitric oxide, NO, in a turbulent grid flow of ozone, 03, using the Turbulent Smog Chamber at the University of Sydney. The measurements have been made at a resolution not previously reported in the literature. The reaction is conducted at non-equilibrium so there is significant interaction between turbulent mixing and chemical reaction. The plume has been characterized by a set of constant initial reactant concentration measurements consisting of radial profiles at various axial locations. Whole plume behaviour can thus be characterized and parameters are selected for a second set of fixed physical location measurements where the effects of varying the initial reactant concentrations are investigated. Careful experiment design and specially developed chemilurninescent analysers, which measure fluctuating concentrations of reactive scalars, ensure that spatial and temporal resolutions are adequate to measure the quantities of interest. Conserved scalar theory is used to define a conserved scalar from the measured reactive scalars and to define frozen, equilibrium and reaction dominated cases for the reactive scalars. Reactive scalar means and the mean reaction rate are bounded by frozen and equilibrium limits but this is not always the case for the reactant variances and covariances. The plume reactant statistics are closer to the equilibrium limit than those for the ambient reactant. The covariance term in the mean reaction rate is found to be negative and significant for all measurements made. The Toor closure was found to overestimate the mean reaction rate by 15 to 65%. Gradient model turbulent diffusivities had significant scatter and were not observed to be affected by reaction. The ratio of turbulent diffusivities for the conserved scalar mean and that for the r.m.s. was found to be approximately 1. Estimates of the ratio of the dissipation timescales of around 2 were found downstream. Estimates of the correlation coefficient between the conserved scalar and its dissipation (parallel to the mean flow) were found to be between 0.25 and the significant value of 0.5. Scalar dissipations for non-reactive and reactive scalars were found to be significantly different. Conditional statistics are found to be a useful way of investigating the reactive behaviour of the plume, effectively decoupling the interaction of chemical reaction and turbulent mixing. It is found that conditional reactive scalar means lack significant transverse dependence as has previously been found theoretically by Klimenko (1995). It is also found that conditional variance around the conditional reactive scalar means is relatively small, simplifying the closure for the conditional reaction rate. These properties are important for the Conditional Moment Closure (CMC) model for turbulent reacting flows recently proposed by Klimenko (1990) and Bilger (1993). Preliminary CMC model calculations are carried out for this flow using a simple model for the conditional scalar dissipation. Model predictions and measured conditional reactive scalar means compare favorably. The reaction dominated limit is found to indicate the maximum reactedness of a reactive scalar and is a limiting case of the CMC model. Conventional (unconditional) reactive scalar means obtained from the preliminary CMC predictions using the conserved scalar p.d.f. compare favorably with those found from experiment except where measuring position is relatively far upstream of the stoichiometric distance. Recommendations include applying a full CMC model to the flow and investigations both of the less significant terms in the conditional mean species equation and the small variation of the conditional mean with radius. Forms for the p.d.f.s, in addition to those found from experiments, could be useful for extending the CMC model to reactive flows in the atmosphere.
Resumo:
Different terminologies have been used to characterize the Chinese independent cinema in the 1990s. These definitions focus on the experimental practices outside the official production system and independent of official ideology. The film industry has had distinctive development since the entry of WTO in 2001. Private investors have played essential role in cinematic economy; strict censorship has been obviously relaxed; the film industry is being divided into two opposing extremes. Thus, it is necessary to give a new definition of the Chinese independent cinema. The definition of independent cinema in today China I suggest in the light of American independence is that any film that has not been financed, produced and distributed by majors is independent. At least four corporations are majors in the Chinese film industry. They are China Film Group Corporation, Huayi Brothers Corporation, PolyBona Film Distribution Corporation and Shanghai Film Group Corporation. Except the four majors, all the other film production or distribution companies are independents.
Resumo:
Case note of Leighton Contractors Pty Ltd v Fox (2009) 258 ALR 673 ----- In Leighton Contractors Pty Ltd v Fox (2009) 83 ALJR 1086 ; 258 ALR 673 the High Court considered the liability of a principal contractor for the negligence of independent subcontractors on a building site. In its decision, the court considered the nature and the scope of the duty owed by principals to independent contractors.
Resumo:
The structure of the 1:1 proton-transfer compound from the reaction of L-tartaric acid with the azo-dye precursor aniline yellow [4-(phenylazo)aniline], 4-(phenyldiazenyl)anilinium hydrogen 2R,3R-tartrate C12H12N3+ . C4H6O6- has been determined at 200 K. The asymmetric unit of the compound contains two independent phenylazoanilinium cations and two hydrogen L-tartrate anions. The structure is unusual in that all four phenyl rings of both cations have identical 50% rotational disorder. The two hydrogen L-tartrate anions form independent but similar chains through head-to-tail carboxylic O--H...O~carboxyl~ hydrogen bonds [graph set C7] which are then extended into a two-dimensional hydrogen-bonded sheet structure through hydroxyl O--H...O hydrogen-bonding links. The anilinium groups of the phenyldiazenyl cations are incorporated into the sheets and also provide internal hydrogen-bonding extensions while their aromatic tails layer in the structure without significant interaction except for weak \p--\p interactions [minimum ring centroid separation, 3.844(3) \%A]. The hydrogen L-tartrate residues of both anions have the common short intramolecular hydroxyl O--H...O~carboxyl~ hydogen bonds. This work has provided a solution to the unusual disorder problem inherent in the structure of this salt as well as giving another example of the utility of the hydrogen tartrate in the generation of sheet substructures in molecular assembly processes.
Resumo:
Campylobacter jejuni followed by Campylobacter coli contribute substantially to the economic and public health burden attributed to food-borne infections in Australia. Genotypic characterisation of isolates has provided new insights into the epidemiology and pathogenesis of C. jejuni and C. coli. However, currently available methods are not conducive to large scale epidemiological investigations that are necessary to elucidate the global epidemiology of these common food-borne pathogens. This research aims to develop high resolution C. jejuni and C. coli genotyping schemes that are convenient for high throughput applications. Real-time PCR and High Resolution Melt (HRM) analysis are fundamental to the genotyping schemes developed in this study and enable rapid, cost effective, interrogation of a range of different polymorphic sites within the Campylobacter genome. While the sources and routes of transmission of campylobacters are unclear, handling and consumption of poultry meat is frequently associated with human campylobacteriosis in Australia. Therefore, chicken derived C. jejuni and C. coli isolates were used to develop and verify the methods described in this study. The first aim of this study describes the application of MLST-SNP (Multi Locus Sequence Typing Single Nucleotide Polymorphisms) + binary typing to 87 chicken C. jejuni isolates using real-time PCR analysis. These typing schemes were developed previously by our research group using isolates from campylobacteriosis patients. This present study showed that SNP + binary typing alone or in combination are effective at detecting epidemiological linkage between chicken derived Campylobacter isolates and enable data comparisons with other MLST based investigations. SNP + binary types obtained from chicken isolates in this study were compared with a previously SNP + binary and MLST typed set of human isolates. Common genotypes between the two collections of isolates were identified and ST-524 represented a clone that could be worth monitoring in the chicken meat industry. In contrast, ST-48, mainly associated with bovine hosts, was abundant in the human isolates. This genotype was, however, absent in the chicken isolates, indicating the role of non-poultry sources in causing human Campylobacter infections. This demonstrates the potential application of SNP + binary typing for epidemiological investigations and source tracing. While MLST SNPs and binary genes comprise the more stable backbone of the Campylobacter genome and are indicative of long term epidemiological linkage of the isolates, the development of a High Resolution Melt (HRM) based curve analysis method to interrogate the hypervariable Campylobacter flagellin encoding gene (flaA) is described in Aim 2 of this study. The flaA gene product appears to be an important pathogenicity determinant of campylobacters and is therefore a popular target for genotyping, especially for short term epidemiological studies such as outbreak investigations. HRM curve analysis based flaA interrogation is a single-step closed-tube method that provides portable data that can be easily shared and accessed. Critical to the development of flaA HRM was the use of flaA specific primers that did not amplify the flaB gene. HRM curve analysis flaA interrogation was successful at discriminating the 47 sequence variants identified within the 87 C. jejuni and 15 C. coli isolates and correlated to the epidemiological background of the isolates. In the combinatorial format, the resolving power of flaA was additive to that of SNP + binary typing and CRISPR (Clustered regularly spaced short Palindromic repeats) HRM and fits the PHRANA (Progressive hierarchical resolving assays using nucleic acids) approach for genotyping. The use of statistical methods to analyse the HRM data enhanced sophistication of the method. Therefore, flaA HRM is a rapid and cost effective alternative to gel- or sequence-based flaA typing schemes. Aim 3 of this study describes the development of a novel bioinformatics driven method to interrogate Campylobacter MLST gene fragments using HRM, and is called ‘SNP Nucleated Minim MLST’ or ‘Minim typing’. The method involves HRM interrogation of MLST fragments that encompass highly informative “Nucleating SNPS” to ensure high resolution. Selection of fragments potentially suited to HRM analysis was conducted in silico using i) “Minimum SNPs” and ii) the new ’HRMtype’ software packages. Species specific sets of six “Nucleating SNPs” and six HRM fragments were identified for both C. jejuni and C. coli to ensure high typeability and resolution relevant to the MLST database. ‘Minim typing’ was tested empirically by typing 15 C. jejuni and five C. coli isolates. The association of clonal complexes (CC) to each isolate by ‘Minim typing’ and SNP + binary typing were used to compare the two MLST interrogation schemes. The CCs linked with each C. jejuni isolate were consistent for both methods. Thus, ‘Minim typing’ is an efficient and cost effective method to interrogate MLST genes. However, it is not expected to be independent, or meet the resolution of, sequence based MLST gene interrogation. ‘Minim typing’ in combination with flaA HRM is envisaged to comprise a highly resolving combinatorial typing scheme developed around the HRM platform and is amenable to automation and multiplexing. The genotyping techniques described in this thesis involve the combinatorial interrogation of differentially evolving genetic markers on the unified real-time PCR and HRM platform. They provide high resolution and are simple, cost effective and ideally suited to rapid and high throughput genotyping for these common food-borne pathogens.
Resumo:
Greyback canegrubs cost the Australian sugarcane industry around $13 million per annum in damage and control. A novel and cost effective biocontrol bacterium could play an important role in the integrated pest management program currently in place to reduce damage and control associated costs. During the course of this project, terminal restriction fragment length polymorphism (TRFLP), 16-S rDNA cloning, suppressive subtractive hybridisation (SSH) and entomopathogen-specific PCR screening were used to investigate the little studied canegrub-associated microflora in an attempt to discover novel pathogens from putatively-diseased specimens. Microflora associated with these soil-dwelling insects was found to be both highly diverse and divergent between individual specimens. Dominant members detected in live specimens were predominantly from taxa of known insect symbionts while dominant sequences amplified from dead grubs were homologous to putativelysaprophytic bacteria and bacteria able to grow during refrigeration. A number of entomopathogenic bacteria were identified such as Photorhabdus luminescens and Pseudomonas fluorescens. Dead canegrubs prior to decomposition need to be analysed if these bacteria are to be isolated. Novel strategies to enrich putative pathogen-associated sequences (SSH and PCR screening) were shown to be promising approaches for pathogen discovery and the investigation of canegrubsassociated microflora. However, due to inter- and intra-grub-associated community diversity, dead grub decomposition and PCR-specific methodological limitations (PCR bias, primer specificity, BLAST database restrictions, 16-S gene copy number and heterogeneity), recommendations have been made to improve the efficiency of such techniques. Improved specimen collection procedures and utilisation of emerging high-throughput sequencing technologies may be required to examine these complex communities in more detail. This is the first study to perform a whole-grub analysis and comparison of greyback canegrub-associated microbial communities. This work also describes the development of a novel V3-PCR based SSH technique. This was the first SSH technique to use V3-PCR products as a starting material and specifically compare bacterial species present in a complex community.
Resumo:
Parallel combinatory orthogonal frequency division multiplexing (PC-OFDM yields lower maximum peak-to-average power ratio (PAR), high bandwidth efficiency and lower bit error rate (BER) on Gaussian channels compared to OFDM systems. However, PC-OFDM does not improve the statistics of PAR significantly. In this chapter, the use of a set of fixed permutations to improve the statistics of the PAR of a PC-OFDM signal is presented. For this technique, interleavers are used to produce K-1 permuted sequences from the same information sequence. The sequence with the lowest PAR, among K sequences is chosen for the transmission. The PAR of a PC-OFDM signal can be further reduced by 3-4 dB by this technique. Mathematical expressions for the complementary cumulative density function (CCDF)of PAR of PC-OFDM signal and interleaved PC-OFDM signal are also presented.
Resumo:
The state-owned media system in China has evolved considerably since 1994 when the first independent TV production company was officially registered. Today, there are thousands of independent TV production companies looking for market opportunities in China. Independent production companies have facilitated the circulation of program trade and investment, and in the process have encouraged innovation and professionalization. This paper focuses on the evolution of independents and the changing face of the television market. It discusses the ecology of independent television companies in China and how government regulations are impacting on the TV production market. It argues that independent TV is providing a new strength for China‟s TV market, one often suspected of being imitative, propagandistic and lacking colour.
Resumo:
Hollywood has dominated the global film business since the First World War. Economic formulas used by governments to assess levels of industry dominance typically measure market share to establish the degree of industry concentration. The business literature reveals that a marketing orientation strongly correlates with superior market performance and that market leaders that possess a set of six superior marketing capabilities are able to continually outperform rival firms. This paper argues that the historical evidence shows that the Hollywood Majors have consistently outperformed rival firms and rival film industries in each of those six marketing capabilities and that unless rivals develop a similarly integrated and cohesive strategic marketing management approach to the movie business and match the Major studios’ superior capabilities, then Hollywood’s dominance will continue. This paper also proposes that in cyberspace, whilst the Internet does provide a channel that democratises film distribution, the flat landscape of the world wide web means that in order to stand out from the clutter of millions of cyber-voices seeking attention, independent film companies need to possess superior strategic marketing management capabilities and develop effective e-marketing strategies to find a niche, attract a loyal online audience and prosper. However, mirroring a recent CIA report forecasting a multi-polar world economy, this paper also argues that potentially serious longer-term rivals are emerging and will increasingly take a larger slice of an expanding global box office as India, China and other major developing economies and their respective cultural channels grow and achieve economic parity with or surpass the advanced western economies. Thus, in terms of global market share over time, Hollywood’s slice of the pie will comparatively diminish in an emerging multi-polar movie business.