47 resultados para Over the counter derivatives


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Web 1.0 referred to the early, read-only internet; Web 2.0 refers to the ‘read-write web’ in which users actively contribute to as well as consume online content; Web 3.0 is now being used to refer to the convergence of mobile and Web 2.0 technologies and applications. One of the most important developments in mobile 3.0 is geography: with many mobile phones now equipped with GPS, mobiles promise to “bring the internet down to earth” through geographically-aware, or locative media. The internet was earlier heralded as “the death of geography” with predictions that with anyone able to access information from anywhere, geography would no longer matter. But mobiles are disproving this. GPS allows the location of the user to be pinpointed, and the mobile internet allows the user to access locally-relevant information, or to upload content which is geotagged to the specific location. It also allows locally-specific content to be sent to the user when the user enters a specific space. Location-based services are one of the fastest-growing segments of the mobile internet market: the 2008 AIMIA report indicates that user access of local maps increased by 347% over the previous 12 months, and restaurant guides/reviews increased by 174%. The central tenet of cultural geography is that places are culturally-constructed, comprised of the physical space itself, culturally-inflected perceptions of that space, and people’s experiences of the space (LeFebvre 1991). This paper takes a cultural geographical approach to locative media, anatomising the various spaces which have emerged through locative media, or “the geoweb” (Lake 2004). The geoweb is such a new concept that to date, critical discourse has treated it as a somewhat homogenous spatial formation. In order to counter this, and in order to demonstrate the dynamic complexity of the emerging spaces of the geoweb, the paper provides a topography of different types of locative media space: including the personal/aesthetic in which individual users geotag specific physical sites with their own content and meanings; the commercial, like the billboards which speak to individuals as they pass in Minority Report; and the social, in which one’s location is defined by the proximity of friends rather than by geography.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a consequence of the increased incidence of collaborative arrangements between firms, the competitive environment characterising many industries has undergone profound change. It is suggested that rivalry is not necessarily enacted by individual firms according to the traditional mechanisms of direct confrontation in factor and product markets, but rather as collaborative orchestration between a number of participants or network members. Strategic networks are recognised as sets of firms within an industry that exhibit denser strategic linkages among themselves than other firms within the same industry. Based on this, strategic networks are determined according to evidence of strategic alliances between firms comprising the industry. As a result, a single strategic network represents a group of firms closely linked according to collaborative ties. Arguably, the collective outcome of these strategic relationships engineered between firms suggest that the collaborative benefits attributed to interorganisational relationships require closer examination in respect to their propensity to influence rivalry in intraindustry environments. Derived in large from the social sciences, network theory allows for the micro and macro examination of the opportunities and constraints inherent in the structure of relationships in strategic networks, establishing a relational approach upon which the conduct and performance of firms can be more fully understood. Research to date has yet to empirically investigate the relationship between strategic networks and rivalry. The limited research that has been completed utilising a network rationale to investigate competitive patterns in contemporary industry environments has been characterised by a failure to directly measure rivalry. Further, this prior research has typically embedded investigation in industry settings dominated by technological or regulatory imperatives, such as the microprocessor and airline industries. These industries, due to the presence of such imperatives, are arguably more inclined to support the realisation of network rivalry, through subscription to prescribed technological standards (eg., microprocessor industry) or by being bound by regulatory constraints dictating operation within particular market segments (airline industry). In order to counter these weaknesses, the proposition guiding research - Are patterns of rivalry predicted by strategic network membership? – is embedded in the United States Light Vehicles Industry, an industry not dominated by technological or regulatory imperatives. Further, rivalry is directly measured and utilised in research, thus distinguishing this investigation from prior research efforts. The timeframe of investigation is 1993 – 1999, with all research data derived from secondary sources. Strategic networks were defined within the United States Light Vehicles Industry based on evidence of horizontal strategic relationships between firms comprising the industry. The measure of rivalry used to directly ascertain the competitive patterns of industry participants was derived from the traditional Herfindahl Index, modified to account for patterns of rivalry observed at the market segment level. Statistical analyses of the strategic network and rivalry constructs found little evidence to support the contention of network rivalry; indeed, greater levels of rivalry were observed between firms comprising the same strategic network than between firms participating in opposing network structures. Based on these results, patterns of rivalry evidenced in the United States Light Vehicle Industry over the period 1993 – 1999 were not found to be predicted by strategic network membership. The findings generated by this research are in contrast to current theorising in the strategic network – rivalry realm. In this respect, these findings are surprising. The relevance of industry type, in conjunction with prevailing network methodology, provides the basis upon which these findings are contemplated. Overall, this study raises some important questions in relation to the relevancy of the network rivalry rationale, establishing a fruitful avenue for further research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The three studies in this thesis focus on happiness and age and seek to contribute to our understanding of happiness change over the lifetime. The first study contributes by offering an explanation for what was evolving to a ‘stylised fact’ in the economics literature, the U-shape of happiness in age. No U-shape is evident if one makes a visual inspection of the age happiness relationship in the German socio-economic panel data, and, it seems counter-intuitive that we just have to wait until we get old to be happy. Eliminating the very young, the very old, and the first timers from the analysis did not explain away regression results supporting the U-shape of happiness in age, but fixed effect analysis did. Analysis revealed found that reverse causality arising from time-invariant individual traits explained the U-shape of happiness in age in the German population, and the results were robust across six econometric methods. Robustness was added to the German fixed effect finding by replicating it with the Australian and the British socio-economic panel data sets. During analysis of the German data an unexpected finding emerged, an exceedingly large negative linear effect of age on happiness in fixed-effect regressions. There is a large self-reported happiness decline by those who remain in the German panel. A similar decline over time was not evident in the Australian or the British data. After testing away age, time and cohort effects, a time-in-panel effect was found. Germans who remain in the panel for longer progressively report lower levels of happiness. Because time-in-panel effects have not been included in happiness regression specifications, our estimates may be biased; perhaps some economics of the happiness studies, that used German panel data, need revisiting. The second study builds upon the fixed-effect finding of the first study and extends our view of lifetime happiness to a cohort little visited by economists, children. Initial analysis extends our view of lifetime happiness beyond adulthood and revealed a happiness decline in adolescent (15 to 23 year-old) Australians that is twice the size of the happiness decline we see in older Australians (75 to 86 yearolds), who we expect to be unhappy due to declining income, failing health and the onset of death. To resolve a difference of opinion in the literature as to whether childhood happiness decreases, increases, or remains flat in age; survey instruments and an Internet-based survey were developed and used to collect data from four hundred 9 to 14 year-old Australian children. Applying the data to a Model of Childhood Happiness revealed that the natural environment life-satisfaction domain factor did not have a significant effect on childhood happiness. However, the children’s school environment and interactions with friends life-satisfaction domain factors explained over half a steep decline in childhood happiness that is three times larger than what we see in older Australians. Adding personality to the model revealed what we expect to see with adults, extraverted children are happier, but unexpectedly, so are conscientious children. With the steep decline in the happiness of young Australians revealed and explanations offered, the third study builds on the time-invariant individual trait finding from the first study by applying the Australian panel data to an Aggregate Model of Average Happiness over the lifetime. The model’s independent variable is the stress that arises from the interaction between personality and the life event shocks that affect individuals and peers throughout their lives. Interestingly, a graphic depiction of the stress in age relationship reveals an inverse U-shape; an inverse U-shape that looks like the opposite of the U-shape of happiness in age we saw in the first study. The stress arising from life event shocks is found to explain much of the change in average happiness over a lifetime. With the policy recommendations of economists potentially invoking unexpected changes in our lives, the ensuing stress and resulting (un)happiness warrant consideration before economists make policy recommendations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The four principles of Beauchamp and Childress - autonomy, non-maleficence, beneficence and justice - have been extremely influential in the field of medical ethics, and are fundamental for understanding the current approach to ethical assessment in health care. This study tests whether these principles can be quantitatively measured on an individual level, and then subsequently if they are used in the decision making process when individuals are faced with ethical dilemmas. Methods The Analytic Hierarchy Process was used as a tool for the measurement of the principles. Four scenarios, which involved conflicts between the medical ethical principles, were presented to participants and they made judgments about the ethicality of the action in the scenario, and their intentions to act in the same manner if they were in the situation. Results Individual preferences for these medical ethical principles can be measured using the Analytic Hierarchy Process. This technique provides a useful tool in which to highlight individual medical ethical values. On average individuals have a significant preference for non-maleficence over the other principles, however, and perhaps counter-intuitively, this preference does not seem to relate to applied ethical judgements in specific ethical dilemmas. Conclusions People state they value these medical ethical principles but they do not actually seem to use them directly in the decision making process. The reasons for this are explained through the lack of a behavioural model to account for the relevant situational factors not captured by the principles. The limitations of the principles in predicting ethical decision making are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

After some years of remarkable growth, the scholarly field of Project Management (PM) research currently finds itself in a crucial stage of development. In this editorial, we make an analysis of submissions to PM's premier specialty journal, the International Journal of Project Management over the period 2007–2010, and argue that one of the most important ways in which PM research can further evolve is to pay more attention to the mundane, yet important, act of good reviewing — an activity that we believe has received relatively little attention in the PM community thus far. Let us begin by considering the crucial juncture that, as a scholarly discipline, PM is currently at. On the one hand, the PM research field is characterized by signs of major progress. For one, there has been a strong growth in terms of published output: recent years have seen the publication of three major edited volumes with a central focus on PM, published by top-tier publishers (Cattani et al., 2011, Kenis et al., 2009 and Morris et al., 2011); the PM/temporary organizations literature published in ISI ranked peer-reviewed articles is growing exponentially (Bakker, 2010); and besides some of the long-standing PM specialty journals, the field has recently seen the rise of a number of new journals, including the International Journal of Managing Projects in Business, the International Journal of Project Organisation and Management, and the Journal of Project, Program, and Portfolio Management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mycotoxins – from the Greek μύκης (mykes, mukos) “fungus” and the Latin (toxicum) “poison” – are a large and growing family of secondary metabolites and hence natural products produced by fungi, in particular by molds (1). It is estimated that well over 1,000 mycotoxins have been isolated and characterized so far, but this number will increase over the next few decades due the availability of more specialized analytical tools and the increasing number of fungi being isolated. However, the most important classes of fungi responsible for these compounds are Alternaria, Aspergillus (multiple forms), Penicillium, and Stachybotrys. The biological activity of mycotoxins ranges from weak and/or sometimes positive effects such as antibacterial activity (e.g. penicillin derivatives derived from Penicillium strains) to strong mutagenic (e.g. aflatoxins, patulin), carcinogenic (e.g. aflatoxins), teratogenic, neurotoxic (e.g. ochratoxins), nephrotoxic (e.g. fumonisins, citrinin), hepatotoxic, and immunotoxic (e.g. ochratoxins, diketopiperazines) activities (1, 2), which are discussed in detail in this volume.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerous initiatives have been employed around the world in order to address rising greenhouse gas (GHG) emissions originating from the transport sector. These measures include: travel demand management (congestion‐charging), increased fuel taxes, alternative fuel subsidies and low‐emission vehicle (LEV) rebates. Incentivizing the purchase of LEVs has been one of the more prevalent approaches in attempting to tackle this global issue. LEVs, whilst having the advantage of lower emissions and, in some cases, more efficient fuel consumption, also bring the downsides of increased purchase cost, reduced convenience of vehicle fuelling, and operational uncertainty. To stimulate demand in the face of these challenges, various incentive‐based policies, such as toll exemptions, have been used by national and local governments to encourage the purchase of these types of vehicles. In order to address rising GHG emissions in Stockholm, and in line with the Swedish Government’s ambition to operate a fossil free fleet by 2030, a number of policies were implemented targeting the transport sector. Foremost amongst these was the combination of a congestion charge – initiated to discourage emissions‐intensive travel – and an exemption from this charge for some LEVs, established to encourage a transition towards a ‘green’ vehicle fleet. Although both policies shared the aim of reducing GHG emissions, the exemption for LEVs carried the risk of diminishing the effectiveness of the congestion charging scheme. As the number of vehicle owners choosing to transition to an eligible LEV increased, the congestion‐reduction effectiveness of the charging scheme weakened. In fact, policy makers quickly recognized this potential issue and consequently phased out the LEV exemption less than 18 months after its introduction (1). Several studies have investigated the demand for LEVs through stated‐preference (SP) surveys across multiple countries, including: Denmark (2), Germany (3, 4), UK (5), Canada (6), USA (7, 8) and Australia (9). Although each of these studies differed in approach, all involved SP surveys where differing characteristics between various types of vehicles, including LEVs, were presented to respondents and these respondents in turn made hypothetical decisions about which vehicle they would be most likely to purchase. Although these studies revealed a number of interesting findings in regards to the potential demand for LEVs, they relied on SP data. In contrast, this paper employs an approach where LEV choice is modelled by taking a retrospective view and by using revealed preference (RP) data. By examining the revealed preferences of vehicle owners in Stockholm, this study overcomes one of the principal limitations of SP data, namely that stated preferences may not in fact reflect individuals’ actual choices, such as when cost, time, and inconvenience factors are real rather than hypothetical. This paper’s RP approach involves modelling the characteristics of individuals who purchased new LEVs, whilst estimating the effect of the congestion charging exemption upon choice probabilities and subsequent aggregate demand. The paper contributes to the current literature by examining the effectiveness of a toll exemption under revealed preference conditions, and by assessing the total effect of the policy based on key indicators for policy makers, including: vehicle owner home location, commuting patterns, number of children, age, gender and income. Extended Abstract Submission for Kuhmo Nectar Conference 2014 2 The two main research questions motivating this study were:  Which individuals chose to purchase a new LEV in Stockholm in 2008?; and,  How did the congestion charging exemption affect the aggregate demand for new LEVs in Stockholm in 2008? In order to answer these research questions the analysis was split into two stages. Firstly, a multinomial logit (MNL) model was used to identify which demographic characteristics were most significantly related to the purchase of an LEV over a conventional vehicle. The three most significant variables were found to be: intra‐cordon residency (positive); commuting across the cordon (positive); and distance of residence from the cordon (negative). In order to estimate the effect of the exemption policy on vehicle purchase choice, the model included variables to control for geographic differences in preferences, based on the location of the vehicle owners’ homes and workplaces in relation to the congestion‐charging cordon boundary. These variables included one indicator representing commutes across the cordon and another indicator representing intra‐cordon residency. The effect of the exemption policy on the probability of purchasing LEVs was estimated in the second stage of the analysis by focusing on the groups of vehicle owners that were most likely to have been affected by the policy i.e. those commuting across the cordon boundary (in both directions). Given the inclusion of the indicator variable representing commutes across the cordon, it is assumed that the estimated coefficient of this variable captures the effect of the exemption policy on the utility of choosing to purchase an exempt LEV for these two groups of vehicle owners. The intra‐cordon residency indicator variable also controls for differences between the two groups, based upon direction of travel across the cordon boundary. A counter‐hypothesis to this assumption is that the coefficient of the variable representing commuting across the cordon boundary instead only captures geo‐demographic differences that lead to variations in LEV ownership across the different groups of vehicle owners in relation to the cordon boundary. In order to address this counter‐hypothesis, an additional analysis was performed on data from a city with a similar geodemographic pattern to Stockholm, Gothenburg ‐ Sweden’s second largest city. The results of this analysis provided evidence to support the argument that the coefficient of the variable representing commutes across the cordon was capturing the effect of the exemption policy. Based upon this framework, the predicted vehicle type shares were calculated using the estimated coefficients of the MNL model and compared with predicted vehicle type shares from a simulated scenario where the exemption policy was inactive. This simulated scenario was constructed by setting the coefficient for the variable representing commutes across the cordon boundary to zero for all observations to remove the utility benefit of the exemption policy. Overall, the procedure of this second stage of the analysis led to results showing that the exemption had a substantial effect upon the probability of purchasing and aggregate demand for exempt LEVs in Stockholm during 2008. By making use of unique evidence of revealed preferences of LEV owners, this study identifies the common characteristics of new LEV owners and estimates the effect of Stockholm's congestion charging exemption upon the demand for new LEVs during 2008. It was found that the variables that had the greatest effect upon the choice of purchasing an exempt LEV included intra‐cordon residency (positive), distance of home from the cordon (negative), and commuting across the cordon (positive). It was also determined that owners under the age of 30 years preferred non‐exempt LEVs (low CO2 LEVs), whilst those over the age of 30 years preferred electric vehicles. In terms of electric vehicles, it was apparent that those individuals living within the city had the highest propensity towards purchasing this vehicle type. A negative relationship between choosing an electric vehicle and the distance of an individuals’ residency from the cordon was also evident. Overall, the congestion charging exemption was found to have increased the share of exempt LEVs in Stockholm by 1.9%, with, as expected, a much stronger effect on those commuting across the boundary, with those living inside the cordon having a 13.1% increase, and those owners living outside the cordon having a 5.0% increase. This increase in demand corresponded to an additional 538 (+/‐ 93; 95% C.I.) new exempt LEVs purchased in Stockholm during 2008 (out of a total of 5 427; 9.9%). Policy makers can take note that an incentive‐based policy can increase the demand for LEVs and appears to be an appropriate approach to adopt when attempting to reduce transport emissions through encouraging a transition towards a ‘green’ vehicle fleet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

WikiLeaks has become a global phenomenon, and its founder and spokesman Julian Assange an international celebrity (or terrorist, depending on one’s perspective). But perhaps this focus on Assange and his website is as misplaced as the attacks against Napster and its founders were a decade ago: WikiLeaks itself only marks a new phase in a continuing shift in the balance of power between states and citizens, much as Napster helped to undermine the control of major music labels over the music industry. If the history of music filesharing is any guide, no level of punitive action against WikiLeaks and its supporters is going to re-contain the information WikiLeaks has set loose.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past 20 years the labour market, workforce and work organisation of most if not all industrialised countries have been significantly refashioned by the increased use of more flexible work arrangements, variously labelled as precarious employment or contingent work. There is now a substantial and growing body of international evidence that many of these arrangements are associated with a significant deterioration in occupational health and safety (OHS), using a range of measures such as injury rates, disease, hazard exposures and work-related stress. Moreover, there is an emerging body of evidence that these arrangements pose particular problems for conventional regulatory regimes. Recognition of these problems has aroused the concern of policy makers - especially in Europe, North America and Australia - and a number of responses have been adopted in terms of modifying legislation, producing new guidance material and codes of practice and revised enforcement practices. This article describes one such in itiative in Australia with regard to home-based clothing workers. The regulatory strategy developed in one Australian jurisdiction (and now being ‘exported’ into others) seeks to counter this process via contractual tracking mechanisms to follow the work, tie in liability and shift overarching legal responsibility to the top of the supply chain. The process also entails the integration of minimum standards relating to wages, hours and working conditions; OHS and access to workers’ compensation. While home-based clothing manufacture represents a very old type of ‘flexible’ work arrangement, it is one that regulators have found especially difficult to address. Further, the elaborate multi-tiered subcont racting and diffuse work locations found in this industry are also characteristic of newer forms of contingent work in other industries (such as some telework) and the regulatory challenges they pose (such as the tendency of elaborate supply chains to attenuate and fracture statutory responsibilities, at least in terms of the attitudes and behaviour of those involved).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

WikiLeaks has become a global phenomenon, and its founder and spokesman Julian Assange an international celebrity (or terrorist, depending on one’s perspective). But perhaps this focus on Assange and his website is as misplaced as the attacks against Napster and its founders were a decade ago: WikiLeaks itself only marks a new phase in a continuing shift in the balance of power between states and citizens, much as Napster helped to undermine the control of major music labels over the music industry. If the history of music filesharing is any guide, no level of punitive action against WikiLeaks and its supporters is going to re-contain the information WikiLeaks has set loose.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Government and tobacco companies recently faced off in Australia's High Court over the legality of plain packaging for cigarettes. Stephen Stern and Matthew Rimmer give their contrasting views on the arguments and implications

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, both developing and industrialised societies have experienced riots and civil unrest over the corporate exploitation of fresh water. Water conflicts increase as water scarcity rises and the unsustainable use of fresh water will continue to have profound implications for sustainable development and the realisation of human rights. Rather than states adopting more costly water conservation strategies or implementing efficient water technologies, corporations are exploiting natural resources in what has been described as the “privatization of water”. By using legal doctrines, states and corporations construct fresh water sources as something that can be owned or leased. For some regions, the privatization of water has enabled corporations and corrupt states to exploit a fundamental human right. Arguing that such matters are of relevance to criminology, which should be concerned with fundamental environmental and human rights, this article adopts a green criminological perspective and draws upon Treadmill of Production theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The film company, Roadshow, the pay television company Foxtel, and Rupert Murdoch’s News Corp and News Limited — as well as copyright industries — have been clamouring for new copyright powers and remedies. In the summer break, the Coalition Government has responded to such entreaties from its industry supporters and donors, with a new package of copyright laws and policies. There has been significant debate over the proposals between the odd couple of Attorney-General George Brandis and the Minister for Communications, Malcolm Turnbull. There has been deep, philosophical differences between the two Ministers over the copyright agenda. The Attorney-General George Brandis has supported a model of copyright maximalism, with strong rights and remedies for the copyright empires in film, television, and publishing. He has shown little empathy for the information technology companies of the digital economy. The Attorney-General has been impatient to press ahead with a copyright regime. The Minister for Communications, Malcolm Turnbull, has been somewhat more circumspect,recognising that there is a need to ensure that copyright laws do not adversely impact upon competition in the digital economy. The final proposal is a somewhat awkward compromise between the discipline-and-punish regime preferred by Brandis, and the responsive regulation model favoured by Turnbull. In his new book, Information Doesn’t Want to Be Free: Laws for the Internet Age, Cory Doctorow has some sage advice for copyright owners: Things that don’t make money: * Complaining about piracy. * Calling your customers thieves. * Treating your customers like thieves. In this context, the push by copyright owners and the Coalition Government to have a copyright crackdown may well be counter-productive to their interests. This submission considers a number of key elements of the Coalition Government’s Copyright Crackdown. Part 1 examines the proposals in respect of the Copyright Amendment (Online Infringement) Bill 2015 (Cth). Part 2 focuses upon the proposed Copyright Code. Part 3 considers the question of safe harbours for intermediaries. Part 4 examines the question of copyright exceptions – particularly looking at the proposal of the Australian Law Reform Commission for the introduction of a defence of fair use. Part 5 highlights the recommendations of the IT Pricing Inquiry and the Harper Competition Policy Review in respect of copyright law, consumer rights, and competition law.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

South Africa is an emerging and industrializing economy which is experiencing remarkable progress. We contend that amidst the developments in the economy, the role of energy, trade openness and financial development are critical. In this article, we revisit the pivotal role of these factors. We use the ARDL bounds [72], the Bayer and Hanck [11] cointegration techniques, and an extended Cobb–Douglas framework, to examine the long-run association with output per worker over the sample period 1971–2011. The results support long-run association between output per worker, capital per worker and the shift parameters. The short-run elasticity coefficients are as follows: energy (0.24), trade (0.07), financial development (−0.03). In the long-run, the elasticity coefficients are: trade openness (0.05), energy (0.29), and financial development (−0.04). In both the short-run and the long-run, we note the post-2000 period has a marginal positive effect on the economy. The Toda and Yamamoto [91] Granger causality results show that a unidirectional causality from capital stock and energy consumption to output; and from capital stock to trade openness; a bidirectional causality between trade openness and output; and absence (neutrality) of any causality between financial development and output thus indicating that these two variables evolve independent of each other.