893 resultados para Subgeometric Convergence
Resumo:
This paper critically analyzes the divergent perspectives on how copyright and intellectual property laws impact creativity, innovation, and the creative industries. One perspective defines the creative industries based on copyright as the means by which revenues are generated from innovation and the dissemination of new ideas. At the same time, it has been argued that copyright and intellectual property regimes fetter creativity and innovation, and that this has become even more marked in the context of digital media convergence and the networked global creative economy. These issues have resonated in debates around the creative industries, particularly since the initial DCMS mapping study in the UK in 1998 defined creative industries as combining individual creativity and exploitable forms of intellectual property. The issue of competing claims for the relationship between copyright and the creative industries has also arisen in Australia, with a report by the Australian Law Reform Commission entitled Copyright and the Digital Economy. This paper will consider the competing claims surrounding copyright and the creative industries, and the implications for policy-makers internationally.
Resumo:
This article uses topological approaches to suggest that education is becoming-topological. Analyses presented in a recent double-issue of Theory, Culture & Society are used to demonstrate the utility of topology for education. In particular, the article explains education's topological character through examining the global convergence of education policy, testing and the discursive ranking of systems, schools and individuals in the promise of reforming education through the proliferation of regimes of testing at local and global levels that constitute a new form of governance through data. In this conceptualisation of global education policy changes in the form and nature of testing combine with it the emergence of global policy network to change the nature of the local (national, regional, school and classroom) forces that operate through the ‘system’. While these forces change, they work through a discursivity that produces disciplinary effects, but in a different way. This new–old disciplinarity, or ‘database effect’, is here represented through a topological approach because of its utility for conceiving education in an increasingly networked world.
Resumo:
This paper examines the global policy convergence toward high-stakes testing in schools and the use of test results to ‘steer at a distance’, particularly as it applies to policy-makers’ promise to improve teacher quality. Using Deleuze’s three syntheses of time in the context of the Australian policy blueprint Quality Education, this paper argues that using test scores to discipline teaching repeats the past habit of policy-making as continuing the problem of the unaccountable teacher. This results in local policy-making enfolding test scores in a pure past where the teacher-as-problem is resolved through the use of data from testing to deliver accountability and transparency. This use of the database returns a digitised form of inspection that is a repetition of the habit of teacher-as-problem. While dystopian possibilities are available through the database, in what Deleuze refers to as a control society, for us the challenge is to consider policy-making as a step into an unknown future, to engage with producing policy that is not grounded on the unconscious interiority of solving the teacher problem, but of imagining new ways of conceiving the relationship between policy-making and teaching.
Resumo:
Accurate radiocarbon dating of marine samples requires knowledge of the marine radiocarbon reservoir effect. This effect for a particular site/region is generally assumed constant through time when calibrating marine 14C ages. However, recent studies have shown large temporal variations of several hundred to a couple of thousand years in this effect for a number of regions during the late Quaternary and Holocene. Here we report marine radiocarbon reservoir correction (ΔRΔR) for Heron Reef and Moreton Bay in southwestern (SW) Pacific for the last 8 ka derived from 14C analysis of 230Th-dated corals. Most of our ΔRΔR for the last ∼5.4 ka agree well with their modern value, but large ΔRΔR variability of ∼410 yr (from trough to peak) with possible decadal/centennial fluctuations is evident for the period ∼5.4–8 ka. The latter time interval also has significant variations with similar features in previously published ΔRΔR values for other sites in the Pacific, including southern Peru–northern Chile in southeastern (SE) Pacific, the South China Sea, Vanuatu and Papua New Guinea, with the largest magnitude of ∼920 yr from SE Pacific. The mechanisms for these large ΔRΔR variations across the Pacific during the mid-Holocene are complex processes involving (1) changes in the quantity and 14C content of upwelled waters in tropical east Pacific (TEP) (frequency and intensity of ocean upwelling in the TEP, and contribution of Subantarctic Mode Water to the upwelled waters, which is influenced by the intensity and position of southern westerly winds), and (2) variations in ocean circulation associated with climate change (La Niña/El Niño conditions, intensity of easterly trade winds, positions of the Intertropical Convergence Zone and the South Pacific Convergence Zone), which control the spreading of the older upwelled surface waters in the TEP to the western sites. Our results imply the need for employing temporal changes in ΔRΔR values, instead of constant (modern) values, for age calibration of Holocene marine samples not only for the SW Pacific sites but also for other tropical and subtropical sites in the Pacific.
Resumo:
In his 1987 book, The Media Lab: Inventing the Future at MIT, Stewart Brand provides an insight into the visions of the future of the media in the 1970s and 1980s. 1 He notes that Nicolas Negroponte made a compelling case for the foundation of a media laboratory at MIT with diagrams detailing the convergence of three sectors of the media—the broadcast and motion picture industry; the print and publishing industry; and the computer industry. Stewart Brand commented: ‘If Negroponte was right and communications technologies really are converging, you would look for signs that technological homogenisation was dissolving old boundaries out of existence, and you would expect an explosion of new media where those boundaries used to be’. Two decades later, technology developers, media analysts and lawyers have become excited about the latest phase of media convergence. In 2006, the faddish Time Magazine heralded the arrival of various Web 2.0 social networking services: You can learn more about how Americans live just by looking at the backgrounds of YouTube videos—those rumpled bedrooms and toy‐strewn basement rec rooms—than you could from 1,000 hours of network television. And we didn’t just watch, we also worked. Like crazy. We made Facebook profiles and Second Life avatars and reviewed books at Amazon and recorded podcasts. We blogged about our candidates losing and wrote songs about getting dumped. We camcordered bombing runs and built open‐source software. America loves its solitary geniuses—its Einsteins, its Edisons, its Jobses—but those lonely dreamers may have to learn to play with others. Car companies are running open design contests. Reuters is carrying blog postings alongside its regular news feed. Microsoft is working overtime to fend off user‐created Linux. We’re looking at an explosion of productivity and innovation, and it’s just getting started, as millions of minds that would otherwise have drowned in obscurity get backhauled into the global intellectual economy. The magazine announced that Time’s Person of the Year was ‘You’, the everyman and everywoman consumer ‘for seizing the reins of the global media, for founding and framing the new digital democracy, for working for nothing and beating the pros at their own game’. This review essay considers three recent books, which have explored the legal dimensions of new media. In contrast to the unbridled exuberance of Time Magazine, this series of legal works displays an anxious trepidation about the legal ramifications associated with the rise of social networking services. In his tour de force, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet, Daniel Solove considers the implications of social networking services, such as Facebook and YouTube, for the legal protection of reputation under privacy law and defamation law. Andrew Kenyon’s edited collection, TV Futures: Digital Television Policy in Australia, explores the intersection between media law and copyright law in the regulation of digital television and Internet videos. In The Future of the Internet and How to Stop It, Jonathan Zittrain explores the impact of ‘generative’ technologies and ‘tethered applications’—considering everything from the Apple Mac and the iPhone to the One Laptop per Child programme.
Resumo:
The numerical solution of fractional partial differential equations poses significant computational challenges in regard to efficiency as a result of the spatial nonlocality of the fractional differential operators. The dense coefficient matrices that arise from spatial discretisation of these operators mean that even one-dimensional problems can be difficult to solve using standard methods on grids comprising thousands of nodes or more. In this work we address this issue of efficiency for one-dimensional, nonlinear space-fractional reaction–diffusion equations with fractional Laplacian operators. We apply variable-order, variable-stepsize backward differentiation formulas in a Jacobian-free Newton–Krylov framework to advance the solution in time. A key advantage of this approach is the elimination of any requirement to form the dense matrix representation of the fractional Laplacian operator. We show how a banded approximation to this matrix, which can be formed and factorised efficiently, can be used as part of an effective preconditioner that accelerates convergence of the Krylov subspace iterative solver. Our approach also captures the full contribution from the nonlinear reaction term in the preconditioner, which is crucial for problems that exhibit stiff reactions. Numerical examples are presented to illustrate the overall effectiveness of the solver.
Resumo:
In this paper, we introduce the Stochastic Adams-Bashforth (SAB) and Stochastic Adams-Moulton (SAM) methods as an extension of the tau-leaping framework to past information. Using the theta-trapezoidal tau-leap method of weak order two as a starting procedure, we show that the k-step SAB method with k >= 3 is order three in the mean and correlation, while a predictor-corrector implementation of the SAM method is weak order three in the mean but only order one in the correlation. These convergence results have been derived analytically for linear problems and successfully tested numerically for both linear and non-linear systems. A series of additional examples have been implemented in order to demonstrate the efficacy of this approach.
Resumo:
Background In 2011, a variant of West Nile virus Kunjin strain (WNVKUN) caused an unprecedented epidemic of neurological disease in horses in southeast Australia, resulting in almost 1,000 cases and a 9% fatality rate. We investigated whether increased fitness of the virus in the primary vector, Culex annulirostris, and another potential vector, Culex australicus, contributed to the widespread nature of the outbreak. Methods Mosquitoes were exposed to infectious blood meals containing either the virus strain responsible for the outbreak, designated WNVKUN2011, or WNVKUN2009, a strain of low virulence that is typical of historical strains of this virus. WNVKUN infection in mosquito samples was detected using a fixed cell culture enzyme immunoassay and a WNVKUN- specific monoclonal antibody. Probit analysis was used to determine mosquito susceptibility to infection. Infection, dissemination and transmission rates for selected days post-exposure were compared using Fisher’s exact test. Virus titers in bodies and saliva expectorates were compared using t-tests. Results There were few significant differences between the two virus strains in the susceptibility of Cx. annulirostris to infection, the kinetics of virus replication and the ability of this mosquito species to transmit either strain. Both strains were transmitted by Cx. annulirostris for the first time on day 5 post-exposure. The highest transmission rates (proportion of mosquitoes with virus detected in saliva) observed were 68% for WNVKUN2011 on day 12 and 72% for WNVKUN2009 on day 14. On days 12 and 14 post-exposure, significantly more WNVKUN2011 than WNVKUN2009 was expectorated by infected mosquitoes. Infection, dissemination and transmission rates of the two strains were not significantly different in Culex australicus. However, transmission rates and the amount of virus expectorated were significantly lower in Cx. australicus than Cx. annulirostris. Conclusions The higher amount of WNVKUN2011 expectorated by infected mosquitoes may be an indication that this virus strain is transmitted more efficiently by Cx. annulirostris compared to other WNVKUN strains. Combined with other factors, such as a convergence of abundant mosquito and wading bird populations, and mammalian and avian feeding behaviour by Cx. annulirostris, this may have contributed to the scale of the 2011 equine epidemic.
Resumo:
Two types of welfare states are compared in this article. Differences in procedural rights for young unemployed at the level of service delivery are analyzed. In Australia, rights are regulated through a rigid procedural justice system. The young unemployed within the social assistance system in Sweden encounter staff with high discretionary powers, which makes the legal status weak for the unemployed but, on the other hand, the system is more flexible. Despite the differences, there is striking convergence in how the young unemployed describe how discretionary power among street-level staff affects their procedural rights. This result can be understood as a result of similar professional norms, work customs and occupational cultures of street-level staff, and that there is a basic logic of conditionality in all developed welfare states where procedural rights are tightly coupled with responsibilities.
Resumo:
Many physical processes appear to exhibit fractional order behavior that may vary with time and/or space. The continuum of order in the fractional calculus allows the order of the fractional operator to be considered as a variable. In this paper, we consider a new space–time variable fractional order advection–dispersion equation on a finite domain. The equation is obtained from the standard advection–dispersion equation by replacing the first-order time derivative by Coimbra’s variable fractional derivative of order α(x)∈(0,1]α(x)∈(0,1], and the first-order and second-order space derivatives by the Riemann–Liouville derivatives of order γ(x,t)∈(0,1]γ(x,t)∈(0,1] and β(x,t)∈(1,2]β(x,t)∈(1,2], respectively. We propose an implicit Euler approximation for the equation and investigate the stability and convergence of the approximation. Finally, numerical examples are provided to show that the implicit Euler approximation is computationally efficient.
Resumo:
This chapter interrogates what recognition of prior learning (RPL) can and does mean in the higher education sector—a sector in the grip of the widening participation agenda and an open access age. The chapter discusses how open learning is making inroads into recognition processes and examines two studies in open learning recognition. A case study relating to e-portfolio-style RPL for entry into a Graduate Certificate in Policy and Governance at a metropolitan university in Queensland is described. In the first instance, candidates who do not possess a relevant Bachelor degree need to demonstrate skills in governmental policy work in order to be eligible to gain entry to a Graduate Certificate (at Australian Qualifications Framework Level 8) (Australian Qualifications Framework Council, 2013, p. 53). The chapter acknowledges the benefits and limitations of recognition in open learning and those of more traditional RPL, anticipating future developments in both (or their convergence).
Resumo:
For the third issue of Communication Research and Practice, we bring together a mix of submitted content and papers presented at events that were hosted under the auspices of the International Communication Association. Dal Yong Jin captures the dynamic and contradictory elements of both convergence and transmedia storytelling, and the ‘Korean Wave’, in his paper on webtoons. Exploring this distinctive online form of transmedia storytelling, Jin considers its evolution from the perspectives of digital content, political economy, convergent media and digital labour, and the tensions that surround its potential expansion into global cultural markets.
Resumo:
A novel analysis to compute the admittance characteristics of the slots cut in the narrow wall of a rectangular waveguide, which includes the corner diffraction effects and the finite waveguide wall thickness, is presented. A coupled magnetic field integral equation is formulated at the slot aperture which is solved by the Galerkin approach of the method of moments using entire domain sinusoidal basis functions. The externally scattered fields are computed using the finite difference method (FDM) coupled with the measured equation of invariance (MEI). The guide wall thickness forms a closed cavity and the fields inside it are evaluated using the standard FDM. The fields scattered inside the waveguide are formulated in the spectral domain for faster convergence compared to the traditional spatial domain expansions. The computed results have been compared with the experimental results and also with the measured data published in previous literature. Good agreement between the theoretical and experimental results is obtained to demonstrate the validity of the present analysis.
Resumo:
We consider a single server queue with the interarrival times and the service times forming a regenerative sequence. This traffic class includes the standard models: lid, periodic, Markov modulated (e.g., BMAP model of Lucantoni [18]) and their superpositions. This class also includes the recently proposed traffic models in high speed networks, exhibiting long range dependence. Under minimal conditions we obtain the rates of convergence to stationary distributions, finiteness of stationary moments, various functional limit theorems and the continuity of stationary distributions and moments. We use the continuity results to obtain approximations for stationary distributions and moments of an MMPP/GI/1 queue where the modulating chain has a countable state space. We extend all our results to feedforward networks where the external arrivals to each queue can be regenerative. In the end we show that the output process of a leaky bucket is regenerative if the input process is and hence our results extend to a queue with arrivals controlled by a leaky bucket.
Resumo:
In this paper, we first recast the generalized symmetric eigenvalue problem, where the underlying matrix pencil consists of symmetric positive definite matrices, into an unconstrained minimization problem by constructing an appropriate cost function, We then extend it to the case of multiple eigenvectors using an inflation technique, Based on this asymptotic formulation, we derive a quasi-Newton-based adaptive algorithm for estimating the required generalized eigenvectors in the data case. The resulting algorithm is modular and parallel, and it is globally convergent with probability one, We also analyze the effect of inexact inflation on the convergence of this algorithm and that of inexact knowledge of one of the matrices (in the pencil) on the resulting eigenstructure. Simulation results demonstrate that the performance of this algorithm is almost identical to that of the rank-one updating algorithm of Karasalo. Further, the performance of the proposed algorithm has been found to remain stable even over 1 million updates without suffering from any error accumulation problems.