119 resultados para Early 20th century
Resumo:
By the end of the 20th century the shift from professional recording studio to personal computer based recording systems was well established (Chadabe 1997) and musicians could increasingly see the benefits of value adding to the musical process by producing their own musical endeavours. At the Queensland University of Technology (QUT) where we were teaching, the need for a musicianship program that took account of these trends was becoming clear. The Sound Media Musicianship unit described in this chapter was developed to fill this need and ran from 1999 through 2010.
Resumo:
The development of public service broadcasters (PSBs) in the 20th century was framed around debates about its difference compared to commercial broadcasting. These debates navigated between two poles. One concerned the relationship between non‐commercial sources of funding and the role played by statutory Charters as guarantors of the independence of PSBs. The other concerned the relationship between PSBs being both a complementary and a comprehensive service, although there are tensions inherent in this duality. In the 21st century, as reconfigured public service media organisations (PSMs) operate across multiple platforms in a convergent media environment, how are these debates changing, if at all? Is the case for PSM “exceptionalism” changed with Web‐based services, catch‐up TV, podcasting, ancillary product sales, and commissioning of programs from external sources in order to operate in highly diversified cross‐media environments? Do the traditional assumptions about non‐commercialism still hold as the basis for different forms of PSM governance and accountability? This paper will consider the question of PSM exceptionalism in the context of three reviews into Australian media that took place over 2011‐2012: the Convergence Review undertaken through the Department of Broadband, Communications and the Digital Economy; the National Classification Scheme Review undertaken by the Australian Law Reform Commission; and the Independent Media Inquiry that considered the future of news and journalism.
Resumo:
Problems with charity law jurisprudence persist. The difficulties arose in the 20th century and are fundamental to the way the doctrine is presently theorised. They grew out of the approach taken in Pemsel’s Case to the categorisation of the ‘spirit and intendment’ of the Preamble to the Statute of Charitable Uses. Recent statutory reforms, such as the Charities Act 2006 (Eng&W), have compounded the underlying problems rather than resolving them. This paper aims to stimulate thinking about a new foundation for charity jurisprudence – while the approach may seem radical, the paper argues that these new foundations can be discerned underlying the current jurisprudence. The difficulties can be overcome by rediscovering the underlying jurisprudence which is disregarded in the current approach to categorisation. Giving voice, in contemporary language, to that foundational jurisprudence, this paper provides a way out of the current problems. It also provides an alternative way of conceptualising the doctrine of charitable purpose to guide reform.
Resumo:
A recent Australian literature digitisation project uncovered some surprising discoveries in the children’s books that it digitised. The Children’s Literature Digital Resources (CLDR) Project digitised children’s books that were first published between 1851 to 1945 and made them available online through AustLit: The Australian Literature Resource. The digitisation process also preserved, within the pages of those books, a range of bookplates, book labels, inscriptions, and loose ephemera. This material allows us to trace the provenance of some of the digitised works, some of which came from the personal libraries of now-famous authors, and others from less celebrated sources. These extra-textual traces can contribute to cultural memory of the past by providing evidence of how books were collected and exchanged, and what kinds of books were presented as prizes in schools and Sunday schools. They also provide insight into Australian literary and artistic networks, particularly of the first few decades of the 20th century. This article describes the kinds of material uncovered in the digitisation process and suggests that the material provides insights into literary and cultural histories that might otherwise be forgotten. It also argues that the indexing of this material is vital if it is not to be lost to future researchers.
Resumo:
Since the architectural design studio learning environment was first established in the early 19th century at the École des Beaux-Arts in Paris, there has been a complete transformation in how the discipline of architecture is practiced and how students of architecture acquire information. Digital technologies allow students to access information instantly and learning is no longer confined to the rigid boundaries of a physical campus environment. In many schools of architecture in Australia, the physical design studio learning environments however, remain largely unchanged. Many learning environments could be mistaken for those last refurbished 30 years ago, being devoid of any significant technological intervention. While some teaching staff are eagerly embracing new digital technologies and attempting to modify their pedagogical approaches, the physical design studio learning environment is resistant to such efforts. In a study aimed at better understanding how staff and students adapt to new blended learning environments, a group of 165 second year architecture students at a large school of architecture in Australia were separated into two different design studio learning environments. 70% of students were allocated to a traditional design studio setting and 30% to a new, high technology embedded, prototype digital learning laboratory. The digital learning laboratory was purpose designed for the case-study users, adapted Student-Centred Active Learning Environment for Undergraduate Programs [SCALE-UP] principles, and built as part of a larger university research project. The architecture students attended the same lectures, followed the same studio curriculum and completed the same pieces of assessment; the only major differences were the teaching staff and physical environment within which the studios were conducted. At the end of the semester, all staff and students were asked to complete a questionnaire about their experiences and preferences within the two respective learning environments. The questionnaire response rate represented the opinions of 100% of the 10 teaching staff and over 70% of the students. Using a qualitative grounded theory approach, data were coded, extrapolated and compared, to reveal emerging key themes. The key themes formed the basis for in-depth interviews and focus groups of teaching staff and students, allowing the researchers to understand the data in more detail. The results of the data verified what had become increasingly evident during the course of the semester: an underlying negative resistance to the new digital studio learning environment, by both staff and students. Many participants openly exhibited a yearning for a return to the traditional design studio learning environments, particularly when the new technology caused frustration, by being unreliable or failing altogether. This paper reports on the study, discusses the negative resistance and explores the major contributors to resistance. The researchers are not aware of any similar previous studies across these particular settings and believe that it offers a necessary and important contribution to emergent research about adaptation to new digital learning environments.
Resumo:
According to Karl Popper, widely regarded as one of the greatest philosophers of science in the 20th century, falsifiability is the primary characteristic that distinguishes scientific theories from ideologies – or dogma. For example, for people who argue that schools should treat creationism as a scientific theory, comparable to modern theories of evolution, advocates of creationism would need to become engaged in the generation of falsifiable hypothesis, and would need to abandon the practice of discouraging questioning and inquiry. Ironically, scientific theories themselves are accepted or rejected based on a principle that might be called survival of the fittest. So, for healthy theories on development to occur, four Darwinian functions should function: (a) variation – avoid orthodoxy and encourage divergent thinking, (b) selection – submit all assumptions and innovations to rigorous testing, (c) diffusion – encourage the shareability of new and/or viable ways of thinking, and (d) accumulation – encourage the reuseability of viable aspects of productive innovations.
Resumo:
The development of creative industries has been connected to urban development since the end of the 20th century. However, the causality of why creative industries always cluster and develop in certain cities hasn‘t been adequately demonstrated, especially as to how various resources grow, interact and nurture the creative capacity of the locality. Therefore it is vital to observe how the local institutional environment nurtures creative industries and how creative industries consequently change the environment in order to better address the connection between creative industries and localities. In Beijing, the relocation of CCTV, BTV and Phoenix to Chaoyang District raises the possibility of a new era for Chinese media, one in which the stodginess of propaganda content will give way to exciting new forms and genres. The mixing of media companies in an open commercial environment (away from the political power district of Xicheng) holds the promise of more freedom of expression and, ultimately, to a =media capital‘ (Curtin, 2003). These are the dreams of many media practitioners in Beijing. But just how realistic are their expectations? This study adopts the concept of =media capital‘ to demonstrate how participants, including state-media organisations, private media companies and international media conglomerates, are seeking out space and networks to survive in Beijing. Drawing on policy analysis, interviews and case studies, this study illustrates how different agents meet, confront and adapt in Beijing. This study identifies factors responsible for the media industries clustering in China, and argues that Beijing is very likely to be the next Chinese media capital, after enough accumulation and development, although as a lower tier version compared to other media capitals in the world. This study contributes to Curtin‘s =media capital‘ concept, develops his interpretation on the relationship of media industries and the government, and suggests that the influence over the government of media companies and professionals should be acknowledged. Therefore, empirically, this study assists media practitioners in understanding how the Chinese government perceives media industries and, consequently, how media industries are operated in China. The study also reveals that despite the government‘s aspirations, China‘s media industries are still greatly constrained by institutional obstacles. Hence Beijing really needs to speed up its pace on the path of media reform, abandon the old mindset and create more room for creativity. Policy-makers in China should keep in mind that the only choice left to them is to further the reform.
Resumo:
A value-shift began to influence global political thinking in the late 20th century, characterised by recognition of the need for environmentally, socially and culturally sustainable resource development. This shift entailed a move away from thinking of ‘nature’ and ‘culture’ as separate entities – the former existing to serve the latter – toward the possibility of embracing the intrinsic worth of the nonhuman world. Cultural landscape theory recognises ‘nature’ as at once both ‘natural’, and a ‘cultural’ construct. As such, it may offer a framework through which to progress in the quest for ‘sustainable development’. This study makes a contribution to this quest by asking whether contemporary developments in cultural landscape theory can contribute to rehabilitation strategies for Australian open-cut coal mining landscapes. The answer is ‘yes’. To answer the research question, a flexible, ‘emergent’ methodological approach has been used, resulting in the following outcomes. A thematic historical overview of landscape values and resource development in Australia post-1788, and a review of cultural landscape theory literature, contribute to the formation of a new theoretical framework: Reconnecting the Interrupted Landscape. This framework establishes a positive answer to the research question. It also suggests a method of application within the Australian open-cut coal mining landscape, a highly visible exemplar of the resource development landscape. This method is speculatively tested against the rehabilitation strategy of an operating open-cut coal mine, concluding with positive recommendations to the industry, and to government.
Resumo:
The relationship between social background and achievement has preoccupied educational researchers since the mid-20th century with major studies in the area reaching prominence in the late 60s. Despite five decades of research and innovation since, recent studies using OECD data have shown that the relationship is strengthening rather than weakening. In this paper, the systematic destabilisation of public education in Australia is examined as a philosophical problem stemming from a fundamental shift in political orientation, where “choice” and “aspiration” work to promote and disguise survivalism. The problem for education however extends far deeper than the inequity in Federal government funding. Whilst this is a major problem, critical scrutiny must also focus on what states can do to turn back aspects of their own education policy that work to exacerbate and entrench social disadvantage.
Resumo:
Language has been of interest to numerous economists since the late 20th century, with the majority of the studies focusing on its effects on immigrants’ labour market outcomes; earnings in particular. However, language is an endogenous variable, which along with its susceptibility to measurement error causes biases in ordinary-least-squares estimates. The instrumental variables method overcomes the shortcomings of ordinary least squares in modelling endogenous explanatory variables. In this dissertation, age at arrival combined with country of origin form an instrument creating a difference-in-difference scenario, to address the issue of endogeneity and attenuation error in language proficiency. The first half of the study aims to investigate the extent to which English speaking ability of immigrants improves their labour market outcomes and social assimilation in Australia, with the use of the 2006 Census. The findings have provided evidence that support the earlier studies. As expected, immigrants in Australia with better language proficiency are able to earn higher income, attain higher level of education, have higher probability of completing tertiary studies, and have more hours of work per week. Language proficiency also improves social integration, leading to higher probability of marriage to a native and higher probability of obtaining citizenship. The second half of the study further investigates whether language proficiency has similar effects on a migrant’s physical and mental wellbeing, health care access and lifestyle choices, with the use of three National Health Surveys. However, only limited evidence has been found with respect to the hypothesised causal relationship between language and health for Australian immigrants.
Resumo:
Mathematical models of mosquito-borne pathogen transmission originated in the early twentieth century to provide insights into how to most effectively combat malaria. The foundations of the Ross–Macdonald theory were established by 1970. Since then, there has been a growing interest in reducing the public health burden of mosquito-borne pathogens and an expanding use of models to guide their control. To assess how theory has changed to confront evolving public health challenges, we compiled a bibliography of 325 publications from 1970 through 2010 that included at least one mathematical model of mosquito-borne pathogen transmission and then used a 79-part questionnaire to classify each of 388 associated models according to its biological assumptions. As a composite measure to interpret the multidimensional results of our survey, we assigned a numerical value to each model that measured its similarity to 15 core assumptions of the Ross–Macdonald model. Although the analysis illustrated a growing acknowledgement of geographical, ecological and epidemiological complexities in modelling transmission, most models during the past 40 years closely resemble the Ross–Macdonald model. Modern theory would benefit from an expansion around the concepts of heterogeneous mosquito biting, poorly mixed mosquito-host encounters, spatial heterogeneity and temporal variation in the transmission process.
Resumo:
One set of public institutions that has seen growing discussion about the transformative impact of new media technologies has been universities. The higher education sector, historically one of the more venerable and stable areas of public life, is now the subject of almost continuous speculation about whether it can continue in its current form during the 21st century. Digital media technologies are often seen as being at the forefront of such changes. It has been widely noted that moves towards a knowledge economy generates ‘skills-biased technological change’, that places a premium upon higher education qualifications, and that this earnings gap remains despite the continuing increase in the number of university graduates. As the demand for higher education continues to grow worldwide, there are new discussions about whether technologically-mediated education through new forms such as Massively Open Online Courses (MOOCs) are broadening access to quality learning, or severing the vital connection between teacher and student seen as integral to the learning process. This paper critically appraises such debates in the context of early 21st century higher education. It will discuss ten drivers of change in higher education, many of which are related to themes discussed elsewhere in this book, such as the impact of social media, globalization, and a knowledge economy. It will also consider the issues raised in navigating such developments from the perspective of the ‘Five P’s’: practical issues; personal issues; pedagogical issues; policy issues; and philosophical issues. It also includes a critical evaluation of MOOCs from the point of view of their educational qualities. It will conclude with the observation that while universities will continue to play a significant – and perhaps growing – role in the economy, society and culture, the issues raised about what Clayton Christensen and Henry Eyring term the ‘disruptive university’ (Christensen and Eyring 2011) are nonetheless pressing ones, and that cost and policy pressures in particular are likely to generate significant institutional transformations in higher education worldwide.
Resumo:
This dissertation seeks to define and classify potential forms of Nonlinear structure and explore the possibilities they afford for the creation of new musical works. It provides the first comprehensive framework for the discussion of Nonlinear structure in musical works and provides a detailed overview of the rise of nonlinearity in music during the 20th century. Nonlinear events are shown to emerge through significant parametrical discontinuity at the boundaries between regions of relatively strong internal cohesion. The dissertation situates Nonlinear structures in relation to linear structures and unstructured sonic phenomena and provides a means of evaluating Nonlinearity in a musical structure through the consideration of the degree to which the structure is integrated, contingent, compressible and determinate as a whole. It is proposed that Nonlinearity can be classified as a three dimensional space described by three continuums: the temporal continuum, encompassing sequential and multilinear forms of organization, the narrative continuum encompassing processual, game structure and developmental narrative forms and the referential continuum encompassing stylistic allusion, adaptation and quotation. The use of spectrograms of recorded musical works is proposed as a means of evaluating Nonlinearity in a musical work through the visual representation of parametrical divergence in pitch, duration, timbre and dynamic over time. Spectral and structural analysis of repertoire works is undertaken as part of an exploration of musical nonlinearity and the compositional and performative features that characterize it. The contribution of cultural, ideological, scientific and technological shifts to the emergence of Nonlinearity in music is discussed and a range of compositional factors that contributed to the emergence of musical Nonlinearity is examined. The evolution of notational innovations from the mobile score to the screen score is plotted and a novel framework for the discussion of these forms of musical transmission is proposed. A computer coordinated performative model is discussed, in which a computer synchronises screening of notational information, provides temporal coordination of the performers through click-tracks or similar methods and synchronises the audio processing and synthesized elements of the work. It is proposed that such a model constitutes a highly effective means of realizing complex Nonlinear structures. A creative folio comprising 29 original works that explore nonlinearity is presented, discussed and categorised utilising the proposed classifications. Spectrograms of these works are employed where appropriate to illustrate the instantiation of parametrically divergent substructures and examples of structural openness through multiple versioning.
Resumo:
Research background For almost 80 years the Chuck Taylor (or Chuck T's) All Star basketball shoe has been an iconic item of fashion apparel. The Chuck T's were first designed in 1921 by Converse, an American shoe company and over the decades they became a popular item not purely for sports and athletic purposes but rather evolved into the shoe of choice for many subcultural groups as a fashion item. In some circles the Chuck Taylor is still seen as the "coolest" sneaker of all time - one which will never go out of fashion regardless of changing trends. With over 600 millions pairs sold all over the world since its release, the Converse shoe is representative of not only a fashion culture - but also of a consumption culture - that evolved as the driving force behind the massive growth of the Western economic system during the 20th Century. Artisan Gallery (Brisbane), in conjunction with the exhibition Reboot: Function, Fashion and the Sneaker, a history of the sneaker, selected 20 designers to customise and re-design the classic Converse Chuck Taylor All Stars shoe and in doing so highlighted the diversity of forms possible for creative outcomes. As Artisan Gallery Curator Kirsten Fitzpatrick states “We were expecting people to draw and paint on them. Instead, we had shoes... mounted as trophies.." referring to the presentation of "Converse Consumption". The exhibition ran from 21 June – 16 August 2012: Research question The Chuck T’s is one of many overwhelmingly commercially successful designs of the last century. Nowadays we are faced with the significant problems of overconsumption and the stress this causes on the natural ecosystem; and on people as a result. As an active member of the industrial design fraternity – a discipline that sits at the core of this problem - how can I use this opportunity to comment on the significant issue of consumption? An effective way to do this was to associate consumption of goods with consumption of sugar. There are significant similarities between our ceaseless desires to consume products and our fervent need to consume indulgent sweet foods. Artisan Statement Delicious, scrumptious, delectable... your pupils dilate, your blood pressure spikes, your liver goes into overdrive. Immediately, your brain cuts off the adenosine receptors, preventing drowsiness. Your body increases dopamine production, in-turn stimulating the pleasure receptors in your brain. Your body absorbs all the sweetness and turns it into fat – while all the nutrients that you actually require are starting to be destroyed, about to be expelled. And this is only after one bite! After some time though, your body comes crashing back to earth. You become irritable and begin to feel sluggish. Your eyelids seem heavy while your breathing pattern changes. Your body has consumed all the energy and destroyed all available nutrients. You literally begin to shut down. These are the physiological effects of sugar consumption. A perfect analogy for our modern day consumer driven world. Enjoy your dessert! Research contribution “Converse Consumption” contributes to the conversation regarding over-consumption by compelling people to reflect on their consumption behaviour through the reconceptualising of the deconstructed Chuck T’s in an attractive edible form. By doing so the viewer has to deal with the desire to consume the indulgent looking dessert with the contradictory fact that it is comprised of a pair of shoes. The fact that the shoes are Chuck T’s make the effect even more powerful due to their iconic status. These clashing motivations are what make “Converse Consumption” a bizarre yet memorable experience. Significance The exhibition was viewed by an excess of 1000 people and generated exceptional media coverage and public exposure/impact. As Artisan Gallery Curator Kirsten Fitzpatrick states “20 of Brisbane's best designers were given the opportunity to customise their own Converse Sneakers, with The Converse Blank Canvas Project.” And to be selected in this category demonstrates the calibre of importance for design prominence.
Resumo:
Historically, children in criminal justice proceedings were treated much the same as adults and subject to the same criminal justice processes as adults. Until the early twentieth century, children in Australia were even subjected to the same penalties as adults, including hard labour and corporal and capital punishment (Carrington & Pereira 2009). Until the mid-nineteenth century, there was no separate category of ’juvenile offender’ in Western legal systems and children as young as six years of age were incarcerated in Australian prisons (Cunneen & White 2007). It is widely acknowledged today, however, both in Australia and internationally, that juveniles should be subject to a system of criminal justice that is separate from the adult system and that recognises their inexperience and immaturity. As such, juveniles are typically dealt with separately from adults and treated less harshly than their adult counterparts. The United Nations’ (1985: 2) Standard Minimum Rules for the Administration of Juvenile Justice (the ‘Beijing Rules’) stress the importance of nations establishing a set of laws, rules and provisions specifically applicable to juvenile offenders and institutions and bodies entrusted with the functions of the administration of juvenile justice and designed to meet the varying needs of juvenile offenders, while protecting their basic rights. In each Australian jurisdiction, except Queensland, a juvenile is defined as a person aged between 10 and 17 years of age, inclusive. In Queensland, a juvenile is defined as a person aged between 10 and 16 years, inclusive. In all jurisdictions, the minimum age of criminal responsibility is 10 years. That is, children under 10 years of age cannot be held legally responsible for their actions.