819 resultados para strategies of supplementation
Resumo:
This report focuses on risk-assessment practices in the private rental market, with particular consideration of their impact on low-income renters. It is based on the fieldwork undertaken in the second stage of the research process that followed completion of the Positioning Paper. The key research question this study addressed was: What are the various factors included in ‘risk-assessments’ by real estate agents in allocating ‘affordable’ tenancies? How are these risks quantified and managed? What are the key outcomes of their decision-making? The study builds on previous research demonstrating that a relatively large proportion of low-cost private rental accommodation is occupied by moderate- to high-income households (Wulff and Yates 2001; Seelig 2001; Yates et al. 2004). This is occurring in an environment where the private rental sector is now the de facto main provider of rental housing for lower-income households across Australia (Seelig et al. 2005) and where a number of factors are implicated in patterns of ‘income–rent mismatching’. These include ongoing shifts in public housing assistance; issues concerning eligibility for rent assistance; ‘supply’ factors, such as loss of low-cost rental stock through upgrading and/or transfer to owner-occupied housing; patterns of supply and demand driven largely by middle- to high-income owner-investors and renters; and patterns of housing need among low-income households for whom affordable housing is not appropriate. In formulating a way of approaching the analysis of ‘risk-assessment’ in rental housing management, this study has applied three sociological perspectives on risk: Beck’s (1992) formulation of risk society as entailing processes of ‘individualisation’; a socio-cultural perspective which emphasises the situated nature of perceptions of risk; and a perspective which has drawn attention to different modes of institutional governance of subjects, as ‘carriers of specific indicators of risk’. The private rental market was viewed as a social institution, and the research strategy was informed by ‘institutional ethnography’ as a method of enquiry. The study was based on interviews with property managers, real estate industry representatives, tenant advocates and community housing providers. The primary focus of inquiry was on ‘the moment of allocation’. Six local areas across metropolitan and regional Queensland, New South Wales, and South Australia were selected as case study localities. In terms of the main findings, it is evident that access to private rental housing is not just a matter of ‘supply and demand’. It is also about assessment of risk among applicants. Risk – perceived or actual – is thus a critical factor in deciding who gets housed, and how. Risk and its assessment matter in the context of housing provision and in the development of policy responses. The outcomes from this study also highlight a number of salient points: 1.There are two principal forms of risk associated with property management: financial risk and risk of litigation. 2. Certain tenant characteristics and/or circumstances – ability to pay and ability to care for the rented property – are the main factors focused on in assessing risk among applicants for rental housing. Signals of either ‘(in)ability to pay’ and/or ‘(in)ability to care for the property’ are almost always interpreted as markers of high levels of risk. 3. The processing of tenancy applications entails a complex and variable mix of formal and informal strategies of risk-assessment and allocation where sorting (out), ranking, discriminating and handing over characterise the process. 4. In the eyes of property managers, ‘suitable’ tenants can be conceptualised as those who are resourceful, reputable, competent, strategic and presentable. 5. Property managers clearly articulated concern about risks entailed in a number of characteristics or situations. Being on a low income was the principal and overarching factor which agents considered. Others included: - unemployment - ‘big’ families; sole parent families - domestic violence - marital breakdown - shift from home ownership to private rental - Aboriginality and specific ethnicities - physical incapacity - aspects of ‘presentation’. The financial vulnerability of applicants in these groups can be invoked, alongside expressed concerns about compromised capacities to manage income and/or ‘care for’ the property, as legitimate grounds for rejection or a lower ranking. 6. At the level of face-to-face interaction between the property manager and applicants, more intuitive assessments of risk based upon past experience or ‘gut feelings’ come into play. These judgements are interwoven with more systematic procedures of tenant selection. The findings suggest that considerable ‘risk’ is associated with low-income status, either directly or insofar as it is associated with other forms of perceived risk, and that such risks are likely to impede access to the professionally managed private rental market. Detailed analysis suggests that opportunities for access to housing by low-income householders also arise where, for example: - the ‘local experience’ of an agency and/or property manager works in favour of particular applicants - applicants can demonstrate available social support and financial guarantors - an applicant’s preference or need for longer-term rental is seen to provide a level of financial security for the landlord - applicants are prepared to agree to specific, more stringent conditions for inspection of properties and review of contracts - the particular circumstances and motivations of landlords lead them to consider a wider range of applicants - In particular circumstances, property managers are prepared to give special consideration to applicants who appear worthy, albeit ‘risky’. The strategic actions of demonstrating and documenting on the part of vulnerable (low-income) tenant applicants can improve their chances of being perceived as resourceful, capable and ‘savvy’. Such actions are significant because they help to persuade property managers not only that the applicant may have sufficient resources (personal and material) but that they accept that the onus is on themselves to show they are reputable, and that they have valued ‘competencies’ and understand ‘how the system works’. The parameters of the market do shape the processes of risk-assessment and, ultimately, the strategic relation of power between property manager and the tenant applicant. Low vacancy rates and limited supply of lower-cost rental stock, in all areas, mean that there are many more tenant applicants than available properties, creating a highly competitive environment for applicants. The fundamental problem of supply is an aspect of the market that severely limits the chances of access to appropriate and affordable housing for low-income rental housing applicants. There is recognition of the impact of this problem of supply. The study indicates three main directions for future focus in policy and program development: providing appropriate supports to tenants to access and sustain private rental housing, addressing issues of discrimination and privacy arising in the processes of selecting suitable tenants, and addressing problems of supply.
Resumo:
Alasdair Duncan’s narrative Metro, set in Brisbane in the early twenty-first century, focuses on Liam, an unapologetically self-styled ‘white, upper middle-class brat’ whose sense of place and identity is firmly mapped by spatial and economic co-ordinates. This article considers the linkages between spatiality and identity in Duncan’s narrative, as well as the ways in which traditional, hegemonic (heterosexual) forms of masculinity are re-invigorated in the enactment of an upper-middle-class script of success, privilege and consumerism. It argues that the safeguarding of these hegemonic forms of masculine identity involves strategies of spatial and bodily expression underpinned by conspicuous consumption, relegating other forms of sexual identity to an exploitable periphery
Resumo:
Unmanned Aerial Vehicles (UAVs) are emerging as an ideal platform for a wide range of civil applications such as disaster monitoring, atmospheric observation and outback delivery. However, the operation of UAVs is currently restricted to specially segregated regions of airspace outside of the National Airspace System (NAS). Mission Flight Planning (MFP) is an integral part of UAV operation that addresses some of the requirements (such as safety and the rules of the air) of integrating UAVs in the NAS. Automated MFP is a key enabler for a number of UAV operating scenarios as it aids in increasing the level of onboard autonomy. For example, onboard MFP is required to ensure continued conformance with the NAS integration requirements when there is an outage in the communications link. MFP is a motion planning task concerned with finding a path between a designated start waypoint and goal waypoint. This path is described with a sequence of 4 Dimensional (4D) waypoints (three spatial and one time dimension) or equivalently with a sequence of trajectory segments (or tracks). It is necessary to consider the time dimension as the UAV operates in a dynamic environment. Existing methods for generic motion planning, UAV motion planning and general vehicle motion planning cannot adequately address the requirements of MFP. The flight plan needs to optimise for multiple decision objectives including mission safety objectives, the rules of the air and mission efficiency objectives. Online (in-flight) replanning capability is needed as the UAV operates in a large, dynamic and uncertain outdoor environment. This thesis derives a multi-objective 4D search algorithm entitled Multi- Step A* (MSA*) based on the seminal A* search algorithm. MSA* is proven to find the optimal (least cost) path given a variable successor operator (which enables arbitrary track angle and track velocity resolution). Furthermore, it is shown to be of comparable complexity to multi-objective, vector neighbourhood based A* (Vector A*, an extension of A*). A variable successor operator enables the imposition of a multi-resolution lattice structure on the search space (which results in fewer search nodes). Unlike cell decomposition based methods, soundness is guaranteed with multi-resolution MSA*. MSA* is demonstrated through Monte Carlo simulations to be computationally efficient. It is shown that multi-resolution, lattice based MSA* finds paths of equivalent cost (less than 0.5% difference) to Vector A* (the benchmark) in a third of the computation time (on average). This is the first contribution of the research. The second contribution is the discovery of the additive consistency property for planning with multiple decision objectives. Additive consistency ensures that the planner is not biased (which results in a suboptimal path) by ensuring that the cost of traversing a track using one step equals that of traversing the same track using multiple steps. MSA* mitigates uncertainty through online replanning, Multi-Criteria Decision Making (MCDM) and tolerance. Each trajectory segment is modeled with a cell sequence that completely encloses the trajectory segment. The tolerance, measured as the minimum distance between the track and cell boundaries, is the third major contribution. Even though MSA* is demonstrated for UAV MFP, it is extensible to other 4D vehicle motion planning applications. Finally, the research proposes a self-scheduling replanning architecture for MFP. This architecture replicates the decision strategies of human experts to meet the time constraints of online replanning. Based on a feedback loop, the proposed architecture switches between fast, near-optimal planning and optimal planning to minimise the need for hold manoeuvres. The derived MFP framework is original and shown, through extensive verification and validation, to satisfy the requirements of UAV MFP. As MFP is an enabling factor for operation of UAVs in the NAS, the presented work is both original and significant.
Resumo:
This entry uses postcolonial perspectives to interrogate relations of power in the curriculum that are deeply influenced by the aftermath of European colonialism. The insights gained help to analyze continuing inequity in material, cultural, ideological and social aspects of the curriculum. This is a starting point for working out strategies of change and identifying the complexities and contestations which accompany change. The entry provides an introduction to key aspects of postcolonial theory, examines various aspects of the curriculum which are problematized by postcolonial perspectives, and explores ways in which curriculum decolonization is advocated in terms of social equity, race, cultural and gender identity, language and knowledge paradigms.
Resumo:
Digital production and distribution technologies may create new opportunities for filmmaking in Australia. A culture of new approaches to filmmaking is emerging driven by ‘next generation filmmakers’ who are willing to consider new business models: from online web series to short films produced for mobile phones. At the same time cultural representation itself is transforming within an interactive, social media driven environment. Yet there is very little research into next generation filmmaking. The aim of this paper is to scope and discuss three key aspects of next generation filmmaking, namely: digital trends in film distribution and marketing; processes and strategies of ‘next generation’ filmmakers; and case studies of viable next generation business models and filmmaking practices. We conclude with a brief examination of the implications for media and cultural policy which suggests the future possibility of a rapprochement between creative industries discourse and cultural policy.
Resumo:
Background The purpose of this study was to provide a detailed evaluation of adherence to nutrition supplements by patients with a lower limb fracture. Methods These descriptive data are from 49 nutritionally“ at-risk” patients aged 70+ years admitted to the hospital after a fall-related lower limb fracture and allocated to receive supplementation as part of a randomized, controlled trial. Supplementation commenced on day 7 and continued for 42 days. Prescribed volumes aimed to meet 45% of individually estimated theoretical energy requirements to meet the shortfall between literature estimates of energy intake and requirements. The supplement was administered by nursing staff on medication rounds in the acute or residential care settings and supervised through thrice-weekly home visits postdischarge. Results Median daily percent of the prescribed volume of nutrition supplement consumed averaged over the 42 days was 67% (interquartile range [IQR], 31–89, n = 49). There was no difference in adherence for gender, accommodation, cognition, or whether the supplement was self-administered or supervised. Twenty-three participants took some supplement every day, and a further 12 missed <5 days. For these 35 “nonrefusers,” adherence was 82% (IQR, 65–93), and they lost on average 0.7% (SD, 4.0%) of baseline weight over the 6 weeks of supplementation compared with a loss of 5.5% (SD, 5.4%) in the “refusers” (n = 14, 29%), p = .003. Conclusions We achieved better volume and energy consumption than previous studies of hip fracture patients but still failed to meet target supplement volumes prescribed to meet 45% of theoretical energy requirements. Clinicians should consider alternative methods of feeding such as a nasogastric tube, particularly in those patients where adherence to oral nutrition supplements is poor and dietary intake alone is insufficient to meet estimated energy requirements.
Resumo:
The processes of digitization and deregulation have transformed the production, distribution and consumption of information and entertainment media over the past three decades. Today, researchers are confronted with profoundly different landscapes of domestic and personal media than the pioneers of qualitative audience research that came to form much of the conceptual basis of Cultural Studies first in Britain and North America and subsequently across all global regions. The process of media convergence, as a consequence of the dual forces of digitisation and deregulation, thus constitutes a central concept in the analysis of popular mass media. From the study of the internationalisation and globalisation of media content, changing regimes of media production, via the social shaping and communication technologies and conversely the impact of communication technology on social, cultural and political realities, to the emergence of transmedia storytelling, the interplay of intertextuality and genre and the formation of mediated social networks, convergence informs and shapes contemporary conceptual debates in the field of popular communication and beyond. However, media convergence challenges not only the conceptual canon of (popular) communication research, but poses profound methodological challenges. As boundaries between producers and consumers are increasingly fluent, formerly stable fields and categories of research such as industries, texts and audiences intersect and overlap, requiring combined and new research strategies. This preconference aims to offer a forum to present and discuss methodological innovations in the study of contemporary media and the analysis of the social, cultural,and political impact and challenges arising through media convergence. The preconference thus aims to focus on the following methodological questions and challenges: *New strategies of audience research responding to the increasing individualisation of popular media consumption. *Methods of data triangulation in and through the integrated study of media production, distribution and consumption. *Bridging the methodological and often associated conceptual gap between qualitative and quantitative research in the study of popular media. *The future of ethnographic audience and production research in light of blurring boundaries between media producers and consumers. *A critical re-examination of which textual configurations can be meaningfully described and studied as text. *Methodological innovations aimed at assessing the macro social, cultural and political impact of mediatization (including, but not limited to, "creative methods"). *Methodological responses to the globalisation of popular media and practicalities of international and transnational comparative research. *An exploration of new methods required in the study of media flow and intertextuality.
Resumo:
Citizenship is a term of association among strangers. Access to it involves contested identities and symbolic meanings, differing power relations and strategies of inclusion, exclusion and action, and unequal room for maneuver or productivity in the uses of citizenship for any given group or individual. In the context of "rethinking communication," strenuous action is neede to associate such different life chances in a common enterprise at a national level or, more modestly, simply to claim equivalence for all such groups under the rule of one law.
Resumo:
This paper draws upon the Australian case to argue that the case for support for cultural production and cultural infrastructure has been strengthened overall by its alignment to economic policy goals. In this respect, the rise of creative industries policy discourses is consistent with trends in thinking about cultural policy that have their roots in the Creative Nation strategies of the early 1990s. In terms of the earlier discussion, cultural policy is as much driven by Schumpeterian principals as it is by Keynesian ones. Such an approach is not without attendant risks, and two stand out. The first is the risk of marginalizing the arts, through a policy framework that gives priority to developing the digital content industries, and viewing the creative industries as primarily an innovation platform. The second is that other trends in the economy, such as the strong Australian dollar resulting from the mining boom, undercuts the development of cultural production in the sections of the creative industries where international trade and investment is most significant, such as the film industry and computer games. Nonetheless, after over a decade of vibrant debate, this focus on linking the cultural and economic policy goals of the creative industries has come to be consistent with broader international trends in the field.
Resumo:
In this paper we adopt a complex systems perspective to examine the perturbations caused by the introduction of the Research Quality Framework (RQF) at a research-intensive Australian university. This case is instructive as it 1) presents a Federal policy initiative that attempted to fundamentally alter the recognition and reward mechanism within a regulated funding environment, 2) analyses the strategies of an institution and its research groups as they sought to not only comply with the implementation of the RQF but to maximise their outcome,and 3) it reveals the ways that some actors used this perturbation to advance their own interests. In short, this case represents an instrumental study into the dynamics of how information systems, organisations, and individuals co-evolve in practice as they seek to navigate a complex problem scenario.
Resumo:
: In the global knowledge economy, knowledge-intensive industries and knowledge workers are extensively seen as the primary factors to improve the welfare and competitiveness of cities. To attract and retain such industries and workers, cities produce knowledge-based urban development strategies, where such strategising is also an important development mechanism for cities and their economies. This paper investigates knowledge-based urban development strategies of Brisbane, Australia that support generation, attraction, and retention of investment and talent. The paper puts forward a clear understanding on the policy frameworks, and relevant applications of Brisbane’s knowledge-based urban development experience in becoming a prosperous knowledge city, and concludes by providing invaluable insights and directions for other cities seeking knowledge-based urban development.
Resumo:
“Turtle Twilight” is a two-screen video installation. Paragraphs of text adapted from a travel blog type across the left-hand screen. A computer-generated image of a tropical sunset is slowly animated on the right-hand screen. The two screens are accompanied by an atmospheric stock music track. This work examines how we construct, represent and deploy ‘nature’ in our contemporary lives. It mixes cinematic codes with image, text and sound gleaned from online sources. By extending on Nicolas Bourriad’s understanding of ‘postproduction’ and the creative and critical strategies of ‘editing’, it questions the relationship between contemporary screen culture, nature, desire and contemplation.
Resumo:
In this video, text sourced from dream description websites is combined into a narrative. The words floating against an animated cloud background are set to a stock music track. This work examines the nature of consciousness and identity in a contemporary context. It mixes the languages of dream description and cinematic narrative. By extending on some of Nicolas Bourriaud’s ideas around “postproduction” and the creative and critical strategies of ‘editing’, this work draws attention to the ways popular culture and private anxieties continually mix together in our experiences of lived and imagined realities.
Resumo:
The research field was curatorship of the Machinima genre - a film-making practice that uses real time 3D computer graphics engines to create cinematic productions. The context was the presentation of gallery non-specific work for large-scale exhibition, as an investigation in thinking beyond traditional strategies of white cube. Strongly influenced by the Christiane Paul (Ed) seminal text, 'New Media in the White Cube and Beyond, Curatorial Models for Digital Art', the context was the repositioning of a genre traditionally focussed on delivery through small-screen, indoor, personal spaces, to large exhibition hall spaces. Beyond the core questions of collecting, documenting, expanding and rethinking the place of Machinima within the history of contemporary digital arts, the curatorial premise asked how to best invert the relationship between context of media production within the gaming domain, using novel presentational strategies that might best promote the 'take-home' impulse. The exhibition was used not as the ultimate destination for work but rather as a place to experience, sort and choose from a high volume of possible works for subsequent investigation by audiences within their own game-ready, domestic environments. In pursuit of this core aim, the exhibition intentionally promoted 'sensory overload'. The exhibition also included a gaming lab experience where audiences could begin to learn the DIY concepts of the medium, and be stimulated to revisit, consider and re-make their own relationship to this genre. The research was predominantly practice-led and collaborative (in close concert with the Machinima community), and ethnographic in that it sought to work with, understand and promote the medium in a contemporary art context. This benchmark exhibition, building on the 15-year history of the medium, was warmly received by the global Machinima community as evidenced by the significant debate, feedback and general interest recorded. The exhibition has recently begun an ongoing Australian touring schedule. To date, the exhibition has received critical attention nationally and internationally in Das Superpaper, the Courier Mail, Machinimart, 4ZZZ-FM, the Sydney Morning Herald, Games and Business, Australian Gamer, Kotaku Australia, and the Age.
Resumo:
This project investigates machine listening and improvisation in interactive music systems with the goal of improvising musically appropriate accompaniment to an audio stream in real-time. The input audio may be from a live musical ensemble, or playback of a recording for use by a DJ. I present a collection of robust techniques for machine listening in the context of Western popular dance music genres, and strategies of improvisation to allow for intuitive and musically salient interaction in live performance. The findings are embodied in a computational agent – the Jambot – capable of real-time musical improvisation in an ensemble setting. Conceptually the agent’s functionality is split into three domains: reception, analysis and generation. The project has resulted in novel techniques for addressing a range of issues in each of these domains. In the reception domain I present a novel suite of onset detection algorithms for real-time detection and classification of percussive onsets. This suite achieves reasonable discrimination between the kick, snare and hi-hat attacks of a standard drum-kit, with sufficiently low-latency to allow perceptually simultaneous triggering of accompaniment notes. The onset detection algorithms are designed to operate in the context of complex polyphonic audio. In the analysis domain I present novel beat-tracking and metre-induction algorithms that operate in real-time and are responsive to change in a live setting. I also present a novel analytic model of rhythm, based on musically salient features. This model informs the generation process, affording intuitive parametric control and allowing for the creation of a broad range of interesting rhythms. In the generation domain I present a novel improvisatory architecture drawing on theories of music perception, which provides a mechanism for the real-time generation of complementary accompaniment in an ensemble setting. All of these innovations have been combined into a computational agent – the Jambot, which is capable of producing improvised percussive musical accompaniment to an audio stream in real-time. I situate the architectural philosophy of the Jambot within contemporary debate regarding the nature of cognition and artificial intelligence, and argue for an approach to algorithmic improvisation that privileges the minimisation of cognitive dissonance in human-computer interaction. This thesis contains extensive written discussions of the Jambot and its component algorithms, along with some comparative analyses of aspects of its operation and aesthetic evaluations of its output. The accompanying CD contains the Jambot software, along with video documentation of experiments and performances conducted during the project.