999 resultados para Research Grants
Resumo:
Through international agreement to the United Nations Framework Convention on Climate Change and the Kyoto Protocol the global community has acknowledged that climate change is a global problem and sought to achieve reductions in global emissions, within a sufficient timeframe, to avoid dangerous anthropogenic interference with the climate system. The sheer magnitude of emissions reductions required within such an urgent timeframe presents a challenge to conventional regulatory approaches both internationally and within Australia. The phenomenon of climate change is temporally and geographically challenging and it is scientifically complex and uncertain. The purpose of this paper is to analyse the current Australian legal response to climate change and to examine the legal measures which have been proposed to promote carbon trading, energy efficiency, renewable energy, and carbon sequestration initiatives across Australia. As this paper illustrates, the current Australian approach is clearly ineffective and the law as it stands overwhelmingly inadequate to address Australia’s emissions and meet the enormity of the challenges posed by climate change. Consequently, the government should look towards a more effective legal framework to achieve rapid and urgent transformations in the selection of energy sources, energy use and sequestration initiatives across the Australian community.
Resumo:
Melt electrospinning is relatively under-investigated compared to solution electrospinning but provides opportunities in numerous areas, in which solvent accumulation or toxicity are a concern. These applications are diverse, and provide a broad set of challenges to researchers involved in electrospinning. In this context, melt electrospinning provides an alternative approach that bypasses some challenges to solution electronspinning, while bringing new issues to the forefront, such as the thermal stability of polymers. This Focus Review describes the literature on melt electrospinning, as well as highlighting areas where both melt and solution are combined, and potentially merge together in the future.
Resumo:
Background: Efforts to prevent the development of overweight and obesity have increasingly focused early in the life course as we recognise that both metabolic and behavioural patterns are often established within the first few years of life. Randomised controlled trials (RCTs) of interventions are even more powerful when, with forethought, they are synthesised into an individual patient data (IPD) prospective meta-analysis (PMA). An IPD PMA is a unique research design where several trials are identified for inclusion in an analysis before any of the individual trial results become known and the data are provided for each randomised patient. This methodology minimises the publication and selection bias often associated with a retrospective meta-analysis by allowing hypotheses, analysis methods and selection criteria to be specified a priori. Methods/Design: The Early Prevention of Obesity in CHildren (EPOCH) Collaboration was formed in 2009. The main objective of the EPOCH Collaboration is to determine if early intervention for childhood obesity impacts on body mass index (BMI) z scores at age 18-24 months. Additional research questions will focus on whether early intervention has an impact on children’s dietary quality, TV viewing time, duration of breastfeeding and parenting styles. This protocol includes the hypotheses, inclusion criteria and outcome measures to be used in the IPD PMA. The sample size of the combined dataset at final outcome assessment (approximately 1800 infants) will allow greater precision when exploring differences in the effect of early intervention with respect to pre-specified participant- and intervention-level characteristics. Discussion: Finalisation of the data collection procedures and analysis plans will be complete by the end of 2010. Data collection and analysis will occur during 2011-2012 and results should be available by 2013. Trial registration number: ACTRN12610000789066
Resumo:
This article introduces a “pseudo classical” notion of modelling non-separability. This form of non-separability can be viewed as lying between separability and quantum-like non-separability. Non-separability is formalized in terms of the non-factorizabilty of the underlying joint probability distribution. A decision criterium for determining the non-factorizability of the joint distribution is related to determining the rank of a matrix as well as another approach based on the chi-square-goodness-of-fit test. This pseudo-classical notion of non-separability is discussed in terms of quantum games and concept combinations in human cognition.
Resumo:
In computational linguistics, information retrieval and applied cognition, words and concepts are often represented as vectors in high dimensional spaces computed from a corpus of text. These high dimensional spaces are often referred to as Semantic Spaces. We describe a novel and efficient approach to computing these semantic spaces via the use of complex valued vector representations. We report on the practical implementation of the proposed method and some associated experiments. We also briefly discuss how the proposed system relates to previous theoretical work in Information Retrieval and Quantum Mechanics and how the notions of probability, logic and geometry are integrated within a single Hilbert space representation. In this sense the proposed system has more general application and gives rise to a variety of opportunities for future research.
Resumo:
Objective--To determine whether heart failure with preserved systolic function (HFPSF) has different natural history from left ventricular systolic dysfunction (LVSD). Design and setting--A retrospective analysis of 10 years of data (for patients admitted between 1 July 1994 and 30 June 2004, and with a study census date of 30 June 2005) routinely collected as part of clinical practice in a large tertiary referral hospital.Main outcome measures-- Sociodemographic characteristics, diagnostic features, comorbid conditions, pharmacotherapies, readmission rates and survival.Results--Of the 2961 patients admitted with chronic heart failure, 753 had echocardiograms available for this analysis. Of these, 189 (25%) had normal left ventricular size and systolic function. In comparison to patients with LVSD, those with HFPSF were more often female (62.4% v 38.5%; P = 0.001), had less social support, and were more likely to live in nursing homes (17.9% v 7.6%; P < 0.001), and had a greater prevalence of renal impairment (86.7% v 6.2%; P = 0.004), anaemia (34.3% v 6.3%; P = 0.013) and atrial fibrillation (51.3% v 47.1%; P = 0.008), but significantly less ischaemic heart disease (53.4% v 81.2%; P = 0.001). Patients with HFPSF were less likely to be prescribed an angiotensin-converting enzyme inhibitor (61.9% v 72.5%; P = 0.008); carvedilol was used more frequently in LVSD (1.5% v 8.8%; P < 0.001). Readmission rates were higher in the HFPSF group (median, 2 v 1.5 admissions; P = 0.032), particularly for malignancy (4.2% v 1.8%; P < 0.001) and anaemia (3.9% v 2.3%; P < 0.001). Both groups had the same poor survival rate (P = 0.912). Conclusions--Patients with HFPSF were predominantly older women with less social support and higher readmission rates for associated comorbid illnesses. We therefore propose that reduced survival in HFPSF may relate more to comorbid conditions than suboptimal cardiac management.
Resumo:
Aims--Telemonitoring (TM) and structured telephone support (STS) have the potential to deliver specialised management to more patients with chronic heart failure (CHF), but their efficacy is still to be proven. Objectives To review randomised controlled trials (RCTs) of TM or STS on all- cause mortality and all-cause and CHF-related hospitalisations in patients with CHF, as a non-invasive remote model of specialised disease-management intervention.--Methods and Results--Data sources:We searched 15 electronic databases and hand-searched bibliographies of relevant studies, systematic reviews, and meeting abstracts. Two reviewers independently extracted all data. Study eligibility and participants: We included any randomised controlled trials (RCT) comparing TM or STS to usual care of patients with CHF. Studies that included intensified management with additional home or clinic visits were excluded. Synthesis: Primary outcomes (mortality and hospitalisations) were analysed; secondary outcomes (cost, length of stay, quality of life) were tabulated.--Results: Thirty RCTs of STS and TM were identified (25 peer-reviewed publications (n=8,323) and five abstracts (n=1,482)). Of the 25 peer-reviewed studies, 11 evaluated TM (2,710 participants), 16 evaluated STS (5,613 participants) and two tested both interventions. TM reduced all-cause mortality (risk ratio (RR 0•66 [95% CI 0•54-0•81], p<0•0001) and STS showed similar trends (RR 0•88 [95% CI 0•76-1•01], p=0•08). Both TM (RR 0•79 [95% CI 0•67-0•94], p=0•008) and STS (RR 0•77 [95% CI 0•68-0•87], p<0•0001) reduced CHF-related hospitalisations. Both interventions improved quality of life, reduced costs, and were acceptable to patients. Improvements in prescribing, patient-knowledge and self-care, and functional class were observed.--Conclusion: TM and STS both appear effective interventions to improve outcomes in patients with CHF.
Resumo:
Against a background of already thin markets in some sectors of major public sector infrastructure in Australia and the desire of Australian federal government to leverage private finance, concerns about ensuring sufficient levels of competition are prompting federal government to seek new sources of in-bound Foreign Direct Income. The aim of this paper is to justify and develop a means to deploying the eclectic paradigm of internationalisation that forms part of an Australian federally funded research project designed to explain the determinants of multinational contractors' willingness to bid for Australian public sector major infrastructure projects. Despite the dominance of the eclectic paradigm as a theory of internationalisation for over two decades, it has seen limited application in terms of multinational construction. It is expected that the research project will be the first empirical study to deploy the eclectic paradigm to inbound FDI to Australia whilst using the dominant economic theories advocated for use within the eclectic paradigm. Furthermore, the research project is anticipated to yield a number of practical benefits. These include estimates of the potential scope to attract more multinational contractors to bid for Australian public sector infrastructure, including the nature and extent to which this scope can be influenced by Australian governments responsible for the delivery of infrastructure. On the other hand, the research is also expected to indicate the extent to which indigenous and other multinational contractors domiciled in Australia are investing in special purpose technology and achieving productivity gains relative to foreign multinational contractors.
Resumo:
This article reports on a research program that has developed new methodologies for mapping the Australian blogosphere and tracking how information is disseminated across it. The authors improve on conventional web crawling methodologies in a number of significant ways: First, the authors track blogging activity as it occurs, by scraping new blog posts when such posts are announced through Really Simple Syndication (RSS) feeds. Second, the authors use custom-made tools that distinguish between the different types of content and thus allow us to analyze only the salient discursive content provided by bloggers. Finally, the authors are able to examine these better quality data using both link network mapping and textual analysis tools, to produce both cumulative longer term maps of interlinkages and themes, and specific shorter term snapshots of current activity that indicate current clusters of heavy interlinkage and highlight their key themes. In this article, the authors discuss findings from a yearlong observation of the Australian political blogosphere, suggesting that Australian political bloggers consistently address current affairs, but interpret them differently from mainstream news outlets. The article also discusses the next stage of the project, which extends this approach to an examination of other social networks used by Australians, including Twitter, YouTube, and Flickr. This adaptation of our methodology moves away from narrow models of political communication, and toward an investigation of everyday and popular communication, providing a more inclusive and detailed picture of the Australian networked public sphere.
Resumo:
Agile ridesharing aims to utilise the capability of social networks and mobile phones to facilitate people to share vehicles and travel in real time. However the application of social networking technologies in local communities to address issues of personal transport faces significant design challenges. In this paper we describe an iterative design-based approach to exploring this problem and discuss findings from the use of an early prototype. The findings focus upon interaction, privacy and profiling. Our early results suggest that explicitly entering information such as ride data and personal profile data into formal fields for explicit computation of matches, as is done in many systems, may not be the best strategy. It might be preferable to support informal communication and negotiation with text search techniques.
Resumo:
A concise introduction to the key ideas and issues in the study of media economics, drawing on a broad range of case studies - from Amazon and Twitter, to Apple and Netflix - to illustrate how economic paradigms are not just theories, but provide important practical insights into how the media operates today. Understanding the economic paradigms at work in media industries and markets is vitally important for the analysis of the media system as a whole. The changing dynamics of media production, distribution and consumption are stretching the capacity of established economic paradigms. In addition to succinct accounts of neo-classical and critical political economics, the text offers fresh perspectives for understanding media drawn from two 'heterodox' approaches: institutional economics and evolutionary economics. Applying these paradigms to vital topics and case studies, Media Economics stresses the value – and limits – of contending economic approaches in understanding how the media operates today. It is essential reading for all students of Media and Communication Studies, and also those from Economics, Policy Studies, Business Studies and Marketing backgrounds who are studying the media. Table of Contents: 1. Media Economics: The Mainstream Approach 2. Critical Political Economy of the Media 3. Institutional Economics 4. Evolutionary Economics 5. Case Studies and Conclusions
Resumo:
The significant challenge faced by government in demonstrating value for money in the delivery of major infrastructure resolves around estimating costs and benefits of alternative modes of procurement. Faced with this challenge, one approach is to focus on a dominant performance outcome visible on the opening day of the asset, as the means to select the procurement approach. In this case, value for money becomes a largely nominal concept and determined by selected procurement mode delivering, or not delivering, the selected performance outcome, and notwithstanding possible under delivery on other desirable performance outcomes, as well as possibly incurring excessive transaction costs. This paper proposes a mind-set change in this particular practice, to an approach in which the analysis commences with the conditions pertaining to the project and proceeds to deploy transaction cost and production cost theory to indicate a procurement approach that can claim superior value for money relative to other competing procurement modes. This approach to delivering value for money in relative terms is developed in a first-order procurement decision making model outlined in this paper. The model developed could be complementary to the Public Sector Comparator (PSC) in terms of cross validation and the model more readily lends itself to public dissemination. As a possible alternative to the PSC, the model could save time and money in preparation of project details to lesser extent than that required in the reference project and may send a stronger signal to the market that may encourage more innovation and competition.
Resumo:
Thin bed technology for clay/ concrete masonry is gaining popularity in many parts of the developed economy in recent times through active engagement of the industry with the academia. One of the main drivers for the development of thin bed technology is the progressive contraction of the professional brick and block laying workforce as the younger generation is not attracted towards this profession due to the general perception of the society towards manual work as being outdated in the modern digital economy. This situation has led to soaring cost of skilled labour associated with the general delay in completion of construction activities in recent times. In parallel, the advent of manufacturing technologies in producing bricks and blocks with adherence to specified dimensions and shapes and several rapid setting binders are other factors that have contributed to the development of thin bed technology. Although this technology is still emerging, especially for applications to earthquake prone regions, field applications are reported in Germany for over a few decades and in Italy since early 2000. The Australian concrete masonry industry has recently taken keen interest in pursuing research with a view to developing this technology. This paper presents the background information including review of literature and pilot studies that have been carried out to enable planning of the development of thin bed technology. The paper concludes with recommendations for future research.
Resumo:
Background: An estimated 285 million people worldwide have diabetes and its prevalence is predicted to increase to 439 million by 2030. For the year 2010, it is estimated that 3.96 million excess deaths in the age group 20-79 years are attributable to diabetes around the world. Self-management is recognised as an integral part of diabetes care. This paper describes the protocol of a randomised controlled trial of an automated interactive telephone system aiming to improve the uptake and maintenance of essential diabetes self-management behaviours. ---------- Methods/Design: A total of 340 individuals with type 2 diabetes will be randomised, either to the routine care arm, or to the intervention arm in which participants receive the Telephone-Linked Care (TLC) Diabetes program in addition to their routine care. The intervention requires the participants to telephone the TLC Diabetes phone system weekly for 6 months. They receive the study handbook and a glucose meter linked to a data uploading device. The TLC system consists of a computer with software designed to provide monitoring, tailored feedback and education on key aspects of diabetes self-management, based on answers voiced or entered during the current or previous conversations. Data collection is conducted at baseline (Time 1), 6-month follow-up (Time 2), and 12-month follow-up (Time 3). The primary outcomes are glycaemic control (HbA1c) and quality of life (Short Form-36 Health Survey version 2). Secondary outcomes include anthropometric measures, blood pressure, blood lipid profile, psychosocial measures as well as measures of diet, physical activity, blood glucose monitoring, foot care and medication taking. Information on utilisation of healthcare services including hospital admissions, medication use and costs is collected. An economic evaluation is also planned.---------- Discussion: Outcomes will provide evidence concerning the efficacy of a telephone-linked care intervention for self-management of diabetes. Furthermore, the study will provide insight into the potential for more widespread uptake of automated telehealth interventions, globally.
Resumo:
One of the major challenges in the design of social technologies is the evaluation of their qualities of use and how they are appropriated over time. While the field of HCI abounds in short-term exploratory design and studies of use, relatively little attention has focused on the continuous development of prototypes longitudinally and studies of their emergent use. We ground the exploration and analysis of use in the everyday world, embracing contingency and open-ended use, through the use of a continuously-available exploratory prototype. Through examining use longitudinally, clearer insight can be gained of realistic, non-novelty usage and appropriation into everyday use. This paper sketches out a framework for design that puts a premium on immediate use and evolving the design in response to use and user feedback. While such design practices with continuously developing systems are common in the design of social technologies, they are little documented. We describe our approach and reflect upon its key characteristics, based on our experiences from two case studies. We also present five major patterns of long-term usage which we found useful for design.