919 resultados para Run


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines the case of the Forgotten Australians as an opportunity to examine the role of the internet in the presentation of testimony. ‘Forgotten Australians’ are a group who suffered abuse and neglect after being removed from their parents – either in Australia or in the UK - and placed in Church and State run institutions in Australia between 1930 and 1970. The campaign by this profoundly marginalised group coincided with the decade in which the opportunities of Web 2.0 were seen to be diffusing throughout different social groups, and were considered a tool for social inclusion. We outline a conceptual framework that positions the role of the internet as an environment in which the difficult relationships between painful past experiences and contemporary injunctions to remember them, are negotiated. We then apply this framework to the analysis of case examples of posts and interaction on websites with web 2.0 functionality: YouTube and the National Museum of Australia. The analysis points to commonalities and differences in the agency of the internet in these two contexts, arguing that in both cases the websites provided support for the development of a testimony-like narrative and the claiming, sharing and acknowledgement of loss.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Australian governments face the twin challenges of dealing with extreme weather-related disasters (such as floods and bushfires) and adapting to the impacts of climate change. These challenges are connected, so any response would benefit from a more integrated approach across and between the different levels of government.This report summarises the findings of an NCCARF-funded project that addresses this problem. The project undertook a three-way comparative case study of the 2009 Victorian bushfires, the 2011 Perth Hills bushfires, and the 2011 Brisbane floods. It collected data from the official inquiry reports into each of these events, and conducted new interviews and workshops with key stakeholders. The findings of this project included recommendations that range from the conceptual to the practical. First, it was argued that a reconceptualization of terms such as ‘community’ and ‘resilience’ was necessary to allow for more tailored responses to varying circumstances. Second, it was suggested that the high level of uncertainty inherent in disaster risk management and climate change adaptation requires a more iterative approach to policymaking and planning. Third, some specific institutional reforms were proposed that included: 1) a new funding mechanism that would encourage collaboration between and across different levels of government, as well as promoting partnerships with business and the community; 2) improving community engagement through new resilience grants run by local councils; 3) embedding climate change researchers within disaster risk management agencies to promote institutional learning, and; 4) creating an inter-agency network that encourages collaboration between organisations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An Application Specific Instruction-set Processor (ASIP) is a specialized processor tailored to run a particular application/s efficiently. However, when there are multiple candidate applications in the application’s domain it is difficult and time consuming to find optimum set of applications to be implemented. Existing ASIP design approaches perform this selection manually based on a designer’s knowledge. We help in cutting down the number of candidate applications by devising a classification method to cluster similar applications based on the special-purpose operations they share. This provides a significant reduction in the comparison overhead while resulting in customized ASIP instruction sets which can benefit a whole family of related applications. Our method gives users the ability to quantify the degree of similarity between the sets of shared operations to control the size of clusters. A case study involving twelve algorithms confirms that our approach can successfully cluster similar algorithms together based on the similarity of their component operations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Layers (about 60-100 μm thick) of almost pure BaCuO2 (BC1), as determined using X-ray diffractometry (XRD) and scanning electron microscopy (SEM), coat the surfaces of YBa2Cu3O7-x (Y123) samples partial melt processed using a single-zone vertical furnace. The actual Cu/Ba ratio of the BC1 phase is 1.2-1.3 as determined using energy dispersive X-ray spectrometry (EDS). The nominally BC1 phase displays an exsolution of BC1.5 or BC2 in the form of thin plates (about 50-100 nm thick) along {100}-type cleavage planes or facets. The exsolved phase also fills cracks within the BC1 layer that require it to be in a molten state at some stage of processing. The samples were influenced by Pt contamination from the supporting wire, which may have stabilised the BC1.5 phase. Many of the Y123 grains have the same morphology as the exsolution domains, and run nearly parallel to the thin plates of the exsolved phases, strongly indicating that Y123 nucleation took place at the interface between the BC1 and the BC1.5 or BC2 exsolved phases. The network of nearly parallel exsolved 'channels' provides a matrix and a mechanism through which a high degree of local texture can be initiated in the material.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study was to compare the effectiveness of three different recovery modalities - active (ACT), passive (PAS) and contrast temperature water immersion (CTW) - on the performance of repeated treadmill running, lactate concentration and pH. Fourteen males performed two pairs of treadmill runs to exhaustion at 120% and 90% of peak running speed (PRS) over a 4-hour period. ACT, PAS or CTW was performed for 15-min after the first pair of treadmill runs. ACT consisted of running at 40% PRS, PAS consisted of standing stationary and CTW consisted of alternating between 60-s cold (10°C) and 120-s hot (42°C) water immersion. Run times were converted to time to cover set distance using critical power. Type of recovery modality did not have a significant effect on change in time to cover 400 m (Mean±SD; ACT 2.7±3.6 s, PAS 2.9±4.2 s, CTW 4.2±6.9 s), 1000 m (ACT 2.2±4.0 s, PAS 4.8±8.6 s, CTW 2.1±7.2 s) or 5000 m (ACT 1.4±29.0 s, PAS 16.7±58.5 s, CTW 11.7±33.0 s). Post exercise blood lactate concentration was lower in ACT and CTW compared with PAS. Participants reported an increased perception of recovery in the CTW compared with ACT and PAS. Blood pH was not significantly influenced by recovery modality. Data suggest both ACT and CTW reduce lactate accumulation after high intensity running, but high intensity treadmill running performance is returned to baseline 4-hours after the initial exercise bout regardless of the recovery strategy employed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study explores the accuracy and valuation implications of the application of a comprehensive list of equity multiples in the takeover context. Motivating the study is the prevalent use of equity multiples in practice, the observed long-run underperformance of acquirers following takeovers, and the scarcity of multiplesbased research in the merger and acquisition setting. In exploring the application of equity multiples in this context three research questions are addressed: (1) how accurate are equity multiples (RQ1); which equity multiples are more accurate in valuing the firm (RQ2); and which equity multiples are associated with greater misvaluation of the firm (RQ3). Following a comprehensive review of the extant multiples-based literature it is hypothesised that the accuracy of multiples in estimating stock market prices in the takeover context will rank as follows (from best to worst): (1) forecasted earnings multiples, (2) multiples closer to bottom line earnings, (3) multiples based on Net Cash Flow from Operations (NCFO) and trading revenue. The relative inaccuracies in multiples are expected to flow through to equity misvaluation (as measured by the ratio of estimated market capitalisation to residual income value, or P/V). Accordingly, it is hypothesised that greater overvaluation will be exhibited for multiples based on Trading Revenue, NCFO, Book Value (BV) and earnings before interest, tax, depreciation and amortisation (EBITDA) versus multiples based on bottom line earnings; and that multiples based on Intrinsic Value will display the least overvaluation. The hypotheses are tested using a sample of 147 acquirers and 129 targets involved in Australian takeover transactions announced between 1990 and 2005. The results show that first, the majority of computed multiples examined exhibit valuation errors within 30 percent of stock market values. Second, and consistent with expectations, the results provide support for the superiority of multiples based on forecasted earnings in valuing targets and acquirers engaged in takeover transactions. Although a gradual improvement in estimating stock market values is not entirely evident when moving down the Income Statement, historical earnings multiples perform better than multiples based on Trading Revenue or NCFO. Third, while multiples based on forecasted earnings have the highest valuation accuracy they, along with Trading Revenue multiples for targets, produce the most overvalued valuations for acquirers and targets. Consistent with predictions, greater overvaluation is exhibited for multiples based on Trading Revenue for targets, and NCFO and EBITDA for both acquirers and targets. Finally, as expected, multiples based Intrinsic Value (along with BV) are associated with the least overvaluation. Given the widespread usage of valuation multiples in takeover contexts these findings offer a unique insight into their relative effectiveness. Importantly, the findings add to the growing body of valuation accuracy literature, especially within Australia, and should assist market participants to better understand the relative accuracy and misvaluation consequences of various equity multiples used in takeover documentation and assist them in subsequent investment decision making.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the summer of 2012 - 2013, the State Library of Queensland invited us to run a number of workshops for younger participants as part of the Garage Gamer program. The brief was very much about the local games industry and the SLQ community, the core concept was about participant contribution. The 'Stories into Games' series of workshops ran across three Saturdays (January 5 - March 2). The workshops were aimed at younger audiences (ages 6-12) and the concept was to engage this group with games as game makers and designers, rather than players. Each session saw a group of participants create a shared story, illustrate the story and then make game assets and objects out of their illustrative work. These were then put into a raw framework created in the Unity Game Engine so that the stories could be played.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

New media initiatives in Brazil's capital, Rio de Janeiro, are attempting to change mainstream ideas about favelas (poor districts) and their inhabitants. This thesis focuses on two of these initiatives that are being run by non-government organisations, Viva Favela and Imagens do Povo. This study takes an ethnographic and discursive approach to investigating and comparing two categories of professional photographers to determine how their working practices contribute to empowering the people living in Brazil's favelas. While mainstream photojournalists mainly cover human rights abuses in the favelas, community photographers challenge stereotypes by presenting images of the favelas' everyday life.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Ankle joint equinus, or restricted dorsiflexion range of motion (ROM), has been linked to a range of pathologies of relevance to clinical practitioners. This systematic review and meta-analysis investigated the effects of conservative interventions on ankle joint ROM in healthy individuals and athletic populations. METHODS: Keyword searches of Embase Medline Cochrane and CINAHL databases were performed with the final search being run in August 2013. Studies were eligible for inclusion if they assessed the effect of a non-surgical intervention on ankle joint dorsiflexion in healthy populations. Studies were quality rated using a standard quality assessment scale. Standardised mean differences (SMDs) and 95% confidence intervals (CIs) were calculated and results were pooled where study methods were homogenous. RESULTS: Twenty-three studies met eligibility criteria, with a total of 734 study participants. Results suggest that there is some evidence to support the efficacy of static stretching alone (SMDs: range 0.70 to 1.69) and static stretching in combination with ultrasound (SMDs: range 0.91 to 0.95), diathermy (SMD 1.12), diathermy and ice (SMD 1.16), heel raise exercises (SMDs: range 0.70 to 0.77), superficial moist heat (SMDs: range 0.65 to 0.84) and warm up (SMD 0.87) in improving ankle joint dorsiflexion ROM. CONCLUSIONS: Some evidence exists to support the efficacy of stretching alone and stretching in combination with other therapies in increasing ankle joint ROM in healthy individuals. There is a paucity of quality evidence to support the efficacy of other non-surgical interventions, thus further research in this area is warranted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordination of dynamic interceptive movements is predicated on cyclical relations between an individual's actions and information sources from the performance environment. To identify dynamic informational constraints, which are interwoven with individual and task constraints, coaches’ experiential knowledge provides a complementary source to support empirical understanding of performance in sport. In this study, 15 expert coaches from 3 sports (track and field, gymnastics and cricket) participated in a semi-structured interview process to identify potential informational constraints which they perceived to regulate action during run-up performance. Expert coaches’ experiential knowledge revealed multiple information sources which may constrain performance adaptations in such locomotor pointing tasks. In addition to the locomotor pointing target, coaches’ knowledge highlighted two other key informational constraints: vertical reference points located near the locomotor pointing target and a check mark located prior to the locomotor pointing target. This study highlights opportunities for broadening the understanding of perception and action coupling processes, and the identified information sources warrant further empirical investigation as potential constraints on athletic performance. Integration of experiential knowledge of expert coaches with theoretically driven empirical knowledge represents a promising avenue to drive future applied science research and pedagogical practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses challenges part of the shift of paradigm taking place in the way we produce, transmit and use power related to what is known as smart grids. The aim of this paper is to explore present initiatives to establish smart grids as a sustainable and reliable power supply system. We argue that smart grids are not isolated to abstract conceptual models alone. We suggest that establishing sustainable and reliable smart grids depend on series of contributions including modeling and simulation projects, technological infrastructure pilots, systemic methods and training, and not least how these and other elements must interact to add reality to the conceptual models. We present and discuss three initiatives that illuminate smart grids from three very different positions. First, the new power grid simulator project in the electrical engineering PhD program at Queensland University of Technology (QUT). Second, the new smart grids infrastructure pilot run by the Norwegian Centers of Expertise Smart Energy Markets (NCE SMART). And third, the new systemic Master program on next generation energy technology at østfold University College (Hiø). These initiatives represent future threads in a mesh embedding smart grids in models, technology, infrastructure, education, skills and people.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Efficient and effective feature detection and representation is an important consideration when processing videos, and a large number of applications such as motion analysis, 3D scene understanding, tracking etc. depend on this. Amongst several feature description methods, local features are becoming increasingly popular for representing videos because of their simplicity and efficiency. While they achieve state-of-the-art performance with low computational complexity, their performance is still too limited for real world applications. Furthermore, rapid increases in the uptake of mobile devices has increased the demand for algorithms that can run with reduced memory and computational requirements. In this paper we propose a semi binary based feature detectordescriptor based on the BRISK detector, which can detect and represent videos with significantly reduced computational requirements, while achieving comparable performance to the state of the art spatio-temporal feature descriptors. First, the BRISK feature detector is applied on a frame by frame basis to detect interest points, then the detected key points are compared against consecutive frames for significant motion. Key points with significant motion are encoded with the BRISK descriptor in the spatial domain and Motion Boundary Histogram in the temporal domain. This descriptor is not only lightweight but also has lower memory requirements because of the binary nature of the BRISK descriptor, allowing the possibility of applications using hand held devices.We evaluate the combination of detectordescriptor performance in the context of action classification with a standard, popular bag-of-features with SVM framework. Experiments are carried out on two popular datasets with varying complexity and we demonstrate comparable performance with other descriptors with reduced computational complexity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the design of μAV, a palm size open source micro quadrotor constructed on a single Printed Circuit Board. The aim of the micro quadrotor is to provide a lightweight (approximately 86g) and cheap robotic research platform that can be used for a range of robotic applications. One possible application could be a cheap test bed for robotic swarm research. The goal of this paper is to give an overview of the design and capabilities of the micro quadrotor. The micro quadrotor is complete with a 9 Degree of Freedom Inertial Measurement Unit, a Gumstix Overo® Computer-On-Module which can run the widely used Robot Operating System (ROS) for use with other research algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This catalogue essay was written to accompany Clark Beaumont's 2014 exhibition at Kings Artist Run in Melbourne, 'Feeling It Out'. It contextualises Clark Beaumont's work within a history of women's participation and achievement in modern and contemporary art, and suggests that this body of work may work through issues of anxiety, ambivalence and doubt about the art world.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.