984 resultados para evaluation protocols


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The scale down of transistor technology allows microelectronics manufacturers such as Intel and IBM to build always more sophisticated systems on a single microchip. The classical interconnection solutions based on shared buses or direct connections between the modules of the chip are becoming obsolete as they struggle to sustain the increasing tight bandwidth and latency constraints that these systems demand. The most promising solution for the future chip interconnects are the Networks on Chip (NoC). NoCs are network composed by routers and channels used to inter- connect the different components installed on the single microchip. Examples of advanced processors based on NoC interconnects are the IBM Cell processor, composed by eight CPUs that is installed on the Sony Playstation III and the Intel Teraflops pro ject composed by 80 independent (simple) microprocessors. On chip integration is becoming popular not only in the Chip Multi Processor (CMP) research area but also in the wider and more heterogeneous world of Systems on Chip (SoC). SoC comprehend all the electronic devices that surround us such as cell-phones, smart-phones, house embedded systems, automotive systems, set-top boxes etc... SoC manufacturers such as ST Microelectronics , Samsung, Philips and also Universities such as Bologna University, M.I.T., Berkeley and more are all proposing proprietary frameworks based on NoC interconnects. These frameworks help engineers in the switch of design methodology and speed up the development of new NoC-based systems on chip. In this Thesis we propose an introduction of CMP and SoC interconnection networks. Then focusing on SoC systems we propose: • a detailed analysis based on simulation of the Spidergon NoC, a ST Microelectronics solution for SoC interconnects. The Spidergon NoC differs from many classical solutions inherited from the parallel computing world. Here we propose a detailed analysis of this NoC topology and routing algorithms. Furthermore we propose aEqualized a new routing algorithm designed to optimize the use of the resources of the network while also increasing its performance; • a methodology flow based on modified publicly available tools that combined can be used to design, model and analyze any kind of System on Chip; • a detailed analysis of a ST Microelectronics-proprietary transport-level protocol that the author of this Thesis helped developing; • a simulation-based comprehensive comparison of different network interface designs proposed by the author and the researchers at AST lab, in order to integrate shared-memory and message-passing based components on a single System on Chip; • a powerful and flexible solution to address the time closure exception issue in the design of synchronous Networks on Chip. Our solution is based on relay stations repeaters and allows to reduce the power and area demands of NoC interconnects while also reducing its buffer needs; • a solution to simplify the design of the NoC by also increasing their performance and reducing their power and area consumption. We propose to replace complex and slow virtual channel-based routers with multiple and flexible small Multi Plane ones. This solution allows us to reduce the area and power dissipation of any NoC while also increasing its performance especially when the resources are reduced. This Thesis has been written in collaboration with the Advanced System Technology laboratory in Grenoble France, and the Computer Science Department at Columbia University in the city of New York.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The research performed during the PhD candidature was intended to evaluate the quality of white wines, as a function of the reduction in SO2 use during the first steps of the winemaking process. In order to investigate the mechanism and intensity of interactions occurring between lysozyme and the principal macro-components of musts and wines, a series of experiments on model wine solutions were undertaken, focusing attention on the polyphenols, SO2, oenological tannins, pectines, ethanol, and sugar components. In the second part of this research program, a series of conventional sulphite added vinifications were compared to vinifications in which sulphur dioxide was replaced by lysozyme and consequently define potential winemaking protocols suitable for the production of SO2-free wines. To reach the final goal, the technological performance of two selected yeast strains with a low aptitude to produce SO2 during fermentation were also evaluated. The data obtained suggested that the addition of lysozyme and oenological tannins during the alcoholic fermentation could represent a promising alternative to the use of sulphur dioxide and a reliable starting point for the production of SO2-free wines. The different vinification protocols studied influenced the composition of the volatile profile in wines at the end of the alcoholic fermentation, especially with regards to alcohols and ethyl esters also a consequence of the yeast’s response to the presence or absence of sulphites during fermentation, contributing in different ways to the sensory profiles of wines. In fact, the aminoacids analysis showed that lysozyme can affect the consumption of nitrogen as a function of the yeast strain used in fermentation. During the bottle storage, the evolution of volatile compounds is affected by the presence of SO2 and oenological tannins, confirming their positive role in scaveging oxygen and maintaining the amounts of esters over certain levels, avoiding a decline in the wine’s quality. Even though a natural decrease was found on phenolic profiles due to oxidation effects caused by the presence of oxygen dissolved in the medium during the storage period, the presence of SO2 together with tannins contrasted the decay of phenolic content at the end of the fermentation. Tannins also showed a central role in preserving the polyphenolic profile of wines during the storage period, confirming their antioxidant property, acting as reductants. Our study focused on the fundamental chemistry relevant to the oxidative phenolic spoilage of white wines has demonstrated the suitability of glutathione to inhibit the production of yellow xanthylium cation pigments generated from flavanols and glyoxylic acid at the concentration that it typically exists in wine. The ability of glutathione to bind glyoxylic acid rather than acetaldehyde may enable glutathione to be used as a ‘switch’ for glyoxylic acid-induced polymerisation mechanisms, as opposed to the equivalent acetaldehyde polymerisation, in processes such as microoxidation. Further research is required to assess the ability of glutathione to prevent xanthylium cation production during the in-situ production of glyoxylic acid and in the presence of sulphur dioxide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This present paper reviews the reliability and validity of visual analogue scales (VAS) in terms of (1) their ability to predict feeding behaviour, (2) their sensitivity to experimental manipulations, and (3) their reproducibility. VAS correlate with, but do not reliably predict, energy intake to the extent that they could be used as a proxy of energy intake. They do predict meal initiation in subjects eating their normal diets in their normal environment. Under laboratory conditions, subjectively rated motivation to eat using VAS is sensitive to experimental manipulations and has been found to be reproducible in relation to those experimental regimens. Other work has found them not to be reproducible in relation to repeated protocols. On balance, it would appear, in as much as it is possible to quantify, that VAS exhibit a good degree of within-subject reliability and validity in that they predict with reasonable certainty, meal initiation and amount eaten, and are sensitive to experimental manipulations. This reliability and validity appears more pronounced under the controlled (but more arti®cial) conditions of the laboratory where the signal : noise ratio in experiments appears to be elevated relative to real life. It appears that VAS are best used in within-subject, repeated-measures designs where the effect of different treatments can be compared under similar circumstances. They are best used in conjunction with other measures (e.g. feeding behaviour, changes in plasma metabolites) rather than as proxies for these variables. New hand-held electronic appetite rating systems (EARS) have been developed to increase reliability of data capture and decrease investigator workload. Recent studies have compared these with traditional pen and paper (P&P) VAS. The EARS have been found to be sensitive to experimental manipulations and reproducible relative to P&P. However, subjects appear to exhibit a signi®cantly more constrained use of the scale when using the EARS relative to the P&P. For this reason it is recommended that the two techniques are not used interchangeably

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to aid researchers in selecting appropriate qualitative methods in order to develop and improve future studies in the field of emotional design. These include observations, think-aloud protocols, questionnaires, diaries and interviews. Based on the authors’ experiences, it is proposed that the methods under review can be successfully used for collecting data on emotional responses to evaluate user product relationships. This paper reviews the methods; discusses the suitability, advantages and challenges in relation to design and emotion studies. Furthermore, the paper outlines the potential impact of technology on the application of these methods, discusses the implications of these methods for emotion research and concludes with recommendations for future work in this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evolution of classic power grids to smart grids creates chances for most participants in the energy sector. Customers can save money by reducing energy consumption, energy providers can better predict energy demand and environment benefits since lower energy consumption implies lower energy production including a decrease of emissions from plants. But information and communication systems supporting smart grids can also be subject to classical or new network attacks. Attacks can result in serious damage such as harming privacy of customers, creating economical loss and even disturb the power supply/demand balance of large regions and countries. In this paper, we give an overview about the German smart measuring architecture, protocols and security. Afterwards, we present a simulation framework which enables researchers to analyze security aspects of smart measuring scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Establishing age-at-death for skeletal remains is a vital component of forensic anthropology. The Suchey-Brooks (S-B) method of age estimation has been widely utilised since 1986 and relies on a visual assessment of the pubic symphyseal surface in comparison to a series of casts. Inter-population studies (Kimmerle et al., 2005; Djuric et al., 2007; Sakaue, 2006) demonstrate limitations of the S-B method, however, no assessment of this technique specific to Australian populations has been published. Aim: This investigation assessed the accuracy and applicability of the S-B method to an adult Australian Caucasian population by highlighting error rates associated with this technique. Methods: Computed tomography (CT) and contact scans of the S-B casts were performed; each geometrically modelled surface was extracted and quantified for reference purposes. A Queensland skeletal database for Caucasian remains aged 15 – 70 years was initiated at the Queensland Health Forensic and Scientific Services – Forensic Pathology Mortuary (n=350). Three-dimensional reconstruction of the bone surface using innovative volume visualisation protocols in Amira® and Rapidform® platforms was performed. Samples were allocated into 11 sub-sets of 5-year age intervals and changes associated with the surface geometry were quantified in relation to age, gender and asymmetry. Results: Preliminary results indicate that computational analysis was successfully applied to model morphological surface changes. Significant differences in observed versus actual ages were noted. Furthermore, initial morphological assessment demonstrates significant bilateral asymmetry of the pubic symphysis, which is unaccounted for in the S-B method. These results propose refinements to the S-B method, when applied to Australian casework. Conclusion: This investigation promises to transform anthropological analysis to be more quantitative and less invasive using CT imaging. The overarching goal contributes to improving skeletal identification and medico-legal death investigation in the coronial process by narrowing the range of age-at-death estimation in a biological profile.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genomic DNA obtained from patient whole blood samples is a key element for genomic research. Advantages and disadvantages, in terms of time-efficiency, cost-effectiveness and laboratory requirements, of procedures available to isolate nucleic acids need to be considered before choosing any particular method. These characteristics have not been fully evaluated for some laboratory techniques, such as the salting out method for DNA extraction, which has been excluded from comparison in different studies published to date. We compared three different protocols (a traditional salting out method, a modified salting out method and a commercially available kit method) to determine the most cost-effective and time-efficient method to extract DNA. We extracted genomic DNA from whole blood samples obtained from breast cancer patient volunteers and compared the results of the product obtained in terms of quantity (concentration of DNA extracted and DNA obtained per ml of blood used) and quality (260/280 ratio and polymerase chain reaction product amplification) of the obtained yield. On average, all three methods showed no statistically significant differences between the final result, but when we accounted for time and cost derived for each method, they showed very significant differences. The modified salting out method resulted in a seven- and twofold reduction in cost compared to the commercial kit and traditional salting out method, respectively and reduced time from 3 days to 1 hour compared to the traditional salting out method. This highlights a modified salting out method as a suitable choice to be used in laboratories and research centres, particularly when dealing with a large number of samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Children with Autism Spectrum Disorder experience difficulty in communication and in understanding the social world which can have negative consequences for their relationships, in managing emotions, and generally dealing with the challenges of everyday life. This thesis examines the effectiveness of the Active and Reflective components of the Get REAL program through the assessment of the detailed coding of video-recorded observations and longitudinal quantitative analysis. The aim of Get REAL is to increase the social, emotional, and cognitive learning of children with High Functioning Autism (HFA). Get REAL is a group program designed specifically for use in inclusive primary school settings. The Get REAL program was designed in response to the mixed success of generalisation of learning to new contexts of existing social skills programs. The theoretical foundation of Get REAL is based upon pedagogical theory and learning theory to facilitate transfer of learning, combined with experiential, individualised, evaluative and organisational approaches. This thesis is by publication and consists of four refereed journal papers; 1 accepted for publication and 3 that are under review. Paper 1 describes the development and theoretical basis of the Get REAL program and provides detail of the program structure and learning cycle. The focus of Paper 1 reflects the first question of interest in the thesis which is about the extent to which learning derived from participation in the program can be generalised to other contexts. Participants are 16 children with HFA ranging in age from 8-13 years. Results provided support for the generalisability of learning from Get REAL to home and school evidenced by parent and teacher data collected pre and post participation in Get REAL. Following establishment of the generalisation of learning from Get REAL, Papers 2 and 3 focus on the Active and Reflective components of the program in order to examine how individual and group learning takes place. Participants (N = 12) in the program are video-taped during the Active and Reflective Sessions. Using identical coding protocols of video data, improvements in prosocial behaviour and diminishing of inappropriate behaviours were apparent with the exception of perspective taking. Data also revealed that 2 of the participants had atypical trajectories. An in-depth case study analysis was then conducted with these 2 participants in Paper 4. Data included reports from health care and education professionals within the school and externally (e.g., paediatrician) and identified the multi-faceted nature of care needed for children with comorbid diagnoses and extremely challenging family circumstances as a complex task to effect change. Results of this research support the effectiveness of the Get REAL program in promoting pro social behaviours such as improvements in engaging with others and emotional regulation, and in diminishing unwanted behaviours such as conduct problems. Further, the gains made by the participating children were found to be generalisable beyond Get REAL to home and other school settings. The research contained in the thesis adds to current knowledge about how learning can take place for children with HFA. Results show that an experiential learning framework with a focus on social cognition, together with explicit teaching, scaffolded with video feedback, are key ingredients for the generalisation of social learning to broader contexts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need to address on-road motorcycle safety in Australia is important due to the disproportionately high percentage of riders and pillions killed and injured each year. One approach to preventing motorcycle-related injury is through training and education. However, motorcycle rider training lacks empirical support as an effective road safety countermeasure to reduce crash involvement. Previous reviews have highlighted that risk-taking is a contributing factor in many motorcycle crashes, rather than merely a lack of vehicle-control skills (Haworth & Mulvihill, 2005; Jonah, Dawson & Bragg, 1982; Watson et al, 1996). Hence, though the basic vehicle-handling skills and knowledge of road rules that are taught in most traditional motorcycle licence training programs may be seen as an essential condition of safe riding, they do not appear to be sufficient in terms of crash reduction. With this in mind there is considerable scope for the improvement of program focus and content for rider training and education. This program of research examined an existing traditional pre-licence motorcycle rider training program and formatively evaluated the addition of a new classroom-based module to address risky riding; the Three Steps to Safer Riding program. The pilot program was delivered in the real world context of the Q-Ride motorcycle licensing system in the state of Queensland, Australia. Three studies were conducted as part of the program of research: Study 1, a qualitative investigation of delivery practices and student learning needs in an existing rider training course; Study 2, an investigation of the extent to which an existing motorcycle rider training course addressed risky riding attitudes and motives; and Study 3, a formative evaluation of the new program. A literature review as well as the investigation of learning needs for motorcyclists in Study 1 aimed to inform the initial planning and development of the Three Steps to Safer Riding program. Findings from Study 1 suggested that the training delivery protocols used by the industry partner training organisation were consistent with a learner-centred approach and largely met the learning needs of trainee riders. However, it also found that information from the course needs to be reinforced by on-road experiences for some riders once licensed and that personal meaning for training information was not fully gained until some riding experience had been obtained. While this research informed the planning and development of the new program, a project team of academics and industry experts were responsible for the formulation of the final program. Study 2 and Study 3 were conducted for the purpose of formative evaluation and program refinement. Study 2 served primarily as a trial to test research protocols and data collection methods with the industry partner organisation and, importantly, also served to gather comparison data for the pilot program which was implemented with the same rider training organisation. Findings from Study 2 suggested that the existing training program of the partner organisation generally had a positive (albeit small) effect on safety in terms of influencing attitudes to risk taking, the propensity for thrill seeking, and intentions to engage in future risky riding. However, maintenance of these effects over time and the effects on riding behaviour remain unclear due to a low response rate upon follow-up 24 months after licensing. Study 3 was a formative evaluation of the new pilot program to establish program effects and possible areas for improvement. Study 3a examined the short term effects of the intervention pilot on psychosocial factors underpinning risky riding compared to the effects of the standard traditional training program (examined in Study 2). It showed that the course which included the Three Steps to Safer Riding program elicited significantly greater positive attitude change towards road safety than the existing standard licensing course. This effect was found immediately following training, and mean scores for attitudes towards safety were also maintained at the 12 month follow-up. The pilot program also had an immediate effect on other key variables such as risky riding intentions and the propensity for thrill seeking, although not significantly greater than the traditional standard training. A low response rate at the 12 month follow-up unfortunately prevented any firm conclusions being drawn regarding the impact of the pilot program on self-reported risky riding once licensed. Study 3a further showed that the use of intermediate outcomes such as self-reported attitudes and intentions for evaluation purposes provides insights into the mechanisms underpinning risky riding that can be changed by education and training. A multifaceted process evaluation conducted in Study 3b confirmed that the intervention pilot was largely delivered as designed, with course participants also rating most aspects of training delivery highly. The complete program of research contributed to the overall body of knowledge relating to motorcycle rider training, with some potential implications for policy in the area of motorcycle rider licensing. A key finding of the research was that psychosocial influences on risky riding can be shaped by structured education that focuses on awareness raising at a personal level and provides strategies to manage future riding situations. However, the formative evaluation was mainly designed to identify areas of improvement for the Three Steps to Safer Riding program and found several areas of potential refinement to improve future efficacy of the program. This included aspects of program content, program delivery, resource development, and measurement tools. The planned future follow-up of program participants' official crash and traffic offence records over time may lend further support for the application of the program within licensing systems. The findings reported in this thesis offer an initial indication that the Three Steps to Safer Riding is a useful resource to accompany skills-based training programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sector wide interest in Reframe: QUT’s Evaluation Framework continues with a number of institutions requesting finer details as QUT embeds the new approach to evaluation across the university in 2013. This interest, both nationally and internationally has warranted QUT’s collegial response to draw upon its experiences from developing Reframe into distilling and offering Kaleidoscope back to the sector. The word Reframe is a relevant reference for QUT’s specific re-evaluation, reframing and adoption of a new approach to evaluation; whereas Kaleidoscope reflects the unique lens through which any other institution will need to view their own cultural specificity and local context through an extensive user-led stakeholder engagement approach when introducing new approaches to learning and teaching evaluation. Kaleidoscope’s objectives are for QUT to develop its research-based stakeholder approach to distil the successful experience exhibited in the Reframe Project into a transferable set of guidelines for use by other tertiary institutions across the sector. These guidelines will assist others to design, develop, and deploy, their own culturally specific widespread organisational change informed by stakeholder engagement and organisational buy-in. It is intended that these guidelines will promote, support and enable other tertiary institutions to embark on their own evaluation projects and maximise impact. Kaleidoscope offers an institutional case study of widespread organisational change underpinned by Reframe’s (i) evidence-based methodology; (ii) research including published environmental scan, literature review (Alderman, et al., 2012), development of a conceptual model (Alderman, et al., in press 2013), project management principles (Alderman & Melanie, 2012) and national conference peer reviews; and (iii) year-long strategic project with national outreach to collaboratively engage the development of a draft set of National Guidelines. Kaleidoscope’s aims are to inform Higher Education evaluation policy development through national stakeholder engagement, the finalisation of proposed National Guidelines. In correlation with the conference paper, the authors will present a Draft Guidelines and Framework ready for external peer review by evaluation practitioners from the Higher Education sector, as part of Kaleidoscope’s dissemination strategy (Hinton & Gannaway, 2011) applying illuminative evaluation theory (Parlett & Hamilton, 1976), through conference workshops and ongoing discussions (Shapiro, et al., 1983; Jacobs, 2000). The initial National Guidelines will be distilled from the Reframe: QUT’s Evaluation Framework’s Policy, Protocols, and incorporated Business Rules. It is intended that the outcomes of Kaleidoscope are owned by and reflect sectoral engagement, including iterative evaluation through multiple avenues of dissemination and collaboration including the Higher Education sector. The dissemination strategy with the inclusion of Illuminative Evaluation methodology provides an inclusive opportunity for other institutions and stakeholders across the Higher Education sector to give voice through the information-gathering component of evaluating the draft Guidelines, providing a comprehensive understanding of the complex realities experienced across the Higher Education sector, and thereby ‘illuminating’ both the shared and unique lenses and contexts. This process will enable any final guidelines developed to have broader applicability, greater acceptance, enhanced sustainability and additional relevance benefiting the Higher Education sector, and the adoption and adaption by any single institution for their local contexts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps to ensure that analyte quantification measurements can be replicated with good precision within and across multiple laboratories and should facilitate more widespread use of MRM-MS technology by the basic biomedical and clinical laboratory research communities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The global demand for food, feed, energy and water poses extraordinary challenges for future generations. It is evident that robust platforms for the exploration of renewable resources are necessary to overcome these challenges. Within the multinational framework MultiBioPro we are developing biorefinery pipelines to maximize the use of plant biomass. More specifically, we use poplar and tobacco tree (Nicotiana glauca) as target crop species for improving saccharification, isoprenoid, long chain hydrocarbon contents, fiber quality, and suberin and lignin contents. The methods used to obtain these outputs include GC-MS, LC-MS and RNA sequencing platforms. The metabolite pipelines are well established tools to generate these types of data, but also have the limitations in that only well characterized metabolites can be used. The deep sequencing will allow us to include all transcripts present during the developmental stages of the tobacco tree leaf, but has to be mapped back to the sequence of Nicotiana tabacum. With these set-ups, we aim at a basic understanding for underlying processes and at establishing an industrial framework to exploit the outcomes. In a more long term perspective, we believe that data generated here will provide means for a sustainable biorefinery process using poplar and tobacco tree as raw material. To date the basal level of metabolites in the samples have been analyzed and the protocols utilized are provided in this article.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Australia Council awarded the tender of APAMs 2014, 2016 and 2018 to the Brisbane Powerhouse. The Australia Council, in awarding the contract for the presentation of APAM by Brisbane Powerhouse, stipulated that a formal evaluation of the three iterations of APAM and activity in the intervening years be undertaken. Queensland University of Technology, Creative Industries Faculty, under the leadership of Associate Professor Sandra Gattenhof, were contracted to undertake the formal evaluation. This is the first year report on the Brisbane iteration of the Market. This report has drawn from data collected across a range of sources, drawing on the scoping study undertaken by Justin Macdonnell addressing the Market from 1994–2010; the tender document submitted by the Brisbane Powerhouse; in-person interviews with APAM staff, APAM Stakeholders, Vox Pops from delegates in response to individual sessions, producer company/artist case studies and, most significantly, responses from a detailed online survey sent to all delegates. The main body of the report is organised around three key research aims, as outlined in the Brisbane Powerhouse Tender document (2011). These have been articulated as: Evaluation of international market development outcomes through showcasing work to targeted international presenters and agents Evaluation of national market development outcomes through showcasing work to national presenters and producers Evaluation of the exchange ideas, dialogue, skill 
development, partnerships, collaborations and co- productions and networks with local and international peers. The culmination of the data analysis has been articulated through five key recommendations, which may assist the APAM delivery team for the next version, in 2016. In summary, the recommendations are described as: 1. Indigenous focus to remain central to the conception and delivery of APAM 2. Re-framing APAM’s function and its delivery 3. Logistics and communications in a multi-venue approach, including communications and housekeeping, volunteers, catering, re-calibrating the employment of Brisbane Powerhouse protocols and processes for APAM 4. Presentation and promotion for presenters 5. Strategic targeting of Asian producers.