853 resultados para Android, Application Programming Interface, Fansubbing, Android Services, App Developing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The QUT Homestay Program is an essential part of the university’s commitment to meet the accommodation needs of international students. Despite the importance of this style of accommodation, there is very little research addressing issues related to homestay arrangements. The program at Queensland University of Technology (QUT) was evaluated in 2002 to develop a continuous improvement framework to ensure provision of quality homestay services to international students. This paper presents an overview of the evaluation and key lessons learnt in providing quality homestay services to international students. It will cover social and cross-cultural issues faced by providers and international students in the homestay environment, the homestay support needs, program information, policies, procedures and code of practice governing the program.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

miRDeep and its varieties are widely used to quantify known and novel micro RNA (miRNA) from small RNA sequencing (RNAseq). This article describes miRDeep*, our integrated miRNA identification tool, which is modeled off miRDeep, but the precision of detecting novel miRNAs is improved by introducing new strategies to identify precursor miRNAs. miRDeep* has a user-friendly graphic interface and accepts raw data in FastQ and Sequence Alignment Map (SAM) or the binary equivalent (BAM) format. Known and novel miRNA expression levels, as measured by the number of reads, are displayed in an interface, which shows each RNAseq read relative to the pre-miRNA hairpin. The secondary pre-miRNA structure and read locations for each predicted miRNA are shown and kept in a separate figure file. Moreover, the target genes of known and novel miRNAs are predicted using the TargetScan algorithm, and the targets are ranked according to the confidence score. miRDeep* is an integrated standalone application where sequence alignment, pre-miRNA secondary structure calculation and graphical display are purely Java coded. This application tool can be executed using a normal personal computer with 1.5 GB of memory. Further, we show that miRDeep* outperformed existing miRNA prediction tools using our LNCaP and other small RNAseq datasets. miRDeep* is freely available online at http://www.australianprostatecentre.org/research/software/mirdeep-star

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: A range of health outcomes at a population level are related to differences in levels of social disadvantage. Understanding the impact of any such differences in palliative care is important. The aim of this study was to assess, by level of socio-economic disadvantage, referral patterns to specialist palliative care and proximity to inpatient services. Methods: All inpatient and community palliative care services nationally were geocoded (using postcode) to one nationally standardised measure of socio-economic deprivation – Socio-Economic Index for Areas (SEIFA; 2006 census data). Referral to palliative care services and characteristics of referrals were described through data collected routinely at clinical encounters. Inpatient location was measured from each person’s home postcode, and stratified by socio-economic disadvantage. Results: This study covered July – December 2009 with data from 10,064 patients. People from the highest SEIFA group (least disadvantaged) were significantly less likely to be referred to a specialist palliative care service, likely to be referred closer to death and to have more episodes of inpatient care for longer time. Physical proximity of a person’s home to inpatient care showed a gradient with increasing distance by decreasing levels of socio-economic advantage. Conclusion: These data suggest that a simple relationship of low socioeconomic status and poor access to a referral-based specialty such as palliative care does not exist. Different patterns of referral and hence different patterns of care emerge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Student performance on examinations is influenced by the level of difficulty of the questions. It seems reasonable to propose therefore that assessment of the difficulty of exam questions could be used to gauge the level of skills and knowledge expected at the end of a course. This paper reports the results of a study investigating the difficulty of exam questions using a subjective assessment of difficulty and a purpose-built exam question complexity classification scheme. The scheme, devised for exams in introductory programming courses, assesses the complexity of each question using six measures: external domain references, explicitness, linguistic complexity, conceptual complexity, length of code involved in the question and/or answer, and intellectual complexity (Bloom level). We apply the scheme to 20 introductory programming exam papers from five countries, and find substantial variation across the exams for all measures. Most exams include a mix of questions of low, medium, and high difficulty, although seven of the 20 have no questions of high difficulty. All of the complexity measures correlate with assessment of difficulty, indicating that the difficulty of an exam question relates to each of these more specific measures. We discuss the implications of these findings for the development of measures to assess learning standards in programming courses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phenomenography is a research approach devised to allow the investigation of varying ways in which people experience aspects of their world. Whilst growing attention is being paid to interpretative research in LIS, it is not always clear how the outcomes of such research can be used in practice. This article explores the potential contribution of phenomenography in advancing the application of phenomenological and hermeneutic frameworks to LIS theory, research and practice. In phenomenography we find a research toll which in revealing variation, uncovers everyday understandings of phenomena and provides outcomes which are readily applicable to professional practice. THe outcomes may be used in human computer interface design, enhancement, implementation and training, in the design and evaluation of services, and in education and training for both end users and information professionals. A proposed research territory for phenomenography in LIS includes investigating qualitative variation in the experienced meaning of: 1) information and its role in society 2) LIS concepts and principles 3) LIS processes and; 4) LIS elements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent research has proposed Neo-Piagetian theory as a useful way of describing the cognitive development of novice programmers. Neo-Piagetian theory may also be a useful way to classify materials used in learning and assessment. If Neo-Piagetian coding of learning resources is to be useful then it is important that practitioners can learn it and apply it reliably. We describe the design of an interactive web-based tutorial for Neo-Piagetian categorization of assessment tasks. We also report an evaluation of the tutorial's effectiveness, in which twenty computer science educators participated. The average classification accuracy of the participants on each of the three Neo-Piagetian stages were 85%, 71% and 78%. Participants also rated their agreement with the expert classifications, and indicated high agreement (91%, 83% and 91% across the three Neo-Piagetian stages). Self-rated confidence in applying Neo-Piagetian theory to classifying programming questions before and after the tutorial were 29% and 75% respectively. Our key contribution is the demonstration of the feasibility of the Neo-Piagetian approach to classifying assessment materials, by demonstrating that it is learnable and can be applied reliably by a group of educators. Our tutorial is freely available as a community resource.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid growth of services available on the Internet and exploited through ever globalizing business networks poses new challenges for service interoperability. New services, from consumer “apps”, enterprise suites, platform and infrastructure resources, are vying for demand with quickly evolving and overlapping capabilities, and shorter cycles of extending service access from user interfaces to software interfaces. Services, drawn from a wider global setting, are subject to greater change and heterogeneity, demanding new requirements for structural and behavioral interface adaptation. In this paper, we analyze service interoperability scenarios in global business networks, and propose new patterns for service interactions, above those proposed over the last 10 years through the development of Web service standards and process choreography languages. By contrast, we reduce assumptions of design-time knowledge required to adapt services, giving way to run-time mismatch resolutions, extend the focus from bilateral to multilateral messaging interactions, and propose declarative ways in which services and interactions take part in long-running conversations via the explicit use of state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: Unravelling the genetic architecture of complex traits requires large amounts of data, sophisticated models and large computational resources. The lack of user-friendly software incorporating all these requisites is delaying progress in the analysis of complex traits. Methods: Linkage disequilibrium and linkage analysis (LDLA) is a high-resolution gene mapping approach based on sophisticated mixed linear models, applicable to any population structure. LDLA can use population history information in addition to pedigree and molecular markers to decompose traits into genetic components. Analyses are distributed in parallel over a large public grid of computers in the UK. Results: We have proven the performance of LDLA with analyses of simulated data. There are real gains in statistical power to detect quantitative trait loci when using historical information compared with traditional linkage analysis. Moreover, the use of a grid of computers significantly increases computational speed, hence allowing analyses that would have been prohibitive on a single computer. © The Author 2009. Published by Oxford University Press. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At a quite fundamental level, the very way in which Public Service Broadcasting (PSB) may envisage its future usually captured in the semantic shift from PSB to Public Service Media (PSM) is at stake when considering the recent history of public value discourse and the public value test. The core Reithian PSB idea assumed that public value would be created through the application of core principles of universality of availability and appeal, provision for minorities, education of the public, distance from vested interests, quality programming standards, program maker independence, and fostering of national culture and the public sphere. On the other hand, the philosophical import of the public value test is that potentially any excursion into the provision of new media services needs to be justified ex ante. In this era of New Public Management, greater transparency and accountability, and the proposition that resources for public value deliverables be contestable and not sequestered in public sector institutions, what might be the new Archimedean point around which a contemporised normativity for PSM be built? This paper will argue for the innovation imperative as an organising principle for contemporary PSM. This may appear counterintuitive, as it is precisely PSB’s predilection for innovating in new media services (in online, mobile, and social media) that has produced the constraining apparatus of the ex ante/public value/Drei-Stufen-Test in Europe, based on principles of competitive neutrality and transparency in the application of public funds for defined and limited public benefit. However, I argue that a commitment to innovation can define as complementary to, rather than as competitive ‘crowding out’, the new products and services that PSM can, and should, be delivering into a post-scarcity, superabundant all-media marketplace. The evidence presented in this paper for this argument is derived mostly from analysis of PSM in the Australian media ecology. While no PSB outside Europe is subject to a formal public value test, the crowding out arguments are certainly run in Australia, particularly by powerful commercial interests for whom free news is a threat to monetising quality news journalism. Take right wing opinion leader, herself a former ABC Board member, Judith Sloan: ‘… the recent expansive nature of the ABC – all those television stations, radio stations and online offerings – is actually squeezing activity that would otherwise be undertaken by the private sector. From partly correcting market failure, the ABC is now causing it. We are now dealing with a case of unfair competition and wasted taxpayer funds’ (The Drum, 1 August http://www.abc.net.au/unleashed/2818220.html). But I argue that the crowding out argument is difficult to sustain in Australia because of the PSB’s non-dominant position and the fact that much of innovation generated by the two PSBs, the ABC and the SBS, has not been imitated by or competed for by the commercials. The paper will bring cases forward, such as SBS’ Go Back to Where you Came From (2011) as an example of product innovation, and a case study of process and organisational innovation which also has resulted in specific product and service innovation – the ABC’s Innovation Unit. In summary, at least some of the old Reithian dicta, along with spectrum scarcity and market failure arguments, have faded or are fading. Contemporary PSM need to justify their role in the system, and to society, in terms of innovation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modified montmorillonite was prepared at different surfactant (HDTMA) loadings through ion exchange. The conformational arrangement of the loaded surfactants within the interlayer space of MMT was obtained by computational modelling. The conformational change of surfactant molecules enhance the visual understanding of the results obtained from characterization methods such as XRD and surface analysis of the organoclays. Batch experiments were carried out for the adsorption of p-chlorophenol (PCP) and different conditions (pH and temperature) were used in order to determine the optimum sorption. For comparison purpose, the experiments were repeated under the same conditions for p-nitrophenol (PNP). Langmuir and Freundlich equations were applied to the adsorption isotherm of PCP and PNP. The Freundlich isotherm model was found to be the best fit for both of the phenolic compounds. This involved multilayer adsorptions in the adsorption process. In particular, the binding affinity value of PNP was higher than that of PCP and this is attributable to their hydrophobicities. The adsorption of the phenolic compounds by organoclays intercalated with highly loaded surfactants was markedly improved possibly due to the fact that the intercalated surfactant molecules within the interlayer space contribute to the partition phases, which result in greater adsorption of the organic pollutants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers the problem of reconstructing the motion of a 3D articulated tree from 2D point correspondences subject to some temporal prior. Hitherto, smooth motion has been encouraged using a trajectory basis, yielding a hard combinatorial problem with time complexity growing exponentially in the number of frames. Branch and bound strategies have previously attempted to curb this complexity whilst maintaining global optimality. However, they provide no guarantee of being more efficient than exhaustive search. Inspired by recent work which reconstructs general trajectories using compact high-pass filters, we develop a dynamic programming approach which scales linearly in the number of frames, leveraging the intrinsically local nature of filter interactions. Extension to affine projection enables reconstruction without estimating cameras.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the movement for economic reform started in China 20 years ago, the nation's GDP had grown on average from seven to nine per cent a year, making China's construction industry one of the largest in the world. This paper presents an overview of China's foreign economic cooperation development (FECD) in the context of exporting three major construction services namely; contracting, labour and design. The paper outlines the export market profile of Chinese contractors and discusses their current position in the international market. It then addresses challenges; they are facing in view of meeting the ambitious strategic targets set out by the Government for the FECD, which cover the export of construction services. Finally, the paper sheds some light on key exporting strategies currently adopted by Chinese contractors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Australian state -based educational system of a national school curriculum that includes a pre-Year 1 Foundation Year has raised questions about the purpose of this year of early education. A document analysis was undertaken across three Australian states, examining three constructions of the pre-Year 1 class and tensions arising from varied perspectives. Tensions have emerged over state-based adaptations of the national curriculum, scripted pedagogies for change management, differing ideological perspectives and positioning of stakeholders. The results indicate that since 2012 there has been a shift in constructions of the pre-Year 1 class towards school-based ideologies, especially in Queensland. Accordingly, positioning of children, parents and teachers has also changed. These results resonate with previous international indications of ‘schooling’ early education. The experiences of Australian early adopters of the curriculum offer insights for other jurisdictions in Australia and internationally, and raise questions about future development in early years education.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Queensland University of Technology (QUT) Library, like many other academic and research institution libraries in Australia, has been collaborating with a range of academic and service provider partners to develop a range of research data management services and collections. Three main strategies are being employed and an overview of process, infrastructure, usage and benefits is provided of each of these service aspects. The development of processes and infrastructure to facilitate the strategic identification and management of QUT developed datasets has been a major focus. A number of Australian National Data Service (ANDS) sponsored projects - including Seeding the Commons; Metadata Hub / Store; Data Capture and Gold Standard Record Exemplars have / will provide QUT with a data registry system, linkages to storage, processes for identifying and describing datasets, and a degree of academic awareness. QUT supports open access and has established a culture for making its research outputs available via the QUT ePrints institutional repository. Incorporating open access research datasets into the library collections is an equally important aspect of facilitating the adoption of data-centric eresearch methods. Some datasets are available commercially, and the library has collaborated with QUT researchers, in the QUT Business School especially strongly, to identify and procure a rapidly growing range of financial datasets to support research. The library undertakes licensing and uses the Library Resource Allocation to pay for the subscriptions. It is a new area of collection development for with much to be learned. The final strategy discussed is the library acting as “data broker”. QUT Library has been working with researchers to identify these datasets and undertake the licensing, payment and access as a centrally supported service on behalf of researchers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The parallel track model is one of the several models that are used in health promotion programmes that focus on community empowerment. It is unique in that it explicitly incorporates an empowerment approach with a top-down health programme. Since its development in 1999-2000 the model has been used in various health programmes in both developed and developing countries. The aim of this review is to examine the nature and extent of the application of this model and its contribution to promoting health. A review of the literature published between 2000 and 2011 was conducted. Nine results matched the inclusion criteria and revealed that the model has been mostly applied to disadvantaged communities to address health determinants, such as poverty and health literacy. This review found that the model had a positive impact on specific health outcomes such as health literacy and community capacity. We concluded that the parallel track model has the most potential for building capacity for community health promotion and appears to be the least useful for interventions focusing on health behaviour change within a limited time frame.