969 resultados para technology standard
A tag-based personalized item recommendation system using tensor modeling and topic model approaches
Resumo:
This research falls in the area of enhancing the quality of tag-based item recommendation systems. It aims to achieve this by employing a multi-dimensional user profile approach and by analyzing the semantic aspects of tags. Tag-based recommender systems have two characteristics that need to be carefully studied in order to build a reliable system. Firstly, the multi-dimensional correlation, called as tag assignment
Resumo:
The Urban Informatics Research Lab brings together a group of people who focus their research on interdisciplinary topics at the intersection of social, spatial, and technical research domains—that is, people, place, and technology. Those topics are spread across the breadth of urban life—its contemporary issues and its needs, as well as the design opportunities that we have as individuals, groups, communities, and as a whole society. The lab’s current research areas include urban planning and design, civic innovation, mobility and transportation, education and connected learning, environmental sustainability, and food and urban agriculture. The common denominator of the lab’s approach is user-centered design research directed toward understanding, conceptualizing, developing, and evaluating sociotechnical practices as well as the opportunities afforded by innovative digital technology in urban environments.
Resumo:
This thesis explored how biophilic urbanism, or the integration of natural features into increasingly dense urban environments, has become mainstream in cities around the world. Fourteen factors uncovered through a case study investigation provide insight for decision makers and change agents in Australia to use biophilic urbanism to address impacts of population growth, climate change and resource shortages. The thesis uses an inductive research approach to explore how barriers to the integration of multi-functional vegetated and water design elements into the built environment, such that these become and standard inclusions in urban design and development processes.
Resumo:
Identifying product families has been considered as an effective way to accommodate the increasing product varieties across the diverse market niches. In this paper, we propose a novel framework to identifying product families by using a similarity measure for a common product design data BOM (Bill of Materials) based on data mining techniques such as frequent mining and clus-tering. For calculating the similarity between BOMs, a novel Extended Augmented Adjacency Matrix (EAAM) representation is introduced that consists of information not only of the content and topology but also of the fre-quent structural dependency among the various parts of a product design. These EAAM representations of BOMs are compared to calculate the similarity between products and used as a clustering input to group the product fami-lies. When applied on a real-life manufacturing data, the proposed framework outperforms a current baseline that uses orthogonal Procrustes for grouping product families.
Resumo:
Previous work within the Faculty of Law, QUT had considered law students perceptions and use of technology and how to manage that use without it becoming a distraction. Students willingness to use technology for their learning purposes, however, had not been tested. The research seeks to understand the affect of law academics in class use of technology for both law and justice students. Students use and their perception of academics use in lectures and tutorials was tested by means of an online survey conducted on an anonymous and voluntary basis. The analysis of results revealed that the majority of respondents rarely use technology in class for their learning purposes. However, most indicated that academic in class use of technology enabled their learning. The research also reinforced the need to make any level of engagement with technology meaningful for students. In particular it identified the need to ensure that students are enabled, by appropriate training, in their use of any required databases or software.
Resumo:
The human choroid is capable of rapidly changing its thickness in response to a variety of stimuli. However little is known about the role of the autonomic nervous system in the regulation of the thickness of the choroid. Therefore, we investigated the effect of topical parasympatholytic and sympathomimetic agents upon the choroidal thickness and ocular biometrics of young healthy adult subjects. Fourteen subjects (mean age 27.9 ± 4 years) participated in this randomized, single-masked, placebo-controlled study. Each subject had measurements of choroidal thickness (ChT) and ocular biometrics of their right eye taken before, and then 30 and 60 min following the administration of topical pharmacological agents. Three different drugs: 2% homatropine hydrobromide, 2.5% phenylephrine hydrochloride and a placebo (0.3% hydroxypropyl methylcellulose) were tested in all subjects; each on different days (at the same time of the day) in randomized order. Participants were masked to the pharmacological agent being used at each testing session. The instillation of 2% homatropine resulted in a small but significant increase in subfoveal ChT at 30 and 60 min after drug instillation (mean change 7 ± 3 μm and 14 ± 2 μm respectively; both p < 0.0001). The parafoveal choroid also exhibited a similar magnitude, significant increase in thickness with time after 2% homatropine (p < 0.001), with a mean change of 7 ± 0.3 μm and 13 ± 1 μm (in the region located 0.5 mm from the fovea center), 6 ± 1 μm and 12.5 ± 1 μm (1 mm from the fovea center) and 6 ± 2 μm and 12 ± 2 μm (1.5 mm from the fovea center) after 30 and 60 min respectively. Axial length decreased significantly 60 min after homatropine (p < 0.01). There were also significant changes in lens thickness (LT) and anterior chamber depth (ACD) (p < 0.05) associated with homatropine instillation. No significant changes in choroidal thickness, or ocular biometrics were found after 2.5% phenylephrine or placebo at any examination points (p > 0.05). In human subjects, significant increases in subfoveal and parafoveal choroidal thickness occurred after administration of 2% homatropine and this implies an involvement of the parasympathetic system in the control of choroidal thickness in humans.
Resumo:
This study describes a field experiment assessing the effectiveness of education and technological innovation in reducing air pollution generated by domestic wood heaters. Two-hundred and twenty four households from a small regional center in Australia were randomly assigned to one of four experimental conditions: (1) Education only – households received a wood smoke reduction education pack containing information about the negative health impacts of wood smoke pollution, and advice about wood heater operation and firewood management; (2) SmartBurn only – households received a SmartBurn canister designed to improve combustion and help wood fires burn more efficiently, (3) Education and SmartBurn, and (4) neither Education nor SmartBurn (control). Analysis of covariance, controlling for pre-intervention household wood smoke emissions, wood moisture content, and wood heater age, revealed that education and SmartBurn were both associated with significant reduction in wood smoke emissions during the post-intervention period. Follow-up mediation analyses indicated that education reduced emissions by improving wood heater operation practices, but not by increasing health risk perceptions. As predicted, SmartBurn exerted a direct effect on emission levels, unmediated by wood heater operation practices or health risk perceptions.
Resumo:
This pilot project investigated the existing practices and processes of Proficient, Highly Accomplished and Lead teachers in the interpretation, analysis and implementation of National Assessment Program – Literacy and Numeracy (NAPLAN) data. A qualitative case study approach was the chosen methodology, with nine teachers across a variety of school sectors interviewed. Themes and sub-themes were identified from the participants’ interview responses revealing the ways in which Queensland teachers work with NAPLAN data. The data illuminated that generally individual schools and teachers adopted their own ways of working with data, with approaches ranging from individual/ad hoc, to hierarchical or a whole school approach. Findings also revealed that data are the responsibility of various persons from within the school hierarchy; some working with the data electronically whilst others rely on manual manipulation. Manipulation of data is used for various purposes including tracking performance, value adding and targeting programmes for specific groups of students, for example the gifted and talented. Whilst all participants had knowledge of intervention programmes and how practice could be modified, there were large inconsistencies in knowledge and skills across schools. Some see the use of data as a mechanism for accountability, whilst others mention data with regards to changing the school culture and identifying best practice. Overall, the findings showed inconsistencies in approach to focus area 5.4. Recommendations therefore include a more national approach to the use of educational data.
Resumo:
As a key element in their response to new media forcing transformations in mass media and media use, newspapers have deployed various strategies to not only establish online and mobile products, and develop healthy business plans, but to set out to be dominant portals. Their response to change was the subject of an early investigation by one of the present authors (Keshvani 2000). That was part of a set of short studies inquiring into what impact new software applications and digital convergence might have on journalism practice (Tickle and Keshvani 2000), and also looking for demonstrations of the way that innovations, technologies and protocols then under development might produce a “wireless, streamlined electronic news production process (Tickle and Keshvani 2001).” The newspaper study compared the online products of The Age in Melbourne and the Straits Times in Singapore. It provided an audit of the Singapore and Australia Information and Communications Technology (ICT) climate concentrating on the state of development of carrier networks, as a determining factor in the potential strength of the two services with their respective markets. In the outcome, contrary to initial expectations, the early cable roll-out and extensive ‘wiring’ of the city in Singapore had not produced a level of uptake of Internet services as strong as that achieved in Melbourne by more ad hoc and varied strategies. By interpretation, while news websites and online content were at an early stage of development everywhere, and much the same as one another, no determining structural imbalance existed to separate these leading media participants in Australia and South-east Asia. The present research revisits that situation, by again studying the online editions of the two large newspapers in the original study, and one other, The Courier Mail, (recognising the diversification of types of product in this field, by including it as a representative of Newscorp, now a major participant). The inquiry works through the principle of comparison. It is an exercise in qualitative, empirical research that establishes a comparison between the situation in 2000 as described in the earlier work, and the situation in 2014, after a decade of intense development in digital technology affecting the media industries. It is in that sense a follow-up study on the earlier work, although this time giving emphasis to content and style of the actual products as experienced by their users. It compares the online and print editions of each of these three newspapers; then the three mastheads as print and online entities, among themselves; and finally it compares one against the other two, as representing a South-east Asian model and Australian models. This exercise is accompanied by a review of literature on the developments in ICT affecting media production and media organisations, to establish the changed context. The new study of the online editions is conducted as a systematic appraisal of the first level, or principal screens, of the three publications, over the course of six days (10-15.2.14 inclusive). For this, categories for analysis were made, through conducting a preliminary examination of the products over three days in the week before. That process identified significant elements of media production, such as: variegated sourcing of materials; randomness in the presentation of items; differential production values among media platforms considered, whether text, video or stills images; the occasional repurposing and repackaging of top news stories of the day and the presence of standard news values – once again drawn out of the trial ‘bundle’ of journalistic items. Reduced in this way the online artefacts become comparable with the companion print editions from the same days. The categories devised and then used in the appraisal of the online products have been adapted to print, to give the closest match of sets of variables. This device, to study the two sets of publications on like standards -- essentially production values and news values—has enabled the comparisons to be made. This comparing of the online and print editions of each of the three publications was set up as up the first step in the investigation. In recognition of the nature of the artefacts, as ones that carry very diverse information by subject and level of depth, and involve heavy creative investment in the formulation and presentation of the information; the assessment also includes an open section for interpreting and commenting on main points of comparison. This takes the form of a field for text, for the insertion of notes, in the table employed for summarising the features of each product, for each day. When the sets of comparisons as outlined above are noted, the process then becomes interpretative, guided by the notion of change. In the context of changing media technology and publication processes, what substantive alterations have taken place, in the overall effort of news organisations in the print and online fields since 2001; and in their print and online products separately? Have they diverged or continued along similar lines? The remaining task is to begin to make inferences from that. Will the examination of findings enforce the proposition that a review of the earlier study, and a forensic review of new models, does provide evidence of the character and content of change --especially change in journalistic products and practice? Will it permit an authoritative description on of the essentials of such change in products and practice? Will it permit generalisation, and provide a reliable base for discussion of the implications of change, and future prospects? Preliminary observations suggest a more dynamic and diversified product has been developed in Singapore, well themed, obviously sustained by public commitment and habituation to diversified online and mobile media services. The Australian products suggest a concentrated corporate and journalistic effort and deployment of resources, with a strong market focus, but less settled and ordered, and showing signs of limitations imposed by the delay in establishing a uniform, large broadband network. The scope of the study is limited. It is intended to test, and take advantage of the original study as evidentiary material from the early days of newspaper companies’ experimentation with online formats. Both are small studies. The key opportunity for discovery lies in the ‘time capsule’ factor; the availability of well-gathered and processed information on major newspaper company production, at the threshold of a transformational decade of change in their industry. The comparison stands to identify key changes. It should also be useful as a reference for further inquiries of the same kind that might be made, and for monitoring of the situation in regard to newspaper portals on line, into the future.
Resumo:
Purpose: The purpose of this work was to evaluate the patient-borne financial cost of common, adverse breast cancer treatment-associated effects, comparing cost across women with or without these side-effects. Methods: 287 Australian women diagnosed with early-stage breast cancer were prospectively followed starting at six months post-surgery for 12 months, with three-monthly assessment of detailed treatment-related side effects and their direct and indirect patient costs attributable to breast cancer. Bootstrapping statistics were used to analyze cost data and adjusted logistic regression was used to evaluate the association between costs and adverse events from breast cancer. Costs were inflated and converted from 2002 Australian to 2014 US dollars. Results: More than 90% of women experienced at least one adverse effect (i.e. post-surgical issue, reaction to radiotherapy, upper-body symptoms or reduced function, lymphedema, fatigue or weight gain). On average, women paid $5,636 (95%CI: $4,694, $6,577) in total costs. Women with any one of the following symptoms (fatigue, reduced upper-body function, upper-body symptoms) or women who report ≥4 adverse treatment-related effects, have 1.5 to nearly 4 times the odds of having higher healthcare costs than women who do not report these complaints (p<0.05). Conclusions: Women face substantial economic burden due to a range of treatment-related health problems, which may persist beyond the treatment period. Improving breast cancer care by incorporating prospective surveillance of treatment-related side effects, and strategies for prevention and treatment of concerns (e.g., exercise) has real potential for reducing patient-borne costs.
Resumo:
Active learning approaches reduce the annotation cost required by traditional supervised approaches to reach the same effectiveness by actively selecting informative instances during the learning phase. However, effectiveness and robustness of the learnt models are influenced by a number of factors. In this paper we investigate the factors that affect the effectiveness, more specifically in terms of stability and robustness, of active learning models built using conditional random fields (CRFs) for information extraction applications. Stability, defined as a small variation of performance when small variation of the training data or a small variation of the parameters occur, is a major issue for machine learning models, but even more so in the active learning framework which aims to minimise the amount of training data required. The factors we investigate are a) the choice of incremental vs. standard active learning, b) the feature set used as a representation of the text (i.e., morphological features, syntactic features, or semantic features) and c) Gaussian prior variance as one of the important CRFs parameters. Our empirical findings show that incremental learning and the Gaussian prior variance lead to more stable and robust models across iterations. Our study also demonstrates that orthographical, morphological and contextual features as a group of basic features play an important role in learning effective models across all iterations.
Resumo:
BACKGROUND Prescribing is a complex task, requiring specific knowledge and skills combined with effective, context-specific clinical reasoning. Prescribing errors can result in significant morbidity and mortality. For all professions with prescribing rights, a clear need exists to ensure students graduate with a well-defined set of prescribing skills, which will contribute to competent prescribing. AIM To describe the methods employed to teach and assess the principles of effective prescribing across five non-medical professions at Queensland University of Technology. METHOD The NPS National Prescribing Competencies Framework (PCF) was used as the prescribing standard. A curriculum mapping exercise was undertaken to determine how well the PCF was addressed across the disciplines of paramedic science, pharmacy, podiatry, nurse practitioner and optometry. Identified gaps in teaching and/or assessment were noted. RESULTS Prescribing skills and knowledge are taught and assessed using a range of methods across disciplines. A multi-modal approach is employed by all disciplines. The Pharmacy discipline uses more tutorial sessions to teach prescribing principles and relies less on case studies and clinical appraisal to assess prescribing when compared to other disciplines. Within the pharmacy discipline approximately 90% of the PCF competencies are taught and assessed. This compares favourably with the other disciplines. CONCLUSION Further work is required to establish a practical, effective approach to the assessment of prescribing competence especially between the university and clinical settings. Effective and reliable assessment of prescribing undertaken by students in diverse settings remains challenging.
Resumo:
This paper offers a definition of elite media arguing their content focus will sufficiently meet social responsibility needs of democracy. Its assumptions come from the Finkelstein and Leveson Inquiries and regulatory British Royal Charter (2013). These provide guidelines on how media outlets meet ‘social responsibility’ standards, e.g. press has a ‘responsibility to be fair and accurate’ (Finkelstein); ethical press will feel a responsibility to ‘hold power to account’ (Leveson); news media ‘will be held strictly accountable’ (RC). The paper invokes the British principle of media opting-in to observe standards, and so serve the democracy. It will give examples from existing media, and consider social responsibility of media more generally. Obvious cases of ‘quality’ media: public broadcasters, e.g. BBC, Al-Jazeera, and ‘quality’ press, e.g. NYT, Süddeutscher Zeitung, but also community broadcasters, specialised magazines, news agencies, distinctive web logs, and others. Where providing commentary, these abjure gratuitous opinion -- meeting a standard of reasoned, informational and fair. Funding is almost a definer, many such services supported by the state, private trusts, public institutions or volunteering by staff. Literature supporting discussion on elite media will include their identity as primarily committed to a public good, e.g. the ‘Public Value Test’, Moe and Donders (2011); with reference also to recent literature on developing public service media. Within its limits the paper will treat social media as participants among all media, including elite, and as a parallel dimension of mass communication founded on inter-activity. Elite media will fulfil the need for social responsibility, firstly by providing one space, a ‘plenary’ for debate. Second is the notion of building public recognition of elite media as trustworthy. Third is the fact that elite media together are a large sector with resources to sustain social cohesion and debate; notwithstanding pressure on funds, and impacts of digital transformation undermining employment in media more than in most industries.
Resumo:
For people with cognitive disabilities, technology is more often thought of as a support mechanism, rather than a source of division that may require intervention to equalize access across the cognitive spectrum. This paper presents a first attempt at formalizing the digital gap created by the generalization of search engines. This was achieved through the development of a mapping of cognitive abilities required by users to execute low- level tasks during a standard Web search task. The mapping demonstrates how critical these abilities are to successfully use search engines with an adequate level of independence. It will lead to a set of design guidelines for search engine interfaces that will allow for the engagement of users of all abilities, and also, more importantly, in search algorithms such as query suggestion and measure of relevance (i.e. ranking).
Resumo:
A laboratory with fully developed and accredited quality systems is highly desirable. In some cases it is mandatory. This is evidenced by the concerns raised when certain medical testing laboratories have not maintained appropriate standards to retain their accreditation with NATA. Whether or not third party accreditation is mandated, good laboratory practice based on sustainable systems is required. Internationally accepted standard ISO/IEC 17025 provides specific guidelines for operating a laboratory that may be accredited. To what extent should a university laboratory implement this standard? Is NATA accreditation necessary? As staffing and funding reduce is it possible to maintain high standard quality systems that are in reality truly required if assurance of results is due? Accreditation may be required where accreditation is due but is credit given by the institutions for the resources required to get there or stay there? These issues are considered in relation to university and commercial laboratories and specific case points are raised from experience with NATA accreditation in the School of Civil Engineering, QUT.