236 resultados para match
Resumo:
In this paper, the authors combine Pierre Bourdieu’s concept of hysteresis (the ‘fish out of water’ experience) with the discourse historical approach to critical discourse analysis (CDA) as a theoretical and analytical framework through which they examine specific moments in the schooling experiences of one refugee student and one international student, both enrolled in post-compulsory education in Australian mainstream secondary schools. We examine specific moments – as narrated by these students during interviews – in which these students can be described as ‘fish out of water’. As such, this paper takes up the concerns of researchers who call for an examination of the lived geographies and the everyday lives of individual students in mainstream schools. We find that our students’ habitus, conditioned by their previous schooling experiences in their home countries, did not match their new Australian schools, resulting in frustration with, and alienation from, their mainstream schools. However, we also note that schools, too, need to adapt and adjust their habitus to the new multicultural world, in which there are international and refugee students among their usual cohort of mainstream students.
Resumo:
Optimisation of Organic Rankine Cycles (ORCs) for binary-cycle geothermal applications could play a major role in the competitiveness of low to moderate temperature geothermal resources. Part of this optimisation process is matching cycles to a given resource such that power output can be maximised. Two major and largely interrelated components of the cycle are the working fluid and the turbine. Both components need careful consideration. Due to the temperature differences in geothermal resources a one-size-fits-all approach to surface power infrastructure is not appropriate. Furthermore, the traditional use of steam as a working fluid does not seem practical due to the low temperatures of many resources. A variety of organic fluids with low boiling points may be utilised as ORC working fluids in binary power cycle loops. Due to differences in thermodynamic properties, certain fluids are able to extract more heat from a given resource than others over certain temperature and pressure ranges. This enables the tailoring of power cycle infrastructure to best match the geothermal resource through careful selection of the working fluid and turbine design optimisation to yield the optimum overall cycle performance. This paper presents the rationale for the use of radial-inflow turbines for ORC applications and the preliminary design of several radial-inflow turbines based on a selection of promising ORC cycles using five different high-density working fluids: R134a, R143a, R236fa, R245fa and n-Pentane at sub- or trans-critical conditions. Numerous studies published compare a variety of working fluids for various ORC configurations. However, there is little information specifically pertaining to the design and implementation of ORCs using realistic radial turbine designs in terms of pressure ratios, inlet pressure, rotor size and rotational speed. Preliminary 1D analysis leads to the generation of turbine designs for the various cycles with similar efficiencies (77%) but large differences in dimensions (139289 mm rotor diameter). The highest performing cycle (R134a) was found to produce 33% more net power from a 150°C resource flowing at 10 kg/s than the lowest performing cycle (n-Pentane).
The backfilled GEI : a cross-capture modality gait feature for frontal and side-view gait recognition
Resumo:
In this paper, we propose a novel direction for gait recognition research by proposing a new capture-modality independent, appearance-based feature which we call the Back-filled Gait Energy Image (BGEI). It can can be constructed from both frontal depth images, as well as the more commonly used side-view silhouettes, allowing the feature to be applied across these two differing capturing systems using the same enrolled database. To evaluate this new feature, a frontally captured depth-based gait dataset was created containing 37 unique subjects, a subset of which also contained sequences captured from the side. The results demonstrate that the BGEI can effectively be used to identify subjects through their gait across these two differing input devices, achieving rank-1 match rate of 100%, in our experiments. We also compare the BGEI against the GEI and GEV in their respective domains, using the CASIA dataset and our depth dataset, showing that it compares favourably against them. The experiments conducted were performed using a sparse representation based classifier with a locally discriminating input feature space, which show significant improvement in performance over other classifiers used in gait recognition literature, achieving state of the art results with the GEI on the CASIA dataset.
Resumo:
Exposure to air pollution during pregnancy is a potential cause of adverse birth outcomes such as preterm birth and stillbirth. The risk of exposure may be greater during vulnerable windows of the pregnancy which might only be weeks long. We demonstrate a method to find these windows based on smoothing the risk of weekly exposure using conditional autoregression. We use incidence density sampling to match cases with adverse birth outcomes to controls whose gestation lasted at least as long as the case. This matching means that cases and controls are have equal length exposure periods, rather than comparing, for example, cases with short gestations to controls with longer gestations. We demonstrate the ability of the method to find vulnerable windows using two simulation studies. We illustrate the method by examining the association between particulate matter air pollution and stillbirth in Brisbane, Australia.
Resumo:
Platey grains of cubic Bi2O3, α-Bi2O3, and Bi2O2.75 nanograins were associated with chondritic porous interplanetary dust particles W7029C1, W7029E5, and 2011C2 that were collected in the stratosphere at 17-19 km altitude. Similar Bi oxide nanograins were present in the upper stratosphere during May 1985. These grains are linked to the plumes of several major volcanic eruptions during the early 1980s that injected material into the stratosphere. The mass of sulfur from these eruptions is a proxy for the mass of stratospheric Bi from which we derive the particle number densities (p m -3) for "average Bi2O3 nanograins" due to this volcanic activity and those necessary to contaminate the extraterrestrial chondritic porous interplanetary dust particles via collisional sticking. The match between both values supports the idea that Bi2O3 nanograins of volcanic origin could contaminate interplanetary dust particles in the Earth's stratosphere. Copyright 1997 by the American Geophysical Union.
Resumo:
Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.
Resumo:
With the explosion of Web 2.0 application such as blogs, social and professional networks, and various other types of social media, the rich online information and various new sources of knowledge flood users and hence pose a great challenge in terms of information overload. It is critical to use intelligent agent software systems to assist users in finding the right information from an abundance of Web data. Recommender systems can help users deal with information overload problem efficiently by suggesting items (e.g., information and products) that match users’ personal interests. The recommender technology has been successfully employed in many applications such as recommending films, music, books, etc. The purpose of this report is to give an overview of existing technologies for building personalized recommender systems in social networking environment, to propose a research direction for addressing user profiling and cold start problems by exploiting user-generated content newly available in Web 2.0.
Resumo:
The focus of knowledge management (KM) in the construction industry is moving towards capability building for value creation. The study reported by this paper is motivated by recent assertions about the genesis and evolution of knowledge management capability (KMC) in the strategic management field. It attempts to shed light on the governance of learning mechanisms that develop KMC within the context of construction firms. A questionnaire survey was administered to a sample of construction contractors operating in the very dynamic Hong Kong market to elicit opinions on the learning mechanisms and business outcomes of targeted firms. On the basis of a total of 149 usable responses, structural equation modeling (SEM) analysis identified relationships among knowledge-governance mechanisms, knowledge processes, and business performance, thereby supporting the existence of strategic learning loops. The study findings provide evidence from the construction context for capability assertions that knowledge-governance mechanisms and processes form learning mechanisms that carry out strategic learning to create value, effect performance outcomes, and ultimately drive the evolution of KMC. The findings imply that it is feasible for managing construction firms to govern learning mechanisms through managing the capability-based holistic KM system, thereby reconfiguring KMC to match needs in the dynamic market environment over time.
Resumo:
A case study based on the experiences of (at the time of writing) Brisbane-based start-up SnowSports Interactive and their plans for global expansion. This case study questions whether SnowSports interactive is ready for global expansion, and if so which country should be its primary target? Once a country has been chosen, how should SnowSports approach and enter the market? This case study prompts business (in particular international business students) to consider a company's readiness in entering a global market, utlising evaluating tools in a wide range of discipline - product, human resources, capital, busines strategy. Furthermore students are asked to match SnowSports' unique characteristics with a country and an entry strategy. Ability to answer questions posed in this case study will demonstrate high level understanding in entrepreneurship and innovation, international business strategy, and cultural awareness; and demonstrate ability in theoretical and framework application
Resumo:
PURPOSE: This study examined the effects of overnight sleep deprivation on recovery following competitive rugby league matches. METHODS: Eleven male, amateur rugby league players performed two competitive matches, followed by either a normal night's sleep (~8h; CONT) or a sleep deprived night (~0h; SDEP) in a randomised fashion. Testing was conducted the morning of the match, and immediately post-match, 2h post and the next morning (16h post-match). Measures included counter-movement jump (CMJ) distance, knee extensor maximal voluntary contraction (MVC), voluntary activation (VA), venous blood creatine kinase (CK) and C-reactive protein (CRP), perceived muscle soreness and a word-colour recognition cognitive function test. Percent change between post- and 16h post-match was reported to determine the effect of the intervention the next morning. RESULTS: Large effects indicated a greater post- to 16h post-match percentage decline in CMJ distance following SDEP compared to CONT (P=0.10-0.16; d=0.95-1.05). Similarly, the percentage decline in incongruent word-colour reaction times were increased in SDEP trials (P=0.007; d=1.75). Measures of MVC did not differ between conditions (P=0.40-0.75; d=0.13-0.33), though trends for larger percentage decline in VA were detected in SDEP (P=0.19; d=0.84). Further, large effects indicated higher CK and CRP responses 16h post-match during SDEP compared to CONT (P=0.11-0.87; d=0.80-0.88). CONCLUSIONS: Sleep deprivation negatively affected recovery following a rugby league match, specifically impairing CMJ distance and cognitive function. Practitioners should promote adequate post-match sleep patterns or adjust training demands the next day to accommodate the altered physical and cognitive state following sleep deprivation.
Resumo:
In order to comprehend user information needs by concepts, this paper introduces a novel method to match relevance features with ontological concepts. The method first discovers relevance features from user local instances. Then, a concept matching approach is developed for matching these features to accurate concepts in a global knowledge base. This approach is significant for the transition of informative descriptor and conceptional descriptor. The proposed method is elaborately evaluated by comparing against three information gathering baseline models. The experimental results shows the matching approach is successful and achieves a series of remarkable improvements on search effectiveness.
Resumo:
Data structures such as k-D trees and hierarchical k-means trees perform very well in approximate k nearest neighbour matching, but are only marginally more effective than linear search when performing exact matching in high-dimensional image descriptor data. This paper presents several improvements to linear search that allows it to outperform existing methods and recommends two approaches to exact matching. The first method reduces the number of operations by evaluating the distance measure in order of significance of the query dimensions and terminating when the partial distance exceeds the search threshold. This method does not require preprocessing and significantly outperforms existing methods. The second method improves query speed further by presorting the data using a data structure called d-D sort. The order information is used as a priority queue to reduce the time taken to find the exact match and to restrict the range of data searched. Construction of the d-D sort structure is very simple to implement, does not require any parameter tuning, and requires significantly less time than the best-performing tree structure, and data can be added to the structure relatively efficiently.
Resumo:
The overarching goal of this project is to better match funding strategies to industry needs to maximise the benefits of R&D to Australia’s infrastructure and building industry. Project partners are: Queensland Department of Public Works; Queensland Transport and Main Roads; Western Australian Department of Treasury and Finance; John Holland; Queensland University of Technology; Swinburne University of Technology; and VTT Technical Research Centre of Finland (Prof Göran Roos). This project has been endorsed by the Australian Built Environment Industry Innovation Council (BEIIC) with Council member Prof Catherin Bull serving on this project’s Steering Committee. This project seeks to: (i) maximise the value of R&D investment in this sector through improved understanding of future industry research needs; and (ii) address the perceived problem of a disproportionately low R&D investment in this sector, relative to the size and national importance of the sector. This research will develop new theory built on open innovation, dynamic capabilities and absorptive capacity theories in the context of strategic foresighting and roadmapping activities. Four project phases have been designed to address this research: 1: Audit and analysis of R&D investment in the Australian built environment since 1990 - access publically available data relating to R&D investments across Australia from public and private organisations to understand past trends. 2: Examine diffusion mechanisms of research and innovation and its impact on public and private organisations – investigate specific R&D investments to determine the process of realising research support, direction-setting, project engagement, impacts and pathways to adoption. 3: Develop a strategic roadmap for the future of this critical Australian industry - assess the likely future landscapes that R&D investment will both respond to and anticipate. 4: Develop policy to maximise the value of R&D investments to public and private organisations – through translating project learnings into policy guidelines.
Resumo:
This thesis examines the social practice of homework. It explores how homework is shaped by the discourses, policies and guidelines in circulation in a society at any given time with particular reference to one school district in the province of Newfoundland and Labrador, Canada. This study investigates how contemporary homework reconstitutes the home as a pedagogical site where the power of the institution of schooling circulates regularly from school to home. It examines how the educational system shapes the organization of family life and how family experiences with homework may be different in different sites depending on the accessibility of various forms of cultural capital. This study employs a qualitative approach, incorporating multiple case studies, and is complemented by insights from institutional ethnography and critical discourse analysis. It draws on the theoretical concepts of Foucault including power and power relations, and governmentality and surveillance, as well as Bourdieu’s concepts of economic, social and cultural capital for analysis. It employs concepts from Bourdieu’s work as they have been expanded on by researchers including Reay (1998), Lareau (2000), and Griffith and Smith (2005). The studies of these researchers allowed for an examination of homework as it related to families and mothers’ work. Smith’s (1987; 1999) concepts of ruling relations, mothers’ unpaid labour, and the engine of inequality were also employed in the analysis. Family interviews with ten volunteer families, teacher focus group sessions with 15 teachers from six schools, homework artefacts, school newsletters, homework brochures, and publicly available assessment and evaluation policy documents from one school district were analyzed. From this analysis key themes emerged and the findings are documented throughout five data analysis chapters. This study shows a change in education in response to a system shaped by standards, accountability and testing. It documents an increased transference of educational responsibility from one educational stakeholder to another. This transference of responsibility shifts downward until it eventually reaches the family in the form of homework and educational activities. Texts in the form of brochures and newsletters, sent home from school, make available to parents specific subject positions that act as instruments of normalization. These subject positions promote a particular ‘ideal’ family that has access to certain types of cultural capital needed to meet the school’s expectations. However, the study shows that these resources are not equally available to all and some families struggle to obtain what is necessary to complete educational activities in the home. The increase in transference of educational work from the school to the home results in greater work for parents, particularly mothers. As well, consideration is given to mother’s role in homework and how, in turn, classroom instructional practices are sometimes dependent on the work completed at home with differential effects for children. This study confirms previous findings that it is mothers who assume the greatest role in the educational trajectory of their children. An important finding in this research is that it is not only middle-class mothers who dedicate extensive time working hard to ensure their children’s educational success; working-class mothers also make substantial contributions of time and resources to their children’s education. The assignments and educational activities distributed as homework require parents’ knowledge of technical school pedagogy to help their children. Much of the homework being sent home from schools is in the area of literacy, particularly reading, but requires parents to do more than read with children. A key finding is that the practices of parents are changing and being reconfigured by the expectations of schools in regard to reading. Parents are now being required to monitor and supervise children’s reading, as well as help children complete reading logs, written reading responses, and follow up questions. The reality of family life as discussed by the participants in this study does not match the ‘ideal’ as portrayed in the educational documents. Homework sessions often create frustrations and tensions between parents and children. Some of the greatest struggles for families were created by mathematical homework, homework for those enrolled in the French Immersion program, and the work required to complete Literature, Heritage and Science Fair projects. Even when institutionalized and objectified capital was readily available, many families still encountered struggles when trying to carry out the assigned educational tasks. This thesis argues that homework and education-related activities play out differently in different homes. Consideration of this significance may assist educators to better understand and appreciate the vast difference in families and the ways in which each family can contribute to their children’s educational trajectory.
Resumo:
In this paper we present a new simulation methodology in order to obtain exact or approximate Bayesian inference for models for low-valued count time series data that have computationally demanding likelihood functions. The algorithm fits within the framework of particle Markov chain Monte Carlo (PMCMC) methods. The particle filter requires only model simulations and, in this regard, our approach has connections with approximate Bayesian computation (ABC). However, an advantage of using the PMCMC approach in this setting is that simulated data can be matched with data observed one-at-a-time, rather than attempting to match on the full dataset simultaneously or on a low-dimensional non-sufficient summary statistic, which is common practice in ABC. For low-valued count time series data we find that it is often computationally feasible to match simulated data with observed data exactly. Our particle filter maintains $N$ particles by repeating the simulation until $N+1$ exact matches are obtained. Our algorithm creates an unbiased estimate of the likelihood, resulting in exact posterior inferences when included in an MCMC algorithm. In cases where exact matching is computationally prohibitive, a tolerance is introduced as per ABC. A novel aspect of our approach is that we introduce auxiliary variables into our particle filter so that partially observed and/or non-Markovian models can be accommodated. We demonstrate that Bayesian model choice problems can be easily handled in this framework.