787 resultados para Gradient-based approaches
Resumo:
A review of theoretical principals and practice of school-based approaches to critical literacy, with a specific focus on the middle years of schooling.
Resumo:
This paper is an essay on the state of Australian education that frames new directions for educational research. It outlines three challenges faced by Australian educators: highly spatialised poverty with particularly strong mediating effects on primary school education; the need for intellectual and critical depth in pedagogy, with a focus in the upper primary and middle years; and the need to reinvent senior schooling to address emergent pathways from school to work and civic life. It offers a narrative description of the dynamics of policy making in Australia and North America and argues for an evidence-based approach to social and educational policy – but one quite unlike current test and market-based approaches. Instead, it argues for a multidisciplinary approach to a broad range of empirical and case-based evidence that subjects these to critical, hermeneutic social sciences. Such an approach would join educational policy with educational research, and broader social, community and governmental action with the aim of reorganising and redistributing material, cultural and social resources.
Resumo:
Recent initiatives in values education in Australia emphasise the importance of the process of valuing and general methodologies that foster this in the classroom. Although a range of strategies are available, this chapter argues that inquiry-based approaches in the Social Sciences play a significant role in linking valuing processes with decision making skills. Collectively, these approaches prompt the development of reasoning and self awareness which also impact on student wellness. This chapter proposes some curriculum approaches to foreground values education in the Australian Social Sciences classroom. It argues that valuing is an important life skill that can be cultivated in the classroom through specific valuing strategies. It contends that the development of the capacity to make informed value choices is a critical factor in promoting wellness and resilience in students and in preparing them for the decision making skills required for effective participation in society.
Resumo:
Decentralised sensor networks typically consist of multiple processing nodes supporting one or more sensors. These nodes are interconnected via wireless communication. Practical applications of Decentralised Data Fusion have generally been restricted to using Gaussian based approaches such as the Kalman or Information Filter This paper proposes the use of Parzen window estimates as an alternate representation to perform Decentralised Data Fusion. It is required that the common information between two nodes be removed from any received estimates before local data fusion may occur Otherwise, estimates may become overconfident due to data incest. A closed form approximation to the division of two estimates is described to enable conservative assimilation of incoming information to a node in a decentralised data fusion network. A simple example of tracking a moving particle with Parzen density estimates is shown to demonstrate how this algorithm allows conservative assimilation of network information.
Resumo:
The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.
Resumo:
In vector space based approaches to natural language processing, similarity is commonly measured by taking the angle between two vectors representing words or documents in a semantic space. This is natural from a mathematical point of view, as the angle between unit vectors is, up to constant scaling, the only unitarily invariant metric on the unit sphere. However, similarity judgement tasks reveal that human subjects fail to produce data which satisfies the symmetry and triangle inequality requirements for a metric space. A possible conclusion, reached in particular by Tversky et al., is that some of the most basic assumptions of geometric models are unwarranted in the case of psychological similarity, a result which would impose strong limits on the validity and applicability vector space based (and hence also quantum inspired) approaches to the modelling of cognitive processes. This paper proposes a resolution to this fundamental criticism of of the applicability of vector space models of cognition. We argue that pairs of words imply a context which in turn induces a point of view, allowing a subject to estimate semantic similarity. Context is here introduced as a point of view vector (POVV) and the expected similarity is derived as a measure over the POVV's. Different pairs of words will invoke different contexts and different POVV's. Hence the triangle inequality ceases to be a valid constraint on the angles. We test the proposal on a few triples of words and outline further research.
Resumo:
Many data mining techniques have been proposed for mining useful patterns in text documents. However, how to effectively use and update discovered patterns is still an open research issue, especially in the domain of text mining. Since most existing text mining methods adopted term-based approaches, they all suffer from the problems of polysemy and synonymy. Over the years, people have often held the hypothesis that pattern (or phrase) based approaches should perform better than the term-based ones, but many experiments did not support this hypothesis. This paper presents an innovative technique, effective pattern discovery which includes the processes of pattern deploying and pattern evolving, to improve the effectiveness of using and updating discovered patterns for finding relevant and interesting information. Substantial experiments on RCV1 data collection and TREC topics demonstrate that the proposed solution achieves encouraging performance.
Resumo:
It is a big challenge to guarantee the quality of discovered relevance features in text documents for describing user preferences because of the large number of terms, patterns, and noise. Most existing popular text mining and classification methods have adopted term-based approaches. However, they have all suffered from the problems of polysemy and synonymy. Over the years, people have often held the hypothesis that pattern-based methods should perform better than term-based ones in describing user preferences, but many experiments do not support this hypothesis. The innovative technique presented in paper makes a breakthrough for this difficulty. This technique discovers both positive and negative patterns in text documents as higher level features in order to accurately weight low-level features (terms) based on their specificity and their distributions in the higher level features. Substantial experiments using this technique on Reuters Corpus Volume 1 and TREC topics show that the proposed approach significantly outperforms both the state-of-the-art term-based methods underpinned by Okapi BM25, Rocchio or Support Vector Machine and pattern based methods on precision, recall and F measures.
Resumo:
Background Chronic heart failure (CHF) is associated with high hospitalisation and mortality rates and debilitating symptoms. In an effort to reduce hospitalisations and improve symptoms individuals must be supported in managing their condition. Patients who can effectively self-manage their symptoms through lifestyle modification and adherence to complex medication regimens will experience less hospitalisations and other adverse events. Aim The purpose of this paper is to explain how providing evidence-based information, using patient education resources, can support self-care. Discussion Self-care relates to the activities that individuals engage in relation to health seeking behaviours. Supporting self-care practices through tailored and relevant information can provide patients with resources and advice on strategies to manage their condition. Evidence-based approaches to improve adherence to self-care practices in patients with heart failure are not often reported. Low health literacy can result in poor understanding of the information about CHF and is related to adverse health outcomes. Also a lack of knowledge can lead to non-adherence with self-care practices such as following fluid restriction, low sodium diet and daily weighing routines. However these issues need to be addressed to improve self-management skills. Outcome Recently the Heart Foundation CHF consumer resource was updated based on evidence-based national clinical guidelines. The aim of this resource is to help consumers improve understanding of the disease, reduce uncertainty and anxiety about what to do when symptoms appear, encourage discussions with local doctors, and build confidence in self-care management. Conclusion Evidence-based CHF patient education resources promote self-care practices and early detection of symptom change that may reduce hospitalisations and improve the quality of life for people with CHF.
Resumo:
Firstly, the authors would like to thank the editor for the opportunity to respond to Dr Al-Azri’s and Dr Al-Maniri’s letter. Secondly, while the current authors also accept that deterrence-based approaches should act as only one corner-stone of a suite of interventions and public policy initiatives designed to improve road safety, deterrence-based approaches have nonetheless consistently proven to be a valuable resource to improve road safety. Dr Al-Azri and Dr Al-Maniri reinforce their assertion about the limited utility of deterrence by citing drink driving research, and the issue of drink driving is particularly relevant within the current context given that the problem of driving after drinking has historically been addressed through deterrence-based approaches. While the effectiveness of deterrence-based approaches to reduce drink driving will always be dependent upon a range of situational and contextual factors (including police enforcement practices, cultural norms, etc), the utilisation of this approach has proven particularly effective within Queensland, Australia. For example, a relatively recent comprehensive review of Random Breath Testing in Queensland demonstrated that this initiative not only had a deterrent impact upon self-reported intentions to drink and drive, but was also found to have significantly reduced alcohol-related fatalities in the state. However, the authors agree that deterrence-based approaches can be particularly transient and thus require constant “topping up” not least through sustained public reinforcement, which was clearly articulated in the seminal work by Homel.
Resumo:
Most web service discovery systems use keyword-based search algorithms and, although partially successful, sometimes fail to satisfy some users information needs. This has given rise to several semantics-based approaches that look to go beyond simple attribute matching and try to capture the semantics of services. However, the results reported in the literature vary and in many cases are worse than the results obtained by keyword-based systems. We believe the accuracy of the mechanisms used to extract tokens from the non-natural language sections of WSDL files directly affects the performance of these techniques, because some of them can be more sensitive to noise. In this paper three existing tokenization algorithms are evaluated and a new algorithm that outperforms all the algorithms found in the literature is introduced.
Resumo:
This paper presents a method of spatial sampling based on stratification by Local Moran’s I i calculated using auxiliary information. The sampling technique is compared to other design-based approaches including simple random sampling, systematic sampling on a regular grid, conditional Latin Hypercube sampling and stratified sampling based on auxiliary information, and is illustrated using two different spatial data sets. Each of the samples for the two data sets is interpolated using regression kriging to form a geostatistical map for their respective areas. The proposed technique is shown to be competitive in reproducing specific areas of interest with high accuracy.
Resumo:
Australian efforts to provide orthopaedic surgeons with living, load-bearing scaffolds suitable for current joint (knee and hip) replacement surgery, non-union fracture repair, and miniscal and growth plate cartilage regeneration are being lead by teams at the Institute for Medical and Veterinary Science and Women's and Children's Hospital in Adelaide; the Peter MacCallum and St Vincent's Medical Research Institutes in Melbourne; and the Mater Medical Research Institute and new Institute for Health and Biomedical Innovation at QUT, Brisbane. In each case multidisciplinary teams are attempting to develop autologous living tissue constructs, utilising mesenchymal stem cells (MSC), with the intention of effecting seamless repair and regeneration of skeletal trauma and defects. In this article we will briefly review current knowledge of the phenotypic properties of MSC and discuss the potential therapeutic applications of these cells as exemplified by their use in cartilage repair and tissue engineering based approaches to the treatment of skeletal defects.
Resumo:
This chapter reports on a project in which university researchers’ expertise in architecture, literacy and communications enabled two teachers in one school to expand the forms of literacy that primary school children engaged in. Starting from the school community’s concerns about an urban renewal project in their neighbourhood, participants collaborated to develop a curriculum of spatial literacies with real-world goals and outcomes. We describe how the creative re-design of curriculum and pedagogy by classroom teachers, in collaboration with university academics and students, allowed students aged 8 to 12 years to appropriate semiotic resources from their local neighbourhood, home communities, and popular culture to make a difference to their material surrounds. We argue that there are productive possibilities for educators who integrate critical and place-based approaches to the design and teaching of the literacy curriculum with work in other learning areas such as society and environment, technology and design and the arts. The student production of expansive and socially significant texts enabled by such approaches may be especially necessary in contemporary neoconservative policy contexts that tend to limit and constrain what is possible in schools.
Resumo:
Gait recognition approaches continue to struggle with challenges including view-invariance, low-resolution data, robustness to unconstrained environments, and fluctuating gait patterns due to subjects carrying goods or wearing different clothes. Although computationally expensive, model based techniques offer promise over appearance based techniques for these challenges as they gather gait features and interpret gait dynamics in skeleton form. In this paper, we propose a fast 3D ellipsoidal-based gait recognition algorithm using a 3D voxel model derived from multi-view silhouette images. This approach directly solves the limitations of view dependency and self-occlusion in existing ellipse fitting model-based approaches. Voxel models are segmented into four components (left and right legs, above and below the knee), and ellipsoids are fitted to each region using eigenvalue decomposition. Features derived from the ellipsoid parameters are modeled using a Fourier representation to retain the temporal dynamic pattern for classification. We demonstrate the proposed approach using the CMU MoBo database and show that an improvement of 15-20% can be achieved over a 2D ellipse fitting baseline.