936 resultados para pacs: word processing equipment for office automation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This abstract is a preliminary discussion of the importance of blending of Indigenous cultural knowledges with mainstream knowledges of mathematics for supporting Indigenous young people. This import is emphasised in the documents Preparing the Ground for Partnership (Priest, 2005), The Indigenous Education Strategic Directions 2008–2011 (Department of Education, Training and the Arts, 2007) and the National Goals for Indigenous Education (Department of Education, Employment and Work Relations, 2008). These documents highlight the contextualising of literacy and numeracy to students’ community and culture (see Priest, 2005). Here, Community describes “a culture that is oriented primarily towards the needs of the group. Martin Nakata (2007) describes contextualising to culture as about that which already exists, that is, Torres Strait Islander community, cultural context and home languages (Nakata, 2007, p. 2). Continuing, Ezeife (2002) cites Hollins (1996) in stating that Indigenous people belong to “high-context culture groups” (p. 185). That is, “high-context cultures are characterized by a holistic (top-down) approach to information processing in which meaning is “extracted” from the environment and the situation. Low-context cultures use a linear, sequential building block (bottom-up) approach to information processing in which meaning is constructed” (p.185). In this regard, students who use holistic thought processing are more likely to be disadvantaged in mainstream mathematics classrooms. This is because Westernised mathematics is presented as broken into parts with limited connections made between concepts and with the students’ culture. It potentially conflicts with how they learn. If this is to change the curriculum needs to be made more culture-sensitive and community orientated so that students know and understand what they are learning and for what purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Web service based systems, new value-added Web services can be constructed by integrating existing Web services. A Web service may have many implementations, which are functionally identical, but have different Quality of Service (QoS) attributes, such as response time, price, reputation, reliability, availability and so on. Thus, a significant research problem in Web service composition is how to select an implementation for each of the component Web services so that the overall QoS of the composite Web service is optimal. This is so called QoS-aware Web service composition problem. In some composite Web services there are some dependencies and conflicts between the Web service implementations. However, existing approaches cannot handle the constraints. This paper tackles the QoS-aware Web service composition problem with inter service dependencies and conflicts using a penalty-based genetic algorithm (GA). Experimental results demonstrate the effectiveness and the scalability of the penalty-based GA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A continuing challenge for pre-service teacher education is the learning transfer between the university based components and the practical school based components of their training. It is not clear how easily pre-service teachers can transfer university learnings into ‘in school’ practice. Similarly, it is not clear how easily knowledge learned in the school context can be disembedded from this particular context and understood more generally by the pre-service teacher. This paper examines the effect of a community of practice formed specifically to explore learning transfer via collaboration and professional enquiry, in ‘real time’, across the globe. “Activity Theory” (Engestrom, 1999) provided the theoretical framework through which the cognitive, physical and social processes involved could be understood. For the study, three activity systems formed community of practice network. The first activity system involved pre-service teachers at a large university in Queensland, Australia. The second activity system was introduced by the pre-service teachers and involved Year 12 students and teachers at a private secondary school also in Queensland, Australia. The third activity system involved university staff engineers at a large university in Pennsylvania, USA. The common object among the three activity systems was to explore the principles and applications of nanotechnology. The participants in the two Queensland activity systems, controlled laboratory equipment (a high powered Atomic Force Microscope – CPII) in Pennsylvania, USA, with the aim of investigating surface topography and the properties of nano particles. The pre-service teachers were to develop their remote ‘real time’ experience into school classroom tasks, implement these tasks, and later report their findings to other pre-service teachers in the university activity system. As an extension to the project, the pre-service teachers were invited to co-author papers relating to the project. Data were collected from (a) reflective journals; (b) participant field notes – a pre-service teacher initiative; (c) surveys – a pre-service teacher initiative; (d) lesson reflections and digital recordings – a pre-service teacher initiative; and (e) interviews with participants. The findings are reported in terms of the major themes: boundary crossing, the philosophy of teaching, and professional relationships The findings have implications for teacher education. The researchers feel that deliberate planning for networking between activity systems may well be a solution to the apparent theory/practice gap. Proximity of activity systems need not be a hindering issue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Structural health monitoring (SHM) is the term applied to the procedure of monitoring a structure’s performance, assessing its condition and carrying out appropriate retrofitting so that it performs reliably, safely and efficiently. Bridges form an important part of a nation’s infrastructure. They deteriorate due to age and changing load patterns and hence early detection of damage helps in prolonging the lives and preventing catastrophic failures. Monitoring of bridges has been traditionally done by means of visual inspection. With recent developments in sensor technology and availability of advanced computing resources, newer techniques have emerged for SHM. Acoustic emission (AE) is one such technology that is attracting attention of engineers and researchers all around the world. This paper discusses the use of AE technology in health monitoring of bridge structures, with a special focus on analysis of recorded data. AE waves are stress waves generated by mechanical deformation of material and can be recorded by means of sensors attached to the surface of the structure. Analysis of the AE signals provides vital information regarding the nature of the source of emission. Signal processing of the AE waveform data can be carried out in several ways and is predominantly based on time and frequency domains. Short time Fourier transform and wavelet analysis have proved to be superior alternatives to traditional frequency based analysis in extracting information from recorded waveform. Some of the preliminary results of the application of these analysis tools in signal processing of recorded AE data will be presented in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spatial information captured from optical remote sensors on board unmanned aerial vehicles (UAVs) has great potential in automatic surveillance of electrical infrastructure. For an automatic vision-based power line inspection system, detecting power lines from a cluttered background is one of the most important and challenging tasks. In this paper, a novel method is proposed, specifically for power line detection from aerial images. A pulse coupled neural filter is developed to remove background noise and generate an edge map prior to the Hough transform being employed to detect straight lines. An improved Hough transform is used by performing knowledge-based line clustering in Hough space to refine the detection results. The experiment on real image data captured from a UAV platform demonstrates that the proposed approach is effective for automatic power line detection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of object-based approaches to the problem of extracting vegetation information from images requires accurate delineation of individual tree crowns. This paper presents an automated method for individual tree crown detection and delineation by applying a simplified PCNN model in spectral feature space followed by post-processing using morphological reconstruction. The algorithm was tested on high resolution multi-spectral aerial images and the results are compared with two existing image segmentation algorithms. The results demonstrate that our algorithm outperforms the other two solutions with the average accuracy of 81.8%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Light Detection and Ranging (LIDAR) has great potential to assist vegetation management in power line corridors by providing more accurate geometric information of the power line assets and vegetation along the corridors. However, the development of algorithms for the automatic processing of LIDAR point cloud data, in particular for feature extraction and classification of raw point cloud data, is in still in its infancy. In this paper, we take advantage of LIDAR intensity and try to classify ground and non-ground points by statistically analyzing the skewness and kurtosis of the intensity data. Moreover, the Hough transform is employed to detected power lines from the filtered object points. The experimental results show the effectiveness of our methods and indicate that better results were obtained by using LIDAR intensity data than elevation data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Investigated human visual processing of simple two-colour patterns using a delayed match to sample paradigm with positron emission tomography (PET). This study is unique in that the authors specifically designed the visual stimuli to be the same for both pattern and colour recognition with all patterns being abstract shapes not easily verbally coded composed of two-colour combinations. The authors did this to explore those brain regions required for both colour and pattern processing and to separate those areas of activation required for one or the other. 10 right-handed male volunteers aged 18–35 yrs were recruited. The authors found that both tasks activated similar occipital regions, the major difference being more extensive activation in pattern recognition. A right-sided network that involved the inferior parietal lobule, the head of the caudate nucleus, and the pulvinar nucleus of the thalamus was common to both paradigms. Pattern recognition also activated the left temporal pole and right lateral orbital gyrus, whereas colour recognition activated the left fusiform gyrus and several right frontal regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several protocols for isolation of mycobacteria from water exist, but there is no established standard method. This study compared methods of processing potable water samples for the isolation of Mycobacterium avium and Mycobacterium intracellulare using spiked sterilized water and tap water decontaminated using 0.005% cetylpyridinium chloride (CPC). Samples were concentrated by centrifugation or filtration and inoculated onto Middlebrook 7H10 and 7H11 plates and Lowenstein-Jensen slants and into mycobacterial growth indicator tubes with or without polymyxin, azlocillin, nalidixic acid, trimethoprim, and amphotericin B. The solid media were incubated at 32°C, at 35°C, and at 35°C with CO2 and read weekly. The results suggest that filtration of water for the isolation of mycobacteria is a more sensitive method for concentration than centrifugation. The addition of sodium thiosulfate may not be necessary and may reduce the yield. Middlebrook M7H10 and 7H11 were equally sensitive culture media. CPC decontamination, while effective for reducing growth of contaminants, also significantly reduces mycobacterial numbers. There was no difference at 3 weeks between the different incubation temperatures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Review of 'Gatz', Elevator Repair Company / Brisbane Powerhouse, published in The Australian, 12 May 2009.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A laboratory scale twin screw extruder has been interfaced with a near infrared (NIR) spectrometer via a fibre optic link so that NIR spectra can be collected continuously during the small scale experimental melt state processing of polymeric materials. This system can be used to investigate melt state processes such as reactive extrusion, in real time, in order to explore the kinetics and mechanism of the reaction. A further advantage of the system is that it has the capability to measure apparent viscosity simultaneously which gives important additional information about molecular weight changes and polymer degradation during processing. The system was used to study the melt processing of a nanocomposite consisting of a thermoplastic polyurethane and an organically modified layered silicate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article explores two matrix methods to induce the ``shades of meaning" (SoM) of a word. A matrix representation of a word is computed from a corpus of traces based on the given word. Non-negative Matrix Factorisation (NMF) and Singular Value Decomposition (SVD) compute a set of vectors corresponding to a potential shade of meaning. The two methods were evaluated based on loss of conditional entropy with respect to two sets of manually tagged data. One set reflects concepts generally appearing in text, and the second set comprises words used for investigations into word sense disambiguation. Results show that for NMF consistently outperforms SVD for inducing both SoM of general concepts as well as word senses. The problem of inducing the shades of meaning of a word is more subtle than that of word sense induction and hence relevant to thematic analysis of opinion where nuances of opinion can arise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spoken term detection (STD) popularly involves performing word or sub-word level speech recognition and indexing the result. This work challenges the assumption that improved speech recognition accuracy implies better indexing for STD. Using an index derived from phone lattices, this paper examines the effect of language model selection on the relationship between phone recognition accuracy and STD accuracy. Results suggest that language models usually improve phone recognition accuracy but their inclusion does not always translate to improved STD accuracy. The findings suggest that using phone recognition accuracy to measure the quality of an STD index can be problematic, and highlight the need for an alternative that is more closely aligned with the goals of the specific detection task.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While spoken term detection (STD) systems based on word indices provide good accuracy, there are several practical applications where it is infeasible or too costly to employ an LVCSR engine. An STD system is presented, which is designed to incorporate a fast phonetic decoding front-end and be robust to decoding errors whilst still allowing for rapid search speeds. This goal is achieved through mono-phone open-loop decoding coupled with fast hierarchical phone lattice search. Results demonstrate that an STD system that is designed with the constraint of a fast and simple phonetic decoding front-end requires a compromise to be made between search speed and search accuracy.