960 resultados para chiroptical switches, data processing, enantiospecificity, photochromism, steric hindrance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The overarching goal of this dissertation was to evaluate the contextual components of instructional strategies for the acquisition of complex programming concepts. A meta-knowledge processing model is proposed, on the basis of the research findings, thereby facilitating the selection of media treatment for electronic courseware. When implemented, this model extends the work of Smith (1998), as a front-end methodology, for his glass-box interpreter called Bradman, for teaching novice programmers. Technology now provides the means to produce individualized instructional packages with relative ease. Multimedia and Web courseware development accentuate a highly graphical (or visual) approach to instructional formats. Typically, little consideration is given to the effectiveness of screen-based visual stimuli, and curiously, students are expected to be visually literate, despite the complexity of human-computer interaction. Visual literacy is much harder for some people to acquire than for others! (see Chapter Four: Conditions-of-the-Learner) An innovative research programme was devised to investigate the interactive effect of instructional strategies, enhanced with text-plus-textual metaphors or text-plus-graphical metaphors, and cognitive style, on the acquisition of a special category of abstract (process) programming concept. This type of concept was chosen to focus on the role of analogic knowledge involved in computer programming. The results are discussed within the context of the internal/external exchange process, drawing on Ritchey's (1980) concepts of within-item and between-item encoding elaborations. The methodology developed for the doctoral project integrates earlier research knowledge in a novel, interdisciplinary, conceptual framework, including: from instructional science in the USA, for the concept learning models; British cognitive psychology and human memory research, for defining the cognitive style construct; and Australian educational research, to provide the measurement tools for instructional outcomes. The experimental design consisted of a screening test to determine cognitive style, a pretest to determine prior domain knowledge in abstract programming knowledge elements, the instruction period, and a post-test to measure improved performance. This research design provides a three-level discovery process to articulate: 1) the fusion of strategic knowledge required by the novice learner for dealing with contexts within instructional strategies 2) acquisition of knowledge using measurable instructional outcome and learner characteristics 3) knowledge of the innate environmental factors which influence the instructional outcomes This research has successfully identified the interactive effect of instructional strategy, within an individual's cognitive style construct, in their acquisition of complex programming concepts. However, the significance of the three-level discovery process lies in the scope of the methodology to inform the design of a meta-knowledge processing model for instructional science. Firstly, the British cognitive style testing procedure, is a low cost, user friendly, computer application that effectively measures an individual's position on the two cognitive style continua (Riding & Cheema,1991). Secondly, the QUEST Interactive Test Analysis System (Izard,1995), allows for a probabilistic determination of an individual's knowledge level, relative to other participants, and relative to test-item difficulties. Test-items can be related to skill levels, and consequently, can be used by instructional scientists to measure knowledge acquisition. Finally, an Effect Size Analysis (Cohen,1977) allows for a direct comparison between treatment groups, giving a statistical measurement of how large an effect the independent variables have on the dependent outcomes. Combined with QUEST's hierarchical positioning of participants, this tool can assist in identifying preferred learning conditions for the evaluation of treatment groups. By combining these three assessment analysis tools into instructional research, a computerized learning shell, customised for individuals' cognitive constructs can be created (McKay & Garner,1999). While this approach has widespread application, individual researchers/trainers would nonetheless, need to validate with an extensive pilot study programme (McKay,1999a; McKay,1999b), the interactive effects within their specific learning domain. Furthermore, the instructional material does not need to be limited to a textual/graphical comparison, but could be applied to any two or more instructional treatments of any kind. For instance: a structured versus exploratory strategy. The possibilities and combinations are believed to be endless, provided the focus is maintained on linking of the front-end identification of cognitive style with an improved performance outcome. My in-depth analysis provides a better understanding of the interactive effects of the cognitive style construct and instructional format on the acquisition of abstract concepts, involving spatial relations and logical reasoning. In providing the basis for a meta-knowledge processing model, this research is expected to be of interest to educators, cognitive psychologists, communications engineers and computer scientists specialising in computer-human interactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This action research project set out to develop the competence of senior personnal from a private vocational college in Thailand in the use of administrative computer systems. The findings demonstrate the critical significance of progressive incremental learning that is tailored to the professional personal needs of learners. Learner competence was found to be dependent upon the creation of an environment promoting learner confidence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RFID is gaining significant thrust as the preferred choice of automatic identification and data collection system. However, there are various data processing and management problems such as missed readings and duplicate readings which hinder wide scale adoption of RFID systems. To this end we propose an approach that filters the captured data including both noise removal and duplicate elimination. Experimental results demonstrate that the proposed approach improves missed data restoration process when compared with the existing method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This manuscript describes a facile alternative route to make thin-film yttria-stabilized zirconia (YSZ) electrolyte by liquid-phase assisted electrophoretic deposition utilizing electrostatic-steric stabilized YSZ suspension followed by sintering. Very fine YSZ particles in ball-milled suspension facilitate their sustained dispersion through electrostatic mechanism as evidenced by their higher zeta potentials. Binder addition into the ball-milled suspension is also demonstrated to contribute complementary steric hindrance effects on suspension stability. As the consequence, the film quality and sinterability improve in the sequence of film made from non ball-milled suspension, film made from ball-milled suspension and film made from ball-milled suspension with binder addition. The specific deposition mechanisms pertaining to each suspension are also postulated and discussed below. A very thin dense electrolyte layer of ∼10 μm can be achieved via electrophoretic deposition route utilizing ball-milled suspension and binder addition. This in turn, makes the electrolyte resistance a more negligible part of the overall cell resistance. Further on, we also tested the performance of SOFC utilizing as-formed 10 μm YSZ electrolyte i.e. YSZ-NiO|YSZ|LSM (La0.8Sr0.2MnO3-δ), whereby a maximum power density of ∼850 mW cm−2 at 850 °C was demonstrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses a major challenge in data mining applications where the full information about the underlying processes, such as sensor networks or large online database, cannot be practically obtained due to physical limitations such as low bandwidth or memory, storage, or computing power. Motivated by the recent theory on direct information sampling called compressed sensing (CS), we propose a framework for detecting anomalies from these largescale data mining applications where the full information is not practically possible to obtain. Exploiting the fact that the intrinsic dimension of the data in these applications are typically small relative to the raw dimension and the fact that compressed sensing is capable of capturing most information with few measurements, our work show that spectral methods that used for volume anomaly detection can be directly applied to the CS data with guarantee on performance. Our theoretical contributions are supported by extensive experimental results on large datasets which show satisfactory performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Protein mass spectrometry (MS) pattern recognition has recently emerged as a new method for cancer diagnosis. Unfortunately, classification performance may degrade owing to the enormously high dimensionality of the data. This paper investigates the use of Random Projection in protein MS data dimensionality reduction. The effectiveness of Random Projection (RP) is analyzed and compared against Principal Component Analysis (PCA) by using three classification algorithms, namely Support Vector Machine, Feed-forward Neural Networks and K-Nearest Neighbour. Three real-world cancer data sets are employed to evaluate the performances of RP and PCA. Through the investigations, RP method demonstrated better or at least comparable classification performance as PCA if the dimensionality of the projection matrix is sufficiently large. This paper also explores the use of RP as a pre-processing step prior to PCA. The results show that without sacrificing classification accuracy, performing RP prior to PCA significantly improves the computational time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a leading framework for processing and analyzing big data, MapReduce is leveraged by many enterprises to parallelize their data processing on distributed computing systems. Unfortunately, the all-to-all data forwarding from map tasks to reduce tasks in the traditional MapReduce framework would generate a large amount of network traffic. The fact that the intermediate data generated by map tasks can be combined with significant traffic reduction in many applications motivates us to propose a data aggregation scheme for MapReduce jobs in cloud. Specifically, we design an aggregation architecture under the existing MapReduce framework with the objective of minimizing the data traffic during the shuffle phase, in which aggregators can reside anywhere in the cloud. Some experimental results also show that our proposal outperforms existing work by reducing the network traffic significantly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Big data is the large or complex data that exceed the processing capacity of conventional data processing systems. This book provides a big picture in this broad research area, covering all the phases of its value chains. The authors have attempted to survey most of the relevant technologies in each phrase of big data. The book is recommended for readers interested in advanced research in big data, also for industry practitioners who are interested in building big data applications. If the reader is not with necessary technical background, complementary readings may be needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents one software developed to process solar radiation data. This software can be used in meteorological and climatic stations, and also as a support for solar radiation measurements in researches of solar energy availability allowing data quality control, statistical calculations and validation of models, as well as ease interchanging of data. (C) 1999 Elsevier B.V. Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Grinding process is usually the last finishing process of a precision component in the manufacturing industries. This process is utilized for manufacturing parts of different materials, so it demands results such as low roughness, dimensional and shape error control, optimum tool-life, with minimum cost and time. Damages on the parts are very expensive since the previous processes and the grinding itself are useless when the part is damaged in this stage. This work aims to investigate the efficiency of digital signal processing tools of acoustic emission signals in order to detect thermal damages in grinding process. To accomplish such a goal, an experimental work was carried out for 15 runs in a surface grinding machine operating with an aluminum oxide grinding wheel and ABNT 1045 e VC131 steels. The acoustic emission signals were acquired from a fixed sensor placed on the workpiece holder. A high sampling rate acquisition system at 2.5 MHz was used to collect the raw acoustic emission instead of root mean square value usually employed. In each test AE data was analyzed off-line, with results compared to inspection of each workpiece for burn and other metallurgical anomaly. A number of statistical signal processing tools have been evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper adresses the problem on processing biological data such as cardiac beats, audio and ultrasonic range, calculating wavelet coefficients in real time, with processor clock running at frequency of present ASIC's and FPGA. The Paralell Filter Architecture for DWT has been improved, calculating wavelet coefficients in real time with hardware reduced to 60%. The new architecture, which also processes IDWT, is implemented with the Radix-2 or the Booth-Wallace Constant multipliers. Including series memory register banks, one integrated circuit Signal Analyzer, ultrasonic range, is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nonlinear (NL) optical properties of antimony oxide based glasses (AG) were characterized for excitation wavelengths from 800 to 1600 m. The NL refractive indices, n2, and the two-photon absorption (TPA) coefficient, β, have been evaluated using the Z-scan technique. Values of n2≈ 10-15 - 10-14 cm2/W of electronic origin were measured and negligible TPA coefficients (β < 0.003 cm/GW) were determined. The response time of the nonlinearity is faster than 100 fs as determined using the Kerr shutter technique. The figure-of-merit usually considered for all-optical switching, T = 2βλ/n2 , indicates that AG are very good materials for ultrafast switches at telecom wavelengths. © 2007 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In geophysics and seismology, raw data need to be processed to generate useful information that can be turned into knowledge by researchers. The number of sensors that are acquiring raw data is increasing rapidly. Without good data management systems, more time can be spent in querying and preparing datasets for analyses than in acquiring raw data. Also, a lot of good quality data acquired at great effort can be lost forever if they are not correctly stored. Local and international cooperation will probably be reduced, and a lot of data will never become scientific knowledge. For this reason, the Seismological Laboratory of the Institute of Astronomy, Geophysics and Atmospheric Sciences at the University of São Paulo (IAG-USP) has concentrated fully on its data management system. This report describes the efforts of the IAG-USP to set up a seismology data management system to facilitate local and international cooperation. © 2011 by the Istituto Nazionale di Geofisica e Vulcanologia. All rights reserved.