985 resultados para Interactive Techniques
Resumo:
With the goal of improving the academic performance of primary and secondary students in Malaysia by 2020, the Malaysian Ministry of Education has made a significant investment in developing a Smart School Project. The aim of this project is to introduce interactive courseware into primary and secondary schools across Malaysia. As has been the case around the world, interactive courseware is regarded as a tool to motivate students to learn meaningfully and enhance learning experiences. Through an initial pilot phase, the Malaysian government has commissioned the development of interactive courseware by a number of developers and has rolled this courseware out to selected schools over the past 12 years. However, Ministry reports and several independent researchers have concluded that its uptake has been limited, and that much of the courseware has not been used effectively in schools. This has been attributed to weaknesses in the interface design of the courseware, which, it has been argued, fails to accommodate the needs of students and teachers. Taking the Smart School Project's science courseware as a sample, this research project has investigated the extent, nature, and reasons for the problems that have arisen. In particular, it has focused on examining the quality and effectivity of the interface design in facilitating interaction and supporting learning experiences. The analysis has been conducted empirically, by first comparing the interface design principles, characteristics and components of the existing courseware against best practice, as described in the international literature, as well as against the government guidelines provided to the developers. An ethnographic study was then undertaken to observe how the courseware is used and received in the classroom, and to investigate the stakeholders' (school principal, teachers and students') perceptions of its usability and effectivity. Finally, to understand how issues may have arisen, a review of the development process has been undertaken and it has been compared to development methods recommended in the literature, as well as the guidelines provided to the developers. The outcomes of the project include an empirical evaluation of the quality of the interface design of the Smart School Project's science courseware; the identification of other issues that have affected its uptake; an evaluation of the development process and, out of this, an extended set of principles to guide the design and development of future Smart School Project courseware to ensure that it accommodates the various stakeholders' needs.
Resumo:
This paper investigates the use of the dimensionality-reduction techniques weighted linear discriminant analysis (WLDA), and weighted median fisher discriminant analysis (WMFD), before probabilistic linear discriminant analysis (PLDA) modeling for the purpose of improving speaker verification performance in the presence of high inter-session variability. Recently it was shown that WLDA techniques can provide improvement over traditional linear discriminant analysis (LDA) for channel compensation in i-vector based speaker verification systems. We show in this paper that the speaker discriminative information that is available in the distance between pair of speakers clustered in the development i-vector space can also be exploited in heavy-tailed PLDA modeling by using the weighted discriminant approaches prior to PLDA modeling. Based upon the results presented within this paper using the NIST 2008 Speaker Recognition Evaluation dataset, we believe that WLDA and WMFD projections before PLDA modeling can provide an improved approach when compared to uncompensated PLDA modeling for i-vector based speaker verification systems.
Resumo:
Navigational collisions are one of the major safety concerns for many seaports. Continuing growth of shipping traffic in number and sizes is likely to result in increased number of traffic movements, which consequently could result higher risk of collisions in these restricted waters. This continually increasing safety concern warrants a comprehensive technique for modeling collision risk in port waters, particularly for modeling the probability of collision events and the associated consequences (i.e., injuries and fatalities). A number of techniques have been utilized for modeling the risk qualitatively, semi-quantitatively and quantitatively. These traditional techniques mostly rely on historical collision data, often in conjunction with expert judgments. However, these techniques are hampered by several shortcomings, such as randomness and rarity of collision occurrence leading to obtaining insufficient number of collision counts for a sound statistical analysis, insufficiency in explaining collision causation, and reactive approach to safety. A promising alternative approach that overcomes these shortcomings is the navigational traffic conflict technique (NTCT), which uses traffic conflicts as an alternative to the collisions for modeling the probability of collision events quantitatively. This article explores the existing techniques for modeling collision risk in port waters. In particular, it identifies the advantages and limitations of the traditional techniques and highlights the potentials of the NTCT in overcoming the limitations. In view of the principles of the NTCT, a structured method for managing collision risk is proposed. This risk management method allows safety analysts to diagnose safety deficiencies in a proactive manner, which consequently has great potential for managing collision risk in a fast, reliable and efficient manner.
Resumo:
This chapter reviews common barriers to community engagement for Latino youth and suggests ways to move beyond those barriers by empowering them to communicate their experiences, address the challenges they face, and develop recommendations for making their community more youth-friendly. As a case study, this chapter describes a program called Youth FACE IT (Youth Fostering Active Community Engagement for Integration and Transformation)in Boulder County, Colorado. The program enables Latino youth to engage in critical dialogue and participate in a community-based initiative. The chapter concludes by explaining specific strategies that planners can use to support active community engagement and develop a future generation of planners and engaged community members that reflects emerging demographics.
Resumo:
In this paper we consider the variable order time fractional diffusion equation. We adopt the Coimbra variable order (VO) time fractional operator, which defines a consistent method for VO differentiation of physical variables. The Coimbra variable order fractional operator also can be viewed as a Caputo-type definition. Although this definition is the most appropriate definition having fundamental characteristics that are desirable for physical modeling, numerical methods for fractional partial differential equations using this definition have not yet appeared in the literature. Here an approximate scheme is first proposed. The stability, convergence and solvability of this numerical scheme are discussed via the technique of Fourier analysis. Numerical examples are provided to show that the numerical method is computationally efficient. Crown Copyright © 2012 Published by Elsevier Inc. All rights reserved.
Resumo:
The research team recognized the value of network-level Falling Weight Deflectometer (FWD) testing to evaluate the structural condition trends of flexible pavements. However, practical limitations due to the cost of testing, traffic control and safety concerns and the ability to test a large network may discourage some agencies from conducting the network-level FWD testing. For this reason, the surrogate measure of the Structural Condition Index (SCI) is suggested for use. The main purpose of the research presented in this paper is to investigate data mining strategies and to develop a prediction method of the structural condition trends for network-level applications which does not require FWD testing. The research team first evaluated the existing and historical pavement condition, distress, ride, traffic and other data attributes in the Texas Department of Transportation (TxDOT) Pavement Maintenance Information System (PMIS), applied data mining strategies to the data, discovered useful patterns and knowledge for SCI value prediction, and finally provided a reasonable measure of pavement structural condition which is correlated to the SCI. To evaluate the performance of the developed prediction approach, a case study was conducted using the SCI data calculated from the FWD data collected on flexible pavements over a 5-year period (2005 – 09) from 354 PMIS sections representing 37 pavement sections on the Texas highway system. The preliminary study results showed that the proposed approach can be used as a supportive pavement structural index in the event when FWD deflection data is not available and help pavement managers identify the timing and appropriate treatment level of preventive maintenance activities.
Resumo:
In the context of increasing demand for potable water and the depletion of water resources, stormwater is a logical alternative. However, stormwater contains pollutants, among which metals are of particular interest due to their toxicity and persistence in the environment. Hence, it is imperative to remove toxic metals in stormwater to the levels prescribed by drinking water guidelines for potable use. Consequently, various techniques have been proposed, among which sorption using low cost sorbents is economically viable and environmentally benign in comparison to other techniques. However, sorbents show affinity towards certain toxic metals, which results in poor removal of other toxic metals. It was hypothesised in this study that a mixture of sorbents that have different metal affinity patterns can be used for the efficient removal of a range of toxic metals commonly found in stormwater. The performance of six sorbents in the sorption of Al, Cr, Cu, Pb, Ni, Zn and Cd, which are the toxic metals commonly found in urban stormwater, was investigated to select suitable sorbents for creating the mixtures. For this purpose, a multi criteria analytical protocol was developed using the decision making methods: PROMETHEE (Preference Ranking Organisation METHod for Enrichment Evaluations) and GAIA (Graphical Analysis for Interactive Assistance). Zeolite and seaweed were selected for the creation of trial mixtures based on their metal affinity pattern and the performance on predetermined selection criteria. The metal sorption mechanisms employed by seaweed and zeolite were defined using kinetics, isotherm and thermodynamics parameters, which were determined using the batch sorption experiments. Additionally, the kinetics rate-limiting steps were identified using an innovative approach using GAIA and Spearman correlation techniques developed as part of the study, to overcome the limitation in conventional graphical methods in predicting the degree of contribution of each kinetics step in limiting the overall metal removal rate. The sorption kinetics of zeolite was found to be primarily limited by intraparticle diffusion followed by the sorption reaction steps, which were governed mainly by the hydrated ionic diameter of metals. The isotherm study indicated that the metal sorption mechanism of zeolite was primarily of a physical nature. The thermodynamics study confirmed that the energetically favourable nature of sorption increased in the order of Zn < Cu < Cd < Ni < Pb < Cr < Al, which is in agreement with metal sorption affinity of zeolite. Hence, sorption thermodynamics has an influence on the metal sorption affinity of zeolite. On the other hand, the primary kinetics rate-limiting step of seaweed was the sorption reaction process followed by intraparticle diffusion. The boundary layer diffusion was also found to limit the metal sorption kinetics at low concentration. According to the sorption isotherm study, Cd, Pb, Cr and Al were sorbed by seaweed via ion exchange, whilst sorption of Ni occurred via physisorption. Furthermore, ionic bonding is responsible for the sorption of Zn. The thermodynamics study confirmed that sorption by seaweed was energetically favourable in the order of Zn < Cu < Cd < Cr . Al < Pb < Ni. However, this did not agree with the affinity series derived for seaweed suggesting a limited influence of sorption thermodynamics on metal affinity for seaweed. The investigation of zeolite-seaweed mixtures indicated that mixing sorbents have an effect on the kinetics rates and the sorption affinity. Additionally, the theoretical relationships were derived to predict the boundary layer diffusion rate, intraparticle diffusion rate, the sorption reaction rate and the enthalpy of mixtures based on that of individual sorbents. In general, low coefficient of determination (R2) for the relationships between theoretical and experimental data indicated that the relationships were not statistically significant. This was attributed to the heterogeneity of the properties of sorbents. Nevertheless, in relative terms, the intraparticle diffusion rate, sorption reaction rate and enthalpy of sorption had higher R2 values than the boundary layer diffusion rate suggesting that there was some relationship between the former set of parameters of mixtures and that of sorbents. The mixture, which contained 80% of zeolite and 20% of seaweed, showed similar affinity for the sorption of Cu, Ni, Cd, Cr and Al, which was attributed to approximately similar sorption enthalpy of the metal ions. Therefore, it was concluded that the seaweed-zeolite mixture can be used to obtain the same affinity for various metals present in a multi metal system provided the metal ions have similar enthalpy during sorption by the mixture.
Resumo:
Digital information that is place- and time-specific, is increasingly becoming available on all aspects of the urban landscape. People (cf. the Social Web), places (cf. the Geo Web), and physical objects (cf. ubiquitous computing, the Internet of Things) are increasingly infused with sensors, actuators, and tagged with a wealth of digital information. Urban informatics research explores these emerging digital layers of the city at the intersection of people, place and technology. However, little is known about the challenges and new opportunities that these digital layers may offer to road users driving through today’s mega cities. We argue that this aspect is worth exploring in particular with regards to Auto-UI’s overarching goal of making cars both safer and more enjoyable. This paper presents the findings of a pilot study, which included 14 urban informatics research experts participating in a guided ideation (idea creation) workshop within a simulated environment. They were immersed into different driving scenarios to imagine novel urban informatics type of applications specific to the driving context.
Resumo:
In Australia, railway systems play a vital role in transporting the sugarcane crop from farms to mills. The sugarcane transport system is very complex and uses daily schedules, consisting of a set of locomotives runs, to satisfy the requirements of the mill and harvesters. The total cost of sugarcane transport operations is very high; over 35% of the total cost of sugarcane production in Australia is incurred in cane transport. Efficient schedules for sugarcane transport can reduce the cost and limit the negative effects that this system can have on the raw sugar production system. There are several benefits to formulating the train scheduling problem as a blocking parallel-machine job shop scheduling (BPMJSS) problem, namely to prevent two trains passing in one section at the same time; to keep the train activities (operations) in sequence during each run (trip) by applying precedence constraints; to pass the trains on one section in the correct order (priorities of passing trains) by applying disjunctive constraints; and, to ease passing trains by solving rail conflicts by applying blocking constraints and Parallel Machine Scheduling. Therefore, the sugarcane rail operations are formulated as BPMJSS problem. A mixed integer programming and constraint programming approaches are used to describe the BPMJSS problem. The model is solved by the integration of constraint programming, mixed integer programming and search techniques. The optimality performance is tested by Optimization Programming Language (OPL) and CPLEX software on small and large size instances based on specific criteria. A real life problem is used to verify and validate the approach. Constructive heuristics and new metaheuristics including simulated annealing and tabu search are proposed to solve this complex and NP-hard scheduling problem and produce a more efficient scheduling system. Innovative hybrid and hyper metaheuristic techniques are developed and coded using C# language to improve the solutions quality and CPU time. Hybrid techniques depend on integrating heuristic and metaheuristic techniques consecutively, while hyper techniques are the complete integration between different metaheuristic techniques, heuristic techniques, or both.
Resumo:
This practice-led study explores different ways the subject of sustain-ability can be addressed within an Interactive Media Arts practice. The exploration encompasses three creative projects, Charmed, Distracted and e. Menura superba. Grounded in an ecological philosophy inspired by vegetarianism and the critical design philosophy of defuturing, the work shows how such a philosophical position can guide the redirection of practice. The concern for sustain-ability within my practice, and more generally the question of Interactive Media Arts and sustain-ability, I refer to as a problématique. The objective of this study is not one of finding an answer or a truth to an instrumentally posed question, but to explore the complexities of the problématique through a program of practice and intellectual investigation. The aim being to redirect my practice and to find a renewed raison d’être for practice through a process of opening up, encountering, and discovering otherwise unknown possibilities for practice. In the context of sustain-ability, this opening up of possibilities can be considered a form of futuring. A futuring I argue is only possible if the things we take for granted as integral aspects of our being, practices and life worlds, are revealed in ways that estrange them, rendering them visible in ways that allow questioning and change.
Resumo:
Background: Outside the mass-spectrometer, proteomics research does not take place in a vacuum. It is affected by policies on funding and research infrastructure. Proteomics research both impacts and is impacted by potential clinical applications. It provides new techniques & clinically relevant findings, but the possibilities for such innovations (and thus the perception of the potential for the field by funders) are also impacted by regulatory practices and the readiness of the health sector to incorporate proteomics-related tools & findings. Key to this process is how knowledge is translated. Methods: We present preliminary results from a multi-year social science project, funded by the Canadian Institutes of Health Research, on the processes and motivations for knowledge translation in the health sciences. The proteomics case within this wider study uses qualitative methods to examine the interplay between proteomics science and regulatory and policy makers regarding clinical applications of proteomics. Results: Adopting an interactive format to encourage conference attendees’ feedback, our poster focuses on deficits in effective knowledge translation strategies from the laboratory to policy, clinical, & regulatory arenas. An analysis of the interviews conducted to date suggests five significant choke points: the changing priorities of funding agencies; the complexity of proteomics research; the organisation of proteomics research; the relationship of proteomics to genomics and other omics sciences; and conflict over the appropriate role of standardisation. Conclusion: We suggest that engagement with aspects of knowledge translation, such as those mentioned above, is crucially important for the eventual clinical application ofproteomics science on any meaningful scale.
Resumo:
Real-time remote sales assistance is an underdeveloped component of online sales services. Solutions involving web page text chat, telephony and video support prove problematic when seeking to remotely guide customers in their sales processes, especially with configurations of physically complex artefacts. Recently, there has been great interest in the application of virtual worlds and augmented reality to create synthetic environments for remote sales of physical artefacts. However, there is a lack of analysis and development of appropriate software services to support these processes. We extend our previous work with the detailed design of configuration context services to support the management of an interactive sales session using augmented reality. We detail the context and configuration services required, presenting a novel data service streaming configuration information to the vendor for business analytics. We expect that a fully implemented configuration management service, based on our design, will improve the remote sales experience for both customers and vendors alike via analysis of the streamed information.
Resumo:
The overall aim of this project was to contribute to existing knowledge regarding methods for measuring characteristics of airborne nanoparticles and controlling occupational exposure to airborne nanoparticles, and to gather data on nanoparticle emission and transport in various workplaces. The scope of this study involved investigating the characteristics and behaviour of particles arising from the operation of six nanotechnology processes, subdivided into nine processes for measurement purposes. It did not include the toxicological evaluation of the aerosol and therefore, no direct conclusion was made regarding the health effects of exposure to these particles. Our research included real-time measurement of sub, and supermicrometre particle number and mass concentration, count median diameter, and alveolar deposited surface area using condensation particle counters, an optical particle counter, DustTrak photometer, scanning mobility particle sizer, and nanoparticle surface area monitor, respectively. Off-line particle analysis included scanning and transmission electron microscopy, energy-dispersive x-ray spectrometry, and thermal optical analysis of elemental carbon. Sources of fibrous and non-fibrous particles were included.