133 resultados para Idea of approaches to a number


Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advances in computer hardware and software development techniques in the past 25 years, digital computer simulation of train movement and traction systems has been widely adopted as a standard computer-aided engineering tool [1] during the design and development stages of existing and new railway systems. Simulators of different approaches and scales are used extensively to investigate various kinds of system studies. Simulation is now proven to be the cheapest means to carry out performance predication and system behaviour characterisation. When computers were first used to study railway systems, they were mainly employed to perform repetitive but time-consuming computational tasks, such as matrix manipulations for power network solution and exhaustive searches for optimal braking trajectories. With only simple high-level programming languages available at the time, full advantage of the computing hardware could not be taken. Hence, structured simulations of the whole railway system were not very common. Most applications focused on isolated parts of the railway system. It is more appropriate to regard those applications as primarily mechanised calculations rather than simulations. However, a railway system consists of a number of subsystems, such as train movement, power supply and traction drives, which inevitably contains many complexities and diversities. These subsystems interact frequently with each other while the trains are moving; and they have their special features in different railway systems. To further complicate the simulation requirements, constraints like track geometry, speed restrictions and friction have to be considered, not to mention possible non-linearities and uncertainties in the system. In order to provide a comprehensive and accurate account of system behaviour through simulation, a large amount of data has to be organised systematically to ensure easy access and efficient representation; the interactions and relationships among the subsystems should be defined explicitly. These requirements call for sophisticated and effective simulation models for each component of the system. The software development techniques available nowadays allow the evolution of such simulation models. Not only can the applicability of the simulators be largely enhanced by advanced software design, maintainability and modularity for easy understanding and further development, and portability for various hardware platforms are also encouraged. The objective of this paper is to review the development of a number of approaches to simulation models. Attention is, in particular, given to models for train movement, power supply systems and traction drives. These models have been successfully used to enable various ‘what-if’ issues to be resolved effectively in a wide range of applications, such as speed profiles, energy consumption, run times etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work was to quantify exposure to particles emitted by wood-fired ovens in pizzerias. Overall, 15 microenvironments were chosen and analyzed in a 14-month experimental campaign. Particle number concentration and distribution were measured simultaneously using a Condensation Particle Counter (CPC), a Scanning Mobility Particle Sizer (SMPS), an Aerodynamic Particle Sizer (APS). The surface area and mass distributions and concentrations, as well as the estimation of lung deposition surface area and PM1 were evaluated using the SMPS-APS system with dosimetric models, by taking into account the presence of aggregates on the basis of the Idealized Aggregate (IA) theory. The fraction of inhaled particles deposited in the respiratory system and different fractions of particulate matter were also measured by means of a Nanoparticle Surface Area Monitor (NSAM) and a photometer (DustTrak DRX), respectively. In this way, supplementary data were obtained during the monitoring of trends inside the pizzerias. We found that surface area and PM1 particle concentrations in pizzerias can be very high, especially when compared to other critical microenvironments, such as the transport hubs. During pizza cooking under normal ventilation conditions, concentrations were found up to 74, 70 and 23 times higher than background levels for number, surface area and PM1, respectively. A key parameter is the oven shape factor, defined as the ratio between the size of the face opening in respect

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This overview focuses on the application of chemometrics techniques for the investigation of soils contaminated by polycyclic aromatic hydrocarbons (PAHs) and metals because these two important and very diverse groups of pollutants are ubiquitous in soils. The salient features of various studies carried out in the micro- and recreational environments of humans, are highlighted in the context of the various multivariate statistical techniques available across discipline boundaries that have been effectively used in soil studies. Particular attention is paid to techniques employed in the geosciences that may be effectively utilized for environmental soil studies; classical multivariate approaches that may be used in isolation or as complementary methods to these are also discussed. Chemometrics techniques widely applied in atmospheric studies for identifying sources of pollutants or for determining the importance of contaminant source contributions to a particular site, have seen little use in soil studies, but may be effectively employed in such investigations. Suitable programs are also available for suggesting mitigating measures in cases of soil contamination, and these are also considered. Specific techniques reviewed include pattern recognition techniques such as Principal Components Analysis (PCA), Fuzzy Clustering (FC) and Cluster Analysis (CA); geostatistical tools include variograms, Geographical Information Systems (GIS), contour mapping and kriging; source identification and contribution estimation methods reviewed include Positive Matrix Factorisation (PMF), and Principal Component Analysis on Absolute Principal Component Scores (PCA/APCS). Mitigating measures to limit or eliminate pollutant sources may be suggested through the use of ranking analysis and multi criteria decision making methods (MCDM). These methods are mainly represented in this review by studies employing the Preference Ranking Organisation Method for Enrichment Evaluation (PROMETHEE) and its associated graphic output, Geometrical Analysis for Interactive Aid (GAIA).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The robust economic growth across South East Asia and the significant advances in nano-technologies in the past two decades have resulted in the creation of intelligent urban infrastructures. Cities like Seoul, Tokyo and Hong Kong have been competing against each other to develop the first ‘ubiquitous city’, a strategic global node of science and technology that provides all municipal services for residents and visitors via ubiquitous infrastructures. This chapter scrutinises the development of ubiquitous and smart infrastructure in Korea, Japan and Hong Kong. These cases provide invaluable learnings for policy-makers and urban and infrastructure planners when considering adopting these systems approaches in their cities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Graduated driver licensing (GDL) has been introduced in numerous jurisdictions in Australia and internationally in an attempt to ameliorate the significantly greater risk of death and injury for young novice drivers arising from road crashes. The GDL program in Queensland, Australia, was extensively modified in July 2007. This paper reports the driving and licensing experiences of Learner drivers progressing through the current-GDL program, and compares them to the experiences of Learners who progressed through the former-GDL program. ----- ----- Method: Young drivers (n = 1032, 609 females, 423 males) aged 17 to 19 years (M = 17.43, SD = 0.67) were recruited as they progressed from a Learner to a Provisional driver’s licence. They completed a survey exploring their sociodemographic characteristics, driving and licensing experiences as a Learner. Key measures for a subsample (n = 183) of the current-GDL drivers were compared with the former-GDL drivers (n = 149) via t-tests and chi-square analyses. ----- ----- Results: As expected, Learner drivers progressing through the current-GDL program gained significantly more driving practice than those in the former program, which was more likely to be provided by mothers than in the past. Female learners in the current-GDL program reported less difficulty obtaining supervision than those in the former program. The number of attempts needed to pass the practical driving assessment did not change, nor did the amount of professional supervision. The current-GDL Learners held their licence for a significantly longer duration than those in the former program, with the majority reporting that their Logbook entries were accurate on the whole. Compared to those in the former program, a significantly smaller proportion of male current-GDL Learners reported being detected for a driving offence while the females reported significantly lower crash involvement. Most current-GDL drivers reported undertaking their supervised practice at the end of the Learner period. ----- ----- Conclusions: The enhancements to the GDL program in Queensland appear to have achieved many of their intended results. The current-GDL learners participating in the study reported obtaining a significantly greater amount of supervised driving experience compared to former-GDL learners. Encouragingly, the current-GDL Learners did not report any greater difficulty in obtaining supervised driving practice, and there was a decline in the proportion of current-GDL Learners engaging in unsupervised driving. In addition, the majority of Learners do not appear to be attempting to subvert logbook recording requirements, as evidenced by high rates of self-reported logbook accuracy. The results have implications for the development and the evaluation of GDL programs in Australia and around the world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Saffman-Taylor finger problem is to predict the shape and,in particular, width of a finger of fluid travelling in a Hele-Shaw cell filled with a different, more viscous fluid. In experiments the width is dependent on the speed of propagation of the finger, tending to half the total cell width as the speed increases. To predict this result mathematically, nonlinear effects on the fluid interface must be considered; usually surface tension is included for this purpose. This makes the mathematical problem suffciently diffcult that asymptotic or numerical methods must be used. In this paper we adapt numerical methods used to solve the Saffman-Taylor finger problem with surface tension to instead include the effect of kinetic undercooling, a regularisation effect important in Stefan melting-freezing problems, for which Hele-Shaw flow serves as a leading order approximation when the specific heat of a substance is much smaller than its latent heat. We find the existence of a solution branch where the finger width tends to zero as the propagation speed increases, disagreeing with some aspects of the asymptotic analysis of the same problem. We also find a second solution branch, supporting the idea of a countably infinite number of branches as with the surface tension problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report presents the findings of an exploratory study into the perceptions held by students regarding the use of criterion-referenced assessment in an undergraduate differential equations class. Students in the class were largely unaware of the concept of criterion referencing and of the various interpretations that this concept has among mathematics educators. Our primary goal was to investigate whether explicitly presenting assessment criteria to students was useful to them and guided them in responding to assessment tasks. Quantitative data and qualitative feedback from students indicates that while students found the criteria easy to understand and useful in informing them as to how they would be graded, the manner in which they actually approached the assessment activity was not altered as a result of the use of explicitly communicated grading criteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Maximum-likelihood estimates of the parameters of stochastic differential equations are consistent and asymptotically efficient, but unfortunately difficult to obtain if a closed-form expression for the transitional probability density function of the process is not available. As a result, a large number of competing estimation procedures have been proposed. This article provides a critical evaluation of the various estimation techniques. Special attention is given to the ease of implementation and comparative performance of the procedures when estimating the parameters of the Cox–Ingersoll–Ross and Ornstein–Uhlenbeck equations respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gesture in performance is widely acknowledged in the literature as an important element in making a performance expressive and meaningful. The body has been shown to play an important role in the production and perception of vocal performance in particular. This paper is interested in the role of gesture in creative works that seek to extend vocal performance via technology. A creative work for vocal performer, laptop computer and a Human Computer Interface called the eMic (Extended Microphone Stand Interface controller) is presented as a case study, to explore the relationships between movement, voice production, and musical expression. The eMic is an interface for live vocal performance that allows the singers’ gestures and interactions with a sensor based microphone stand to be captured and mapped to musical parameters. The creative work discussed in this paper presents a new compositional approach for the eMic by working with movement as a starting point for the composition and thus using choreographed gesture as the basis for musical structures. By foregrounding the body and movement in the creative process, the aim is to create a more visually engaging performance where the performer is able to more effectively use the body to express their musical objectives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Higher-order spectral (bispectral and trispectral) analyses of numerical solutions of the Duffing equation with a cubic stiffness are used to isolate the coupling between the triads and quartets, respectively, of nonlinearly interacting Fourier components of the system. The Duffing oscillator follows a period-doubling intermittency catastrophic route to chaos. For period-doubled limit cycles, higher-order spectra indicate that both quadratic and cubic nonlinear interactions are important to the dynamics. However, when the Duffing oscillator becomes chaotic, global behavior of the cubic nonlinearity becomes dominant and quadratic nonlinear interactions are weak, while cubic interactions remain strong. As the nonlinearity of the system is increased, the number of excited Fourier components increases, eventually leading to broad-band power spectra for chaos. The corresponding higher-order spectra indicate that although some individual nonlinear interactions weaken as nonlinearity increases, the number of nonlinearly interacting Fourier modes increases. Trispectra indicate that the cubic interactions gradually evolve from encompassing a few quartets of Fourier components for period-1 motion to encompassing many quartets for chaos. For chaos, all the components within the energetic part of the power spectrum are cubically (but not quadratically) coupled to each other.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last two decades, particularly in Australia and the UK, the doctoral landscape has changed considerably with increasingly hybridised approaches to methodologies and research strategies as well as greater choice of examinable outputs. This paper provides an overview of doctoral practices that are emerging in the context of the creative industries, with a focus on practice-led approaches within the Doctor of Philosophy and recent developments in professional doctorates, from a predominantly Australian perspective. In interrogating what constitutes ‘doctorateness’ in this context, the paper examines some of the diverse theoretical principles which foreground the practitioner/researcher, methodological approaches that incorporate tacit knowledge and reflective practice together with qualitative strategies, blended learning delivery modes, and flexible doctoral outputs; and how these are shaping this shifting environment. The paper concludes with a study of the Doctor of Creative Industries at Queensland University of Technology as one model of an interdisciplinary professional research doctorate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigated how the interpretation of mathematical problems by Year 7 students impacted on their ability to demonstrate what they can do in NAPLAN numeracy testing. In the study, mathematics is viewed as a culturally and socially determined system of signs and signifiers that establish the meaning, origins and importance of mathematics. The study hypothesises that students are unable to succeed in NAPLAN numeracy tests because they cannot interpret the questions, even though they may be able to perform the necessary calculations. To investigate this, the study applied contemporary theories of literacy to the context of mathematical problem solving. A case study design with multiple methods was used. The study used a correlation design to explore the connections between NAPLAN literacy and numeracy outcomes of 198 Year 7 students in a Queensland school. Additionally, qualitative methods provided a rich description of the effect of the various forms of NAPLAN numeracy questions on the success of ten Year 7 students in the same school. The study argues that there is a quantitative link between reading and numeracy. It illustrates that interpretation (literacy) errors are the most common error type in the selected NAPLAN questions, made by students of all abilities. In contrast, conceptual (mathematical) errors are less frequent amongst more capable students. This has important implications in preparing students for NAPLAN numeracy tests. The study concluded by recommending that increased focus on the literacies of mathematics would be effective in improving NAPLAN results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High fidelity simulation as a teaching and learning approach is being embraced by many schools of nursing. Our school embarked on integrating high fidelity (HF) simulation into the undergraduate clinical education program in 2011. Low and medium fidelity simulation has been used for many years, but this did not simplify the integration of HF simulation. Alongside considerations of how and where HF simulation would be integrated, issues arose with: student consent and participation for observed activities; data management of video files; staff development, and conceptualising how methods for student learning could be researched. Simulation for undergraduate student nurses commenced as a formative learning activity, undertaken in groups of eight, where four students undertake the ‘doing’ role and four are structured observers, who then take a formal role in the simulation debrief. Challenges for integrating simulation into student learning included conceptualising and developing scenarios to trigger students’ decision making and application of skills, knowledge and attitudes explicit to solving clinical ‘problems’. Developing and planning scenarios for students to ‘try out’ skills and make decisions for problem solving lay beyond choosing pre-existing scenarios inbuilt with the software. The supplied scenarios were not concept based but rather knowledge, skills and technology (of the manikin) focussed. Challenges lay in using the technology for the purpose of building conceptual mastery rather than using technology simply because it was available. As we integrated use of HF simulation into the final year of the program, focus was on building skills, knowledge and attitudes that went beyond technical skill, and provided an opportunity to bridge the gap with theory-based knowledge that students often found difficult to link to clinical reality. We wished to provide opportunities to develop experiential knowledge based on application and clinical reasoning processes in team environments where problems are encountered, and to solve them, the nurse must show leadership and direction. Other challenges included students consenting for simulations to be videotaped and ethical considerations of this. For example if one student in a group of eight did not consent, did this mean they missed the opportunity to undertake simulation, or that others in the group may be disadvantaged by being unable to review their performance. This has implications for freely given consent but also for equity of access to learning opportunities for students who wished to be taped and those who did not. Alongside this issue were the details behind data management, storage and access. Developing staff with varying levels of computer skills to use software and undertake a different approach to being the ‘teacher’ required innovation where we took an experiential approach. Considering explicit learning approaches to be trialled for learning was not a difficult proposition, but considering how to enact this as research with issues of blinding, timetabling of blinded groups, and reducing bias for testing results of different learning approaches along with gaining ethical approval was problematic. This presentation presents examples of these challenges and how we overcame them.