903 resultados para race-based discrimination
Resumo:
This paper presents an extended study on the implementation of support vector machine(SVM) based speaker verification in systems that employ continuous progressive model adaptation using the weight-based factor analysis model. The weight-based factor analysis model compensates for session variations in unsupervised scenarios by incorporating trial confidence measures in the general statistics used in the inter-session variability modelling process. Employing weight-based factor analysis in Gaussian mixture models (GMM) was recently found to provide significant performance gains to unsupervised classification. Further improvements in performance were found through the integration of SVM-based classification in the system by means of GMM supervectors. This study focuses particularly on the way in which a client is represented in the SVM kernel space using single and multiple target supervectors. Experimental results indicate that training client SVMs using a single target supervector maximises performance while exhibiting a certain robustness to the inclusion of impostor training data in the model. Furthermore, the inclusion of low-scoring target trials in the adaptation process is investigated where they were found to significantly aid performance.
Resumo:
To evaluate whether luminance contrast discrimination losses in amblyopia on putative magnocellular (MC) and parvocellular (PC) pathway tasks reflect deficits at retinogeniculate or cortical sites. Fifteen amblyopes including six anisometropes, seven strabismics, two mixed and 12 age-matched controls were investigated. Contrast discrimination was measured using established psychophysical procedures that differentiate MC and PC processing. Data were described with a model of the contrast response of primate retinal ganglion cells. All amblyopes and controls displayed the same contrast signatures on the MC and PC tasks, with three strabismics having reduced sensitivity. Amblyopic PC contrast gain was similar to electrophysiological estimates from visually normal, non-human primates. Sensitivity losses evident in a subset of the amblyopes reflect cortical summation deficits, with no change in retinogeniculate contrast responses. The data do not support the proposal that amblyopic contrast sensitivity losses on MC and PC tasks reflect retinogeniculate deficits, but rather are due to anomalous post-retinogeniculate cortical processing of retinal signals.
Resumo:
This paper introduces an event-based traffic model for railway systems adopting fixed-block signalling schemes. In this model, the events of trains' arrival at and departure from signalling blocks constitute the states of the traffic flow. A state transition is equivalent to the progress of the trains by one signalling block and it is realised by referring to past and present states, as well as a number of pre-calculated look-up tables of run-times in the signalling block under various signalling conditions. Simulation results are compared with those from a time-based multi-train simulator to study the improvement of processing time and accuracy.
Resumo:
Many industrial processes and systems can be modelled mathematically by a set of Partial Differential Equations (PDEs). Finding a solution to such a PDF model is essential for system design, simulation, and process control purpose. However, major difficulties appear when solving PDEs with singularity. Traditional numerical methods, such as finite difference, finite element, and polynomial based orthogonal collocation, not only have limitations to fully capture the process dynamics but also demand enormous computation power due to the large number of elements or mesh points for accommodation of sharp variations. To tackle this challenging problem, wavelet based approaches and high resolution methods have been recently developed with successful applications to a fixedbed adsorption column model. Our investigation has shown that recent advances in wavelet based approaches and high resolution methods have the potential to be adopted for solving more complicated dynamic system models. This chapter will highlight the successful applications of these new methods in solving complex models of simulated-moving-bed (SMB) chromatographic processes. A SMB process is a distributed parameter system and can be mathematically described by a set of partial/ordinary differential equations and algebraic equations. These equations are highly coupled; experience wave propagations with steep front, and require significant numerical effort to solve. To demonstrate the numerical computing power of the wavelet based approaches and high resolution methods, a single column chromatographic process modelled by a Transport-Dispersive-Equilibrium linear model is investigated first. Numerical solutions from the upwind-1 finite difference, wavelet-collocation, and high resolution methods are evaluated by quantitative comparisons with the analytical solution for a range of Peclet numbers. After that, the advantages of the wavelet based approaches and high resolution methods are further demonstrated through applications to a dynamic SMB model for an enantiomers separation process. This research has revealed that for a PDE system with a low Peclet number, all existing numerical methods work well, but the upwind finite difference method consumes the most time for the same degree of accuracy of the numerical solution. The high resolution method provides an accurate numerical solution for a PDE system with a medium Peclet number. The wavelet collocation method is capable of catching up steep changes in the solution, and thus can be used for solving PDE models with high singularity. For the complex SMB system models under consideration, both the wavelet based approaches and high resolution methods are good candidates in terms of computation demand and prediction accuracy on the steep front. The high resolution methods have shown better stability in achieving steady state in the specific case studied in this Chapter.
Resumo:
The impact of what has been broadly labelled the knowledge economy has been such that, even in the absence of precise measurement, it is the undoubted dynamo of today’s global market, and an essential part of any global city. The socio-economic importance of knowledge production in a knowledge economy is clear, and it is an emerging social phenomenon and research agenda in geographical studies. Knowledge production, and where, how and by whom it is produced, is an urban phenomenon that is poorly understood in an era of strong urbanisation. This paper focuses on knowledge community precincts as the catalytic magnet infrastructures impacting on knowledge production in cities. The paper discusses the increasing importance of knowledge-based urban development within the paradigm of the knowledge economy, and the role of knowledge community precincts as instruments to seed the foundation of knowledge production in cities. This paper explores the knowledge based urban development, and particularly knowledge community precinct development, potentials of Sydney, Melbourne and Brisbane, and benchmarks this against that of Boston, Massachusetts.
Resumo:
This paper proposes a generic decoupled imagebased control scheme for cameras obeying the unified projection model. The scheme is based on the spherical projection model. Invariants to rotational motion are computed from this projection and used to control the translational degrees of freedom. Importantly we form invariants which decrease the sensitivity of the interaction matrix to object depth variation. Finally, the proposed results are validated with experiments using a classical perspective camera as well as a fisheye camera mounted on a 6-DOF robotic platform.
Resumo:
Airborne measurements of particle number concentrations from biomass burning were conducted in the Northern Territory, Australia, during June and September campaigns in 2003, which is the early and the late dry season in that region. The airborne measurements were performed along horizontal flight tracks, at several heights in order to gain insight into the particle concentration levels and their variation with height within the lower boundary layer (LBL), upper boundary layer (UBL), and also in the free troposphere (FT). The measurements found that the concentration of particles during the early dry season was lower than that for the late dry season. For the June campaign, the concentration of particles in LBL, UBL, and FT were (685 ± 245) particles/cm3, (365 ± 183) particles/cm3, and (495 ± 45) particle/cm3 respectively. For the September campaign, the concentration of particles were found to be (1233 ± 274) particles/cm3 in the LBL, (651 ± 68) particles/cm3 in the UBL, and (568 ± 70) particles/cm3 in the FT. The particle size distribution measurements indicate that during the late dry season there was no change in the particle size distribution below (LBL) and above the boundary layer (UBL). This indicates that there was possibly some penetration of biomass burning particles into the upper boundary layer. In the free troposphere the particle concentration and size measured during both campaigns were approximately the same.
Resumo:
In the era of knowledge economy, cities and regions have started increasingly investing on their physical, social and knowledge infrastructures so as to foster, attract and retain global talent and investment. Knowledge-based urban development as a new paradigm in urban planning and development is being implemented across the globe in order to increase the competitiveness of cities and regions. This chapter provides an overview of the lessons from Multimedia Super Corridor, Malaysia as one of the first large scale manifestations of knowledge-based urban development in South East Asia. The chapter investigates the application of the knowledge-based urban development concept within the Malaysian context, and, particularly, scrutinises the development and evolution of Multimedia Super Corridor by focusing on strategies, implementation policies, infrastructural implications, and agencies involved in the development and management of the corridor. In the light of the literature and case findings, the chapter provides generic recommendations, on the orchestration of knowledge-based urban development, for other cities and regions seeking such development.
Resumo:
This paper provides an overview of the current QUT Spatial Science undergraduate program based in Brisbane, Queensland, Australia. It discusses the development and implementation of a broad-based educational model for the faculty of built environment and engineering courses and specifically to the course structure of the new Bachelor of Urban Development (Spatial Science) study major. A brief historical background of surveying courses is discussed prior to the detailing of the three distinct and complementary learning themes of the new course structure with a graphical course matrix. Curriculum mapping of the spatial science major has been undertaken as the course approaches formal review in late 2010. Work-integrated learning opportunities have been embedded into the curriculum and a brief outline is presented. Some issues relevant to the tertiary surveying/ spatial sector are highlighted in the context of changing higher education environments in Australia.
Resumo:
Short-term traffic flow data is characterized by rapid and dramatic fluctuations. It reflects the nature of the frequent congestion in the lane, which shows a strong nonlinear feature. Traffic state estimation based on the data gained by electronic sensors is critical for much intelligent traffic management and the traffic control. In this paper, a solution to freeway traffic estimation in Beijing is proposed using a particle filter, based on macroscopic traffic flow model, which estimates both traffic density and speed.Particle filter is a nonlinear prediction method, which has obvious advantages for traffic flows prediction. However, with the increase of sampling period, the volatility of the traffic state curve will be much dramatic. Therefore, the prediction accuracy will be affected and difficulty of forecasting is raised. In this paper, particle filter model is applied to estimate the short-term traffic flow. Numerical study is conducted based on the Beijing freeway data with the sampling period of 2 min. The relatively high accuracy of the results indicates the superiority of the proposed model.
Resumo:
Abstract Computer simulation is a versatile and commonly used tool for the design and evaluation of systems with different degrees of complexity. Power distribution systems and electric railway network are areas for which computer simulations are being heavily applied. A dominant factor in evaluating the performance of a software simulator is its processing time, especially in the cases of real-time simulation. Parallel processing provides a viable mean to reduce the computing time and is therefore suitable for building real-time simulators. In this paper, we present different issues related to solving the power distribution system with parallel computing based on a multiple-CPU server and we will concentrate, in particular, on the speedup performance of such an approach.
Resumo:
There is a need for educational frameworks for computer ethics education. This discussion paper presents an approach to developing students’ moral sensitivity, an awareness of morally relevant issues, in project-based learning (PjBL). The proposed approach is based on a study of IT professionals’ levels of awareness of ethics. These levels are labelled My world, The corporate world, A shared world, The client’s world and The wider world. We give recommendations for how instructors may stimulate students’ thinking with the levels and how the levels may be taken into account in managing a project course and in an IS department. Limitations of the recommendations are assessed and issues for discussion are raised.
Resumo:
Background and Aim: To investigate participation in a second round of colorectal cancer screening using a fecal occult blood test (FOBT) in an Australian rural community, and to assess the demographic characteristics and individual perspectives associated with repeat screening. ---------- Methods: Potential participants from round 1 (50–74 years of age) were sent an intervention package and asked to return a completed FOBT (n = 3406). Doctors of participants testing positive referred to colonoscopy as appropriate. Following screening, 119 participants completed qualitative telephone interviews. Multivariable logistic regression models evaluated the association between round-2 participation and other variables.---------- Results: Round-2 participation was 34.7%; the strongest predictor was participation in round 1. Repeat participants were more likely to be female; inconsistent screeners were more likely to be younger (aged 50–59 years). The proportion of positive FOBT was 12.7%, that of colonoscopy compliance was 98.6%, and the positive predictive value for cancer or adenoma of advanced pathology was 23.9%. Reasons for participation included testing as a precautionary measure or having family history/friends with colorectal cancer; reasons for non-participation included apathy or doctors’ advice against screening.---------- Conclusion: Participation was relatively low and consistent across rounds. Unless suitable strategies are identified to overcome behavioral trends and/or to screen out ineligible participants, little change in overall participation rates can be expected across rounds.
Resumo:
Photo-curable biodegradable macromers were prepared by ring opening polymerization of D,L-lactide (DLLA), (similar to)-caprolactone (CL) and 1,3-trimethylene carbonate (TMC) in the presence of glycerol or sorbitol as initiator and stannous octoate as catalyst, and subsequent methacrylation of the terminal hydroxyl groups. These methacrylated macromers, ranging in molecular weight from approximately 700 to 6000 g/mol, were cross-linked using ultraviolet (UV) light to form biodegradable networks. Homogeneous networks with high gel contents were prepared. One of the resins based on PTMC was used to prepare three-dimensional structures by stereo-lithography using a commercially available apparatus.
Resumo:
We present an approach to automating computationally sound proofs of key exchange protocols based on public-key encryption. We show that satisfying the property called occultness in the Dolev-Yao model guarantees the security of a related key exchange protocol in a simple computational model. Security in this simpler model has been shown to imply security in a Bellare {Rogaway-like model. Furthermore, the occultness in the Dolev-Yao model can be searched automatically by a mechanisable procedure. Thus automated proofs for key exchange protocols in the computational model can be achieved. We illustrate the method using the well-known Lowe-Needham-Schroeder protocol.