945 resultados para Mathematical Techniques - Integration
Resumo:
Significant attention has been given in urban policy literature to the integration of land-use and transport planning and policies—with a view to curbing sprawling urban form and diminishing externalities associated with car-dependent travel patterns. By taking land-use and transport interaction into account, this debate mainly focuses on how a successful integration can contribute to societal well-being, providing efficient and balanced economic growth while accomplishing the goal of developing sustainable urban environments and communities. The integration is also a focal theme of contemporary urban development models, such as smart growth, liveable neighbourhoods, and new urbanism. Even though available planning policy options for ameliorating urban form and transport-related externalities have matured—owing to growing research and practice worldwide—there remains a lack of suitable evaluation models to reflect on the current status of urban form and travel problems or on the success of implemented integration policies. In this study we explore the applicability of indicator-based spatial indexing to assess land-use and transport integration at the neighbourhood level. For this, a spatial index is developed by a number of indicators compiled from international studies and trialled in Gold Coast, Queensland, Australia. The results of this modelling study reveal that it is possible to propose an effective metric to determine the success level of city plans considering their sustainability performance via composite indicator methodology. The model proved useful in demarcating areas where planning intervention is applicable, and in identifying the most suitable locations for future urban development and plan amendments. Lastly, we integrate variance-based sensitivity analysis with the spatial indexing method, and discuss the applicability of the model in other urban contexts.
Resumo:
Polarisation diversity is a technique to improve the quality of mobile communications, but its reliability is suboptimal because it depends on the mobile channel and the antenna orientations at both ends of the mobile link. A method to optimise the reliability is established by minimising the dependency on antenna orientations. While the mobile base station can have fixed antenna orientation, the mobile terminal is typically a handheld device with random orientations. This means orientation invariance needs to be established at the receiver in the downlink, and at the transmitter in the uplink. This research presents separate solutions for both cases, and is based on the transmission of an elliptically polarised signal synthesised from the channel statistics. Complete receiver orientation invariance is achieved in the downlink. Effects of the transmitter orientation are minimised in the uplink.
Resumo:
Management of the industrial nations' hazardous waste is a current and exponentially increasing, global threatening situation. Improved environmental information must be obtained and managed concerning the current status, temporal dynamics and potential future status of these critical sites. To test the application of spatial environmental techniques to the problem of hazardous waste sites, as Superfund (CERCLA) test site was chosen in an industrial/urban valley experiencing severe TCE, PCE, and CTC ground water contamination. A paradigm is presented for investigating spatial/environmental tools available for the mapping, monitoring and modelling of the environment and its toxic contaminated plumes. This model incorporates a range of technical issues concerning the collection of data as augmented by remotely sensed tools, the format and storage of data utilizing geographic information systems, and the analysis and modelling of environment through the use of advance GIS analysis algorithms and geophysic models of hydrologic transport including statistical surface generation. This spatial based approach is evaluated against the current government/industry standards of operations. Advantages and lessons learned of the spatial approach are discussed.
Resumo:
Previous studies have shown that the human lens contains glycerophospholipids with ether linkages. These lipids differ from conventional glycerophospholipids in that the sn-1 substituent is attached to the glycerol backbone via an 1-O-alkyl or an 1-O-alk-1'-enyl ether rather than an ester bond. The present investigation employed a combination of collision-induced dissociation (CID) and ozone-induced dissociation (OzID) to unambiguously distinguish such 1-O-alkyl and 1-O-alk-1'-enyl ethers. Using these methodologies the human lens was found to contain several abundant 1-O-alkyl glycerophos-phoethanolamines, including GPEtn(16:0e/9Z-18:1), GPEtn(11Z-18:1e/9Z-18:1), and GPEtn(18:0e/9Z-18:1), as well as a related series of unusual 1-O-alkyl glycerophosphoserines, including GPSer(16:0e/9Z-18:1), GPSer(11Z-18:1e/9Z-18:1), GPSer(18:0e/9Z-18:1) that to our knowledge have not previously been observed in human tissue. Isomeric 1-O-alk-1'-enyl ethers were absent or in low abundance. Examination of the double bond position within the phospholipids using OzID revealed that several positional isomers were present, including sites of unsaturation at the n-9, n-7, and even n-5 positions. Tandem CID/OzID experiments revealed a preference for double bonds in the n-7 position of 1-O-ether linked chains, while n-9 double bonds predominated in the ester-linked fatty acids [e.g., GPEtn(11Z-18:1e/9Z-18:1) and GPSer(11Z-18:1e/9Z-18:1)]. Different combinations of these double bond positional isomers within chains at the sn-1 and sn-2 positions point to a remarkable molecular diversity of ether-lipids within the human lens.
Resumo:
E-mail spam has remained a scourge and menacing nuisance for users, internet and network service operators and providers, in spite of the anti-spam techniques available; and spammers are relentlessly circumventing these anti-spam techniques embedded or installed in form of software products on both client and server sides of both fixed and mobile devices to their advantage. This continuous evasion degrades the capabilities of these anti-spam techniques as none of them provides a comprehensive reliable solution to the problem posed by spam and spammers. Major problem for instance arises when these anti-spam techniques misjudge or misclassify legitimate emails as spam (false positive); or fail to deliver or block spam on the SMTP server (false negative); and the spam passes-on to the receiver, and yet this server from where it originates does not notice or even have an auto alert service to indicate that the spam it was designed to prevent has slipped and moved on to the receiver’s SMTP server; and the receiver’s SMTP server still fail to stop the spam from reaching user’s device and with no auto alert mechanism to inform itself of this inability; thus causing a staggering cost in loss of time, effort and finance. This paper takes a comparative literature overview of some of these anti-spam techniques, especially the filtering technological endorsements designed to prevent spam, their merits and demerits to entrench their capability enhancements, as well as evaluative analytical recommendations that will be subject to further research.
Resumo:
In this paper, we propose a new multi-class steganalysis for binary image. The proposed method can identify the type of steganographic technique used by examining on the given binary image. In addition, our proposed method is also capable of differentiating an image with hidden message from the one without hidden message. In order to do that, we will extract some features from the binary image. The feature extraction method used is a combination of the method extended from our previous work and some new methods proposed in this paper. Based on the extracted feature sets, we construct our multi-class steganalysis from the SVM classifier. We also present the empirical works to demonstrate that the proposed method can effectively identify five different types of steganography.
Resumo:
Two lecture notes describe recent developments of evolutionary multi objective optimization (MO) techniques in detail and their advantages and drawbacks compared to traditional deterministic optimisers. The role of Game Strategies (GS), such as Pareto, Nash or Stackelberg games as companions or pre-conditioners of Multi objective Optimizers is presented and discussed on simple mathematical functions in Part I , as well as their implementations on simple aeronautical model optimisation problems on the computer using a friendly design framework in Part II. Real life (robust) design applications dealing with UAVs systems or Civil Aircraft and using the EAs and Game Strategies combined material of Part I & Part II are solved and discussed in Part III providing the designer new compromised solutions useful to digital aircraft design and manufacturing. Many details related to Lectures notes Part I, Part II and Part III can be found by the reader in [68].
Resumo:
The incidence of major storm surges in the last decade have dramatically emphasized the immense destructive capabilities of extreme water level events, particularly when driven by severe tropical cyclones. Given this risk, it is vitally important that the exceedance probabilities of extreme water levels are accurately evaluated to inform risk-based flood and erosion management, engineering and for future land-use planning and to ensure the risk of catastrophic structural failures due to under-design or expensive wastes due to over-design are minimised. Australia has a long history of coastal flooding from tropical cyclones. Using a novel integration of two modeling techniques, this paper provides the first estimates of present day extreme water level exceedance probabilities around the whole coastline of Australia, and the first estimates that combine the influence of astronomical tides, storm surges generated by both extra-tropical and tropical cyclones, and seasonal and inter-annual variations in mean sea level. Initially, an analysis of tide gauge records has been used to assess the characteristics of tropical cyclone-induced surges around Australia. However, given the dearth (temporal and spatial) of information around much of the coastline, and therefore the inability of these gauge records to adequately describe the regional climatology, an observationally based stochastic tropical cyclone model has been developed to synthetically extend the tropical cyclone record to 10,000 years. Wind and pressure fields derived for these synthetically generated events have then been used to drive a hydrodynamic model of the Australian continental shelf region with annual maximum water levels extracted to estimate exceedance probabilities around the coastline. To validate this methodology, selected historic storm surge events have been simulated and resultant storm surges compared with gauge records. Tropical cyclone induced exceedance probabilities have been combined with estimates derived from a 61-year water level hindcast described in a companion paper to give a single estimate of present day extreme water level probabilities around the whole coastline of Australia. Results of this work are freely available to coastal engineers, managers and researchers via a web-based tool (www.sealevelrise.info). The described methodology could be applied to other regions of the world, like the US east coast, that are subject to both extra-tropical and tropical cyclones.
Resumo:
Planning techniques for large scale earthworks have been considered in this article. To improve these activities a “block theoretic” approach was developed that provides an integrated solution consisting of an allocation of cuts to fills and a sequence of cuts and fills over time. It considers the constantly changing terrain by computing haulage routes dynamically. Consequently more realistic haulage costs are used in the decision making process. A digraph is utilised to describe the terrain surface which has been partitioned into uniform grids. It reflects the true state of the terrain, and is altered after each cut and fill. A shortest path algorithm is successively applied to calculate the cost of each haul, and these costs are summed over the entire sequence, to provide a total cost of haulage. To solve this integrated optimisation problem a variety of solution techniques were applied, including constructive algorithms, meta-heuristics and parallel programming. The extensive numerical investigations have successfully shown the applicability of our approach to real sized earthwork problems.
Resumo:
Designing systems for multiple stakeholders requires frequent collaboration with multiple stakeholders from the start. In many cases at least some stakeholders lack a professional habit of formal modeling. We report observations from student design teams as well as two case studies, respectively of a prototype for supporting creative communication to design objects, and of stakeholder-involvement in early design. In all observations and case studies we found that non-formal techniques supported strong collaboration resulting in deep understanding of early design ideas, of their value and of the feasibility of solutions.
Resumo:
Dynamics is an essential core engineering subject. It includes high level mathematical and theoretical contents, and basic concepts which are abstract in nature. Hence, Dynamics is considered as one of the hardest subjects in the engineering discipline. To assist our students in learning this subject, we have conducted a Teaching & Learning project to study ways and methods to effectively teach Dynamics based on visualization techniques. The research project adopts the five basic steps of Action Learning Cycle. It is found that visualization technique is a powerful tool for students learning Dynamics and helps to break the barrier of students who perceived Dynamics as a hard subject.
Resumo:
Rolling Element Bearings (REBs) are vital components in rotating machineries for providing rotating motion. In slow speed rotating machines, bearings are normally subjected to heavy static loads and a catastrophic failure can cause enormous disruption to production and human safety. Due to its low operating speed the impact energy generated by the rotating elements on the defective components is not sufficient to produce a detectable vibration response. This is further aggravated by the inability of general measuring instruments to detect and process the weak signals at the initiation of the defect accurately. Furthermore, the weak signals are often corrupted by background noise. This is a serious problem faced by maintenance engineers today and the inability to detect an incipient failure of the machine can significantly increases the risk of functional failure and costly downtime. This paper presents the application of noise removal techniques for enhancing the detection capability for slow speed REB condition monitoring. Blind deconvolution (BD) and adaptive line enhancer (ALE) are compared to evaluate their performance in enhancing the source signal with consequential removal of background noise. In the experimental study, incipient defects were seeded on a number of roller bearings and the signals were acquired using acoustic emission (AE) sensor. Kurtosis and modified peak ratio (mPR) were used to determine the detectability of signal corrupted by noise.
Resumo:
We investigate the utility to computational Bayesian analyses of a particular family of recursive marginal likelihood estimators characterized by the (equivalent) algorithms known as "biased sampling" or "reverse logistic regression" in the statistics literature and "the density of states" in physics. Through a pair of numerical examples (including mixture modeling of the well-known galaxy dataset) we highlight the remarkable diversity of sampling schemes amenable to such recursive normalization, as well as the notable efficiency of the resulting pseudo-mixture distributions for gauging prior-sensitivity in the Bayesian model selection context. Our key theoretical contributions are to introduce a novel heuristic ("thermodynamic integration via importance sampling") for qualifying the role of the bridging sequence in this procedure, and to reveal various connections between these recursive estimators and the nested sampling technique.
Resumo:
The television quiz program Letters and Numbers, broadcast on the SBS network, has recently become quite popular in Australia. This paper explores the potential of this game to illustrate and engage student interest in a range of fundamental concepts of computer science and mathematics. The Numbers Game in particular has a rich mathematical structure whose analysis and solution involves concepts of counting and problem size, discrete (tree) structures, language theory, recurrences, computational complexity, and even advanced memory management. This paper presents an analysis of these games and their teaching applications, and presents some initial results of use in student assignments.