232 resultados para Variational techniques
Resumo:
Designing systems for multiple stakeholders requires frequent collaboration with multiple stakeholders from the start. In many cases at least some stakeholders lack a professional habit of formal modeling. We report observations from two case studies of stakeholder-involvement in early design where non-formal techniques supported strong collaboration resulting in deep understanding of requirements and of the feasibility of solutions.
Resumo:
Bauxite refinery residues are derived from the Bayer process by the digestion of crushed bauxite in concentrated caustic at elevated temperatures. Chemically, it comprises, in varying amounts (depending upon the composition of the starting bauxite), oxides of iron and titanium, residual alumina, sodalite, silica, and minor quantities of other metal oxides. Bauxite residues are being neutralised by seawater in recent years to reduce the alkalinity in bauxite residue, through the precipitation of hydrotalcite-like compounds and some other Mg, Ca, and Al hydroxide and carbonate minerals. A combination of X-ray diffraction (XRD) and vibrational spectroscopy techniques, including mid-infrared (IR), Raman, near-infrared (NIR), and UV-Visible, have been used to characterise bauxite residue and seawater neutralised bauxite residue. Both the ferrous (Fe2+) and ferric (Fe3+) ions within bauxite residue can be identified by their characteristic NIR bands, where ferrous ions produce a strong absorption band at around 9000 cm-1, while ferric ions produce two strong bands at 25000 and 14300 cm-1. The presence of adsorbed carbonate and hydroxide anions can be identified at around 5200 and 7000 cm-1, respectively, attributed to the 2nd overtone of the 1st fundamental overtones observed in the mid-IR spectra. The complex bands in the Raman and mid-IR spectra around 3500 cm-1 are assigned to the OH stretching vibrations of the various oxides present in bauxite residue, and water. The combination of carbonate and hydroxyl units and their fundamental overtones give rise to many of the features of the NIR spectra.
Resumo:
Recurrence relations in mathematics form a very powerful and compact way of looking at a wide range of relationships. Traditionally, the concept of recurrence has often been a difficult one for the secondary teacher to convey to students. Closely related to the powerful proof technique of mathematical induction, recurrences are able to capture many relationships in formulas much simpler than so-called direct or closed formulas. In computer science, recursive coding often has a similar compactness property, and, perhaps not surprisingly, suffers from similar problems in the classroom as recurrences: the students often find both the basic concepts and practicalities elusive. Using models designed to illuminate the relevant principles for the students, we offer a range of examples which use the modern spreadsheet environment to powerfully illustrate the great expressive and computational power of recurrences.
Resumo:
We first classify the state-of-the-art stream authentication problem in the multicast environment and group them into Signing and MAC approaches. A new approach for authenticating digital streams using Threshold Techniques is introduced. The new approach main advantages are in tolerating packet loss, up to a threshold number, and having a minimum space overhead. It is most suitable for multicast applications running over lossy, unreliable communication channels while, in same time, are pertain the security requirements. We use linear equations based on Lagrange polynomial interpolation and Combinatorial Design methods.
Resumo:
With nine examples, we seek to illustrate the utility of the Renormalization Group approach as a unification of other asymptotic and perturbation methods.
Resumo:
The objective of this chapter is to provide an overview of traffic data collection that can and should be used for the calibration and validation of traffic simulation models. There are big differences in availability of data from different sources. Some types of data such as loop detector data are widely available and used. Some can be measured with additional effort, for example, travel time data from GPS probe vehicles. Some types such as trajectory data are available only in rare situations such as research projects.
Resumo:
For the renewable energy sources whose outputs vary continuously, a Z-source current-type inverter has been proposed as a possible buck-boost alternative for grid-interfacing. With a unique X-shaped LC network connected between its dc power source and inverter topology, Z-source current-type inverter is however expected to suffer from compounded resonant complications in addition to those associated with its second-order output filter. To improve its damping performance, this paper proposes the careful integration of Posicast or three-step compensators before the inverter pulse-width modulator for damping triggered resonant oscillations. In total, two compensators are needed for wave-shaping the inverter boost factor and modulation ratio, and they can conveniently be implemented using first-in first-out stacks and embedded timers of modern digital signal processors widely used in motion control applications. Both techniques are found to damp resonance of ac filter well, but for cases of transiting from current-buck to boost state, three-step technique is less effective due to the sudden intermediate discharging interval introduced by its non-monotonic stepping (unlike the monotonic stepping of Posicast damping). These findings have been confirmed both in simulations and experiments using an implemented laboratory prototype.
Resumo:
This paper proposes a combination of source-normalized weighted linear discriminant analysis (SN-WLDA) and short utterance variance (SUV) PLDA modelling to improve the short utterance PLDA speaker verification. As short-length utterance i-vectors vary with the speaker, session variations and phonetic content of the utterance (utterance variation), a combined approach of SN-WLDA projection and SUV PLDA modelling is used to compensate the session and utterance variations. Experimental studies have found that a combination of SN-WLDA and SUV PLDA modelling approach shows an improvement over baseline system (WCCN[LDA]-projected Gaussian PLDA (GPLDA)) as this approach effectively compensates the session and utterance variations.
Resumo:
The foliage of a plant performs vital functions. As such, leaf models are required to be developed for modelling the plant architecture from a set of scattered data captured using a scanning device. The leaf model can be used for purely visual purposes or as part of a further model, such as a fluid movement model or biological process. For these reasons, an accurate mathematical representation of the surface and boundary is required. This paper compares three approaches for fitting a continuously differentiable surface through a set of scanned data points from a leaf surface, with a technique already used for reconstructing leaf surfaces. The techniques which will be considered are discrete smoothing D2-splines [R. Arcangeli, M. C. Lopez de Silanes, and J. J. Torrens, Multidimensional Minimising Splines, Springer, 2004.], the thin plate spline finite element smoother [S. Roberts, M. Hegland, and I. Altas, Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions, SIAM, 1 (2003), pp. 208--234] and the radial basis function Clough-Tocher method [M. Oqielat, I. Turner, and J. Belward, A hybrid Clough-Tocher method for surface fitting with application to leaf data., Appl. Math. Modelling, 33 (2009), pp. 2582-2595]. Numerical results show that discrete smoothing D2-splines produce reconstructed leaf surfaces which better represent the original physical leaf.
Resumo:
Purpose Corneal confocal microscopy (CCM) is a rapid non-invasive ophthalmic technique, which has been shown to diagnose and stratify the severity of diabetic neuropathy. Current morphometric techniques assess individual static images of the subbasal nerve plexus; this work explores the potential for non-invasive assessment of the wide-field morphology and dynamic changes of this plexus in vivo. Methods In this pilot study, laser scanning CCM was used to acquire maps (using a dynamic fixation target and semi-automated tiling software) of the central corneal sub-basal nerve plexus in 4 diabetic patients with and 6 without neuropathy and in 2 control subjects. Nerve migration was measured in an additional 7 diabetic patients with neuropathy, 4 without neuropathy and in 2 control subjects by repeating a modified version of the mapping procedure within 2-8 weeks, thus facilitating re-identification of distinctive nerve landmarks in the 2 montages. The rate of nerve movement was determined from these data and normalised to a weekly rate (µm/week), using customised software. Results Wide-field corneal nerve fibre length correlated significantly with the Neuropathy Disability Score (r = -0.58, p < 0.05), vibration perception (r = -0.66, p < 0.05) and peroneal conduction velocity (r = 0.67, p < 0.05). Central corneal nerve fibre length did not correlate with any of these measures of neuropathy (p > 0.05 for all). The rate of corneal nerve migration was 14.3 ± 1.1 µm/week in diabetic patients with neuropathy, 19.7 ± 13.3µm/week in diabetic patients without neuropathy, and 24.4 ± 9.8µm/week in control subjects; however, these differences were not significantly different (p = 0.543). Conclusions Our data demonstrate that it is possible to capture wide-field images of the corneal nerve plexus, and to quantify the rate of corneal nerve migration by repeating this procedure over a number of weeks. Further studies on larger sample sizes are required to determine the utility of this approach for the diagnosis and monitoring of diabetic neuropathy.
Resumo:
Grading is basic to the work of Landscape Architects concerned with design on the land. Gradients conducive to easy use, rainwater drained away, and land slope contributing to functional and aesthetic use are all essential to the amenity and pleasure of external environments. This workbook has been prepared specifically to support the program of landscape construction for students in Landscape Architecture. It is concerned primarily with the technical design of grading rather than with its aesthetic design. It must be stressed that the two aspects are rarely separate; what is designed should be technically correct and aesthetically pleasing - it needs to look good as well as to function effectively. This revised edition contains amended and new content which has evolved out of student classes and discussion with colleagues. I am pleased to have on record that every delivery of this workbook material has resulted in my own better understanding of grading and the techniques for its calculation and communication.
Resumo:
The solutions proposed in this thesis contribute to improve gait recognition performance in practical scenarios that further enable the adoption of gait recognition into real world security and forensic applications that require identifying humans at a distance. Pioneering work has been conducted on frontal gait recognition using depth images to allow gait to be integrated with biometric walkthrough portals. The effects of gait challenging conditions including clothing, carrying goods, and viewpoint have been explored. Enhanced approaches are proposed on segmentation, feature extraction, feature optimisation and classification elements, and state-of-the-art recognition performance has been achieved. A frontal depth gait database has been developed and made available to the research community for further investigation. Solutions are explored in 2D and 3D domains using multiple images sources, and both domain-specific and independent modality gait features are proposed.
Resumo:
This research is a step forward in improving the accuracy of detecting anomaly in a data graph representing connectivity between people in an online social network. The proposed hybrid methods are based on fuzzy machine learning techniques utilising different types of structural input features. The methods are presented within a multi-layered framework which provides the full requirements needed for finding anomalies in data graphs generated from online social networks, including data modelling and analysis, labelling, and evaluation.