983 resultados para modern techniques
Resumo:
The life history strategies of massive Porites corals make them a valuable resource not only as key providers of reef structure, but also as recorders of past environmental change. Yet recent documented evidence of an unprecedented increase in the frequency of mortality in Porites warrants investigation into the history of mortality and associated drivers. To achieve this, both an accurate chronology and an understanding of the life history strategies of Porites are necessary. Sixty-two individual Uranium–Thorium (U–Th) dates from 50 dead massive Porites colonies from the central inshore region of the Great Barrier Reef (GBR) revealed the timing of mortality to have occurred predominantly over two main periods from 1989.2 ± 4.1 to 2001.4 ± 4.1, and from 2006.4 ± 1.8 to 2008.4 ± 2.2 A.D., with a small number of colonies dating earlier. Overall, the peak ages of mortality are significantly correlated with maximum sea-surface temperature anomalies. Despite potential sampling bias, the frequency of mortality increased dramatically post-1980. These observations are similar to the results reported for the Southern South China Sea. High resolution measurements of Sr/Ca and Mg/Ca obtained from a well preserved sample that died in 1994.6 ± 2.3 revealed that the time of death occurred at the peak of sea surface temperatures (SST) during the austral summer. In contrast, Sr/Ca and Mg/Ca analysis in two colonies dated to 2006.9 ± 3.0 and 2008.3 ± 2.0, suggest that both died after the austral winter. An increase in Sr/Ca ratios and the presence of low Mg-calcite cements (as determined by SEM and elemental ratio analysis) in one of the colonies was attributed to stressful conditions that may have persisted for some time prior to mortality. For both colonies, however, the timing of mortality coincides with the 4th and 6th largest flood events reported for the Burdekin River in the past 60 years, implying that factors associated with terrestrial runoff may have been responsible for mortality. Our results show that a combination of U–Th and elemental ratio geochemistry can potentially be used to precisely and accurately determine the timing and season of mortality in modern massive Porites corals. For reefs where long-term monitoring data are absent, the ability to reconstruct historical events in coral communities may prove useful to reef managers by providing some baseline knowledge on disturbance history and associated drivers.
Resumo:
We first classify the state-of-the-art stream authentication problem in the multicast environment and group them into Signing and MAC approaches. A new approach for authenticating digital streams using Threshold Techniques is introduced. The new approach main advantages are in tolerating packet loss, up to a threshold number, and having a minimum space overhead. It is most suitable for multicast applications running over lossy, unreliable communication channels while, in same time, are pertain the security requirements. We use linear equations based on Lagrange polynomial interpolation and Combinatorial Design methods.
Resumo:
With nine examples, we seek to illustrate the utility of the Renormalization Group approach as a unification of other asymptotic and perturbation methods.
Resumo:
The objective of this chapter is to provide an overview of traffic data collection that can and should be used for the calibration and validation of traffic simulation models. There are big differences in availability of data from different sources. Some types of data such as loop detector data are widely available and used. Some can be measured with additional effort, for example, travel time data from GPS probe vehicles. Some types such as trajectory data are available only in rare situations such as research projects.
Resumo:
Large arrays and networks of carbon nanotubes, both single- and multi-walled, feature many superior properties which offer excellent opportunities for various modern applications ranging from nanoelectronics, supercapacitors, photovoltaic cells, energy storage and conversation devices, to gas- and biosensors, nanomechanical and biomedical devices etc. At present, arrays and networks of carbon nanotubes are mainly fabricated from the pre-fabricated separated nanotubes by solution-based techniques. However, the intrinsic structure of the nanotubes (mainly, the level of the structural defects) which are required for the best performance in the nanotube-based applications, are often damaged during the array/network fabrication by surfactants, chemicals, and sonication involved in the process. As a result, the performance of the functional devices may be significantly degraded. In contrast, directly synthesized nanotube arrays/networks can preclude the adverse effects of the solution-based process and largely preserve the excellent properties of the pristine nanotubes. Owing to its advantages of scale-up production and precise positioning of the grown nanotubes, catalytic and catalyst-free chemical vapor depositions (CVD), as well as plasma-enhanced chemical vapor deposition (PECVD) are the methods most promising for the direct synthesis of the nanotubes.
Resumo:
Bayesian experimental design is a fast growing area of research with many real-world applications. As computational power has increased over the years, so has the development of simulation-based design methods, which involve a number of algorithms, such as Markov chain Monte Carlo, sequential Monte Carlo and approximate Bayes methods, facilitating more complex design problems to be solved. The Bayesian framework provides a unified approach for incorporating prior information and/or uncertainties regarding the statistical model with a utility function which describes the experimental aims. In this paper, we provide a general overview on the concepts involved in Bayesian experimental design, and focus on describing some of the more commonly used Bayesian utility functions and methods for their estimation, as well as a number of algorithms that are used to search over the design space to find the Bayesian optimal design. We also discuss other computational strategies for further research in Bayesian optimal design.
Resumo:
This paper proposes a combination of source-normalized weighted linear discriminant analysis (SN-WLDA) and short utterance variance (SUV) PLDA modelling to improve the short utterance PLDA speaker verification. As short-length utterance i-vectors vary with the speaker, session variations and phonetic content of the utterance (utterance variation), a combined approach of SN-WLDA projection and SUV PLDA modelling is used to compensate the session and utterance variations. Experimental studies have found that a combination of SN-WLDA and SUV PLDA modelling approach shows an improvement over baseline system (WCCN[LDA]-projected Gaussian PLDA (GPLDA)) as this approach effectively compensates the session and utterance variations.
Resumo:
The foliage of a plant performs vital functions. As such, leaf models are required to be developed for modelling the plant architecture from a set of scattered data captured using a scanning device. The leaf model can be used for purely visual purposes or as part of a further model, such as a fluid movement model or biological process. For these reasons, an accurate mathematical representation of the surface and boundary is required. This paper compares three approaches for fitting a continuously differentiable surface through a set of scanned data points from a leaf surface, with a technique already used for reconstructing leaf surfaces. The techniques which will be considered are discrete smoothing D2-splines [R. Arcangeli, M. C. Lopez de Silanes, and J. J. Torrens, Multidimensional Minimising Splines, Springer, 2004.], the thin plate spline finite element smoother [S. Roberts, M. Hegland, and I. Altas, Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions, SIAM, 1 (2003), pp. 208--234] and the radial basis function Clough-Tocher method [M. Oqielat, I. Turner, and J. Belward, A hybrid Clough-Tocher method for surface fitting with application to leaf data., Appl. Math. Modelling, 33 (2009), pp. 2582-2595]. Numerical results show that discrete smoothing D2-splines produce reconstructed leaf surfaces which better represent the original physical leaf.
Resumo:
Purpose Corneal confocal microscopy (CCM) is a rapid non-invasive ophthalmic technique, which has been shown to diagnose and stratify the severity of diabetic neuropathy. Current morphometric techniques assess individual static images of the subbasal nerve plexus; this work explores the potential for non-invasive assessment of the wide-field morphology and dynamic changes of this plexus in vivo. Methods In this pilot study, laser scanning CCM was used to acquire maps (using a dynamic fixation target and semi-automated tiling software) of the central corneal sub-basal nerve plexus in 4 diabetic patients with and 6 without neuropathy and in 2 control subjects. Nerve migration was measured in an additional 7 diabetic patients with neuropathy, 4 without neuropathy and in 2 control subjects by repeating a modified version of the mapping procedure within 2-8 weeks, thus facilitating re-identification of distinctive nerve landmarks in the 2 montages. The rate of nerve movement was determined from these data and normalised to a weekly rate (µm/week), using customised software. Results Wide-field corneal nerve fibre length correlated significantly with the Neuropathy Disability Score (r = -0.58, p < 0.05), vibration perception (r = -0.66, p < 0.05) and peroneal conduction velocity (r = 0.67, p < 0.05). Central corneal nerve fibre length did not correlate with any of these measures of neuropathy (p > 0.05 for all). The rate of corneal nerve migration was 14.3 ± 1.1 µm/week in diabetic patients with neuropathy, 19.7 ± 13.3µm/week in diabetic patients without neuropathy, and 24.4 ± 9.8µm/week in control subjects; however, these differences were not significantly different (p = 0.543). Conclusions Our data demonstrate that it is possible to capture wide-field images of the corneal nerve plexus, and to quantify the rate of corneal nerve migration by repeating this procedure over a number of weeks. Further studies on larger sample sizes are required to determine the utility of this approach for the diagnosis and monitoring of diabetic neuropathy.
Resumo:
Grading is basic to the work of Landscape Architects concerned with design on the land. Gradients conducive to easy use, rainwater drained away, and land slope contributing to functional and aesthetic use are all essential to the amenity and pleasure of external environments. This workbook has been prepared specifically to support the program of landscape construction for students in Landscape Architecture. It is concerned primarily with the technical design of grading rather than with its aesthetic design. It must be stressed that the two aspects are rarely separate; what is designed should be technically correct and aesthetically pleasing - it needs to look good as well as to function effectively. This revised edition contains amended and new content which has evolved out of student classes and discussion with colleagues. I am pleased to have on record that every delivery of this workbook material has resulted in my own better understanding of grading and the techniques for its calculation and communication.
Resumo:
In the years since Nicolas Bourriaud’s Relational Aesthetics (1998) was published, a plethora of books (Shannon Jackson’s Social Works: Performing Art, Supporting Publics [2011], Nato Thompson’s Living as Form: Socially Engaged Art from 1991–2011 [2011], Grant Kester’s Conversation Pieces: Community and Communication in Modern Art [2004], Pablo Helguera’s Education for Socially Engaged Art: A Material and Techniques Handbook [2011]), conferences and articles have surfaced creating a rich and textured discourse that has responded to, critiqued and reconfigured the proposed social utopias of Bourriaud’s aesthetics. As a touchstone for this emerging discourse, Relational Aesthetics outlines in a contemporary context the plethora of social and process-based art forms that took as their medium the ‘social’. It is, however, Clare Bishop’s book Artificial Hells: Participatory Art and the Politics of Spectatorship (Verso), that offers a deeper art historical and theoretically considered rendering of this growing and complicated form of art, and forms a central body of work in this broad constellation of writings about participatory art, or social practice art/socially engaged art (SEA), as it is now commonly known...
Resumo:
The solutions proposed in this thesis contribute to improve gait recognition performance in practical scenarios that further enable the adoption of gait recognition into real world security and forensic applications that require identifying humans at a distance. Pioneering work has been conducted on frontal gait recognition using depth images to allow gait to be integrated with biometric walkthrough portals. The effects of gait challenging conditions including clothing, carrying goods, and viewpoint have been explored. Enhanced approaches are proposed on segmentation, feature extraction, feature optimisation and classification elements, and state-of-the-art recognition performance has been achieved. A frontal depth gait database has been developed and made available to the research community for further investigation. Solutions are explored in 2D and 3D domains using multiple images sources, and both domain-specific and independent modality gait features are proposed.
Resumo:
As a Lecturer of Animation History and 3D Computer Animator, I received a copy of Moving Innovation: A History of Computer Animation by Tom Sito with an element of anticipation in the hope that this text would clarify the complex evolution of Computer Graphics (CG). Tom Sito did not disappoint, as this text weaves together the multiple development streams and convergent technologies and techniques throughout history that would ultimately result in modern CG. Universities now have students who have never known a world without computer animation and many students are younger than the first 3D CG animated feature film, Toy Story (1996); this text is ideal for teaching computer animation history and, as I would argue, it also provides a model for engaging young students in the study of animation history in general. This is because Sito places the development of computer animation within the context of its pre-digital ancestry and throughout the text he continues to link the discussion to the broader history of animation, its pioneers, technologies and techniques...
Resumo:
Australian labour law, at least from the mid-twentieth century, was dominated by the employment paradigm: the assumption that labour law’s scope was the regulation of employment relationships –full-time and part-time, and continuing, fixed term or casual – with a single (usually corporate) entity employer. But no sooner had the employment paradigm established and consolidated its shape, it began to fall apart. Since the 1980s there has been a significant growth of patterns of work that fall outside this paradigm, driven by organisational restructuring and management techniques such as labour hire, sub-contracting and franchising. Beyond Employment analyses the way in which Australian labour law is being reframed in this shift away from the pre-eminence of the employment paradigm. Its principal concern is with the legal construction and regulation of various forms of contracting, including labour hire arrangements, complex contractual chains and modern forms like franchising, and of casual employment. It outlines the current array of work relationships in Australia, and describes and analyses the way in which those outside continuous and fixed term employment are regulated. The book seeks to answer the central question: How does law (legal rules and principles) construct these work relationships, and how does it regulate these relationships? The book identifies the way in which current law draws the lines between the various work relationships through the use of contract and property ownership, and describes, analyses and synthesises the legal rules that govern these different forms of work relationships. The legal rules that govern work relationships are explored through the traditional lens of labour law’s protective function, principally in four themes: control of property, and the distribution of risks and rewards; maintenance of income security; access to collective voice mechanisms, focusing on collective bargaining; and health, safety and welfare. The book critically evaluates the gaps in the coverage and content of these rules and principles, and the implications of these gaps for workers. It also reflects upon the power relationships that underpin the work arrangements that are the focus of the book and that are enhanced through the laws of contract and property. Finally, it frames an agenda to address the gaps and identified weaknesses insofar as they affect the economic wellbeing, democratic voice, and health and safety of workers.