236 resultados para Match
Resumo:
Human hair is a relatively inert biopolymer and can survive through natural disasters. It is also found as trace evidence at crime scenes. Previous studies by FTIRMicrospectroscopy and – Attenuated Total Reflectance (ATR) successfully showed that hairs can be matched and discriminated on the basis of gender, race and hair treatment, when interpreted by chemometrics. However, these spectroscopic techniques are difficult to operate at- or on-field. On the other hand, some near infrared spectroscopic (NIRS) instruments equipped with an optical probe, are portable and thus, facilitate the on- or at –field measurements for potential application directly at a crime or disaster scene. This thesis is focused on bulk hair samples, which are free of their roots, and thus, independent of potential DNA contribution for identification. It explores the building of a profile of an individual with the use of the NIRS technique on the basis of information on gender, race and treated hair, i.e. variables which can match and discriminate individuals. The complex spectra collected may be compared and interpreted with the use of chemometrics. These methods can then be used as protocol for further investigations. Water is a common substance present at forensic scenes e.g. at home in a bath, in the swimming pool; it is also common outdoors in the sea, river, dam, puddles and especially during DVI incidents at the seashore after a tsunami. For this reason, the matching and discrimination of bulk hair samples after the water immersion treatment was also explored. Through this research, it was found that Near Infrared Spectroscopy, with the use of an optical probe, has successfully matched and discriminated bulk hair samples to build a profile for the possible application to a crime or disaster scene. Through the interpretation of Chemometrics, such characteristics included Gender and Race. A novel approach was to measure the spectra not only in the usual NIR range (4000 – 7500 cm-1) but also in the Visible NIR (7500 – 12800 cm-1). This proved to be particularly useful in exploring the discrimination of differently coloured hair, e.g. naturally coloured, bleached or dyed. The NIR region is sensitive to molecular vibrations of the hair fibre structure as well as that of the dyes and damage from bleaching. But the Visible NIR region preferentially responds to the natural colourants, the melanin, which involves electronic transitions. This approach was shown to provide improved discrimination between dyed and untreated hair. This thesis is an extensive study of the application of NIRS with the aid of chemometrics, for matching and discrimination of bulk human scalp hair. The work not only indicates the strong potential of this technique in this field but also breaks new ground with the exploration of the use of the NIR and Visible NIR ranges for spectral sampling. It also develops methods for measuring spectra from hair which has been immersed in different water media (sea, river and dam)
Resumo:
Purpose: The classic study of Sumby and Pollack (1954, JASA, 26(2), 212-215) demonstrated that visual information aided speech intelligibility under noisy auditory conditions. Their work showed that visual information is especially useful under low signal-to-noise conditions where the auditory signal leaves greater margins for improvement. We investigated whether simulated cataracts interfered with the ability of participants to use visual cues to help disambiguate the auditory signal in the presence of auditory noise. Methods: Participants in the study were screened to ensure normal visual acuity (mean of 20/20) and normal hearing (auditory threshold ≤ 20 dB HL). Speech intelligibility was tested under an auditory only condition and two visual conditions: normal vision and simulated cataracts. The light scattering effects of cataracts were imitated using cataract-simulating filters. Participants wore blacked-out glasses in the auditory only condition and lens-free frames in the normal auditory-visual condition. Individual sentences were spoken by a live speaker in the presence of prerecorded four-person background babble set to a speech-to-noise ratio (SNR) of -16 dB. The SNR was determined in a preliminary experiment to support 50% correct identification of sentence under the auditory only conditions. The speaker was trained to match the rate, intensity and inflections of a prerecorded audio track of everyday speech sentences. The speaker was blind to the visual conditions of the participant to control for bias.Participants’ speech intelligibility was measured by comparing the accuracy of their written account of what they believed the speaker to have said to the actual spoken sentence. Results: Relative to the normal vision condition, speech intelligibility was significantly poorer when participants wore simulated catarcts. Conclusions: The results suggest that cataracts may interfere with the acquisition of visual cues to speech perception.
Resumo:
Developing the social identity theory of leadership (e.g., [Hogg, M. A. (2001). A social identity theory of leadership. Personality and Social Psychology Review, 5, 184–200]), an experiment (N=257) tested the hypothesis that as group members identify more strongly with their group (salience) their evaluations of leadership effectiveness become more strongly influenced by the extent to which their demographic stereotype-based impressions of their leader match the norm of the group (prototypicality). Participants, with more or less traditional gender attitudes (orientation), were members, under high or low group salience conditions (salience), of non-interactive laboratory groups that had “instrumental” or “expressive” group norms (norm), and a male or female leader (leader gender). As predicted, these four variables interacted significantly to affect perceptions of leadership effectiveness. Reconfiguration of the eight conditions formed by orientation, norm and leader gender produced a single prototypicality variable. Irrespective of participant gender, prototypical leaders were considered more effective in high then low salience groups, and in high salience groups prototypical leaders were more effective than less prototypical leaders. Alternative explanations based on status characteristics and role incongruity theory do not account well for the findings. Implications of these results for the glass ceiling effect and for a wider social identity analysis of the impact of demographic group membership on leadership in small groups are discussed.
Resumo:
The critical impact of innovation on national and the global economies has been discussed at length in the literature. Economic development requires the diffusion of innovations into markets. It has long been recognised that economic growth and development depends upon a constant stream of innovations. Governments have been keenly aware of the need to ensure this flow does not dry to a trickle and have introduced many and varied industry policies and interventions to assist in seeding, supporting and diffusing innovations. In Australia, as in many countries, Government support for the transfer of knowledge especially from publicly funded research has resulted in the creation of knowledge exchange intermediaries. These intermediaries are themselves service organisations, seeking innovative service offerings for their markets. The choice for most intermediaries is generally a dichotomous one, between market-pull and technology-push knowledge exchange programmes. In this article, we undertake a case analysis of one such innovative intermediary and its flagship programme. We then compare this case with other successful intermediaries in Europe. We put forward a research proposition that the design of intermediary programmes must match the service type they offer. That is, market-pull programmes require market-pull design, in close collaboration with industry, whereas technology programmes can be problem-solving innovations where demand is latent. The discussion reflects the need for an evolution in knowledge transfer policies and programmes beyond the first generation ushered in with the US Bayh-Dole Act (1980) and Stevenson-Wydler Act (1984). The data analysed is a case study comparison of market-pull and technology-push programmes, focusing on primary and secondary socio-economic benefits (using both Australian and international comparisons).
Resumo:
In this thesis an investigation into theoretical models for formation and interaction of nanoparticles is presented. The work presented includes a literature review of current models followed by a series of five chapters of original research. This thesis has been submitted in partial fulfilment of the requirements for the degree of doctor of philosophy by publication and therefore each of the five chapters consist of a peer-reviewed journal article. The thesis is then concluded with a discussion of what has been achieved during the PhD candidature, the potential applications for this research and ways in which the research could be extended in the future. In this thesis we explore stochastic models pertaining to the interaction and evolution mechanisms of nanoparticles. In particular, we explore in depth the stochastic evaporation of molecules due to thermal activation and its ultimate effect on nanoparticles sizes and concentrations. Secondly, we analyse the thermal vibrations of nanoparticles suspended in a fluid and subject to standing oscillating drag forces (as would occur in a standing sound wave) and finally on lattice surfaces in the presence of high heat gradients. We have described in this thesis a number of new models for the description of multicompartment networks joined by a multiple, stochastically evaporating, links. The primary motivation for this work is in the description of thermal fragmentation in which multiple molecules holding parts of a carbonaceous nanoparticle may evaporate. Ultimately, these models predict the rate at which the network or aggregate fragments into smaller networks/aggregates and with what aggregate size distribution. The models are highly analytic and describe the fragmentation of a link holding multiple bonds using Markov processes that best describe different physical situations and these processes have been analysed using a number of mathematical methods. The fragmentation of the network/aggregate is then predicted using combinatorial arguments. Whilst there is some scepticism in the scientific community pertaining to the proposed mechanism of thermal fragmentation,we have presented compelling evidence in this thesis supporting the currently proposed mechanism and shown that our models can accurately match experimental results. This was achieved using a realistic simulation of the fragmentation of the fractal carbonaceous aggregate structure using our models. Furthermore, in this thesis a method of manipulation using acoustic standing waves is investigated. In our investigation we analysed the effect of frequency and particle size on the ability for the particle to be manipulated by means of a standing acoustic wave. In our results, we report the existence of a critical frequency for a particular particle size. This frequency is inversely proportional to the Stokes time of the particle in the fluid. We also find that for large frequencies the subtle Brownian motion of even larger particles plays a significant role in the efficacy of the manipulation. This is due to the decreasing size of the boundary layer between acoustic nodes. Our model utilises a multiple time scale approach to calculating the long term effects of the standing acoustic field on the particles that are interacting with the sound. These effects are then combined with the effects of Brownian motion in order to obtain a complete mathematical description of the particle dynamics in such acoustic fields. Finally, in this thesis, we develop a numerical routine for the description of "thermal tweezers". Currently, the technique of thermal tweezers is predominantly theoretical however there has been a handful of successful experiments which demonstrate the effect it practise. Thermal tweezers is the name given to the way in which particles can be easily manipulated on a lattice surface by careful selection of a heat distribution over the surface. Typically, the theoretical simulations of the effect can be rather time consuming with supercomputer facilities processing data over days or even weeks. Our alternative numerical method for the simulation of particle distributions pertaining to the thermal tweezers effect use the Fokker-Planck equation to derive a quick numerical method for the calculation of the effective diffusion constant as a result of the lattice and the temperature. We then use this diffusion constant and solve the diffusion equation numerically using the finite volume method. This saves the algorithm from calculating many individual particle trajectories since it is describes the flow of the probability distribution of particles in a continuous manner. The alternative method that is outlined in this thesis can produce a larger quantity of accurate results on a household PC in a matter of hours which is much better than was previously achieveable.
Resumo:
The middle years of schooling are increasingly recognised as a crucial stage in students' lives, one that has significant consequences for ongoing educational success. International research indicates that young adolescents benefit from programs designed especially for their needs, and the middle years have become an important reform issue for education systems. Teaching Middle Years offers a systematic overview of the philosophy, principles and issues in middle schooling. It includes contributions from academics and school-based practitioners on intellectual and emotional development in early adolescence, pedagogy, curriculum and assessment of middle years students. Written for teachers, student teachers, education leaders and policy makers, Teaching Middle Years is an essential resource for anyone involved in educating young adolescents. Teaching Middle Years is the first comprehensive Australian book to match and surpass the quality of many overseas publications.'
Resumo:
Aims: To describe a local data linkage project to match hospital data with the Australian Institute of Health and Welfare (AIHW) National Death Index (NDI) to assess longterm outcomes of intensive care unit patients. Methods: Data were obtained from hospital intensive care and cardiac surgery databases on all patients aged 18 years and over admitted to either of two intensive care units at a tertiary-referral hospital between 1 January 1994 and 31 December 2005. Date of death was obtained from the AIHW NDI by probabilistic software matching, in addition to manual checking through hospital databases and other sources. Survival was calculated from time of ICU admission, with a censoring date of 14 February 2007. Data for patients with multiple hospital admissions requiring intensive care were analysed only from the first admission. Summary and descriptive statistics were used for preliminary data analysis. Kaplan-Meier survival analysis was used to analyse factors determining long-term survival. Results: During the study period, 21 415 unique patients had 22 552 hospital admissions that included an ICU admission; 19 058 surgical procedures were performed with a total of 20 092 ICU admissions. There were 4936 deaths. Median follow-up was 6.2 years, totalling 134 203 patient years. The casemix was predominantly cardiac surgery (80%), followed by cardiac medical (6%), and other medical (4%). The unadjusted survival at 1, 5 and 10 years was 97%, 84% and 70%, respectively. The 1-year survival ranged from 97% for cardiac surgery to 36% for cardiac arrest. An APACHE II score was available for 16 877 patients. In those discharged alive from hospital, the 1, 5 and 10-year survival varied with discharge location. Conclusions: ICU-based linkage projects are feasible to determine long-term outcomes of ICU patients
Resumo:
This paper explains, somewhat along a Simmelian line, that political theory may produce practical and universal theories like those developed in theoretical physics. The reasoning behind this paper is to show that the Element of Democracy Theory may be true by way of comparing it to Einstein’s Special Relativity – specifically concerning the parameters of symmetry, unification, simplicity, and utility. These parameters are what make a theory in physics as meeting them not only fits with current knowledge, but also produces paths towards testing (application). As the Element of Democracy Theory meets these same parameters, it could settle the debate concerning the definition of democracy. This will be shown firstly by discussing why no one has yet achieved a universal definition of democracy; secondly by explaining the parameters chosen (as in why these and not others confirm or scuttle theories); and thirdly by comparing how Special Relativity and the Element of Democracy match the parameters.
Resumo:
Even for a casual observer of the journalistic industry it is becoming difficult to escape the conclusion that journalism is entering a time of crisis. At the same time that revenues and readerships for traditional publications from newspapers to broadcast news are declining, journalistic content is being overtaken by a flotilla of alternative options ranging from the news satire of The Daily Show in the United States to the citizen journalism of South Korea’s OhmyNews and a myriad of other news blogs and citizen journalism Websites. Worse still, such new competitors with the products of the journalism industry frequently take professional journalists themselves to task where their standards have appeared to have slipped, and are beginning to match the news industry’s incumbents in terms of insight and informational value: recent studies have shown, for example, that avid Daily Show viewers are as if not better informed about the U.S. political process as those who continue to follow mainstream print or television news (see e.g. Fox et al., 2007). The show’s host Jon Stewart – who has consistently maintained his self-description as a comedian, not a journalist – even took the fight directly to the mainstream with his appearance on CNN’s belligerent talk show Crossfire, repeatedly making the point that the show’s polarised and polarising ‘left vs. right’ format was “hurting” politics in America (the show disappeared from CNN’s line-up a few months after Stewart’s appearance; Stewart, 2004). Similarly, news bloggers and citizen journalists have shown persistence and determination both in uncovering political and other scandals, and in highlighting the shortcomings of professional journalism as it investigates and reports on such scandals.
Resumo:
This paper illustrates the prediction of opponent behaviour in a competitive, highly dynamic, multi-agent and partially observable environment, namely RoboCup small size league robot soccer. The performance is illustrated in the context of the highly successful robot soccer team, the RoboRoos. The project is broken into three tasks; classification of behaviours, modelling and prediction of behaviours and integration of the predictions into the existing planning system. A probabilistic approach is taken to dealing with the uncertainty in the observations and with representing the uncertainty in the prediction of the behaviours. Results are shown for a classification system using a Naïve Bayesian Network that determines the opponent’s current behaviour. These results are compared to an expert designed fuzzy behaviour classification system. The paper illustrates how the modelling system will use the information from behaviour classification to produce probability distributions that model the manner with which the opponents perform their behaviours. These probability distributions are show to match well with the existing multi-agent planning system (MAPS) that forms the core of the RoboRoos system.
Resumo:
This paper explains, somewhat along a Simmelian line, that political theory may produce practical and universal theories like those developed in theoretical physics. The reasoning behind this paper is to show that the Element of Democracy Theory may be true by way of comparing it to Einstein’s Special Relativity – specifically concerning the parameters of symmetry, unification, simplicity, and utility. These parameters are what make a theory in physics as meeting them not only fits with current knowledge, but also produces paths towards testing (application). As the Element of Democracy Theory meets these same parameters, it could settle the debate concerning the definition of democracy. This will be shown firstly by discussing why no one has yet achieved a universal definition of democracy; secondly by explaining the parameters chosen (as in why these and not others confirm or scuttle theories); and thirdly by comparing how Special Relativity and the Element of Democracy match the parameters.
Resumo:
Existing literature has failed to find robust relationships between individual differences and the ability to fake psychological tests, possibly due to limitations in how successful faking is operationalised. In order to fake, individuals must alter their original profile to create a particular impression. Currently, successful faking is operationalised through statistical definitions, informant ratings, known groups comparisons, the use of archival and baseline data, and breaches of validity indexes. However, there are many methodological limitations to these approaches. This research proposed a three component model of successful faking to address this, where an original response is manipulated into a strategic response, which must match a criteria target. Further, by operationalising successful faking in this manner, this research takes into account the fact that individuals may have been successful in reaching their implicitly created profile, but that this may not have matched the criteria they were instructed to fake.Participants (N=48, 22 students and 26 non-students) completed the BDI-II honestly. Participants then faked the BDI-II as if they had no, mild, moderate and severe depression, as well as completing a checklist revealing which symptoms they thought indicated each level of depression. Findings were consistent with a three component model of successful faking, where individuals effectively changed their profile to what they believed was required, however this profile differed from the criteria defined by the psychometric norms of the test.One of the foremost issues for research in this area is the inconsistent manner in which successful faking is operationalised. This research allowed successful faking to be operationalised in an objective, quantifiable manner. Using this model as a template may allow researchers better understanding of the processes involved in faking, including the role of strategies and abilities in determining the outcome of test dissimulation.
Resumo:
Water environments are greatly valued in urban areas as ecological and aesthetic assets. However, it is the water environment that is most adversely affected by urbanisation. Urban land use coupled with anthropogenic activities alters the stream flow regime and degrade water quality with urban stormwater being a significant source of pollutants. Unfortunately, urban water pollution is difficult to evaluate in terms of conventional monetary measures. True costs extend beyond immediate human or the physical boundaries of the urban area and affect the function of surrounding ecosystems. Current approaches for handling stormwater pollution and water quality issues in urban landscapes are limited as these are primarily focused on ‘end-of-pipe’ solutions. The approaches are commonly based either on, insufficient design knowledge, faulty value judgements or inadequate consideration of full life cycle costs. It is in this context that the adoption of a triple bottom line approach is advocated to safeguard urban water quality. The problem of degradation of urban water environments can only be remedied through innovative planning, water sensitive engineering design and the foresight to implement sustainable practices. Sustainable urban landscapes must be designed to match the triple bottom line needs of the community, starting with ecosystem services first such as the water cycle, then addressing the social and immediate ecosystem health needs, and finally the economic performance of the catchment. This calls for a cultural change towards urban water resources rather than the current piecemeal and single issue focus approach. This paper discusses the challenges in safeguarding urban water environments and the limitations of current approaches. It then explores the opportunities offered by integrating innovative planning practices with water engineering concepts into a single cohesive framework to protect valuable urban ecosystem assets. Finally, a series of recommendations are proposed for protecting urban water resources within the context of a triple bottom line approach.
Resumo:
A mathematical model is developed to simulate the discharge of a LiFePO4 cathode. This model contains 3 size scales, which match with experimental observations present in the literature on the multi-scale nature of LiFePO4 material. A shrinking-core is used on the smallest scale to represent the phase-transition of LiFePO4 during discharge. The model is then validated against existing experimental data and this validated model is then used to investigate parameters that influence active material utilisation. Specifically the size and composition of agglomerates of LiFePO4 crystals is discussed, and we investigate and quantify the relative effects that the ionic and electronic conductivities within the oxide have on oxide utilisation. We find that agglomerates of crystals can be tolerated under low discharge rates. The role of the electrolyte in limiting (cathodic) discharge is also discussed, and we show that electrolyte transport does limit performance at high discharge rates, confirming the conclusions of recent literature.
Resumo:
Abstract Providing water infrastructure in times of accelerating climate change presents interesting new problems. Expanding demands must be met or managed in contexts of increasingly constrained sources of supply, raising ethical questions of equity and participation. Loss of agricultural land and natural habitats, the coastal impacts of desalination plants and concerns over re-use of waste water must be weighed with demand management issues of water rationing, pricing mechanisms and inducing behaviour change. This case study examines how these factors impact on infrastructure planning in South East Queensland, Australia: a region with one of the developed world’s most rapidly growing populations, which has recently experienced the most severe drought in its recorded history. Proposals to match forecast demands and potential supplies for water over a 20 year period are reviewed by applying ethical principles to evaluate practical plans to meet the water needs of the region’s activities and settlements.