489 resultados para Strong-Field Phenomena
Resumo:
Mentoring has been the focus of both research and writing across a range of professional fields including, for example, education, business, medecine, nursing and law for decades. Even so it has been argued by researchers that much less confusion continues to surround its meaning and understanding. Part of this confusion lies in the fact it has been described in many ways. Some writing in the field focuses on it as a workplace activity for men and womean, a developmental process for novices and leaders alike, a career tool for enhancing promotion, an affirmative action strategy for members of minority groups, and a human resource development strategy used in organisations (Ehrich and Hansford, 1999).
Resumo:
Technology platforms originally developed for tissue engineering applications produce valuable models that mimic three-dimensional (3D) tissue organization and function to enhance the understanding of cell/tissue function under normal and pathological situations. These models show that when replicating physiological and pathological conditions as closely as possible investigators are allowed to probe the basic mechanisms of morphogenesis, differentiation and cancer. Significant efforts investigating angiogenetic processes and factors in tumorigenesis are currently undertaken to establish ways of targeting angiogenesis in tumours. Anti-angiogenic agents have been accepted for clinical application as attractive targeted therapeutics for the treatment of cancer. Combining the areas of tumour angiogenesis, combination therapies and drug delivery systems is therefore closely related to the understanding of the basic principles that are applied in tissue engineering models. Studies with 3D model systems have repeatedly identified complex interacting roles of matrix stiffness and composition, integrins, growth factor receptors and signalling in development and cancer. These insights suggest that plasticity, regulation and suppression of these processes can provide strategies and therapeutic targets for future cancer therapies. The historical perspective of the fields of tissue engineering and controlled release of therapeutics, including inhibitors of angiogenesis in tumours is becoming clearly evident as a major future advance in merging these fields. New delivery systems are expected to greatly enhance the ability to deliver drugs locally and in therapeutic concentrations to relevant sites in living organisms. Investigating the phenomena of angiogenesis and anti-angiogenesis in 3D in vivo models such as the Arterio-Venous (AV) loop mode in a separated and isolated chamber within a living organism adds another significant horizon to this perspective and opens new modalities for translational research in this field.
Resumo:
Evidence-based Practice (EBP) has recently emerged as a topic of discussion amongst professionals within the library and information services (LIS) industry. Simply stated, EBP is the process of using formal research skills and methods to assist in decision making and establishing best practice. The emerging interest in EBP within the library context serves to remind the library profession that research skills and methods can help ensure that the library industry remains current and relevant in changing times. The LIS sector faces ongoing challenges in terms of the expectation that financial and human resources will be managed efficiently, particularly if library budgets are reduced and accountability to the principal stakeholders is increased. Library managers are charged with the responsibility to deliver relevant and cost effective services, in an environment characterised by rapidly changing models of information provision, information access and user behaviours. Consequently they are called upon not only to justify the services they provide, or plan to introduce, but also to measure the effectiveness of these services and to evaluate the impact on the communities they serve. The imperative for innovation in and enhancements to library practice is accompanied by the need for a strong understanding of the processes of review, measurement, assessment and evaluation. In 2001 the Centre for Information Research was commissioned by the Chartered Institute of Library and Information Professionals (CILIP) in the UK to conduct an examination into the research landscape for library and information science. The examination concluded that research is “important for the LIS [library and information science] domain in a number of ways” (McNicol & Nankivell, 2001, p.77). At the professional level, research can inform practice, assist in the future planning of the profession, raise the profile of the discipline, and indeed the reputation and standing of the library and information service itself. At the personal level, research can “broaden horizons and offer individuals development opportunities” (McNicol & Nankivell, 2001, p.77). The study recommended that “research should be promoted as a valuable professional activity for practitioners to engage in” (McNicol & Nankivell, 2001, p.82). This chapter will consider the role of EBP within the library profession. A brief review of key literature in the area is provided. The review considers issues of definition and terminology, highlights the importance of research in professional practice and outlines the research approaches that underpin EBP. The chapter concludes with a consideration of the specific application of EBP within the dynamic and evolving field of information literacy (IL).
Resumo:
In recent years, practitioners and researchers alike have turned their attention to knowledge management (KM) in order to increase organisational performance (OP). As a result, many different approaches and strategies have been investigated and suggested for how knowledge should be managed to make organisations more effective and efficient. However, most research has been undertaken in the for-profit sector, with only a few studies focusing on the benefits nonprofit organisations might gain by managing knowledge. This study broadly investigates the impact of knowledge management on the organisational performance of nonprofit organisations. Organisational performance can be evaluated through either financial or non-financial measurements. In order to evaluate knowledge management and organisational performance, non-financial measurements are argued to be more suitable given that knowledge is an intangible asset which often cannot be expressed through financial indicators. Non-financial measurement concepts of performance such as the balanced scorecard or the concept of Intellectual Capital (IC) are well accepted and used within the for-profit and nonprofit sectors to evaluate organisational performance. This study utilised the concept of IC as the method to evaluate KM and OP in the context of nonprofit organisations due to the close link between KM and IC: Indeed, KM is concerned with managing the KM processes of creating, storing, sharing and applying knowledge and the organisational KM infrastructure such as organisational culture or organisational structure to support these processes. On the other hand, IC measures the knowledge stocks in different ontological levels: at the individual level (human capital), at the group level (relational capital) and at the organisational level (structural capital). In other words, IC measures the value of the knowledge which has been managed through KM. As KM encompasses the different KM processes and the KM infrastructure facilitating these processes, previous research has investigated the relationship between KM infrastructure and KM processes. Organisational culture, organisational structure and the level of IT support have been identified as the main factors of the KM infrastructure influencing the KM processes of creating, storing, sharing and applying knowledge. Other research has focused on the link between KM and OP or organisational effectiveness. Based on existing literature, a theoretical model was developed to enable the investigation of the relation between KM (encompassing KM infrastructure and KM processes) and IC. The model assumes an association between KM infrastructure and KM processes, as well as an association between KM processes and the various levels of IC (human capital, structural capital and relational capital). As a result, five research questions (RQ) with respect to the various factors of the KM infrastructure as well as with respect to the relationship between KM infrastructure and IC were raised and included into the research model: RQ 1 Do nonprofit organisations which have a Hierarchy culture have a stronger IT support than nonprofit organisations which have an Adhocracy culture? RQ 2 Do nonprofit organisations which have a centralised organisational structure have a stronger IT support than nonprofit organisations which have decentralised organisational structure? RQ 3 Do nonprofit organisations which have a stronger IT support have a higher value of Human Capital than nonprofit organisations which have a less strong IT support? RQ 4 Do nonprofit organisations which have a stronger IT support have a higher value of Structural Capital than nonprofit organisations which have a less strong IT support? RQ 5 Do nonprofit organisations which have a stronger IT support have a higher value of Relational Capital than nonprofit organisations which have a less strong IT support? In order to investigate the research questions, measurements for IC were developed which were linked to the main KM processes. The final KM/IC model contained four items for evaluating human capital, five items for evaluating structural capital and four items for evaluating relational capital. The research questions were investigated through empirical research using a case study approach with the focus on two nonprofit organisations providing trade promotions services through local offices worldwide. Data for the investigation of the assumptions were collected via qualitative as well as quantitative research methods. The qualitative study included interviews with representatives of the two participating organisations as well as in-depth document research. The purpose of the qualitative study was to investigate the factors of the KM infrastructure (organisational culture, organisational structure, IT support) of the organisations and how these factors were related to each other. On the other hand, the quantitative study was carried out through an online-survey amongst staff of the various local offices. The purpose of the quantitative study was to investigate which impact the level of IT support, as the main instrument of the KM infrastructure, had on IC. Overall several key themes were found as a result of the study: • Knowledge Management and Intellectual Capital were complementary with each other, which should be expressed through measurements of IC based on KM processes. • The various factors of the KM infrastructure (organisational culture, organisational structure and level of IT support) are interdependent. • IT was a primary instrument through which the different KM processes (creating, storing, sharing and applying knowledge) were performed. • A high level of IT support was evident when participants reported higher level of IC (human capital, structural capital and relational capital). The study supported previous research in the field of KM and replicated the findings from other case studies in this area. The study also contributed to theory by placing the KM research within the nonprofit context and analysing the linkage between KM and IC. From the managerial perspective, the findings gave clear indications that would allow interested parties, such as nonprofit managers or consultants to understand more about the implications of KM on OP and to use this knowledge for implementing efficient and effective KM strategies within their organisations.
Resumo:
In the region of self-organized criticality (SOC) interdependency between multi-agent system components exists and slight changes in near-neighbor interactions can break the balance of equally poised options leading to transitions in system order. In this region, frequency of events of differing magnitudes exhibits a power law distribution. The aim of this paper was to investigate whether a power law distribution characterized attacker-defender interactions in team sports. For this purpose we observed attacker and defender in a dyadic sub-phase of rugby union near the try line. Videogrammetry was used to capture players’ motion over time as player locations were digitized. Power laws were calculated for the rate of change of players’ relative position. Data revealed that three emergent patterns from dyadic system interactions (i.e., try; unsuccessful tackle; effective tackle) displayed a power law distribution. Results suggested that pattern forming dynamics dyads in rugby union exhibited SOC. It was concluded that rugby union dyads evolve in SOC regions suggesting that players’ decisions and actions are governed by local interactions rules.
Resumo:
This paper describes the current status of a program to develop an automated forced landing system for a fixed-wing Unmanned Aerial Vehicle (UAV). This automated system seeks to emulate human pilot thought processes when planning for and conducting an engine-off emergency landing. Firstly, a path planning algorithm that extends Dubins curves to 3D space is presented. This planning element is then combined with a nonlinear guidance and control logic, and simulated test results demonstrate the robustness of this approach to strong winds during a glided descent. The average path deviation errors incurred are comparable to or even better than that of manned, powered aircraft. Secondly, a study into suitable multi-criteria decision making approaches and the problems that confront the decision-maker is presented. From this study, it is believed that decision processes that utilize human expert knowledge and fuzzy logic reasoning are most suited to the problem at hand, and further investigations will be conducted to identify the particular technique/s to be implemented in simulations and field tests. The automated UAV forced landing approach presented in this paper is promising, and will allow the progression of this technology from the development and simulation stages through to a prototype system
Resumo:
Two-stroke outboard boat engines using total loss lubrication deposit a significant proportion of their lubricant and fuel directly into the water. The purpose of this work is to document the velocity and concentration field characteristics of a submerged swirling water jet emanating from a propeller in order to provide information on its fundamental characteristics. Measurements of the velocity and concentration field were performed in a turbulent jet generated by a model boat propeller (0.02 m diameter) operating at 1500 rpm and 3000 rpm. The measurements were carried out in the Zone of Established Flow up to 50 propeller diameters downstream of the propeller. Both the mean axial velocity profile and the mean concentration profile showed self-similarity. Further, the stand deviation growth curve was linear. The effects of propeller speed and dye release location were also investigated.
Resumo:
As with the broader field of education research, most writing on the subject of school excursions and field trips has centred around progressive/humanist concerns for building pupil’s self-esteem and for the development of the ‘whole child’. Such research has also stressed the importance of a broad, grounded, and experiential curriculum - as exemplified by subjects containing these extra-school activities - as well as the possibility of strengthening the relationship between student and teacher. Arguing that this approach to the field trip is both exhausted of ideas and conceptually flawed, this paper proposes some alternate routes into the area for the prospective researcher. First, it is argued that by historicising the subject matter, it can be seen that school excursions are not simply the product of the contemporary humanist desire for diverse and fulfilling educational experiences, rather they can, in part, be traced to eighteenth century beliefs among the English gentry that travel formed a crucial component of a good education, to the advent of an affordable public rail system, and to school tours associated with the Temperance movement. Second, field trips can be understood from within the associated framework of concerns over the governance of tourism and the organisation of disciplinary apparatuses for the production of an educated and regulated citizenry. Far from being a simple learning experience, museums and art galleries form part of a complex of disciplinary and power relations designed to produce a populace with very specific capacities, aspirations and styles of public conduct. Finally, rather than allowing children ‘freedom’ from the constraints of the classroom, on the contrary, through the medium of the field-trip, children can become accustomed to having their activities governed in the broader domain of the generalised community . School excursions thereby constitute an effective tactic through which young people have their conduct managed, and their social and scholastic identities shaped and administered.
Resumo:
There is increasing agreement that understanding complexity is important for project management because of difficulties associated with decision-making and goal attainment which appear to stem from complexity. However the current operational definitions of complex projects, based upon size and budget, have been challenged and questions have been raised about how complexity can be measured in a robust manner that takes account of structural, dynamic and interaction elements. Thematic analysis of data from 25 in-depth interviews of project managers involved with complex projects, together with an exploration of the literature reveals a wide range of factors that may contribute to project complexity. We argue that these factors contributing to project complexity may define in terms of dimensions, or source characteristics, which are in turn subject to a range of severity factors. In addition to investigating definitions and models of complexity from the literature and in the field, this study also explores the problematic issues of ‘measuring’ or assessing complexity. A research agenda is proposed to further the investigation of phenomena reported in this initial study.
Theoretical and numerical investigation of plasmon nanofocusing in metallic tapered rods and grooves
Resumo:
Effective focusing of electromagnetic (EM) energy to nanoscale regions is one of the major challenges in nano-photonics and plasmonics. The strong localization of the optical energy into regions much smaller than allowed by the diffraction limit, also called nanofocusing, offers promising applications in nano-sensor technology, nanofabrication, near-field optics or spectroscopy. One of the most promising solutions to the problem of efficient nanofocusing is related to surface plasmon propagation in metallic structures. Metallic tapered rods, commonly used as probes in near field microscopy and spectroscopy, are of a particular interest. They can provide very strong EM field enhancement at the tip due to surface plasmons (SP’s) propagating towards the tip of the tapered metal rod. A large number of studies have been devoted to the manufacturing process of tapered rods or tapered fibers coated by a metal film. On the other hand, structures such as metallic V-grooves or metal wedges can also provide strong electric field enhancements but manufacturing of these structures is still a challenge. It has been shown, however, that the attainable electric field enhancement at the apex in the V-groove is higher than at the tip of a metal tapered rod when the dissipation level in the metal is strong. Metallic V-grooves also have very promising characteristics as plasmonic waveguides. This thesis will present a thorough theoretical and numerical investigation of nanofocusing during plasmon propagation along a metal tapered rod and into a metallic V-groove. Optimal structural parameters including optimal taper angle, taper length and shape of the taper are determined in order to achieve maximum field enhancement factors at the tip of the nanofocusing structure. An analytical investigation of plasmon nanofocusing by metal tapered rods is carried out by means of the geometric optics approximation (GOA), which is also called adiabatic nanofocusing. However, GOA is applicable only for analysing tapered structures with small taper angles and without considering a terminating tip structure in order to neglect reflections. Rigorous numerical methods are employed for analysing non-adiabatic nanofocusing, by tapered rod and V-grooves with larger taper angles and with a rounded tip. These structures cannot be studied by analytical methods due to the presence of reflected waves from the taper section, the tip and also from (artificial) computational boundaries. A new method is introduced to combine the advantages of GOA and rigorous numerical methods in order to reduce significantly the use of computational resources and yet achieve accurate results for the analysis of large tapered structures, within reasonable calculation time. Detailed comparison between GOA and rigorous numerical methods will be carried out in order to find the critical taper angle of the tapered structures at which GOA is still applicable. It will be demonstrated that optimal taper angles, at which maximum field enhancements occur, coincide with the critical angles, at which GOA is still applicable. It will be shown that the applicability of GOA can be substantially expanded to include structures which could be analysed previously by numerical methods only. The influence of the rounded tip, the taper angle and the role of dissipation onto the plasmon field distribution along the tapered rod and near the tip will be analysed analytically and numerically in detail. It will be demonstrated that electric field enhancement factors of up to ~ 2500 within nanoscale regions are predicted. These are sufficient, for instance, to detect single molecules using surface enhanced Raman spectroscopy (SERS) with the tip of a tapered rod, an approach also known as tip enhanced Raman spectroscopy or TERS. The results obtained in this project will be important for applications for which strong local field enhancement factors are crucial for the performance of devices such as near field microscopes or spectroscopy. The optimal design of nanofocusing structures, at which the delivery of electromagnetic energy to the nanometer region is most efficient, will lead to new applications in near field sensors, near field measuring technology, or generation of nanometer sized energy sources. This includes: applications in tip enhanced Raman spectroscopy (TERS); manipulation of nanoparticles and molecules; efficient coupling of optical energy into and out of plasmonic circuits; second harmonic generation in non-linear optics; or delivery of energy to quantum dots, for instance, for quantum computations.
Resumo:
Suggestions that peripheral imagery may affect the development of refractive error have led to interest in the variation in refraction and aberration across the visual field. It is shown that, if the optical system of the eye is rotationally symmetric about an optical axis which does not coincide with the visual axis, measurements of refraction and aberration made along the horizontal and vertical meridians of the visual field will show asymmetry about the visual axis. The departures from symmetry are modelled for second-order aberrations, refractive components and third-order coma. These theoretical results are compared with practical measurements from the literature. The experimental data support the concept that departures from symmetry about the visual axis in the measurements of crossed-cylinder astigmatism J45 and J180 are largely explicable in terms of a decentred optical axis. Measurements of the mean sphere M suggest, however, that the retinal curvature must differ in the horizontal and vertical meridians.
Resumo:
A strong designated verifier signature scheme makes it possible for a signer to convince a designated verifier that she has signed a message in such a way that the designated verifier cannot transfer the signature to a third party, and no third party can even verify the validity of a designated verifier signature. We show that anyone who intercepts one signature can verify subsequent signatures in Zhang-Mao ID-based designated verifier signature scheme and Lal-Verma ID-based designated verifier proxy signature scheme. We propose a new and efficient ID-based designated verifier signature scheme that is strong and unforgeable. As a direct corollary, we also get a new efficient ID-based designated verifier proxy signature scheme.
Resumo:
This thesis proposes that contemporary printmaking, at its most significant, marks the present through reconstructing pasts and anticipating futures. It argues this through examples in the field, occurring in contexts beyond the Euramerican (Europe and North America). The arguments revolve around how the practice of a number of significant artists in Japan, Australia and Thailand has generated conceptual and formal innovations in printmaking that transcend local histories and conventions, whilst paradoxically, also building upon them and creating new meanings. The arguments do not portray the relations between contemporary and traditional art as necessarily antagonistic but rather, as productively dialectical. Furthermore, the case studies demonstrate that, in the 1980s and 1990s particularly, the studio practice of these printmakers was informed by other visual arts disciplines and reflected postmodern concerns. Departures from convention witnessed in these countries within the Asia-Pacific region shifted the field of the print into a heterogeneous and hybrid realm. The practitioners concerned (especially in Thailand) produced work that was more readily equated with performance and installation art than with printmaking per se. In Japan, the incursion of photography interrupted the decorative cast of printmaking and delivered it from a straightforward, craft-based aesthetic. In Australia, fixed notions of national identity were challenged by print practitioners through deliberate cultural rapprochements and technical contradictions (speaking across old and new languages).However time-honoured print methods were not jettisoned by any case study artists. Their re-alignment of the fundamental attributes of printmaking, in line with materialist formalism, is a core consideration of my arguments. The artists selected for in-depth analysis from these three countries are all innovators whose geographical circumstances and creative praxis drew on local traditions whilst absorbing international trends. In their radical revisionism, they acknowledged the specificity of history and place, conditions of contingency and forces of globalisation. The transformational nature of their work during the late twentieth century connects it to the postmodern ethos and to a broader artistic and cultural nexus than has hitherto been recognised in literature on the print. Emerging from former guild-based practices, they ambitiously conceived their work to be part of a continually evolving visual arts vocabulary. I argue in this thesis that artists from the Asia-Pacific region have historically broken with the hermetic and Euramerican focus that has generally characterised the field. Inadequate documentation and access to print activity outside the dominant centres of critical discourse imply that readings of postmodernism have been too limited in their scope of inquiry. Other locations offer complexities of artistic practice where re-alignments of customary boundaries are often the norm. By addressing innovative activity in Japan, Australia and Thailand, this thesis exposes the need for a more inclusive theoretical framework and wider global reach than currently exists for ‘printmaking’.
Resumo:
The loss of valuable water resources due to pipe failure has become a major problem in Australia, especially in areas under high level of water restrictions. Generally pipe failure occurs due to a combination of physical and environmental factors. Stresses induced by shrinking and swelling of reactive soils are one of the major factors affecting the performance of buried pipes. This paper presents the details of a field instrumentation undertaken to monitor the performance of an in-service water reticulation pipe buried in a reactive soil and subjected to seasonal climatic changes.
Resumo:
Human hair is a relatively inert biopolymer and can survive through natural disasters. It is also found as trace evidence at crime scenes. Previous studies by FTIRMicrospectroscopy and – Attenuated Total Reflectance (ATR) successfully showed that hairs can be matched and discriminated on the basis of gender, race and hair treatment, when interpreted by chemometrics. However, these spectroscopic techniques are difficult to operate at- or on-field. On the other hand, some near infrared spectroscopic (NIRS) instruments equipped with an optical probe, are portable and thus, facilitate the on- or at –field measurements for potential application directly at a crime or disaster scene. This thesis is focused on bulk hair samples, which are free of their roots, and thus, independent of potential DNA contribution for identification. It explores the building of a profile of an individual with the use of the NIRS technique on the basis of information on gender, race and treated hair, i.e. variables which can match and discriminate individuals. The complex spectra collected may be compared and interpreted with the use of chemometrics. These methods can then be used as protocol for further investigations. Water is a common substance present at forensic scenes e.g. at home in a bath, in the swimming pool; it is also common outdoors in the sea, river, dam, puddles and especially during DVI incidents at the seashore after a tsunami. For this reason, the matching and discrimination of bulk hair samples after the water immersion treatment was also explored. Through this research, it was found that Near Infrared Spectroscopy, with the use of an optical probe, has successfully matched and discriminated bulk hair samples to build a profile for the possible application to a crime or disaster scene. Through the interpretation of Chemometrics, such characteristics included Gender and Race. A novel approach was to measure the spectra not only in the usual NIR range (4000 – 7500 cm-1) but also in the Visible NIR (7500 – 12800 cm-1). This proved to be particularly useful in exploring the discrimination of differently coloured hair, e.g. naturally coloured, bleached or dyed. The NIR region is sensitive to molecular vibrations of the hair fibre structure as well as that of the dyes and damage from bleaching. But the Visible NIR region preferentially responds to the natural colourants, the melanin, which involves electronic transitions. This approach was shown to provide improved discrimination between dyed and untreated hair. This thesis is an extensive study of the application of NIRS with the aid of chemometrics, for matching and discrimination of bulk human scalp hair. The work not only indicates the strong potential of this technique in this field but also breaks new ground with the exploration of the use of the NIR and Visible NIR ranges for spectral sampling. It also develops methods for measuring spectra from hair which has been immersed in different water media (sea, river and dam)