900 resultados para Elaborazione d’immagini, Microscopia, Istopatologia, Classificazione, K-means
Resumo:
ABSTRACT: Neuropathy is a cause of significant disability in patients with Fabry disease, yet its diagnosis is difficult. In this study we compared the novel noninvasive techniques of corneal confocal microscopy (CCM) to quantify small-fiber pathology, and non-contact corneal esthesiometry (NCCA) to quantify loss of corneal sensation, with established tests of neuropathy in patients with Fabry disease. Ten heterozygous females with Fabry disease not on enzyme replacement therapy (ERT), 6 heterozygous females, 6 hemizygous males on ERT, and 14 age-matched, healthy volunteers underwent detailed quantification of neuropathic symptoms, neurological deficits, neurophysiology, quantitative sensory testing (QST), NCCA, and CCM. All patients with Fabry disease had significant neuropathic symptoms and an elevated Mainz score. Peroneal nerve amplitude was reduced in all patients and vibration perception threshold was elevated in both male and female patients on ERT. Cold sensation (CS) threshold was significantly reduced in both male and female patients on ERT (P < 0.02), but warm sensation (WS)and heat-induced pain (HIP) were only significantly increased in males onERT (P<0.01). However, corneal sensation assessed withNCCAwas significantly reduced in female (P < 0.02) and male (P < 0.04) patients on ERT compared with control subjects. According to CCM, corneal nerve fiber and branch density was significantly reduced in female (P < 0.03) and male (P < 0.02) patients on ERT compared with control subjects. Furthermore, the severity of neuropathic symptoms and the neurological component of the Mainz Severity Score Index correlated significantly with QSTand CCM. This study shows that CCM and NCCA provide a novel means to detect early nerve fiber damage and dysfunction, respectively, in patients with Fabry disease.
Resumo:
The importance of student engagement to higher education quality, making deep learning outcomes possible for students, and achieving student retention, is increasingly being understood. The issue of student engagement in the first year of tertiary study is of particular significance. This paper takes the position that the first year curriculum, and the pedagogical principles that inform its design, are critical influencers of student engagement in the first year learning environment. We use an analysis of case studies prepared for Kift’s ALTC Senior Fellowship to demonstrate ways in which student engagement in the first year of tertiary study can be successfully supported through intentional curriculum design that motivates students to learn, provides a positive learning climate, and encourages students to be active in their learning.
Resumo:
Business Process Management (BPM) in recent years has become a highest priority area for most organizations. Since this concept is multidisciplinary, success in this endeavour requires considering different factors. A number of studies have been conducted to identify these factors; however, most are limited to the introduction of high-level factors or to the identification of the means of success within only a specific context. This paper presents a holistic framework of success factors as well as the associated means for achieving success. This framework introduces nine factors, namely culture, leadership, communication, Information Technology, strategic alignment, people, project management, performance measurement and methodology. Each of these factors are characterized further by defining some sub-constructs and under each sub construct the means for achieving success are also introduced. This framework including means for achieving success can be useful for BPM project stakeholders in adequately planning the initiative and checking the progress during the implementation.
Resumo:
A major focus of research in nanotechnology is the development of novel, high throughput techniques for fabrication of arbitrarily shaped surface nanostructures of sub 100 nm to atomic scale. A related pursuit is the development of simple and efficient means for parallel manipulation and redistribution of adsorbed atoms, molecules and nanoparticles on surfaces – adparticle manipulation. These techniques will be used for the manufacture of nanoscale surface supported functional devices in nanotechnologies such as quantum computing, molecular electronics and lab-on-achip, as well as for modifying surfaces to obtain novel optical, electronic, chemical, or mechanical properties. A favourable approach to formation of surface nanostructures is self-assembly. In self-assembly, nanostructures are grown by aggregation of individual adparticles that diffuse by thermally activated processes on the surface. The passive nature of this process means it is generally not suited to formation of arbitrarily shaped structures. The self-assembly of nanostructures at arbitrary positions has been demonstrated, though these have typically required a pre-patterning treatment of the surface using sophisticated techniques such as electron beam lithography. On the other hand, a parallel adparticle manipulation technique would be suited for directing the selfassembly process to occur at arbitrary positions, without the need for pre-patterning the surface. There is at present a lack of techniques for parallel manipulation and redistribution of adparticles to arbitrary positions on the surface. This is an issue that needs to be addressed since these techniques can play an important role in nanotechnology. In this thesis, we propose such a technique – thermal tweezers. In thermal tweezers, adparticles are redistributed by localised heating of the surface. This locally enhances surface diffusion of adparticles so that they rapidly diffuse away from the heated regions. Using this technique, the redistribution of adparticles to form a desired pattern is achieved by heating the surface at specific regions. In this project, we have focussed on the holographic implementation of this approach, where the surface is heated by holographic patterns of interfering pulsed laser beams. This implementation is suitable for the formation of arbitrarily shaped structures; the only condition is that the shape can be produced by holographic means. In the simplest case, the laser pulses are linearly polarised and intersect to form an interference pattern that is a modulation of intensity along a single direction. Strong optical absorption at the intensity maxima of the interference pattern results in approximately a sinusoidal variation of the surface temperature along one direction. The main aim of this research project is to investigate the feasibility of the holographic implementation of thermal tweezers as an adparticle manipulation technique. Firstly, we investigate theoretically the surface diffusion of adparticles in the presence of sinusoidal modulation of the surface temperature. Very strong redistribution of adparticles is predicted when there is strong interaction between the adparticle and the surface, and the amplitude of the temperature modulation is ~100 K. We have proposed a thin metallic film deposited on a glass substrate heated by interfering laser beams (optical wavelengths) as a means of generating very large amplitude of surface temperature modulation. Indeed, we predict theoretically by numerical solution of the thermal conduction equation that amplitude of the temperature modulation on the metallic film can be much greater than 100 K when heated by nanosecond pulses with an energy ~1 mJ. The formation of surface nanostructures of less than 100 nm in width is predicted at optical wavelengths in this implementation of thermal tweezers. Furthermore, we propose a simple extension to this technique where spatial phase shift of the temperature modulation effectively doubles or triples the resolution. At the same time, increased resolution is predicted by reducing the wavelength of the laser pulses. In addition, we present two distinctly different, computationally efficient numerical approaches for theoretical investigation of surface diffusion of interacting adparticles – the Monte Carlo Interaction Method (MCIM) and the random potential well method (RPWM). Using each of these approaches we have investigated thermal tweezers for redistribution of both strongly and weakly interacting adparticles. We have predicted that strong interactions between adparticles can increase the effectiveness of thermal tweezers, by demonstrating practically complete adparticle redistribution into the low temperature regions of the surface. This is promising from the point of view of thermal tweezers applied to directed self-assembly of nanostructures. Finally, we present a new and more efficient numerical approach to theoretical investigation of thermal tweezers of non-interacting adparticles. In this approach, the local diffusion coefficient is determined from solution of the Fokker-Planck equation. The diffusion equation is then solved numerically using the finite volume method (FVM) to directly obtain the probability density of adparticle position. We compare predictions of this approach to those of the Ermak algorithm solution of the Langevin equation, and relatively good agreement is shown at intermediate and high friction. In the low friction regime, we predict and investigate the phenomenon of ‘optimal’ friction and describe its occurrence due to very long jumps of adparticles as they diffuse from the hot regions of the surface. Future research directions, both theoretical and experimental are also discussed.
Resumo:
An alternative approach to port decoupling and matching of arrays with tightly coupled elements is proposed. The method is based on the inherent decoupling effect obtained by feeding the orthogonal eigenmodes of the array. For this purpose, a modal feed network is connected to the array. The decoupled external ports of the feed network may then be matched independently by using conventional matching circuits. Such a system may be used in digital beam forming applications with good signal-to-noise performance. The theory is applicable to arrays with an arbitrary number of elements, but implementation is only practical for smaller arrays. The principle is illustrated by means of two examples.
Resumo:
Residential aged care in Australia does not have a system of quality assessment related to clinical outcomes, creating a significant gap in quality monitoring. Clinical outcomes represent the results of all inputs into care, thus providing an indication of the success of those inputs. To fill this gap, an assessment tool based on resident outcomes (the ResCareQA) was developed and evaluated in collaboration with residential care providers. A useful output of the ResCareQA is a profile of resident clinical status, and this paper will use such outputs to present a snapshot of nine residential facilities. Such comprehensive data has not yet been available within Australia, so this will provide an important insight. ResCareQA data was collected from all residents (N=498) of nine aged care facilities from two major aged care providers. For each facility, numerator–denominator data were calculated to assess the degree of potential clinical problems. Results varied across clinical areas and across facilities, and rank-ordered facility results for selected clinical areas are reviewed and discussed. Use of the ResCareQA to generate clinical outcome data provides a concrete means of monitoring care quality within residential facilities; regular use of the ResCareQA could thus contribute to improved care outcomes within residential aged care.
Resumo:
Cyber bullying – or bullying through the use of technology – is a growing phenomenon which is currently most commonly experienced by young people and the consequences manifested in schools. Cyber bullying shares many of the same attributes as face-to-face bullying such as a power imbalance and a sense of helplessness on the part of the target. Not surprisingly, targets of face-to-face bullying are increasingly turning to the law, and it is likely that targets of cyber bullying may also do so in an appropriate case. This article examines the various criminal, civil and vilification laws that may apply to cases of cyber bullying and assesses the likely effectiveness of these laws as a means of redressing that power imbalance between perpetrator and target.
Resumo:
Powerful brands create meaningful images in the minds of customers (Keller, 1993). A strong brand image and reputation enhances differentiation and has a positive influence on buying behaviour (Gordon et al., 1993; McEnally and de Chernatony, 1999). While the power of branding is widely acknowledged in consumer markets, the nature and importance of branding in industrial markets remains under-researched. Many business-to-business (B2B) strategists have claimed brand-building belongs in the consumer realm. They argue that industrial products do not need branding as it is confusing and adds little value to functional products (Collins, 1977; Lorge, 1998; Saunders and Watt, 1979). Others argue that branding and the concept of brand equity however are increasingly important in industrial markets, because it has been shown that what a brand means to a buyer can be a determining factor in deciding between industrial purchase alternatives (Aaker, 1991). In this context, it is critical for suppliers to initiate and sustain relationships due to the small number of potential customers (Ambler, 1995; Webster and Keller, 2004). To date however, there is no model available to assist B2B marketers in identifying and measuring brand equity. In this paper, we take a step in that direction by operationalising and empirically testing a prominent brand equity model in a B2B context. This makes not only a theoretical contribution by advancing branding research, but also addresses a managerial need for information that will assist in the assessment of industrial branding efforts.
Resumo:
This paper describes the approach taken to the clustering task at INEX 2009 by a group at the Queensland University of Technology. The Random Indexing (RI) K-tree has been used with a representation that is based on the semantic markup available in the INEX 2009 Wikipedia collection. The RI K-tree is a scalable approach to clustering large document collections. This approach has produced quality clustering when evaluated using two different methodologies.
Resumo:
Starbug is an inexpensive, miniature autonomous underwater vehicle ideal for data collection and ecosystem surveys. Starbug is small enough to be launched by one person without the need for specialised equipment, such as cranes, and it operates with minimal to no human intervention. Starbug was one of the first autonomous underwater vehicles (AUVs) in the world where vision is the primary means of navigation and control. More details of Starbug can be found here: http://www.csiro.au/science/starbug.html
Resumo:
In a consumerist society obsessed with body image and thinness, obesity levels have reached an all-time high. This multi-faceted book written by a range of experts, explores the social, cultural, clinical and psychological factors that lie behind the Obesity Epidemic . It is required reading for the many healthcare professionals dealing with the effects of obesity and for anyone who wants to know more about the causes of weight gain and the best ways of dealing with it. Fat Matters covers a range of issues from sociology through medicine to technology. This is not a book for the highly specialised expert. Rather it is a book that shows the diversity of approaches to the phenomenon of obesity, tailored to the reader who wants to be up-to-date and well-informed on a subject that is possibly as frequently discussed and as misunderstood as the weather.
Resumo:
Multi-disciplinary approaches to complex problems are becoming more common – they enable criteria manifested in distinct (and potentially conflicting) domains to be jointly balanced and satisfied. In this paper we present airport terminals as a case study which requires multi-disciplinary knowledge in order to balance conflicting security, economic and passenger-driven needs and correspondingly enhance the design, management and operation of airport terminals. The need for a truly multi-disciplinary scientific approach which integrates information, process, people, technology and space domains is highlighted through a brief discussion of two challenges currently faced by airport operators. The paper outlines the approach taken by this project, detailing the aims and objectives of each of seven diverse research programs.