233 resultados para k-means
Resumo:
Business Process Management (BPM) in recent years has become a highest priority area for most organizations. Since this concept is multidisciplinary, success in this endeavour requires considering different factors. A number of studies have been conducted to identify these factors; however, most are limited to the introduction of high-level factors or to the identification of the means of success within only a specific context. This paper presents a holistic framework of success factors as well as the associated means for achieving success. This framework introduces nine factors, namely culture, leadership, communication, Information Technology, strategic alignment, people, project management, performance measurement and methodology. Each of these factors are characterized further by defining some sub-constructs and under each sub construct the means for achieving success are also introduced. This framework including means for achieving success can be useful for BPM project stakeholders in adequately planning the initiative and checking the progress during the implementation.
Resumo:
A major focus of research in nanotechnology is the development of novel, high throughput techniques for fabrication of arbitrarily shaped surface nanostructures of sub 100 nm to atomic scale. A related pursuit is the development of simple and efficient means for parallel manipulation and redistribution of adsorbed atoms, molecules and nanoparticles on surfaces – adparticle manipulation. These techniques will be used for the manufacture of nanoscale surface supported functional devices in nanotechnologies such as quantum computing, molecular electronics and lab-on-achip, as well as for modifying surfaces to obtain novel optical, electronic, chemical, or mechanical properties. A favourable approach to formation of surface nanostructures is self-assembly. In self-assembly, nanostructures are grown by aggregation of individual adparticles that diffuse by thermally activated processes on the surface. The passive nature of this process means it is generally not suited to formation of arbitrarily shaped structures. The self-assembly of nanostructures at arbitrary positions has been demonstrated, though these have typically required a pre-patterning treatment of the surface using sophisticated techniques such as electron beam lithography. On the other hand, a parallel adparticle manipulation technique would be suited for directing the selfassembly process to occur at arbitrary positions, without the need for pre-patterning the surface. There is at present a lack of techniques for parallel manipulation and redistribution of adparticles to arbitrary positions on the surface. This is an issue that needs to be addressed since these techniques can play an important role in nanotechnology. In this thesis, we propose such a technique – thermal tweezers. In thermal tweezers, adparticles are redistributed by localised heating of the surface. This locally enhances surface diffusion of adparticles so that they rapidly diffuse away from the heated regions. Using this technique, the redistribution of adparticles to form a desired pattern is achieved by heating the surface at specific regions. In this project, we have focussed on the holographic implementation of this approach, where the surface is heated by holographic patterns of interfering pulsed laser beams. This implementation is suitable for the formation of arbitrarily shaped structures; the only condition is that the shape can be produced by holographic means. In the simplest case, the laser pulses are linearly polarised and intersect to form an interference pattern that is a modulation of intensity along a single direction. Strong optical absorption at the intensity maxima of the interference pattern results in approximately a sinusoidal variation of the surface temperature along one direction. The main aim of this research project is to investigate the feasibility of the holographic implementation of thermal tweezers as an adparticle manipulation technique. Firstly, we investigate theoretically the surface diffusion of adparticles in the presence of sinusoidal modulation of the surface temperature. Very strong redistribution of adparticles is predicted when there is strong interaction between the adparticle and the surface, and the amplitude of the temperature modulation is ~100 K. We have proposed a thin metallic film deposited on a glass substrate heated by interfering laser beams (optical wavelengths) as a means of generating very large amplitude of surface temperature modulation. Indeed, we predict theoretically by numerical solution of the thermal conduction equation that amplitude of the temperature modulation on the metallic film can be much greater than 100 K when heated by nanosecond pulses with an energy ~1 mJ. The formation of surface nanostructures of less than 100 nm in width is predicted at optical wavelengths in this implementation of thermal tweezers. Furthermore, we propose a simple extension to this technique where spatial phase shift of the temperature modulation effectively doubles or triples the resolution. At the same time, increased resolution is predicted by reducing the wavelength of the laser pulses. In addition, we present two distinctly different, computationally efficient numerical approaches for theoretical investigation of surface diffusion of interacting adparticles – the Monte Carlo Interaction Method (MCIM) and the random potential well method (RPWM). Using each of these approaches we have investigated thermal tweezers for redistribution of both strongly and weakly interacting adparticles. We have predicted that strong interactions between adparticles can increase the effectiveness of thermal tweezers, by demonstrating practically complete adparticle redistribution into the low temperature regions of the surface. This is promising from the point of view of thermal tweezers applied to directed self-assembly of nanostructures. Finally, we present a new and more efficient numerical approach to theoretical investigation of thermal tweezers of non-interacting adparticles. In this approach, the local diffusion coefficient is determined from solution of the Fokker-Planck equation. The diffusion equation is then solved numerically using the finite volume method (FVM) to directly obtain the probability density of adparticle position. We compare predictions of this approach to those of the Ermak algorithm solution of the Langevin equation, and relatively good agreement is shown at intermediate and high friction. In the low friction regime, we predict and investigate the phenomenon of ‘optimal’ friction and describe its occurrence due to very long jumps of adparticles as they diffuse from the hot regions of the surface. Future research directions, both theoretical and experimental are also discussed.
Resumo:
An alternative approach to port decoupling and matching of arrays with tightly coupled elements is proposed. The method is based on the inherent decoupling effect obtained by feeding the orthogonal eigenmodes of the array. For this purpose, a modal feed network is connected to the array. The decoupled external ports of the feed network may then be matched independently by using conventional matching circuits. Such a system may be used in digital beam forming applications with good signal-to-noise performance. The theory is applicable to arrays with an arbitrary number of elements, but implementation is only practical for smaller arrays. The principle is illustrated by means of two examples.
Resumo:
Residential aged care in Australia does not have a system of quality assessment related to clinical outcomes, creating a significant gap in quality monitoring. Clinical outcomes represent the results of all inputs into care, thus providing an indication of the success of those inputs. To fill this gap, an assessment tool based on resident outcomes (the ResCareQA) was developed and evaluated in collaboration with residential care providers. A useful output of the ResCareQA is a profile of resident clinical status, and this paper will use such outputs to present a snapshot of nine residential facilities. Such comprehensive data has not yet been available within Australia, so this will provide an important insight. ResCareQA data was collected from all residents (N=498) of nine aged care facilities from two major aged care providers. For each facility, numerator–denominator data were calculated to assess the degree of potential clinical problems. Results varied across clinical areas and across facilities, and rank-ordered facility results for selected clinical areas are reviewed and discussed. Use of the ResCareQA to generate clinical outcome data provides a concrete means of monitoring care quality within residential facilities; regular use of the ResCareQA could thus contribute to improved care outcomes within residential aged care.
Resumo:
Cyber bullying – or bullying through the use of technology – is a growing phenomenon which is currently most commonly experienced by young people and the consequences manifested in schools. Cyber bullying shares many of the same attributes as face-to-face bullying such as a power imbalance and a sense of helplessness on the part of the target. Not surprisingly, targets of face-to-face bullying are increasingly turning to the law, and it is likely that targets of cyber bullying may also do so in an appropriate case. This article examines the various criminal, civil and vilification laws that may apply to cases of cyber bullying and assesses the likely effectiveness of these laws as a means of redressing that power imbalance between perpetrator and target.
Resumo:
Powerful brands create meaningful images in the minds of customers (Keller, 1993). A strong brand image and reputation enhances differentiation and has a positive influence on buying behaviour (Gordon et al., 1993; McEnally and de Chernatony, 1999). While the power of branding is widely acknowledged in consumer markets, the nature and importance of branding in industrial markets remains under-researched. Many business-to-business (B2B) strategists have claimed brand-building belongs in the consumer realm. They argue that industrial products do not need branding as it is confusing and adds little value to functional products (Collins, 1977; Lorge, 1998; Saunders and Watt, 1979). Others argue that branding and the concept of brand equity however are increasingly important in industrial markets, because it has been shown that what a brand means to a buyer can be a determining factor in deciding between industrial purchase alternatives (Aaker, 1991). In this context, it is critical for suppliers to initiate and sustain relationships due to the small number of potential customers (Ambler, 1995; Webster and Keller, 2004). To date however, there is no model available to assist B2B marketers in identifying and measuring brand equity. In this paper, we take a step in that direction by operationalising and empirically testing a prominent brand equity model in a B2B context. This makes not only a theoretical contribution by advancing branding research, but also addresses a managerial need for information that will assist in the assessment of industrial branding efforts.
Resumo:
This paper describes the approach taken to the clustering task at INEX 2009 by a group at the Queensland University of Technology. The Random Indexing (RI) K-tree has been used with a representation that is based on the semantic markup available in the INEX 2009 Wikipedia collection. The RI K-tree is a scalable approach to clustering large document collections. This approach has produced quality clustering when evaluated using two different methodologies.
Resumo:
Starbug is an inexpensive, miniature autonomous underwater vehicle ideal for data collection and ecosystem surveys. Starbug is small enough to be launched by one person without the need for specialised equipment, such as cranes, and it operates with minimal to no human intervention. Starbug was one of the first autonomous underwater vehicles (AUVs) in the world where vision is the primary means of navigation and control. More details of Starbug can be found here: http://www.csiro.au/science/starbug.html
Resumo:
In a consumerist society obsessed with body image and thinness, obesity levels have reached an all-time high. This multi-faceted book written by a range of experts, explores the social, cultural, clinical and psychological factors that lie behind the Obesity Epidemic . It is required reading for the many healthcare professionals dealing with the effects of obesity and for anyone who wants to know more about the causes of weight gain and the best ways of dealing with it. Fat Matters covers a range of issues from sociology through medicine to technology. This is not a book for the highly specialised expert. Rather it is a book that shows the diversity of approaches to the phenomenon of obesity, tailored to the reader who wants to be up-to-date and well-informed on a subject that is possibly as frequently discussed and as misunderstood as the weather.
Resumo:
Multi-disciplinary approaches to complex problems are becoming more common – they enable criteria manifested in distinct (and potentially conflicting) domains to be jointly balanced and satisfied. In this paper we present airport terminals as a case study which requires multi-disciplinary knowledge in order to balance conflicting security, economic and passenger-driven needs and correspondingly enhance the design, management and operation of airport terminals. The need for a truly multi-disciplinary scientific approach which integrates information, process, people, technology and space domains is highlighted through a brief discussion of two challenges currently faced by airport operators. The paper outlines the approach taken by this project, detailing the aims and objectives of each of seven diverse research programs.
Resumo:
Digital collections are growing exponentially in size as the information age takes a firm grip on all aspects of society. As a result Information Retrieval (IR) has become an increasingly important area of research. It promises to provide new and more effective ways for users to find information relevant to their search intentions. Document clustering is one of the many tools in the IR toolbox and is far from being perfected. It groups documents that share common features. This grouping allows a user to quickly identify relevant information. If these groups are misleading then valuable information can accidentally be ignored. There- fore, the study and analysis of the quality of document clustering is important. With more and more digital information available, the performance of these algorithms is also of interest. An algorithm with a time complexity of O(n2) can quickly become impractical when clustering a corpus containing millions of documents. Therefore, the investigation of algorithms and data structures to perform clustering in an efficient manner is vital to its success as an IR tool. Document classification is another tool frequently used in the IR field. It predicts categories of new documents based on an existing database of (doc- ument, category) pairs. Support Vector Machines (SVM) have been found to be effective when classifying text documents. As the algorithms for classifica- tion are both efficient and of high quality, the largest gains can be made from improvements to representation. Document representations are vital for both clustering and classification. Representations exploit the content and structure of documents. Dimensionality reduction can improve the effectiveness of existing representations in terms of quality and run-time performance. Research into these areas is another way to improve the efficiency and quality of clustering and classification results. Evaluating document clustering is a difficult task. Intrinsic measures of quality such as distortion only indicate how well an algorithm minimised a sim- ilarity function in a particular vector space. Intrinsic comparisons are inherently limited by the given representation and are not comparable between different representations. Extrinsic measures of quality compare a clustering solution to a “ground truth” solution. This allows comparison between different approaches. As the “ground truth” is created by humans it can suffer from the fact that not every human interprets a topic in the same manner. Whether a document belongs to a particular topic or not can be subjective.
Resumo:
Patients with idiopathic small fibre neuropathy (ISFN) have been shown to have significant intraepidermal nerve fibre loss and an increased prevalence of impaired glucose tolerance (IGT). It has been suggested that the dysglycemia of IGT and additional metabolic risk factors may contribute to small nerve fibre damage in these patients. Twenty-five patients with ISFN and 12 aged-matched control subjects underwent a detailed evaluation of neuropathic symptoms, neurological deficits (Neuropathy deficit score (NDS); Nerve Conduction Studies (NCS); Quantitative Sensory Testing (QST) and Corneal Confocal Microscopy (CCM)) to quantify small nerve fibre pathology. Eight (32%) patients had IGT. Whilst all patients with ISFN had significant neuropathic symptoms, NDS, NCS and QST except for warm thresholds were normal. Corneal sensitivity was reduced and CCM demonstrated a significant reduction in corneal nerve fibre density (NFD) (Pb0.0001), nerve branch density (NBD) (Pb0.0001), nerve fibre length (NFL) (Pb0.0001) and an increase in nerve fibre tortuosity (NFT) (Pb0.0001). However these parameters did not differ between ISFN patients with and without IGT, nor did they correlate with BMI, lipids and blood pressure. Corneal confocal microscopy provides a sensitive non-invasive means to detect small nerve fibre damage in patients with ISFN and metabolic abnormalities do not relate to nerve damage.