186 resultados para Machines


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Everything revolves around desiring-machines and the production of desire… Schizoanalysis merely asks what are the machinic, social and technical indices on a socius that open to desiring-machines (Deleuze & Guattari, 1983, pp. 380-381). Achievement tests like NAPLAN are fairly recent, yet common, education policy initiatives in much of the Western world. They intersect with, use and change pre-existing logics of education, teaching and learning. There has been much written about the form and function of these tests, the ‘stakes’ involved and the effects of their practice. This paper adopts a different “angle of vision” to ask what ‘opens’ education to these regimes of testing(Roy, 2008)? This paper builds on previous analyses of NAPLAN as a modulating machine, or a machine characterised by the increased intensity of connections and couplings. One affect can be “an existential disquiet” as “disciplinary subjects attempt to force coherence onto a disintegrating narrative of self”(Thompson & Cook, 2012, p. 576). Desire operates at all levels of the education assemblage, however our argument is that achievement testing manifests desire as ‘lack’; seen in the desire for improved results, the desire for increased control, the desire for freedom, the desire for acceptance to name a few. For Deleuze and Guattari desire is irreducible to lack, instead desire is productive. As a productive assemblage, education machines operationalise and produce through desire; “Desire is a machine, and the object of the desire is another machine connected to it”(Deleuze & Guattari, 1983, p. 26). This intersection is complexified by the strata at which they occur, the molar and molecular connections and flows they make possible. Our argument is that when attention is paid to the macro and micro connections, the machines built and disassembled as a result of high-stakes testing, a map is constructed that outlines possibilities, desires and blockages within the education assemblage. This schizoanalytic cartography suggests a new analysis of these ‘axioms’ of testing and accountability. It follows the flows and disruptions made possible as different or altered connections are made and as new machines are brought online. Thinking of education machinically requires recognising that “every machine functions as a break in the flow in relation to the machine to which it is connected, but at the same time is also a flow itself, or the production of flow, in relation to the machine connected to it”(Deleuze & Guattari, 1983, p. 37). Through its potential to map desire, desire-production and the production of desire within those assemblages that have come to dominate our understanding of what is possible, Deleuze and Guattari’s method of schizoanalysis provides a provocative lens for grappling with the question of what one can do, and what lines of flight are possible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper applies concepts Deleuze developed in his ‘Postscript on the Societies of Control’, especially those relating to modulatory power, dividuation and control, to aspects of Australian schooling to explore how this transition is manifesting itself. Two modulatory machines of assessment, NAPLAN and My Schools, are examined as a means to better understand how the disciplinary institution is changing as a result of modulation. This transition from discipline to modulation is visible in the declining importance of the disciplinary teacher–student relationship as a measure of the success of the educative process. The transition occurs through seduction because that which purports to measure classroom quality is in fact a serpent of modulation that produces simulacra of the disciplinary classroom. The effect is to sever what happens in the disciplinary space from its representations in a luminiferous ether that overlays the classroom.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Legal Context In the wake of the Copenhagen Accord 2009 and the Cancun Agreements 2010, a number of patent offices have introduced fast-track mechanisms to encourage patent applications in relation to clean technologies - such as those pertaining to hydrogen. However, patent offices will be under increasing pressure to ensure that the granted patents satisfy the requisite patent thresholds, as well as to identify and reject cases of fraud, hoaxes, scams, and swindles. Key Points This article examines the BlackLight litigation in the United States, the United Kingdom, and the European Patent Office, and considers how patent offices and courts deal with patent applications in respect of clean energy and perpetual motion machines. Practical Significance The capacity of patent offices to grant sound and reliable patents is critical to the credibility of the patent system, particularly in the context of the current focus upon promoting clean technologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The requirement of distributed computing of all-to-all comparison (ATAC) problems in heterogeneous systems is increasingly important in various domains. Though Hadoop-based solutions are widely used, they are inefficient for the ATAC pattern, which is fundamentally different from the MapReduce pattern for which Hadoop is designed. They exhibit poor data locality and unbalanced allocation of comparison tasks, particularly in heterogeneous systems. The results in massive data movement at runtime and ineffective utilization of computing resources, affecting the overall computing performance significantly. To address these problems, a scalable and efficient data and task distribution strategy is presented in this paper for processing large-scale ATAC problems in heterogeneous systems. It not only saves storage space but also achieves load balancing and good data locality for all comparison tasks. Experiments of bioinformatics examples show that about 89\% of the ideal performance capacity of the multiple machines have be achieved through using the approach presented in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – Ideally, there is no wear in hydrodynamic lubrication regime. A small amount of wear occurs during start and stop of the machines and the amount of wear is so small that it is difficult to measure with accuracy. Various wear measuring techniques have been used where out-of-roundness was found to be the most reliable method of measuring small wear quantities in journal bearings. This technique was further developed to achieve higher accuracy in measuring small wear quantities. The method proved to be reliable as well as inexpensive. The paper aims to discuss these issues. Design/methodology/approach – In an experimental study, the effect of antiwear additives was studied on journal bearings lubricated with oil containing solid contaminants. The test duration was too long and the wear quantities achieved were too small. To minimise the test duration, short tests of about 90 min duration were conducted and wear was measured recording changes in variety of parameters related to weight, geometry and wear debris. The out-of-roundness was found to be the most effective method. This method was further refined by enlarging the out-of-roundness traces on a photocopier. The method was proved to be reliable and inexpensive. Findings – Study revealed that the most commonly used wear measurement techniques such as weight loss, roughness changes and change in particle count were not adequate for measuring small wear quantities in journal bearings. Out-of-roundness method with some refinements was found to be one of the most reliable methods for measuring small wear quantities in journal bearings working in hydrodynamic lubrication regime. By enlarging the out-of-roundness traces and determining the worn area of the bearing cross-section, weight loss in bearings was calculated, which was repeatable and reliable. Research limitations/implications – This research is a basic in nature where a rudimentary solution has been developed for measuring small wear quantities in rotary devices such as journal bearings. The method requires enlarging traces on a photocopier and determining the shape of the worn area on an out-of-roundness trace on a transparency, which is a simple but a crude method. This may require an automated procedure to determine the weight loss from the out-of-roundness traces directly. This method can be very useful in reducing test duration and measuring wear quantities with higher precision in situations where wear quantities are very small. Practical implications – This research provides a reliable method of measuring wear of circular geometry. The Talyrond equipment used for measuring the change in out-of-roundness due to wear of bearings indicates that this equipment has high potential to be used as a wear measuring device also. Measurement of weight loss from the traces is an enhanced capability of this equipment and this research may lead to the development of a modified version of Talyrond type of equipment for wear measurements in circular machine components. Originality/value – Wear measurement in hydrodynamic bearings requires long duration tests to achieve adequate wear quantities. Out-of-roundness is one of the geometrical parameters that changes with progression of wear in a circular shape components. Thus, out-of-roundness is found to be an effective wear measuring parameter that relates to change in geometry. Method of increasing the sensitivity and enlargement of out-of-roundness traces is original work through which area of worn cross-section can be determined and weight loss can be derived for materials of known density with higher precision.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increasingly larger scale applications are generating an unprecedented amount of data. However, the increasing gap between computation and I/O capacity on High End Computing machines makes a severe bottleneck for data analysis. Instead of moving data from its source to the output storage, in-situ analytics processes output data while simulations are running. However, in-situ data analysis incurs much more computing resource contentions with simulations. Such contentions severely damage the performance of simulation on HPE. Since different data processing strategies have different impact on performance and cost, there is a consequent need for flexibility in the location of data analytics. In this paper, we explore and analyze several potential data-analytics placement strategies along the I/O path. To find out the best strategy to reduce data movement in given situation, we propose a flexible data analytics (FlexAnalytics) framework in this paper. Based on this framework, a FlexAnalytics prototype system is developed for analytics placement. FlexAnalytics system enhances the scalability and flexibility of current I/O stack on HEC platforms and is useful for data pre-processing, runtime data analysis and visualization, as well as for large-scale data transfer. Two use cases – scientific data compression and remote visualization – have been applied in the study to verify the performance of FlexAnalytics. Experimental results demonstrate that FlexAnalytics framework increases data transition bandwidth and improves the application end-to-end transfer performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A combined data matrix consisting of high performance liquid chromatography–diode array detector (HPLC–DAD) and inductively coupled plasma-mass spectrometry (ICP-MS) measurements of samples from the plant roots of the Cortex moutan (CM), produced much better classification and prediction results in comparison with those obtained from either of the individual data sets. The HPLC peaks (organic components) of the CM samples, and the ICP-MS measurements (trace metal elements) were investigated with the use of principal component analysis (PCA) and the linear discriminant analysis (LDA) methods of data analysis; essentially, qualitative results suggested that discrimination of the CM samples from three different provinces was possible with the combined matrix producing best results. Another three methods, K-nearest neighbor (KNN), back-propagation artificial neural network (BP-ANN) and least squares support vector machines (LS-SVM) were applied for the classification and prediction of the samples. Again, the combined data matrix analyzed by the KNN method produced best results (100% correct; prediction set data). Additionally, multiple linear regression (MLR) was utilized to explore any relationship between the organic constituents and the metal elements of the CM samples; the extracted linear regression equations showed that the essential metals as well as some metallic pollutants were related to the organic compounds on the basis of their concentrations

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Network topology and routing are two important factors in determining the communication costs of big data applications at large scale. As for a given Cluster, Cloud, or Grid system, the network topology is fixed and static or dynamic routing protocols are preinstalled to direct the network traffic. Users cannot change them once the system is deployed. Hence, it is hard for application developers to identify the optimal network topology and routing algorithm for their applications with distinct communication patterns. In this study, we design a CCG virtual system (CCGVS), which first uses container-based virtualization to allow users to create a farm of lightweight virtual machines on a single host. Then, it uses software-defined networking (SDN) technique to control the network traffic among these virtual machines. Users can change the network topology and control the network traffic programmingly, thereby enabling application developers to evaluate their applications on the same system with different network topologies and routing algorithms. The preliminary experimental results through both synthetic big data programs and NPB benchmarks have shown that CCGVS can represent application performance variations caused by network topology and routing algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel near-infrared spectroscopy (NIRS) method has been researched and developed for the simultaneous analyses of the chemical components and associated properties of mint (Mentha haplocalyx Briq.) tea samples. The common analytes were: total polysaccharide content, total flavonoid content, total phenolic content, and total antioxidant activity. To resolve the NIRS data matrix for such analyses, least squares support vector machines was found to be the best chemometrics method for prediction, although it was closely followed by the radial basis function/partial least squares model. Interestingly, the commonly used partial least squares was unsatisfactory in this case. Additionally, principal component analysis and hierarchical cluster analysis were able to distinguish the mint samples according to their four geographical provinces of origin, and this was further facilitated with the use of the chemometrics classification methods-K-nearest neighbors, linear discriminant analysis, and partial least squares discriminant analysis. In general, given the potential savings with sampling and analysis time as well as with the costs of special analytical reagents required for the standard individual methods, NIRS offered a very attractive alternative for the simultaneous analysis of mint samples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over past few decades, frog species have been experiencing dramatic decline around the world. The reason for this decline includes habitat loss, invasive species, climate change and so on. To better know the status of frog species, classifying frogs has become increasingly important. In this study, acoustic features are investigated for multi-level classification of Australian frogs: family, genus and species, including three families, eleven genera and eighty five species which are collected from Queensland, Australia. For each frog species, six instances are selected from which ten acoustic features are calculated. Then, the multicollinearity between ten features are studied for selecting non-correlated features for subsequent analysis. A decision tree (DT) classifier is used to visually and explicitly determine which acoustic features are relatively important for classifying family, which for genus, and which for species. Finally, a weighted support vector machines (SVMs) classifier is used for the multi- level classification with three most important acoustic features respectively. Our experiment results indicate that using different acoustic feature sets can successfully classify frogs at different levels and the average classification accuracy can be up to 85.6%, 86.1% and 56.2% for family, genus and species respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Instead of regarding a particular type of gambling activity (for example, electronic gambling machines, table games) as an isolated factor for problem gambling, recent research suggests that gambling involvement (for example, as measured by the number of different types of gambling activities played) should also be considered. Using a large sample of the Victorian adult population, this study found that the strength of association between problem gambling and the type of gambling reduced after adjusting for gambling involvement. This finding supports recent research that gambling involvement is an important factor in assessing the risk of problem gambling. The study also provides insights into the measurements of gambling involvement and provides alternative statistical modelling to analyse problem gambling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The voice of a traditional communication drum can be heard over great distances. Yet now in Papua New Guinea (PNG) it is hearing, by phone, the voice of a loved one who has moved far away from home for work, marriage or studies that brings the greatest delight. As recently as 2007, most areas of this Pacific island nation had no form of telephony available. Apart from radio, modern communication forms have been restricted predominantly to the urban areas where only a small percentage of the people reside. Landline telephones, television, Internet, facsimile machines and so on have never reached the majority of the inhabited areas...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new technology – 3D printing – has the potential to make radical changes to aspects of the way in which we live. Put simply, it allows people to download designs and turn them into physical objects by laying down successive layers of material. Replacements or parts for household objects such as toys, utensils and gadgets could become available at the press of a button. With this innovation, however, comes the need to consider impacts on a wide range of forms of intellectual property, as Dr Matthew Rimmer explains. 3D Printing is the latest in a long line of disruptive technologies – including photocopiers, cassette recorders, MP3 players, personal computers, peer to peer networks, and wikis – which have challenged intellectual property laws, policies, practices, and norms. As The Economist has observed, ‘Tinkerers with machines that turn binary digits into molecules are pioneering a whole new way of making things—one that could well rewrite the rules of manufacturing in much the same way as the PC trashed the traditional world of computing.’

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an effective classification method based on Support Vector Machines (SVM) in the context of activity recognition. Local features that capture both spatial and temporal information in activity videos have made significant progress recently. Efficient and effective features, feature representation and classification plays a crucial role in activity recognition. For classification, SVMs are popularly used because of their simplicity and efficiency; however the common multi-class SVM approaches applied suffer from limitations including having easily confused classes and been computationally inefficient. We propose using a binary tree SVM to address the shortcomings of multi-class SVMs in activity recognition. We proposed constructing a binary tree using Gaussian Mixture Models (GMM), where activities are repeatedly allocated to subnodes until every new created node contains only one activity. Then, for each internal node a separate SVM is learned to classify activities, which significantly reduces the training time and increases the speed of testing compared to popular the `one-against-the-rest' multi-class SVM classifier. Experiments carried out on the challenging and complex Hollywood dataset demonstrates comparable performance over the baseline bag-of-features method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction Electronic medication administration record (eMAR) systems are promoted as a potential intervention to enhance medication safety in residential aged care facilities (RACFs). The purpose of this study was to conduct an in-practice evaluation of an eMAR being piloted in one Australian RACF before its roll out, and to provide recommendations for system improvements. Methods A multidisciplinary team conducted direct observations of workflow (n=34 hours) in the RACF site and the community pharmacy. Semi-structured interviews (n=5) with RACF staff and the community pharmacist were conducted to investigate their views of the eMAR system. Data were analysed using a grounded theory approach to identify challenges associated with the design of the eMAR system. Results The current eMAR system does not offer an end-to-end solution for medication management. Many steps, including prescribing by doctors and communication with the community pharmacist, are still performed manually using paper charts and fax machines. Five major challenges associated with the design of eMAR system were identified: limited interactivity; inadequate flexibility; problems related to information layout and semantics; the lack of relevant decision support; and system maintenance issues.We suggest recommendations to improve the design of the eMAR system and to optimize existing workflows. Discussion Immediate value can be achieved by improving the system interactivity, reducing inconsistencies in data entry design and offering dedicated organisational support to minimise connectivity issues. Longer-term benefits can be achieved by adding decision support features and establishing system interoperability requirements with stakeholder groups (e.g. community pharmacies) prior to system roll out. In-practice evaluations of technologies like eMAR system have great value in identifying design weaknesses which inhibit optimal system use.