858 resultados para Cutting machine
Resumo:
The commercialization of aerial image processing is highly dependent on the platforms such as UAVs (Unmanned Aerial Vehicles). However, the lack of an automated UAV forced landing site detection system has been identified as one of the main impediments to allow UAV flight over populated areas in civilian airspace. This article proposes a UAV forced landing site detection system that is based on machine learning approaches including the Gaussian Mixture Model and the Support Vector Machine. A range of learning parameters are analysed including the number of Guassian mixtures, support vector kernels including linear, radial basis function Kernel (RBF) and polynormial kernel (poly), and the order of RBF kernel and polynormial kernel. Moreover, a modified footprint operator is employed during feature extraction to better describe the geometric characteristics of the local area surrounding a pixel. The performance of the presented system is compared to a baseline UAV forced landing site detection system which uses edge features and an Artificial Neural Network (ANN) region type classifier. Experiments conducted on aerial image datasets captured over typical urban environments reveal improved landing site detection can be achieved with an SVM classifier with an RBF kernel using a combination of colour and texture features. Compared to the baseline system, the proposed system provides significant improvement in term of the chance to detect a safe landing area, and the performance is more stable than the baseline in the presence of changes to the UAV altitude.
Resumo:
Objective To synthesise recent research on the use of machine learning approaches to mining textual injury surveillance data. Design Systematic review. Data sources The electronic databases which were searched included PubMed, Cinahl, Medline, Google Scholar, and Proquest. The bibliography of all relevant articles was examined and associated articles were identified using a snowballing technique. Selection criteria For inclusion, articles were required to meet the following criteria: (a) used a health-related database, (b) focused on injury-related cases, AND used machine learning approaches to analyse textual data. Methods The papers identified through the search were screened resulting in 16 papers selected for review. Articles were reviewed to describe the databases and methodology used, the strength and limitations of different techniques, and quality assurance approaches used. Due to heterogeneity between studies meta-analysis was not performed. Results Occupational injuries were the focus of half of the machine learning studies and the most common methods described were Bayesian probability or Bayesian network based methods to either predict injury categories or extract common injury scenarios. Models were evaluated through either comparison with gold standard data or content expert evaluation or statistical measures of quality. Machine learning was found to provide high precision and accuracy when predicting a small number of categories, was valuable for visualisation of injury patterns and prediction of future outcomes. However, difficulties related to generalizability, source data quality, complexity of models and integration of content and technical knowledge were discussed. Conclusions The use of narrative text for injury surveillance has grown in popularity, complexity and quality over recent years. With advances in data mining techniques, increased capacity for analysis of large databases, and involvement of computer scientists in the injury prevention field, along with more comprehensive use and description of quality assurance methods in text mining approaches, it is likely that we will see a continued growth and advancement in knowledge of text mining in the injury field.
Resumo:
This thesis is concerned with the detection and prediction of rain in environmental recordings using different machine learning algorithms. The results obtained in this research will help ecologists to efficiently analyse environmental data and monitor biodiversity.
Resumo:
The mining industry is highly suitable for the application of robotics and automation technology since the work is arduous, dangerous and often repetitive. This paper describes the development of an automation system for a physically large and complex field robotic system - a 3,500 tonne mining machine (a dragline). The major components of the system are discussed with a particular emphasis on the machine/operator interface. A very important aspect of this system is that it must work cooperatively with a human operator, seamlessly passing the control back and forth in order to achieve the main aim - increased productivity.
Resumo:
Neural interface devices and the melding of mind and machine, challenge the law in determining where civil liability for injury, damage or loss should lie. The ability of the human mind to instruct and control these devices means that in a negligence action against a person with a neural interface device, determining the standard of care owed by him or her will be of paramount importance. This article considers some of the factors that may influence the court’s determination of the appropriate standard of care to be applied in this situation, leading to the conclusion that a new standard of care might evolve.
Resumo:
The mining industry presents us with a number of ideal applications for sensor based machine control because of the unstructured environment that exists within each mine. The aim of the research presented here is to increase the productivity of existing large compliant mining machines by retrofitting with enhanced sensing and control technology. The current research focusses on the automatic control of the swing motion cycle of a dragline and an automated roof bolting system. We have achieved: * closed-loop swing control of an one-tenth scale model dragline; * single degree of freedom closed-loop visual control of an electro-hydraulic manipulator in the lab developed from standard components.
Resumo:
Everything revolves around desiring-machines and the production of desire… Schizoanalysis merely asks what are the machinic, social and technical indices on a socius that open to desiring-machines (Deleuze & Guattari, 1983, pp. 380-381). Achievement tests like NAPLAN are fairly recent, yet common, education policy initiatives in much of the Western world. They intersect with, use and change pre-existing logics of education, teaching and learning. There has been much written about the form and function of these tests, the ‘stakes’ involved and the effects of their practice. This paper adopts a different “angle of vision” to ask what ‘opens’ education to these regimes of testing(Roy, 2008)? This paper builds on previous analyses of NAPLAN as a modulating machine, or a machine characterised by the increased intensity of connections and couplings. One affect can be “an existential disquiet” as “disciplinary subjects attempt to force coherence onto a disintegrating narrative of self”(Thompson & Cook, 2012, p. 576). Desire operates at all levels of the education assemblage, however our argument is that achievement testing manifests desire as ‘lack’; seen in the desire for improved results, the desire for increased control, the desire for freedom, the desire for acceptance to name a few. For Deleuze and Guattari desire is irreducible to lack, instead desire is productive. As a productive assemblage, education machines operationalise and produce through desire; “Desire is a machine, and the object of the desire is another machine connected to it”(Deleuze & Guattari, 1983, p. 26). This intersection is complexified by the strata at which they occur, the molar and molecular connections and flows they make possible. Our argument is that when attention is paid to the macro and micro connections, the machines built and disassembled as a result of high-stakes testing, a map is constructed that outlines possibilities, desires and blockages within the education assemblage. This schizoanalytic cartography suggests a new analysis of these ‘axioms’ of testing and accountability. It follows the flows and disruptions made possible as different or altered connections are made and as new machines are brought online. Thinking of education machinically requires recognising that “every machine functions as a break in the flow in relation to the machine to which it is connected, but at the same time is also a flow itself, or the production of flow, in relation to the machine connected to it”(Deleuze & Guattari, 1983, p. 37). Through its potential to map desire, desire-production and the production of desire within those assemblages that have come to dominate our understanding of what is possible, Deleuze and Guattari’s method of schizoanalysis provides a provocative lens for grappling with the question of what one can do, and what lines of flight are possible.
Resumo:
Cord cutting refers to the act of cable and satellite consumers cancelling their subscriptions and opting instead for non-traditional distribution outlets, like streaming media platforms. The trend has been the subject of much debate in the trade press and a source of much concern for the industry. Yet many questions remain unanswered: Is it really a major trend? Does it save consumers money? Can viewers still find the content they love? How do we even “cut the cord” anyway?
Resumo:
Lateralization of temporal lobe epilepsy (TLE) is critical for successful outcome of surgery to relieve seizures. TLE affects brain regions beyond the temporal lobes and has been associated with aberrant brain networks, based on evidence from functional magnetic resonance imaging. We present here a machine learning-based method for determining the laterality of TLE, using features extracted from resting-state functional connectivity of the brain. A comprehensive feature space was constructed to include network properties within local brain regions, between brain regions, and across the whole network. Feature selection was performed based on random forest and a support vector machine was employed to train a linear model to predict the laterality of TLE on unseen patients. A leave-one-patient-out cross validation was carried out on 12 patients and a prediction accuracy of 83% was achieved. The importance of selected features was analyzed to demonstrate the contribution of resting-state connectivity attributes at voxel, region, and network levels to TLE lateralization.
Resumo:
In the past few years, the virtual machine (VM) placement problem has been studied intensively and many algorithms for the VM placement problem have been proposed. However, those proposed VM placement algorithms have not been widely used in today's cloud data centers as they do not consider the migration cost from current VM placement to the new optimal VM placement. As a result, the gain from optimizing VM placement may be less than the loss of the migration cost from current VM placement to the new VM placement. To address this issue, this paper presents a penalty-based genetic algorithm (GA) for the VM placement problem that considers the migration cost in addition to the energy-consumption of the new VM placement and the total inter-VM traffic flow in the new VM placement. The GA has been implemented and evaluated by experiments, and the experimental results show that the GA outperforms two well known algorithms for the VM placement problem.
Resumo:
Purpose The purpose of this paper is to highlight and promote fresh thinking in services marketing research. Design/methodology/approach The topic of the special issue was deliberately chosen to encourage fresh ideas and concepts that will move the discipline forward. The accepted papers have been categorised for ease and convenience of reading by scholars and practitioners, with a short commentary on each category. Findings There is a wealth of forward-thinking by service(s) marketing researchers that bodes well for the future of the sub-discipline. Research limitations/implications The special issue does not address fresh thinking in all areas of services marketing research. Other potential areas for fresh thinking are identified. Originality/value New thinking in a scholarly field is necessary to propel the discipline forward.
Resumo:
In this paper, we present a machine learning approach to measure the visual quality of JPEG-coded images. The features for predicting the perceived image quality are extracted by considering key human visual sensitivity (HVS) factors such as edge amplitude, edge length, background activity and background luminance. Image quality assessment involves estimating the functional relationship between HVS features and subjective test scores. The quality of the compressed images are obtained without referring to their original images ('No Reference' metric). Here, the problem of quality estimation is transformed to a classification problem and solved using extreme learning machine (ELM) algorithm. In ELM, the input weights and the bias values are randomly chosen and the output weights are analytically calculated. The generalization performance of the ELM algorithm for classification problems with imbalance in the number of samples per quality class depends critically on the input weights and the bias values. Hence, we propose two schemes, namely the k-fold selection scheme (KS-ELM) and the real-coded genetic algorithm (RCGA-ELM) to select the input weights and the bias values such that the generalization performance of the classifier is a maximum. Results indicate that the proposed schemes significantly improve the performance of ELM classifier under imbalance condition for image quality assessment. The experimental results prove that the estimated visual quality of the proposed RCGA-ELM emulates the mean opinion score very well. The experimental results are compared with the existing JPEG no-reference image quality metric and full-reference structural similarity image quality metric.
Resumo:
Inventory Management (IM) plays a decisive role in the enhancement of efficiency and competitiveness of manufacturing enterprises. Therefore, major manufacturing enterprises are following IM practices as a strategy to improve efficiency and achieve competitiveness. However, the spread of IM culture among Small and Medium Enterprises (SMEs) is limited due to lack of initiation, expertise and financial limitations in developed countries, leave alone developing countries. With this backdrop, this paper makes an attempt to ascertain the role and importance of IM practices and performance of SMEs in the machine tools industry of Bangalore, India. The relationship between inventory management practices and inventory cost are probed based on primary data gathered from 91 SMEs. The paper brings out that formal IM practices have a positive impact on the inventory performance of SMEs.
Resumo:
We apply the method of multiple scales (MMS) to a well known model of regenerative cutting vibrations in the large delay regime. By ``large'' we mean the delay is much larger than the time scale of typical cutting tool oscillations. The MMS upto second order for such systems has been developed recently, and is applied here to study tool dynamics in the large delay regime. The second order analysis is found to be much more accurate than first order analysis. Numerical integration of the MMS slow flow is much faster than for the original equation, yet shows excellent accuracy. The main advantage of the present analysis is that infinite dimensional dynamics is retained in the slow flow, while the more usual center manifold reduction gives a planar phase space. Lower-dimensional dynamical features, such as Hopf bifurcations and families of periodic solutions, are also captured by the MMS. Finally, the strong sensitivity of the dynamics to small changes in parameter values is seen clearly.
Resumo:
In this paper, downscaling models are developed using a support vector machine (SVM) for obtaining projections of monthly mean maximum and minimum temperatures (T-max and T-min) to river-basin scale. The effectiveness of the model is demonstrated through application to downscale the predictands for the catchment of the Malaprabha reservoir in India, which is considered to be a climatically sensitive region. The probable predictor variables are extracted from (1) the National Centers for Environmental Prediction (NCEP) reanalysis dataset for the period 1978-2000, and (2) the simulations from the third-generation Canadian Coupled Global Climate Model (CGCM3) for emission scenarios A1B, A2, B1 and COMMIT for the period 1978-2100. The predictor variables are classified into three groups, namely A, B and C. Large-scale atmospheric variables Such as air temperature, zonal and meridional wind velocities at 925 nib which are often used for downscaling temperature are considered as predictors in Group A. Surface flux variables such as latent heat (LH), sensible heat, shortwave radiation and longwave radiation fluxes, which control temperature of the Earth's surface are tried as plausible predictors in Group B. Group C comprises of all the predictor variables in both the Groups A and B. The scatter plots and cross-correlations are used for verifying the reliability of the simulation of the predictor variables by the CGCM3 and to Study the predictor-predictand relationships. The impact of trend in predictor variables on downscaled temperature was studied. The predictor, air temperature at 925 mb showed an increasing trend, while the rest of the predictors showed no trend. The performance of the SVM models that are developed, one for each combination of predictor group, predictand, calibration period and location-based stratification (land, land and ocean) of climate variables, was evaluated. In general, the models which use predictor variables pertaining to land surface improved the performance of SVM models for downscaling T-max and T-min