966 resultados para Separators (Machines)
Resumo:
Brazil is the world biggest producer of sugar cane with an area of 7x10(6) hectares. Mainly the system used for planting is the semi-mechanized one, which consists in opening the furrows with a machine, manually allocating the fractioned stalks and then covering the furrows done by the machines. The great amount of human labor used in the semi-mechanized system is becoming harder to find and also more expensive, indicating the need of a fully mechanized operation. Currently in Brazil these agriculture machines industries offers six different types of fully mechanized sugar cane planters (two types of whole stalks for planting and four using mechanized harvested stalks known as billets). All of them plant in two furrows simultaneously in 1.5 m row spacing. This study analyzed five different machines and the following variables: Working Speed (km h(-1)); Effective Capacity (ha h(-1)), Drawbar Force (kgf), Draw Bar Power (in HP), Fuel Consumption (L h(-1)) and Costs (US$ ha(-1)) comparing them with the semi-mechanized system. This research also characterized the stalks for planting as viable gems number (%), non viable gems number (%) and billet length (m). And lastly the mechanized planting system is cheaper than the conventional one and none of the machines has an adequate mechanism for placing the right amount of sugar cane seed.
Resumo:
Recently, we have built a classification model that is capable of assigning a given sesquiterpene lactone (STL) into exactly one tribe of the plant family Asteraceae from which the STL has been isolated. Although many plant species are able to biosynthesize a set of peculiar compounds, the occurrence of the same secondary metabolites in more than one tribe of Asteraceae is frequent. Building on our previous work, in this paper, we explore the possibility of assigning an STL to more than one tribe (class) simultaneously. When an object may belong to more than one class simultaneously, it is called multilabeled. In this work, we present a general overview of the techniques available to examine multilabeled data. The problem of evaluating the performance of a multilabeled classifier is discussed. Two particular multilabeled classification methods-cross-training with support vector machines (ct-SVM) and multilabeled k-nearest neighbors (M-L-kNN)were applied to the classification of the STLs into seven tribes from the plant family Asteraceae. The results are compared to a single-label classification and are analyzed from a chemotaxonomic point of view. The multilabeled approach allowed us to (1) model the reality as closely as possible, (2) improve our understanding of the relationship between the secondary metabolite profiles of different Asteraceae tribes, and (3) significantly decrease the number of plant sources to be considered for finding a certain STL. The presented classification models are useful for the targeted collection of plants with the objective of finding plant sources of natural compounds that are biologically active or possess other specific properties of interest.
Resumo:
In the context of cancer diagnosis and treatment, we consider the problem of constructing an accurate prediction rule on the basis of a relatively small number of tumor tissue samples of known type containing the expression data on very many (possibly thousands) genes. Recently, results have been presented in the literature suggesting that it is possible to construct a prediction rule from only a few genes such that it has a negligible prediction error rate. However, in these results the test error or the leave-one-out cross-validated error is calculated without allowance for the selection bias. There is no allowance because the rule is either tested on tissue samples that were used in the first instance to select the genes being used in the rule or because the cross-validation of the rule is not external to the selection process; that is, gene selection is not performed in training the rule at each stage of the cross-validation process. We describe how in practice the selection bias can be assessed and corrected for by either performing a cross-validation or applying the bootstrap external to the selection process. We recommend using 10-fold rather than leave-one-out cross-validation, and concerning the bootstrap, we suggest using the so-called. 632+ bootstrap error estimate designed to handle overfitted prediction rules. Using two published data sets, we demonstrate that when correction is made for the selection bias, the cross-validated error is no longer zero for a subset of only a few genes.
Resumo:
We introduced a spectral clustering algorithm based on the bipartite graph model for the Manufacturing Cell Formation problem in [Oliveira S, Ribeiro JFF, Seok SC. A spectral clustering algorithm for manufacturing cell formation. Computers and Industrial Engineering. 2007 [submitted for publication]]. It constructs two similarity matrices; one for parts and one for machines. The algorithm executes a spectral clustering algorithm on each separately to find families of parts and cells of machines. The similarity measure in the approach utilized limited information between parts and between machines. This paper reviews several well-known similarity measures which have been used for Group Technology. Computational clustering results are compared by various performance measures. (C) 2008 The Society of Manufacturing Engineers. Published by Elsevier Ltd. All rights reserved.
Resumo:
A graph clustering algorithm constructs groups of closely related parts and machines separately. After they are matched for the least intercell moves, a refining process runs on the initial cell formation to decrease the number of intercell moves. A simple modification of this main approach can deal with some practical constraints, such as the popular constraint of bounding the maximum number of machines in a cell. Our approach makes a big improvement in the computational time. More importantly, improvement is seen in the number of intercell moves when the computational results were compared with best known solutions from the literature. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The fabrication of heavy-duty printer heads involves a great deal of grinding work. Previously in the printer manufacturing industry, four grinding procedures were manually conducted in four grinding machines, respectively. The productivity of the whole grinding process was low due to the long loading time. Also, the machine floor space occupation was large because of the four separate grinding machines. The manual operation also caused inconsistent quality. This paper reports the system and process development of a highly integrated and automated high-speed grinding system for printer heads. The developed system, which is believed to be the first of its kind, not only produces printer heads of consistently good quality, but also significantly reduces the cycle time and machine floor space occupation.
Resumo:
There is not a specific test to diagnose Alzheimer`s disease (AD). Its diagnosis should be based upon clinical history, neuropsychological and laboratory tests, neuroimaging and electroencephalography (EEG). Therefore, new approaches are necessary to enable earlier and more accurate diagnosis and to follow treatment results. In this study we used a Machine Learning (ML) technique, named Support Vector Machine (SVM), to search patterns in EEG epochs to differentiate AD patients from controls. As a result, we developed a quantitative EEG (qEEG) processing method for automatic differentiation of patients with AD from normal individuals, as a complement to the diagnosis of probable dementia. We studied EEGs from 19 normal subjects (14 females/5 males, mean age 71.6 years) and 16 probable mild to moderate symptoms AD patients (14 females/2 males, mean age 73.4 years. The results obtained from analysis of EEG epochs were accuracy 79.9% and sensitivity 83.2%. The analysis considering the diagnosis of each individual patient reached 87.0% accuracy and 91.7% sensitivity.
Resumo:
Introduction: Recently developed portable dental X-ray units increase the mobility of the forensic odontologists and allow more efficient X-ray work in a disaster field, especially when used in combination with digital sensors. This type of machines might also have potential for application in remote areas, military and humanitarian missions, dental care of patients with mobility limitation, as well as imaging in operating rooms. Objective: To evaluate radiographic image quality acquired by three portable X-ray devices in combination with four image receptors and to evaluate their medical physics parameters. Materials and methods: Images of five samples consisting of four teeth and one formalin-fixed mandible were acquired by one conventional wall-mounted X-ray unit, MinRay (R) 60/70 kVp, used as a clinical standard, and three portable dental X-ray devices: AnyRay (R) 60 kVp, Nomad (R) 60 kVp and Rextar (R) 70 kVp, in combination with a phosphor image plate (PSP), a CCD, or a CMOS sensor. Three observers evaluated images for standard image quality besides forensic diagnostic quality on a 4-point rating scale. Furthermore, all machines underwent tests for occupational as well as patient dosimetry. Results: Statistical analysis showed good quality imaging for all system, with the combination of Nomad (R) and PSP yielding the best score. A significant difference in image quality between the combination of the four X-ray devices and four sensors was established (p < 0.05). For patient safety, the exposure rate was determined and exit dose rates for MinRay (R) at 60 kVp, MinRay (R) at 70 kVp, AnyRay (R), Nomad (R) and Rextar (R) were 3.4 mGy/s, 4.5 mGy/s, 13.5 mGy/s, 3.8 mGy/s and 2.6 mGy/s respectively. The kVp of the AnyRay (R) system was the most stable, with a ripple of 3.7%. Short-term variations in the tube output of all the devices were less than 10%. AnyRay (R) presented higher estimated effective dose than other machines. Occupational dosimetry showed doses at the operator`s hand being lowest with protective shielding (Nomad (R): 0.1 mu Gy). It was also low while using remote control (distance > 1 m: Rextar (R) < 0.2 mu Gy, MinRay (R) < 0.1 mu Gy). Conclusions: The present study demonstrated the feasibility of three portable X-ray systems to be used for specific indications, based on acceptable image quality and sufficient accuracy of the machines and following the standard guidelines for radiation hygiene. (C) 2010 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Consider a tandem system of machines separated by infinitely large buffers. The machines process a continuous flow of products, possibly at different speeds. The life and repair times of the machines are assumed to be exponential. We claim that the overflow probability of each buffer has an exponential decay, and provide an algorithm to determine the exact decay rates in terms of the speeds and the failure and repair rates of the machines. These decay rates provide useful qualitative insight into the behavior of the flow line. In the derivation of the algorithm we use the theory of Large Deviations.
Resumo:
Modelling and simulation studies were carried out at 26 cement clinker grinding circuits including tube mills, air separators and high pressure grinding rolls in 8 plants. The results reported earlier have shown that tube mills can be modelled as several mills in series, and the internal partition in tube mills can be modelled as a screen which must retain coarse particles in the first compartment but not impede the flow of drying air. In this work the modelling has been extended to show that the Tromp curve which describes separator (classifier) performance can be modelled in terms of d(50)(corr), by-pass, the fish hook, and the sharpness of the curve. Also the high pressure grinding rolls model developed at the Julius Kruttschnitt Mineral Research Centre gives satisfactory predictions using a breakage function derived from impact and compressed bed tests. Simulation studies of a full plant incorporating a tube mill, HPGR and separators showed that the models could successfully predict the performance of the another mill working under different conditions. The simulation capability can therefore be used for process optimization and design. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
This paper discusses the design and characterisation of a short, and hence portable impact load cell for in-situ quantification of ore breakage properties under impact loading conditions. Much literature has been published in the past two decades about impact load cells for ore breakage testing. It has been conclusively shown that such machines yield significant quantitative energy-fragmentation information about industrial ores. However, documented load cells are all laboratory systems that are not adapted for in-situ testing due to their dimensions and operating requirements. The authors report on a new portable impact load cell designed specifically for in-situ testing. The load cell is 1.5 m in height and weighs 30 kg. Its physical and operating characteristics are detailed in the paper. This includes physical dimensions, calibration and signal deconvolution. Emphasis is placed on the deconvolution issue, which is significant for such a short load cell. Finally, it is conclusively shown that the short load cell is quantitatively as accurate as its larger laboratory analogues. (C) 2062 Elsevier Science B.V. All rights reserved.
Resumo:
Purpose. Health promotion policy frameworks, recent theorizing, and research all emphasize understanding and mobilizing environmental influences to change particular health-related behaviors in specific settings. The workplace is a key environmental setting. The Checklist of Health Promotion Environments at Worksites (CHEW) was designed as a direct observation instrument to assess characteristics of worksite environments that are known to influence health-related behaviors. Methods. The CHEW is a 112-item checklist of workplace environment features hypothesized to be associated, both positively and negatively, with physical activity, healthy eating, alcohol consumption, and smoking. The three environmental domains assessed are (1) physical characteristics of the worksite, (2) features of the information environment, and (3) characteristics of the immediate neighborhood around the workplace. The conceptual rationale and development studies for the CHEW are described, and data from observational studies of 20 worksites are reported. Results. The data on CHEW-derived environmental attributes showed generally good reliability and identified meaningful sets of variables that plausibly may influence health-related behaviors. With the exception of one information environment attribute, intraclass correlation coefficients ranged from 0.80 to 1.00. Descriptive statistics on selected physical and information environment characteristics indicated that vending machines, showers, bulletin boards, and signs prohibiting smoking were common across worksites. Bicycle racks, visible stairways, and signs related to alcohol consumption, nutrition, and health. promotion were relatively uncommon. Conclusions. These findings illustrate the types of data on environmental attributes that can be derived, their relevance for program planning, and how they can characterize variability across worksites. The CHEW is a promising observational measure that has the potential to assess environmental influences on health behaviors and to evaluate workplace health promotion programs.
Resumo:
We compared the quality of realtime fetal ultrasound images transmitted using ISDN and IP networks. Four experienced obstetric ultrasound specialists viewed standard recordings in a randomized trial and rated the appearance of 30 fetal anatomical landmarks, each on a seven-point scale. A total of 12 evaluations were performed for various combinations of bandwidths (128, 384 or 768 kbit/s) and networks (ISDN or IF). The intraobserver coefficient of variation was 2.9%, 5.0%, 12.7% and 14.7% for the four observers. The mean overall ratings by each of the four observers were 4.6, 4.8, 5.0 and 5.3, respectively (a rating of 4 indicated satisfactory visualization and 7 indicated as good as the original recording). Analysis of variance showed that there were no significant interobserver variations nor significant differences in the mean scores for the different types of videoconferencing machines used. The most significant variable affecting the mean score was the bandwidth used. For ISDN, the mean score was 3.7 at 128 kbit/s, which was significantly worse than the mean score of 4.9 at 384 kbit/s, which was in turn significantly worse than the mean score of 5.9 at 768 kbit/s. The mean score for transmission using IP was about 0.5 points lower than that using ISDN across all the different bandwidths, but the differences were not significant. It appears that IP transmission in a private (non-shared) network is an acceptable alternative to ISDN for fetal tele-ultrasound and one deserving further study.
Resumo:
Low concentrate density from wet drum magnetic separators in dense medium circuits can cause operating difficulties due to inability to obtain the required circulating medium density and, indirectly, high medium solids losses. The literature is almost silent on the processes controlling concentrate density. However, the common name for the region through which concentrate is discharged-the squeeze pan gap-implies that some extrusion process is thought to be at work. There is no model of magnetics recovery in a wet drum magnetic separator, which includes as inputs all significant machine and operating variables. A series of trials, in both factorial experiments and in single variable experiments, was done using a purpose built rig which featured a small industrial scale (700 mm lip length, 900 turn diameter) wet drum magnetic separator. A substantial data set of 191 trials was generated in this work. The results of the factorial experiments were used to identify the variables having a significant effect on magnetics recovery. It is proposed, based both on the experimental observations of the present work and on observations reported in the literature, that the process controlling magnetic separator concentrate density is one of drainage. Such a process should be able to be defined by an initial moisture, a drainage rate and a drainage time, the latter being defined by the volumetric flowrate and the volume within the drainage zone. The magnetics can be characterised by an experimentally derived ultimate drainage moisture. A model based on these concepts and containing adjustable parameters was developed. This model was then fitted to a randomly chosen 80% of the data, and validated by application to the remaining 20%. The model is shown to be a good fit to data over concentrate solids content values from 40% solids to 80% solids and for both magnetite and ferrosilicon feeds. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Recent work by Siegelmann has shown that the computational power of recurrent neural networks matches that of Turing Machines. One important implication is that complex language classes (infinite languages with embedded clauses) can be represented in neural networks. Proofs are based on a fractal encoding of states to simulate the memory and operations of stacks. In the present work, it is shown that similar stack-like dynamics can be learned in recurrent neural networks from simple sequence prediction tasks. Two main types of network solutions are found and described qualitatively as dynamical systems: damped oscillation and entangled spiraling around fixed points. The potential and limitations of each solution type are established in terms of generalization on two different context-free languages. Both solution types constitute novel stack implementations - generally in line with Siegelmann's theoretical work - which supply insights into how embedded structures of languages can be handled in analog hardware.