965 resultados para Milking machines


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Experience has shown that milling machines with carbide tipped teeth have the capability of profiling most asphalt concrete (ac) and portland cement concrete (pcc) pavements. Most standard milling operations today leave a very coarse, generally objectionable surface texture. This research utilized a Cedarapids Wirtgen 1900C mill modified by adding additional teeth. There were 411 teeth at a 5 millimeter transverse spacing (standard spacing is 15 mm) on a 6 ft. 4 in. long drum. The mill was used to profile and texture the surface of one ac and two pcc pavements. One year after the milling operation there is still some noticeable change in tire noise but the general appearance is good. The milling operation with the additional teeth provides an acceptable surface texture with improved Friction Numbers when compared to a nonmilled surface.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To explore the user-friendliness and ergonomics of seven new generation intensive care ventilators. DESIGN: Prospective task-performing study. SETTING: Intensive care research laboratory, university hospital. METHODS: Ten physicians experienced in mechanical ventilation, but without prior knowledge of the ventilators, were asked to perform eight specific tasks [turning the ventilator on; recognizing mode and parameters; recognizing and setting alarms; mode change; finding and activating the pre-oxygenation function; pressure support setting; stand-by; finding and activating non-invasive ventilation (NIV) mode]. The time needed for each task was compared to a reference time (by trained physiotherapist familiar with the devices). A time >180 s was considered a task failure. RESULTS: For each of the tests on the ventilators, all physicians' times were significantly higher than the reference time (P < 0.001). A mean of 13 +/- 8 task failures (16%) was observed by the ventilator. The most frequently failed tasks were mode and parameter recognition, starting pressure support and finding the NIV mode. Least often failed tasks were turning on the pre-oxygenation function and alarm recognition and management. Overall, there was substantial heterogeneity between machines, some exhibiting better user-friendliness than others for certain tasks, but no ventilator was clearly better that the others on all points tested. CONCLUSIONS: The present study adds to the available literature outlining the ergonomic shortcomings of mechanical ventilators. These results suggest that closer ties between end-users and manufacturers should be promoted, at an early development phase of these machines, based on the scientific evaluation of the cognitive processes involved by users in the clinical setting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: Late toxicities such as second cancer induction become more important as treatment outcome improves. Often the dose distribution calculated with a commercial treatment planning system (TPS) is used to estimate radiation carcinogenesis for the radiotherapy patient. However, for locations beyond the treatment field borders, the accuracy is not well known. The aim of this study was to perform detailed out-of-field-measurements for a typical radiotherapy treatment plan administered with a Cyberknife and a Tomotherapy machine and to compare the measurements to the predictions of the TPS. MATERIALS AND METHODS: Individually calibrated thermoluminescent dosimeters were used to measure absorbed dose in an anthropomorphic phantom at 184 locations. The measured dose distributions from 6 MV intensity-modulated treatment beams for CyberKnife and TomoTherapy machines were compared to the dose calculations from the TPS. RESULTS: The TPS are underestimating the dose far away from the target volume. Quantitatively the Cyberknife underestimates the dose at 40cm from the PTV border by a factor of 60, the Tomotherapy TPS by a factor of two. If a 50% dose uncertainty is accepted, the Cyberknife TPS can predict doses down to approximately 10 mGy/treatment Gy, the Tomotherapy-TPS down to 0.75 mGy/treatment Gy. The Cyberknife TPS can then be used up to 10cm from the PTV border the Tomotherapy up to 35cm. CONCLUSIONS: We determined that the Cyberknife and Tomotherapy TPS underestimate substantially the doses far away from the treated volume. It is recommended not to use out-of-field doses from the Cyberknife TPS for applications like modeling of second cancer induction. The Tomotherapy TPS can be used up to 35cm from the PTV border (for a 390 cm(3) large PTV).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although the radiation doses involved in basic research radiology are relatively small, the increasing number of radiological procedures makes risks becoming increasingly high. Quality control techniques in radiological practice have to ensure an adequate system of protection for people exposed to radiation. These techniques belong to a quality assurance program for X-ray machines and are designed to correct problems related to equipment and radiological practices, to obtain radiological images of high quality and to reduce the unnecessary exposures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Summary: The effect of automatic milking on udder health in two Finnish dairy herds

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Bone health is a concern when treating early stage breast cancer patients with adjuvant aromatase inhibitors. Early detection of patients (pts) at risk of osteoporosis and fractures may be helpful for starting preventive therapies and selecting the most appropriate endocrine therapy schedule. We present statistical models describing the evolution of lumbar and hip bone mineral density (BMD) in pts treated with tamoxifen (T), letrozole (L) and sequences of T and L. Methods: Available dual-energy x-ray absorptiometry exams (DXA) of pts treated in trial BIG 1-98 were retrospectively collected from Swiss centers. Treatment arms: A) T for 5 years, B) L for 5 years, C) 2 years of T followed by 3 years of L and, D) 2 years of L followed by 3 years of T. Pts without DXA were used as a control for detecting selection biases. Patients randomized to arm A were subsequently allowed an unplanned switch from T to L. Allowing for variations between DXA machines and centres, two repeated measures models, using a covariance structure that allow for different times between DXA, were used to estimate changes in hip and lumbar BMD (g/cm2) from trial randomization. Prospectively defined covariates, considered as fixed effects in the multivariable models in an intention to treat analysis, at the time of trial randomization were: age, height, weight, hysterectomy, race, known osteoporosis, tobacco use, prior bone fracture, prior hormone replacement therapy (HRT), bisphosphonate use and previous neo-/adjuvant chemotherapy (ChT). Similarly, the T-scores for lumbar and hip BMD measurements were modeled using a per-protocol approach (allowing for treatment switch in arm A), specifically studying the effect of each therapy upon T-score percentage. Results: A total of 247 out of 546 pts had between 1 and 5 DXA; a total of 576 DXA were collected. Number of DXA measurements per arm were; arm A 133, B 137, C 141 and D 135. The median follow-up time was 5.8 years. Significant factors positively correlated with lumbar and hip BMD in the multivariate analysis were weight, previous HRT use, neo-/adjuvant ChT, hysterectomy and height. Significant negatively correlated factors in the models were osteoporosis, treatment arm (B/C/D vs. A), time since endocrine therapy start, age and smoking (current vs. never).Modeling the T-score percentage, differences from T to L were -4.199% (p = 0.036) and -4.907% (p = 0.025) for the hip and lumbar measurements respectively, before any treatment switch occurred. Conclusions: Our statistical models describe the lumbar and hip BMD evolution for pts treated with L and/or T. The results of both localisations confirm that, contrary to expectation, the sequential schedules do not seem less detrimental for the BMD than L monotherapy. The estimated difference in BMD T-score percent is at least 4% from T to L.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L’objectiu d’aquest estudi és comparar els efectes de dos mitjans de treball sobre la força explosiva en 9 atletes de nivell autonòmic i nacional (19,6 ± 2,6 anys, 1,76 ± 7 m i 68,9 ± 3,5 kg) que entrenaren durant el període competitiu, 6 setmanes amb una freqüència de 2 cops per setmana, seguint una periodització creixent. Els subjectes varen ser dividits en 3 grups de 3 atletes cada un, però utilitzant diferents mitjans d’entrenament: el grup experimental combinant la plataforma vibratòria i les màquines inercials (1), amb un temps d’exposició a la vibració de 30’’, a una intensitat de 45 Hz i una amplitud de 5 mm amb una pausa d’1’. I de 3 sèries de 8 repeticions executades a màxima velocitat en la fase concèntrica i controlant aquesta en la fase excèntrica, amb una pausa de 3’ en les màquines io-io. El grup experimental de pesos lliures (2), va realitzar 4 exercicis: ½ squat, pliometria (CEE), multisalts horitzontals i acceleracions, seguint les pautes d’un treball de força explosiva proposat per Badillo i Gorostiaga (1995). I un grup control (3), que no realitzà cap entrenament de força. Abans i després del període d’intervenció és realitzaren els següents tests: salt sense contramoviment (SJ) i salt amb contramoviment (CMJ). Els resultats indicaren que el grup (1), (2) i (3) disminuïren significativament el SJ i el CMJ. Es conclou que l’entrenament, tant en el de combinació d’estímuls vibratoris amb màquines inercials com el de pesos lliures pareix ser un mitja que s’ha de controlar i perioditzar molt bé ja que en el període competitiu l’atleta acumula uns nivells de fatiga tant muscular com fisiològics molt superiors que en altres períodes de la temporada.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Actualment hi ha una gran varietat de marques i models d'estabilitzadors de tensió, tot i que tots estan dissenyats i construïts amb el mateix propòsit, entregar una tensió estable a la sortida del dispositiu. La raó per la que es fabriquen estabilitzadors de tensió, es basa en el fet que, tot i amb els avenços tècnics i millores dels serveis de l'àrea energètica, no s'han pogut eliminar les freqüents caigudes o pujades de tensió en les xarxes d'alimentació d'energia elèctrica, fet que pot ocasionar errors en el funcionament dels equips electrònics. És per això que la majoria d'usuaris d'equips electrònics interposen un estabilitzador de tensió entre la línea d'alimentació dels seus aparells. L'objectiu del treball és realitzar la construcció d'un dispositiu de tres bobines amb un nucli ferromagnètic per a dur a terme la funció d'estabilitzador de tensió alterna. Mitjançant la tècnica de reluctància de nucli saturable i aplicant-la en aquest cas pràctic, es vol aconseguir desenvolupar un estabilitzador de tensió fiable i econòmic. Aquest treball ha d'aportar una solució en entorns de treball, entre altres, on s'utilitzen màquines d'alimentació alternes que siguin molt sensibles a les variacions de tensió de la xarxa i que, a més a més, no suposi una despesa molt elevada.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The book presents the state of the art in machine learning algorithms (artificial neural networks of different architectures, support vector machines, etc.) as applied to the classification and mapping of spatially distributed environmental data. Basic geostatistical algorithms are presented as well. New trends in machine learning and their application to spatial data are given, and real case studies based on environmental and pollution data are carried out. The book provides a CD-ROM with the Machine Learning Office software, including sample sets of data, that will allow both students and researchers to put the concepts rapidly to practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En la actualidad, las cooperativas recolectan, seleccionan, tratan y separan la fruta según su calibre (peso, diámetro máximo, medio y/o mínimo) para que esta llegue al consumidor final según la categoría (calibre). Para poder competir en un mercado cada vez más exigente en calidad y precios, se requieren sistemas de clasificación automáticos que nos permitan obtener óptimos resultados con altos niveles de producción y productividad. Para realizar estas tareas existen calibradoras industriales que pesan la fruta mediante células de carga y con el peso obtenido las clasifican asignando las piezas a su salida correspondiente (mesa de confección) a través de un sistema de electroimanes. Desafortunadamente el proceso de calibración de la fruta por peso no es en absoluto fiable ya que en este proceso no se considera el grosor de la piel, contenido de agua, de azúcar u otros factores altamente relevantes que influyen considerablemente en los resultados finales. El objeto de este proyecto es el de evolucionar las existentes calibradoras de fruta instalando un sistema industrial de visión artificial (rápido y robusto) que trabaje en un rango de espectro Infrarrojo (mayor fiabilidad) proporcionando óptimos resultados finales en la clasificación de las frutas, verduras y hortalizas. De este modo, el presente proyecto ofrece la oportunidad de mejorar el rendimiento de la línea de clasificación de frutas, aumentando la velocidad, disminuyendo pérdidas en tiempo y error humano y mejorando indiscutiblemente la calidad del producto final deseada por los consumidores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This letter presents advanced classification methods for very high resolution images. Efficient multisource information, both spectral and spatial, is exploited through the use of composite kernels in support vector machines. Weighted summations of kernels accounting for separate sources of spectral and spatial information are analyzed and compared to classical approaches such as pure spectral classification or stacked approaches using all the features in a single vector. Model selection problems are addressed, as well as the importance of the different kernels in the weighted summation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Clear Lake, Iowa's third largest natural lake, is a premier natural resource and popular recreational destination in north central Iowa. Despite the lake's already strong recreational use, water quality concerns have not allowed the lake to reach its full potential. Clear Lake is listed on Iowa's Draft 2010 303(d) Impaired Waters List for algae, bacteria, and turbidity. Many restoration practices have been implemented to treat the algae and turbidity impairment, but few practices have been installed to treat bacteria. Reducing beach bacteria levels is a priority of the lake restoration partners. Federal, State, and local partners have invested more than $20 million in lake and watershed restoration efforts to improve water clarity and quality. These partners have a strong desire to ensure high bacteria levels at public swim beaches do not undermine the other water quality improvements. Recent bacteria source tracking completed by the State Hygienic Laboratory indicates that Canada Geese are a major contributor of bacteria loading to the Clear Lake swim beaches. Other potential sources include unpermitted septic systems in the watershed. The grant request proposes to reduce bacteria levels at Clear Lake's three public swim beaches by utilizing beach cleaner machines to remove goose waste, installing goose deterrents at the swim beaches, and continuing a septic system update grant program. These practices began to be implemented in 2011 and recent bacteria samples in 2012 are showing they can be effective if the effort is continued.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spatial data analysis mapping and visualization is of great importance in various fields: environment, pollution, natural hazards and risks, epidemiology, spatial econometrics, etc. A basic task of spatial mapping is to make predictions based on some empirical data (measurements). A number of state-of-the-art methods can be used for the task: deterministic interpolations, methods of geostatistics: the family of kriging estimators (Deutsch and Journel, 1997), machine learning algorithms such as artificial neural networks (ANN) of different architectures, hybrid ANN-geostatistics models (Kanevski and Maignan, 2004; Kanevski et al., 1996), etc. All the methods mentioned above can be used for solving the problem of spatial data mapping. Environmental empirical data are always contaminated/corrupted by noise, and often with noise of unknown nature. That's one of the reasons why deterministic models can be inconsistent, since they treat the measurements as values of some unknown function that should be interpolated. Kriging estimators treat the measurements as the realization of some spatial randomn process. To obtain the estimation with kriging one has to model the spatial structure of the data: spatial correlation function or (semi-)variogram. This task can be complicated if there is not sufficient number of measurements and variogram is sensitive to outliers and extremes. ANN is a powerful tool, but it also suffers from the number of reasons. of a special type ? multiplayer perceptrons ? are often used as a detrending tool in hybrid (ANN+geostatistics) models (Kanevski and Maignank, 2004). Therefore, development and adaptation of the method that would be nonlinear and robust to noise in measurements, would deal with the small empirical datasets and which has solid mathematical background is of great importance. The present paper deals with such model, based on Statistical Learning Theory (SLT) - Support Vector Regression. SLT is a general mathematical framework devoted to the problem of estimation of the dependencies from empirical data (Hastie et al, 2004; Vapnik, 1998). SLT models for classification - Support Vector Machines - have shown good results on different machine learning tasks. The results of SVM classification of spatial data are also promising (Kanevski et al, 2002). The properties of SVM for regression - Support Vector Regression (SVR) are less studied. First results of the application of SVR for spatial mapping of physical quantities were obtained by the authorsin for mapping of medium porosity (Kanevski et al, 1999), and for mapping of radioactively contaminated territories (Kanevski and Canu, 2000). The present paper is devoted to further understanding of the properties of SVR model for spatial data analysis and mapping. Detailed description of the SVR theory can be found in (Cristianini and Shawe-Taylor, 2000; Smola, 1996) and basic equations for the nonlinear modeling are given in section 2. Section 3 discusses the application of SVR for spatial data mapping on the real case study - soil pollution by Cs137 radionuclide. Section 4 discusses the properties of the modelapplied to noised data or data with outliers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work was to evaluate the association between milk flow, teat morphological measurements and subclinical mastitis prevalence in Gir cows. Eighty cows in the 2nd and 3rd lactations, with 90 to 200 days of lactation, were divided according to milk flow during milking into fast or slow groups. Teat morphometry was assessed by ultrasound scanning of the right anterior teat and external measurements. Milk samples were collected for somatic cells count (SCC) and microbiological culture. The effect of milk flow during milking was evaluated by analysis of variance of milk yield, SCC, morphometry and external measurements. The association of morphometry and external measurements of the teats with the SCC and microorganisms found in milk were analysed. Milk flow was significantly correlated to milk production. Gir cows with slower milk flow had longer teat canal and greater milk yield, in comparison to cows with fast milk flow. Teat-end to floor distance influenced SCC of Gir cows. Prevalence of subclinical mastitis and the type of mastitis-causing pathogens were not affected by milk flow during milking