965 resultados para Milking machines


Relevância:

10.00% 10.00%

Publicador:

Resumo:

La interacció home-màquina per mitjà de la veu cobreix moltes àrees d’investigació. Es destaquen entre altres, el reconeixement de la parla, la síntesis i identificació de discurs, la verificació i identificació de locutor i l’activació per veu (ordres) de sistemes robòtics. Reconèixer la parla és natural i simple per a les persones, però és un treball complex per a les màquines, pel qual existeixen diverses metodologies i tècniques, entre elles les Xarxes Neuronals. L’objectiu d’aquest treball és desenvolupar una eina en Matlab per al reconeixement i identificació de paraules pronunciades per un locutor, entre un conjunt de paraules possibles, i amb una bona fiabilitat dins d’uns marges preestablerts. El sistema és independent del locutor que pronuncia la paraula, és a dir, aquest locutor no haurà intervingut en el procés d’entrenament del sistema. S’ha dissenyat una interfície que permet l’adquisició del senyal de veu i el seu processament mitjançant xarxes neuronals i altres tècniques. Adaptant una part de control al sistema, es podria utilitzar per donar ordres a un robot com l’Alfa6Uvic o qualsevol altre dispositiu.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Development and environmental issues of small cities in developing countries have largely been overlooked although these settlements are of global demographic importance and often face a "triple challenge"; that is, they have limited financial and human resources to address growing environmental problems that are related to both development (e.g., pollution) and under-development (e.g., inadequate water supply). Neoliberal policy has arguably aggravated this challenge as public investments in infrastructure generally declined while the focus shifted to the metropolitan "economic growth machines". This paper develops a conceptual framework and agenda for the study of small cities in the global south, their environmental dynamics, governance and politics in the current neoliberal context. While small cities are governed in a neoliberal policy context, they are not central to neoliberalism, and their (environmental) governance therefore seems to differ from that of global cities. Furthermore, "actually existing" neoliberal governance of small cities is shaped by the interplay of regional and local politics and environmental situations. The approach of urban political ecology and the concept of rural-urban linkages are used to consider these socio-ecological processes. The conceptual framework and research agenda are illustrated in the case of India, where the agency of small cities in regard to environmental governance seems to remain limited despite formal political decentralization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During timber exploitation in forest stands harvesting machines pass repeatedly along the same track and can cause soil compaction, which leads to soil erosion and restricted tree root growth. The level of soil compaction depends on the number of passes and weight of the wood load. This paper aimed to evaluate soil compaction and eucalyptus growth as affected by the number of passes and wood load of a forwarder. The study was carried out in Santa Maria de Itabira county, Minas Gerais State - Brazil, on a seven-year-old eucalyptus stand planted on an Oxisol. The trees were felled by chainsaw and manually removed. Plots of 144 m² (four rows 12 m long in a 3 x 2 m spacing) were then marked off for the conduction of two trials. The first tested the traffic intensity of a forwarder which weighed 11,900 kg and carried 12 m³ wood (density of 480 kg m-3) and passed 2, 4, and 8 times along the same track. In the second trial, the forwarder carried loads of 4, 8, and 12 m³ of wood, and the machine was driven four times along the same track. In each plot, the passes affected four rows. Eucalyptus was planted in 30 x 30 x 30 cm holes on the compacted tracks. The soil in the area is clayey (470 clay and 440 g kg-1 sand content) and at depths of 0-5 cm and 5-10 cm, respectively, soil organic carbon was 406 and 272 g kg-1 and the moisture content during the trial 248 and 249 g kg-1. These layers were assessed for soil bulk density and water-stable aggregates. The infiltration rate was measured by a cylinder infiltrometer. After 441 days the measurements were repeated, with additional analyses of: soil organic carbon, total nitrogen, N-NH4+, N-NO3-, porosity, and penetration resistance. Tree height, stem diameter, and stem dry matter were measured. Forwarder traffic increased soil compaction, resistance to penetration and microporosity while it reduced the geometric mean diameter, total porosity, macroporosity and infiltration rate. Stem dry matter yield and tree height were not affected by soil compaction. Two passes of the forwarder were enough to cause the disturbances at the highest levels. The compaction effects were still persistent 441 days after forwarder traffic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Compaction is one of the most destructive factors of soil quality, however the effects on the microbial community and enzyme activity have not been investigated in detail so far. The objective of this study was to evaluate the effects of soil compaction caused by the traffic of agricultural machines on the soil microbial community and its enzyme activity. Six compaction levels were induced by tractors with different weights driving over a Eutrustox soil and the final density was measured. Soil samples were collected after corn from the layers 0-0.10 and 0.10-0.20 m. The compaction effect on all studied properties was evident. Total bacteria counts were reduced significantly (by 22-30 %) and by 38-41 % of nitrifying bacteria in the soil with highest bulk density compared to the control. On the other hand, fungi populations increased 55-86 % and denitrifying bacteria 49-53 %. Dehydrogenase activity decreased 20-34 %, urease 44-46 % and phosphatase 26-28 %. The organic matter content and soil pH decreased more in the 0-0.10 than in the 0.10-0.20 m layer and possibly influenced the reduction of the microbial counts, except denitrifying bacteria, and all enzyme activities, except urease. Results indicated that soil compaction influences the community of aerobic microorganisms and their activity. This effect can alter nutrient cycling and reduce crop yields.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research considers the problem of spatial data classification using machine learning algorithms: probabilistic neural networks (PNN) and support vector machines (SVM). As a benchmark model simple k-nearest neighbor algorithm is considered. PNN is a neural network reformulation of well known nonparametric principles of probability density modeling using kernel density estimator and Bayesian optimal or maximum a posteriori decision rules. PNN is well suited to problems where not only predictions but also quantification of accuracy and integration of prior information are necessary. An important property of PNN is that they can be easily used in decision support systems dealing with problems of automatic classification. Support vector machine is an implementation of the principles of statistical learning theory for the classification tasks. Recently they were successfully applied for different environmental topics: classification of soil types and hydro-geological units, optimization of monitoring networks, susceptibility mapping of natural hazards. In the present paper both simulated and real data case studies (low and high dimensional) are considered. The main attention is paid to the detection and learning of spatial patterns by the algorithms applied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Machine learning has been largely applied to analyze data in various domains, but it is still new to personalized medicine, especially dose individualization. In this paper, we focus on the prediction of drug concentrations using Support Vector Machines (S VM) and the analysis of the influence of each feature to the prediction results. Our study shows that SVM-based approaches achieve similar prediction results compared with pharmacokinetic model. The two proposed example-based SVM methods demonstrate that the individual features help to increase the accuracy in the predictions of drug concentration with a reduced library of training data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Inadequate usage can degrade natural resources, particularly soils. More attention has been paid to practices aiming at the recovery of degraded soils in the last years, e.g, the use of organic fertilizers, liming and introduction of species adapted to adverse conditions. The purpose of this study was therefore to investigate the recovery of physical properties of a Red Latosol (Oxisol) degraded by the construction of a hydroelectric power station. In the study area, a soil layer about 8m thick had been withdrawn by heavy machines leading not only to soil compaction, but resulting in high-degree degradation. The experiment was arranged in a completely randomized design with nine treatments and four replications. The treatments consisted of: 1- soil mobilization by tilling (to ensure the effect of mechanical mobilization in all treatments) without planting, but growth of spontaneous vegetation; 2- Black velvet bean (Stizolobium aterrimum Piper & Tracy); 3- Pigeonpea (Cajanus cajan (L.) DC); 4- Liming + black velvet bean; 5-Liming + pigeonpea until 1994, when replaced by jack bean (Canavalia ensiformis); 6- Liming + gypsum + black velvet bean; 7- Liming + gypsum + pigeonpea until 1994, when replaced by jack bean; and two controls as reference: 8- Native Cerrado vegetation and 9- bare soil (no tilling and no planting), left under natural conditions and in this situation, without spontaneous vegetation. In treatments 1 through 7, the soil was tilled. Treatments were installed in 1992 and left unmanaged for seven years, until brachiaria (Brachiaria decumbens) was planted in all plots in 1999. Seventeen years after implantation, the properties soil macroporosity, microporosity, total porosity, bulk density and aggregate stability were assessed in the previously described treatments in the soil layers 0.00-0.10; 0.10-0.20 and 0.20-0.40 m, and soil Penetration Resistance and soil moisture in 0.00-0.15 and 0.15-0.30 m. The plants were evaluated for: brachiaria dry matter and spontaneous growth of native tree species in the plots as of 2006. Results were analyzed by variance analysis and Tukey´s test at 5 % for mean comparison. In all treatments, except for the bare soil (no recovery measures), ongoing recovery of the degraded soil physical properties was observed. Macroporosity, soil bulk density and total porosity were good soil quality indicators. The occurrence of spontaneous native species indicated the soil recovery process. The best adapted species was Machaerium acutifolium Vogel, with the largest number of plants and most advanced development; the dry matter production of B. decumbens in recovering soil was similar to normal conditions, evidencing soil recovery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article presents an experimental study about the classification ability of several classifiers for multi-classclassification of cannabis seedlings. As the cultivation of drug type cannabis is forbidden in Switzerland lawenforcement authorities regularly ask forensic laboratories to determinate the chemotype of a seized cannabisplant and then to conclude if the plantation is legal or not. This classification is mainly performed when theplant is mature as required by the EU official protocol and then the classification of cannabis seedlings is a timeconsuming and costly procedure. A previous study made by the authors has investigated this problematic [1]and showed that it is possible to differentiate between drug type (illegal) and fibre type (legal) cannabis at anearly stage of growth using gas chromatography interfaced with mass spectrometry (GC-MS) based on therelative proportions of eight major leaf compounds. The aims of the present work are on one hand to continueformer work and to optimize the methodology for the discrimination of drug- and fibre type cannabisdeveloped in the previous study and on the other hand to investigate the possibility to predict illegal cannabisvarieties. Seven classifiers for differentiating between cannabis seedlings are evaluated in this paper, namelyLinear Discriminant Analysis (LDA), Partial Least Squares Discriminant Analysis (PLS-DA), Nearest NeighbourClassification (NNC), Learning Vector Quantization (LVQ), Radial Basis Function Support Vector Machines(RBF SVMs), Random Forest (RF) and Artificial Neural Networks (ANN). The performance of each method wasassessed using the same analytical dataset that consists of 861 samples split into drug- and fibre type cannabiswith drug type cannabis being made up of 12 varieties (i.e. 12 classes). The results show that linear classifiersare not able to manage the distribution of classes in which some overlap areas exist for both classificationproblems. Unlike linear classifiers, NNC and RBF SVMs best differentiate cannabis samples both for 2-class and12-class classifications with average classification results up to 99% and 98%, respectively. Furthermore, RBFSVMs correctly classified into drug type cannabis the independent validation set, which consists of cannabisplants coming from police seizures. In forensic case work this study shows that the discrimination betweencannabis samples at an early stage of growth is possible with fairly high classification performance fordiscriminating between cannabis chemotypes or between drug type cannabis varieties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

THESIS ABSTRACTThis thesis project was aimed at studying the molecular mechanisms underlying learning and memory formation, in particular as they relate to the metabolic coupling between astrocytes and neurons. For that, changes in the metabolic activity of different mice brain regions after 1 or 9 days of training in an eight-arm radial maze were assessed by (14C) 2-deoxyglucose (2DG) autoradiography. Significant differences in the areas engaged during the behavioral task at day 1 (when animals are confronted for the first time to the learning task) and at day 9 (when animals are highly performing) have been identified. These areas include the hippocampus, the fornix, the parietal cortex, the laterodorsal thalamic nucleus and the mammillary bodies at day 1 ; and the anterior cingulate, the retrosplenial cortex and the dorsal striatum at day 9. Two of these cerebral regions (those presenting the greatest changes at day 1 and day 9: the hippocampus and the retrosplenial cortex, respectively) were microdissected by laser capture microscopy and selected genes related to neuron-glia metabolic coupling, glucose metabolism and synaptic plasticity were analyzed by RT-PCR. 2DG and gene expression analysis were performed at three different times: 1) immediately after the end of the behavioral paradigm, 2) 45 minutes and 3) 6 hours after training. The main goal of this study was the identification of the metabolic adaptations following the learning task. Gene expression results demonstrate that the learning task profoundly modulates the pattern of gene expression in time, meaning that these two cerebral regions with high 2DG signal (hippocampus and retrosplenial cortex) have adapted their metabolic molecular machinery in consequence. Almost all studied genes show a higher expression in the hippocampus at day 1 compared to day 9, while an increased expression was found in the retrosplenial cortex at day 9. We can observe these molecular adaptations with a short delay of 45 minutes after the end of the task. However, 6 hours after training a high gene expression was found at day 9 (compared to day 1) in both regions, suggesting that only one day of training is not sufficient to detect transcriptional modifications several hours after the task. Thus, gene expression data match 2DG results indicating a transfer of information in time (from day 1 to day 9) and in space (from the hippocampus to the retrosplenial cortex), and this at a cellular and a molecular level. Moreover, learning seems to modify the neuron-glia metabolic coupling, since several genes involved in this coupling are induced. These results also suggest a role of glia in neuronal plasticity.RESUME DU TRAVAIL DE THESECe projet de thèse a eu pour but l'étude des mécanismes moléculaires qui sont impliqués dans l'apprentissage et la mémoire et, en particulier, à les mettre en rapport avec le couplage métabolique existant entre les astrocytes et les neurones. Pour cela, des changements de l'activité métabolique dans différentes régions du cerveau des souris après 1 ou 9 jours d'entraînement dans un labyrinthe radial à huit-bras ont été évalués par autoradiographie au 2-désoxyglucose (2DG). Des différences significatives dans les régions engagées pendant la tâche comportementale au jour 1 (quand les animaux sont confrontés pour la première fois à la tâche) et au jour 9 (quand les animaux ont déjà appris) ont été identifiés. Ces régions incluent, au jour 1, l'hippocampe, le fornix, le cortex pariétal, le noyau thalamic laterodorsal et les corps mamillaires; et, au jour 9, le cingulaire antérieur, le cortex retrosplenial et le striatum dorsal. Deux de ces régions cérébrales (celles présentant les plus grands changements à jour 1 et à jour 9: l'hippocampe et le cortex retrosplenial, respectivement) ont été découpées par microdissection au laser et quelques gènes liés au couplage métabolique neurone-glie, au métabolisme du glucose et à la plasticité synaptique ont été analysées par RT-PCR. L'étude 2DG et l'analyse de l'expression de gènes ont été exécutés à trois temps différents: 1) juste après entraînement, 2) 45 minutes et 3) 6 heures après la fin de la tâche. L'objectif principal de cette étude était l'identification des adaptations métaboliques suivant la tâche d'apprentissage. Les résultats de l'expression de gènes démontrent que la tâche d'apprentissage module profondément le profile d'expression des gènes dans le temps, signifiant que ces deux régions cérébrales avec un signal 2DG élevé (l'hippocampe et le cortex retrosplenial) ont adapté leurs « machines moléculaires » en conséquence. Presque tous les gènes étudiés montrent une expression plus élevée dans l'hippocampe au jour 1 comparé au jour 9, alors qu'une expression accrue a été trouvée dans le cortex retrosplenial au jour 9. Nous pouvons observer ces adaptations moléculaires avec un retard court de 45 minutes après la fin de la tâche. Cependant, 6 heures après l'entraînement, une expression de gènes élevée a été trouvée au jour 9 (comparé à jour 1) dans les deux régions, suggérant que seulement un jour d'entraînement ne suffit pas pour détecter des modifications transcriptionelles plusieurs heures après la tâche. Ainsi, les données d'expression de gènes corroborent les résultats 2DG indiquant un transfert d'information dans le temps (de jour 1 à jour 9) et dans l'espace (de l'hippocampe au cortex retrosplenial), et ceci à un niveau cellulaire et moléculaire. D'ailleurs, la tâche d'apprentissage semble modifier le couplage métabolique neurone-glie, puisque de nombreux gènes impliqués dans ce couplage sont induits. Ces observations suggèrent un rôle important de la glie dans les mécanismes de plasticité du système nerveux.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Because of the increase in workplace automation and the diversification of industrial processes, workplaces have become more and more complex. The classical approaches used to address workplace hazard concerns, such as checklists or sequence models, are, therefore, of limited use in such complex systems. Moreover, because of the multifaceted nature of workplaces, the use of single-oriented methods, such as AEA (man oriented), FMEA (system oriented), or HAZOP (process oriented), is not satisfactory. The use of a dynamic modeling approach in order to allow multiple-oriented analyses may constitute an alternative to overcome this limitation. The qualitative modeling aspects of the MORM (man-machine occupational risk modeling) model are discussed in this article. The model, realized on an object-oriented Petri net tool (CO-OPN), has been developed to simulate and analyze industrial processes in an OH&S perspective. The industrial process is modeled as a set of interconnected subnets (state spaces), which describe its constitutive machines. Process-related factors are introduced, in an explicit way, through machine interconnections and flow properties. While man-machine interactions are modeled as triggering events for the state spaces of the machines, the CREAM cognitive behavior model is used in order to establish the relevant triggering events. In the CO-OPN formalism, the model is expressed as a set of interconnected CO-OPN objects defined over data types expressing the measure attached to the flow of entities transiting through the machines. Constraints on the measures assigned to these entities are used to determine the state changes in each machine. Interconnecting machines implies the composition of such flow and consequently the interconnection of the measure constraints. This is reflected by the construction of constraint enrichment hierarchies, which can be used for simulation and analysis optimization in a clear mathematical framework. The use of Petri nets to perform multiple-oriented analysis opens perspectives in the field of industrial risk management. It may significantly reduce the duration of the assessment process. But, most of all, it opens perspectives in the field of risk comparisons and integrated risk management. Moreover, because of the generic nature of the model and tool used, the same concepts and patterns may be used to model a wide range of systems and application fields.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MOTIVATION: Analysis of millions of pyro-sequences is currently playing a crucial role in the advance of environmental microbiology. Taxonomy-independent, i.e. unsupervised, clustering of these sequences is essential for the definition of Operational Taxonomic Units. For this application, reproducibility and robustness should be the most sought after qualities, but have thus far largely been overlooked. RESULTS: More than 1 million hyper-variable internal transcribed spacer 1 (ITS1) sequences of fungal origin have been analyzed. The ITS1 sequences were first properly extracted from 454 reads using generalized profiles. Then, otupipe, cd-hit-454, ESPRIT-Tree and DBC454, a new algorithm presented here, were used to analyze the sequences. A numerical assay was developed to measure the reproducibility and robustness of these algorithms. DBC454 was the most robust, closely followed by ESPRIT-Tree. DBC454 features density-based hierarchical clustering, which complements the other methods by providing insights into the structure of the data. AVAILABILITY: An executable is freely available for non-commercial users at ftp://ftp.vital-it.ch/tools/dbc454. It is designed to run under MPI on a cluster of 64-bit Linux machines running Red Hat 4.x, or on a multi-core OSX system. CONTACT: dbc454@vital-it.ch or nicolas.guex@isb-sib.ch.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Avalanche forecasting is a complex process involving the assimilation of multiple data sources to make predictions over varying spatial and temporal resolutions. Numerically assisted forecasting often uses nearest neighbour methods (NN), which are known to have limitations when dealing with high dimensional data. We apply Support Vector Machines to a dataset from Lochaber, Scotland to assess their applicability in avalanche forecasting. Support Vector Machines (SVMs) belong to a family of theoretically based techniques from machine learning and are designed to deal with high dimensional data. Initial experiments showed that SVMs gave results which were comparable with NN for categorical and probabilistic forecasts. Experiments utilising the ability of SVMs to deal with high dimensionality in producing a spatial forecast show promise, but require further work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although cross-sectional diffusion tensor imaging (DTI) studies revealed significant white matter changes in mild cognitive impairment (MCI), the utility of this technique in predicting further cognitive decline is debated. Thirty-five healthy controls (HC) and 67 MCI subjects with DTI baseline data were neuropsychologically assessed at one year. Among them, there were 40 stable (sMCI; 9 single domain amnestic, 7 single domain frontal, 24 multiple domain) and 27 were progressive (pMCI; 7 single domain amnestic, 4 single domain frontal, 16 multiple domain). Fractional anisotropy (FA) and longitudinal, radial, and mean diffusivity were measured using Tract-Based Spatial Statistics. Statistics included group comparisons and individual classification of MCI cases using support vector machines (SVM). FA was significantly higher in HC compared to MCI in a distributed network including the ventral part of the corpus callosum, right temporal and frontal pathways. There were no significant group-level differences between sMCI versus pMCI or between MCI subtypes after correction for multiple comparisons. However, SVM analysis allowed for an individual classification with accuracies up to 91.4% (HC versus MCI) and 98.4% (sMCI versus pMCI). When considering the MCI subgroups separately, the minimum SVM classification accuracy for stable versus progressive cognitive decline was 97.5% in the multiple domain MCI group. SVM analysis of DTI data provided highly accurate individual classification of stable versus progressive MCI regardless of MCI subtype, indicating that this method may become an easily applicable tool for early individual detection of MCI subjects evolving to dementia.