25 resultados para Tractor trailer combinations
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
L’objectiu projecte és desenvolupar un sistema d’ajut al guiatge universal adaptable a qualsevol vehicle que permeti incrementar significativament l’eficiència de les feines a realitzar per la maquinària al camp. El sistema proposat es pot configurar d’acord amb les característiques de la maquinària o de la tasca a realitzar i és capaç de guiar en línia recta i crear paral·leles a la passada recta de referència. Un altre objectiu és millorar el rendiment, la fiabilitat i la usabilitat del programari de monitoratge inicial instal·lat al tractor, com també caracteritzar el receptor GPS AgGPS 332 de Trimble® per a comprovar la precisió de l’aparell. Els resultats obtinguts en la millora del programa de monitoratge són molt satisfactoris a l’haver corregit imprecisions de funcionament que en limitaven l’usabilitat. Els resultats obtinguts en la caracterització del receptor AgGPS 332 permeten valorar millor quin tipus de correcció diferencial és més convenient per a la precisió de treball requerida segons el seu cost de posada en marxa i de funcionament.Els resultats obtinguts en la validació de l’ajut al guiatge, han validat el guiatge a 10 metres vista com un ajut al guiatge equivalent al guiatge manual quan la velocitat de treball és de 5 km/h, el tractorista té referències visuals i no està fatigat. Els resultats obtinguts pels guiatges a 3 i 50 metres no són satisfactoris a 5 km/h. Tanmateix, durant el procés de disseny, la realització dels assajos i durant l’anàlisi de resultats s’han identificat algunes mancances i limitacions i es proposen una sèrie de millores per tal de solucionar-les.
Resumo:
Background: Protein domains represent the basic units in the evolution of proteins. Domain duplication and shuffling by recombination and fusion, followed by divergence are the most common mechanisms in this process. Such domain fusion and recombination events are predicted to occur only once for a given multidomain architecture. However, other scenarios may be relevant in the evolution of specific proteins, such as convergent evolution of multidomain architectures. With this in mind, we study glutaredoxin (GRX) domains, because these domains of approximately one hundred amino acids are widespread in archaea, bacteria and eukaryotes and participate in fusion proteins. GRXs are responsible for the reduction of protein disulfides or glutathione-protein mixed disulfides and are involved in cellular redox regulation, although their specific roles and targets are often unclear. Results: In this work we analyze the distribution and evolution of GRX proteins in archaea,bacteria and eukaryotes. We study over one thousand GRX proteins, each containing at least one GRX domain, from hundreds of different organisms and trace the origin and evolution of the GRX domain within the tree of life. Conclusion: Our results suggest that single domain GRX proteins of the CGFS and CPYC classes have, each, evolved through duplication and divergence from one initial gene that was present in the last common ancestor of all organisms. Remarkably, we identify a case of convergent evolution in domain architecture that involves the GRX domain. Two independent recombination events of a TRX domain to a GRX domain are likely to have occurred, which is an exception to the dominant mechanism of domain architecture evolution.
Resumo:
La selección del tractor más adecuado para una determinada explotación frutícola requiere considerar dos aspectos fundamentales. En primer lugar, el fruticultor debe valorar la conveniencia económica de la adquisición frente a la posibilidad de contratar a empresas de servicios (sino en todas, sí para aquellas operaciones de mayor coste); en segundo lugar, la potencia del tractor deberá ser coherente en relación a las máquinas agrícolas que se le acoplen.
Resumo:
We prove that any subanalytic locally Lipschitz function has the Sard property. Such functions are typically nonsmooth and their lack of regularity necessitates the choice of some generalized notion of gradient and of critical point. In our framework these notions are defined in terms of the Clarke and of the convex-stable subdifferentials. The main result of this note asserts that for any subanalytic locally Lipschitz function the set of its Clarke critical values is locally finite. The proof relies on Pawlucki's extension of the Puiseuxlemma. In the last section we give an example of a continuous subanalytic function which is not constant on a segment of "broadly critical" points, that is, points for which we can find arbitrarily short convex combinations of gradients at nearby points.
Resumo:
In this paper we investigate the role of horospheres in Integral Geometry and Differential Geometry. In particular we study envelopes of families of horocycles by means of “support maps”. We define invariant “linear combinations” of support maps or curves. Finally we obtain Gauss-Bonnet type formulas and Chern-Lashof type inequalities.
Resumo:
This study presents a first attempt to extend the “Multi-scale integrated analysis of societal and ecosystem metabolism (MuSIASEM)” approach to a spatial dimension using GIS techniques in the Metropolitan area of Barcelona. We use a combination of census and commercial databases along with a detailed land cover map to create a layer of Common Geographic Units that we populate with the local values of human time spent in different activities according to MuSIASEM hierarchical typology. In this way, we mapped the hours of available human time, in regards to the working hours spent in different locations, putting in evidence the gradients in spatial density between the residential location of workers (generating the work supply) and the places where the working hours are actually taking place. We found a strong three-modal pattern of clumps of areas with different combinations of values of time spent on household activities and on paid work. We also measured and mapped spatial segregation between these two activities and put forward the conjecture that this segregation increases with higher energy throughput, as the size of the functional units must be able to cope with the flow of exosomatic energy. Finally, we discuss the effectiveness of the approach by comparing our geographic representation of exosomatic throughput to the one issued from conventional methods.
Resumo:
El objetivo de la presente investigación es analizar el tratamiento que algunos de los diccionarios generales monolingües del español aparecidos en los últimos diez años han dado al fenómeno de las colocaciones léxicas, a saber, aquellas combinaciones de palabras que, desde el punto de vista de la norma, presentan ciertas restricciones combinatorias, esencialmente de carácter semántico, impuestas por el uso (Corpas: 1996). Los repertorios objeto de análisis han sido: el "Diccionario Salamanca de la lengua española", dirigido por Juan Gutiérrez (1996); el "Diccionario del español actual", de Manuel Seco, Olimpia Andrés y Gabino Ramos (1999); la vigésima segunda edición del diccionario de la RAE (2001); y el "Gran diccionario de uso del español actual. Basado en el corpus Cumbre", dirigido por Aquilino Sánchez (2001). Nuestro estudio se ha fundamentado en un corpus de 52 colocaciones léxicas confeccionado a partir del análisis de las subentradas contenidas en la letra "b" de cada uno de los diccionarios seleccionados. Posteriormente, hemos examinado las entradas correspondientes a cada uno de los elementos que constituyen la colocación (base y colocativo) con el fin de observar si los diccionarios estudiados dan cuenta de estas mismas combinaciones en otras partes del artículo lexicográfico, como son las definiciones o los ejemplos. A la hora de analizar la información lexicográfica hemos centrado nuestra atención en cuatro aspectos: a) la información contenida en las páginas preliminares de cada una de las obras; b) la ubicación de las colocaciones en el artículo lexicográfico; c) la asignación de la colocación a un artículo determinado; y d) la marcación gramatical.
Resumo:
The commitment among agents has always been a difficult task, especially when they have to decide how to distribute the available amount of a scarce resource among all. On the one hand, there are a multiplicity of possible ways for assigning the available amount; and, on the other hand, each agent is going to propose that distribution which provides her the highest possible award. In this paper, with the purpose of making this agreement easier, firstly we use two different sets of basic properties, called Commonly Accepted Equity Principles, to delimit what agents can propose as reasonable allocations. Secondly, we extend the results obtained by Chun (1989) and Herrero (2003), obtaining new characterizations of old and well known bankruptcy rules. Finally, using the fact that bankruptcy problems can be analyzed from awards and losses, we define a mechanism which provides a new justification of the convex combinations of bankruptcy rules. Keywords: Bankruptcy problems, Unanimous Concessions procedure, Diminishing Claims mechanism, Piniles’ rule, Constrained Egalitarian rule. JEL classification: C71, D63, D71.
Resumo:
We show that nuclear C*-algebras have a re ned version of the completely positive approximation property, in which the maps that approximately factorize through finite dimensional algebras are convex combinations of order zero maps. We use this to show that a separable nuclear C*-algebra A which is closely contained in a C*-algebra B embeds into B.
Resumo:
Background and Purpose Early prediction of motor outcome is of interest in stroke management. We aimed to determine whether lesion location at DTT is predictive of motor outcome after acute stroke and whether this information improves the predictive accuracy of the clinical scores. Methods We evaluated 60 consecutive patients within 12 hours of MCA stroke onset. We used DTT to evaluate CST involvement in the MC and PMC, CS, CR, and PLIC and in combinations of these regions at admission, at day 3, and at day 30. Severity of limb weakness was assessed using the m-NIHSS (5a, 5b, 6a, 6b). We calculated volumes of infarct and FA values in the CST of the pons. Results Acute damage to the PLIC was the best predictor associated with poor motor outcome, axonal damage, and clinical severity at admission (P&.001). There was no significant correlation between acute infarct volume and motor outcome at day 90 (P=.176, r=0.485). The sensitivity, specificity, and positive and negative predictive values of acute CST involvement at the level of the PLIC for 4 motor outcome at day 90 were 73.7%, 100%, 100%, and 89.1%, respectively. In the acute stage, DTT predicted motor outcome at day 90 better than the clinical scores (R2=75.50, F=80.09, P&.001). Conclusions In the acute setting, DTT is promising for stroke mapping to predict motor outcome. Acute CST damage at the level of the PLIC is a significant predictor of unfavorable motor outcome.
Resumo:
A parts based model is a parametrization of an object class using a collection of landmarks following the object structure. The matching of parts based models is one of the problems where pairwise Conditional Random Fields have been successfully applied. The main reason of their effectiveness is tractable inference and learning due to the simplicity of involved graphs, usually trees. However, these models do not consider possible patterns of statistics among sets of landmarks, and thus they sufffer from using too myopic information. To overcome this limitation, we propoese a novel structure based on a hierarchical Conditional Random Fields, which we explain in the first part of this memory. We build a hierarchy of combinations of landmarks, where matching is performed taking into account the whole hierarchy. To preserve tractable inference we effectively sample the label set. We test our method on facial feature selection and human pose estimation on two challenging datasets: Buffy and MultiPIE. In the second part of this memory, we present a novel approach to multiple kernel combination that relies on stacked classification. This method can be used to evaluate the landmarks of the parts-based model approach. Our method is based on combining responses of a set of independent classifiers for each individual kernel. Unlike earlier approaches that linearly combine kernel responses, our approach uses them as inputs to another set of classifiers. We will show that we outperform state-of-the-art methods on most of the standard benchmark datasets.
Resumo:
Theory of compositional data analysis is often focused on the composition only. However in practical applications we often treat a composition together with covariableswith some other scale. This contribution systematically gathers and develop statistical tools for this situation. For instance, for the graphical display of the dependenceof a composition with a categorical variable, a colored set of ternary diagrams mightbe a good idea for a first look at the data, but it will fast hide important aspects ifthe composition has many parts, or it takes extreme values. On the other hand colored scatterplots of ilr components could not be very instructive for the analyst, if theconventional, black-box ilr is used.Thinking on terms of the Euclidean structure of the simplex, we suggest to set upappropriate projections, which on one side show the compositional geometry and on theother side are still comprehensible by a non-expert analyst, readable for all locations andscales of the data. This is e.g. done by defining special balance displays with carefully-selected axes. Following this idea, we need to systematically ask how to display, explore,describe, and test the relation to complementary or explanatory data of categorical, real,ratio or again compositional scales.This contribution shows that it is sufficient to use some basic concepts and very fewadvanced tools from multivariate statistics (principal covariances, multivariate linearmodels, trellis or parallel plots, etc.) to build appropriate procedures for all these combinations of scales. This has some fundamental implications in their software implementation, and how might they be taught to analysts not already experts in multivariateanalysis
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods
Resumo:
Aitchison and Bacon-Shone (1999) considered convex linear combinations ofcompositions. In other words, they investigated compositions of compositions, wherethe mixing composition follows a logistic Normal distribution (or a perturbationprocess) and the compositions being mixed follow a logistic Normal distribution. Inthis paper, I investigate the extension to situations where the mixing compositionvaries with a number of dimensions. Examples would be where the mixingproportions vary with time or distance or a combination of the two. Practicalsituations include a river where the mixing proportions vary along the river, or acrossa lake and possibly with a time trend. This is illustrated with a dataset similar to thatused in the Aitchison and Bacon-Shone paper, which looked at how pollution in aloch depended on the pollution in the three rivers that feed the loch. Here, I explicitlymodel the variation in the linear combination across the loch, assuming that the meanof the logistic Normal distribution depends on the river flows and relative distancefrom the source origins
Resumo:
Catadioptric sensors are combinations of mirrors and lenses made in order to obtain a wide field of view. In this paper we propose a new sensor that has omnidirectional viewing ability and it also provides depth information about the nearby surrounding. The sensor is based on a conventional camera coupled with a laser emitter and two hyperbolic mirrors. Mathematical formulation and precise specifications of the intrinsic and extrinsic parameters of the sensor are discussed. Our approach overcomes limitations of the existing omni-directional sensors and eventually leads to reduced costs of production