57 resultados para Dataflow diagrams
Resumo:
Aquesta memòria presenta un estudi de la generació automàtica de codi Java a partir de diagrames UML amb l'eina ArgoUML. El cicle de vida tradicional del programari presenta alguns problemes com la manca de sincronització entre codi i documentació, poca portabilitat i problemes de interoperabilidad. El paradigma MDA vol solucionar en part aquests problemes fent dels models el centre del desenvolupament del programari i apostant per la generació automàtica de models i codi.
Resumo:
En aquest projecte, amb la finalitat d'analitzar per mitjà d'un exemple (la base de dades d'una botiga que opera en línia) algunes de les possibilitats que dóna XML, s'ha construït un sistema que permet generar bases de dades relacionals (amb SQL) a partir de diagrames de classes definits en XML. D?aquesta manera es mostra la flexibilitat que aporta XML a la definició, el processament i la transformació dels documents.
Resumo:
El següent document recull la documentació tècnica generada durant el desenvolupament del Sistema gestor d'empresa d'excavacions amb tecnologia J2EE i el conjunt de decisions que s'han pres per a portar-lo a terme en el termini previst.Aquest, es divideix principalment en 9 capítols:El primer, en el que es defineixen les característiques principals per a la gestió del projecte.En el segon recull els requeriments, l'anàlisi i el disseny del mateix projecte,acompanyats dels respectius diagrames de casos d'ús i de seqüència.En el tercer es descriu el model de dades utilitzat.El quart està format pel model físic del sistema on es defineix la arquitectura, lescaracterístiques tècniques principals del sistema i de la seva implementació, i les decisions adoptades durant el desenvolupament.El cinquè és un recull dels canvis realitzats sobre el disseny original.El sisè està dedicat a les proves de testeig realitzades un cop finalitzat el desenvolupament.El setè, és un manual d'instal·lació, on es descriuen les especificacions necessàries i els passos a seguir per a poder-lo desplegar.En el vuitè es fa una breu valoració econòmica de l'aplicació conjuntament amb un estudi d'oportunitat.I el novè, on es descriuen les conclusions sobre la realització d'aquest treball de final de carrera.
Resumo:
Documentación (requisitos, diagramas UML, etc.) para la elaboración de una aplicación para la gestión de una autoescuela.
Resumo:
Theory of compositional data analysis is often focused on the composition only. However in practical applications we often treat a composition together with covariableswith some other scale. This contribution systematically gathers and develop statistical tools for this situation. For instance, for the graphical display of the dependenceof a composition with a categorical variable, a colored set of ternary diagrams mightbe a good idea for a first look at the data, but it will fast hide important aspects ifthe composition has many parts, or it takes extreme values. On the other hand colored scatterplots of ilr components could not be very instructive for the analyst, if theconventional, black-box ilr is used.Thinking on terms of the Euclidean structure of the simplex, we suggest to set upappropriate projections, which on one side show the compositional geometry and on theother side are still comprehensible by a non-expert analyst, readable for all locations andscales of the data. This is e.g. done by defining special balance displays with carefully-selected axes. Following this idea, we need to systematically ask how to display, explore,describe, and test the relation to complementary or explanatory data of categorical, real,ratio or again compositional scales.This contribution shows that it is sufficient to use some basic concepts and very fewadvanced tools from multivariate statistics (principal covariances, multivariate linearmodels, trellis or parallel plots, etc.) to build appropriate procedures for all these combinations of scales. This has some fundamental implications in their software implementation, and how might they be taught to analysts not already experts in multivariateanalysis
Resumo:
”compositions” is a new R-package for the analysis of compositional and positive data.It contains four classes corresponding to the four different types of compositional andpositive geometry (including the Aitchison geometry). It provides means for computation,plotting and high-level multivariate statistical analysis in all four geometries.These geometries are treated in an fully analogous way, based on the principle of workingin coordinates, and the object-oriented programming paradigm of R. In this way,called functions automatically select the most appropriate type of analysis as a functionof the geometry. The graphical capabilities include ternary diagrams and tetrahedrons,various compositional plots (boxplots, barplots, piecharts) and extensive graphical toolsfor principal components. Afterwards, ortion and proportion lines, straight lines andellipses in all geometries can be added to plots. The package is accompanied by ahands-on-introduction, documentation for every function, demos of the graphical capabilitiesand plenty of usage examples. It allows direct and parallel computation inall four vector spaces and provides the beginner with a copy-and-paste style of dataanalysis, while letting advanced users keep the functionality and customizability theydemand of R, as well as all necessary tools to add own analysis routines. A completeexample is included in the appendix
Resumo:
R from http://www.r-project.org/ is ‘GNU S’ – a language and environment for statistical computingand graphics. The environment in which many classical and modern statistical techniques havebeen implemented, but many are supplied as packages. There are 8 standard packages and many moreare available through the cran family of Internet sites http://cran.r-project.org .We started to develop a library of functions in R to support the analysis of mixtures and our goal isa MixeR package for compositional data analysis that provides support foroperations on compositions: perturbation and power multiplication, subcomposition with or withoutresiduals, centering of the data, computing Aitchison’s, Euclidean, Bhattacharyya distances,compositional Kullback-Leibler divergence etc.graphical presentation of compositions in ternary diagrams and tetrahedrons with additional features:barycenter, geometric mean of the data set, the percentiles lines, marking and coloring ofsubsets of the data set, theirs geometric means, notation of individual data in the set . . .dealing with zeros and missing values in compositional data sets with R procedures for simpleand multiplicative replacement strategy,the time series analysis of compositional data.We’ll present the current status of MixeR development and illustrate its use on selected data sets
Resumo:
This paper describes a navigation system for autonomous underwater vehicles (AUVs) in partially structured environments, such as dams, harbors, marinas or marine platforms. A mechanical scanning imaging sonar is used to obtain information about the location of planar structures present in such environments. A modified version of the Hough transform has been developed to extract line features, together with their uncertainty, from the continuous sonar dataflow. The information obtained is incorporated into a feature-based SLAM algorithm running an Extended Kalman Filter (EKF). Simultaneously, the AUV's position estimate is provided to the feature extraction algorithm to correct the distortions that the vehicle motion produces in the acoustic images. Experiments carried out in a marina located in the Costa Brava (Spain) with the Ictineu AUV show the viability of the proposed approach
Resumo:
There is almost not a case in exploration geology, where the studied data doesn’tincludes below detection limits and/or zero values, and since most of the geological dataresponds to lognormal distributions, these “zero data” represent a mathematicalchallenge for the interpretation.We need to start by recognizing that there are zero values in geology. For example theamount of quartz in a foyaite (nepheline syenite) is zero, since quartz cannot co-existswith nepheline. Another common essential zero is a North azimuth, however we canalways change that zero for the value of 360°. These are known as “Essential zeros”, butwhat can we do with “Rounded zeros” that are the result of below the detection limit ofthe equipment?Amalgamation, e.g. adding Na2O and K2O, as total alkalis is a solution, but sometimeswe need to differentiate between a sodic and a potassic alteration. Pre-classification intogroups requires a good knowledge of the distribution of the data and the geochemicalcharacteristics of the groups which is not always available. Considering the zero valuesequal to the limit of detection of the used equipment will generate spuriousdistributions, especially in ternary diagrams. Same situation will occur if we replace thezero values by a small amount using non-parametric or parametric techniques(imputation).The method that we are proposing takes into consideration the well known relationshipsbetween some elements. For example, in copper porphyry deposits, there is always agood direct correlation between the copper values and the molybdenum ones, but whilecopper will always be above the limit of detection, many of the molybdenum values willbe “rounded zeros”. So, we will take the lower quartile of the real molybdenum valuesand establish a regression equation with copper, and then we will estimate the“rounded” zero values of molybdenum by their corresponding copper values.The method could be applied to any type of data, provided we establish first theircorrelation dependency.One of the main advantages of this method is that we do not obtain a fixed value for the“rounded zeros”, but one that depends on the value of the other variable.Key words: compositional data analysis, treatment of zeros, essential zeros, roundedzeros, correlation dependency
Resumo:
Through the history of Electrical Engineering education, vectorial and phasorial diagrams have been used as a fundamental learning tool. At present, computational power has replaced them by long data lists, the result of solving equation systems by means of numerical methods. In this sense, diagrams have been shifted to an academic background and although theoretically explained, they are not used in a practical way within specific examples. This fact may be against the understanding of the complex behavior of the electrical power systems by students. This article proposes a modification of the classical Perrine-Baum diagram construction to allowing both a more practical representation and a better understanding of the behavior of a high-voltage electric line under different levels of load. This modification allows, at the same time, the forecast of the obsolescence of this behavior and line’s loading capacity. Complementary, we evaluate the impact of this tool in the learning process showing comparative undergraduate results during three academic years
Resumo:
Johnson CCD photometry was performed in the two subgroups of the association Cepheus OB3, for selected fields each containing at least one star with previous UBV photoelectric photometry. Photometry for about 1000 stars down to visual magnitude 21 is provided, although the completeness tests show that the sample is complete down to V=19mag. Individual errors were assigned to the magnitude and colours for each star. Colour-colour and colour-magnitude diagrams are shown. Astrometric positions of the stars are also given. Description of the reduction procedure is fully detailed.
Resumo:
Our procedure to detect moving groups in the solar neighbourhood (Chen et al., 1997) in the four-dimensional space of the stellar velocity components and age has been improved. The method, which takes advantadge of non-parametric estimators of density distribution to avoid any a priori knowledge of the kinematic properties of these stellar groups, now includes the effect of observational errors on the process to select moving group stars, uses a better estimation of the density distribution of the total sample and field stars, and classifies moving group stars using all the available information. It is applied here to an accurately selected sample of early-type stars with known radial velocities and Strömgren photometry. Astrometric data are taken from the HIPPARCOS catalogue (ESA, 1997), which results in an important decrease in the observational errors with respect to ground-based data, and ensures the uniformity of the observed data. Both the improvement of our method and the use of precise astrometric data have allowed us not only to confirm the existence of classical moving groups, but also to detect finer structures that in several cases can be related to kinematic properties of nearby open clusters or associations.
Resumo:
We have investigated the different contributions to the entropy change at the martensitic transition of different families of Cu-based shape-memory alloys. The total entropy change has been obtained through calorimetric measurements. By measuring the evolution of the magnetic susceptibility with temperature, the entropy change associated with conduction electrons has been evaluated. The contribution of the anharmonic vibrations of the lattice has also been estimated using various parameters associated with the anharmonic behavior of these alloys, collected from the literature. The results found in the present work have been compared to values published for the martensitic transition of group-IV metals. For Cu-based alloys, both electron and anharmonic contributions have been shown to be much smaller than the overall entropy change. This finding demonstrates that the harmonic vibrations of the lattice are the most relevant contribution to the stability of the bcc phase in Cu-based alloys.